hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
4790f3025bbf6b358e4391fc4f4796ff23c3baef | 4,791 | py | Python | test_autolens/point/test_point_source.py | Jammy2211/AutoLens | bc132a21d1a52248f08f198474e29f985e365d85 | [
"MIT"
] | 114 | 2018-03-05T07:31:47.000Z | 2022-03-08T06:40:52.000Z | test_autolens/point/test_point_source.py | Jammy2211/PyAutoLens | 728100a3bf13f89f35030724aa08593ab44e65eb | [
"MIT"
] | 143 | 2018-01-31T09:57:13.000Z | 2022-03-16T09:41:05.000Z | test_autolens/point/test_point_source.py | Jammy2211/AutoLens | bc132a21d1a52248f08f198474e29f985e365d85 | [
"MIT"
] | 33 | 2018-01-31T12:15:57.000Z | 2022-01-08T18:31:02.000Z | from os import path
import shutil
import os
import numpy as np
import autolens as al
def test__point_dataset_structures_as_dict():
point_dataset_0 = al.PointDataset(
name="source_1",
positions=al.Grid2DIrregular([[1.0, 1.0]]),
positions_noise_map=al.ValuesIrregular([1.0]),
)
point_dict = al.PointDict(point_dataset_list=[point_dataset_0])
assert point_dict["source_1"].name == "source_1"
assert point_dict["source_1"].positions.in_list == [(1.0, 1.0)]
assert point_dict["source_1"].positions_noise_map.in_list == [1.0]
assert point_dict["source_1"].fluxes == None
assert point_dict["source_1"].fluxes_noise_map == None
point_dataset_1 = al.PointDataset(
name="source_2",
positions=al.Grid2DIrregular([[1.0, 1.0]]),
positions_noise_map=al.ValuesIrregular([1.0]),
fluxes=al.ValuesIrregular([2.0, 3.0]),
fluxes_noise_map=al.ValuesIrregular([4.0, 5.0]),
)
point_dict = al.PointDict(point_dataset_list=[point_dataset_0, point_dataset_1])
assert point_dict["source_1"].name == "source_1"
assert point_dict["source_1"].positions.in_list == [(1.0, 1.0)]
assert point_dict["source_1"].positions_noise_map.in_list == [1.0]
assert point_dict["source_1"].fluxes == None
assert point_dict["source_1"].fluxes_noise_map == None
assert point_dict["source_2"].name == "source_2"
assert point_dict["source_2"].positions.in_list == [(1.0, 1.0)]
assert point_dict["source_2"].positions_noise_map.in_list == [1.0]
assert point_dict["source_2"].fluxes.in_list == [2.0, 3.0]
assert point_dict["source_2"].fluxes_noise_map.in_list == [4.0, 5.0]
assert (point_dict.positions_list[0] == np.array([1.0, 1.0])).all()
assert (point_dict.positions_list[1] == np.array([1.0, 1.0])).all()
def test__inputs_are_other_python_types__converted_correctly():
point_dataset_0 = al.PointDataset(
name="source_1", positions=[[1.0, 1.0]], positions_noise_map=[1.0]
)
point_dict = al.PointDict(point_dataset_list=[point_dataset_0])
assert point_dict["source_1"].name == "source_1"
assert point_dict["source_1"].positions.in_list == [(1.0, 1.0)]
assert point_dict["source_1"].positions_noise_map.in_list == [1.0]
assert point_dict["source_1"].fluxes == None
assert point_dict["source_1"].fluxes_noise_map == None
point_dataset_0 = al.PointDataset(
name="source_1",
positions=[(1.0, 1.0), (2.0, 2.0)],
positions_noise_map=[1.0],
fluxes=[2.0],
fluxes_noise_map=[3.0],
)
point_dict = al.PointDict(point_dataset_list=[point_dataset_0])
assert point_dict["source_1"].name == "source_1"
assert point_dict["source_1"].positions.in_list == [(1.0, 1.0), (2.0, 2.0)]
assert point_dict["source_1"].positions_noise_map.in_list == [1.0]
assert point_dict["source_1"].fluxes.in_list == [2.0]
assert point_dict["source_1"].fluxes_noise_map.in_list == [3.0]
def test__from_json_and_output_to_json():
point_dataset_0 = al.PointDataset(
name="source_1",
positions=al.Grid2DIrregular([[1.0, 1.0]]),
positions_noise_map=al.ValuesIrregular([1.0]),
)
point_dataset_1 = al.PointDataset(
name="source_2",
positions=al.Grid2DIrregular([[1.0, 1.0]]),
positions_noise_map=al.ValuesIrregular([1.0]),
fluxes=al.ValuesIrregular([2.0, 3.0]),
fluxes_noise_map=al.ValuesIrregular([4.0, 5.0]),
)
point_dict = al.PointDict(point_dataset_list=[point_dataset_0, point_dataset_1])
dir_path = path.join("{}".format(path.dirname(path.realpath(__file__))), "files")
if path.exists(dir_path):
shutil.rmtree(dir_path)
os.makedirs(dir_path)
file_path = path.join(dir_path, "point_dict.json")
point_dict.output_to_json(file_path=file_path, overwrite=True)
point_dict_via_json = al.PointDict.from_json(file_path=file_path)
assert point_dict_via_json["source_1"].name == "source_1"
assert point_dict_via_json["source_1"].positions.in_list == [(1.0, 1.0)]
assert point_dict_via_json["source_1"].positions_noise_map.in_list == [1.0]
assert point_dict_via_json["source_1"].fluxes == None
assert point_dict_via_json["source_1"].fluxes_noise_map == None
assert point_dict_via_json["source_2"].name == "source_2"
assert point_dict_via_json["source_2"].positions.in_list == [(1.0, 1.0)]
assert point_dict_via_json["source_2"].positions_noise_map.in_list == [1.0]
assert point_dict_via_json["source_2"].fluxes.in_list == [2.0, 3.0]
assert point_dict_via_json["source_2"].fluxes_noise_map.in_list == [4.0, 5.0]
| 38.637097 | 86 | 0.669798 | 732 | 4,791 | 4.031421 | 0.086066 | 0.137242 | 0.188072 | 0.177906 | 0.859031 | 0.822433 | 0.815995 | 0.788885 | 0.726533 | 0.705185 | 0 | 0.051106 | 0.179086 | 4,791 | 123 | 87 | 38.95122 | 0.699212 | 0 | 0 | 0.494505 | 0 | 0 | 0.086975 | 0 | 0 | 0 | 0 | 0 | 0.406593 | 1 | 0.032967 | false | 0 | 0.054945 | 0 | 0.087912 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
47aed2508a97b8e90dfb5b45b2e8255b3abb4533 | 1,699 | py | Python | blog/migrations/0011_auto_20210621_2239.py | timptner/farafmb.de | 2b154278d8b44ea3adecafcb8554c1b0b0055e01 | [
"MIT"
] | 1 | 2017-04-06T09:12:45.000Z | 2017-04-06T09:12:45.000Z | blog/migrations/0011_auto_20210621_2239.py | timptner/farafmb.de | 2b154278d8b44ea3adecafcb8554c1b0b0055e01 | [
"MIT"
] | 2 | 2017-09-07T22:09:50.000Z | 2020-06-09T14:46:30.000Z | blog/migrations/0011_auto_20210621_2239.py | timptner/farafmb.de | 2b154278d8b44ea3adecafcb8554c1b0b0055e01 | [
"MIT"
] | null | null | null | # Generated by Django 3.2.4 on 2021-06-21 20:39
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('blog', '0010_auto_20210609_2150'),
]
operations = [
migrations.AlterField(
model_name='document',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='image',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='link',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='post',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='protocol',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='snippet',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
migrations.AlterField(
model_name='video',
name='id',
field=models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID'),
),
]
| 34.673469 | 111 | 0.598587 | 176 | 1,699 | 5.602273 | 0.261364 | 0.085193 | 0.177485 | 0.205882 | 0.779919 | 0.779919 | 0.779919 | 0.779919 | 0.779919 | 0.779919 | 0 | 0.025306 | 0.278988 | 1,699 | 48 | 112 | 35.395833 | 0.779592 | 0.026486 | 0 | 0.666667 | 1 | 0 | 0.058111 | 0.013923 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02381 | 0 | 0.095238 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
47e87b091ad53e4e470afbbf37fcbf5535e6d681 | 4,462 | py | Python | test/unit/test_iluminumd_data_shims.py | luminousdev0/sentinel | 10dacca9e16b58b8d4512c513a312efd46cfff11 | [
"MIT"
] | null | null | null | test/unit/test_iluminumd_data_shims.py | luminousdev0/sentinel | 10dacca9e16b58b8d4512c513a312efd46cfff11 | [
"MIT"
] | null | null | null | test/unit/test_iluminumd_data_shims.py | luminousdev0/sentinel | 10dacca9e16b58b8d4512c513a312efd46cfff11 | [
"MIT"
] | null | null | null | import pytest
import sys
import os
os.environ['SENTINEL_CONFIG'] = os.path.normpath(os.path.join(os.path.dirname(__file__), '../test_sentinel.conf'))
sys.path.append(os.path.normpath(os.path.join(os.path.dirname(__file__), '../../lib')))
import iluminumlib
@pytest.fixture
def sentinel_proposal_hex():
return '5b2270726f706f73616c222c207b22656e645f65706f6368223a20313439313032323830302c20226e616d65223a2022626565722d7265696d62757273656d656e742d37222c20227061796d656e745f61646472657373223a2022795965384b77796155753559737753596d4233713372797838585455753979375569222c20227061796d656e745f616d6f756e74223a20372e30303030303030302c202273746172745f65706f6368223a20313438333235303430302c202275726c223a202268747470733a2f2f6461736863656e7472616c2e636f6d2f626565722d7265696d62757273656d656e742d37227d5d'
@pytest.fixture
def sentinel_superblock_hex():
return '5b227375706572626c6f636b222c207b226576656e745f626c6f636b5f686569676874223a2036323530302c20227061796d656e745f616464726573736573223a2022795965384b77796155753559737753596d42337133727978385854557539793755697c795443363268755234595145506e39414a486a6e517878726548536267416f617456222c20227061796d656e745f616d6f756e7473223a2022357c33227d5d'
@pytest.fixture
def iluminumd_proposal_hex():
return '5b5b2270726f706f73616c222c207b22656e645f65706f6368223a20313439313336383430302c20226e616d65223a2022626565722d7265696d62757273656d656e742d39222c20227061796d656e745f61646472657373223a2022795965384b77796155753559737753596d4233713372797838585455753979375569222c20227061796d656e745f616d6f756e74223a2034392e30303030303030302c202273746172745f65706f6368223a20313438333235303430302c202274797065223a20312c202275726c223a202268747470733a2f2f7777772e6461736863656e7472616c2e6f72672f702f626565722d7265696d62757273656d656e742d39227d5d5d'
@pytest.fixture
def iluminumd_superblock_hex():
return '5b5b2274726967676572222c207b226576656e745f626c6f636b5f686569676874223a2036323530302c20227061796d656e745f616464726573736573223a2022795965384b77796155753559737753596d42337133727978385854557539793755697c795443363268755234595145506e39414a486a6e517878726548536267416f617456222c20227061796d656e745f616d6f756e7473223a2022357c33222c202274797065223a20327d5d5d'
# ========================================================================
def test_SHIM_deserialise_from_iluminumd(iluminumd_proposal_hex, iluminumd_superblock_hex):
assert iluminumlib.SHIM_deserialise_from_iluminumd(iluminumd_proposal_hex) == '5b2270726f706f73616c222c207b22656e645f65706f6368223a20313439313336383430302c20226e616d65223a2022626565722d7265696d62757273656d656e742d39222c20227061796d656e745f61646472657373223a2022795965384b77796155753559737753596d4233713372797838585455753979375569222c20227061796d656e745f616d6f756e74223a2034392e30303030303030302c202273746172745f65706f6368223a20313438333235303430302c202275726c223a202268747470733a2f2f7777772e6461736863656e7472616c2e6f72672f702f626565722d7265696d62757273656d656e742d39227d5d'
assert iluminumlib.SHIM_deserialise_from_iluminumd(iluminumd_superblock_hex) == '5b227375706572626c6f636b222c207b226576656e745f626c6f636b5f686569676874223a2036323530302c20227061796d656e745f616464726573736573223a2022795965384b77796155753559737753596d42337133727978385854557539793755697c795443363268755234595145506e39414a486a6e517878726548536267416f617456222c20227061796d656e745f616d6f756e7473223a2022357c33227d5d'
def test_SHIM_serialise_for_iluminumd(sentinel_proposal_hex, sentinel_superblock_hex):
assert iluminumlib.SHIM_serialise_for_iluminumd(sentinel_proposal_hex) == '5b5b2270726f706f73616c222c207b22656e645f65706f6368223a20313439313032323830302c20226e616d65223a2022626565722d7265696d62757273656d656e742d37222c20227061796d656e745f61646472657373223a2022795965384b77796155753559737753596d4233713372797838585455753979375569222c20227061796d656e745f616d6f756e74223a20372e30303030303030302c202273746172745f65706f6368223a20313438333235303430302c202274797065223a20312c202275726c223a202268747470733a2f2f6461736863656e7472616c2e636f6d2f626565722d7265696d62757273656d656e742d37227d5d5d'
assert iluminumlib.SHIM_serialise_for_iluminumd(sentinel_superblock_hex) == '5b5b2274726967676572222c207b226576656e745f626c6f636b5f686569676874223a2036323530302c20227061796d656e745f616464726573736573223a2022795965384b77796155753559737753596d42337133727978385854557539793755697c795443363268755234595145506e39414a486a6e517878726548536267416f617456222c20227061796d656e745f616d6f756e7473223a2022357c33222c202274797065223a20327d5d5d'
| 114.410256 | 586 | 0.935007 | 135 | 4,462 | 30.503704 | 0.266667 | 0.008742 | 0.015542 | 0.020398 | 0.10442 | 0.098106 | 0.098106 | 0.019913 | 0.019913 | 0.019913 | 0 | 0.68089 | 0.023084 | 4,462 | 38 | 587 | 117.421053 | 0.263822 | 0.016136 | 0 | 0.166667 | 0 | 0 | 0.77598 | 0.77051 | 0 | 1 | 0 | 0 | 0.166667 | 1 | 0.25 | false | 0 | 0.166667 | 0.166667 | 0.583333 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 10 |
9a04b838605814f10b9a085c34d8b05ebe9fe780 | 22,792 | py | Python | sdk/python/pulumi_vault/aws/auth_backend_role_tag.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 10 | 2019-10-07T17:44:18.000Z | 2022-03-30T20:46:33.000Z | sdk/python/pulumi_vault/aws/auth_backend_role_tag.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 79 | 2019-10-11T18:13:07.000Z | 2022-03-31T21:09:41.000Z | sdk/python/pulumi_vault/aws/auth_backend_role_tag.py | pulumi/pulumi-vault | 1682875f4a5d7d508f36e166529ad2b8aec34090 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2019-10-28T10:08:40.000Z | 2020-03-17T14:20:55.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['AuthBackendRoleTagArgs', 'AuthBackendRoleTag']
@pulumi.input_type
class AuthBackendRoleTagArgs:
def __init__(__self__, *,
role: pulumi.Input[str],
allow_instance_migration: Optional[pulumi.Input[bool]] = None,
backend: Optional[pulumi.Input[str]] = None,
disallow_reauthentication: Optional[pulumi.Input[bool]] = None,
instance_id: Optional[pulumi.Input[str]] = None,
max_ttl: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None):
"""
The set of arguments for constructing a AuthBackendRoleTag resource.
:param pulumi.Input[str] role: The name of the AWS auth backend role to read
role tags from, with no leading or trailing `/`s.
:param pulumi.Input[bool] allow_instance_migration: If set, allows migration of the underlying instances where the client resides. Use with caution.
:param pulumi.Input[str] backend: The path to the AWS auth backend to
read role tags from, with no leading or trailing `/`s. Defaults to "aws".
:param pulumi.Input[bool] disallow_reauthentication: If set, only allows a single token to be granted per instance ID.
:param pulumi.Input[str] instance_id: Instance ID for which this tag is intended for. If set, the created tag can only be used by the instance with the given ID.
:param pulumi.Input[str] max_ttl: The maximum TTL of the tokens issued using this role.
:param pulumi.Input[Sequence[pulumi.Input[str]]] policies: The policies to be associated with the tag. Must be a subset of the policies associated with the role.
"""
pulumi.set(__self__, "role", role)
if allow_instance_migration is not None:
pulumi.set(__self__, "allow_instance_migration", allow_instance_migration)
if backend is not None:
pulumi.set(__self__, "backend", backend)
if disallow_reauthentication is not None:
pulumi.set(__self__, "disallow_reauthentication", disallow_reauthentication)
if instance_id is not None:
pulumi.set(__self__, "instance_id", instance_id)
if max_ttl is not None:
pulumi.set(__self__, "max_ttl", max_ttl)
if policies is not None:
pulumi.set(__self__, "policies", policies)
@property
@pulumi.getter
def role(self) -> pulumi.Input[str]:
"""
The name of the AWS auth backend role to read
role tags from, with no leading or trailing `/`s.
"""
return pulumi.get(self, "role")
@role.setter
def role(self, value: pulumi.Input[str]):
pulumi.set(self, "role", value)
@property
@pulumi.getter(name="allowInstanceMigration")
def allow_instance_migration(self) -> Optional[pulumi.Input[bool]]:
"""
If set, allows migration of the underlying instances where the client resides. Use with caution.
"""
return pulumi.get(self, "allow_instance_migration")
@allow_instance_migration.setter
def allow_instance_migration(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "allow_instance_migration", value)
@property
@pulumi.getter
def backend(self) -> Optional[pulumi.Input[str]]:
"""
The path to the AWS auth backend to
read role tags from, with no leading or trailing `/`s. Defaults to "aws".
"""
return pulumi.get(self, "backend")
@backend.setter
def backend(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "backend", value)
@property
@pulumi.getter(name="disallowReauthentication")
def disallow_reauthentication(self) -> Optional[pulumi.Input[bool]]:
"""
If set, only allows a single token to be granted per instance ID.
"""
return pulumi.get(self, "disallow_reauthentication")
@disallow_reauthentication.setter
def disallow_reauthentication(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "disallow_reauthentication", value)
@property
@pulumi.getter(name="instanceId")
def instance_id(self) -> Optional[pulumi.Input[str]]:
"""
Instance ID for which this tag is intended for. If set, the created tag can only be used by the instance with the given ID.
"""
return pulumi.get(self, "instance_id")
@instance_id.setter
def instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_id", value)
@property
@pulumi.getter(name="maxTtl")
def max_ttl(self) -> Optional[pulumi.Input[str]]:
"""
The maximum TTL of the tokens issued using this role.
"""
return pulumi.get(self, "max_ttl")
@max_ttl.setter
def max_ttl(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "max_ttl", value)
@property
@pulumi.getter
def policies(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The policies to be associated with the tag. Must be a subset of the policies associated with the role.
"""
return pulumi.get(self, "policies")
@policies.setter
def policies(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "policies", value)
@pulumi.input_type
class _AuthBackendRoleTagState:
def __init__(__self__, *,
allow_instance_migration: Optional[pulumi.Input[bool]] = None,
backend: Optional[pulumi.Input[str]] = None,
disallow_reauthentication: Optional[pulumi.Input[bool]] = None,
instance_id: Optional[pulumi.Input[str]] = None,
max_ttl: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
role: Optional[pulumi.Input[str]] = None,
tag_key: Optional[pulumi.Input[str]] = None,
tag_value: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering AuthBackendRoleTag resources.
:param pulumi.Input[bool] allow_instance_migration: If set, allows migration of the underlying instances where the client resides. Use with caution.
:param pulumi.Input[str] backend: The path to the AWS auth backend to
read role tags from, with no leading or trailing `/`s. Defaults to "aws".
:param pulumi.Input[bool] disallow_reauthentication: If set, only allows a single token to be granted per instance ID.
:param pulumi.Input[str] instance_id: Instance ID for which this tag is intended for. If set, the created tag can only be used by the instance with the given ID.
:param pulumi.Input[str] max_ttl: The maximum TTL of the tokens issued using this role.
:param pulumi.Input[Sequence[pulumi.Input[str]]] policies: The policies to be associated with the tag. Must be a subset of the policies associated with the role.
:param pulumi.Input[str] role: The name of the AWS auth backend role to read
role tags from, with no leading or trailing `/`s.
:param pulumi.Input[str] tag_key: The key of the role tag.
:param pulumi.Input[str] tag_value: The value to set the role key.
"""
if allow_instance_migration is not None:
pulumi.set(__self__, "allow_instance_migration", allow_instance_migration)
if backend is not None:
pulumi.set(__self__, "backend", backend)
if disallow_reauthentication is not None:
pulumi.set(__self__, "disallow_reauthentication", disallow_reauthentication)
if instance_id is not None:
pulumi.set(__self__, "instance_id", instance_id)
if max_ttl is not None:
pulumi.set(__self__, "max_ttl", max_ttl)
if policies is not None:
pulumi.set(__self__, "policies", policies)
if role is not None:
pulumi.set(__self__, "role", role)
if tag_key is not None:
pulumi.set(__self__, "tag_key", tag_key)
if tag_value is not None:
pulumi.set(__self__, "tag_value", tag_value)
@property
@pulumi.getter(name="allowInstanceMigration")
def allow_instance_migration(self) -> Optional[pulumi.Input[bool]]:
"""
If set, allows migration of the underlying instances where the client resides. Use with caution.
"""
return pulumi.get(self, "allow_instance_migration")
@allow_instance_migration.setter
def allow_instance_migration(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "allow_instance_migration", value)
@property
@pulumi.getter
def backend(self) -> Optional[pulumi.Input[str]]:
"""
The path to the AWS auth backend to
read role tags from, with no leading or trailing `/`s. Defaults to "aws".
"""
return pulumi.get(self, "backend")
@backend.setter
def backend(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "backend", value)
@property
@pulumi.getter(name="disallowReauthentication")
def disallow_reauthentication(self) -> Optional[pulumi.Input[bool]]:
"""
If set, only allows a single token to be granted per instance ID.
"""
return pulumi.get(self, "disallow_reauthentication")
@disallow_reauthentication.setter
def disallow_reauthentication(self, value: Optional[pulumi.Input[bool]]):
pulumi.set(self, "disallow_reauthentication", value)
@property
@pulumi.getter(name="instanceId")
def instance_id(self) -> Optional[pulumi.Input[str]]:
"""
Instance ID for which this tag is intended for. If set, the created tag can only be used by the instance with the given ID.
"""
return pulumi.get(self, "instance_id")
@instance_id.setter
def instance_id(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "instance_id", value)
@property
@pulumi.getter(name="maxTtl")
def max_ttl(self) -> Optional[pulumi.Input[str]]:
"""
The maximum TTL of the tokens issued using this role.
"""
return pulumi.get(self, "max_ttl")
@max_ttl.setter
def max_ttl(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "max_ttl", value)
@property
@pulumi.getter
def policies(self) -> Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]:
"""
The policies to be associated with the tag. Must be a subset of the policies associated with the role.
"""
return pulumi.get(self, "policies")
@policies.setter
def policies(self, value: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]]):
pulumi.set(self, "policies", value)
@property
@pulumi.getter
def role(self) -> Optional[pulumi.Input[str]]:
"""
The name of the AWS auth backend role to read
role tags from, with no leading or trailing `/`s.
"""
return pulumi.get(self, "role")
@role.setter
def role(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "role", value)
@property
@pulumi.getter(name="tagKey")
def tag_key(self) -> Optional[pulumi.Input[str]]:
"""
The key of the role tag.
"""
return pulumi.get(self, "tag_key")
@tag_key.setter
def tag_key(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tag_key", value)
@property
@pulumi.getter(name="tagValue")
def tag_value(self) -> Optional[pulumi.Input[str]]:
"""
The value to set the role key.
"""
return pulumi.get(self, "tag_value")
@tag_value.setter
def tag_value(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "tag_value", value)
class AuthBackendRoleTag(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
allow_instance_migration: Optional[pulumi.Input[bool]] = None,
backend: Optional[pulumi.Input[str]] = None,
disallow_reauthentication: Optional[pulumi.Input[bool]] = None,
instance_id: Optional[pulumi.Input[str]] = None,
max_ttl: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
role: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Reads role tag information from an AWS auth backend in Vault.
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] allow_instance_migration: If set, allows migration of the underlying instances where the client resides. Use with caution.
:param pulumi.Input[str] backend: The path to the AWS auth backend to
read role tags from, with no leading or trailing `/`s. Defaults to "aws".
:param pulumi.Input[bool] disallow_reauthentication: If set, only allows a single token to be granted per instance ID.
:param pulumi.Input[str] instance_id: Instance ID for which this tag is intended for. If set, the created tag can only be used by the instance with the given ID.
:param pulumi.Input[str] max_ttl: The maximum TTL of the tokens issued using this role.
:param pulumi.Input[Sequence[pulumi.Input[str]]] policies: The policies to be associated with the tag. Must be a subset of the policies associated with the role.
:param pulumi.Input[str] role: The name of the AWS auth backend role to read
role tags from, with no leading or trailing `/`s.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: AuthBackendRoleTagArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Reads role tag information from an AWS auth backend in Vault.
:param str resource_name: The name of the resource.
:param AuthBackendRoleTagArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(AuthBackendRoleTagArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
allow_instance_migration: Optional[pulumi.Input[bool]] = None,
backend: Optional[pulumi.Input[str]] = None,
disallow_reauthentication: Optional[pulumi.Input[bool]] = None,
instance_id: Optional[pulumi.Input[str]] = None,
max_ttl: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
role: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = AuthBackendRoleTagArgs.__new__(AuthBackendRoleTagArgs)
__props__.__dict__["allow_instance_migration"] = allow_instance_migration
__props__.__dict__["backend"] = backend
__props__.__dict__["disallow_reauthentication"] = disallow_reauthentication
__props__.__dict__["instance_id"] = instance_id
__props__.__dict__["max_ttl"] = max_ttl
__props__.__dict__["policies"] = policies
if role is None and not opts.urn:
raise TypeError("Missing required property 'role'")
__props__.__dict__["role"] = role
__props__.__dict__["tag_key"] = None
__props__.__dict__["tag_value"] = None
super(AuthBackendRoleTag, __self__).__init__(
'vault:aws/authBackendRoleTag:AuthBackendRoleTag',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
allow_instance_migration: Optional[pulumi.Input[bool]] = None,
backend: Optional[pulumi.Input[str]] = None,
disallow_reauthentication: Optional[pulumi.Input[bool]] = None,
instance_id: Optional[pulumi.Input[str]] = None,
max_ttl: Optional[pulumi.Input[str]] = None,
policies: Optional[pulumi.Input[Sequence[pulumi.Input[str]]]] = None,
role: Optional[pulumi.Input[str]] = None,
tag_key: Optional[pulumi.Input[str]] = None,
tag_value: Optional[pulumi.Input[str]] = None) -> 'AuthBackendRoleTag':
"""
Get an existing AuthBackendRoleTag resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[bool] allow_instance_migration: If set, allows migration of the underlying instances where the client resides. Use with caution.
:param pulumi.Input[str] backend: The path to the AWS auth backend to
read role tags from, with no leading or trailing `/`s. Defaults to "aws".
:param pulumi.Input[bool] disallow_reauthentication: If set, only allows a single token to be granted per instance ID.
:param pulumi.Input[str] instance_id: Instance ID for which this tag is intended for. If set, the created tag can only be used by the instance with the given ID.
:param pulumi.Input[str] max_ttl: The maximum TTL of the tokens issued using this role.
:param pulumi.Input[Sequence[pulumi.Input[str]]] policies: The policies to be associated with the tag. Must be a subset of the policies associated with the role.
:param pulumi.Input[str] role: The name of the AWS auth backend role to read
role tags from, with no leading or trailing `/`s.
:param pulumi.Input[str] tag_key: The key of the role tag.
:param pulumi.Input[str] tag_value: The value to set the role key.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _AuthBackendRoleTagState.__new__(_AuthBackendRoleTagState)
__props__.__dict__["allow_instance_migration"] = allow_instance_migration
__props__.__dict__["backend"] = backend
__props__.__dict__["disallow_reauthentication"] = disallow_reauthentication
__props__.__dict__["instance_id"] = instance_id
__props__.__dict__["max_ttl"] = max_ttl
__props__.__dict__["policies"] = policies
__props__.__dict__["role"] = role
__props__.__dict__["tag_key"] = tag_key
__props__.__dict__["tag_value"] = tag_value
return AuthBackendRoleTag(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="allowInstanceMigration")
def allow_instance_migration(self) -> pulumi.Output[Optional[bool]]:
"""
If set, allows migration of the underlying instances where the client resides. Use with caution.
"""
return pulumi.get(self, "allow_instance_migration")
@property
@pulumi.getter
def backend(self) -> pulumi.Output[Optional[str]]:
"""
The path to the AWS auth backend to
read role tags from, with no leading or trailing `/`s. Defaults to "aws".
"""
return pulumi.get(self, "backend")
@property
@pulumi.getter(name="disallowReauthentication")
def disallow_reauthentication(self) -> pulumi.Output[Optional[bool]]:
"""
If set, only allows a single token to be granted per instance ID.
"""
return pulumi.get(self, "disallow_reauthentication")
@property
@pulumi.getter(name="instanceId")
def instance_id(self) -> pulumi.Output[Optional[str]]:
"""
Instance ID for which this tag is intended for. If set, the created tag can only be used by the instance with the given ID.
"""
return pulumi.get(self, "instance_id")
@property
@pulumi.getter(name="maxTtl")
def max_ttl(self) -> pulumi.Output[Optional[str]]:
"""
The maximum TTL of the tokens issued using this role.
"""
return pulumi.get(self, "max_ttl")
@property
@pulumi.getter
def policies(self) -> pulumi.Output[Optional[Sequence[str]]]:
"""
The policies to be associated with the tag. Must be a subset of the policies associated with the role.
"""
return pulumi.get(self, "policies")
@property
@pulumi.getter
def role(self) -> pulumi.Output[str]:
"""
The name of the AWS auth backend role to read
role tags from, with no leading or trailing `/`s.
"""
return pulumi.get(self, "role")
@property
@pulumi.getter(name="tagKey")
def tag_key(self) -> pulumi.Output[str]:
"""
The key of the role tag.
"""
return pulumi.get(self, "tag_key")
@property
@pulumi.getter(name="tagValue")
def tag_value(self) -> pulumi.Output[str]:
"""
The value to set the role key.
"""
return pulumi.get(self, "tag_value")
| 45.493014 | 169 | 0.649351 | 2,826 | 22,792 | 5.045294 | 0.061925 | 0.09258 | 0.07757 | 0.063263 | 0.86597 | 0.848085 | 0.831323 | 0.81947 | 0.806635 | 0.782999 | 0 | 0.000059 | 0.251667 | 22,792 | 500 | 170 | 45.584 | 0.835894 | 0.320244 | 0 | 0.745763 | 1 | 0 | 0.095968 | 0.045128 | 0 | 0 | 0 | 0 | 0 | 1 | 0.162712 | false | 0.00339 | 0.016949 | 0 | 0.277966 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
9a1a04d6c5a43933025680e6c35cf944cab4dfa2 | 11,426 | py | Python | SimModel_Python_API/simmodel_swig/Release/SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.py | EnEff-BIM/EnEffBIM-Framework | 6328d39b498dc4065a60b5cc9370b8c2a9a1cddf | [
"MIT"
] | 3 | 2016-05-30T15:12:16.000Z | 2022-03-22T08:11:13.000Z | SimModel_Python_API/simmodel_swig/Release/SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.py | EnEff-BIM/EnEffBIM-Framework | 6328d39b498dc4065a60b5cc9370b8c2a9a1cddf | [
"MIT"
] | 21 | 2016-06-13T11:33:45.000Z | 2017-05-23T09:46:52.000Z | SimModel_Python_API/simmodel_swig/Release/SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.py | EnEff-BIM/EnEffBIM-Framework | 6328d39b498dc4065a60b5cc9370b8c2a9a1cddf | [
"MIT"
] | null | null | null | # This file was automatically generated by SWIG (http://www.swig.org).
# Version 3.0.7
#
# Do not make changes to this file unless you know what you are doing--modify
# the SWIG interface file instead.
from sys import version_info
if version_info >= (2, 6, 0):
def swig_import_helper():
from os.path import dirname
import imp
fp = None
try:
fp, pathname, description = imp.find_module('_SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water', [dirname(__file__)])
except ImportError:
import _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water
if fp is not None:
try:
_mod = imp.load_module('_SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water', fp, pathname, description)
finally:
fp.close()
return _mod
_SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water = swig_import_helper()
del swig_import_helper
else:
import _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water
del version_info
try:
_swig_property = property
except NameError:
pass # Python < 2.2 doesn't have 'property'.
def _swig_setattr_nondynamic(self, class_type, name, value, static=1):
if (name == "thisown"):
return self.this.own(value)
if (name == "this"):
if type(value).__name__ == 'SwigPyObject':
self.__dict__[name] = value
return
method = class_type.__swig_setmethods__.get(name, None)
if method:
return method(self, value)
if (not static):
if _newclass:
object.__setattr__(self, name, value)
else:
self.__dict__[name] = value
else:
raise AttributeError("You cannot add attributes to %s" % self)
def _swig_setattr(self, class_type, name, value):
return _swig_setattr_nondynamic(self, class_type, name, value, 0)
def _swig_getattr_nondynamic(self, class_type, name, static=1):
if (name == "thisown"):
return self.this.own()
method = class_type.__swig_getmethods__.get(name, None)
if method:
return method(self)
if (not static):
return object.__getattr__(self, name)
else:
raise AttributeError(name)
def _swig_getattr(self, class_type, name):
return _swig_getattr_nondynamic(self, class_type, name, 0)
def _swig_repr(self):
try:
strthis = "proxy of " + self.this.__repr__()
except:
strthis = ""
return "<%s.%s; %s >" % (self.__class__.__module__, self.__class__.__name__, strthis,)
try:
_object = object
_newclass = 1
except AttributeError:
class _object:
pass
_newclass = 0
try:
import weakref
weakref_proxy = weakref.proxy
except:
weakref_proxy = lambda x: x
import base
import SimFlowEnergyTransfer_ConvectiveHeater_Water
class SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water(SimFlowEnergyTransfer_ConvectiveHeater_Water.SimFlowEnergyTransfer_ConvectiveHeater):
__swig_setmethods__ = {}
for _s in [SimFlowEnergyTransfer_ConvectiveHeater_Water.SimFlowEnergyTransfer_ConvectiveHeater]:
__swig_setmethods__.update(getattr(_s, '__swig_setmethods__', {}))
__setattr__ = lambda self, name, value: _swig_setattr(self, SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water, name, value)
__swig_getmethods__ = {}
for _s in [SimFlowEnergyTransfer_ConvectiveHeater_Water.SimFlowEnergyTransfer_ConvectiveHeater]:
__swig_getmethods__.update(getattr(_s, '__swig_getmethods__', {}))
__getattr__ = lambda self, name: _swig_getattr(self, SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water, name)
__repr__ = _swig_repr
def SimFlowEnergyTrans_InNodeName(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_InNodeName(self, *args)
def SimFlowEnergyTrans_OutNodeName(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_OutNodeName(self, *args)
def SimFlowEnergyTrans_RatedAverageWaterTemp(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_RatedAverageWaterTemp(self, *args)
def SimFlowEnergyTrans_RatedWaterMassFlowRate(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_RatedWaterMassFlowRate(self, *args)
def SimFlowEnergyTrans_RatedCap(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_RatedCap(self, *args)
def SimFlowEnergyTrans_MaxWaterFlowRate(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_MaxWaterFlowRate(self, *args)
def SimFlowEnergyTrans_ConvergTol(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_ConvergTol(self, *args)
def SimFlowEnergyTrans_FracRadiant(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_FracRadiant(self, *args)
def SimFlowEnergyTrans_FractRadiantEnergycidentOnPeople(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_FractRadiantEnergycidentOnPeople(self, *args)
def SimFlowEnergyTrans_SurfName_1_100(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_SurfName_1_100(self, *args)
def SimFlowEnergyTrans_FractRadiantEnergyToSurf_1_20(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_SimFlowEnergyTrans_FractRadiantEnergyToSurf_1_20(self, *args)
def __init__(self, *args):
this = _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.new_SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water(*args)
try:
self.this.append(this)
except:
self.this = this
def _clone(self, f=0, c=None):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water__clone(self, f, c)
__swig_destroy__ = _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.delete_SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water
__del__ = lambda self: None
SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_swigregister = _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_swigregister
SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_swigregister(SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water)
class SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence(base.sequence_common):
__swig_setmethods__ = {}
for _s in [base.sequence_common]:
__swig_setmethods__.update(getattr(_s, '__swig_setmethods__', {}))
__setattr__ = lambda self, name, value: _swig_setattr(self, SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence, name, value)
__swig_getmethods__ = {}
for _s in [base.sequence_common]:
__swig_getmethods__.update(getattr(_s, '__swig_getmethods__', {}))
__getattr__ = lambda self, name: _swig_getattr(self, SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence, name)
__repr__ = _swig_repr
def __init__(self, *args):
this = _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.new_SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence(*args)
try:
self.this.append(this)
except:
self.this = this
def assign(self, n, x):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_assign(self, n, x)
def begin(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_begin(self, *args)
def end(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_end(self, *args)
def rbegin(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_rbegin(self, *args)
def rend(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_rend(self, *args)
def at(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_at(self, *args)
def front(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_front(self, *args)
def back(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_back(self, *args)
def push_back(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_push_back(self, *args)
def pop_back(self):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_pop_back(self)
def detach_back(self, pop=True):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_detach_back(self, pop)
def insert(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_insert(self, *args)
def erase(self, *args):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_erase(self, *args)
def detach(self, position, r, erase=True):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_detach(self, position, r, erase)
def swap(self, x):
return _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_swap(self, x)
__swig_destroy__ = _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.delete_SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence
__del__ = lambda self: None
SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_swigregister = _SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water.SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_swigregister
SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence_swigregister(SimFlowEnergyTransfer_ConvectiveHeater_Radiant_Water_sequence)
# This file is compatible with both classic and new-style classes.
| 49.678261 | 205 | 0.801505 | 1,119 | 11,426 | 7.615728 | 0.141198 | 0.395095 | 0.433701 | 0.482985 | 0.749472 | 0.699601 | 0.670617 | 0.637409 | 0.615114 | 0.541305 | 0 | 0.002946 | 0.138456 | 11,426 | 229 | 206 | 49.895197 | 0.862759 | 0.025731 | 0 | 0.292398 | 1 | 0 | 0.023737 | 0.009531 | 0 | 0 | 0 | 0 | 0 | 1 | 0.204678 | false | 0.011696 | 0.070175 | 0.169591 | 0.596491 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
9a398fd6cbd08cf409dcea20f0929afe9d858bdc | 168 | py | Python | cookbook/schema.py | ticotheps/graphene-django-sample | 76b38ab76da30b0d01b5728c1dad02ccd18fc118 | [
"MIT"
] | null | null | null | cookbook/schema.py | ticotheps/graphene-django-sample | 76b38ab76da30b0d01b5728c1dad02ccd18fc118 | [
"MIT"
] | null | null | null | cookbook/schema.py | ticotheps/graphene-django-sample | 76b38ab76da30b0d01b5728c1dad02ccd18fc118 | [
"MIT"
] | null | null | null | import graphene
import cookbook.ingredients.schema
class Query(cookbook.ingredients.schema.Query, graphene.ObjectType):
pass
schema = graphene.Schema(query=Query) | 24 | 68 | 0.815476 | 20 | 168 | 6.85 | 0.45 | 0.277372 | 0.364964 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 168 | 7 | 69 | 24 | 0.901316 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
d0415b6ac543f7f4b09216cba98a6d672515d7f1 | 4,022 | py | Python | interpreter/env/builtin/arith.py | DaRubyMiner360/ParaCode | 02f18c3c23085d1c0108a958d05451724c91e41c | [
"MIT"
] | 1 | 2021-03-16T14:56:32.000Z | 2021-03-16T14:56:32.000Z | interpreter/env/builtin/arith.py | DaRubyMiner360/ParaCode | 02f18c3c23085d1c0108a958d05451724c91e41c | [
"MIT"
] | null | null | null | interpreter/env/builtin/arith.py | DaRubyMiner360/ParaCode | 02f18c3c23085d1c0108a958d05451724c91e41c | [
"MIT"
] | null | null | null | from interpreter.basic_value import BasicValue
def builtin_int_add(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs + rhs))
def builtin_int_sub(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs - rhs))
def builtin_int_mul(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs * rhs))
def builtin_int_expon(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs ** rhs))
def builtin_int_div(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs // rhs))
def builtin_int_bitor(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs | rhs))
def builtin_int_bitand(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs & rhs))
def builtin_int_bitxor(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs ^ rhs))
def builtin_int_mod(arguments):
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs % rhs))
def builtin_int_bitshiftleft(arguments):
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs << rhs))
def builtin_int_bitshiftright(arguments):
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(int(lhs >> rhs))
from interpreter.basic_value import BasicValue
def builtin_float_add(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(float(lhs + rhs))
def builtin_float_sub(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(float(lhs - rhs))
def builtin_float_mul(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(float(lhs * rhs))
def builtin_float_expon(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(float(lhs ** rhs))
def builtin_float_div(arguments):
interpreter = arguments.interpreter
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(float(lhs / rhs))
def builtin_float_mod(arguments):
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(float(lhs % rhs))
def builtin_float_bitshiftleft(arguments):
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(float(lhs << rhs))
def builtin_float_bitshiftright(arguments):
lhs = arguments.arguments[0].extract_value()
rhs = arguments.arguments[1].extract_value()
return BasicValue(float(lhs >> rhs))
| 30.70229 | 48 | 0.717802 | 469 | 4,022 | 5.989339 | 0.059701 | 0.243503 | 0.142043 | 0.148807 | 0.9911 | 0.9911 | 0.9911 | 0.9911 | 0.954788 | 0.954788 | 0 | 0.011377 | 0.169567 | 4,022 | 130 | 49 | 30.938462 | 0.829641 | 0 | 0 | 0.582418 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208791 | false | 0 | 0.021978 | 0 | 0.43956 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
d0ac5e6a6c21a390c94d28edb2254aba9f7e6119 | 40 | py | Python | Python/Tests/TestData/Signatures/multilinesigs.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | Python/Tests/TestData/Signatures/multilinesigs.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | Python/Tests/TestData/Signatures/multilinesigs.py | nanshuiyu/pytools | 9f9271fe8cf564b4f94e9456d400f4306ea77c23 | [
"Apache-2.0"
] | null | null | null | def func(a, b):
pass
func(a,
b) | 8 | 16 | 0.475 | 8 | 40 | 2.375 | 0.625 | 0.526316 | 0.631579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.35 | 40 | 5 | 17 | 8 | 0.730769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.25 | 0 | 0 | 0.25 | 0 | 1 | 1 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
efbd1099cb64a7c20c3b5d38099ec2cde57f8842 | 3,035 | py | Python | venv/Lib/site-packages/docutils/parsers/rst/include/isogrk4-wide.txt.py | roshanba/mangal | f7b428811dc07214009cc33f0beb665ead402038 | [
"bzip2-1.0.6",
"MIT"
] | null | null | null | venv/Lib/site-packages/docutils/parsers/rst/include/isogrk4-wide.txt.py | roshanba/mangal | f7b428811dc07214009cc33f0beb665ead402038 | [
"bzip2-1.0.6",
"MIT"
] | null | null | null | venv/Lib/site-packages/docutils/parsers/rst/include/isogrk4-wide.txt.py | roshanba/mangal | f7b428811dc07214009cc33f0beb665ead402038 | [
"bzip2-1.0.6",
"MIT"
] | null | null | null | XX XXXX XXXX XXXX XXX XXXX XXXXXX XX XXX XXXXXX XXXXXXX
XX XXXXXXX XXXX XXX XXXXXXX XXXXXXXXX XXXXXXXX XXXXXXXXX XXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XXXXXXXXX XX XXXXXXXXXXXXXXXXXXX XXXX XX XXXXXXXXX
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXX
XX XXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXX
XX XXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXXXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXX
XX XXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXXXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXXXXX
XX XXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXXXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXX
XX XXXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXX XXXXXX XXXXXXX
XX XXXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXX XXXXX XXXXXX XXXXXXX
XX XXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXX
XX XXXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXXX
XX XXXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXXXX
XX XXXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXX
XX XXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XX
XX XXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXXXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXX
XX XXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXX
XX XXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXX
XX XXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXX XXXXXX
XX XXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XX
XX XXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XX
XX XXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XX XXXXXX
XX XXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXX
XX XXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXX
XX XXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXX
XX XXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXX XXXXXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXXXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXX
XX XXXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXX XXXXX
XX XXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXX
XX XXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXXXX
XX XXXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXX
XX XXXXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXXX
XX XXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XXXXXXX
XX XXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXXXXX
XX XXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXXXX XX
XX XXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XX
XX XXXXXXXX XXXXXXXXX XXXXXXX XX XXXXXXXXXXXX XXXX XXXXX XXXX
| 60.7 | 70 | 0.833937 | 415 | 3,035 | 6.098795 | 0.031325 | 0.181351 | 0.305808 | 0.485974 | 0.904386 | 0.904386 | 0.885816 | 0.849072 | 0.824575 | 0.796918 | 0 | 0 | 0.166063 | 3,035 | 49 | 71 | 61.938776 | 1 | 0 | 0 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
efbdac562c1e012cce96fd3e8da51010a91d5001 | 180 | py | Python | cleverhans/future/jax/attacks/__init__.py | iamgroot42/cleverhans | 53da9cd6daf9d7457800831c3eaa75f729a39145 | [
"MIT"
] | 2 | 2021-06-02T03:08:14.000Z | 2021-07-02T03:32:17.000Z | cleverhans/future/jax/attacks/__init__.py | iamgroot42/cleverhans | 53da9cd6daf9d7457800831c3eaa75f729a39145 | [
"MIT"
] | null | null | null | cleverhans/future/jax/attacks/__init__.py | iamgroot42/cleverhans | 53da9cd6daf9d7457800831c3eaa75f729a39145 | [
"MIT"
] | 1 | 2020-10-29T14:13:21.000Z | 2020-10-29T14:13:21.000Z | from cleverhans.future.jax.attacks.fast_gradient_method import fast_gradient_method
from cleverhans.future.jax.attacks.projected_gradient_descent import projected_gradient_descent
| 60 | 95 | 0.911111 | 24 | 180 | 6.5 | 0.458333 | 0.179487 | 0.25641 | 0.294872 | 0.384615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044444 | 180 | 2 | 96 | 90 | 0.906977 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
efc173a24c1caad2013e7ccf7c829160163abe36 | 108 | py | Python | tests/__init__.py | rwaldron/hy | 57fa5c8127faf067e2fc02b886b22e334d4c027d | [
"MIT"
] | null | null | null | tests/__init__.py | rwaldron/hy | 57fa5c8127faf067e2fc02b886b22e334d4c027d | [
"MIT"
] | null | null | null | tests/__init__.py | rwaldron/hy | 57fa5c8127faf067e2fc02b886b22e334d4c027d | [
"MIT"
] | 1 | 2021-04-06T18:27:34.000Z | 2021-04-06T18:27:34.000Z | #
import hy # noqa
from .native_tests.math import * # noqa
from .native_tests.language import * # noqa
| 15.428571 | 44 | 0.703704 | 15 | 108 | 4.933333 | 0.533333 | 0.216216 | 0.378378 | 0.513514 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.203704 | 108 | 6 | 45 | 18 | 0.860465 | 0.12963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
efc61561aeea1110d2c54af2415d563d5320186b | 215 | py | Python | tmfeedback/homepage/views.py | Andrew-Whitaker/tm-feedback | dd9bc3cc3427fef6e6c06add4ade77def6a6f7ac | [
"MIT"
] | null | null | null | tmfeedback/homepage/views.py | Andrew-Whitaker/tm-feedback | dd9bc3cc3427fef6e6c06add4ade77def6a6f7ac | [
"MIT"
] | 6 | 2020-07-19T14:35:09.000Z | 2021-09-22T19:27:33.000Z | tmfeedback/homepage/views.py | Andrew-Whitaker/tm-feedback | dd9bc3cc3427fef6e6c06add4ade77def6a6f7ac | [
"MIT"
] | null | null | null | from django.shortcuts import render
from django.http import HttpResponse
def home(request):
return render(request, 'homepage/home.html')
def about(request):
return render(request, 'homepage/about.html')
| 19.545455 | 49 | 0.75814 | 28 | 215 | 5.821429 | 0.5 | 0.122699 | 0.233129 | 0.319018 | 0.417178 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139535 | 215 | 10 | 50 | 21.5 | 0.881081 | 0 | 0 | 0 | 0 | 0 | 0.172093 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
4be2234c3556eeb5c51f51f85bfe4ad8cfddc15d | 4,775 | py | Python | hw03.1/tests/unit/test_fields.py | dmryutov/otus-python-0319 | c825166ff16a2e26f3d99e375b13c9e2e8ada49b | [
"MIT"
] | null | null | null | hw03.1/tests/unit/test_fields.py | dmryutov/otus-python-0319 | c825166ff16a2e26f3d99e375b13c9e2e8ada49b | [
"MIT"
] | null | null | null | hw03.1/tests/unit/test_fields.py | dmryutov/otus-python-0319 | c825166ff16a2e26f3d99e375b13c9e2e8ada49b | [
"MIT"
] | null | null | null | from datetime import datetime
import unittest
import api
from ..utils import cases
class TestCharField(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.field = api.CharField()
@cases(['some text', ''])
def test_ok(self, value):
self.assertIsNone(self.field.validate(value))
@cases([12312, {}, []])
def test_invalid_type(self, value):
self.assertRaises(TypeError, self.field.validate, value)
@cases(['some text', None, []])
def test_to_python(self, value):
self.assertEqual(self.field.to_python(value), value)
class TestArgumentsField(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.field = api.ArgumentsField()
@cases([{'k': 123}, {}])
def test_ok(self, value):
self.assertIsNone(self.field.validate(value))
@cases(['str', 12312, []])
def test_invalid_type(self, value):
self.assertRaises(TypeError, self.field.validate, value)
class TestEmailField(TestCharField):
@classmethod
def setUpClass(cls):
cls.field = api.EmailField()
@cases(['user1@mail.ru'])
def test_ok(self, value):
self.assertIsNone(self.field.validate(value))
@cases([12312, {}, []])
def test_invalid_type(self, value):
self.assertRaises(TypeError, self.field.validate, value)
@cases(['some text', ''])
def test_invalid_format(self, value):
self.assertRaises(TypeError, self.field.validate, value)
class TestPhoneField(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.field = api.PhoneField()
@cases([71234567890, '71234567890'])
def test_ok(self, value):
self.assertIsNone(self.field.validate(value))
@cases([12.312, {}, []])
def test_invalid_type(self, value):
self.assertRaises(TypeError, self.field.validate, value)
@cases([7123456, '7123456789012345'])
def test_invalid_length(self, value):
self.assertRaises(ValueError, self.field.validate, value)
@cases([91234567890, '51234567890', 'abcdefghijk'])
def test_invalid_first_char(self, value):
self.assertRaises(ValueError, self.field.validate, value)
class TestDateField(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.field = api.DateField()
@cases(['01.01.2019'])
def test_ok(self, value):
self.assertIsNone(self.field.validate(value))
@cases([12312, {}, []])
def test_invalid_type(self, value):
self.assertRaises(TypeError, self.field.validate, value)
@cases(['01012019', '40.01.2019'])
def test_invalid_format(self, value):
self.assertRaises(ValueError, self.field.validate, value)
@cases([
['01.01.2019', datetime(2019, 1, 1)],
['01012019', '01012019'],
])
def test_to_python(self, value):
self.assertEqual(self.field.to_python(value[0]), value[1])
class TestBirthDayField(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.field = api.BirthDayField()
@cases(['01.01.2019'])
def test_ok(self, value):
self.assertIsNone(self.field.validate(value))
@cases([12312, {}, []])
def test_invalid_type(self, value):
self.assertRaises(TypeError, self.field.validate, value)
@cases(['01012019', '40.01.2019'])
def test_invalid_format(self, value):
self.assertRaises(ValueError, self.field.validate, value)
@cases(['01.01.1910', '01.01.2040'])
def test_invalid_age(self, value):
self.assertRaises(ValueError, self.field.validate, value)
class TestGenderField(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.field = api.GenderField()
@cases([0, 1, 2])
def test_ok(self, value):
self.assertIsNone(self.field.validate(value))
@cases(['some text', 12.312, {}, []])
def test_invalid_type(self, value):
self.assertRaises(TypeError, self.field.validate, value)
@cases([-1, 3, 1000])
def test_invalid_value(self, value):
self.assertRaises(ValueError, self.field.validate, value)
class TestClientIDsField(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.field = api.ClientIDsField()
@cases([[1, 2, 3]])
def test_ok(self, value):
self.assertIsNone(self.field.validate(value))
@cases(['some text', 12312, {}])
def test_invalid_type(self, value):
self.assertRaises(TypeError, self.field.validate, value)
@cases([[]])
def test_empty_value(self, value):
self.assertRaises(ValueError, self.field.validate, value)
@cases([[10, '45', 'sdfsdfs']])
def test_invalid_value(self, value):
self.assertRaises(TypeError, self.field.validate, value)
if __name__ == '__main__':
unittest.main()
| 28.254438 | 66 | 0.655288 | 554 | 4,775 | 5.545126 | 0.15343 | 0.087891 | 0.114258 | 0.179036 | 0.801758 | 0.797526 | 0.797526 | 0.785156 | 0.767904 | 0.626628 | 0 | 0.058028 | 0.195183 | 4,775 | 168 | 67 | 28.422619 | 0.741348 | 0 | 0 | 0.606557 | 0 | 0 | 0.048168 | 0 | 0 | 0 | 0 | 0 | 0.221311 | 1 | 0.286885 | false | 0 | 0.032787 | 0 | 0.385246 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ef03f4a29900045d793122eed1bf9b0db2b8448d | 170 | py | Python | portality/api/current/bulk/__init__.py | DOAJ/doaj | b11f163c48f51f9e3ada2b02c617b50b847dcb4c | [
"Apache-2.0"
] | 47 | 2015-04-24T13:13:39.000Z | 2022-03-06T03:22:42.000Z | portality/api/current/bulk/__init__.py | DOAJ/doaj | b11f163c48f51f9e3ada2b02c617b50b847dcb4c | [
"Apache-2.0"
] | 1,215 | 2015-01-02T14:29:38.000Z | 2022-03-28T14:19:13.000Z | portality/api/current/bulk/__init__.py | DOAJ/doaj | b11f163c48f51f9e3ada2b02c617b50b847dcb4c | [
"Apache-2.0"
] | 14 | 2015-11-27T13:01:23.000Z | 2021-05-21T07:57:23.000Z | # ~~APIBulk:Feature->API:Feature~~
from portality.api.current.bulk.applications import ApplicationsBulkApi
from portality.api.current.bulk.articles import ArticlesBulkApi | 56.666667 | 71 | 0.847059 | 20 | 170 | 7.2 | 0.6 | 0.180556 | 0.222222 | 0.319444 | 0.375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052941 | 170 | 3 | 72 | 56.666667 | 0.89441 | 0.188235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
ef5f65e76aae5a3da937dd2b8c9fb64ff9a73f37 | 122 | py | Python | build/lib/pycnnum/__init__.py | AdrianTeng/pycnnum | afe23c9c4e7203ba1a12e692aa8c9b643aab20e0 | [
"MIT"
] | null | null | null | build/lib/pycnnum/__init__.py | AdrianTeng/pycnnum | afe23c9c4e7203ba1a12e692aa8c9b643aab20e0 | [
"MIT"
] | null | null | null | build/lib/pycnnum/__init__.py | AdrianTeng/pycnnum | afe23c9c4e7203ba1a12e692aa8c9b643aab20e0 | [
"MIT"
] | 1 | 2021-01-19T06:05:06.000Z | 2021-01-19T06:05:06.000Z | name = 'pycnnum'
try:
from pycnnum import cn2num, num2cn
except ImportError:
from .pycnnum import cn2num, num2cn
| 17.428571 | 39 | 0.729508 | 15 | 122 | 5.933333 | 0.6 | 0.247191 | 0.382022 | 0.516854 | 0.651685 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041237 | 0.204918 | 122 | 6 | 40 | 20.333333 | 0.876289 | 0 | 0 | 0 | 0 | 0 | 0.057377 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3270096dca2a8e188cf3a9c197136e61a8d6aa80 | 39,432 | py | Python | calculate_code.py | Haaroon/ether_parity_hack_Nov_2017 | 616ab274da32ad17f08b70ce137c4808286c2bde | [
"MIT"
] | null | null | null | calculate_code.py | Haaroon/ether_parity_hack_Nov_2017 | 616ab274da32ad17f08b70ce137c4808286c2bde | [
"MIT"
] | null | null | null | calculate_code.py | Haaroon/ether_parity_hack_Nov_2017 | 616ab274da32ad17f08b70ce137c4808286c2bde | [
"MIT"
] | null | null | null | __author__ = 'haaroony'
import requests
from bs4 import BeautifulSoup
#soup = BeautifulSoup(req.text)
#addr = soup.find_all('a', class_="address-tag")
#addresses = set()
#for a in addr:
# addresses.add(a.text)
addresses = {u'0x009f3de1e8878cda9c2e94a6ce6084d9ca86425c',
u'0x0106c3a3392376533db860be6ea418ea01fd51c5',
u'0x0200af2a76b163fb830c8fc5fdd1a408bf35fc61',
u'0x02265cb792a16a03ba32dbe52eff4c53fc7b19e5',
u'0x0285d5528f574f1361009eef75a4f61942767799',
u'0x033dbded32c28994c0d4a943a8c7982edc53f7c1',
u'0x038422e0057c4df8460b73920f28426b8f6cd452',
u'0x0397453bb7db560a039d474c5693578fdb6096c4',
u'0x043dae09e7f51d02b8745bcf82c4c5ee86e4bc96',
u'0x04c2c8b15cbdac0b5d18a19bd2acd704b0378fa0',
u'0x04fbee65aaf9d0d179de1ec22cd6c184b27e7ec1',
u'0x0557f85f4ff6da812e38df9beee476aa4a5a8e55',
u'0x057bff90e3de12c1b5d682412d5bb33d9dc7d6da',
u'0x05b34bf3562c61715f70240104abc6ae8c80055c',
u'0x05cf82965cc412494c5de53bf107ec631accf03e',
u'0x05e415a1626d9d6f7a79916021394c38f5bdd2e4',
u'0x0661daff3c17d04f9f844ad7031f94bd5a1a2209',
u'0x0760bfd74512ed0f143fcc67c957087ba787b3a6',
u'0x07847248ee1f7efe6a0f97caec2e4c675f0cbc65',
u'0x0881538f81a4092bf5a00462c1853a5f2a8b6fa5',
u'0x08a373919c3d99ba6ef5bb85a74811146ce9061b',
u'0x08ca68ecc2cc98f8ba6345531089899fc4c42f57',
u'0x0900d7ecfacdec21aca271b0133b9e387ece1321',
u'0x09d9b2f572f4c7c99631349f2dbad34273aea997',
u'0x09e23cade251c61866c480c247e77a9952d86823',
u'0x0b1a4994d8a0ad27289849dc883d8ae98ca0a45a',
u'0x0b2e21d33a3c6f722c791ae0ab054a0174f5be19',
u'0x0c12e4f17869a331d5169847dff198b1101f8bf9',
u'0x0d6c24d85680a89152012f9dc81e406183489c1f',
u'0x0da3cb3046f72fcbb49edf01b04ab6efc6c0d8dc',
u'0x0ddffcc435d637fd10bf6f962e50c005839a17be',
u'0x0e28db72b24fe9d5cfa5bd7e151541b48bbedab1',
u'0x0ed019efec4fe8516c4e0e9c629de30585dfcd35',
u'0x0f30c808069315b3b7dfbfe149c87448b50c6d8b',
u'0x0f6f59698719113cd8bec98bd87536a776808f81',
u'0x0f93cfe4096b701bf16c8d789ac6a0325bd23e95',
u'0x108d3d8a7a442da1a5c0f5d83c4c05ba04ed30af',
u'0x10e301560860db30dc1bc519a99aa860bc71f076',
u'0x10ec053e06ccc3cabb5c0ace8a1bf222ebc75159',
u'0x116a57abbea386dfdf8e6068d1b161d6fa8774a5',
u'0x11769ccab9fb7888db173965fc3883923594b832',
u'0x1426a8fdcd713132ea4784e905687f68240d5cfe',
u'0x142c10c90aa0a4dd588edf1ac54c3e959646cc2d',
u'0x144e7593edd1e9703c3ba49e5d98a84c055ba0b3',
u'0x145462828d693280a13f3b5ce4ee82fb70162318',
u'0x146f1708d8ab639f21edf029adde35c2b15c5e26',
u'0x14e5ec74e6b53b990d1ebf4bb740f0371d95d0b5',
u'0x15cdeeef5fb18058368e3d67fedee973c464ec39',
u'0x16a3daaa411b75a27ccf5281d52cadc01007853d',
u'0x171658295a93c8129e7e38c36172dd58ff3421c8',
u'0x1763edba84c15c8ddef46897154c292ee2853f56',
u'0x18b4092dee9ed759b0742608be8ad904957c3d08',
u'0x1947c2a678b7cbac00a75d6490ca7d6f8a4b0eda',
u'0x19986fcfbc5ef9b9e377fa8429c5a8d215cbe814',
u'0x199a4567ddbfa4426903e36b4752213ba1f34d64',
u'0x1afb16d06e76e39732728e186e519d99a156ef4e',
u'0x1b25a52f29ef9a0d64f31cb380323f0c3174b81d',
u'0x1b3de683a4ff93457b0a27986361a5090e3fbb50',
u'0x1bace398f996c2d8c14ef111f1f5d704f34c181f',
u'0x1bbd93c4a939da46490dd53f248b4b2967a639fa',
u'0x1c0e9b714da970e6466ba8e6980c55e7636835a6',
u'0x1cad47ca8f30a144d88b573480a78ca3c84e4abc',
u'0x1de748da20266aad571cc24af1c791240ae7f2e4',
u'0x1eb77714944dd6516ed63a1b6a0fa6e91dee140f',
u'0x2006df02a034359fd32e5bb7d64e07aca44b573a',
u'0x20db5d16771a4ebbb83a00cc27b784407a3bae97',
u'0x20dd16b295f83ab305f359a84d22cd99368755cf',
u'0x212d65b0d28d0fbc8ecdf1bdbba44897d8c8c459',
u'0x2178b33171b49484c22168091446495ffc3d431b',
u'0x2243f00fca59f3468033cf3cf2c4bb8534f154f9',
u'0x227b7656129bc07eef947d3c019a7a8f36a24e74',
u'0x228dee661be746e3492990930de3f60d955296ea',
u'0x22ef5434cc2deb6c760c7ebbc88777d1f32757f6',
u'0x23d1435e7ae402fde76b51926f1c7773cb1ac28b',
u'0x2431de5c5948fe71c57c9f271d97d44c1ea88c35',
u'0x258119b345a71bde55aeb67413c0c6e7e96c44bf',
u'0x25f602eb3497cfc37d70436513fca6df45a84181',
u'0x2711cb6a338d054d566f65432ca3f77581e09627',
u'0x2800ec80ea2b19fefb0f56e6070abf602f83afd1',
u'0x28877c4cc1a482378daf961937660e8d4ffeefa1',
u'0x28cd572413840ece39df8fab7f33baa8a176b4a4',
u'0x28ff414bb944b81053389f22113ad305c8ac69fa',
u'0x29cb6ad7b3eac8342438f2a40e7632a98964a0a1',
u'0x2b41c085a79a3bdf5494141cec188cfbbf70b2b9',
u'0x2b70211148e956955db8fd5beea4f634a5ad1744',
u'0x2ba2cd0540a692fa6402c0df5d384bbca5027ad3',
u'0x2bb433d6a7b6bea34f741c8038a8d42edef23756',
u'0x2c066eabb4bfbcd0caf19dd51d0a5234eb8ff6be',
u'0x2cecbecf7bde3e2485d2071c761a57cdfdd5f85f',
u'0x2e3efc824210201065663a56d92aa09ac5762da4',
u'0x2ee4ae9c1774d4e08934ecbf26121d758bc93965',
u'0x2f8d38c727ebac1daf6b42e15cdbe73cad0e2211',
u'0x2f9f02f2ba99ff5c750f95cf27d25352f71cd6a9',
u'0x306f50526ccb718d74184b23b917cc981a94f97c',
u'0x307d9191533274d93c7cfd4f8a505986248a094c',
u'0x31844f51c4e15dc5dd88ff4357ff90acecd5ff0c',
u'0x32a528762b6326ca0e2b314530d412f823a23d51',
u'0x3323f041f2e6445103dd7f287baedc3ebdf58f4a',
u'0x3331fc85bd9cafe7733c61187178310ef7751973',
u'0x33cc8a390496513f767f970c34e23ac4df8e89ef',
u'0x34210c6cc8c1debd1620ad65557f971cb52a12c7',
u'0x34335ca0a2525f6a3b38ce94f3d5524d33856191',
u'0x3467fd15e1d6356b6f74745d4b14aecec25ee6fb',
u'0x348df768093688bd8d2f706d2b770ff89400ee04',
u'0x34b8ae154641782c9e2933d720f31ac3c9a990f8',
u'0x35bd14e205251f3ee0405bc543ceac1d776e5736',
u'0x35dd33232bb08d749036d4ba87fa05352f1014bf',
u'0x3646da9d8e6cb67b0cf86af2c30c8b615d9bb9ce',
u'0x3647237212970569e356c939f6bfaa0821e39128',
u'0x371dd92422ecf0420fc419a0953f9be886be52d8',
u'0x37764fe50340f0158b9facefb3dbaf5222e34a3d',
u'0x37c6772be3e333e8acbc38521fb5090b0abe1a3b',
u'0x37e17c463fa8d98a8c3f33c72a2b65888c288c40',
u'0x38276b91330106451b3eab93c9a2b3fbd19d50be',
u'0x3853a8344c162cc19d2cad87ee7d1e6159f703ca',
u'0x394d8b3c5de759b8258376fde9b394c8f237d8c4',
u'0x39c92cae22c0648cd9382717b0b5ac944c81af14',
u'0x39d46c1824dfc32ad4e80c28a825296a8ac52437',
u'0x3bdb859ca3655417d054a594e011587b4ac46c35',
u'0x3bfc20f0b9afcace800d73d2191166ff16540258',
u'0x3e103db22424567f92d39b3f8b3c14231cdf3073',
u'0x3e4317888097067bf21361941dbde052e044c70e',
u'0x3fcb02a27dc60573a0cb9bff9528fcd77e78d734',
u'0x404f31181ad0700dab41488d7f1a36a40be4d9e1',
u'0x4051e94da0f345653b7decd9d20062b56715e90e',
u'0x4073404129aea005a661f09c38bc64908b27a746',
u'0x409487eb9a4aa50a3f3016f1b58166e8b339485e',
u'0x40dadf59b16d62d4d5ad68fa6d97341640ed915c',
u'0x41849f3bd33ced4a21c73fddd4a595e22a3c2251',
u'0x41cf890bba58ae36a7a37757aaed4c698ac9dfa1',
u'0x41fd44e2babd518e5b4cfbd8e0af38fbfdc13373',
u'0x428c131b323161f549bf61da2a434d1a3a920b0b',
u'0x42a3d814e6e3c25d20120b972e4d174ef76d93d2',
u'0x42ac09396496d2484114625078ed29205ba01244',
u'0x4324181aa7bbc103aa316e762118c72fc55db9e4',
u'0x43ab622752d766d694c005acfb78b1fc60f35b69',
u'0x43f663226d1a452f96136d4c9b0f68752b17fad0',
u'0x4405cdf409d270fa55f9a4020c3b5772bf1a1a10',
u'0x44e64e26bc024a51692e4e0195c45c98ed99ff9c',
u'0x4536cf9204565be19146c642aac0ffdee918dde9',
u'0x4585b138dc13925c65613a511fc1fc642d16d976',
u'0x4623913527a5511822e11490a91dade706f9d854',
u'0x46eb986afccd8e2edacfdc6aab13d81eba4ee99d',
u'0x470efa7764407de9c142745c8c36fd3150ab341b',
u'0x4753edd1b7cbeadf0738943bcec4967292aa411b',
u'0x4768aec11566841cf78c53e4e01a57ab5771f4b0',
u'0x47c663ba238fb5c66fa7ac92c33a86a41da261de',
u'0x4840516ca9e39391c6c77d1d7c89af11cb3d0ee8',
u'0x493f7decbf1e3da9765c4db06abb8c4daf4a7893',
u'0x49a0ffb48c8cbae349d20df2a7e8e74f6d228804',
u'0x49eafa4c392819c009eccdc8d851b4e3c2dda7d0',
u'0x4a24e33338dfd3c065da1485ce3e3c81f5af327f',
u'0x4af4adb2b2e6e499ba9e94a1deb2d7d68688213a',
u'0x4b22e669bddff412ba47e5de4fc38fab4d93ed04',
u'0x4c21b8b53e4c2000f2ab958a88de08b626d28a3b',
u'0x4d8006dc86d6015d5cb1f33c4e98ca12c39fcba2',
u'0x4de05b00797b11ae43e08ad0068fbd0689a0e041',
u'0x4ebcf8a133cce749ee07d4c764e10d1916f84f5c',
u'0x4f1ac562c120f0d47101ebe7909ce3023225b4bc',
u'0x4ff2aa45ac9975b87b410aaec49d98050a4074da',
u'0x518c463889a8866e19ea9d56ceee1f03cede38c4',
u'0x51a4cf4c8866d3663097b132c97bbc2192c0c350',
u'0x51d31282760c7178c7fe6d216ed3e151d00a1fc6',
u'0x52503c16fa494daba7c214519d833c8ff60c2edb',
u'0x52c7f6792b563e0e7739ab4bf921e6dfa049c5f4',
u'0x52d1caa1e63b3fcf8de34108663f89ed2cfc7874',
u'0x5311fce951684e46cefd804704a06c5133030dff',
u'0x53489e80e3647e762b70b4fb3b57b9ac5b7cc779',
u'0x5374282422da994a3f1f2ce38efc16c6dc42fa68',
u'0x5385c808d5a41df4a04e918defc5286559ea3972',
u'0x53ea709e81eefa48a311b2a582ad8057d45d4acc',
u'0x5483c2e726061fa518379820d863076aff39f7ea',
u'0x54f17794723fae852830b68d337f919cfd8649c7',
u'0x557c741c0af8826b9e3459ad456628646221afb9',
u'0x5618d755634d226204e925872c9d236b5627f957',
u'0x562ab60e521b0779b72d87bdcb3471296feb127e',
u'0x56b64e3c98685e8260ed46d23a720577a6b5c0b1',
u'0x56dbf0c4a4dcc8ec254b92c49e7b6b7b24d38efd',
u'0x57354a283cda64560867939d7579952fb1b3e5a9',
u'0x57e7a9668a60d9a6fad927c048051610ce5feb03',
u'0x57f4f696e46ef9d2371b21ff79f40c7823b6aa08',
u'0x58d7b92089b22422bd04e27d02146111adf53cd8',
u'0x5a1a5986a8c13cd723f8ed4ad9c4738387e7ac1b',
u'0x5aa393731c8cbb83c6c4dd9c5241c59c68c1dfa2',
u'0x5ab18d3b796bae844e243d0bc906b0209106c10a',
u'0x5bcf689874163503c5117fba7dda3c920becd769',
u'0x5bf4cb94818897be2d150773473265f380468019',
u'0x5ccba1eab776fc4d7cc89084c1825f5ffd87ffda',
u'0x5cef6ef48e2ee1c1c9aecf36a669d8a4eb6609da',
u'0x5d56defa47efe98a74342b551e0937058da280cb',
u'0x5e7abd3ce3b4aa69d9a8b75810e0388a0d9c65fa',
u'0x5e8d3d110015523dd0125a4bbeff584b226fbd55',
u'0x5ec8dcdcc9932c954077f5eabfcbb13beb6c0840',
u'0x5f3ce3907e7e4c5b5b8d04dd3211ca8b81a64733',
u'0x602ac8d3e6da55a7bc97d2e8bbfe13e60d0a8d2b',
u'0x614b748b4338e1e112759f8abe9bec5c64c252d0',
u'0x62577e292951fdcf7af3a38570aaf954d0a913a3',
u'0x629efa63717ffc6b621368617dcfe500ae040d5e',
u'0x62c00230b47cc17d6c9a871352568b3f4ffc5f1a',
u'0x633a12e0ccd8f2618de13ac87bf35d2d647266d5',
u'0x64397573d38897dd860cf8344ed55baf9a29e752',
u'0x64678668edc31af86023aa36bf1142ff810fec7d',
u'0x6492780dc59598c6f8a4984c6deffd4600ba0003',
u'0x64a23f1e128b9d8147c8cb0ce49490b4ab9e3c64',
u'0x64ba2d5cb84b90b7e8107552bf7301ca3c26b295',
u'0x64d9f57b88899bebfb17a3644749c237eeb9e525',
u'0x6524cbb0eae90069cebf4772a47166b600e20a62',
u'0x655402a4433233085c23bf10556c4ecd094d4e43',
u'0x6673446bd38e2ec301de13ee1aa01b343199fc5c',
u'0x669653151c8067a2d193d20f50cdaadff5eb3867',
u'0x66b88e6e950310627061c2d373954eed7ce1521a',
u'0x6747507e44b7a2c5c2bb9acc5199cd93c54455bc',
u'0x67c1c8063510f9f5d8e4c8af9e7512060c979cf3',
u'0x6924d3ad691e7f641ffe1c95aa6297e4c10e5e86',
u'0x69bb3d4d29249814845c8634a03ba3aa4165c933',
u'0x69c90fa30d03a82a64451e971c39882054b0ad5d',
u'0x6b9e38a26b3f2e4aeb7d31850adb16daa71b1c0e',
u'0x6ba3dbb0c41d3ffb3b3055fbcaca343442d69484',
u'0x6bf4e17e0f21dd8b36b5d08406ad706642fe6bcb',
u'0x6d71296a8d19c902985f0942e2a6fc78e2ffc6f1',
u'0x6e314220258a6fa41c2d50cd98f123ffff247d9e',
u'0x6e94bcfca0db7f91bdea4d2a1f515924f7ca21c6',
u'0x70fb2712bccb7a416b28b09951e5093eaf394966',
u'0x7100c7ce94607ef68983f133cfd59cc1833a115d',
u'0x71331c46fba44d85e293d63d1d5a8cdadf264451',
u'0x7273ebcd172ea59de951585fee3f3cd6749f277c',
u'0x728dbf45456de6b51b1227d5cd5e2507167688c0',
u'0x7304aa271d18f8ed3295bddb63ad5cbed2f47301',
u'0x7388fd3117ad58ca09d6655b00170bdb1f218db8',
u'0x7426c4ee3214f3be6d55314c5662322ac19fc44d',
u'0x7614ba4b95cc4f456cae349b94b8a6992d4818ea',
u'0x766b580356f40dd56d272b6941ebb02a6dee981d',
u'0x7693f7100a671d0cbfca63bd766fd698c17d6f04',
u'0x769512eed08245828c705a186a09709d0afb52e1',
u'0x78d653cde46185653dfac71464b000daa5eedf1f',
u'0x791ec02d41f9ce7f096b5f86d58b2bfa827c3eec',
u'0x79e8a847fccfc180a059d220bd50743412466454',
u'0x79f80f2c4c1c88027ef1824b5c33b2c3a4957d9f',
u'0x7b0f1fa8eb11f40b6e28822a9ffe3719301f4b58',
u'0x7b6bce3cf38ee602030662fa24ac2ed5a32d0a02',
u'0x7b7d2a5ec99d35ce636556e20898ad8328926c48',
u'0x7bd1a91a8b83a8f3939c704a4b44d1c4f6b2b83f',
u'0x7c922218294246fc1e8c99c737f87afd94361f4f',
u'0x7db3af869d3db2159cd89faa2848e2ea4cd36c1c',
u'0x7ea366ed75b557294486c7349d383a59c549fced',
u'0x7f2a7eaa423970c29570b6c67fd4c2e271e9dd81',
u'0x80285e16f04a37b01ff4f2b57fad3dedbb3d72f2',
u'0x809e38eb4b32ff260f20869c97ae7a1cbb9ce91e',
u'0x80c9d21293620f4f472e337dab1e5589cbe4645d',
u'0x80f6a38c0f71db64e28c770913715a97c7f5be82',
u'0x8113493fde5b23b87b2c4baec9bd78f4a70f28d5',
u'0x81158ddcdba6a6341027e771d72865d86666c100',
u'0x81822027f08cff852817d24e8be679d5201ac8a4',
u'0x81a7fbdc79ffd84daa53c6ba9bb1d7fdd85a1ccb',
u'0x81ac45c30cfc5478067498029233d6107ab3d93b',
u'0x8211a8c59d601f8205746081eeb6e02f589c2e1f',
u'0x8254c841544151fbfc656e2fd2e57f71aaadd592',
u'0x8266bcb06a66efe3150a6f3ddd4f43caeb566245',
u'0x82878f8cc60eaa2aa903c1409a8cda5cfd9edc16',
u'0x829778a21eabc3e4c6835689eea6eeb0857d1e03',
u'0x830389b854770e9102eb957379c6b70da4283d60',
u'0x8327f619e90d576407e5497106e546e29213cd0e',
u'0x83cd2f9f47744b9c3110dad55da1eb66c47ac163',
u'0x83f3159c3293fd4becb2a8fa3e545d33eaf7e1e7',
u'0x84b7cf537a44878db297fca310a2de246c0d40f9',
u'0x85c9397f39b87cb18c04f5c11170cdcf2485d8ec',
u'0x8644f9a3df787089ca76a41cf4158ad1991e80f4',
u'0x8655d6bf4abd2aa47a7a4ac19807b26b7609b61d',
u'0x870c2f61a0ea4b34a8db470c121ee5232bb5f260',
u'0x87c10b82320f7c8586851602b2430c0a09d29714',
u'0x87d6119051e3e84055ed6899325db292fef20344',
u'0x87db2b7144de20063153661fe626c9aee3d43800',
u'0x87f5b0d8f79182830248382a0b6aa2c86757f51c',
u'0x88370cad496cebc663a6b4b3127582ad2fcfc99a',
u'0x88c27b1c75b5e8d0391e08100153d38a94ed06ac',
u'0x89a90af92cc4048005cdf29db58cc6d42dc5a6e4',
u'0x8ab0bea750695b06b810e6fccc02bd758c90f9ec',
u'0x8ab3b07dd8c16d03124cc0ed77fbfd105d7b8f58',
u'0x8b1172e3ace1796dc2adf937ddb5d7e106789bb7',
u'0x8b3cb840c24ecd4f045f5cfb8ea14beec17b51e0',
u'0x8b910bb7a02af0e13eb199b728218850c731dbab',
u'0x8c53795efc5dca289a703ecb40f95934b1a92362',
u'0x8cfab485f4de196cc65073dc8d1196c7b6449940',
u'0x8d1d3e87e4807a4849348866ba7226758db85dae',
u'0x8d358fdc7e1ed7e69f7fb6d5daccc77282a2d737',
u'0x8da8bb9d085cd73b35da7f36be737f6d3fb8b0c2',
u'0x8f7d95abcab6775651dc35bfec346632affff9b0',
u'0x90c58937ff2de4759a71296868a18b5b33dc9faa',
u'0x92c5c001c7c33ee6071432e3d8aa91f5fe1e7a50',
u'0x94169c702cd55f42f73d45a4f490359692c4cbc0',
u'0x94535a9cee64a0258af01c8cc41cfaf7bfb58f76',
u'0x946d9f75f627a297054fd62cf35a433b899bbd17',
u'0x94bd4150e41c717b7e7564484693073239715376',
u'0x9523fb92ba88381c78975753c24afd1acef725c0',
u'0x95f49c4ac16563ebe09e98543cdc38a0265c1934',
u'0x96cf4b5448b47a2e29a8d8875c19791a6bb15aaa',
u'0x97695b2bb33736b7517303ac4be0863a4f0d7fe9',
u'0x97e285c10c1195421e965c3edeb235ce28c65e1c',
u'0x9cf6a3a9e65646a988e3dd16a5d382ba54cc7013',
u'0x9cfd4cbad59b0597bc567b450ec85e9d44f1c9dc',
u'0x9d1c821881647a4dbd7e6fafe38a45ed59947082',
u'0x9d26550cf519321984b59399caf1595bce8e2317',
u'0x9d8d4ff2b1dfb9a14e50e7d84952b6f14fcb8377',
u'0x9e5a4063237844f097f556c3e888bde76fcfbb22',
u'0x9f040a1adac8c601881e41ade9b5b0dd65ad0c1c',
u'0x9fe4e371dd53cd7bb588cd59faf0baf34fba9a80',
u'0xa00d54e9b7bbb2dad4094270520a12f3b28f1adf',
u'0xa0584912b347c682aa66a2b3f60f2e5e32aac6da',
u'0xa08c1134cdd73ad41889f7f914ecc4d3b30c1333',
u'0xa09484f00643f63a6211f75777c297551910c167',
u'0xa14703b1da572e3ddf4803113eb32159209199db',
u'0xa24975b43f25e25a31795b879e49617bf23c6657',
u'0xa31e1018badf408309e56e17080b1a47e768cf29',
u'0xa386560ac173a436c0c592272bb419c94cca8bc9',
u'0xa57b2cf597996a92c9967bd0f3e9d22f565b3a62',
u'0xa588465e80f3198eda3d68d4a607853abbc745cc',
u'0xa678be52ec645836c2fecdc08dfc403117748cd8',
u'0xa764c7e498711efa686e0d126a0ded7ffc453173',
u'0xa7bd27d13c95dc3fafc98a4c412ac0199a395353',
u'0xa7f2266d244e208e5e4231fb72420a3883a5f4fa',
u'0xa87a4b0465225a986b577de509038af022db33d4',
u'0xa8871d303c501c39deb2abe118691eeeea813e30',
u'0xa8df1a82c3602b7d4f2e35013715f48586bef9f7',
u'0xa9139277a57a86dbe1ab916e111b982f12ed7fdf',
u'0xa976626a522fde68abd9b773c2981d45c62e374e',
u'0xa9eebb32a1d459eb1eb5078c543427c34da44313',
u'0xaab0dceb238e7c54d46f895637299dbadd09104e',
u'0xab6b0f01ae694d8ab462df865cb432698b8eeb9d',
u'0xabfe9ee7512e2291b95666f5f0e9de1b43659e56',
u'0xac3c64644d3ca6f960308ed7cd79bfa794bf12b1',
u'0xaca612ba532c64e87c4dd4edda68b2161dc5f593',
u'0xad0d6a9c97d6d401a7e4444859f41f0606d07b62',
u'0xad418b2824bae8dd82fe7da87900aaafae4c5ed8',
u'0xad556c3f5de12a50afe9086a375a3014c6d7dec5',
u'0xadb1b1fded055f31a375c4f4e01c3e10fe396ead',
u'0xadc2f95304136c70dd5ec2ad1da1c87d182803d5',
u'0xadf189ba5aa503c8b3faf104e5175ee8d65b6a86',
u'0xaea2a12c382101c1e9e8bb6b6f14213aa97ad7a9',
u'0xaf85b07a6c61ed0317e61a8493db74a29c622ef1',
u'0xafe1c046029ac21f6b45462dde38f61d3cbc607a',
u'0xb03cc3fe8f21e0933eba76bc9817dfea3b33f3a3',
u'0xb374ad7c24e5ca585d43884d9a1f5ff8165d9bfd',
u'0xb4210e53e5b13cbfa29f72e4764f4c4b9a6ae72a',
u'0xb4923e52d5d1555d78559f372c02612b5ccc6530',
u'0xb4a2870e783d7b7b98b0a69eaa9926ef3b0e92da',
u'0xb4b867a8a5872c185dc65f9f756bec7e1f26a8db',
u'0xb5bda6ea026aad1013b277ccc403f297deb124a2',
u'0xb696bee07c81b57ef2832353a7e8b26f4c80cc65',
u'0xb69e024300f63c452ba0572405d53ca5e991dd77',
u'0xb6bd9c764effc12875e818bd3be827eb8cf7603a',
u'0xb78a748efa48b0c96779ec625e1d017531e3cb65',
u'0xb7e6dff80f694501d6c347486b0c08bca4f141e9',
u'0xb865823d4ce616620e76c2677800427f3a365efb',
u'0xb93e14d4f0ed692447dfa30672d974a7bdc364f3',
u'0xb950466511bb3dbb2b57f44694e7ec295b643e32',
u'0xbbc75b6812a908a3b3d5c6622a3cd1b496990f2b',
u'0xbd055fbd618f610bfca5075d57576675ddb676e3',
u'0xbd13904c10be5fb680e1f6f950bbd4a317d7098c',
u'0xbe17d91c518f1743aa0556425421d59de0372766',
u'0xbe52298db4ba17149b91b57bfb2d3f236f85b25d',
u'0xbe9a4452d319ae42b4b23eb55b46d882dc12e005',
u'0xbf357b557fcb2506041f52b475d91f3b6397cfb8',
u'0xbfbe471a1b31f1ee3a9700652736ec57269db290',
u'0xbfffb0522bcbc5e38be9325c204d883e05786a37',
u'0xc01283f05079d6a143c12079cecac8e3f966694c',
u'0xc0bffe3169967e18a07602bb5312eb8fda2c6b5c',
u'0xc1ad4aebc470176e5ecc2c87717096aa17e0af1a',
u'0xc1bef33095fc3ff1bfa38a193a028fd670751462',
u'0xc1d787c7a1a98b187c31362b588447e5a945fcbb',
u'0xc24eff02ded1c5bd133c285b9d368995dc8f15b7',
u'0xc32050abac7dbfef4fc8dc7b96d9617394cb4e1b',
u'0xc3501dad78f27c7147b65701c5da2d1d2a71285e',
u'0xc45f944b479880a71e976323ae8803e0fea69574',
u'0xc503e2a06eab6a7e4c61f7f80befb51e614655f2',
u'0xc6bbe0ad6032a93705e8552392c586f631164f2b',
u'0xc7cd9d874f93f2409f39a95987b3e3c738313925',
u'0xc89b38417b525c9fb3c3381e45ca3f1233dcb758',
u'0xc94be6acc592957291d787e802b0c932a76d671a',
u'0xc99f0813a601eb1e696f7c8e4d8d3cf197c95719',
u'0xca58f4f8687bc333cc44f41f3e04a3cc3137eb1f',
u'0xcaa3cfe68ae9fe20450926156ccd9671fc815280',
u'0xcb94c18f5c9cb3f6f7241d25b2d1b57ad1203808',
u'0xcbc29209108f329a170fedef0bb78cc2881203c0',
u'0xcc04ac9d65f3a629c764610da872718c6710b377',
u'0xcc2b7a67c9896e75fb0fde83482baead682dcbd7',
u'0xcd9e1a9ce2fb95c548d599da3a61de09342db4d5',
u'0xcda9eb698fc1966d0e92746b465ba07a6c6e981c',
u'0xcdbca5b8fa98d60843806014eabc6e4b785e4f7c',
u'0xce5574ff9d1fd16a411c09c488935f4fc613498c',
u'0xce8e7257b640cf9eef096b188e1345e96c4a80d9',
u'0xce9a6705a416aa6d5c488e5754f612c915e31bae',
u'0xcf27f7edddb7af3744a545c0f8fc5b27bc652fcc',
u'0xcf34dba1bf9b67210de159215190699d8f34e78b',
u'0xcf46cc20deba6b802707961ca3c6f3602566c2cf',
u'0xcf914d0dc4535261afc0b95324376a0fc381d7b6',
u'0xcfe56f64824c75dcfc2d9860cca088f287d4768b',
u'0xd1c56f6aaa42ea2f58909bbbd273e86225c35850',
u'0xd1e594199f020448a5d477ac807747cf753a89b0',
u'0xd31a34d621122bebe0dee360e33bbe61193d5b90',
u'0xd32db75141f77a1174ba3130f69f0aa002cb1ccc',
u'0xd341a30c9946f61cb90e3870ad3fa9019f981ae2',
u'0xd341f357138dc3d1488e203a0138de71f4e0de63',
u'0xd4abeea47c1034affb790a7f0237ef6b4a43ee5e',
u'0xd51f04d699a929c86695d0ef00973189dc928429',
u'0xd5ca44255601a3fa439dfeb715d4f3515548368f',
u'0xd64cea43b17e3e8801bf95ee1b53c0cdfaca95e1',
u'0xd68345e6cc0e821c6dd4154a42072c84c1536d25',
u'0xd6934a115916a7144f81d92fb8b61626a338c726',
u'0xd78331e9dd5b7dc506da403d37a43b1335fbdd11',
u'0xd7b349624208a406627010b7d440dd2f25ddd63a',
u'0xd7dfc49e5d13f77830029134fb06f5fa6d5e8ec4',
u'0xd8b9f5a8f4c79273a34876abd045eb0513f3b1f3',
u'0xd95a6aa3e20397211e487b231211e16790a21ac9',
u'0xda443a5da728e6f0e1aac46517dcdc7069821e37',
u'0xdb0e7d784d6a7ca2cbda6ce26ac3b1bd348c06f8',
u'0xdb46b29957b3021a5ea79c49f443083aba994a33',
u'0xdb890da7d1ff40f80d56bfce9edf210fd0efab9a',
u'0xdb9bf3575c9014065130b09d4febbc3fbb4b4227',
u'0xdc7f356bfa601aeaa96d79efad3e0eb505169572',
u'0xdcab43b6ef9dd156c54e1c4f055aa60e317c6f99',
u'0xdcca93041d97849ab486c76d3d9279eac1ee8298',
u'0xdd0c464c5ef163badeb6d3f4d71ed956771d599f',
u'0xdd21d75db9ed2fe97775ffa46e8fa1c8072cd15d',
u'0xde5e63efdbd02951c89e835a12dad5ff36fe5cd0',
u'0xde6d19362a97c72e9cfce5bb7c136bee9d70130b',
u'0xdf07bc8465d9e47150ce92c9b6d4c668f05de160',
u'0xdf665ab68393bd31cf244171ba57abdbe871f81d',
u'0xdf99823dac53d2d4b993f0391d8e88aa487c2f31',
u'0xdfc20878af74a424de3b0cbdfd42cfcde53932e2',
u'0xe01c0bdc8f2a8a6220a4bed665ceeb1d2c716bcb',
u'0xe0b93a625693a33221cf9bd534ae790ea59a9ba7',
u'0xe0fc954a459a039da40386533311ec878a0f6f81',
u'0xe1c251b56e3de227c45b9a43f89927fabdbad6ab',
u'0xe1db2fffbb97ca9c3cf9b46c382d41d85932b44d',
u'0xe28f9d0ddf33f4e788bc5918cb8f8ebebfa83467',
u'0xe2f30da7d29c64e05bae5848ee1185a272e6dd68',
u'0xe3a482efacc86b55cd60fa1ae07b658548e00c2e',
u'0xe4aa399ac8c2c636c3f084f8176c01c5c73ed90e',
u'0xe4bd8e54814c54244b93beb8c83e924fdcc1da4c',
u'0xe50058012b9a881d983cb5347c75354e055331e8',
u'0xe53d49a207ad911b53600f4a23dd34c3c6cf3a99',
u'0xe55c8759f2d619843319fe5d4058e180aa73ec52',
u'0xe5fea9091c533cbe0069248a56fd2226ec949d5a',
u'0xe60b077ddf5e1ddb5ffb8595069aae0e1e3af9f0',
u'0xe64bae6b0e8b89a1a3f8152dc3fb5e3910166689',
u'0xe64ff72d09515593bf11f43aa4ec6765ce4270c4',
u'0xe705daf2f65228aade8c8ac4f60a586b1391228d',
u'0xe7ccfc7ae9ad1e3514dabc6a7b888c81d82bfe7a',
u'0xe9a7e86799fe3a0cc538b8a09d18e39a9b87766e',
u'0xe9d31bc446c6a258250045080baa7f8576ab91f2',
u'0xe9d7d845388311e478be278bc2b48afed5bdadde',
u'0xea11357575e03a80651783877c1bea6553e5cebf',
u'0xeb011ebe376358de424adfbd7807ba4d1e424ce7',
u'0xeb0f8d2f1efbeeaecf5ef61b16f45e1a1c0082dc',
u'0xeb7f7d58e29f02b079157e8bacfac40ed211359b',
u'0xebf5206c23b66456aa1afa2a55591869568689df',
u'0xec2644ae44eee9b9f476ef8dda8d7efa2340ce37',
u'0xec51a8d16011460c7c0195052e9f898b6ac6d031',
u'0xec6ebd53ec5be5555968a8213a43a2bd9e0eb070',
u'0xec86ad3c96a242f5f108a081194a9c5650bd8701',
u'0xecf58466edce800c40468115350980e038497aa3',
u'0xed68e61b4b8c459fa77e82addc490972b8a3d102',
u'0xed89922d1fe9e5ad9b997ef315a4a14ba7b090cd',
u'0xeea89e8967902b786f00b6ecfbc5962ee59ef207',
u'0xef0613ab211cfb5eeb5a160b65303d6e927f3f85',
u'0xef5da7752c084df1cc719c64bbe06fa98b2c554c',
u'0xefa1994328e59f8e24d85458810d67a27289679a',
u'0xf1eae5c823956a1970712d1473dafc7b13d22ba4',
u'0xf22438c3c3c2f59bdac48a1a84db33e4fa82c701',
u'0xf22c1657f714983eb58cd9822a04314fba4908d6',
u'0xf256ab5436a72a0b0d690b5a2817a3f5b31714be',
u'0xf29a9f84e392095b4c45cd7df8af304fa77ce99c',
u'0xf2a6f6efe5d92605f6a8564caf39d16d5727fe07',
u'0xf2b1ac18de7b1f39bc1b4fb0bd548592efd056ca',
u'0xf353e8e64570858a0028648fbc0655e963fbe197',
u'0xf35ed576a74eac269380f36ee6f44ba64a57fce1',
u'0xf3d7524b8b4d1568b6f1fbd9d186525d51960512',
u'0xf44ddf4ebfb88c956ff0dd99e63255babf75f267',
u'0xf451de832887f56952e606392a6c9786d8629028',
u'0xf5706748b4d1489d54e17eba8bce0f86dd7ac862',
u'0xf590bbd25f47d7af634af1f37acbcc631b7363fe',
u'0xf5948877f4a351db245282af3cbec5791283b61c',
u'0xf6c68965cdc903164284b482ef5dfdb640d9e0de',
u'0xf6e51ae30705cd7248d4d9ac602cb58cc4b61a52',
u'0xf807a52f7f825e893d08f53104c124882e33b01d',
u'0xf82ea8a8e821461abdbaa988ab65948828a6389d',
u'0xfa524ac26ec6ec26a446172086344da9e75e14fa',
u'0xfbdf2c384ea983ae4ea18bace01b422494485096',
u'0xfc244c837f0df2b82c8995a3e334ec3845c8ac38',
u'0xfc7b0f63e1ad6b23482627ac798d59c1f585dd17',
u'0xfd2b3eb22bac1634f8b554a6d67fd11849dc3a0f',
u'0xfd41ac16c108852cea07debdf01880a54e7eaffe',
u'0xfd5e80260d7d7644ea4b1ca81d6992dead65f08b',
u'0xfd6a51dba51330ad60e15ca9c465b64dc656dd81',
u'0xfe3460f80961c466f2afe9a9291dc0c66a12e7c5',
u'0xfea0191dba1ca73f9728e904335ec98ad44203ef',
u'0xfeb7abbce1108d16278fb322d65265211bff6cc9',
u'0xff2cfdb1ab4067247c550f7e1a0c321e62558399',
u'0xff7bb940b38edf80e964ec756479d362fddf1532',
u'0xff911dacc99c7667896aab32efd59f01059f2a73'}
addressesSet = set()
for addr in addresses:
addressesSet.add(str(addr))
docsAddresses = {"0x1c0e9b714da970e6466ba8e6980c55e7636835a6":[299830.00,1000],
"0x227b7656129bc07eef947d3c019a7a8f36a24e74":[196418.63,655],
"0xa8871d303c501c39deb2abe118691eeeea813e30":[134923.50,450],
"0xc7cd9d874f93f2409f39a95987b3e3c738313925":[4939859.40,16475.53],
"0x3bfc20f0b9afcace800d73d2191166ff16540258":[91830814.71,306276.27],
"0xd7dfc49e5d13f77830029134fb06f5fa6d5e8ec4":[351691.80,1173.40],
"0xe705daf2f65228aade8c8ac4f60a586b1391228d":[102024.69,340.4],
"0x43ab622752d766d694c005acfb78b1fc60f35b69":[6503918.20,21704.33],
"0x71331c46fba44d85e293d63d1d5a8cdadf264451":[122760.21,409.665],
"0x0397453bb7db560a039d474c5693578fdb6096c4":[179346.00,600],
"0x39d46c1824dfc32ad4e80c28a825296a8ac52437":[118667.27,397],
"0x94bd4150e41c717b7e7564484693073239715376":[200776.16,671.6943575],
"0x22ef5434cc2deb6c760c7ebbc88777d1f32757f6":[119564.00,397],
"0x7693f7100a671d0cbfca63bd766fd698c17d6f04":[232869.78,779.0892447],
"0x376c3e5547c68bc26240d8dcc6729fff665a4448":[34355267.10,114939],
"0x35bd14e205251f3ee0405bc543ceac1d776e5736":[119560.00,400],
"0x47c663ba238fb5c66fa7ac92c33a86a41da261de":[177546.60,594],
"0xd341f357138dc3d1488e203a0138de71f4e0de63":[411605.49,1376.33],
"0x0da3cb3046f72fcbb49edf01b04ab6efc6c0d8dc":[753862.49,2520.77],
"0x4d8006dc86d6015d5cb1f33c4e98ca12c39fcba2":[107661.60,360],
"0x8655d6bf4abd2aa47a7a4ac19807b26b7609b61d":[897180.00,3000],
"0xa08c1134cdd73ad41889f7f914ecc4d3b30c1333":[97254.31,325.5],
"0x41849f3bd33ced4a21c73fddd4a595e22a3c2251":[968070.51,3237.04],
"0xa14703b1da572e3ddf4803113eb32159209199db":[179364.00,600],
"0xbe17d91c518f1743aa0556425421d59de0372766":[1303579.44,4360.67],
"0x28ff414bb944b81053389f22113ad305c8ac69fa":[99248.08,332],
"0xdb46b29957b3021a5ea79c49f443083aba994a33":[148600.00,500],
"0x3fcb02a27dc60573a0cb9bff9528fcd77e78d734":[466102.80,1568.31],
"0x49eafa4c392819c009eccdc8d851b4e3c2dda7d0":[1344825.13,4524.98],
"0x10e301560860db30dc1bc519a99aa860bc71f076":[4524.98,108478.00],
"0x37c6772be3e333e8acbc38521fb5090b0abe1a3b":[104020.00,350],
"0x7100c7ce94607ef68983f133cfd59cc1833a115d":[97343.68,327.535947],
"0x0881538f81a4092bf5a00462c1853a5f2a8b6fa5":[104911.60,353],
"0x05cf82965cc412494c5de53bf107ec631accf03e":[118582.80,399],
"0xf6e51ae30705cd7248d4d9ac602cb58cc4b61a52":[416080.00,1400.00],
"0xd95a6aa3e20397211e487b231211e16790a21ac9":[44722.49,150.4794447],
"0x7b6bce3cf38ee602030662fa24ac2ed5a32d0a02":[43036.00,144.8486948],
"0x2f9f02f2ba99ff5c750f95cf27d25352f71cd6a9":[95075.50,320.001],
"0xd31a34d621122bebe0dee360e33bbe61193d5b90":[420226.44,1416.10],
"0x05b34bf3562c61715f70240104abc6ae8c80055c":[468065.67,1577.31],
"0x428c131b323161f549bf61da2a434d1a3a920b0b":[148375.00,500],
"0xcf46cc20deba6b802707961ca3c6f3602566c2cf":[103869.92,350.025],
"0xe4aa399ac8c2c636c3f084f8176c01c5c73ed90e":[103558.00,350],
"0x6492780dc59598c6f8a4984c6deffd4600ba0003":[516902.36,1747],
"0x4ebcf8a133cce749ee07d4c764e10d1916f84f5c":[101731.84,342.8200312],
"0x728dbf45456de6b51b1227d5cd5e2507167688c0":[103862.50,350],
"0xef5da7752c084df1cc719c64bbe06fa98b2c554c":[102526.98,45.49952754],
"0x53ea709e81eefa48a311b2a582ad8057d45d4acc":[103988.50,350],
"0x0f30c808069315b3b7dfbfe149c87448b50c6d8b":[84897.09,285.7429561],
"0x7e5b6dd9ba1abf42bfb41e5ae8f46fe5e01aae14":[84675.31,284.9965155],
"0x66ea39aee3f4a2e39d2f28b397a4daf0bffafd89":[6750.75,22.72137362],
"0xdb0e7d784d6a7ca2cbda6ce26ac3b1bd348c06f8":[2058110.00,6925],
"0xc1bd4f07421571364617adce98a8d657f52498b7":[32236.34,108.4668393],
"0xa9eebb32a1d459eb1eb5078c543427c34da44313":[17205.30,57.89130953],
"0x2006df02a034359fd32e5bb7d64e07aca44b573a":[3094.55,10.41233802],
"0x8f7070b6b8e8ac245cc8735c32cccc12e178a99e":[2235.90,7.5232],
"0x009f3de1e8878cda9c2e94a6ce6084d9ca86425c":[3094.13,10.4109228],
"0x570f77473c329a5149fe5d5786d8759e38ed15be":[4458.00,15],
"0xbd13904c10be5fb680e1f6f950bbd4a317d7098c":[3148.16,10.59271815],
"0x7c922218294246fc1e8c99c737f87afd94361f4f":[1980.34,6.663314377],
"0xe0b93a625693a33221cf9bd534ae790ea59a9ba7":[1687.82,5.679066238],
"0xdcab43b6ef9dd156c54e1c4f055aa60e317c6f99":[3010.62,10.071],
"0x394d8b3c5de759b8258376fde9b394c8f237d8c4":[24089.85,80.58423587],
"0xc1bef33095fc3ff1bfa38a193a028fd670751462":[1554.37,5.199601654],
"0x1b3de683a4ff93457b0a27986361a5090e3fbb50":[2081.71,6.960849822],
"0x21675f1b593ac15c5585bca5e7778e4f391620bd":[870.51,2.91083644],
"0x37764fe50340f0158b9facefb3dbaf5222e34a3d":[1644.83,5.5],
"0x4073404129aea005a661f09c38bc64908b27a746":[1493.80,4.995],
"0x2f56c5f0b2548ce52fac5512b76eadbb2c511a7f":[1300.25,4.35012337],
"0x2f8d38c727ebac1daf6b42e15cdbe73cad0e2211":[7.42,0.02548275862],
"0x4de05b00797b11ae43e08ad0068fbd0689a0e041":[223574.73,768.8],
"0xc32050abac7dbfef4fc8dc7b96d9617394cb4e1b":[98931.68,340.2286337],
"0x5f3ce3907e7e4c5b5b8d04dd3211ca8b81a64733":[215955.51,741.4272475],
"0x19986fcfbc5ef9b9e377fa8429c5a8d215cbe814":[583400.00,2000],
"0x2f8d38c727ebac1daf6b42e15cdbe73cad0e2211":[7.4,0.02548275862],
"0x4de05b00797b11ae43e08ad0068fbd0689a0e041":[223574.73,768.8],
"0xc32050abac7dbfef4fc8dc7b96d9617394cb4e1b":[98931.68,340.2286337],
"0x5f3ce3907e7e4c5b5b8d04dd3211ca8b81a64733":[215955.51,741.4272475],
"0x19986fcfbc5ef9b9e377fa8429c5a8d215cbe814":[583400.00,2000],
}
# paddy 2
paddy2 = {
"0x1c0e9b714da970e6466ba8e6980c55e7636835a6":1000,
"0x227b7656129bc07eef947d3c019a7a8f36a24e74":655,
"0xa8871d303c501c39deb2abe118691eeeea813e30":450,
"0xc7cd9d874f93f2409f39a95987b3e3c738313925":16475.53,
"0x3bfc20f0b9afcace800d73d2191166ff16540258":306276.27,
"0xd7dfc49e5d13f77830029134fb06f5fa6d5e8ec4":1173.40,
"0xe705daf2f65228aade8c8ac4f60a586b1391228d":340.4,
"0x43ab622752d766d694c005acfb78b1fc60f35b69":21704.33,
"0x71331c46fba44d85e293d63d1d5a8cdadf264451":409.665,
"0x0397453bb7db560a039d474c5693578fdb6096c4":600,
"0x39d46c1824dfc32ad4e80c28a825296a8ac52437":397,
"0x94bd4150e41c717b7e7564484693073239715376":671.6943575,
"0x22ef5434cc2deb6c760c7ebbc88777d1f32757f6":397,
"0x7693f7100a671d0cbfca63bd766fd698c17d6f04":779.0892447,
"0x376c3e5547c68bc26240d8dcc6729fff665a4448":114939,
"0x35bd14e205251f3ee0405bc543ceac1d776e5736":400,
"0x47c663ba238fb5c66fa7ac92c33a86a41da261de":594,
"0xd341f357138dc3d1488e203a0138de71f4e0de63":1376.33,
"0x0da3cb3046f72fcbb49edf01b04ab6efc6c0d8dc":2520.77,
"0x4d8006dc86d6015d5cb1f33c4e98ca12c39fcba2":360,
"0x8655d6bf4abd2aa47a7a4ac19807b26b7609b61d":3000,
"0xa08c1134cdd73ad41889f7f914ecc4d3b30c1333":325.5,
"0x41849f3bd33ced4a21c73fddd4a595e22a3c2251":3237.04,
"0xa14703b1da572e3ddf4803113eb32159209199db":600,
"0xbe17d91c518f1743aa0556425421d59de0372766":4360.67,
"0x28ff414bb944b81053389f22113ad305c8ac69fa":332,
"0xdb46b29957b3021a5ea79c49f443083aba994a33":500,
"0x3fcb02a27dc60573a0cb9bff9528fcd77e78d734":1568.31,
"0x49eafa4c392819c009eccdc8d851b4e3c2dda7d0":4524.98,
"0x10e301560860db30dc1bc519a99aa860bc71f076":108478.00,
"0x37c6772be3e333e8acbc38521fb5090b0abe1a3b":350,
"0x7100c7ce94607ef68983f133cfd59cc1833a115d":327.535947,
"0x0881538f81a4092bf5a00462c1853a5f2a8b6fa5":353,
"0x05cf82965cc412494c5de53bf107ec631accf03e":399,
"0xf6e51ae30705cd7248d4d9ac602cb58cc4b61a52":1400.00,
"0xd95a6aa3e20397211e487b231211e16790a21ac9":150.4794447,
"0x7b6bce3cf38ee602030662fa24ac2ed5a32d0a02":144.8486948,
"0x2f9f02f2ba99ff5c750f95cf27d25352f71cd6a9":320.001,
"0xd31a34d621122bebe0dee360e33bbe61193d5b90":1416.10,
"0x05b34bf3562c61715f70240104abc6ae8c80055c":1577.31,
"0x428c131b323161f549bf61da2a434d1a3a920b0b":500,
"0xcf46cc20deba6b802707961ca3c6f3602566c2cf":350.025,
"0xe4aa399ac8c2c636c3f084f8176c01c5c73ed90e":350,
"0x6492780dc59598c6f8a4984c6deffd4600ba0003":1747,
"0x4ebcf8a133cce749ee07d4c764e10d1916f84f5c":342.8200312,
"0x728dbf45456de6b51b1227d5cd5e2507167688c0":350,
"0xef5da7752c084df1cc719c64bbe06fa98b2c554c":45.49952754,
"0x53ea709e81eefa48a311b2a582ad8057d45d4acc":350,
"0x0f30c808069315b3b7dfbfe149c87448b50c6d8b":285.7429561,
"0x7e5b6dd9ba1abf42bfb41e5ae8f46fe5e01aae14":284.9965155,
"0x66ea39aee3f4a2e39d2f28b397a4daf0bffafd89":22.72137362,
"0xdb0e7d784d6a7ca2cbda6ce26ac3b1bd348c06f8":6925,
"0xc1bd4f07421571364617adce98a8d657f52498b7":108.4668393,
"0xa9eebb32a1d459eb1eb5078c543427c34da44313":57.89130953,
"0x2006df02a034359fd32e5bb7d64e07aca44b573a":10.41233802,
"0x8f7070b6b8e8ac245cc8735c32cccc12e178a99e":7.5232,
"0x009f3de1e8878cda9c2e94a6ce6084d9ca86425c":10.4109228,
"0x570f77473c329a5149fe5d5786d8759e38ed15be":15,
"0xbd13904c10be5fb680e1f6f950bbd4a317d7098c":10.59271815,
"0x7c922218294246fc1e8c99c737f87afd94361f4f":6.663314377,
"0xe0b93a625693a33221cf9bd534ae790ea59a9ba7":5.679066238,
"0xdcab43b6ef9dd156c54e1c4f055aa60e317c6f99":10.071,
"0x394d8b3c5de759b8258376fde9b394c8f237d8c4":80.58423587,
"0xc1bef33095fc3ff1bfa38a193a028fd670751462":5.199601654,
"0x1b3de683a4ff93457b0a27986361a5090e3fbb50":6.960849822,
"0x21675f1b593ac15c5585bca5e7778e4f391620bd":2.91083644,
"0x37764fe50340f0158b9facefb3dbaf5222e34a3d":5.5,
"0x4073404129aea005a661f09c38bc64908b27a746":4.995,
"0x2f56c5f0b2548ce52fac5512b76eadbb2c511a7f":4.35012337,
"0x19986fcfbc5ef9b9e377fa8429c5a8d215cbe814":2000,
"0x4de05b00797b11ae43e08ad0068fbd0689a0e041":768.8,
"0x5f3ce3907e7e4c5b5b8d04dd3211ca8b81a64733":741.4272475,
"0x6e314220258a6fa41c2d50cd98f123ffff247d9e":501,
"0x043dae09e7f51d02b8745bcf82c4c5ee86e4bc96":360,
"0xc32050abac7dbfef4fc8dc7b96d9617394cb4e1b":340.2286337,
"0xe9d7d845388311e478be278bc2b48afed5bdadde":2,
"0xdd0c464c5ef163badeb6d3f4d71ed956771d599f":1.022,
"0xdd21d75db9ed2fe97775ffa46e8fa1c8072cd15d":1,
"0xdc7f356bfa601aeaa96d79efad3e0eb505169572":1,
"0xc94be6acc592957291d787e802b0c932a76d671a":0.999997094,
"0x4585b138dc13925c65613a511fc1fc642d16d976":0.9997583302,
"0x5483c2e726061fa518379820d863076aff39f7ea":0.8,
"0x5ccba1eab776fc4d7cc89084c1825f5ffd87ffda":0.793952825,
"0xc3501dad78f27c7147b65701c5da2d1d2a71285e":0.495,
"0x08ca68ecc2cc98f8ba6345531089899fc4c42f57":0.4687,
"0x4405cdf409d270fa55f9a4020c3b5772bf1a1a10":0.2990756894,
"0x62c00230b47cc17d6c9a871352568b3f4ffc5f1a":0.2,
"0x1947c2a678b7cbac00a75d6490ca7d6f8a4b0eda":0.2,
"0xce8e7257b640cf9eef096b188e1345e96c4a80d9":0.184728438,
"0x42a3d814e6e3c25d20120b972e4d174ef76d93d2":0.141,
"0x3646da9d8e6cb67b0cf86af2c30c8b615d9bb9ce":0.133737051,
"0xefa1994328e59f8e24d85458810d67a27289679a":0.115,
"0x829778a21eabc3e4c6835689eea6eeb0857d1e03":0.1,
"0x09d9b2f572f4c7c99631349f2dbad34273aea997":0.1,
"0xa9139277a57a86dbe1ab916e111b982f12ed7fdf":0.1,
"0x9d8d4ff2b1dfb9a14e50e7d84952b6f14fcb8377":0.08305596935,
"0xa386560ac173a436c0c592272bb419c94cca8bc9":0.08,
"0x830389b854770e9102eb957379c6b70da4283d60":0.05,
"0xef0613ab211cfb5eeb5a160b65303d6e927f3f85":0.05,
"0x5311fce951684e46cefd804704a06c5133030dff":0.05,
"0xe01c0bdc8f2a8a6220a4bed665ceeb1d2c716bcb":0.05,
"0xf6c68965cdc903164284b482ef5dfdb640d9e0de":0.05,
"0xe64bae6b0e8b89a1a3f8152dc3fb5e3910166689":0.05,
"0xe3a482efacc86b55cd60fa1ae07b658548e00c2e":0.049693242,
"0x0285d5528f574f1361009eef75a4f61942767799":0.044433042,
"0xd32db75141f77a1174ba3130f69f0aa002cb1ccc":0.04,
"0xa57b2cf597996a92c9967bd0f3e9d22f565b3a62":0.039,
"0x42ac09396496d2484114625078ed29205ba01244":0.03,
"0xabfe9ee7512e2291b95666f5f0e9de1b43659e56":0.03,
"0x2f8d38c727ebac1daf6b42e15cdbe73cad0e2211":0.02548275862,
"0x8b3cb840c24ecd4f045f5cfb8ea14beec17b51e0":0.024,
"0xc01283f05079d6a143c12079cecac8e3f966694c":0.023,
"0xfd2b3eb22bac1634f8b554a6d67fd11849dc3a0f":0.02187378,
"0xdf665ab68393bd31cf244171ba57abdbe871f81d":0.02,
"0x32a528762b6326ca0e2b314530d412f823a23d51":0.02,
"0x25f602eb3497cfc37d70436513fca6df45a84181":0.019345831,
"0x8cfab485f4de196cc65073dc8d1196c7b6449940":0.019,
"0x199a4567ddbfa4426903e36b4752213ba1f34d64":0.01760873,
"0xd5ca44255601a3fa439dfeb715d4f3515548368f":0.015,
"0x1afb16d06e76e39732728e186e519d99a156ef4e":0.013,
"0x69bb3d4d29249814845c8634a03ba3aa4165c933":0.012,
"0xb4210e53e5b13cbfa29f72e4764f4c4b9a6ae72a":0.01008999993,
"0x4623913527a5511822e11490a91dade706f9d854":0.010003182,
"0x28877c4cc1a482378daf961937660e8d4ffeefa1":0.01,
"0x94535a9cee64a0258af01c8cc41cfaf7bfb58f76":0.01,
"0x18b4092dee9ed759b0742608be8ad904957c3d08":0.01,
"0xb696bee07c81b57ef2832353a7e8b26f4c80cc65":0.01,
"0xcf27f7edddb7af3744a545c0f8fc5b27bc652fcc":0.01,
"0xd7b349624208a406627010b7d440dd2f25ddd63a":0.01,
"0x8266bcb06a66efe3150a6f3ddd4f43caeb566245":0.01,
"0x6924d3ad691e7f641ffe1c95aa6297e4c10e5e86":0.01,
"0x769512eed08245828c705a186a09709d0afb52e1":0.01,
"0x87f5b0d8f79182830248382a0b6aa2c86757f51c":0.01,
"0x8c53795efc5dca289a703ecb40f95934b1a92362":0.009246044699,
"0xac3c64644d3ca6f960308ed7cd79bfa794bf12b1":0.007,
"0xd51f04d699a929c86695d0ef00973189dc928429":0.006,
"0xb69e024300f63c452ba0572405d53ca5e991dd77":0.005,
"0xc1d787c7a1a98b187c31362b588447e5a945fcbb":0.005,
"0xd78331e9dd5b7dc506da403d37a43b1335fbdd11":0.005,
"0xcfe56f64824c75dcfc2d9860cca088f287d4768b":0.003720439749,
"0x7614ba4b95cc4f456cae349b94b8a6992d4818ea":0.003,
"0xad0d6a9c97d6d401a7e4444859f41f0606d07b62":0.0026,
"0x97695b2bb33736b7517303ac4be0863a4f0d7fe9":0.002,
"0x493f7decbf1e3da9765c4db06abb8c4daf4a7893":0.001,
"0x142c10c90aa0a4dd588edf1ac54c3e959646cc2d":0.001,
"0x5ab18d3b796bae844e243d0bc906b0209106c10a":0.0005,
"0x20db5d16771a4ebbb83a00cc27b784407a3bae97":0.0005,
"0x8d358fdc7e1ed7e69f7fb6d5daccc77282a2d737":0.0004,
"0x39c92cae22c0648cd9382717b0b5ac944c81af14":0.00022839,
"0x0d6c24d85680a89152012f9dc81e406183489c1f":0.00019,
"0x5cef6ef48e2ee1c1c9aecf36a669d8a4eb6609da":0.000039389,
}
print "paddy"
print str(len(paddy2))
paddy2set = set(paddy2.keys())
print str(len(paddy2set))
# merge
for addr in docsAddresses.keys():
addressesSet.add(addr)
print "google+etherscan"
print str(len(addressesSet))
for addr in paddy2set:
addressesSet.add(addr)
print "google+etherscan+paddy"
print str(len(addressesSet))
with open("/Users/Haaroony/Downloads/parity_hack.html", 'r') as content_file:
content = content_file.read()
soup=BeautifulSoup(content)
# extract the table values
addresses = {}
rate = 300
table = soup.find("table")
for row in table.findAll("tr"):
cells = row.findAll("td")
if len(cells) == 6:
addr = str(cells[3].text)
amount = float(str(cells[4].text).split("Ether")[0].replace(",", ""))
usd = amount*rate
addresses[addr] = [usd, amount]
# if addr in docsAddresses:
# if amount != docsAddresses[addr][1]:
# print "INCORRECT VALUES"
# print addr
# print docsAddresses[addr]
# print amount
if addr in paddy2:
if amount != paddy2[addr]:
print "INCORRECT NEW VALUE"
print "addr "+ str(addr)
print "paddy "+ str(paddy2[addr])
print "ethscan "+str(amount)
for addr in docsAddresses.keys():
if addr not in addresses:
addresses[addr] = docsAddresses[addr]
print addresses
totalUSD = 0.0
totalETH = 0.0
for addr in addresses:
totalUSD += addresses[addr][0]
totalETH += addresses[addr][1]
import csv
with open('mycsvfile.csv', 'wb') as f: # Just use 'w' mode in 3.x
w = csv.writer(f)
w.writerow(["addr", "usd", "eth"])
for addr in addresses:
w.writerow([addr, addresses[addr][0], addresses[addr][1]])
w.writerow(["total", totalUSD, totalETH])
| 47.912515 | 79 | 0.873986 | 2,069 | 39,432 | 16.652489 | 0.407443 | 0.000871 | 0.001567 | 0.001567 | 0.017647 | 0.016137 | 0.013874 | 0.013874 | 0.013874 | 0.013874 | 0 | 0.556004 | 0.044532 | 39,432 | 822 | 80 | 47.970803 | 0.358478 | 0.009409 | 0 | 0.021629 | 0 | 0 | 0.789889 | 0.786867 | 0 | 0 | 0.785228 | 0 | 0 | 0 | null | null | 0 | 0.003817 | null | null | 0.015267 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
32c7834f958143fe7b883c6665b720fd79a089e5 | 33 | py | Python | 101_people_counter/helper_functions.py | Ampel2Go/community | 08759509287efef08218a4eb6e6e2b029b862b4a | [
"Apache-2.0"
] | 2 | 2020-08-08T15:38:08.000Z | 2020-11-18T13:13:45.000Z | 101_people_counter/helper_functions.py | Ampel2Go/community | 08759509287efef08218a4eb6e6e2b029b862b4a | [
"Apache-2.0"
] | 1 | 2021-09-22T19:46:49.000Z | 2021-09-22T19:46:49.000Z | 101_people_counter/helper_functions.py | Ampel2Go/community | 08759509287efef08218a4eb6e6e2b029b862b4a | [
"Apache-2.0"
] | 1 | 2022-02-16T09:52:31.000Z | 2022-02-16T09:52:31.000Z | import cv2
import logging
| 3.3 | 14 | 0.666667 | 4 | 33 | 5.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045455 | 0.333333 | 33 | 9 | 15 | 3.666667 | 0.954545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
08f87f776171394d101faccda1aa407763fe9740 | 44 | py | Python | src/amuse/community/ph4/__init__.py | sibonyves/amuse | 5557bf88d14df1aa02133a199b6d60c0c57dcab7 | [
"Apache-2.0"
] | null | null | null | src/amuse/community/ph4/__init__.py | sibonyves/amuse | 5557bf88d14df1aa02133a199b6d60c0c57dcab7 | [
"Apache-2.0"
] | 12 | 2021-11-15T09:13:03.000Z | 2022-02-02T14:53:04.000Z | src/amuse/community/ph4/__init__.py | sibonyves/amuse | 5557bf88d14df1aa02133a199b6d60c0c57dcab7 | [
"Apache-2.0"
] | null | null | null | # generated file
from .interface import Ph4
| 14.666667 | 26 | 0.795455 | 6 | 44 | 5.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027027 | 0.159091 | 44 | 2 | 27 | 22 | 0.918919 | 0.318182 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
3ea0d80b17a42030686df0d496bb0ca659f0069c | 12,777 | py | Python | grl/rl_apps/scenarios/catalog/poker_psro_scenarios.py | indylab/xdo | 1ddd92aa56ba10fa468396de8f8824c83ba9d0ba | [
"MIT"
] | 12 | 2021-03-12T07:18:52.000Z | 2022-03-15T22:30:44.000Z | grl/rl_apps/scenarios/catalog/poker_psro_scenarios.py | indylab/xdo | 1ddd92aa56ba10fa468396de8f8824c83ba9d0ba | [
"MIT"
] | 1 | 2021-11-22T16:39:46.000Z | 2022-02-02T22:13:03.000Z | grl/rl_apps/scenarios/catalog/poker_psro_scenarios.py | indylab/xdo | 1ddd92aa56ba10fa468396de8f8824c83ba9d0ba | [
"MIT"
] | 4 | 2021-06-21T03:54:45.000Z | 2022-01-13T10:28:26.000Z | from ray.rllib.agents.dqn import DQNTrainer
from ray.rllib.agents.ppo import PPOTrainer, PPOTorchPolicy
from grl.envs.poker_multi_agent_env import PokerMultiAgentEnv
from grl.rl_apps.scenarios.catalog import scenario_catalog
from grl.rl_apps.scenarios.catalog.common import default_if_creating_ray_head
from grl.rl_apps.scenarios.psro_scenario import PSROScenario
from grl.rl_apps.scenarios.stopping_conditions import *
from grl.rl_apps.scenarios.trainer_configs.poker_psro_configs import *
from grl.rllib_tools.modified_policies.simple_q_torch_policy import SimpleQTorchPolicyPatched
scenario_catalog.add(PSROScenario(
name="kuhn_psro_dqn",
ray_cluster_cpus=default_if_creating_ray_head(default=8),
ray_cluster_gpus=default_if_creating_ray_head(default=0),
ray_object_store_memory_cap_gigabytes=1,
env_class=PokerMultiAgentEnv,
env_config={
"version": "kuhn_poker",
"fixed_players": True,
},
mix_metanash_with_uniform_dist_coeff=0.0,
allow_stochastic_best_responses=False,
trainer_class=DQNTrainer,
policy_classes={
"metanash": SimpleQTorchPolicyPatched,
"best_response": SimpleQTorchPolicyPatched,
"eval": SimpleQTorchPolicyPatched,
},
num_eval_workers=8,
games_per_payoff_eval=20000,
p2sro=False,
p2sro_payoff_table_exponential_avg_coeff=None,
p2sro_sync_with_payoff_table_every_n_episodes=None,
single_agent_symmetric_game=False,
get_trainer_config=psro_kuhn_dqn_params,
psro_get_stopping_condition=lambda: EpisodesSingleBRRewardPlateauStoppingCondition(
br_policy_id="best_response",
dont_check_plateau_before_n_episodes=int(2e4),
check_plateau_every_n_episodes=int(2e4),
minimum_reward_improvement_otherwise_plateaued=0.01,
max_train_episodes=int(1e5),
),
calc_exploitability_for_openspiel_env=True,
))
leduc_psro_dqn = PSROScenario(
name="leduc_psro_dqn",
ray_cluster_cpus=default_if_creating_ray_head(default=8),
ray_cluster_gpus=default_if_creating_ray_head(default=0),
ray_object_store_memory_cap_gigabytes=1,
env_class=PokerMultiAgentEnv,
env_config={
'version': "leduc_poker",
"fixed_players": True,
"append_valid_actions_mask_to_obs": True,
},
mix_metanash_with_uniform_dist_coeff=0.0,
allow_stochastic_best_responses=False,
trainer_class=DQNTrainer,
policy_classes={
"metanash": SimpleQTorchPolicyPatched,
"best_response": SimpleQTorchPolicyPatched,
"eval": SimpleQTorchPolicyPatched,
},
num_eval_workers=8,
games_per_payoff_eval=3000,
p2sro=False,
p2sro_payoff_table_exponential_avg_coeff=None,
p2sro_sync_with_payoff_table_every_n_episodes=None,
single_agent_symmetric_game=False,
get_trainer_config=psro_leduc_dqn_params,
psro_get_stopping_condition=lambda: EpisodesSingleBRRewardPlateauStoppingCondition(
br_policy_id="best_response",
dont_check_plateau_before_n_episodes=int(2e4),
check_plateau_every_n_episodes=int(2e4),
minimum_reward_improvement_otherwise_plateaued=0.01,
max_train_episodes=int(1e5),
),
calc_exploitability_for_openspiel_env=True,
)
scenario_catalog.add(leduc_psro_dqn)
scenario_catalog.add(leduc_psro_dqn.with_updates(
name="symmetric_leduc_psro_dqn",
env_config={
'version': "leduc_poker",
"fixed_players": False,
"append_valid_actions_mask_to_obs": True,
},
mix_metanash_with_uniform_dist_coeff=0.1,
single_agent_symmetric_game=True,
psro_get_stopping_condition=lambda: EpisodesSingleBRRewardPlateauStoppingCondition(
br_policy_id="best_response",
dont_check_plateau_before_n_episodes=int(5e3),
check_plateau_every_n_episodes=int(5e3),
minimum_reward_improvement_otherwise_plateaued=0.01,
max_train_episodes=int(1e5),
),
))
scenario_catalog.add(leduc_psro_dqn.with_updates(
name="symmetric_leduc_p2sro_dqn_3_br_learners",
ray_cluster_cpus=default_if_creating_ray_head(default=4 * 3), # 4 cpus for each of 3 workers
env_config={
'version': "leduc_poker",
"fixed_players": False,
"append_valid_actions_mask_to_obs": True,
},
mix_metanash_with_uniform_dist_coeff=0.1,
p2sro=True,
p2sro_payoff_table_exponential_avg_coeff=1.0 / 3000,
p2sro_sync_with_payoff_table_every_n_episodes=100,
single_agent_symmetric_game=True,
psro_get_stopping_condition=lambda: EpisodesSingleBRRewardPlateauStoppingCondition(
br_policy_id="best_response",
dont_check_plateau_before_n_episodes=int(5e3),
check_plateau_every_n_episodes=int(5e3),
minimum_reward_improvement_otherwise_plateaued=0.01,
max_train_episodes=int(1e5),
),
))
scenario_catalog.add(leduc_psro_dqn.with_updates(
name="symmetric_leduc_p2sro_dqn_3_br_learners_fast_update",
ray_cluster_cpus=default_if_creating_ray_head(default=4 * 3), # 4 cpus for each of 3 workers
env_config={
'version': "leduc_poker",
"fixed_players": False,
"append_valid_actions_mask_to_obs": True,
},
mix_metanash_with_uniform_dist_coeff=0.1,
p2sro=True,
p2sro_payoff_table_exponential_avg_coeff=1.0 / 1000,
p2sro_sync_with_payoff_table_every_n_episodes=100,
single_agent_symmetric_game=True,
psro_get_stopping_condition=lambda: EpisodesSingleBRRewardPlateauStoppingCondition(
br_policy_id="best_response",
dont_check_plateau_before_n_episodes=int(5e3),
check_plateau_every_n_episodes=int(5e3),
minimum_reward_improvement_otherwise_plateaued=0.01,
max_train_episodes=int(1e5),
),
))
# 20_clone_leduc_psro_dqn
# 40_clone_leduc_psro_dqn
# 80_clone_leduc_psro_dqn
for dummy_action_multiplier in [20, 40, 80]:
scenario_catalog.add(PSROScenario(
name=f"{dummy_action_multiplier}_clone_leduc_psro_dqn",
ray_cluster_cpus=default_if_creating_ray_head(default=8),
ray_cluster_gpus=default_if_creating_ray_head(default=0),
ray_object_store_memory_cap_gigabytes=1,
env_class=PokerMultiAgentEnv,
env_config={
'version': "leduc_poker",
"fixed_players": True,
"append_valid_actions_mask_to_obs": True,
"dummy_action_multiplier": dummy_action_multiplier,
},
mix_metanash_with_uniform_dist_coeff=0.0,
allow_stochastic_best_responses=False,
trainer_class=DQNTrainer,
policy_classes={
"metanash": SimpleQTorchPolicyPatched,
"best_response": SimpleQTorchPolicyPatched,
"eval": SimpleQTorchPolicyPatched,
},
num_eval_workers=8,
games_per_payoff_eval=3000,
p2sro=False,
p2sro_payoff_table_exponential_avg_coeff=None,
p2sro_sync_with_payoff_table_every_n_episodes=None,
single_agent_symmetric_game=False,
get_trainer_config=psro_leduc_dqn_params,
psro_get_stopping_condition=lambda: EpisodesSingleBRRewardPlateauStoppingCondition(
br_policy_id="best_response",
dont_check_plateau_before_n_episodes=int(2e4),
check_plateau_every_n_episodes=int(2e4),
minimum_reward_improvement_otherwise_plateaued=0.01,
max_train_episodes=int(1e5),
),
calc_exploitability_for_openspiel_env=True,
))
# 12_no_limit_leduc_psro_dqn
# 30_no_limit_leduc_psro_dqn
# 60_no_limit_leduc_psro_dqn
# 1000_no_limit_leduc_psro_ppo
# 1000_no_limit_leduc_psro_ppo_stochastic
for stack_size in [12, 30, 60, 100, 1000]:
for allow_stochastic_best_responses in [True, False]:
stochastic_str = "_stochastic" if allow_stochastic_best_responses else ""
scenario_catalog.add(PSROScenario(
name=f"{stack_size}_no_limit_leduc_psro_dqn{stochastic_str}",
ray_cluster_cpus=default_if_creating_ray_head(default=8),
ray_cluster_gpus=default_if_creating_ray_head(default=0),
ray_object_store_memory_cap_gigabytes=1,
env_class=PokerMultiAgentEnv,
env_config={
'version': "universal_poker",
"fixed_players": True,
"append_valid_actions_mask_to_obs": True,
"universal_poker_stack_size": stack_size,
},
mix_metanash_with_uniform_dist_coeff=0.0,
allow_stochastic_best_responses=allow_stochastic_best_responses,
trainer_class=DQNTrainer,
policy_classes={
"metanash": SimpleQTorchPolicyPatched,
"best_response": SimpleQTorchPolicyPatched,
"eval": SimpleQTorchPolicyPatched,
},
num_eval_workers=8,
games_per_payoff_eval=3000,
p2sro=False,
p2sro_payoff_table_exponential_avg_coeff=None,
p2sro_sync_with_payoff_table_every_n_episodes=None,
single_agent_symmetric_game=False,
get_trainer_config=psro_leduc_dqn_params,
psro_get_stopping_condition=lambda: EpisodesSingleBRRewardPlateauStoppingCondition(
br_policy_id="best_response",
dont_check_plateau_before_n_episodes=int(2e4),
check_plateau_every_n_episodes=int(2e4),
minimum_reward_improvement_otherwise_plateaued=0.01,
max_train_episodes=int(1e5),
),
calc_exploitability_for_openspiel_env=False,
))
ppo_no_limit = PSROScenario(
name=f"{stack_size}_no_limit_leduc_psro_ppo{stochastic_str}",
ray_cluster_cpus=default_if_creating_ray_head(default=8),
ray_cluster_gpus=default_if_creating_ray_head(default=0),
ray_object_store_memory_cap_gigabytes=1,
env_class=PokerMultiAgentEnv,
env_config={
'version': "universal_poker",
"fixed_players": True,
"append_valid_actions_mask_to_obs": False,
"universal_poker_stack_size": stack_size,
"continuous_action_space": True,
},
mix_metanash_with_uniform_dist_coeff=0.0,
allow_stochastic_best_responses=allow_stochastic_best_responses,
trainer_class=PPOTrainer,
policy_classes={
"metanash": PPOTorchPolicy,
"best_response": PPOTorchPolicy,
"eval": PPOTorchPolicy,
},
num_eval_workers=8,
games_per_payoff_eval=10000,
p2sro=False,
p2sro_payoff_table_exponential_avg_coeff=None,
p2sro_sync_with_payoff_table_every_n_episodes=None,
single_agent_symmetric_game=False,
get_trainer_config=psro_leduc_ppo_params,
psro_get_stopping_condition=lambda: EpisodesSingleBRRewardPlateauStoppingCondition(
br_policy_id="best_response",
dont_check_plateau_before_n_episodes=int(2e4),
check_plateau_every_n_episodes=int(2e4),
minimum_reward_improvement_otherwise_plateaued=0.01,
max_train_episodes=int(1e5),
),
calc_exploitability_for_openspiel_env=False,
)
scenario_catalog.add(ppo_no_limit)
scenario_catalog.add(ppo_no_limit.with_updates(
name=f"{stack_size}_3_round_no_limit_leduc_psro_ppo{stochastic_str}",
env_config={
'version': "universal_poker",
"fixed_players": True,
"append_valid_actions_mask_to_obs": False,
"universal_poker_stack_size": stack_size,
"continuous_action_space": True,
"universal_poker_num_rounds": 3,
},
))
scenario_catalog.add(ppo_no_limit.with_updates(
name=f"{stack_size}_4_round_no_limit_leduc_psro_ppo{stochastic_str}",
env_config={
'version': "universal_poker",
"fixed_players": True,
"append_valid_actions_mask_to_obs": False,
"universal_poker_stack_size": stack_size,
"continuous_action_space": True,
"universal_poker_num_rounds": 4,
},
))
scenario_catalog.add(ppo_no_limit.with_updates(
name=f"{stack_size}_6_rank_no_limit_leduc_psro_ppo{stochastic_str}",
env_config={
'version': "universal_poker",
"fixed_players": True,
"append_valid_actions_mask_to_obs": False,
"universal_poker_stack_size": stack_size,
"continuous_action_space": True,
"universal_poker_num_ranks": 6,
},
))
| 41.618893 | 97 | 0.702356 | 1,497 | 12,777 | 5.443554 | 0.11022 | 0.032397 | 0.023561 | 0.031906 | 0.896429 | 0.87385 | 0.842312 | 0.841944 | 0.837894 | 0.827586 | 0 | 0.022803 | 0.224309 | 12,777 | 306 | 98 | 41.754902 | 0.799415 | 0.021836 | 0 | 0.785965 | 0 | 0 | 0.137103 | 0.08689 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.031579 | 0 | 0.031579 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
3ea4482d1a87bcf59d8a06181fb17dc510517c0b | 49 | py | Python | utils_demo/__init__.py | IBM/nesa-demo | 4e87217f44ff66414f78df6962ee8633d89f0cf5 | [
"MIT"
] | 2 | 2021-12-16T13:16:56.000Z | 2022-01-19T14:23:18.000Z | utils_demo/__init__.py | SocioProphet/nesa-demo | 4e87217f44ff66414f78df6962ee8633d89f0cf5 | [
"MIT"
] | null | null | null | utils_demo/__init__.py | SocioProphet/nesa-demo | 4e87217f44ff66414f78df6962ee8633d89f0cf5 | [
"MIT"
] | 1 | 2022-03-07T19:57:59.000Z | 2022-03-07T19:57:59.000Z | from .percentage_format import percentage_format
| 24.5 | 48 | 0.897959 | 6 | 49 | 7 | 0.666667 | 0.761905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.081633 | 49 | 1 | 49 | 49 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
411d7319a793903751cd9bc80ae0d2ed805367db | 16,851 | py | Python | data/level/level39.py | levelupai/match3-level-similarity | cc9b28b8741b41bea1273c8bc9b4d265d79a1dca | [
"Apache-2.0"
] | null | null | null | data/level/level39.py | levelupai/match3-level-similarity | cc9b28b8741b41bea1273c8bc9b4d265d79a1dca | [
"Apache-2.0"
] | 6 | 2020-07-04T02:53:08.000Z | 2022-03-11T23:53:14.000Z | data/level/level39.py | levelupai/match3-level-similarity | cc9b28b8741b41bea1273c8bc9b4d265d79a1dca | [
"Apache-2.0"
] | 3 | 2019-12-31T11:42:59.000Z | 2021-03-28T20:06:13.000Z | data = {
'level_index': 39,
'move_count': 31,
'board_info': {
(0, 0): {
'base': (4, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(0, 1): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 2): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 3): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 4): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 6): {
'base': (1, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(0, 7): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 8): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 9): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 10): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 11): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 13): {
'base': (2, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(0, 14): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 15): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 16): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(0, 17): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 0): {
'base': (4, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(1, 1): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 2): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 3): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 4): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 6): {
'cover': (60, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(1, 7): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 8): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 9): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 10): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 11): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 13): {
'base': (1, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(1, 14): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 15): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 16): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(1, 17): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 0): {
'base': (2, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(2, 1): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 2): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 3): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 4): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 6): {
'base': (5, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(2, 7): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 8): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 9): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 10): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 11): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 13): {
'base': (1, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(2, 14): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 15): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 16): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(2, 17): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(3, 0): {
'base': (5, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(3, 1): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(3, 2): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(3, 3): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(3, 4): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(3, 13): {
'base': (5, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(3, 14): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(3, 15): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(3, 16): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(3, 17): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 0): {
'base': (4, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(4, 1): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 2): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 3): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 4): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 6): {
'base': (5, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(4, 7): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 8): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 9): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 10): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 11): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 13): {
'base': (2, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(4, 14): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 15): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 16): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(4, 17): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 0): {
'base': (4, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(5, 1): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 2): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 3): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 4): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 6): {
'cover': (60, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(5, 7): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 8): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 9): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 10): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 11): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 13): {
'base': (1, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(5, 14): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 15): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 16): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(5, 17): {
'base': (4, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 0): {
'base': (1, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(6, 1): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 2): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 3): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 4): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 6): {
'base': (4, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(6, 7): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 8): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 9): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 10): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 11): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 13): {
'base': (1, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(6, 14): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 15): {
'base': (1, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 16): {
'base': (2, 1),
'next': (0, 1),
'prev': (0, -1)
},
(6, 17): {
'base': (5, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 0): {
'base': (66, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(8, 1): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 2): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 3): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 4): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 5): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 6): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 7): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 8): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 9): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 10): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 11): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 12): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 13): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 14): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 15): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 16): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(8, 17): {
'base': (66, 1),
'next': (0, 1),
'prev': (0, -1)
},
(10, 1): {
'cover': (60, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(10, 2): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(10, 6): {
'cover': (60, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(10, 7): {
'cover': (60, 1),
'next': (0, 1),
'prev': (0, -1)
},
(10, 10): {
'cover': (60, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(10, 13): {
'cover': (60, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
},
(10, 16): {
'cover': (60, 1),
'fall_point': (0, -1, {'random_list': [1, 2, 4, 5]}),
'next': (0, 1),
'prev': (0, -1)
}
},
'trans_info': {
(0, 0): {
60: 12
},
(0, 9): {
60: 11
}
}
}
| 24.211207 | 65 | 0.208415 | 1,720 | 16,851 | 2.009302 | 0.022093 | 0.167245 | 0.227431 | 0.379051 | 0.960938 | 0.960938 | 0.960938 | 0.960648 | 0.958912 | 0.948206 | 0 | 0.171984 | 0.54246 | 16,851 | 695 | 66 | 24.246043 | 0.276265 | 0 | 0 | 0.608633 | 0 | 0 | 0.129488 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
41223de2bba09a3096518f73fdd8313bd4d9bb1a | 6,383 | py | Python | bender_service/bender/tests/users_tests.py | Dreem-Organization/bender-api | 9ddc817f130b853127a1925b2a9dced2662f66fc | [
"MIT"
] | null | null | null | bender_service/bender/tests/users_tests.py | Dreem-Organization/bender-api | 9ddc817f130b853127a1925b2a9dced2662f66fc | [
"MIT"
] | 2 | 2021-03-19T22:20:06.000Z | 2021-06-10T21:17:12.000Z | bender_service/bender/tests/users_tests.py | Dreem-Organization/bender-api | 9ddc817f130b853127a1925b2a9dced2662f66fc | [
"MIT"
] | null | null | null | from rest_framework import status
from .helpers import BenderTestCase
class UserViewsTests(BenderTestCase):
"""Logging"""
def test_user_not_logged(self):
response = self.client.get("/api/users/")
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
""" LIST """
def test_list_user(self):
self.client.login(username="Toto1", password="123456")
response = self.client.get("/api/users/")
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(list(response.json()["results"][0].keys()), ["username"])
def test_list_user_admin(self):
self.client.login(username="admin", password="123456")
response = self.client.get("/api/users/")
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual("id" in set(response.json()["results"][0].keys()), True)
self.assertEqual("username" in set(response.json()["results"][0].keys()), True)
self.assertEqual("email" in set(response.json()["results"][0].keys()), True)
""" UPDATE """
def test_update_user(self):
self.client.login(username=self.user2.username, password="123456")
n_experiments = self.user2.shared_experiments.count()
self.assertEqual(n_experiments >= 1, True)
n_algos_total = self.user2.algos.count()
n_algos_to_delete = sum([experiment.algos.filter(owner=self.user2).count()
for experiment in self.user2.shared_experiments.all()])
self.assertEqual(n_algos_total >= 1, True)
self.assertEqual(n_algos_total > n_algos_to_delete, True)
data = {
'username': "Toto3",
'email': "lol@toto.com",
'shared_experiments': [],
}
response = self.client.patch("/api/users/{}/".format(self.user2.pk),
data=data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.user2.refresh_from_db()
self.assertEqual(self.user2.shared_experiments.count(), 0)
self.assertEqual(self.user2.username, "Toto3")
self.assertEqual(self.user2.email, "lol@toto.com")
self.assertEqual(self.user2.algos.count(), n_algos_total - n_algos_to_delete)
def test_update_user_username_already_exist(self):
self.client.login(username=self.user1.username, password="123456")
data = {
'username': "Toto2",
}
response = self.client.patch("/api/users/{}/".format(self.user1.pk),
data=data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_update_user_email_already_exist(self):
self.client.login(username=self.user1.username, password="123456")
data = {
'email': "toto2@gmail.com",
}
response = self.client.patch("/api/users/{}/".format(self.user1.pk),
data=data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_update_other_user(self):
self.client.login(username=self.user1.username, password="123456")
data = {
'username': "Toto3",
'email': "lol@toto.com",
'shared_experiments': [],
}
response = self.client.patch("/api/users/{}/".format(self.user2.pk),
data=data)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_update_add_shared_experiment(self):
self.client.login(username=self.user1.username, password="123456")
data = {
'shared_experiments': [self.user2.experiments.exclude(shared_with=self.user1)[0].pk],
}
response = self.client.patch("/api/users/{}/".format(self.user1.pk),
data=data)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_update_add_own_shared_experiment(self):
self.client.login(username=self.user1.username, password="123456")
data = {
'shared_experiments': [self.user1.experiments.all()[0].pk],
}
response = self.client.patch("/api/users/{}/".format(self.user1.pk),
data=data)
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# """ DELETE """
def test_delete_user(self):
self.client.login(username="Toto1", password="123456")
response = self.client.delete("/api/users/{}/".format(self.user1.pk))
self.assertEqual(response.status_code, status.HTTP_204_NO_CONTENT)
response = self.client.get("/api/")
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
self.client.login(username="Toto1", password="123456")
response = self.client.get("/api/")
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_delete_other_user(self):
"""This would not be nice."""
self.client.login(username="Toto1", password="123456")
response = self.client.delete("/api/users/{}/".format(self.user2.pk))
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
# """ DELETE """
def test_retrieve_user(self):
self.client.login(username="Toto1", password="123456")
response = self.client.get("/api/users/{}/".format(self.user1.pk))
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_retrieve_other_user(self):
self.client.login(username="Toto1", password="123456")
response = self.client.get("/api/users/{}/".format(self.user2.pk))
self.assertEqual(response.status_code, status.HTTP_403_FORBIDDEN)
def test_retrieve_other_user_admin(self):
self.client.login(username="admin", password="123456")
response = self.client.get("/api/users/{}/".format(self.user2.pk))
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_contact(self):
self.client.login(username="Toto1", password="123456")
response = self.client.post("/api/users/contact/", {
'title': "Some title",
'content': "Some content",
'email': "jus@an.email"
})
self.assertEqual(response.status_code, status.HTTP_200_OK) | 39.645963 | 97 | 0.630581 | 741 | 6,383 | 5.255061 | 0.132254 | 0.082178 | 0.078582 | 0.126605 | 0.803801 | 0.765794 | 0.754494 | 0.72907 | 0.720596 | 0.705187 | 0 | 0.038633 | 0.225443 | 6,383 | 161 | 98 | 39.645963 | 0.748989 | 0.009713 | 0 | 0.530435 | 0 | 0 | 0.10105 | 0 | 0 | 0 | 0 | 0 | 0.243478 | 1 | 0.130435 | false | 0.130435 | 0.017391 | 0 | 0.156522 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
eb308e0b19144279954a45587b441a1df4d27b95 | 2,264 | py | Python | tests/test_extract_content.py | wanasit/kiji | 5d44e188c09f20c38483a474db444ccd5a9ae4db | [
"MIT"
] | null | null | null | tests/test_extract_content.py | wanasit/kiji | 5d44e188c09f20c38483a474db444ccd5a9ae4db | [
"MIT"
] | null | null | null | tests/test_extract_content.py | wanasit/kiji | 5d44e188c09f20c38483a474db444ccd5a9ae4db | [
"MIT"
] | null | null | null | from bs4 import Tag
from kiji.inspection import inspect, inspect_file
from kiji.extraction import extract_content_element
def test_extract_content_one_layer():
page_inspection = inspect("""
<div class="content-wrapper">
<div class="content">
<h2>Lorem Ipsum</h2>
<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus ullamcorper dolor eget lacus condimentum, at laoreet neque consequat.</p>
<p>Mauris non mauris in est pellentesque egestas ullamcorper eu magna. Ut faucibus tempus dolor vel efficitur.</p>
</div>
</div>
""")
content_inspection = extract_content_element(page_inspection)
assert content_inspection
assert content_inspection.is_element
assert content_inspection._element_class() == ['content']
def test_extract_content_multiple_nested_sections():
page_inspection = inspect("""
<div class="content-wrapper">
<div class="content">
<h2>Lorem Ipsum</h2>
<div class="section1">
<h2>Lorem Ipsum 1</h2>
<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus ullamcorper dolor eget lacus condimentum, at laoreet neque consequat.</p>
</div>
<div class="section2">
<h2>Lorem Ipsum 2</h2>
<p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Vivamus ullamcorper dolor eget lacus condimentum, at laoreet neque consequat.</p>
<p>Mauris non mauris in est pellentesque egestas ullamcorper eu magna. Ut faucibus tempus dolor vel efficitur.</p>
<p>Mauris non mauris in est pellentesque egestas ullamcorper eu magna. Ut faucibus tempus dolor vel efficitur.</p>
<p>Mauris non mauris in est pellentesque egestas ullamcorper eu magna. Ut faucibus tempus dolor vel efficitur.</p>
<p>Mauris non mauris in est pellentesque egestas ullamcorper eu magna. Ut faucibus tempus dolor vel efficitur.</p>
</div>
</div>
</div>
""")
content_inspection = extract_content_element(page_inspection)
assert content_inspection
assert content_inspection.is_element
assert content_inspection._element_class() == ['content']
| 40.428571 | 157 | 0.678445 | 273 | 2,264 | 5.501832 | 0.223443 | 0.090546 | 0.091878 | 0.036618 | 0.842876 | 0.842876 | 0.842876 | 0.842876 | 0.842876 | 0.842876 | 0 | 0.007554 | 0.239841 | 2,264 | 55 | 158 | 41.163636 | 0.865195 | 0 | 0 | 0.780488 | 0 | 0.195122 | 0.698406 | 0.021258 | 0 | 0 | 0 | 0 | 0.146341 | 1 | 0.04878 | false | 0 | 0.073171 | 0 | 0.121951 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
eb427419ae48680e5a3892864f126696271886ac | 21,599 | py | Python | talentmap_api/saml2/attribute_maps/basic.py | burgwyn/State-TalentMAP-API | 1f4f3659c5743ebfd558cd87af381f5460f284b3 | [
"CC0-1.0"
] | 7 | 2018-10-17T15:13:05.000Z | 2021-12-10T14:53:38.000Z | talentmap_api/saml2/attribute_maps/basic.py | burgwyn/State-TalentMAP-API | 1f4f3659c5743ebfd558cd87af381f5460f284b3 | [
"CC0-1.0"
] | 232 | 2017-06-16T02:09:54.000Z | 2018-05-10T16:15:48.000Z | talentmap_api/saml2/attribute_maps/basic.py | MetaPhase-Consulting/State-TalentMAP-API | 4e238cbfe241fd3d0a718a9a0fc038dbed00f13b | [
"CC0-1.0"
] | 4 | 2018-06-13T14:49:27.000Z | 2021-06-30T22:29:15.000Z | MAP = {
"identifier": "urn:oasis:names:tc:SAML:2.0:attrname-format:uri",
"fro": {
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress": "emailaddress", # E-Mail Address, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname": "givenname", # Given Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name": "name", # Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn": "upn", # UPN, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/claims/CommonName": "CommonName", # Common Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/claims/EmailAddress": "EmailAddress", # AD FS 1.x E-Mail Address, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/claims/Group": "Group", # Group, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/claims/UPN": "UPN", # AD FS 1.x UPN, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/role": "role", # Role, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname": "surname", # Surname, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/privatepersonalidentifier": "privatepersonalidentifier", # PPID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier": "nameidentifier", # Name ID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationinstant": "authenticationinstant", # Authentication time stamp, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationmethod": "authenticationmethod", # Authentication method, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/denyonlysid": "denyonlysid", # Deny only group SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/denyonlyprimarysid": "denyonlyprimarysid", # Deny only primary SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/denyonlyprimarygroupsid": "denyonlyprimarygroupsid", # Deny only primary group SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/groupsid": "groupsid", # Group SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/primarygroupsid": "primarygroupsid", # Primary group SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/primarysid": "primarysid", # Primary SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname": "windowsaccountname", # Windows account name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/devicecontext/claims/isregistereduser": "isregistereduser", # Is Registered User, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/devicecontext/claims/identifier": "identifier", # Device Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/devicecontext/claims/registrationid": "registrationid", # Device Registration Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/devicecontext/claims/displayname": "displayname", # Device Registration DisplayName, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/devicecontext/claims/ostype": "ostype", # Device OS type, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/devicecontext/claims/osversion": "osversion", # Device OS Version, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/devicecontext/claims/ismanaged": "ismanaged", # Is Managed Device, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-forwarded-client-ip": "x-ms-forwarded-client-ip", # Forwarded Client IP, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-client-application": "x-ms-client-application", # Client Application, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-client-user-agent": "x-ms-client-user-agent", # Client User Agent, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-client-ip": "x-ms-client-ip", # Client IP, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-endpoint-absolute-path": "x-ms-endpoint-absolute-path", # Endpoint Path, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-proxy": "x-ms-proxy", # Proxy, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/requestcontext/claims/relyingpartytrustid": "relyingpartytrustid", # Application Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/applicationpolicy": "applicationpolicy", # Application policies, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/authoritykeyidentifier": "authoritykeyidentifier", # Authority Key Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/basicconstraints": "basicconstraints", # Basic Constraint, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/eku": "eku", # Enhanced Key Usage, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/field/issuer": "issuer", # Issuer, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/field/issuername": "issuername", # Issuer Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/keyusage": "keyusage", # Key Usage, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/field/notafter": "notafter", # Not After, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/field/notbefore": "notbefore", # Not Before, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/certificatepolicy": "certificatepolicy", # Certificate Policies, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/rsa": "rsa", # Public Key, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/field/rawdata": "rawdata", # Certificate Raw Data, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/san": "san", # Subject Alternative Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2008/06/identity/claims/serialnumber": "serialnumber", # Serial Number, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/field/signaturealgorithm": "signaturealgorithm", # Signature Algorithm, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/field/subject": "subject", # Subject, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/subjectkeyidentifier": "subjectkeyidentifier", # Subject Key Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/field/subjectname": "subjectname", # Subject Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/certificatetemplateinformation": "certificatetemplateinformation", # V2 Template Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/extension/certificatetemplatename": "certificatetemplatename", # V1 Template Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.xmlsoap.org/ws/2005/05/identity/claims/thumbprint": "thumbprint", # Thumbprint, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/12/certificatecontext/field/x509version": "x509version", # X.509 Version, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2012/01/insidecorporatenetwork": "insidecorporatenetwork", # Inside Corporate Network, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2012/01/passwordexpirationtime": "passwordexpirationtime", # Password Expiration Time, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2012/01/passwordexpirationdays": "passwordexpirationdays", # Password Expiration Days, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2012/01/passwordchangeurl": "passwordchangeurl", # Update Password URL, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/claims/authnmethodsreferences": "authnmethodsreferences", # Authentication Methods References, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/2012/01/requestcontext/claims/client-request-id": "client-request-id", # Client Request ID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"http://schemas.microsoft.com/ws/2013/11/alternateloginid": "alternateloginid", # Alternate Login ID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
},
"to": {
"emailaddress": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress", # E-Mail Address, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"givenname": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/givenname", # Given Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"name": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/name", # Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"upn": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn", # UPN, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"CommonName": "http://schemas.xmlsoap.org/claims/CommonName", # Common Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"EmailAddress": "http://schemas.xmlsoap.org/claims/EmailAddress", # AD FS 1.x E-Mail Address, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"Group": "http://schemas.xmlsoap.org/claims/Group", # Group, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"UPN": "http://schemas.xmlsoap.org/claims/UPN", # AD FS 1.x UPN, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"role": "http://schemas.microsoft.com/ws/2008/06/identity/claims/role", # Role, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"surname": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/surname", # Surname, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"privatepersonalidentifier": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/privatepersonalidentifier", # PPID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"nameidentifier": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/nameidentifier", # Name ID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"authenticationinstant": "http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationinstant", # Authentication time stamp, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"authenticationmethod": "http://schemas.microsoft.com/ws/2008/06/identity/claims/authenticationmethod", # Authentication method, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"denyonlysid": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/denyonlysid", # Deny only group SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"denyonlyprimarysid": "http://schemas.microsoft.com/ws/2008/06/identity/claims/denyonlyprimarysid", # Deny only primary SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"denyonlyprimarygroupsid": "http://schemas.microsoft.com/ws/2008/06/identity/claims/denyonlyprimarygroupsid", # Deny only primary group SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"groupsid": "http://schemas.microsoft.com/ws/2008/06/identity/claims/groupsid", # Group SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"primarygroupsid": "http://schemas.microsoft.com/ws/2008/06/identity/claims/primarygroupsid", # Primary group SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"primarysid": "http://schemas.microsoft.com/ws/2008/06/identity/claims/primarysid", # Primary SID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"windowsaccountname": "http://schemas.microsoft.com/ws/2008/06/identity/claims/windowsaccountname", # Windows account name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"isregistereduser": "http://schemas.microsoft.com/2012/01/devicecontext/claims/isregistereduser", # Is Registered User, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"identifier": "http://schemas.microsoft.com/2012/01/devicecontext/claims/identifier", # Device Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"registrationid": "http://schemas.microsoft.com/2012/01/devicecontext/claims/registrationid", # Device Registration Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"displayname": "http://schemas.microsoft.com/2012/01/devicecontext/claims/displayname", # Device Registration DisplayName, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"ostype": "http://schemas.microsoft.com/2012/01/devicecontext/claims/ostype", # Device OS type, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"osversion": "http://schemas.microsoft.com/2012/01/devicecontext/claims/osversion", # Device OS Version, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"ismanaged": "http://schemas.microsoft.com/2012/01/devicecontext/claims/ismanaged", # Is Managed Device, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"x-ms-forwarded-client-ip": "http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-forwarded-client-ip", # Forwarded Client IP, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"x-ms-client-application": "http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-client-application", # Client Application, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"x-ms-client-user-agent": "http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-client-user-agent", # Client User Agent, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"x-ms-client-ip": "http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-client-ip", # Client IP, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"x-ms-endpoint-absolute-path": "http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-endpoint-absolute-path", # Endpoint Path, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"x-ms-proxy": "http://schemas.microsoft.com/2012/01/requestcontext/claims/x-ms-proxy", # Proxy, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"relyingpartytrustid": "http://schemas.microsoft.com/2012/01/requestcontext/claims/relyingpartytrustid", # Application Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"applicationpolicy": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/applicationpolicy", # Application policies, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"authoritykeyidentifier": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/authoritykeyidentifier", # Authority Key Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"basicconstraints": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/basicconstraints", # Basic Constraint, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"eku": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/eku", # Enhanced Key Usage, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"issuer": "http://schemas.microsoft.com/2012/12/certificatecontext/field/issuer", # Issuer, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"issuername": "http://schemas.microsoft.com/2012/12/certificatecontext/field/issuername", # Issuer Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"keyusage": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/keyusage", # Key Usage, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"notafter": "http://schemas.microsoft.com/2012/12/certificatecontext/field/notafter", # Not After, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"notbefore": "http://schemas.microsoft.com/2012/12/certificatecontext/field/notbefore", # Not Before, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"certificatepolicy": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/certificatepolicy", # Certificate Policies, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"rsa": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/rsa", # Public Key, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"rawdata": "http://schemas.microsoft.com/2012/12/certificatecontext/field/rawdata", # Certificate Raw Data, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"san": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/san", # Subject Alternative Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"serialnumber": "http://schemas.microsoft.com/ws/2008/06/identity/claims/serialnumber", # Serial Number, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"signaturealgorithm": "http://schemas.microsoft.com/2012/12/certificatecontext/field/signaturealgorithm", # Signature Algorithm, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"subject": "http://schemas.microsoft.com/2012/12/certificatecontext/field/subject", # Subject, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"subjectkeyidentifier": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/subjectkeyidentifier", # Subject Key Identifier, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"subjectname": "http://schemas.microsoft.com/2012/12/certificatecontext/field/subjectname", # Subject Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"certificatetemplateinformation": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/certificatetemplateinformation", # V2 Template Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"certificatetemplatename": "http://schemas.microsoft.com/2012/12/certificatecontext/extension/certificatetemplatename", # V1 Template Name, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"thumbprint": "http://schemas.xmlsoap.org/ws/2005/05/identity/claims/thumbprint", # Thumbprint, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"x509version": "http://schemas.microsoft.com/2012/12/certificatecontext/field/x509version", # X.509 Version, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"insidecorporatenetwork": "http://schemas.microsoft.com/ws/2012/01/insidecorporatenetwork", # Inside Corporate Network, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"passwordexpirationtime": "http://schemas.microsoft.com/ws/2012/01/passwordexpirationtime", # Password Expiration Time, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"passwordexpirationdays": "http://schemas.microsoft.com/ws/2012/01/passwordexpirationdays", # Password Expiration Days, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"passwordchangeurl": "http://schemas.microsoft.com/ws/2012/01/passwordchangeurl", # Update Password URL, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"authnmethodsreferences": "http://schemas.microsoft.com/claims/authnmethodsreferences", # Authentication Methods References, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"client-request-id": "http://schemas.microsoft.com/2012/01/requestcontext/claims/client-request-id", # Client Request ID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
"alternateloginid": "http://schemas.microsoft.com/ws/2013/11/alternateloginid", # Alternate Login ID, urn:oasis:names:tc:SAML:2.0:attrname-format:uri
}
}
| 158.816176 | 210 | 0.741238 | 2,982 | 21,599 | 5.36888 | 0.056003 | 0.06446 | 0.104747 | 0.120862 | 0.953342 | 0.949469 | 0.9406 | 0.938601 | 0.922361 | 0.820175 | 0 | 0.05063 | 0.092875 | 21,599 | 135 | 211 | 159.992593 | 0.766498 | 0.389509 | 0 | 0 | 0 | 0.103704 | 0.819331 | 0.053841 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.044444 | 0 | 0 | 0 | 0.014815 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
de3480cf308205a5767926416c787399523beee1 | 6,594 | py | Python | sensio/sphinx/phpcode.py | greg0ire/sphinx-php | 910374e0fa056104548e3cc50509ec90c13747e0 | [
"MIT"
] | 83 | 2015-01-11T06:02:29.000Z | 2021-11-08T09:48:26.000Z | sensio/sphinx/phpcode.py | greg0ire/sphinx-php | 910374e0fa056104548e3cc50509ec90c13747e0 | [
"MIT"
] | 20 | 2015-01-08T16:55:22.000Z | 2020-09-12T16:33:17.000Z | sensio/sphinx/phpcode.py | greg0ire/sphinx-php | 910374e0fa056104548e3cc50509ec90c13747e0 | [
"MIT"
] | 29 | 2015-01-10T17:55:36.000Z | 2021-05-06T12:45:10.000Z | # -*- coding: utf-8 -*-
"""
:copyright: (c) 2010-2012 Fabien Potencier
:license: MIT, see LICENSE for more details.
"""
from docutils import nodes, utils
from sphinx.util.nodes import split_explicit_title
def php_namespace_role(typ,
rawtext,
text,
lineno,
inliner,
options={},
content=[]):
text = utils.unescape(text)
env = inliner.document.settings.env
base_url = env.app.config.api_url
has_explicit_title, title, namespace = split_explicit_title(text)
try:
full_url = base_url % namespace.replace('\\', '/')
except (TypeError, ValueError):
env.warn(
env.docname, 'unable to expand %s api_url with base '
'URL %r, please make sure the base contains \'%%s\' '
'exactly once' % (typ, base_url))
full_url = base_url + utils.escape(full_class)
if not has_explicit_title:
name = namespace.lstrip('\\')
ns = name.rfind('\\')
if ns != -1:
name = name[ns + 1:]
title = name
list = [
nodes.reference(title,
title,
internal=False,
refuri=full_url,
reftitle=namespace)
]
pnode = nodes.literal('', '', *list)
return [pnode], []
def php_class_role(typ, rawtext, text, lineno, inliner, options={},
content=[]):
text = utils.unescape(text)
env = inliner.document.settings.env
base_url = env.app.config.api_url
has_explicit_title, title, full_class = split_explicit_title(text)
try:
full_url = base_url % full_class.replace('\\', '/')
except (TypeError, ValueError):
env.warn(
env.docname, 'unable to expand %s api_url with base '
'URL %r, please make sure the base contains \'%%s\' '
'exactly once' % (typ, base_url))
full_url = base_url + utils.escape(full_class)
if not has_explicit_title:
class_name = full_class.lstrip('\\')
ns = class_name.rfind('\\')
if ns != -1:
class_name = class_name[ns + 1:]
title = class_name
list = [
nodes.reference(title,
title,
internal=False,
refuri=full_url,
reftitle=full_class)
]
pnode = nodes.literal('', '', *list)
return [pnode], []
def php_method_role(typ,
rawtext,
text,
lineno,
inliner,
options={},
content=[]):
text = utils.unescape(text)
env = inliner.document.settings.env
base_url = env.app.config.api_url
has_explicit_title, title, class_and_method = split_explicit_title(text)
ns = class_and_method.rfind('::')
full_class = class_and_method[:ns]
method = class_and_method[ns + 2:]
try:
full_url = base_url % full_class.replace('\\', '/')
except (TypeError, ValueError):
env.warn(
env.docname, 'unable to expand %s api_url with base '
'URL %r, please make sure the base contains \'%%s\' '
'exactly once' % (typ, base_url))
full_url = base_url + utils.escape(full_class)
if not has_explicit_title:
title = method + '()'
list = [
nodes.reference(title,
title,
internal=False,
refuri=full_url,
reftitle=full_class + '::' + method + '()')
]
pnode = nodes.literal('', '', *list)
return [pnode], []
def php_phpclass_role(typ,
rawtext,
text,
lineno,
inliner,
options={},
content=[]):
text = utils.unescape(text)
has_explicit_title, title, full_class = split_explicit_title(text)
full_url = 'https://www.php.net/manual/en/class.%s.php' % full_class.lower(
)
if not has_explicit_title:
title = full_class
list = [
nodes.reference(title,
title,
internal=False,
refuri=full_url,
reftitle=full_class)
]
pnode = nodes.literal('', '', *list)
return [pnode], []
def php_phpmethod_role(typ,
rawtext,
text,
lineno,
inliner,
options={},
content=[]):
text = utils.unescape(text)
has_explicit_title, title, class_and_method = split_explicit_title(text)
ns = class_and_method.rfind('::')
full_class = class_and_method[:ns]
method = class_and_method[ns + 2:]
full_url = 'https://www.php.net/manual/en/%s.%s.php' % (
full_class.lower(), method.lower())
if not has_explicit_title:
title = full_class + '::' + method + '()'
list = [
nodes.reference(title,
title,
internal=False,
refuri=full_url,
reftitle=full_class)
]
pnode = nodes.literal('', '', *list)
return [pnode], []
def php_phpfunction_role(typ,
rawtext,
text,
lineno,
inliner,
options={},
content=[]):
text = utils.unescape(text)
has_explicit_title, title, full_function = split_explicit_title(text)
full_url = 'https://www.php.net/manual/en/function.%s.php' % full_function.replace(
'_', '-').lower()
if not has_explicit_title:
title = full_function
list = [
nodes.reference(title,
title,
internal=False,
refuri=full_url,
reftitle=full_function)
]
pnode = nodes.literal('', '', *list)
return [pnode], []
def setup(app):
app.add_config_value('api_url', '', 'env')
app.add_role('namespace', php_namespace_role)
app.add_role('class', php_class_role)
app.add_role('method', php_method_role)
app.add_role('phpclass', php_phpclass_role)
app.add_role('phpmethod', php_phpmethod_role)
app.add_role('phpfunction', php_phpfunction_role)
return {'parallel_read_safe': True}
| 31.4 | 87 | 0.510919 | 678 | 6,594 | 4.747788 | 0.154867 | 0.076732 | 0.059646 | 0.065238 | 0.800249 | 0.78192 | 0.775396 | 0.764523 | 0.721031 | 0.684063 | 0 | 0.003635 | 0.374128 | 6,594 | 209 | 88 | 31.550239 | 0.776351 | 0.016682 | 0 | 0.732955 | 0 | 0 | 0.080396 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039773 | false | 0 | 0.011364 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7213446934e5ab478cf0a057ba4b5f4ecabcf63f | 62,598 | py | Python | plotting_for_publication_plos_bio/plot_kegg_pi_distribution_multispecies.py | zhiru-liu/microbiome_evolution | 5a08fbf41357d845236e3ff46c31315929d2b649 | [
"BSD-2-Clause"
] | null | null | null | plotting_for_publication_plos_bio/plot_kegg_pi_distribution_multispecies.py | zhiru-liu/microbiome_evolution | 5a08fbf41357d845236e3ff46c31315929d2b649 | [
"BSD-2-Clause"
] | null | null | null | plotting_for_publication_plos_bio/plot_kegg_pi_distribution_multispecies.py | zhiru-liu/microbiome_evolution | 5a08fbf41357d845236e3ff46c31315929d2b649 | [
"BSD-2-Clause"
] | null | null | null | import matplotlib
matplotlib.use('Agg')
import os
import pylab
import sys
import numpy
from utils import diversity_utils
from parsers import parse_midas_data
from numpy.random import choice
import matplotlib.cm as cmx
import matplotlib.colors as colors
import pickle
import pandas
import seaborn as sns
import pandas
import random
# plotting tools
import matplotlib.colors as colors
from matplotlib.colors import LogNorm
import matplotlib.cm as cmx
from math import log10,ceil
import matplotlib as mpl
mpl.rcParams['font.size'] = 8
mpl.rcParams['lines.linewidth'] = 1.0
mpl.rcParams['legend.frameon'] = False
mpl.rcParams['legend.fontsize'] = 'small'
#############
#species_names=['Alistipes_putredinis_61533','Bacteroides_uniformis_57318']
species_names = parse_midas_data.parse_good_species_list()
files={}
for species_name in species_names:
if (os.path.exists(os.path.expanduser('~/tmp_intermediate_files/kegg_pi_%s.dat' % species_name))):
files[species_name]=pickle.load(open(os.path.expanduser('~/tmp_intermediate_files/kegg_pi_%s.dat' % species_name),'rb'))
else:
print species_name
colors=['#a1d99b','#c994c7']
####################################################################
# Order the species according to their phylogenetic relationships. #
####################################################################
# to make this plot I will have to feed the list of species to R. When I am in R, I will drop the species that are not in the list. Then, R will output a tree.
# output a file to the temp dir with the species number only
outFN=os.path.expanduser("~/tmp_intermediate_files/species_names.txt")
out_file=open(outFN, 'w')
for species in species_names:
print species
id_no=species.strip().split('_')[2]
out_file.write(species + '\t' + id_no +'\n')
# run the R script to prune the species tree
os.system('Rscript ~/ben_nandita_hmp_scripts/prune_species_tree.R ' + outFN)
# read in the order of the species that are on the tree:
file=open(os.path.expanduser("~/tmp_intermediate_files/species_order.txt"))
species_order=[]
for line in file:
species_order.append(line.strip())
######################################################################
# plot a comparison of pi for core vs variable genes accross species
#####################################################################
# try ordering species based on median piS in core
species_list=[]
mean_avg_pi=[]
sample_size=[]
for species_name in species_names:
if species_name in files.keys():
avg_pi_matrix_core=files[species_name]['avg_pi_matrix_core']
avg_pi_matrix_variable=files[species_name]['avg_pi_matrix_variable']
avg_pi_per_pathway_core=files[species_name]['avg_pi_per_pathway_core']
avg_pi_per_pathway_variable=files[species_name]['avg_pi_per_pathway_variable']
passed_sites_per_pathway_core=files[species_name]['passed_sites_per_pathway_core']
passed_sites_per_pathway_variable=files[species_name]['passed_sites_per_pathway_variable'] #need to fix this
num_genes_per_pathway_core=files[species_name]['num_genes_per_pathway_core']
num_genes_per_pathway_variable=files[species_name]['num_genes_per_pathway_variable']
num_people_with_data_pathway_core=files[species_name]['num_people_with_data_pathway_core']
num_people_with_data_pathway_variable=files[species_name]['num_people_with_data_pathway_variable']
fraction_nonsynonymous_core=files[species_name]['fraction_nonsynonymous_core']
fraction_nonsynonymous_variable=files[species_name]['fraction_nonsynonymous_variable']
fraction_nonsynonymous_per_pathway_core=files[species_name]['fraction_nonsynonymous_per_pathway_core']
fraction_nonsynonymous_per_pathway_variable=files[species_name]['fraction_nonsynonymous_per_pathway_variable']
fixation_opportunities_per_pathway_syn_non_core=files[species_name]['fixation_opportunities_per_pathway_syn_non_core']
fixation_opportunities_per_pathway_syn_non_variable=files[species_name]['fixation_opportunities_per_pathway_syn_non_variable']
num_genes_per_pathway_syn_non_core=files[species_name]['num_genes_per_pathway_syn_non_core']
num_genes_per_pathway_syn_non_variable=files[species_name]['num_genes_per_pathway_syn_non_variable']
num_people_with_data_per_pathway_fixations_core=files[species_name]['num_people_with_data_per_pathway_fixations_core']
num_people_with_data_per_pathway_fixations_variable=files[species_name]['num_people_with_data_per_pathway_fixations_variable']
dtot_core=files[species_name]['dtot_core']
dtot_variable=files[species_name]['dtot_variable']
dtot_per_pathway_core=files[species_name]['dtot_per_pathway_core']
dtot_per_pathway_variable=files[species_name]['dtot_per_pathway_variable']
fixation_opportunities_per_pathway_all_core=files[species_name]['fixation_opportunities_per_pathway_all_core']
fixation_opportunities_per_pathway_all_variable=files[species_name]['fixation_opportunities_per_pathway_all_variable']
num_genes_per_pathway_tot_core=files[species_name]['num_genes_per_pathway_tot_core']
num_genes_per_pathway_tot_variable=files[species_name]['num_genes_per_pathway_tot_variable']
num_people_with_data_per_pathway_tot_core=files[species_name]['num_people_with_data_per_pathway_tot_core']
num_people_with_data_per_pathway_tot_variable=files[species_name]['num_people_with_data_per_pathway_tot_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
same_sample_idxs=files[species_name]['same_sample_idxs']
avg_pi=numpy.median(avg_pi_matrix_core[same_sample_idxs])
species_list.append(species_name)
mean_avg_pi.append(avg_pi)
sample_size.append(len(same_sample_idxs[0]))
df=pandas.DataFrame({'species':species_list,'sample_size':sample_size})
df=df.sort('sample_size')
sorted_species=list(df['species']) # this has a list of species sorted by sample size
######################################################################
# plot a comparison of pi for core vs variable genes accross species
#####################################################################
pylab.figure(figsize=(6,20))
pylab.xlabel('Pi/bp')
pylab.ylabel("Species/gene type")
pylab.xlim(1e-7,1e-1)
data=[]
labels=[]
for species_name in sorted_species:
if species_name in files.keys():
#for species_name in sorted_species:
avg_pi_matrix_core=files[species_name]['avg_pi_matrix_core']
avg_pi_matrix_variable=files[species_name]['avg_pi_matrix_variable']
same_sample_idxs=files[species_name]['same_sample_idxs']
data.append(avg_pi_matrix_core[same_sample_idxs])
labels.append(species_name + '_core, m=' + str(len(same_sample_idxs[0])))
data.append(avg_pi_matrix_variable[same_sample_idxs])
labels.append(species_name + '_variable, m=' + str(len(same_sample_idxs[0])))
bp=pylab.boxplot(data,0,'.',0, widths=0.75,patch_artist=True)
k=0
for patch in bp['boxes']:
patch.set_facecolor(colors[k%2])
k+=1
pylab.xscale('log')
locs, dummy_labels = pylab.yticks()
pylab.yticks(locs, labels, fontsize=9)
pylab.axvline(x=0.001, ymin=0, ymax=1, hold=None)
pylab.savefig('%s/core_vs_variable_genes_pi_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
######################################################################
# plot a comparison of fraction nonsyn for core vs variable genes accross species
#####################################################################
pylab.figure(figsize=(6,20))
pylab.xlabel('Fraction nonsynonymous fixations')
pylab.ylabel("Species/gene type")
pylab.xlim(0,1)
data=[]
labels=[]
for species_name in sorted_species:
if species_name in files.keys():
#for species_name in sorted_species:
fraction_nonsynonymous_core=files[species_name]['fraction_nonsynonymous_core']
fraction_nonsynonymous_variable=files[species_name]['fraction_nonsynonymous_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
tmp1=numpy.array(diff_subject_idxs[0], dtype=numpy.int32)
tmp2=numpy.array(diff_subject_idxs[1], dtype=numpy.int32)
diff_subject_idxs=(tmp1,tmp2)
same_sample_idxs=files[species_name]['same_sample_idxs']
data.append(fraction_nonsynonymous_core[diff_subject_idxs])
labels.append(species_name + '_core, n=' + str(len(same_sample_idxs[0])))
data.append(fraction_nonsynonymous_variable[diff_subject_idxs])
labels.append(species_name + '_variable, n=' + str(len(same_sample_idxs[0])))
bp=pylab.boxplot(data,0,'.',0, widths=0.75,patch_artist=True)
k=0
for patch in bp['boxes']:
patch.set_facecolor(colors[k%2])
k+=1
locs, dummy_labels = pylab.yticks()
pylab.yticks(locs, labels, fontsize=9)
pylab.savefig('%s/core_vs_variable_genes_fraction_nonsyn_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
######################################################################
# plot a comparison of fraction nonsyn for core vs variable genes for just A. putredinis and B. vulgatus
#####################################################################
fig=pylab.figure(figsize=(4,6))
pylab.ylabel('Fraction nonsynonymous fixations')
pylab.xlabel("Species/gene type")
pylab.ylim(0,0.4)
data=[]
labels=[]
colors=['#de2d26','#3182bd']
ax = fig.add_subplot(111)
ax.patch.set_facecolor('white')
ax.patch.set_alpha(0)
for species_name in ['Bacteroides_vulgatus_57955', 'Alistipes_putredinis_61533']:
if species_name in files.keys():
#for species_name in sorted_species:
fraction_nonsynonymous_core=files[species_name]['fraction_nonsynonymous_core']
fraction_nonsynonymous_variable=files[species_name]['fraction_nonsynonymous_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
tmp1=numpy.array(diff_subject_idxs[0], dtype=numpy.int32)
tmp2=numpy.array(diff_subject_idxs[1], dtype=numpy.int32)
diff_subject_idxs=(tmp1,tmp2)
same_sample_idxs=files[species_name]['same_sample_idxs']
data.append(fraction_nonsynonymous_core[diff_subject_idxs])
labels.append(species_name + '_core, n=' + str(len(same_sample_idxs[0])))
data.append(fraction_nonsynonymous_variable[diff_subject_idxs])
labels.append(species_name + '_variable, n=' + str(len(same_sample_idxs[0])))
bp=pylab.boxplot(data,patch_artist=True)
k=0
for patch in bp['boxes']:
patch.set_facecolor(colors[k%2])
k+=1
locs, dummy_labels = pylab.xticks()
pylab.xticks(locs, labels, fontsize=9)
#fig.set_facecolor("white")
fig.savefig('%s/core_vs_variable_genes_fraction_nonsyn_B_vul_A_put.png' % (parse_midas_data.analysis_directory), facecolor=fig.get_facecolor(), bbox_inches='tight', dpi=300)
######################################################################
# plot a comparison of total fixations for core vs variable genes accross species
#####################################################################
pylab.figure(figsize=(6,20))
pylab.xlabel('Total fixations')
pylab.ylabel("Species/gene type")
pylab.xlim(1e-7,1e-1)
data=[]
labels=[]
for species_name in sorted_species:
if species_name in files.keys():
#for species_name in sorted_species:
dtot_core=files[species_name]['dtot_core']
dtot_variable=files[species_name]['dtot_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
same_sample_idxs=files[species_name]['same_sample_idxs']
data.append(dtot_core[diff_subject_idxs])
labels.append(species_name + '_core, m=' + str(len(same_sample_idxs[0])))
data.append(dtot_variable[diff_subject_idxs])
labels.append(species_name + '_variable, m=' + str(len(same_sample_idxs[0])))
bp=pylab.boxplot(data,0,'.',0, widths=0.75,patch_artist=True)
k=0
for patch in bp['boxes']:
patch.set_facecolor(colors[k%2])
k+=1
pylab.xscale('log')
locs, dummy_labels = pylab.yticks()
pylab.yticks(locs, labels, fontsize=9)
pylab.savefig('%s/core_vs_variable_genes_fixations_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################################################
# Plot fraction dN vs dtot for all species core vs var
################################################################
pylab.figure(figsize=(8,len(files.keys()*3)))
plot_no=1
for species_name in species_names:
if species_name in files.keys():
#for species_name in sorted_species:
pylab.subplot(len(files.keys()), 1, plot_no)
fraction_nonsynonymous_core=files[species_name]['fraction_nonsynonymous_core']
fraction_nonsynonymous_variable=files[species_name]['fraction_nonsynonymous_variable']
dtot_core=files[species_name]['dtot_core']
dtot_variable=files[species_name]['dtot_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
same_sample_idxs=files[species_name]['same_sample_idxs']
pylab.scatter(dtot_core[diff_subject_idxs],fraction_nonsynonymous_core[diff_subject_idxs], color='red',marker='.', label=species_name)
pylab.scatter(dtot_variable[diff_subject_idxs],fraction_nonsynonymous_variable[diff_subject_idxs], color='blue',marker='.')
pylab.title(species_name)
pylab.ylabel("Fraction nonsynonymous")
pylab.xlim(1e-7,1e-1)
pylab.ylim(0,1)
pylab.xscale('log')
plot_no+=1
pylab.xlabel('Total fixations')
pylab.legend(loc='lower right',frameon=False)
pylab.savefig('%s/core_vs_variable_genes_fraction_nonsyn_vs_fixations_multispecies.png' % (
parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
###################################################
# Bar plot of number of species with each pathway #
###################################################
# iterate through the species
# add the kegg pathway to a dictionary
# How many species share each pathway?
# make a bar plot showing this info
pathways_variable={}
pathways_core={}
for species_name in species_names:
if species_name in files.keys():
avg_pi_per_pathway_core=files[species_name]['avg_pi_per_pathway_core']
avg_pi_per_pathway_variable=files[species_name]['avg_pi_per_pathway_variable']
passed_sites_per_pathway_core=files[species_name]['passed_sites_per_pathway_core']
num_genes_per_pathway_core=files[species_name]['num_genes_per_pathway_core']
num_people_with_data_pathway_core=files[species_name]['num_people_with_data_pathway_core']
#iterate through the pathways
for pathway in avg_pi_per_pathway_variable.keys():
if pathway not in pathways_variable:
pathways_variable[pathway]=[]
pathways_variable[pathway].append(species_name)
for pathway in avg_pi_per_pathway_core.keys():
if pathway not in pathways_core:
pathways_core[pathway]=[]
pathways_core[pathway].append(species_name)
#plot for core pathways
species_counts=[]
pathway_labels=[]
for pathway in pathways_core:
species_counts.append(len(pathways_core[pathway]))
pathway_labels.append(pathway)
table=[pathway_labels,species_counts]
df=pandas.DataFrame({'num_species':species_counts,'pathways':pathway_labels})
df_sorted=df.sort(['num_species'], ascending=0)
ypos=numpy.arange(len(df_sorted['num_species']))
pylab.figure(figsize=(12,15))
pylab.barh(ypos, df_sorted['num_species'])
pylab.title("Number of species sharing each pathway (core genes)", fontsize=8)
pylab.yticks(ypos, df_sorted['pathways'], fontsize=9)
pylab.xlabel('Number of species')
pylab.savefig('%s/num_species_per_pathway_core.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
# plot for variable pathways
species_counts=[]
pathway_labels=[]
for pathway in pathways_variable:
species_counts.append(len(pathways_variable[pathway]))
pathway_labels.append(pathway)
table=[pathway_labels,species_counts]
df=pandas.DataFrame({'num_species':species_counts,'pathways':pathway_labels})
df_sorted=df.sort(['num_species'], ascending=0)
ypos=numpy.arange(len(df_sorted['num_species']))
pylab.figure(figsize=(12,15))
pylab.barh(ypos, df_sorted['num_species'])
pylab.title("Number of species sharing each pathway (variable genes)", fontsize=8)
pylab.yticks(ypos, df_sorted['pathways'], fontsize=9)
pylab.xlabel('Number of species')
pylab.savefig('%s/num_species_per_pathway_variable.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
#########################################################################
# plot pi within pathways with the most data, with species side-by-side #
#########################################################################
pylab.figure(figsize=(6,100))
pylab.xlabel('pi/bp')
pylab.ylabel("Species/gene type")
pylab.xlim(1e-7,1e-1)
pathways_core={}
num_genes={}
num_people={}
min_passed_sites_per_person=100
pathways_core['All core genes']=[]
pathways_core['All variable genes']=[]
min_mean_number_genes=10 # this is the min mean number of genes that need to be in a pathway in order for it to be plotted.
for species_name in species_names:
if species_name in files.keys():
avg_pi_matrix_core=files[species_name]['avg_pi_matrix_core']
avg_pi_matrix_variable=files[species_name]['avg_pi_matrix_variable']
avg_pi_per_pathway_core=files[species_name]['avg_pi_per_pathway_core']
same_sample_idxs=files[species_name]['same_sample_idxs']
passed_sites_per_pathway_core=files[species_name]['passed_sites_per_pathway_core']
num_genes_per_pathway_core=files[species_name]['num_genes_per_pathway_core']
num_people_with_data_pathway_core=files[species_name]['num_people_with_data_pathway_core']
# add the full genome data:
pathways_core['All core genes'].append(avg_pi_matrix_core[same_sample_idxs])
pathways_core['All variable genes'].append(avg_pi_matrix_variable[same_sample_idxs])
for pathway in avg_pi_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(passed_sites_per_pathway_core[pathway][same_sample_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if pathway not in pathways_core:
pathways_core[pathway]=[]
num_genes[pathway]=[]
num_people[pathway]=[]
pathways_core[pathway].append(avg_pi_per_pathway_core[pathway][same_sample_idxs][high_num_sites_idx])
num_genes[pathway].append(num_genes_per_pathway_core[pathway])
num_people[pathway].append(num_people_with_data_pathway_core[pathway])
# iterate through all pathways and concatenate into a single list
data=[]
labels=[]
color_list=[]
k=0
for pathway in ['All core genes','All variable genes','','Annotated pathways']:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
if pathway =='':
labels.append('Unannotated pathways')
else:
labels.append(pathway)
else:
labels.append('')
k+=1
k=0
for pathway in pathways_core:
if pathway !='' and pathway !='Annotated pathways' and pathway !='All core genes' and pathway != 'All variable genes':
# check if the mean_num_genes is >=min_mean_number_genes:
mean_num_genes=numpy.mean(numpy.asarray(num_genes[pathway]))
mean_num_people=numpy.mean(numpy.asarray(num_people[pathway]))
if mean_num_genes>=min_mean_number_genes:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
labels.append(pathway+ ', n=' + str(int(mean_num_genes))+ ', m='+str(int(mean_num_people)))
else:
labels.append('')
k+=1
bp=pylab.boxplot(data,0,'.',0, widths=0.75,patch_artist=True)
i=0
for patch in bp['boxes']:
patch.set_facecolor(color_list[i])
i+=1
pylab.xscale('log')
locs, dummy_labels = pylab.yticks()
pylab.yticks(locs, labels, fontsize=9)
pylab.savefig('%s/pi_per_pathway_core_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
#########################################################################
# plot pi within pathways with the most data, with species side-by-side #
# repeat for variable genes
#########################################################################
pylab.figure(figsize=(6,25))
pylab.xlabel('pi/bp')
pylab.ylabel("Species/gene type")
pylab.xlim(1e-7,1e-1)
pathways_core={}
num_genes={}
num_people={}
min_passed_sites_per_person=100
pathways_core['All core genes']=[]
pathways_core['All variable genes']=[]
min_mean_number_genes=5 # this is the min mean number of genes that need to be in a pathway in order for it to be plotted.
for species_name in species_names:
if species_name in files.keys():
avg_pi_matrix_core=files[species_name]['avg_pi_matrix_core']
avg_pi_matrix_variable=files[species_name]['avg_pi_matrix_variable']
avg_pi_per_pathway_core=files[species_name]['avg_pi_per_pathway_variable']
same_sample_idxs=files[species_name]['same_sample_idxs']
passed_sites_per_pathway_core=files[species_name]['passed_sites_per_pathway_variable']
num_genes_per_pathway_core=files[species_name]['num_genes_per_pathway_variable']
num_people_with_data_pathway_core=files[species_name]['num_people_with_data_pathway_variable']
# add the full genome data:
pathways_core['All core genes'].append(avg_pi_matrix_core[same_sample_idxs])
pathways_core['All variable genes'].append(avg_pi_matrix_variable[same_sample_idxs])
for pathway in avg_pi_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(passed_sites_per_pathway_core[pathway][same_sample_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if pathway not in pathways_core:
pathways_core[pathway]=[]
num_genes[pathway]=[]
num_people[pathway]=[]
pathways_core[pathway].append(avg_pi_per_pathway_core[pathway][same_sample_idxs][high_num_sites_idx])
num_genes[pathway].append(num_genes_per_pathway_core[pathway])
num_people[pathway].append(num_people_with_data_pathway_core[pathway])
# iterate through all pathways and concatenate into a single list
data=[]
labels=[]
color_list=[]
k=0
for pathway in ['All core genes','All variable genes','','Annotated pathways']:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
if pathway =='':
labels.append('Unannotated pathways')
else:
labels.append(pathway)
else:
labels.append('')
k+=1
k=0
for pathway in pathways_core:
if pathway !='' and pathway !='Annotated pathways' and pathway !='All core genes' and pathway != 'All variable genes':
# check if the mean_num_genes is >=min_mean_number_genes:
mean_num_genes=numpy.mean(numpy.asarray(num_genes[pathway]))
mean_num_people=numpy.mean(numpy.asarray(num_people[pathway]))
if mean_num_genes>=min_mean_number_genes:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
labels.append(pathway+ ', n=' + str(int(mean_num_genes))+ ', m='+str(int(mean_num_people)))
else:
labels.append('')
k+=1
bp=pylab.boxplot(data,0,'.',0, widths=0.75,patch_artist=True)
i=0
for patch in bp['boxes']:
patch.set_facecolor(color_list[i])
i+=1
pylab.xscale('log')
locs, dummy_labels = pylab.yticks()
pylab.yticks(locs, labels, fontsize=9)
pylab.savefig('%s/pi_per_pathway_variable_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################################################################################
# plot fraction nonsyn fixations within pathways with the most data, with species side-by-side #
################################################################################################
pylab.figure(figsize=(6,100))
pylab.xlabel('Fraction of fixations that are nonsynonymous')
pylab.ylabel("Species/gene type")
pylab.xlim(0,1)
pathways_core={}
num_genes={}
num_people={}
min_passed_sites_per_person=100
pathways_core['All core genes']=[]
pathways_core['All variable genes']=[]
min_mean_number_genes=10 # this is the min mean number of genes that need to be in a pathway in order for it to be plotted.
for species_name in species_names:
if species_name in files.keys():
fraction_nonsynonymous_core=files[species_name]['fraction_nonsynonymous_core']
fraction_nonsynonymous_variable=files[species_name]['fraction_nonsynonymous_variable']
fraction_nonsynonymous_per_pathway_core=files[species_name]['fraction_nonsynonymous_per_pathway_core']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
fixation_opportunities_per_pathway_syn_non_core=files[species_name]['fixation_opportunities_per_pathway_syn_non_core']
num_genes_per_pathway_syn_non_core=files[species_name]['num_genes_per_pathway_syn_non_core']
num_people_with_data_per_pathway_fixations_core=files[species_name]['num_people_with_data_per_pathway_fixations_core']
# add the full genome data:
pathways_core['All core genes'].append(fraction_nonsynonymous_core[diff_subject_idxs])
pathways_core['All variable genes'].append(fraction_nonsynonymous_variable[diff_subject_idxs])
for pathway in fraction_nonsynonymous_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(fixation_opportunities_per_pathway_syn_non_core[pathway][diff_subject_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if pathway not in pathways_core:
pathways_core[pathway]=[]
num_genes[pathway]=[]
num_people[pathway]=[]
pathways_core[pathway].append(fraction_nonsynonymous_per_pathway_core[pathway][diff_subject_idxs][high_num_sites_idx])
num_genes[pathway].append(num_genes_per_pathway_syn_non_core[pathway])
num_people[pathway].append(num_people_with_data_per_pathway_fixations_core[pathway])
# iterate through all pathways and concatenate into a single list
data=[]
labels=[]
color_list=[]
k=0
for pathway in ['All core genes','All variable genes','','Annotated pathways']:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
if pathway =='':
labels.append('Unannotated pathways')
else:
labels.append(pathway)
else:
labels.append('')
k+=1
k=0
for pathway in pathways_core:
if pathway !='' and pathway !='Annotated pathways' and pathway !='All core genes' and pathway != 'All variable genes':
# check if the mean_num_genes is >=min_mean_number_genes:
mean_num_genes=numpy.mean(numpy.asarray(num_genes[pathway]))
mean_num_people=numpy.mean(numpy.asarray(num_people[pathway]))
if mean_num_genes>=min_mean_number_genes:
for i in range(0, len(pathways_core[pathway])):
data.append(pathways_core[pathway][i])
color_list.append(colors[k%2])
if i ==0:
labels.append(pathway+ ', n=' + str(int(mean_num_genes))+ ', m='+str(int(mean_num_people)))
else:
labels.append('')
k+=1
bp=pylab.boxplot(data,0,'.',0, widths=0.75,patch_artist=True)
i=0
for patch in bp['boxes']:
patch.set_facecolor(color_list[i])
i+=1
locs, dummy_labels = pylab.yticks()
pylab.yticks(locs, labels, fontsize=9)
pylab.savefig('%s/fraction_nonsynonymous_per_pathway_core_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################################################################################
# plot fraction nonsyn fixations within pathways with the most data, with species side-by-side #
# repeat for variable genes
################################################################################################
pylab.figure(figsize=(6,25))
pylab.xlabel('Fraction of fixations that are nonsynonymous')
pylab.ylabel("Species/gene type")
pylab.xlim(0,1)
pathways_core={}
num_genes={}
num_people={}
min_passed_sites_per_person=100
pathways_core['All core genes']=[]
pathways_core['All variable genes']=[]
min_mean_number_genes=5 # this is the min mean number of genes that need to be in a pathway in order for it to be plotted.
for species_name in species_names:
if species_name in files.keys():
fraction_nonsynonymous_core=files[species_name]['fraction_nonsynonymous_core']
fraction_nonsynonymous_variable=files[species_name]['fraction_nonsynonymous_variable']
fraction_nonsynonymous_per_pathway_core=files[species_name]['fraction_nonsynonymous_per_pathway_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
fixation_opportunities_per_pathway_syn_non_core=files[species_name]['fixation_opportunities_per_pathway_syn_non_variable']
num_genes_per_pathway_syn_non_core=files[species_name]['num_genes_per_pathway_syn_non_variable']
num_people_with_data_per_pathway_fixations_core=files[species_name]['num_people_with_data_per_pathway_fixations_variable']
# add the full genome data:
pathways_core['All core genes'].append(fraction_nonsynonymous_core[diff_subject_idxs])
pathways_core['All variable genes'].append(fraction_nonsynonymous_variable[diff_subject_idxs])
for pathway in fraction_nonsynonymous_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(fixation_opportunities_per_pathway_syn_non_core[pathway][diff_subject_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if pathway not in pathways_core:
pathways_core[pathway]=[]
num_genes[pathway]=[]
num_people[pathway]=[]
pathways_core[pathway].append(fraction_nonsynonymous_per_pathway_core[pathway][diff_subject_idxs][high_num_sites_idx])
num_genes[pathway].append(num_genes_per_pathway_syn_non_core[pathway])
num_people[pathway].append(num_people_with_data_per_pathway_fixations_core[pathway])
# iterate through all pathways and concatenate into a single list
data=[]
labels=[]
color_list=[]
k=0
for pathway in ['All core genes','All variable genes','','Annotated pathways']:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
if pathway =='':
labels.append('Unannotated pathways')
else:
labels.append(pathway)
else:
labels.append('')
k+=1
k=0
for pathway in pathways_core:
if pathway !='' and pathway !='Annotated pathways' and pathway !='All core genes' and pathway != 'All variable genes':
# check if the mean_num_genes is >=min_mean_number_genes:
mean_num_genes=numpy.mean(numpy.asarray(num_genes[pathway]))
mean_num_people=numpy.mean(numpy.asarray(num_people[pathway]))
if mean_num_genes>=min_mean_number_genes:
for i in range(0, len(pathways_core[pathway])):
data.append(pathways_core[pathway][i])
color_list.append(colors[k%2])
if i ==0:
labels.append(pathway+ ', n=' + str(int(mean_num_genes))+ ', m='+str(int(mean_num_people)))
else:
labels.append('')
k+=1
bp=pylab.boxplot(data,0,'.',0, widths=0.75,patch_artist=True)
i=0
for patch in bp['boxes']:
patch.set_facecolor(color_list[i])
i+=1
locs, dummy_labels = pylab.yticks()
pylab.yticks(locs, labels, fontsize=9)
pylab.savefig('%s/fraction_nonsynonymous_per_pathway_variable_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################################################################################
# plot fixations within pathways with the most data, with species side-by-side #
################################################################################################
pylab.figure(figsize=(6,100))
pylab.xlabel('Total divergence')
pylab.ylabel("Species/gene type")
pylab.xlim(1e-7,1)
pathways_core={}
num_genes={}
num_people={}
min_passed_sites_per_person=100
pathways_core['All core genes']=[]
pathways_core['All variable genes']=[]
min_mean_number_genes=10 # this is the min mean number of genes that need to be in a pathway in order for it to be plotted.
for species_name in species_names:
if species_name in files.keys():
dtot_core=files[species_name]['dtot_core']
dtot_variable=files[species_name]['dtot_variable']
dtot_per_pathway_core=files[species_name]['dtot_per_pathway_core']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
fixation_opportunities_per_pathway_all_core=files[species_name]['fixation_opportunities_per_pathway_all_core']
num_genes_per_pathway_tot_core=files[species_name]['num_genes_per_pathway_tot_core']
num_people_with_data_per_pathway_tot_core=files[species_name]['num_people_with_data_per_pathway_tot_core']
# add the full genome data:
pathways_core['All core genes'].append(dtot_core[diff_subject_idxs])
pathways_core['All variable genes'].append(dtot_variable[diff_subject_idxs])
for pathway in dtot_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(fixation_opportunities_per_pathway_all_core[pathway][diff_subject_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if pathway not in pathways_core:
pathways_core[pathway]=[]
num_genes[pathway]=[]
num_people[pathway]=[]
pathways_core[pathway].append(dtot_per_pathway_core[pathway][diff_subject_idxs][high_num_sites_idx])
num_genes[pathway].append(num_genes_per_pathway_tot_core[pathway])
num_people[pathway].append(num_people_with_data_per_pathway_tot_core[pathway])
# iterate through all pathways and concatenate into a single list
data=[]
labels=[]
color_list=[]
k=0
for pathway in ['All core genes','All variable genes','','Annotated pathways']:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
if pathway =='':
labels.append('Unannotated pathways')
else:
labels.append(pathway)
else:
labels.append('')
k+=1
k=0
for pathway in pathways_core:
if pathway !='' and pathway !='Annotated pathways' and pathway !='All core genes' and pathway != 'All variable genes':
# check if the mean_num_genes is >=min_mean_number_genes:
mean_num_genes=numpy.mean(numpy.asarray(num_genes[pathway]))
mean_num_people=numpy.mean(numpy.asarray(num_people[pathway]))
if mean_num_genes>=min_mean_number_genes:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
labels.append(pathway+ ', n=' + str(int(mean_num_genes))+ ', m='+str(int(mean_num_people)))
else:
labels.append('')
k+=1
bp=pylab.boxplot(data,0,'.',0, widths=0.75,patch_artist=True)
i=0
for patch in bp['boxes']:
patch.set_facecolor(color_list[i])
i+=1
pylab.xscale('log')
locs, dummy_labels = pylab.yticks()
pylab.yticks(locs, labels, fontsize=9)
pylab.savefig('%s/fixations_per_pathway_core_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################################################################################
# plot fixations within pathways with the most data, with species side-by-side #
# repeat for variable genes
################################################################################################
pylab.figure(figsize=(6,25))
pylab.xlabel('Total divergence')
pylab.ylabel("Species/gene type")
pylab.xlim(1e-7,1)
pathways_core={}
num_genes={}
num_people={}
min_passed_sites_per_person=100
pathways_core['All core genes']=[]
pathways_core['All variable genes']=[]
min_mean_number_genes=5 # this is the min mean number of genes that need to be in a pathway in order for it to be plotted.
for species_name in species_names:
if species_name in files.keys():
dtot_core=files[species_name]['dtot_core']
dtot_variable=files[species_name]['dtot_variable']
dtot_per_pathway_core=files[species_name]['dtot_per_pathway_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
fixation_opportunities_per_pathway_all_core=files[species_name]['fixation_opportunities_per_pathway_all_variable']
num_genes_per_pathway_tot_core=files[species_name]['num_genes_per_pathway_tot_variable']
num_people_with_data_per_pathway_tot_core=files[species_name]['num_people_with_data_per_pathway_tot_variable']
# add the full genome data:
pathways_core['All core genes'].append(dtot_core[diff_subject_idxs])
pathways_core['All variable genes'].append(dtot_variable[diff_subject_idxs])
for pathway in dtot_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(fixation_opportunities_per_pathway_all_core[pathway][diff_subject_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if pathway not in pathways_core:
pathways_core[pathway]=[]
num_genes[pathway]=[]
num_people[pathway]=[]
pathways_core[pathway].append(dtot_per_pathway_core[pathway][diff_subject_idxs][high_num_sites_idx])
num_genes[pathway].append(num_genes_per_pathway_tot_core[pathway])
num_people[pathway].append(num_people_with_data_per_pathway_tot_core[pathway])
# iterate through all pathways and concatenate into a single list
data=[]
labels=[]
color_list=[]
k=0
for pathway in ['All core genes','All variable genes','','Annotated pathways']:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
if pathway =='':
labels.append('Unannotated pathways')
else:
labels.append(pathway)
else:
labels.append('')
k+=1
k=0
for pathway in pathways_core:
if pathway !='' and pathway !='Annotated pathways' and pathway !='All core genes' and pathway != 'All variable genes':
# check if the mean_num_genes is >=min_mean_number_genes:
mean_num_genes=numpy.mean(numpy.asarray(num_genes[pathway]))
mean_num_people=numpy.mean(numpy.asarray(num_people[pathway]))
if mean_num_genes>=min_mean_number_genes:
for i in range(0, len(pathways_core[pathway])):
data.append(numpy.clip(pathways_core[pathway][i],5*1e-7,1))
color_list.append(colors[k%2])
if i ==0:
labels.append(pathway+ ', n=' + str(int(mean_num_genes))+ ', m='+str(int(mean_num_people)))
else:
labels.append('')
k+=1
bp=pylab.boxplot(data,0,'.',0, widths=0.75,patch_artist=True)
i=0
for patch in bp['boxes']:
patch.set_facecolor(color_list[i])
i+=1
pylab.xscale('log')
locs, dummy_labels = pylab.yticks()
pylab.yticks(locs, labels, fontsize=9)
pylab.savefig('%s/fixations_per_pathway_variable_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
########################
# HEATMAPS #
########################
###################
# pi per pathway #
###################
min_number_genes=10
min_passed_sites_per_person=100
data_dict={}
# load the data into data_dict, where keys are species_name, values are mean_pi
for species_name in species_names:
if species_name in files.keys():
avg_pi_matrix_core=files[species_name]['avg_pi_matrix_core']
avg_pi_matrix_variable=files[species_name]['avg_pi_matrix_variable']
avg_pi_per_pathway_core=files[species_name]['avg_pi_per_pathway_core']
same_sample_idxs=files[species_name]['same_sample_idxs']
passed_sites_per_pathway_core=files[species_name]['passed_sites_per_pathway_core']
num_genes_per_pathway_core=files[species_name]['num_genes_per_pathway_core']
num_people_with_data_pathway_core=files[species_name]['num_people_with_data_pathway_core']
mean_pi_dict={}
mean_pi_dict['All core genes']=numpy.mean(avg_pi_matrix_core[same_sample_idxs])
mean_pi_dict['All variable genes']=numpy.mean(avg_pi_matrix_variable[same_sample_idxs])
for pathway in avg_pi_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(passed_sites_per_pathway_core[pathway][same_sample_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if num_genes_per_pathway_core[pathway]>=min_number_genes:
mean_pi=numpy.clip(numpy.mean(avg_pi_per_pathway_core[pathway][same_sample_idxs][high_num_sites_idx]),1e-7,1)
if pathway!='':
mean_pi_dict[pathway]=mean_pi
else:
mean_pi_dict['Unannotated pathways']=mean_pi
data_dict[species_name]=mean_pi_dict
# convert data_dict into a pandas dataframe
df=pandas.DataFrame(data_dict)
pathway_order=[]
pathway_order.append('All core genes')
pathway_order.append('All variable genes')
pathway_order.append('Annotated pathways')
pathway_order.append('Unannotated pathways')
pathway_names=list(df.index)
for pathway in pathway_names:
if pathway not in pathway_order:
pathway_order.append(pathway)
df=df.reindex(pathway_order)
# use seaborn to plot a heat map of the pi values
pylab.figure(figsize=(24,8))
pylab.subplot(1,2,1)
sns.heatmap(df, cmap='RdYlGn_r',norm=LogNorm(vmin=1e-7, vmax=1e-2))
#pylab.savefig('%s/pi_per_pathway_heatmap_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
# Iterate through each species again and compute the rank order of each column.
for species_name in species_names:
if species_name in files.keys():
df[species_name]=df[species_name].rank(ascending=True)
# use seaborn to plot a heat map of the rank orders
#pylab.figure(figsize=(8,8))
pylab.subplot(1,2,2)
sns.heatmap(df, cmap='RdYlGn_r',yticklabels=False)
pylab.savefig('%s/pi_per_pathway_ranking_heatmap_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################
# fraction nonsyn per pathway #
# core genes #
################################
min_number_genes=10
min_passed_sites_per_person=100
data_dict={}
# load the data into data_dict, where keys are species_name, values are mean_pi
for species_name in species_names:
if species_name in files.keys():
fraction_nonsynonymous_core=files[species_name]['fraction_nonsynonymous_core']
fraction_nonsynonymous_variable=files[species_name]['fraction_nonsynonymous_variable']
fraction_nonsynonymous_per_pathway_core=files[species_name]['fraction_nonsynonymous_per_pathway_core']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
fixation_opportunities_per_pathway_syn_non_core=files[species_name]['fixation_opportunities_per_pathway_syn_non_core']
num_genes_per_pathway_syn_non_core=files[species_name]['num_genes_per_pathway_syn_non_core']
num_people_with_data_per_pathway_fixations_core=files[species_name]['num_people_with_data_per_pathway_fixations_core']
mean_pi_dict={}
mean_pi_dict['All core genes']=numpy.mean(fraction_nonsynonymous_core[diff_subject_idxs])
mean_pi_dict['All variable genes']=numpy.mean(fraction_nonsynonymous_variable[diff_subject_idxs])
for pathway in fraction_nonsynonymous_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(fixation_opportunities_per_pathway_syn_non_core[pathway][diff_subject_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if num_genes_per_pathway_syn_non_core[pathway]>=min_number_genes:
mean_pi=numpy.mean(fraction_nonsynonymous_per_pathway_core[pathway][diff_subject_idxs][high_num_sites_idx])
if pathway!='':
mean_pi_dict[pathway]=mean_pi
else:
mean_pi_dict['Unannotated pathways']=mean_pi
data_dict[species_name]=mean_pi_dict
# convert data_dict into a pandas dataframe
df=pandas.DataFrame(data_dict)
pathway_order=[]
pathway_order.append('All core genes')
pathway_order.append('All variable genes')
pathway_order.append('Annotated pathways')
pathway_order.append('Unannotated pathways')
pathway_names=list(df.index)
for pathway in pathway_names:
if pathway not in pathway_order:
pathway_order.append(pathway)
df=df.reindex(pathway_order)
# use seaborn to plot a heat map of the pi values
pylab.figure(figsize=(24,8))
pylab.subplot(1,2,1)
sns.heatmap(df, cmap='RdYlGn_r',vmin=0.0, vmax=1)
#pylab.savefig('%s/pi_per_pathway_heatmap_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
# Iterate through each species again and compute the rank order of each column.
for species_name in species_names:
if species_name in files.keys():
df[species_name]=df[species_name].rank(ascending=True)
# use seaborn to plot a heat map of the rank orders
#pylab.figure(figsize=(8,8))
pylab.subplot(1,2,2)
sns.heatmap(df, cmap='RdYlGn_r',yticklabels=False)
pylab.savefig('%s/fraction_nonsyn_per_pathway_core_ranking_heatmap_multispecies.png' % (
parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################
# fraction nonsyn per pathway #
# variable genes #
################################
min_number_genes=5
min_passed_sites_per_person=100
data_dict={}
# load the data into data_dict, where keys are species_name, values are mean_pi
for species_name in species_names:
if species_name in files.keys():
fraction_nonsynonymous_core=files[species_name]['fraction_nonsynonymous_core']
fraction_nonsynonymous_variable=files[species_name]['fraction_nonsynonymous_variable']
fraction_nonsynonymous_per_pathway_core=files[species_name]['fraction_nonsynonymous_per_pathway_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
fixation_opportunities_per_pathway_syn_non_core=files[species_name]['fixation_opportunities_per_pathway_syn_non_variable']
num_genes_per_pathway_syn_non_core=files[species_name]['num_genes_per_pathway_syn_non_variable']
num_people_with_data_per_pathway_fixations_core=files[species_name]['num_people_with_data_per_pathway_fixations_variable']
mean_pi_dict={}
mean_pi_dict['All core genes']=numpy.mean(fraction_nonsynonymous_core[diff_subject_idxs])
mean_pi_dict['All variable genes']=numpy.mean(fraction_nonsynonymous_variable[diff_subject_idxs])
for pathway in fraction_nonsynonymous_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(fixation_opportunities_per_pathway_syn_non_core[pathway][diff_subject_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if num_genes_per_pathway_syn_non_core[pathway]>=min_number_genes:
mean_pi=numpy.mean(fraction_nonsynonymous_per_pathway_core[pathway][diff_subject_idxs][high_num_sites_idx])
if pathway!='':
mean_pi_dict[pathway]=mean_pi
else:
mean_pi_dict['Unannotated pathways']=mean_pi
data_dict[species_name]=mean_pi_dict
# convert data_dict into a pandas dataframe
df=pandas.DataFrame(data_dict)
pathway_order=[]
pathway_order.append('All core genes')
pathway_order.append('All variable genes')
pathway_order.append('Annotated pathways')
pathway_order.append('Unannotated pathways')
pathway_names=list(df.index)
for pathway in pathway_names:
if pathway not in pathway_order:
pathway_order.append(pathway)
df=df.reindex(pathway_order)
# use seaborn to plot a heat map of the pi values
pylab.figure(figsize=(24,8))
pylab.subplot(1,2,1)
sns.heatmap(df, cmap='RdYlGn_r')
#pylab.savefig('%s/pi_per_pathway_heatmap_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
# Iterate through each species again and compute the rank order of each column.
for species_name in species_names:
if species_name in files.keys():
df[species_name]=df[species_name].rank(ascending=True)
# use seaborn to plot a heat map of the rank orders
#pylab.figure(figsize=(8,8))
pylab.subplot(1,2,2)
sns.heatmap(df, cmap='RdYlGn_r',yticklabels=False)
pylab.savefig('%s/fraction_nonsyn_per_pathway_variable_ranking_heatmap_multispecies.png' % (
parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################
# fraction nonsyn per pathway #
# core and variable genes #
################################
min_number_genes=5
min_passed_sites_per_person=100
data_dict={}
# load the data into data_dict, where keys are species_name, values are mean_pi
for species_name in species_names:
if species_name in files.keys():
fraction_nonsynonymous_core=files[species_name]['fraction_nonsynonymous_core']
fraction_nonsynonymous_variable=files[species_name]['fraction_nonsynonymous_variable']
fraction_nonsynonymous_per_pathway_core=files[species_name]['fraction_nonsynonymous_per_pathway_core_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
fixation_opportunities_per_pathway_syn_non_core=files[species_name]['fixation_opportunities_per_pathway_syn_non_core_variable']
num_genes_per_pathway_syn_non_core=files[species_name]['num_genes_per_pathway_syn_non_core_variable']
num_people_with_data_per_pathway_fixations_core=files[species_name]['num_people_with_data_per_pathway_fixations_core_variable']
mean_pi_dict={}
mean_pi_dict['All core genes']=numpy.mean(fraction_nonsynonymous_core[diff_subject_idxs])
mean_pi_dict['All variable genes']=numpy.mean(fraction_nonsynonymous_variable[diff_subject_idxs])
for pathway in fraction_nonsynonymous_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(fixation_opportunities_per_pathway_syn_non_core[pathway][diff_subject_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if num_genes_per_pathway_syn_non_core[pathway]>=min_number_genes:
mean_pi=numpy.mean(fraction_nonsynonymous_per_pathway_core[pathway][diff_subject_idxs][high_num_sites_idx])
if pathway!='':
mean_pi_dict[pathway]=mean_pi
else:
mean_pi_dict['Unannotated pathways']=mean_pi
data_dict[species_name]=mean_pi_dict
# convert data_dict into a pandas dataframe
df=pandas.DataFrame(data_dict)
df.index.name='pathways'
# get the mean value across species:
df['mean']=df.mean(axis=1)
df=df.sort('mean')
pathway_names=list(df.index)
pathway_order=[]
pathway_order.append('All core genes')
pathway_order.append('All variable genes')
pathway_order.append('Annotated pathways')
pathway_order.append('Unannotated pathways')
for pathway in pathway_names:
if pathway not in pathway_order:
pathway_order.append(pathway)
df=df.reindex(pathway_order)
del df['mean']
# drop Acidaminococcus_intestini_54097 because it messes up coloring
#df=df.drop('Acidaminococcus_intestini_54097',axis=1)
# make a boxplot where each species is a data point. Order the pathways based on mean fraction nonsyn
pylab.figure(figsize=(8,8))
df.T.boxplot(vert=False)
pylab.savefig('%s/fraction_nonsyn_per_pathway_boxplot_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
#sns.pairplot(df)
# plot a scatter plot of every pair of points
# randomly sample n species to plot
n=8
random_species_names=random.sample(list(df.columns.values),n)
pylab.figure(figsize=(30,30))
plot_no=1
for i in range(0, n):
for j in range(0,n):
pylab.subplot(n,n,plot_no)
x=list(df[random_species_names[i]])
y=list(df[random_species_names[j]])
pylab.scatter(x,y, color='red',marker='.')
pylab.xlabel(random_species_names[i])
pylab.ylabel(random_species_names[i])
pylab.xlim(0,0.5)
pylab.ylim(0,0.5)
#ax=df.plot(x=random_species_names[i], y=random_species_names[j], style='o', xlim=[0,0.8], ylim=[0,0.8],legend=False,subplots=True)
#ax.set_xlabel(random_species_names[i])
#ax.set_ylabel(random_species_names[j])
plot_no+=1
pylab.savefig('%s/fraction_nonsyn_per_pathway_core_variable_pairplot_multispecies.png' % (
parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
# plot the spearman correlation coefficients between every pair of species
df = df[species_order]
spearman_corr=df.corr(method='spearman')
# plot a heatmap of the spearman correlation coefficients
pylab.figure(figsize=(15,15))
sns.heatmap(spearman_corr, cmap='RdYlBu_r')
pylab.savefig('%s/fraction_nonsyn_per_pathway_core_variable_spearman_multispecies.png' % (
parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
# use seaborn to plot a heat map of the pi values
pylab.figure(figsize=(24,8))
pylab.subplot(1,2,1)
sns.heatmap(df, cmap='RdYlBu_r')
# Iterate through each species again and compute the rank order of each column.
for species_name in species_names:
if species_name in df.columns:
df[species_name]=df[species_name].rank(ascending=True)
# use seaborn to plot a heat map of the rank orders
pylab.subplot(1,2,2)
sns.heatmap(df, cmap='RdYlBu_r',yticklabels=False)
pylab.savefig('%s/fraction_nonsyn_per_pathway_core_variable_ranking_heatmap_multispecies.png' % (
parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################
# Total fixations per pathway #
# core genes #
################################
min_number_genes=10
min_passed_sites_per_person=100
data_dict={}
# load the data into data_dict, where keys are species_name, values are mean_pi
for species_name in species_names:
if species_name in files.keys():
dtot_core=files[species_name]['dtot_core']
dtot_variable=files[species_name]['dtot_variable']
fixation_opportunities_per_pathway_all_core=files[species_name]['fixation_opportunities_per_pathway_all_core']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
dtot_per_pathway_core=files[species_name]['dtot_per_pathway_core']
num_genes_per_pathway_tot_core=files[species_name]['num_genes_per_pathway_tot_core']
num_people_with_data_per_pathway_tot_core=files[species_name]['num_people_with_data_per_pathway_tot_core']
mean_pi_dict={}
mean_pi_dict['All core genes']=numpy.mean(dtot_core[diff_subject_idxs])
mean_pi_dict['All variable genes']=numpy.mean(dtot_variable[diff_subject_idxs])
for pathway in dtot_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(fixation_opportunities_per_pathway_all_core[pathway][diff_subject_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if num_genes_per_pathway_tot_core[pathway]>=min_number_genes:
mean_pi=numpy.mean(dtot_per_pathway_core[pathway][diff_subject_idxs][high_num_sites_idx])
if pathway!='':
mean_pi_dict[pathway]=mean_pi
else:
mean_pi_dict['Unannotated pathways']=mean_pi
data_dict[species_name]=mean_pi_dict
# convert data_dict into a pandas dataframe
df=pandas.DataFrame(data_dict)
pathway_order=[]
pathway_order.append('All core genes')
pathway_order.append('All variable genes')
pathway_order.append('Annotated pathways')
pathway_order.append('Unannotated pathways')
pathway_names=list(df.index)
for pathway in pathway_names:
if pathway not in pathway_order:
pathway_order.append(pathway)
df=df.reindex(pathway_order)
# use seaborn to plot a heat map of the pi values
pylab.figure(figsize=(24,8))
pylab.subplot(1,2,1)
sns.heatmap(df, cmap='RdYlGn_r')
#pylab.savefig('%s/pi_per_pathway_heatmap_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
# Iterate through each species again and compute the rank order of each column.
for species_name in species_names:
if species_name in files.keys():
df[species_name]=df[species_name].rank(ascending=True)
# use seaborn to plot a heat map of the rank orders
#pylab.figure(figsize=(8,8))
pylab.subplot(1,2,2)
sns.heatmap(df, cmap='RdYlGn_r',yticklabels=False)
pylab.savefig('%s/fixations_per_pathway_core_ranking_heatmap_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
################################
# Total fixations per pathway #
# variable genes #
################################
min_number_genes=5
min_passed_sites_per_person=100
data_dict={}
# load the data into data_dict, where keys are species_name, values are mean_pi
for species_name in species_names:
if species_name in files.keys():
dtot_core=files[species_name]['dtot_core']
dtot_variable=files[species_name]['dtot_variable']
fixation_opportunities_per_pathway_all_core=files[species_name]['fixation_opportunities_per_pathway_all_variable']
diff_subject_idxs=files[species_name]['diff_subject_idxs']
dtot_per_pathway_core=files[species_name]['dtot_per_pathway_variable']
num_genes_per_pathway_tot_core=files[species_name]['num_genes_per_pathway_tot_variable']
num_people_with_data_per_pathway_tot_core=files[species_name]['num_people_with_data_per_pathway_tot_variable']
mean_pi_dict={}
mean_pi_dict['All core genes']=numpy.mean(dtot_core[diff_subject_idxs])
mean_pi_dict['All variable genes']=numpy.mean(dtot_variable[diff_subject_idxs])
for pathway in dtot_per_pathway_core:
#check which people have enough data. Otherwise don't add this.
high_num_sites_idx=numpy.where(fixation_opportunities_per_pathway_all_core[pathway][diff_subject_idxs]>=min_passed_sites_per_person)
if len(high_num_sites_idx)>0:
if num_genes_per_pathway_tot_core[pathway]>=min_number_genes:
mean_pi=numpy.mean(dtot_per_pathway_core[pathway][diff_subject_idxs][high_num_sites_idx])
if pathway!='':
mean_pi_dict[pathway]=mean_pi
else:
mean_pi_dict['Unannotated pathways']=mean_pi
data_dict[species_name]=mean_pi_dict
# convert data_dict into a pandas dataframe
df=pandas.DataFrame(data_dict)
pathway_order=[]
pathway_order.append('All core genes')
pathway_order.append('All variable genes')
pathway_order.append('Annotated pathways')
pathway_order.append('Unannotated pathways')
pathway_names=list(df.index)
for pathway in pathway_names:
if pathway not in pathway_order:
pathway_order.append(pathway)
df=df.reindex(pathway_order)
# use seaborn to plot a heat map of the pi values
pylab.figure(figsize=(24,8))
pylab.subplot(1,2,1)
sns.heatmap(df, cmap='RdYlGn_r')
#pylab.savefig('%s/pi_per_pathway_heatmap_multispecies.png' % (parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
# Iterate through each species again and compute the rank order of each column.
for species_name in species_names:
if species_name in files.keys():
df[species_name]=df[species_name].rank(ascending=True)
# use seaborn to plot a heat map of the rank orders
#pylab.figure(figsize=(8,8))
pylab.subplot(1,2,2)
sns.heatmap(df, cmap='RdYlGn_r',yticklabels=False)
pylab.savefig('%s/fixations_per_pathway_variable_ranking_heatmap_multispecies.png' % (
parse_midas_data.analysis_directory), bbox_inches='tight', dpi=300)
| 45.164502 | 173 | 0.700773 | 8,639 | 62,598 | 4.740248 | 0.043292 | 0.064199 | 0.055872 | 0.041513 | 0.914923 | 0.901687 | 0.89463 | 0.886645 | 0.875827 | 0.862836 | 0 | 0.011516 | 0.159382 | 62,598 | 1,385 | 174 | 45.197112 | 0.766709 | 0.125531 | 0 | 0.841046 | 0 | 0 | 0.168085 | 0.095601 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.030181 | 0.020121 | null | null | 0.002012 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
724e4ad8607e4aeec4bfca2728201a3bfc2dea41 | 39,144 | py | Python | tests/unit/test_elements.py | klmitch/micropath | 3de7f3d3da59dea802b502ebc71ec5e139e25e1f | [
"Apache-2.0"
] | 1 | 2018-06-07T22:17:14.000Z | 2018-06-07T22:17:14.000Z | tests/unit/test_elements.py | klmitch/micropath | 3de7f3d3da59dea802b502ebc71ec5e139e25e1f | [
"Apache-2.0"
] | null | null | null | tests/unit/test_elements.py | klmitch/micropath | 3de7f3d3da59dea802b502ebc71ec5e139e25e1f | [
"Apache-2.0"
] | null | null | null | # Copyright (C) 2018 by Kevin L. Mitchell <klmitch@mit.edu>
#
# Licensed under the Apache License, Version 2.0 (the "License"); you
# may not use this file except in compliance with the License. You may
# obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
# implied. See the License for the specific language governing
# permissions and limitations under the License.
import pytest
from micropath import elements
class ElementForTest(elements.Element):
def set_ident(self, ident):
super(ElementForTest, self).set_ident(ident)
class OtherElement(elements.Element):
def set_ident(self, ident):
pass
class TestElement(object):
def test_init_base(self):
result = ElementForTest('ident')
assert result.ident == 'ident'
assert result.parent is None
assert result.paths == {}
assert result.bindings is None
assert result.methods == {}
assert result.delegation is None
def test_init_alt(self):
result = ElementForTest('ident', 'parent')
assert result.ident == 'ident'
assert result.parent == 'parent'
assert result.paths == {}
assert result.bindings is None
assert result.methods == {}
assert result.delegation is None
def test_set_ident_base(self):
obj = ElementForTest(None)
obj.set_ident('ident')
assert obj.ident == 'ident'
def test_set_ident_set(self):
obj = ElementForTest('ident')
with pytest.raises(ValueError):
obj.set_ident('spam')
assert obj.ident == 'ident'
@staticmethod
def sub_sel(subs, *elems):
return {elem: subs[elem] for elem in elems}
def test_path_base(self, mocker):
mock_Path = mocker.patch.object(
elements, 'Path',
return_value=mocker.Mock(ident=None),
)
obj = ElementForTest('ident')
result = obj.path()
assert result == mock_Path.return_value
mock_Path.assert_called_once_with(None, parent=obj)
assert obj.paths == {}
def test_path_with_ident(self, mocker):
mock_Path = mocker.patch.object(
elements, 'Path',
return_value=mocker.Mock(ident='spam'),
)
obj = ElementForTest('ident')
result = obj.path('spam')
assert result == mock_Path.return_value
mock_Path.assert_called_once_with('spam', parent=obj)
assert obj.paths == {'spam': result}
def test_path_conflict(self, mocker):
mock_Path = mocker.patch.object(
elements, 'Path',
return_value=mocker.Mock(ident='spam'),
)
obj = ElementForTest('ident')
obj.paths['spam'] = 'conflict'
with pytest.raises(ValueError):
obj.path('spam')
mock_Path.assert_called_once_with('spam', parent=obj)
assert obj.paths == {'spam': 'conflict'}
def test_binding_base(self, mocker):
mock_Binding = mocker.patch.object(
elements, 'Binding',
return_value=mocker.Mock(ident=None),
)
obj = ElementForTest('ident')
result = obj.bind()
assert result == mock_Binding.return_value
mock_Binding.assert_called_once_with(
None,
parent=obj,
)
assert obj.bindings is None
def test_binding_with_ident(self, mocker):
mock_Binding = mocker.patch.object(
elements, 'Binding',
return_value=mocker.Mock(ident='spam'),
)
obj = ElementForTest('ident')
result = obj.bind('spam')
assert result == mock_Binding.return_value
mock_Binding.assert_called_once_with(
'spam',
parent=obj,
)
assert obj.bindings == result
def test_binding_conflict(self, mocker):
mock_Binding = mocker.patch.object(
elements, 'Binding',
return_value=mocker.Mock(ident='spam'),
)
obj = ElementForTest('ident')
obj.bindings = 'conflict'
with pytest.raises(ValueError):
obj.bind('spam')
mock_Binding.assert_called_once_with(
'spam',
parent=obj,
)
assert obj.bindings == 'conflict'
def test_route_func(self, mocker):
mock_Method = mocker.patch.object(
elements, 'Method',
return_value=mocker.Mock(ident=None),
)
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
obj = ElementForTest('ident')
func = mocker.Mock(_micropath_handler=False)
result = obj.route(func)
assert result == func
mock_Method.assert_called_once_with(None, func, parent=obj)
mock_from_func.assert_called_once_with(func)
assert obj.methods == {None: mock_Method.return_value}
assert func._micropath_handler is True
assert func._micropath_elem is obj
def test_route_no_methods(self, mocker):
mock_Method = mocker.patch.object(
elements, 'Method',
return_value=mocker.Mock(ident=None),
)
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
obj = ElementForTest('ident')
func = mocker.Mock(_micropath_handler=False)
decorator = obj.route()
assert callable(decorator)
assert decorator != func
mock_Method.assert_not_called()
mock_from_func.assert_not_called()
assert obj.methods == {}
result = decorator(func)
assert result == func
mock_Method.assert_called_once_with(None, func, parent=obj)
mock_from_func.assert_called_once_with(func)
assert obj.methods == {None: mock_Method.return_value}
assert func._micropath_handler is True
assert func._micropath_elem is obj
def test_route_with_methods(self, mocker):
methods = {
'get': mocker.Mock(ident='get'),
'put': mocker.Mock(ident='put'),
}
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f, parent: methods[x],
)
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
obj = ElementForTest('ident')
func = mocker.Mock(_micropath_handler=False)
decorator = obj.route('get', 'put')
assert callable(decorator)
assert decorator != func
mock_Method.assert_not_called()
mock_from_func.assert_not_called()
assert obj.methods == {}
result = decorator(func)
assert result == func
mock_Method.assert_has_calls([
mocker.call('get', func, parent=obj),
mocker.call('put', func, parent=obj),
])
assert mock_Method.call_count == 2
mock_from_func.assert_called_once_with(func)
assert obj.methods == methods
assert func._micropath_handler is True
assert func._micropath_elem is obj
def test_route_with_methods_internal_duplicate(self, mocker):
methods = {
'get': mocker.Mock(ident='get'),
'put': mocker.Mock(ident='put'),
}
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f, parent: methods[x],
)
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
obj = ElementForTest('ident')
func = mocker.Mock(_micropath_handler=False)
decorator = obj.route('get', 'put', 'get')
assert callable(decorator)
assert decorator != func
mock_Method.assert_not_called()
mock_from_func.assert_not_called()
assert obj.methods == {}
result = decorator(func)
assert result == func
mock_Method.assert_has_calls([
mocker.call('get', func, parent=obj),
mocker.call('put', func, parent=obj),
])
assert mock_Method.call_count == 2
mock_from_func.assert_called_once_with(func)
assert obj.methods == methods
assert func._micropath_handler is True
assert func._micropath_elem is obj
def test_route_with_methods_external_duplicate(self, mocker):
methods = {
'get': mocker.Mock(ident='get'),
'put': mocker.Mock(ident='put'),
}
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f, parent: methods[x],
)
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
obj = ElementForTest('ident')
obj.methods['get'] = 'conflict'
with pytest.raises(ValueError):
obj.route('get', 'put')
mock_Method.assert_not_called()
mock_from_func.assert_not_called()
assert obj.methods == {'get': 'conflict'}
def test_mount_base(self, mocker):
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
return_value=mocker.Mock(ident=None),
)
obj = ElementForTest('ident')
result = obj.mount('delegation')
assert isinstance(result, elements.Delegation)
assert result.element == obj
assert obj.methods == {}
assert obj.delegation == result
mock_init.assert_called_once_with('delegation', {})
mock_Method.assert_not_called()
def test_mount_with_methods(self, mocker):
methods = {
'get': mocker.Mock(ident='get', delegation=None),
'put': mocker.Mock(ident='put', delegation=None),
}
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f, parent: methods[x],
)
obj = ElementForTest('ident')
result = obj.mount('delegation', 'get', 'put', a=1, b=2)
assert isinstance(result, elements.Delegation)
assert result.element == obj
assert obj.methods == methods
for meth in methods.values():
assert meth.delegation == result
assert obj.delegation is None
mock_init.assert_called_once_with('delegation', {'a': 1, 'b': 2})
mock_Method.assert_has_calls([
mocker.call('get', None, parent=obj),
mocker.call('put', None, parent=obj),
])
assert mock_Method.call_count == 2
def test_mount_with_methods_internal_duplication(self, mocker):
methods = {
'get': mocker.Mock(ident='get', delegation=None),
'put': mocker.Mock(ident='put', delegation=None),
}
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f, parent: methods[x],
)
obj = ElementForTest('ident')
result = obj.mount('delegation', 'get', 'put', 'get', a=1, b=2)
assert isinstance(result, elements.Delegation)
assert result.element == obj
assert obj.methods == methods
for meth in methods.values():
assert meth.delegation == result
assert obj.delegation is None
mock_init.assert_called_once_with('delegation', {'a': 1, 'b': 2})
mock_Method.assert_has_calls([
mocker.call('get', None, parent=obj),
mocker.call('put', None, parent=obj),
])
assert mock_Method.call_count == 2
def test_mount_with_methods_external_duplication(self, mocker):
methods = {
'get': mocker.Mock(ident='get', delegation=None),
'put': mocker.Mock(ident='put', delegation=None),
}
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f, parent: methods[x],
)
obj = ElementForTest('ident')
obj.methods['get'] = 'conflict'
with pytest.raises(ValueError):
obj.mount('delegation', 'get', 'put', a=1, b=2)
assert obj.methods == {'get': 'conflict'}
for meth in methods.values():
assert meth.delegation is None
assert obj.delegation is None
mock_init.assert_called_once_with('delegation', {'a': 1, 'b': 2})
mock_Method.assert_not_called()
def test_mount_delegation(self, mocker):
delegation = elements.Delegation('delegation', {})
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
return_value=mocker.Mock(ident=None),
)
obj = ElementForTest('ident')
result = obj.mount(delegation)
assert result == delegation
assert result.element == obj
assert obj.methods == {}
assert obj.delegation == delegation
mock_init.assert_not_called()
mock_Method.assert_not_called()
def test_mount_delegation_set(self, mocker):
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
return_value=mocker.Mock(ident=None),
)
obj = ElementForTest('ident')
obj.delegation = 'spam'
with pytest.raises(ValueError):
obj.mount('delegation')
assert obj.methods == {}
assert obj.delegation == 'spam'
mock_init.assert_not_called()
mock_Method.assert_not_called()
class TestRoot(object):
def test_init(self, mocker):
mock_init = mocker.patch.object(
elements.Element, '__init__',
return_value=None,
)
result = elements.Root()
assert isinstance(result, elements.Root)
mock_init.assert_called_once_with(None)
def test_set_ident(self):
obj = elements.Root()
with pytest.raises(ValueError):
obj.set_ident('ident')
def test_add_elem_path(self, mocker):
elem = mocker.Mock(spec=elements.Path, ident='spam')
elem.parent = None
obj = elements.Root()
obj.add_elem(elem)
assert obj.paths == {'spam': elem}
assert obj.bindings is None
assert obj.methods == {}
assert elem.ident == 'spam'
assert elem.parent is obj
elem.set_ident.assert_not_called()
def test_add_elem_path_conflict(self, mocker):
elem = mocker.Mock(spec=elements.Path, ident='spam')
elem.parent = None
obj = elements.Root()
obj.paths['spam'] = 'conflict'
with pytest.raises(ValueError):
obj.add_elem(elem)
assert obj.paths == {'spam': 'conflict'}
assert obj.bindings is None
assert obj.methods == {}
assert elem.ident == 'spam'
assert elem.parent is None
elem.set_ident.assert_not_called()
def test_add_elem_binding(self, mocker):
elem = mocker.Mock(spec=elements.Binding, ident='spam')
elem.parent = None
obj = elements.Root()
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings == elem
assert obj.methods == {}
assert elem.ident == 'spam'
assert elem.parent is obj
elem.set_ident.assert_not_called()
def test_add_elem_binding_conflict(self, mocker):
elem = mocker.Mock(spec=elements.Binding, ident='spam')
elem.parent = None
obj = elements.Root()
obj.bindings = 'conflict'
with pytest.raises(ValueError):
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings == 'conflict'
assert obj.methods == {}
assert elem.ident == 'spam'
assert elem.parent is None
elem.set_ident.assert_not_called()
def test_add_elem_method(self, mocker):
elem = mocker.Mock(spec=elements.Method, ident='spam')
elem.parent = None
obj = elements.Root()
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {'spam': elem}
assert elem.ident == 'spam'
assert elem.parent is obj
elem.set_ident.assert_not_called()
def test_add_elem_method_conflict(self, mocker):
elem = mocker.Mock(spec=elements.Method, ident='spam')
elem.parent = None
obj = elements.Root()
obj.methods['spam'] = 'conflict'
with pytest.raises(ValueError):
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {'spam': 'conflict'}
assert elem.ident == 'spam'
assert elem.parent is None
elem.set_ident.assert_not_called()
def test_add_elem_method_all(self, mocker):
elem = mocker.Mock(spec=elements.Method, ident=None)
elem.parent = None
obj = elements.Root()
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {None: elem}
assert elem.ident is None
assert elem.parent is obj
elem.set_ident.assert_not_called()
def test_add_elem_method_all_conflict(self, mocker):
elem = mocker.Mock(spec=elements.Method, ident=None)
elem.parent = None
obj = elements.Root()
obj.methods[None] = 'conflict'
with pytest.raises(ValueError):
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {None: 'conflict'}
assert elem.ident is None
assert elem.parent is None
elem.set_ident.assert_not_called()
def test_add_elem_other(self, mocker):
elem = mocker.Mock(ident='spam')
elem.parent = None
obj = elements.Root()
with pytest.raises(ValueError):
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {}
assert elem.ident == 'spam'
assert elem.parent is None
elem.set_ident.assert_not_called()
def test_add_elem_self(self, mocker):
obj = elements.Root()
obj.add_elem(obj)
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {}
def test_add_elem_root(self, mocker):
elem = mocker.Mock(spec=elements.Root, ident=None)
elem.parent = None
obj = elements.Root()
with pytest.raises(ValueError):
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {}
assert elem.ident is None
assert elem.parent is None
elem.set_ident.assert_not_called()
def test_add_elem_path_no_ident(self, mocker):
elem = mocker.Mock(spec=elements.Path, ident=None)
elem.parent = None
obj = elements.Root()
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {}
assert elem.ident is None
assert elem.parent is obj
elem.set_ident.assert_not_called()
def test_add_elem_binding_no_ident(self, mocker):
elem = mocker.Mock(spec=elements.Binding, ident=None)
elem.parent = None
obj = elements.Root()
obj.add_elem(elem)
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {}
assert elem.ident is None
assert elem.parent is obj
elem.set_ident.assert_not_called()
def test_add_elem_set_ident(self, mocker):
elem = mocker.Mock(spec=elements.Path, ident=None)
elem.parent = None
obj = elements.Root()
obj.add_elem(elem, 'spam')
assert obj.paths == {}
assert obj.bindings is None
assert obj.methods == {}
assert elem.ident is None
assert elem.parent is obj
elem.set_ident.assert_called_once_with('spam')
def test_add_elem_parents(self, mocker):
elem = mocker.Mock(spec=elements.Path, ident='spam')
elem.parent = None
descendant = mocker.Mock(spec=elements.Path, ident=None)
descendant.parent = elem
obj = elements.Root()
obj.add_elem(descendant, 'descendant')
assert obj.paths == {'spam': elem}
assert obj.bindings is None
assert obj.methods == {}
assert elem.ident == 'spam'
assert elem.parent is obj
elem.set_ident.assert_not_called()
descendant.set_ident.assert_called_once_with('descendant')
class TestPath(object):
def test_set_ident_no_parent(self, mocker):
mock_set_ident = mocker.patch.object(elements.Element, 'set_ident')
obj = elements.Path(None)
obj.set_ident('ident')
mock_set_ident.assert_called_once_with('ident')
def test_set_ident_with_parent(self, mocker):
mock_set_ident = mocker.patch.object(elements.Element, 'set_ident')
obj = elements.Path(None)
obj.parent = mocker.Mock(paths={})
obj.set_ident('ident')
assert obj.parent.paths == {None: obj}
mock_set_ident.assert_called_once_with('ident')
def test_set_ident_conflict(self, mocker):
mock_set_ident = mocker.patch.object(elements.Element, 'set_ident')
obj = elements.Path(None)
obj.parent = mocker.Mock(paths={None: 'conflict'})
with pytest.raises(ValueError):
obj.set_ident('ident')
assert obj.parent.paths == {None: 'conflict'}
mock_set_ident.assert_called_once_with('ident')
class TestBinding(object):
def test_init_base(self, mocker):
mock_init = mocker.patch.object(
elements.Element, '__init__',
return_value=None,
)
result = elements.Binding('ident')
assert result._validator is None
assert result._formatter is None
mock_init.assert_called_once_with('ident', None)
def test_init_alt(self, mocker):
mock_init = mocker.patch.object(
elements.Element, '__init__',
return_value=None,
)
result = elements.Binding(
'ident', 'parent',
)
assert result._validator is None
assert result._formatter is None
mock_init.assert_called_once_with('ident', 'parent')
def test_set_ident_no_parent(self, mocker):
mock_set_ident = mocker.patch.object(elements.Element, 'set_ident')
obj = elements.Binding(None)
obj.set_ident('ident')
mock_set_ident.assert_called_once_with('ident')
def test_set_ident_with_parent(self, mocker):
mock_set_ident = mocker.patch.object(elements.Element, 'set_ident')
obj = elements.Binding(None)
obj.parent = mocker.Mock(bindings=None)
obj.set_ident('ident')
assert obj.parent.bindings == obj
mock_set_ident.assert_called_once_with('ident')
def test_set_ident_conflict(self, mocker):
mock_set_ident = mocker.patch.object(elements.Element, 'set_ident')
obj = elements.Binding(None)
obj.parent = mocker.Mock(bindings='conflict')
with pytest.raises(ValueError):
obj.set_ident('ident')
assert obj.parent.bindings == 'conflict'
mock_set_ident.assert_called_once_with('ident')
def test_validator_base(self, mocker):
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
obj = elements.Binding('ident')
result = obj.validator('func')
assert result == 'func'
assert obj._validator == 'func'
mock_from_func.assert_called_once_with('func')
def test_validator_already_set(self, mocker):
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
obj = elements.Binding('ident')
obj._validator = 'spam'
with pytest.raises(ValueError):
obj.validator('func')
assert obj._validator == 'spam'
mock_from_func.assert_not_called()
def test_validate_unset(self, mocker):
inj = mocker.Mock()
obj = elements.Binding('ident')
result = obj.validate('controller', inj, 'value')
assert result == 'value'
inj.assert_not_called()
def test_validate_set(self, mocker):
inj = mocker.Mock()
obj = elements.Binding('ident')
obj._validator = 'validator'
result = obj.validate('controller', inj, 'value')
assert result == inj.return_value
inj.assert_called_once_with('validator', 'controller', value='value')
def test_formatter_base(self):
obj = elements.Binding('ident')
result = obj.formatter('func')
assert result == 'func'
assert obj._formatter == 'func'
def test_formatter_already_set(self):
obj = elements.Binding('ident')
obj._formatter = 'spam'
with pytest.raises(ValueError):
obj.formatter('func')
assert obj._formatter == 'spam'
def test_format_unset(self):
obj = elements.Binding('ident')
result = obj.format('controller', 1234)
assert result == '1234'
def test_format_set(self, mocker):
obj = elements.Binding('ident')
obj._formatter = mocker.Mock(return_value='string')
result = obj.format('controller', 1234)
assert result == 'string'
obj._formatter.assert_called_once_with('controller', 1234)
class TestMethod(object):
def test_init_base(self, mocker):
mock_init = mocker.patch.object(
elements.Element, '__init__',
return_value=None,
)
result = elements.Method('get', 'func')
assert result.func == 'func'
mock_init.assert_called_once_with('GET', None)
def test_init_alt(self, mocker):
mock_init = mocker.patch.object(
elements.Element, '__init__',
return_value=None,
)
result = elements.Method(None, 'func', 'parent')
assert result.func == 'func'
mock_init.assert_called_once_with(None, 'parent')
def test_set_ident(self):
obj = elements.Method(None, 'func')
with pytest.raises(ValueError):
obj.set_ident('ident')
def test_path_base(self):
obj = elements.Method(None, 'func')
with pytest.raises(ValueError):
obj.path()
def test_path_alt(self):
obj = elements.Method(None, 'func')
with pytest.raises(ValueError):
obj.path('ident')
def test_bind_base(self):
obj = elements.Method(None, 'func')
with pytest.raises(ValueError):
obj.bind()
def test_bind_alt(self):
obj = elements.Method(None, 'func')
with pytest.raises(ValueError):
obj.bind('ident')
def test_route_base(self):
obj = elements.Method(None, 'func')
with pytest.raises(ValueError):
obj.route()
def test_route_alt(self):
obj = elements.Method(None, 'func')
with pytest.raises(ValueError):
obj.route('get', 'put')
def test_mount(self, mocker):
mock_mount = mocker.patch.object(elements.Element, 'mount')
obj = elements.Method(None, 'func')
result = obj.mount('delegation')
assert result == mock_mount.return_value
mock_mount.assert_called_once_with('delegation')
class TestDelegation(object):
def test_init(self):
result = elements.Delegation('controller', 'kwargs')
assert result.controller == 'controller'
assert result.kwargs == 'kwargs'
assert result.element is None
assert result._cache == {}
def test_dunder_get_class(self, mocker):
mock_get = mocker.patch.object(elements.Delegation, 'get')
obj = elements.Delegation('controller', {})
result = obj.__get__(None, 'class')
assert result is obj
mock_get.assert_not_called()
def test_dunder_get_object(self, mocker):
mock_get = mocker.patch.object(elements.Delegation, 'get')
obj = elements.Delegation('controller', {})
result = obj.__get__('object', 'class')
assert result == mock_get.return_value
mock_get.assert_called_once_with('object')
def test_set(self):
target = object()
obj = elements.Delegation('controller', {})
obj.__set__(target, 'value')
assert obj._cache == {id(target): 'value'}
def test_delete_exists(self):
target = object()
obj = elements.Delegation('controller', {})
obj._cache = {id(target): 'value'}
obj.__delete__(target)
assert obj._cache == {}
def test_delete_missing(self):
target = object()
obj = elements.Delegation('controller', {})
obj.__delete__(target)
assert obj._cache == {}
def test_get_cached(self, mocker):
mock_construct = mocker.patch.object(elements.Delegation, 'construct')
target = object()
obj = elements.Delegation('controller', {})
obj._cache = {id(target): 'value'}
result = obj.get(target)
assert result == 'value'
assert obj._cache == {id(target): 'value'}
mock_construct.assert_not_called()
def test_get_uncached(self, mocker):
mock_construct = mocker.patch.object(elements.Delegation, 'construct')
target = object()
obj = elements.Delegation('controller', {})
obj.element = 'element'
result = obj.get(target)
assert result == mock_construct.return_value
assert obj._cache == {id(target): mock_construct.return_value}
assert mock_construct.return_value._micropath_parent is target
assert mock_construct.return_value._micropath_elem == 'element'
mock_construct.assert_called_once_with(target)
def test_construct(self, mocker):
target = mocker.Mock()
obj = elements.Delegation('controller', 'kwargs')
result = obj.construct(target)
assert result == target.micropath_construct.return_value
target.micropath_construct.assert_called_once_with(
'controller', 'kwargs',
)
class TestPathFunc(object):
def test_base(self, mocker):
mock_Path = mocker.patch.object(elements, 'Path')
result = elements.path()
assert result == mock_Path.return_value
mock_Path.assert_called_once_with(None)
def test_alt(self, mocker):
mock_Path = mocker.patch.object(elements, 'Path')
result = elements.path('ident')
assert result == mock_Path.return_value
mock_Path.assert_called_once_with('ident')
class TestBind(object):
def test_base(self, mocker):
mock_Binding = mocker.patch.object(elements, 'Binding')
result = elements.bind()
assert result == mock_Binding.return_value
mock_Binding.assert_called_once_with(None)
def test_alt(self, mocker):
mock_Binding = mocker.patch.object(elements, 'Binding')
result = elements.bind('ident')
assert result == mock_Binding.return_value
mock_Binding.assert_called_once_with(
'ident',
)
class TestRoute(object):
def test_func(self, mocker):
mock_Method = mocker.patch.object(
elements, 'Method',
return_value=mocker.Mock(ident=None),
)
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
func = mocker.Mock(_micropath_methods=None, _micropath_handler=False)
result = elements.route(func)
assert result == func
mock_Method.assert_called_once_with(None, func)
mock_from_func.assert_called_once_with(func)
assert func._micropath_methods == [mock_Method.return_value]
assert func._micropath_handler is True
def test_no_methods(self, mocker):
mock_Method = mocker.patch.object(
elements, 'Method',
return_value=mocker.Mock(ident=None),
)
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
func = mocker.Mock(_micropath_methods=None, _micropath_handler=False)
decorator = elements.route()
assert callable(decorator)
assert decorator != func
mock_Method.assert_not_called()
mock_from_func.assert_not_called()
assert func._micropath_methods is None
assert func._micropath_handler is False
result = decorator(func)
assert result == func
mock_Method.assert_called_once_with(None, func)
mock_from_func.assert_called_once_with(func)
assert func._micropath_methods == [mock_Method.return_value]
assert func._micropath_handler is True
def test_with_methods(self, mocker):
methods = {
'get': mocker.Mock(ident='get'),
'put': mocker.Mock(ident='put'),
}
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f: methods[x],
)
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
func = mocker.Mock(_micropath_methods=None, _micropath_handler=False)
decorator = elements.route('get', 'put')
assert callable(decorator)
assert decorator != func
mock_Method.assert_not_called()
mock_from_func.assert_not_called()
assert func._micropath_methods is None
assert func._micropath_handler is False
result = decorator(func)
assert result == func
mock_Method.assert_has_calls([
mocker.call('get', func),
mocker.call('put', func),
])
assert mock_Method.call_count == 2
mock_from_func.assert_called_once_with(func)
assert func._micropath_methods == [methods[x] for x in ('get', 'put')]
assert func._micropath_handler is True
def test_with_methods_internal_duplicate(self, mocker):
methods = {
'get': mocker.Mock(ident='get'),
'put': mocker.Mock(ident='put'),
}
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f: methods[x],
)
mock_from_func = mocker.patch.object(
elements.injector.WantSignature, 'from_func',
)
func = mocker.Mock(_micropath_methods=None, _micropath_handler=False)
decorator = elements.route('get', 'put', 'get')
assert callable(decorator)
assert decorator != func
mock_Method.assert_not_called()
mock_from_func.assert_not_called()
assert func._micropath_methods is None
assert func._micropath_handler is False
result = decorator(func)
assert result == func
mock_Method.assert_has_calls([
mocker.call('get', func),
mocker.call('put', func),
])
assert mock_Method.call_count == 2
mock_from_func.assert_called_once_with(func)
assert func._micropath_methods == [methods[x] for x in ('get', 'put')]
assert func._micropath_handler is True
class TestMount(object):
def test_base(self, mocker):
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
return_value=mocker.Mock(ident=None),
)
result = elements.mount('delegation')
assert isinstance(result, elements.Delegation)
assert not hasattr(result, '_micropath_methods')
mock_init.assert_called_once_with('delegation', {})
mock_Method.assert_not_called()
def test_with_methods(self, mocker):
methods = {
'get': mocker.Mock(ident='get', delegation=None),
'put': mocker.Mock(ident='put', delegation=None),
}
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f: methods[x],
)
result = elements.mount('delegation', 'get', 'put', a=1, b=2)
assert isinstance(result, elements.Delegation)
assert result._micropath_methods == [methods['get'], methods['put']]
mock_init.assert_called_once_with('delegation', {'a': 1, 'b': 2})
mock_Method.assert_has_calls([
mocker.call('get', None),
mocker.call('put', None),
])
def test_with_methods_internal_duplication(self, mocker):
methods = {
'get': mocker.Mock(ident='get', delegation=None),
'put': mocker.Mock(ident='put', delegation=None),
}
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
side_effect=lambda x, f: methods[x],
)
result = elements.mount('delegation', 'get', 'put', 'get', a=1, b=2)
assert isinstance(result, elements.Delegation)
assert result._micropath_methods == [methods['get'], methods['put']]
mock_init.assert_called_once_with('delegation', {'a': 1, 'b': 2})
mock_Method.assert_has_calls([
mocker.call('get', None),
mocker.call('put', None),
])
def test_delegation(self, mocker):
delegation = elements.Delegation('delegation', {})
mock_init = mocker.patch.object(
elements.Delegation, '__init__',
return_value=None,
)
mock_Method = mocker.patch.object(
elements, 'Method',
return_value=mocker.Mock(ident=None),
)
result = elements.mount(delegation)
assert result == delegation
assert not hasattr(result, '_micropath_methods')
mock_init.assert_not_called()
mock_Method.assert_not_called()
| 31.61874 | 78 | 0.610004 | 4,460 | 39,144 | 5.133184 | 0.042152 | 0.044116 | 0.049008 | 0.072071 | 0.890059 | 0.868787 | 0.837075 | 0.808203 | 0.785402 | 0.767232 | 0 | 0.001773 | 0.279685 | 39,144 | 1,237 | 79 | 31.644301 | 0.810186 | 0.014792 | 0 | 0.703206 | 0 | 0 | 0.054006 | 0 | 0 | 0 | 0 | 0 | 0.334023 | 1 | 0.091003 | false | 0.001034 | 0.002068 | 0.001034 | 0.106515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
a0df099e09d707110269891222744b0475f7cdbd | 152 | py | Python | pauxy/walkers/__init__.py | pauxy-qmc/pauxy | 1da80284284769b59361c73cfa3c2d914c74a73f | [
"Apache-2.0"
] | 16 | 2020-08-05T17:17:17.000Z | 2022-03-18T04:06:18.000Z | pauxy/walkers/__init__.py | pauxy-qmc/pauxy | 1da80284284769b59361c73cfa3c2d914c74a73f | [
"Apache-2.0"
] | 4 | 2020-05-17T21:28:20.000Z | 2021-04-22T18:05:50.000Z | pauxy/walkers/__init__.py | pauxy-qmc/pauxy | 1da80284284769b59361c73cfa3c2d914c74a73f | [
"Apache-2.0"
] | 5 | 2020-05-18T01:03:18.000Z | 2021-04-13T15:36:29.000Z | from pauxy.walkers.single_det import SingleDetWalker
from pauxy.walkers.multi_det import MultiDetWalker
from pauxy.walkers.thermal import ThermalWalker
| 38 | 52 | 0.881579 | 20 | 152 | 6.6 | 0.55 | 0.204545 | 0.363636 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 152 | 3 | 53 | 50.666667 | 0.942857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9d0dbe880016f84f149485440a6719b7ce873592 | 17,337 | py | Python | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/ocelot/phys/Phys_Studio_WiSUN.py | lmnotran/gecko_sdk | 2e82050dc8823c9fe0e8908c1b2666fb83056230 | [
"Zlib"
] | 82 | 2016-06-29T17:24:43.000Z | 2021-04-16T06:49:17.000Z | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/ocelot/phys/Phys_Studio_WiSUN.py | lmnotran/gecko_sdk | 2e82050dc8823c9fe0e8908c1b2666fb83056230 | [
"Zlib"
] | 2 | 2017-02-13T10:07:17.000Z | 2017-03-22T21:28:26.000Z | platform/radio/efr32_multiphy_configurator/pyradioconfig/parts/ocelot/phys/Phys_Studio_WiSUN.py | lmnotran/gecko_sdk | 2e82050dc8823c9fe0e8908c1b2666fb83056230 | [
"Zlib"
] | 56 | 2016-08-02T10:50:50.000Z | 2021-07-19T08:57:34.000Z | from pyradioconfig.calculator_model_framework.interfaces.iphy import IPhy
class PHYS_IEEE802154_WiSUN_Ocelot(IPhy):
### EU Region ###
# Owner: Casey Weltzin
# JIRA Link: https://jira.silabs.com/browse/PGOCELOTVALTEST-166
def PHY_IEEE802154_WISUN_868MHz_2GFSK_50kbps_1a_EU(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, EU-868MHz, 1a (2FSK 50kbps mi=0.5)', phy_name=phy_name)
#Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode1a
#Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 863100000 # FAN EU Mode #1a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 100000 # FAN EU Mode #1a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 8 * 8 # FAN EU Mode #1a, WiSUN 20140727-PHY-Profile Table 6
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
#Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
#Apps-verified
def PHY_IEEE802154_WISUN_873MHz_2GFSK_50kbps_1a_EU(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, EU-873MHz, 1a (2FSK 50kbps mi=0.5)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode1a
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 870100000 # FAN EU Mode #1a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 100000 # FAN EU Mode #1a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 8 * 8 # FAN EU Mode #1a, WiSUN 20140727-PHY-Profile Table 6
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
# Owner: Casey Weltzin
# JIRA Link: https://jira.silabs.com/browse/PGOCELOTVALTEST-165
def PHY_IEEE802154_WISUN_868MHz_2GFSK_100kbps_2a_EU(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, EU-868MHz, 2a (2FSK 100kbps mi=0.5)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode2a
#Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 863100000 # FAN EU Mode #2a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 200000 # FAN EU Mode #2a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 8 * 8 # FAN EU Mode #2a, WiSUN 20140727-PHY-Profile Table 6
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
# Apps-verified
def PHY_IEEE802154_WISUN_873MHz_2GFSK_100kbps_2a_EU(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, EU-873MHz, 2a (2FSK 100kbps mi=0.5)',
phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode2a
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 870200000 # FAN EU Mode #2a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 200000 # FAN EU Mode #2a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 8 * 8 # FAN EU Mode #2a, WiSUN 20140727-PHY-Profile Table 6
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
# Apps-verified
def PHY_IEEE802154_WISUN_868MHz_2GFSK_150kbps_3_EU(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, EU-868MHz, 3 (2FSK 150kbps mi=0.5)',
phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode3
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 863100000 # FAN NA Mode #3, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 200000 # FAN NA Mode #3, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 12 * 8 # FAN NA Mode #3, WiSUN 20140727-PHY-Profile Table 6
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
# Apps-verified
def PHY_IEEE802154_WISUN_873MHz_2GFSK_150kbps_3_EU(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, EU-873MHz, 3 (2FSK 150kbps mi=0.5)',
phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode3
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 870200000 # FAN NA Mode #3, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 200000 # FAN NA Mode #3, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 12 * 8 # FAN NA Mode #3, WiSUN 20140727-PHY-Profile Table 6
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
### NA Region ###
# Owner: Casey Weltzin
# JIRA Link: https://jira.silabs.com/browse/PGOCELOTVALTEST-168
def PHY_IEEE802154_WISUN_915MHz_2GFSK_50kbps_1b_NA(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, NA-915MHz, 1b (2FSK 50kbps mi=1.0)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode1b
#Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 902200000 # FAN NA Mode #1b, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 200000 # FAN NA Mode #1b, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 8 * 8 # FAN NA Mode #1b, WiSUN 20140727-PHY-Profile Table 6
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
# Apps-verified
def PHY_IEEE802154_WISUN_915MHz_2GFSK_100kbps_2a_NA(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN,
readable_name='Wi-SUN FAN, NA-915MHz, 2a (2FSK 100kbps mi=0.5)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode2a
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 902200000 # FAN EU Mode #2a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 200000 # FAN EU Mode #2a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 8 * 8 # FAN EU Mode #2a, WiSUN 20140727-PHY-Profile Table 6
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
# Owner: Casey Weltzin
# JIRA Link: https://jira.silabs.com/browse/PGOCELOTVALTEST-167
def PHY_IEEE802154_WISUN_915MHz_2GFSK_150kbps_3_NA(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, NA-915MHz, 3 (2FSK 150kbps mi=0.5)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode3
#Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 902400000 # FAN NA Mode #3, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 400000 # FAN NA Mode #3, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 12 * 8 # FAN NA Mode #3, WiSUN 20140727-PHY-Profile Table 6
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
#Apps-verified
def PHY_IEEE802154_WISUN_915MHz_2GFSK_200kbps_4a_NA(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN,
readable_name='Wi-SUN FAN, NA-915MHz, 4a (2GFSK 200kbps mi=0.5)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode4a
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 902400000
phy.profile_inputs.channel_spacing_hz.value = 400000
phy.profile_inputs.preamble_length.value = 12 * 8
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
# Apps-verified
def PHY_IEEE802154_WISUN_915MHz_2GFSK_300kbps_5_NA(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN,
readable_name='Wi-SUN FAN, NA-915MHz, 5 (2GFSK 300kbps mi=0.5)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode5
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 902600000
phy.profile_inputs.channel_spacing_hz.value = 600000
phy.profile_inputs.preamble_length.value = 24 * 8
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
### JP Region ###
# Owner: Casey Weltzin
# JIRA Link: https://jira.silabs.com/browse/PGOCELOTVALTEST-170
def PHY_IEEE802154_WISUN_920MHz_2GFSK_50kbps_1b_JP_ECHONET(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN ECHONET, JP-920MHz, 1b (2FSK 50kbps mi=1.0)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode1b
#Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 920600000 # Echonet JP Mode #1b, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 200000 # Echonet JP Mode #1b, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 8 * 8 # Echonet JP Mode #1b, WiSUN 20140727-PHY-Profile Table 5
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.CCITT_16 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
# Owner: Casey Weltzin
# JIRA Link: https://jira.silabs.com/browse/PGOCELOTVALTEST-169
def PHY_IEEE802154_WISUN_920MHz_2GFSK_100kbps_2b_JP_ECHONET(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN ECHONET, JP-920MHz, 2b (2FSK 100kbps mi=1.0)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode2b
#Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 920900000 # Echonet JP Mode #2b, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 400000 # Echonet JP Mode #2b, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 15 * 8 # Echonet JP Mode #2b, WiSUN 20140727-PHY-Profile Table 5
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.CCITT_16 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
#Apps-verified
def PHY_IEEE802154_WISUN_920MHz_2GFSK_100kbps_2b_JP(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, JP-920MHz, 2b (2FSK 100kbps mi=1.0)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode2b
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 920900000
phy.profile_inputs.channel_spacing_hz.value = 400000
phy.profile_inputs.preamble_length.value = 8 * 8
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
#Apps-verified
def PHY_IEEE802154_WISUN_920MHz_2GFSK_200kbps_4b_JP(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN,
readable_name='Wi-SUN FAN, JP-920MHz, 4b (2GFSK 200kbps mi=1.0)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode4b
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 920800000
phy.profile_inputs.channel_spacing_hz.value = 600000
phy.profile_inputs.preamble_length.value = 12 * 8
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
### CN Region ###
# Owner: Casey Weltzin
# JIRA Link: https://jira.silabs.com/browse/PGOCELOTVALTEST-1218
def PHY_IEEE802154_WISUN_470MHz_2GFSK_50kbps_1b_CN(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, CN-470MHz, 1b (2FSK 50kbps mi=1.0)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode1b
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 470200000 # FAN CN Mode #1b, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 200000 # FAN CN Mode #1b, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 8 * 8 # FAN CN Mode #1b, WiSUN 20140727-PHY-Profile Table 5
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy
# Owner: Casey Weltzin
# JIRA Link: https://jira.silabs.com/browse/PGOCELOTVALTEST-1219
def PHY_IEEE802154_WISUN_470MHz_2GFSK_100kbps_2a_CN(self, model, phy_name=None):
phy = self._makePhy(model, model.profiles.WiSUN, readable_name='Wi-SUN FAN, CN-470MHz, 2a (2FSK 100kbps mi=0.5)', phy_name=phy_name)
# Select the correct SUNFSK mode
phy.profile_inputs.wisun_mode.value = model.vars.wisun_mode.var_enum.Mode2a
# Define WiSUN Profile / Region specific inputs
phy.profile_inputs.base_frequency_hz.value = 470200000 # FAN CN Mode #2a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.channel_spacing_hz.value = 200000 # FAN CN Mode #2a, WiSUN 20140727-PHY-Profile Table 3
phy.profile_inputs.preamble_length.value = 8 * 8 # FAN CN Mode #2a, WiSUN 20140727-PHY-Profile Table 5
phy.profile_inputs.crc_poly.value = model.vars.crc_poly.var_enum.ANSIX366_1979 # 802.15.4-2015, 7.2.10
# Default xtal frequency of 39MHz
phy.profile_inputs.xtal_frequency_hz.value = 39000000
return phy | 53.841615 | 144 | 0.716099 | 2,563 | 17,337 | 4.62895 | 0.054623 | 0.118847 | 0.137559 | 0.075607 | 0.974039 | 0.974039 | 0.955496 | 0.955496 | 0.953978 | 0.946898 | 0 | 0.114405 | 0.196343 | 17,337 | 322 | 145 | 53.841615 | 0.737099 | 0.286439 | 0 | 0.711538 | 0 | 0 | 0.065764 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.108974 | false | 0 | 0.00641 | 0 | 0.192308 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
9d39d726a72ca9b680c21d923c67ba239050bc73 | 287 | py | Python | autoload/leaderf/python/bookmark/commands/__init__.py | tamago324/LeaderF-bookmark | 17cf0fb3b1e956f1265be7e6d7e7d7b68317c566 | [
"Apache-2.0"
] | 4 | 2020-02-02T09:12:49.000Z | 2020-05-30T18:44:59.000Z | autoload/leaderf/python/bookmark/commands/__init__.py | tamago324/LeaderF-bookmark | 17cf0fb3b1e956f1265be7e6d7e7d7b68317c566 | [
"Apache-2.0"
] | 2 | 2020-04-10T01:07:21.000Z | 2020-06-01T23:08:40.000Z | autoload/leaderf/python/bookmark/commands/__init__.py | tamago324/LeaderF-bookmark | 17cf0fb3b1e956f1265be7e6d7e7d7b68317c566 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from bookmark.commands.add_dir import *
from bookmark.commands.add_file import *
from bookmark.commands.delete import *
from bookmark.commands.edit import *
from bookmark.commands.input import command___input_cancel, command___input_prompt
| 31.888889 | 82 | 0.797909 | 40 | 287 | 5.475 | 0.5 | 0.273973 | 0.456621 | 0.474886 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003876 | 0.101045 | 287 | 8 | 83 | 35.875 | 0.844961 | 0.146341 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
9d60ce135a129c1dc6829bd797967d6ade579e31 | 176 | py | Python | ckan/overrides/extra-helper-functions.py | USSBA/sba-gov-ckan | 75baf061f8496ed0a1d0b7a3b0927088b7a0f29d | [
"Apache-2.0"
] | 8 | 2019-06-10T16:26:12.000Z | 2021-10-19T06:44:25.000Z | ckan/overrides/extra-helper-functions.py | USSBA/sba-gov-ckan | 75baf061f8496ed0a1d0b7a3b0927088b7a0f29d | [
"Apache-2.0"
] | 4 | 2019-06-14T17:50:35.000Z | 2021-02-17T17:07:54.000Z | ckan/overrides/extra-helper-functions.py | USSBA/sba-gov-ckan | 75baf061f8496ed0a1d0b7a3b0927088b7a0f29d | [
"Apache-2.0"
] | 3 | 2019-06-14T13:24:04.000Z | 2019-12-30T13:52:12.000Z |
@core_helper
def google_analytics_id():
return config.get('google_analytics.id', None)
@core_helper
def google_analytics_enabled():
return len(google_analytics_id()) > 0
| 19.555556 | 48 | 0.778409 | 25 | 176 | 5.12 | 0.52 | 0.46875 | 0.398438 | 0.296875 | 0.4375 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006369 | 0.107955 | 176 | 8 | 49 | 22 | 0.808917 | 0 | 0 | 0.333333 | 0 | 0 | 0.108571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0 | 0.333333 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 9 |
c22b693243fd88a91e1b147402c43d6ffa7f07de | 23,063 | py | Python | psono/restapi/tests/link_share.py | psono/psono-server | e4253aa343857d1cf1102bf7a2cf08dfc40c92c6 | [
"Apache-2.0",
"CC0-1.0"
] | 48 | 2018-04-19T15:50:58.000Z | 2022-01-23T15:58:11.000Z | psono/restapi/tests/link_share.py | psono/psono-server | e4253aa343857d1cf1102bf7a2cf08dfc40c92c6 | [
"Apache-2.0",
"CC0-1.0"
] | 9 | 2018-09-13T14:56:18.000Z | 2020-01-17T16:44:33.000Z | psono/restapi/tests/link_share.py | psono/psono-server | e4253aa343857d1cf1102bf7a2cf08dfc40c92c6 | [
"Apache-2.0",
"CC0-1.0"
] | 11 | 2019-09-20T11:53:47.000Z | 2021-07-18T22:41:31.000Z | from django.urls import reverse
from django.contrib.auth.hashers import make_password
from django.utils import timezone
from datetime import timedelta
from rest_framework import status
from .base import APITestCaseExtended
from restapi import models
import random
import string
BAD_URL='BAD_URL'
def mock_request_post(url, data=None, json=None, **kwargs):
if url == BAD_URL:
raise Exception
class UserCreateLinkShareTest(APITestCaseExtended):
"""
Test to create a link share (PUT)
"""
def setUp(self):
self.test_email = "test@example.com"
self.test_email_bcrypt = "a"
self.test_email2 = "test2@example.com"
self.test_email_bcrypt2 = "b"
self.test_email3 = "test3@example.com"
self.test_email_bcrypt3 = "test3@example.com"
self.test_username = "test@psono.pw"
self.test_username2 = "test2@psono.pw"
self.test_username3 = "test3@psono.pw"
self.test_password = "myPassword"
self.test_authkey = "c55066421a559f76d8ed5227622e9f95a0c67df15220e40d7bc98a8a598124fa15373ac553ef3ee27c7" \
"123d6be058e6d43cc71c1b666bdecaf33b734c8583a93"
self.test_public_key = "5706a5648debec63e86714c8c489f08aee39477487d1b3f39b0bbb05dbd2c649"
self.test_secret_key = "a7d028388e9d80f2679c236ebb2d0fedc5b7b0a28b393f6a20cc8f6be636aa71"
self.test_secret_key_enc = "77cde8ff6a5bbead93588fdcd0d6346bb57224b55a49c0f8a22a807bf6414e4d82ff60711422" \
"996e4a26de599982d531eef3098c9a531a05f75878ac0739571d6a242e6bf68c2c28eadf1011" \
"571a48eb"
self.test_secret_key_nonce = "f580cc9900ce7ae8b6f7d2bab4627e9e689dca0f13a53e3c"
self.test_secret_key_nonce2 = "f580cc9902ce7ae8b6f7d2bab4627e9e689dca0f13a53e3c"
self.test_secret_key_nonce3 = "f580c29902ce7ae8b6f7d2bab4627e9e689dca0f13a53e3c"
self.test_private_key = "d636f7cc20384475bdc30c3ede98f719ee09d1fd4709276103772dd9479f353c"
self.test_private_key_enc = "abddebec9d20cecf7d1cab95ad6c6394db3826856bf21c2c6af9954e9816c2239f5df697e52" \
"d60785eb1136803407b69729c38bb50eefdd2d24f2fa0f104990eee001866ba83704cf4f576" \
"a74b9b2452"
self.test_private_key_nonce = "4298a9ab3d9d5d8643dfd4445adc30301b565ab650497fb9"
self.test_private_key_nonce2 = "4298a9ab3d8d5d8643dfd4445adc30301b565ab650497fb9"
self.test_private_key_nonce3 = "4228a9ab3d8d5d8643dfd4445adc30301b565ab650497fb9"
self.test_user_obj = models.User.objects.create(
username=self.test_username,
email=self.test_email,
email_bcrypt=self.test_email_bcrypt,
authkey=make_password(self.test_authkey),
public_key=self.test_public_key,
private_key=self.test_private_key_enc,
private_key_nonce=self.test_private_key_nonce,
secret_key=self.test_secret_key_enc,
secret_key_nonce=self.test_secret_key_nonce,
user_sauce='6b84c6bca05de45714f224e4707fa4e02a59fa21b1e6539f5f3f35fdbf914022',
is_email_active=True
)
self.test_user2_obj = models.User.objects.create(
email=self.test_email2,
email_bcrypt=self.test_email_bcrypt2,
username=self.test_username2,
authkey=make_password(self.test_authkey),
public_key=self.test_public_key,
private_key=self.test_private_key_enc,
private_key_nonce=self.test_private_key_nonce2,
secret_key=self.test_secret_key_enc,
secret_key_nonce=self.test_secret_key_nonce2,
user_sauce='4b01f5914b95005b011442ff6a88039627909e77e67f84066973b22131958ac2',
is_email_active=True
)
self.test_user3_obj = models.User.objects.create(
email=self.test_email3,
email_bcrypt=self.test_email_bcrypt3,
username=self.test_username3,
authkey=make_password(self.test_authkey),
public_key=self.test_public_key,
private_key=self.test_private_key_enc,
private_key_nonce=self.test_private_key_nonce3,
secret_key=self.test_secret_key_enc,
secret_key_nonce=self.test_secret_key_nonce3,
user_sauce='dd8e55859b0542320fc4c442cfa7d751ef16ffcabbbefd0129c10cdc0ea79b00',
is_email_active=True
)
self.test_secret_obj = models.Secret.objects.create(
user_id=self.test_user_obj.id,
data='12345',
data_nonce=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
type="dummy"
)
self.test_datastore_obj = models.Data_Store.objects.create(
user_id=self.test_user_obj.id,
type="my-type",
description= "my-description",
data= "12345",
data_nonce= ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
secret_key= ''.join(random.choice(string.ascii_lowercase) for _ in range(256)),
secret_key_nonce= ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
)
self.secret_link_obj = models.Secret_Link.objects.create(
link_id = '0493017f-47b0-446e-9a41-6533721ade71',
secret_id = self.test_secret_obj.id,
parent_datastore_id = self.test_datastore_obj.id,
parent_share_id = None
)
self.test_secret2_obj = models.Secret.objects.create(
user_id=self.test_user2_obj.id,
data='12345',
data_nonce=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
type="dummy"
)
def test_create_link_share(self):
"""
Tests to create a link share
"""
url = reverse('link_share')
data = {
'secret_id': str(self.test_secret_obj.id),
'node': '12345',
'node_nonce': ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
'public_title': 'A public title',
'allowed_reads': 1,
'passphrase': '',
}
self.client.force_authenticate(user=self.test_user_obj)
response = self.client.put(url, data)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(models.Link_Share.objects.count(), 1)
def test_create_link_share_without_secret_and_file(self):
"""
Tests to create a link share without secret nor file
"""
url = reverse('link_share')
data = {
# 'secret_id': str(self.test_secret_obj.id),
'node': '12345',
'node_nonce': ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
'public_title': 'A public title',
'allowed_reads': 1,
'passphrase': '',
}
self.client.force_authenticate(user=self.test_user_obj)
response = self.client.put(url, data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_create_link_share_with_secret_without_permission(self):
"""
Tests to create a link share for a secret that the user has no permission for
"""
url = reverse('link_share')
data = {
'secret_id': str(self.test_secret2_obj.id),
'node': '12345',
'node_nonce': ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
'public_title': 'A public title',
'allowed_reads': 1,
'passphrase': '',
}
self.client.force_authenticate(user=self.test_user_obj)
response = self.client.put(url, data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_create_link_share_with_valid_till_already_expired(self):
"""
Tests to create a link share with a valid till that is already expired
"""
url = reverse('link_share')
data = {
'secret_id': str(self.test_secret_obj.id),
'node': '12345',
'node_nonce': ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
'public_title': 'A public title',
'allowed_reads': 1,
'passphrase': '',
'valid_till': timezone.now() - timedelta(seconds=1),
}
self.client.force_authenticate(user=self.test_user_obj)
response = self.client.put(url, data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
class UserGetLinkShareTest(APITestCaseExtended):
"""
Test to get a link share (GET)
"""
def setUp(self):
self.test_email = "test@example.com"
self.test_email_bcrypt = "a"
self.test_email2 = "test2@example.com"
self.test_email_bcrypt2 = "b"
self.test_email3 = "test3@example.com"
self.test_email_bcrypt3 = "test3@example.com"
self.test_username = "test@psono.pw"
self.test_username2 = "test2@psono.pw"
self.test_username3 = "test3@psono.pw"
self.test_password = "myPassword"
self.test_authkey = "c55066421a559f76d8ed5227622e9f95a0c67df15220e40d7bc98a8a598124fa15373ac553ef3ee27c7" \
"123d6be058e6d43cc71c1b666bdecaf33b734c8583a93"
self.test_public_key = "5706a5648debec63e86714c8c489f08aee39477487d1b3f39b0bbb05dbd2c649"
self.test_secret_key = "a7d028388e9d80f2679c236ebb2d0fedc5b7b0a28b393f6a20cc8f6be636aa71"
self.test_secret_key_enc = "77cde8ff6a5bbead93588fdcd0d6346bb57224b55a49c0f8a22a807bf6414e4d82ff60711422" \
"996e4a26de599982d531eef3098c9a531a05f75878ac0739571d6a242e6bf68c2c28eadf1011" \
"571a48eb"
self.test_secret_key_nonce = "f580cc9900ce7ae8b6f7d2bab4627e9e689dca0f13a53e3c"
self.test_secret_key_nonce2 = "f580cc9902ce7ae8b6f7d2bab4627e9e689dca0f13a53e3c"
self.test_secret_key_nonce3 = "f580c29902ce7ae8b6f7d2bab4627e9e689dca0f13a53e3c"
self.test_private_key = "d636f7cc20384475bdc30c3ede98f719ee09d1fd4709276103772dd9479f353c"
self.test_private_key_enc = "abddebec9d20cecf7d1cab95ad6c6394db3826856bf21c2c6af9954e9816c2239f5df697e52" \
"d60785eb1136803407b69729c38bb50eefdd2d24f2fa0f104990eee001866ba83704cf4f576" \
"a74b9b2452"
self.test_private_key_nonce = "4298a9ab3d9d5d8643dfd4445adc30301b565ab650497fb9"
self.test_private_key_nonce2 = "4298a9ab3d8d5d8643dfd4445adc30301b565ab650497fb9"
self.test_private_key_nonce3 = "4228a9ab3d8d5d8643dfd4445adc30301b565ab650497fb9"
self.test_user_obj = models.User.objects.create(
username=self.test_username,
email=self.test_email,
email_bcrypt=self.test_email_bcrypt,
authkey=make_password(self.test_authkey),
public_key=self.test_public_key,
private_key=self.test_private_key_enc,
private_key_nonce=self.test_private_key_nonce,
secret_key=self.test_secret_key_enc,
secret_key_nonce=self.test_secret_key_nonce,
user_sauce='6b84c6bca05de45714f224e4707fa4e02a59fa21b1e6539f5f3f35fdbf914022',
is_email_active=True
)
self.test_user2_obj = models.User.objects.create(
email=self.test_email2,
email_bcrypt=self.test_email_bcrypt2,
username=self.test_username2,
authkey=make_password(self.test_authkey),
public_key=self.test_public_key,
private_key=self.test_private_key_enc,
private_key_nonce=self.test_private_key_nonce2,
secret_key=self.test_secret_key_enc,
secret_key_nonce=self.test_secret_key_nonce2,
user_sauce='4b01f5914b95005b011442ff6a88039627909e77e67f84066973b22131958ac2',
is_email_active=True
)
self.test_user3_obj = models.User.objects.create(
email=self.test_email3,
email_bcrypt=self.test_email_bcrypt3,
username=self.test_username3,
authkey=make_password(self.test_authkey),
public_key=self.test_public_key,
private_key=self.test_private_key_enc,
private_key_nonce=self.test_private_key_nonce3,
secret_key=self.test_secret_key_enc,
secret_key_nonce=self.test_secret_key_nonce3,
user_sauce='dd8e55859b0542320fc4c442cfa7d751ef16ffcabbbefd0129c10cdc0ea79b00',
is_email_active=True
)
self.test_secret_obj = models.Secret.objects.create(
user_id=self.test_user_obj.id,
data='12345',
data_nonce=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
type="dummy"
)
self.test_datastore_obj = models.Data_Store.objects.create(
user_id=self.test_user_obj.id,
type="my-type",
description= "my-description",
data='12345',
data_nonce= ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
secret_key= ''.join(random.choice(string.ascii_lowercase) for _ in range(256)),
secret_key_nonce= ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
)
self.secret_link_obj = models.Secret_Link.objects.create(
link_id = '0493017f-47b0-446e-9a41-6533721ade71',
secret_id = self.test_secret_obj.id,
parent_datastore_id = self.test_datastore_obj.id,
parent_share_id = None
)
self.link_share = models.Link_Share.objects.create(
user=self.test_user_obj,
secret=self.test_secret_obj,
file_id=None,
allowed_reads=True,
public_title='A public title',
node=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
node_nonce=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
passphrase=None,
valid_till=None,
)
self.test_secret2_obj = models.Secret.objects.create(
user_id=self.test_user2_obj.id,
data='12345',
data_nonce=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
type="dummy"
)
def test_read_link_share_success(self):
"""
Tests to read a specific link share successful
"""
url = reverse('link_share', kwargs={'link_share_id': str(self.link_share.id)})
data = {}
self.client.force_authenticate(user=self.test_user_obj)
response = self.client.get(url, data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
def test_without_uuid_and_existing_link_shares(self):
"""
Tests to get all link shares without specifying a uuid
"""
url = reverse('link_share')
data = {}
self.client.force_authenticate(user=self.test_user_obj)
response = self.client.get(url, data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertIn('link_shares', response.data)
self.assertEqual(len(response.data['link_shares']), 1)
def test_without_uuid_and_no_existing_link_shares(self):
"""
Tests to get all link shares without specifying a uuid, while having no link shares
"""
url = reverse('link_share')
data = {}
self.client.force_authenticate(user=self.test_user3_obj)
response = self.client.get(url, data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertIn('link_shares', response.data)
self.assertEqual(len(response.data['link_shares']), 0)
def test_with_not_existing_link_share(self):
"""
Tests to get a specific link share without rights
"""
url = reverse('link_share', kwargs={'link_share_id': 'cf84fbd5-c606-4d5b-aa96-88c68a06cde4'})
data = {}
self.client.force_authenticate(user=self.test_user_obj)
response = self.client.get(url, data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
class UserUpdateLinkShareTest(APITestCaseExtended):
"""
Test to update a link share (POST)
"""
def setUp(self):
self.test_email = "test@example.com"
self.test_email_bcrypt = "a"
self.test_email2 = "test2@example.com"
self.test_email_bcrypt2 = "b"
self.test_username = "test@psono.pw"
self.test_username2 = "test2@psono.pw"
self.test_password = "myPassword"
self.test_authkey = "c55066421a559f76d8ed5227622e9f95a0c67df15220e40d7bc98a8a598124fa15373ac553ef3ee27c7" \
"123d6be058e6d43cc71c1b666bdecaf33b734c8583a93"
self.test_public_key = "5706a5648debec63e86714c8c489f08aee39477487d1b3f39b0bbb05dbd2c649"
self.test_secret_key = "a7d028388e9d80f2679c236ebb2d0fedc5b7b0a28b393f6a20cc8f6be636aa71"
self.test_secret_key_enc = "77cde8ff6a5bbead93588fdcd0d6346bb57224b55a49c0f8a22a807bf6414e4d82ff60711422" \
"996e4a26de599982d531eef3098c9a531a05f75878ac0739571d6a242e6bf68c2c28eadf1011" \
"571a48eb"
self.test_secret_key_nonce = "f580cc9900ce7ae8b6f7d2bab4627e9e689dca0f13a53e3c"
self.test_secret_key_nonce2 = "f580cc9902ce7ae8b6f7d2bab4627e9e689dca0f13a53e3c"
self.test_private_key = "d636f7cc20384475bdc30c3ede98f719ee09d1fd4709276103772dd9479f353c"
self.test_private_key_enc = "abddebec9d20cecf7d1cab95ad6c6394db3826856bf21c2c6af9954e9816c2239f5df697e52" \
"d60785eb1136803407b69729c38bb50eefdd2d24f2fa0f104990eee001866ba83704cf4f576" \
"a74b9b2452"
self.test_private_key_nonce = "4298a9ab3d9d5d8643dfd4445adc30301b565ab650497fb9"
self.test_private_key_nonce2 = "4298a9ab3d8d5d8643dfd4445adc30301b565ab650497fb9"
self.test_user_obj = models.User.objects.create(
email=self.test_email,
email_bcrypt=self.test_email_bcrypt,
username=self.test_username,
authkey=make_password(self.test_authkey),
public_key=self.test_public_key,
private_key=self.test_private_key_enc,
private_key_nonce=self.test_private_key_nonce,
secret_key=self.test_secret_key_enc,
secret_key_nonce=self.test_secret_key_nonce,
user_sauce='8b32efae0a4940bafa236ee35ee975f71833860b7fa747d44659717b18719d84',
is_email_active=True
)
self.test_user2_obj = models.User.objects.create(
email=self.test_email2,
email_bcrypt=self.test_email_bcrypt2,
username=self.test_username2,
authkey=make_password(self.test_authkey),
public_key=self.test_public_key,
private_key=self.test_private_key_enc,
private_key_nonce=self.test_private_key_nonce2,
secret_key=self.test_secret_key_enc,
secret_key_nonce=self.test_secret_key_nonce2,
user_sauce='14f79675e9b28c25d633b0e4511beb041cca41da864bd36c94c67d60c1d3f716',
is_email_active=True
)
self.test_secret_obj = models.Secret.objects.create(
user_id=self.test_user_obj.id,
data='12345',
data_nonce=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
type="dummy"
)
self.test_datastore_obj = models.Data_Store.objects.create(
user_id=self.test_user_obj.id,
type="my-type",
description= "my-description",
data='12345',
data_nonce= ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
secret_key= ''.join(random.choice(string.ascii_lowercase) for _ in range(256)),
secret_key_nonce= ''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
)
self.secret_link_obj = models.Secret_Link.objects.create(
link_id = '0493017f-47b0-446e-9a41-6533721ade71',
secret_id = self.test_secret_obj.id,
parent_datastore_id = self.test_datastore_obj.id,
parent_share_id = None
)
self.link_share = models.Link_Share.objects.create(
user=self.test_user_obj,
secret=self.test_secret_obj,
file_id=None,
allowed_reads=True,
public_title='A public title',
node=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
node_nonce=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
passphrase=None,
valid_till=None,
)
self.test_secret2_obj = models.Secret.objects.create(
user_id=self.test_user2_obj.id,
data='12345',
data_nonce=''.join(random.choice(string.ascii_lowercase) for _ in range(64)),
type="dummy"
)
def test_success(self):
"""
Tests to update a specific link share successful
"""
url = reverse('link_share')
data = {
'link_share_id': str(self.link_share.id),
'public_title': 'Another public title',
'allowed_reads': 2,
}
self.client.force_authenticate(user=self.test_user_obj)
response = self.client.post(url, data)
self.assertEqual(response.status_code, status.HTTP_200_OK)
updated_link_share = models.Link_Share.objects.get(pk=self.link_share.id)
self.assertEqual(updated_link_share.public_title, data['public_title'])
def test_without_permission(self):
"""
Tests to update a specific link share without permission
"""
url = reverse('link_share')
data = {
'link_share_id': str(self.link_share.id),
'public_title': 'Another public title',
'allowed_reads': 2,
}
self.client.force_authenticate(user=self.test_user2_obj)
response = self.client.post(url, data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_with_valid_till_already_expired(self):
"""
Tests to update a specific link share with a valid till that is already expired
"""
url = reverse('link_share')
data = {
'link_share_id': str(self.link_share.id),
'public_title': 'Another public title',
'allowed_reads': 2,
'valid_till': timezone.now() - timedelta(seconds=1),
}
self.client.force_authenticate(user=self.test_user_obj)
response = self.client.post(url, data)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
| 41.480216 | 115 | 0.659931 | 2,406 | 23,063 | 6.008728 | 0.080632 | 0.101819 | 0.039704 | 0.035277 | 0.927993 | 0.915335 | 0.913191 | 0.907104 | 0.890503 | 0.885315 | 0 | 0.135199 | 0.253393 | 23,063 | 555 | 116 | 41.554955 | 0.704396 | 0.034514 | 0 | 0.831731 | 0 | 0 | 0.205499 | 0.149777 | 0 | 0 | 0 | 0 | 0.040865 | 1 | 0.036058 | false | 0.043269 | 0.021635 | 0 | 0.064904 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5f0fa3a8e974ceecea86564bc9b65a41730ea0a4 | 21,914 | py | Python | tests/test_suite.py | a4k-openproject/a4kStreaming | c9674ddfede5b322496e79ec2a265d0af192d2f8 | [
"MIT"
] | 18 | 2021-01-20T23:33:17.000Z | 2022-01-11T07:51:16.000Z | tests/test_suite.py | newt-sc/a4kStreaming | c9674ddfede5b322496e79ec2a265d0af192d2f8 | [
"MIT"
] | 1 | 2021-02-04T18:20:52.000Z | 2021-02-04T18:20:52.000Z | tests/test_suite.py | newt-sc/a4kStreaming | c9674ddfede5b322496e79ec2a265d0af192d2f8 | [
"MIT"
] | 4 | 2021-01-17T23:46:29.000Z | 2022-01-23T22:20:24.000Z | # -*- coding: utf-8 -*-
from .common import (
sys,
os,
json,
re,
time,
api,
utils,
pytest
)
provider_url = os.environ.get('A4KSTREAMING_PROVIDER_URL')
premiumize_apikey = os.environ.get('A4KSTREAMING_PREMIUMIZE_APIKEY')
realdebrid_apikey = os.environ.get('A4KSTREAMING_REALDEBRID_APIKEY')
alldebrid_apikey = os.environ.get('A4KSTREAMING_ALLDEBRID_APIKEY')
imdb_token = os.environ.get('A4KSTREAMING_IMDB_TOKEN')
trakt_apikey = os.environ.get('A4KSTREAMING_TRAKT_APIKEY')
trakt_username = os.environ.get('A4KSTREAMING_TRAKT_USERNAME')
def __remove_cache(a4kstreaming_api):
try:
os.remove(a4kstreaming_api.core.cache.__search_filepath)
except: pass
try:
os.remove(a4kstreaming_api.core.cache.__provider_filepath)
except: pass
try:
os.remove(a4kstreaming_api.core.cache.__last_results_filepath)
except: pass
def __setup_provider(a4kstreaming_api, settings={}):
def select(*args, **kwargs): return 1
a4kstreaming_api.core.kodi.xbmcgui.Dialog().select = select
keyboard = a4kstreaming_api.core.kodi.xbmc.Keyboard('', '')
keyboard.getText = lambda: provider_url
keyboard.isConfirmed = lambda: True
__invoke(a4kstreaming_api, 'provider', { 'type': 'install' }, settings=settings)
provider = a4kstreaming_api.core.cache.get_provider()
selected = {}
for key in provider.keys():
if len(key) == 8 or settings.get('provider.use_recommended', 'false') == 'true':
selected[key] = True
a4kstreaming_api.core.cache.save_provider(selected)
def __invoke(a4kstreaming_api, action, params={}, settings={}, prerun=None, remove_cache=True):
global premiumize_apikey, realdebrid_apikey, imdb_token, trakt_apikey, trakt_username
if remove_cache:
__remove_cache(a4kstreaming_api)
fn = lambda: None
fn.params = a4kstreaming_api.core.utils.DictAsObject(params)
fn.settings = {
'general.timeout': '30',
'general.max_quality': '3',
'general.autoplay': 'false',
'general.mark_as_watched_rating': '7',
'general.page_size': '29',
'general.lists_page_size': '29',
'general.season_title_template': '0',
'general.episode_title_template': '0',
'general.max_movie_size': '200',
'views.menu': '0',
'views.titles': '0',
'views.seasons': '0',
'views.episodes': '0',
'views.season': '0',
'views.episode': '0',
'provider.use_recommended': 'false',
'premiumize.apikey': premiumize_apikey,
'realdebrid.apikey': realdebrid_apikey,
'alldebrid.apikey': alldebrid_apikey,
'imdb.at-main': imdb_token,
'trakt.clientid': trakt_apikey,
'trakt.username': trakt_username,
}
fn.settings.update(settings)
if prerun:
prerun()
start = time.time()
fn.results = getattr(a4kstreaming_api, action)(fn.params, fn.settings)
end = time.time()
print(end - start)
return fn
def test_provider_install():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
def select(*args, **kwargs): return 1
a4kstreaming_api.core.kodi.xbmcgui.Dialog().select = select
keyboard = a4kstreaming_api.core.kodi.xbmc.Keyboard('', '')
keyboard.getText = lambda: provider_url
keyboard.isConfirmed = lambda: True
__invoke(a4kstreaming_api, 'provider', { 'type': 'install' })
assert len(a4kstreaming_api.core.cache.get_provider()) > 0
def test_provider_manage():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
sources = ['SRC1', 'SRC2', 'SRC3', 'SRC4']
expected_selection = [1, 3]
def prerun():
a4kstreaming_api.core.cache.save_provider({
'SRC1': True,
'SRC2': False,
'SRC3': True,
'SRC4': False,
})
def multiselect(*args, **kwargs): return expected_selection
a4kstreaming_api.core.kodi.xbmcgui.Dialog().multiselect = multiselect
__invoke(a4kstreaming_api, 'provider', { 'type': 'manage' }, prerun=prerun)
provider = a4kstreaming_api.core.cache.get_provider()
for index in [0, 1, 2, 3]:
if index in expected_selection:
assert provider[sources[index]] is True
else:
assert provider[sources[index]] is False
def test_play_movie_pm():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
settings = { 'realdebrid.apikey': '', 'alldebrid.apikey': '' }
def prerun():
__setup_provider(a4kstreaming_api, settings)
title = b'eyJtZWRpYXR5cGUiOiAibW92aWUiLCAiaW1kYm51bWJlciI6ICJ0dDAxMDgxNjAiLCAidGl0bGUiOiAiU2xlZXBsZXNzIGluIFNlYXR0bGUiLCAib3JpZ2luYWx0aXRsZSI6ICJTbGVlcGxlc3MgaW4gU2VhdHRsZSIsICJ0dnNob3dpZCI6IG51bGwsICJzZWFzb25zIjogbnVsbCwgInR2c2hvd3RpdGxlIjogIiIsICJ5ZWFyIjogMTk5MywgInByZW1pZXJlZCI6ICIxOTkzLTYtMjUiLCAiZHVyYXRpb24iOiA2MzAwLCAibXBhYSI6ICJQRyIsICJnZW5yZSI6IFsiQ29tZWR5IiwgIkRyYW1hIiwgIlJvbWFuY2UiXSwgImNvdW50cnkiOiBbIlVuaXRlZCBTdGF0ZXMiXSwgInRyYWlsZXIiOiAiP2FjdGlvbj10cmFpbGVyJmlkPXZpNzI3MzY3NDQ5IiwgInBsb3QiOiAiQSByZWNlbnRseSB3aWRvd2VkIG1hbidzIHNvbiBjYWxscyBhIHJhZGlvIHRhbGstc2hvdyBpbiBhbiBhdHRlbXB0IHRvIGZpbmQgaGlzIGZhdGhlciBhIHBhcnRuZXIuIiwgInRhZ2xpbmUiOiAiV2hhdCBpZiBzb21lb25lIHlvdSBuZXZlciBtZXQsIHNvbWVvbmUgeW91IG5ldmVyIHNhdywgc29tZW9uZSB5b3UgbmV2ZXIga25ldyB3YXMgdGhlIG9ubHkgc29tZW9uZSBmb3IgeW91PyIsICJvdmVybGF5IjogMCwgInN0dWRpbyI6IFsiVHJpU3RhciBQaWN0dXJlcyIsICJUcmlTdGFyIFBpY3R1cmVzIiwgIkNvbHVtYmlhIFRyaVN0YXIgRmlsbSJdLCAiZGlyZWN0b3IiOiBbIk5vcmEgRXBocm9uIl0sICJ3cml0ZXIiOiBbIkplZmYgQXJjaCIsICJOb3JhIEVwaHJvbiIsICJEYXZpZCBTLiBXYXJkIl19'
play = __invoke(a4kstreaming_api, 'play', { 'type': title }, settings=settings, prerun=prerun)
assert play.results is not None
def test_play_movie_pm_using_recommended():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
settings = { 'realdebrid.apikey': '', 'alldebrid.apikey': '', 'provider.use_recommended': 'true' }
def prerun():
__setup_provider(a4kstreaming_api, settings)
title = b'eyJtZWRpYXR5cGUiOiAibW92aWUiLCAiaW1kYm51bWJlciI6ICJ0dDAxMDgxNjAiLCAidGl0bGUiOiAiU2xlZXBsZXNzIGluIFNlYXR0bGUiLCAib3JpZ2luYWx0aXRsZSI6ICJTbGVlcGxlc3MgaW4gU2VhdHRsZSIsICJ0dnNob3dpZCI6IG51bGwsICJzZWFzb25zIjogbnVsbCwgInR2c2hvd3RpdGxlIjogIiIsICJ5ZWFyIjogMTk5MywgInByZW1pZXJlZCI6ICIxOTkzLTYtMjUiLCAiZHVyYXRpb24iOiA2MzAwLCAibXBhYSI6ICJQRyIsICJnZW5yZSI6IFsiQ29tZWR5IiwgIkRyYW1hIiwgIlJvbWFuY2UiXSwgImNvdW50cnkiOiBbIlVuaXRlZCBTdGF0ZXMiXSwgInRyYWlsZXIiOiAiP2FjdGlvbj10cmFpbGVyJmlkPXZpNzI3MzY3NDQ5IiwgInBsb3QiOiAiQSByZWNlbnRseSB3aWRvd2VkIG1hbidzIHNvbiBjYWxscyBhIHJhZGlvIHRhbGstc2hvdyBpbiBhbiBhdHRlbXB0IHRvIGZpbmQgaGlzIGZhdGhlciBhIHBhcnRuZXIuIiwgInRhZ2xpbmUiOiAiV2hhdCBpZiBzb21lb25lIHlvdSBuZXZlciBtZXQsIHNvbWVvbmUgeW91IG5ldmVyIHNhdywgc29tZW9uZSB5b3UgbmV2ZXIga25ldyB3YXMgdGhlIG9ubHkgc29tZW9uZSBmb3IgeW91PyIsICJvdmVybGF5IjogMCwgInN0dWRpbyI6IFsiVHJpU3RhciBQaWN0dXJlcyIsICJUcmlTdGFyIFBpY3R1cmVzIiwgIkNvbHVtYmlhIFRyaVN0YXIgRmlsbSJdLCAiZGlyZWN0b3IiOiBbIk5vcmEgRXBocm9uIl0sICJ3cml0ZXIiOiBbIkplZmYgQXJjaCIsICJOb3JhIEVwaHJvbiIsICJEYXZpZCBTLiBXYXJkIl19'
play = __invoke(a4kstreaming_api, 'play', { 'type': title }, settings=settings, prerun=prerun)
assert play.results is not None
def test_trailer():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
trailer = __invoke(a4kstreaming_api, 'trailer', { 'id': 'vi727367449' })
assert len(trailer.results) > 0
def test_popular():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'popular' })
assert len(fn.results) > 0
def test_fan_picks():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'fan_picks' })
assert len(fn.results) > 0
def test_top_picks():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'top_picks' })
assert len(fn.results) > 0
def test_play_movie_rd():
if not os.environ.get('A4KSTREAMING_LOCAL'):
pytest.skip("No Key")
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
settings = { 'premiumize.apikey': '', 'alldebrid.apikey': '' }
def prerun():
__setup_provider(a4kstreaming_api, settings)
title = b'eyJtZWRpYXR5cGUiOiAibW92aWUiLCAiaW1kYm51bWJlciI6ICJ0dDAxMDgxNjAiLCAidGl0bGUiOiAiU2xlZXBsZXNzIGluIFNlYXR0bGUiLCAib3JpZ2luYWx0aXRsZSI6ICJTbGVlcGxlc3MgaW4gU2VhdHRsZSIsICJ0dnNob3dpZCI6IG51bGwsICJzZWFzb25zIjogbnVsbCwgInR2c2hvd3RpdGxlIjogIiIsICJ5ZWFyIjogMTk5MywgInByZW1pZXJlZCI6ICIxOTkzLTYtMjUiLCAiZHVyYXRpb24iOiA2MzAwLCAibXBhYSI6ICJQRyIsICJnZW5yZSI6IFsiQ29tZWR5IiwgIkRyYW1hIiwgIlJvbWFuY2UiXSwgImNvdW50cnkiOiBbIlVuaXRlZCBTdGF0ZXMiXSwgInRyYWlsZXIiOiAiP2FjdGlvbj10cmFpbGVyJmlkPXZpNzI3MzY3NDQ5IiwgInBsb3QiOiAiQSByZWNlbnRseSB3aWRvd2VkIG1hbidzIHNvbiBjYWxscyBhIHJhZGlvIHRhbGstc2hvdyBpbiBhbiBhdHRlbXB0IHRvIGZpbmQgaGlzIGZhdGhlciBhIHBhcnRuZXIuIiwgInRhZ2xpbmUiOiAiV2hhdCBpZiBzb21lb25lIHlvdSBuZXZlciBtZXQsIHNvbWVvbmUgeW91IG5ldmVyIHNhdywgc29tZW9uZSB5b3UgbmV2ZXIga25ldyB3YXMgdGhlIG9ubHkgc29tZW9uZSBmb3IgeW91PyIsICJvdmVybGF5IjogMCwgInN0dWRpbyI6IFsiVHJpU3RhciBQaWN0dXJlcyIsICJUcmlTdGFyIFBpY3R1cmVzIiwgIkNvbHVtYmlhIFRyaVN0YXIgRmlsbSJdLCAiZGlyZWN0b3IiOiBbIk5vcmEgRXBocm9uIl0sICJ3cml0ZXIiOiBbIkplZmYgQXJjaCIsICJOb3JhIEVwaHJvbiIsICJEYXZpZCBTLiBXYXJkIl19'
play = __invoke(a4kstreaming_api, 'play', { 'type': title }, settings=settings, prerun=prerun)
assert play.results is not None
def test_more_like_this():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'more_like_this', 'id': 'tt0383574' })
assert len(fn.results) > 0
def test_watchlist():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'watchlist' })
assert len(fn.results) > 0
def test_lists():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'lists' })
assert len(fn.results) > 0
def test_list():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'list', 'id': 'ls091520106' })
assert len(fn.results) > 0
def test_play_movie_ad():
if not os.environ.get('A4KSTREAMING_LOCAL'):
pytest.skip("Fails on CI - NO SERVER error")
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
settings = { 'premiumize.apikey': '', 'realdebrid.apikey': '' }
def prerun():
__setup_provider(a4kstreaming_api, settings)
title = b'eyJtZWRpYXR5cGUiOiAibW92aWUiLCAiaW1kYm51bWJlciI6ICJ0dDAxMDgxNjAiLCAidGl0bGUiOiAiU2xlZXBsZXNzIGluIFNlYXR0bGUiLCAib3JpZ2luYWx0aXRsZSI6ICJTbGVlcGxlc3MgaW4gU2VhdHRsZSIsICJ0dnNob3dpZCI6IG51bGwsICJzZWFzb25zIjogbnVsbCwgInR2c2hvd3RpdGxlIjogIiIsICJ5ZWFyIjogMTk5MywgInByZW1pZXJlZCI6ICIxOTkzLTYtMjUiLCAiZHVyYXRpb24iOiA2MzAwLCAibXBhYSI6ICJQRyIsICJnZW5yZSI6IFsiQ29tZWR5IiwgIkRyYW1hIiwgIlJvbWFuY2UiXSwgImNvdW50cnkiOiBbIlVuaXRlZCBTdGF0ZXMiXSwgInRyYWlsZXIiOiAiP2FjdGlvbj10cmFpbGVyJmlkPXZpNzI3MzY3NDQ5IiwgInBsb3QiOiAiQSByZWNlbnRseSB3aWRvd2VkIG1hbidzIHNvbiBjYWxscyBhIHJhZGlvIHRhbGstc2hvdyBpbiBhbiBhdHRlbXB0IHRvIGZpbmQgaGlzIGZhdGhlciBhIHBhcnRuZXIuIiwgInRhZ2xpbmUiOiAiV2hhdCBpZiBzb21lb25lIHlvdSBuZXZlciBtZXQsIHNvbWVvbmUgeW91IG5ldmVyIHNhdywgc29tZW9uZSB5b3UgbmV2ZXIga25ldyB3YXMgdGhlIG9ubHkgc29tZW9uZSBmb3IgeW91PyIsICJvdmVybGF5IjogMCwgInN0dWRpbyI6IFsiVHJpU3RhciBQaWN0dXJlcyIsICJUcmlTdGFyIFBpY3R1cmVzIiwgIkNvbHVtYmlhIFRyaVN0YXIgRmlsbSJdLCAiZGlyZWN0b3IiOiBbIk5vcmEgRXBocm9uIl0sICJ3cml0ZXIiOiBbIkplZmYgQXJjaCIsICJOb3JhIEVwaHJvbiIsICJEYXZpZCBTLiBXYXJkIl19'
play = __invoke(a4kstreaming_api, 'play', { 'type': title }, settings=settings, prerun=prerun)
assert play.results is not None
def test_seasons():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'seasons', 'id': 'tt3288518' })
assert len(fn.results) > 0
def test_seasons_with_paging():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'seasons', 'id': 'tt0239195' })
assert len(fn.results) > 0
def test_episodes():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', {
'type': 'episodes',
'id': 'tt0108778',
'season': '4',
'year': '1997',
'month': '09',
'day': '25',
'year_end': '1998',
'month_end': '05',
'day_end': '07',
})
assert len(fn.results) > 0
def test_year():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'year', 'target_year': '1990' })
assert len(fn.results) > 0
def test_play_episode_pm():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
settings = { 'realdebrid.apikey': '', 'alldebrid.apikey': '' }
def prerun():
__setup_provider(a4kstreaming_api, settings)
title = b'eyJtZWRpYXR5cGUiOiAiZXBpc29kZSIsICJpbWRibnVtYmVyIjogInR0MDU4MzYzMiIsICJ0aXRsZSI6ICJUaGUgT25lIHdpdGggdGhlIE5hcCBQYXJ0bmVycyIsICJvcmlnaW5hbHRpdGxlIjogIlRoZSBPbmUgd2l0aCB0aGUgTmFwIFBhcnRuZXJzIiwgInR2c2hvd2lkIjogInR0MDEwODc3OCIsICJzZWFzb25zIjogWzEsIDIsIDMsIDQsIDUsIDYsIDcsIDgsIDksIDEwXSwgInR2c2hvd3RpdGxlIjogIkZyaWVuZHMiLCAieWVhciI6IDIwMDAsICJwcmVtaWVyZWQiOiAiMjAwMC0xMS05IiwgImR1cmF0aW9uIjogMTMyMCwgIm1wYWEiOiAiVFYtUEciLCAiZ2VucmUiOiBbIkNvbWVkeSIsICJSb21hbmNlIl0sICJjb3VudHJ5IjogWyJVbml0ZWQgU3RhdGVzIl0sICJ0cmFpbGVyIjogbnVsbCwgInBsb3QiOiAiSm9leSBhbmQgUm9zcyBhY2NpZGVudGFsbHkgdGFrZSBhIG5hcCB0b2dldGhlciAtIGFuZCBtdWNoIHRvIHRoZWlyIGRpc21heSwgZmluZCB0aGF0IHRoZXkgbGlrZSBpdC4gUGhvZWJlIGFuZCBSYWNoZWwgY29tcGV0ZSB0byBiZWNvbWUgTW9uaWNhJ3MgbWFpZCBvZiBob25vci4iLCAidGFnbGluZSI6IG51bGwsICJvdmVybGF5IjogMCwgImVwaXNvZGUiOiA2LCAic2Vhc29uIjogNywgInN0dWRpbyI6IFsiQnJpZ2h0L0thdWZmbWFuL0NyYW5lIFByb2R1Y3Rpb25zIiwgIldhcm5lciBCcm9zLiBUZWxldmlzaW9uIiwgIk5hdGlvbmFsIEJyb2FkY2FzdGluZyBDb21wYW55IChOQkMpIl0sICJkaXJlY3RvciI6IFsiR2FyeSBIYWx2b3Jzb24iXSwgIndyaXRlciI6IFsiRGF2aWQgQ3JhbmUiLCAiTWFydGEgS2F1ZmZtYW4iLCAiQnJpYW4gQnVja25lciIsICJTZWJhc3RpYW4gSm9uZXMiXX0='
play = __invoke(a4kstreaming_api, 'play', { 'type': title }, settings=settings, prerun=prerun)
assert play.results is not None
def test_browse_movie():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'browse', 'id': 'tt6723592' })
assert len(fn.results) > 0
def test_browse_episode():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'browse', 'id': 'tt13052876' })
assert len(fn.results) > 0
def test_knownfor():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'knownfor', 'id': 'nm1434871' })
assert len(fn.results) > 0
def test_status():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'status', 'ids': ['tt8111088', 'tt7126948'] })
assert len(fn.results) > 0
def test_ratings():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'query', { 'type': 'ratings', 'ids': ['tt8111088', 'tt7126948'] })
assert len(fn.results) > 0
def test_search():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'search', { 'query': 'tenet' })
assert len(fn.results) > 0
def test_play_episode_ad():
if not os.environ.get('A4KSTREAMING_LOCAL'):
pytest.skip("Fails on CI - NO SERVER error")
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
settings = { 'premiumize.apikey': '', 'realdebrid.apikey': '' }
def prerun():
__setup_provider(a4kstreaming_api, settings)
title = b'eyJtZWRpYXR5cGUiOiAiZXBpc29kZSIsICJpbWRibnVtYmVyIjogInR0MDU4MzYzMiIsICJ0aXRsZSI6ICJUaGUgT25lIHdpdGggdGhlIE5hcCBQYXJ0bmVycyIsICJvcmlnaW5hbHRpdGxlIjogIlRoZSBPbmUgd2l0aCB0aGUgTmFwIFBhcnRuZXJzIiwgInR2c2hvd2lkIjogInR0MDEwODc3OCIsICJzZWFzb25zIjogWzEsIDIsIDMsIDQsIDUsIDYsIDcsIDgsIDksIDEwXSwgInR2c2hvd3RpdGxlIjogIkZyaWVuZHMiLCAieWVhciI6IDIwMDAsICJwcmVtaWVyZWQiOiAiMjAwMC0xMS05IiwgImR1cmF0aW9uIjogMTMyMCwgIm1wYWEiOiAiVFYtUEciLCAiZ2VucmUiOiBbIkNvbWVkeSIsICJSb21hbmNlIl0sICJjb3VudHJ5IjogWyJVbml0ZWQgU3RhdGVzIl0sICJ0cmFpbGVyIjogbnVsbCwgInBsb3QiOiAiSm9leSBhbmQgUm9zcyBhY2NpZGVudGFsbHkgdGFrZSBhIG5hcCB0b2dldGhlciAtIGFuZCBtdWNoIHRvIHRoZWlyIGRpc21heSwgZmluZCB0aGF0IHRoZXkgbGlrZSBpdC4gUGhvZWJlIGFuZCBSYWNoZWwgY29tcGV0ZSB0byBiZWNvbWUgTW9uaWNhJ3MgbWFpZCBvZiBob25vci4iLCAidGFnbGluZSI6IG51bGwsICJvdmVybGF5IjogMCwgImVwaXNvZGUiOiA2LCAic2Vhc29uIjogNywgInN0dWRpbyI6IFsiQnJpZ2h0L0thdWZmbWFuL0NyYW5lIFByb2R1Y3Rpb25zIiwgIldhcm5lciBCcm9zLiBUZWxldmlzaW9uIiwgIk5hdGlvbmFsIEJyb2FkY2FzdGluZyBDb21wYW55IChOQkMpIl0sICJkaXJlY3RvciI6IFsiR2FyeSBIYWx2b3Jzb24iXSwgIndyaXRlciI6IFsiRGF2aWQgQ3JhbmUiLCAiTWFydGEgS2F1ZmZtYW4iLCAiQnJpYW4gQnVja25lciIsICJTZWJhc3RpYW4gSm9uZXMiXX0='
play = __invoke(a4kstreaming_api, 'play', { 'type': title }, settings=settings, prerun=prerun)
assert play.results is not None
def test_watchlist_add():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'watchlist_add', 'id': 'tt8111088' })
assert fn.results is True
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'watchlist_add', 'ids': 'tt8111088__tt7126948' })
assert fn.results is True
def test_watchlist_remove():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'watchlist_remove', 'id': 'tt8111088' })
assert fn.results is True
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'watchlist_remove', 'ids': 'tt8111088__tt7126948' })
assert fn.results is True
def test_rate_rate():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
def select(*args, **kwargs): return 1
a4kstreaming_api.core.kodi.xbmcgui.Dialog().select = select
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'rate', 'id': 'tt8111088' })
assert fn.results is True
def test_rate_unrate():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
def select(*args, **kwargs): return 0
a4kstreaming_api.core.kodi.xbmcgui.Dialog().select = select
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'rate', 'id': 'tt8111088' })
assert fn.results is True
def test_mark_as_watched():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'mark_as_watched', 'id': 'tt2011749' })
assert fn.results is True
def test_mark_as_unwatched():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'mark_as_unwatched', 'id': 'tt2011749' })
assert fn.results is True
def test_season_mark_as_watched():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'mark_as_watched', 'ids': 'tt3676824__tt3676822__tt3676826__tt3676830__tt3676828__tt3676832__tt3676836__tt3676834__tt3676844__tt3676840__tt3676846__tt3676848' })
assert fn.results is True
def test_season_mark_as_unwatched():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'profile', { 'type': 'mark_as_unwatched', 'ids': 'tt3676824__tt3676822__tt3676826__tt3676830__tt3676828__tt3676832__tt3676836__tt3676834__tt3676844__tt3676840__tt3676846__tt3676848' })
assert fn.results is True
def test_play_episode_rd():
if not os.environ.get('A4KSTREAMING_LOCAL'):
pytest.skip("NO Key")
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
settings = { 'premiumize.apikey': '', 'alldebrid.apikey': '' }
def prerun():
__setup_provider(a4kstreaming_api, settings)
title = b'eyJtZWRpYXR5cGUiOiAiZXBpc29kZSIsICJpbWRibnVtYmVyIjogInR0MDU4MzYzMiIsICJ0aXRsZSI6ICJUaGUgT25lIHdpdGggdGhlIE5hcCBQYXJ0bmVycyIsICJvcmlnaW5hbHRpdGxlIjogIlRoZSBPbmUgd2l0aCB0aGUgTmFwIFBhcnRuZXJzIiwgInR2c2hvd2lkIjogInR0MDEwODc3OCIsICJzZWFzb25zIjogWzEsIDIsIDMsIDQsIDUsIDYsIDcsIDgsIDksIDEwXSwgInR2c2hvd3RpdGxlIjogIkZyaWVuZHMiLCAieWVhciI6IDIwMDAsICJwcmVtaWVyZWQiOiAiMjAwMC0xMS05IiwgImR1cmF0aW9uIjogMTMyMCwgIm1wYWEiOiAiVFYtUEciLCAiZ2VucmUiOiBbIkNvbWVkeSIsICJSb21hbmNlIl0sICJjb3VudHJ5IjogWyJVbml0ZWQgU3RhdGVzIl0sICJ0cmFpbGVyIjogbnVsbCwgInBsb3QiOiAiSm9leSBhbmQgUm9zcyBhY2NpZGVudGFsbHkgdGFrZSBhIG5hcCB0b2dldGhlciAtIGFuZCBtdWNoIHRvIHRoZWlyIGRpc21heSwgZmluZCB0aGF0IHRoZXkgbGlrZSBpdC4gUGhvZWJlIGFuZCBSYWNoZWwgY29tcGV0ZSB0byBiZWNvbWUgTW9uaWNhJ3MgbWFpZCBvZiBob25vci4iLCAidGFnbGluZSI6IG51bGwsICJvdmVybGF5IjogMCwgImVwaXNvZGUiOiA2LCAic2Vhc29uIjogNywgInN0dWRpbyI6IFsiQnJpZ2h0L0thdWZmbWFuL0NyYW5lIFByb2R1Y3Rpb25zIiwgIldhcm5lciBCcm9zLiBUZWxldmlzaW9uIiwgIk5hdGlvbmFsIEJyb2FkY2FzdGluZyBDb21wYW55IChOQkMpIl0sICJkaXJlY3RvciI6IFsiR2FyeSBIYWx2b3Jzb24iXSwgIndyaXRlciI6IFsiRGF2aWQgQ3JhbmUiLCAiTWFydGEgS2F1ZmZtYW4iLCAiQnJpYW4gQnVja25lciIsICJTZWJhc3RpYW4gSm9uZXMiXX0='
play = __invoke(a4kstreaming_api, 'play', { 'type': title }, settings=settings, prerun=prerun)
assert play.results is not None
def test_premiumize_files():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'cloud', { 'type': 'premiumize_files' })
assert len(fn.results) > 0
result = [r for r in fn.results if 'aws' in r['label'].lower()][0]
fn = __invoke(a4kstreaming_api, 'cloud', { 'type': 'premiumize_files', 'id': result['params']['id'] })
assert len(fn.results) > 0
result = [r for r in fn.results if 'aws' in r['label'].lower()][0]
fn = __invoke(a4kstreaming_api, 'cloud', { 'type': 'premiumize_files', 'id': result['params']['id'] })
assert len(fn.results) > 0
def test_premiumize_transfers():
a4kstreaming_api = api.A4kStreamingApi({'kodi': True})
fn = __invoke(a4kstreaming_api, 'cloud', { 'type': 'premiumize_transfers' })
assert len(fn.results) > 0
| 48.5898 | 1,147 | 0.779775 | 1,680 | 21,914 | 9.904762 | 0.127381 | 0.096454 | 0.054267 | 0.073377 | 0.869651 | 0.85012 | 0.836599 | 0.813281 | 0.810337 | 0.797596 | 0 | 0.068759 | 0.121977 | 21,914 | 450 | 1,148 | 48.697778 | 0.796061 | 0.000958 | 0 | 0.472313 | 0 | 0 | 0.46156 | 0.372893 | 0 | 1 | 0 | 0 | 0.136808 | 1 | 0.172638 | false | 0.009772 | 0.003257 | 0.016287 | 0.179153 | 0.003257 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
a039b19f7d7a615aa7460685d42678cebcc1cc67 | 112 | py | Python | util.py | sanjitk7/ClonalAlgorithmAI | 8bad2ab758817985edee4decdc85f39769a1acda | [
"MIT"
] | null | null | null | util.py | sanjitk7/ClonalAlgorithmAI | 8bad2ab758817985edee4decdc85f39769a1acda | [
"MIT"
] | null | null | null | util.py | sanjitk7/ClonalAlgorithmAI | 8bad2ab758817985edee4decdc85f39769a1acda | [
"MIT"
] | null | null | null | import uuid
def create_uuid():
return uuid.uuid4()
# if (__name__=="__main__"):
# print(create_uuid()) | 16 | 28 | 0.651786 | 14 | 112 | 4.5 | 0.714286 | 0.31746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01087 | 0.178571 | 112 | 7 | 29 | 16 | 0.673913 | 0.455357 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | true | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
a0ae6a469e31fa1feceb87b37d8523ad70a22ebb | 2,074 | py | Python | libeqs/eqs_wrapper.py | Panchatantra/EQSignal | ac04d71a5b011bc3feb66562d6dda0b4e7e4e67b | [
"MIT"
] | 5 | 2018-04-19T12:41:26.000Z | 2021-09-21T12:39:02.000Z | libeqs/eqs_wrapper.py | jiayingqi/EQSignal | ac04d71a5b011bc3feb66562d6dda0b4e7e4e67b | [
"MIT"
] | null | null | null | libeqs/eqs_wrapper.py | jiayingqi/EQSignal | ac04d71a5b011bc3feb66562d6dda0b4e7e4e67b | [
"MIT"
] | 2 | 2018-08-04T09:57:06.000Z | 2020-06-01T03:24:26.000Z | from ctypes import POINTER, c_int, c_double, byref
import numpy as np
libeqs = np.ctypeslib.load_library("libeqs", ".")
# subroutine acc2vd(a,v,d,n,dt,v0,d0)
libeqs.acc2vd.argtypes = [
POINTER(c_double), POINTER(c_double), POINTER(c_double),
POINTER(c_int), POINTER(c_double), POINTER(c_double), POINTER(c_double)
]
# subroutine ratacc2vd(a,v,d,n,dt,v0,d0)
libeqs.ratacc2vd.argtypes = [
POINTER(c_double), POINTER(c_double), POINTER(c_double),
POINTER(c_int), POINTER(c_double), POINTER(c_double), POINTER(c_double)
]
# subroutine spectrum(acc,n,dt,zeta,P,np,SPA,SPI,SM)
libeqs.spectrum.argtypes = [
POINTER(c_double), POINTER(c_int), POINTER(c_double), POINTER(c_double),
POINTER(c_double), POINTER(c_int), POINTER(c_double), POINTER(c_int), POINTER(c_int)
]
# subroutine spectrumavd(acc,n,dt,zeta,P,np,SPA,SPI,SPV,SPD,SPEV,SM)
libeqs.spectrumavd.argtypes = [
POINTER(c_double), POINTER(c_int), POINTER(c_double), POINTER(c_double),
POINTER(c_double), POINTER(c_int), POINTER(c_double), POINTER(c_int),
POINTER(c_double), POINTER(c_double), POINTER(c_double), POINTER(c_int)
]
# subroutine spmu(acc,n,dt,zeta,P,nP,SPA,SPI,SPV,SPD,SPE,mu,model,tol,maxiter,uy,rk)
libeqs.spmu.argtypes = [
POINTER(c_double), POINTER(c_int), POINTER(c_double), POINTER(c_double),
POINTER(c_double), POINTER(c_int), POINTER(c_double), POINTER(c_int),
POINTER(c_double), POINTER(c_double), POINTER(c_double),
POINTER(c_double), POINTER(c_int), POINTER(c_double), POINTER(c_int),
POINTER(c_double), POINTER(c_double)
]
# subroutine r(acc,n,dt,zeta,P,ra,rv,rd,SM)
libeqs.r.argtypes = [
POINTER(c_double), POINTER(c_int), POINTER(c_double), POINTER(c_double),
POINTER(c_double), POINTER(c_double), POINTER(c_double),
POINTER(c_double), POINTER(c_int)
]
# subroutine rnl(acc,n,dt,zeta,P,ra,rv,rd,SM,mu)
libeqs.rnl.argtypes = [
POINTER(c_double), POINTER(c_int), POINTER(c_double), POINTER(c_double),
POINTER(c_double), POINTER(c_double), POINTER(c_double),
POINTER(c_double), POINTER(c_int), POINTER(c_double)
] | 39.884615 | 88 | 0.729026 | 338 | 2,074 | 4.254438 | 0.147929 | 0.400556 | 0.506259 | 0.700974 | 0.819889 | 0.819889 | 0.819889 | 0.812935 | 0.777469 | 0.751043 | 0 | 0.00432 | 0.10704 | 2,074 | 52 | 89 | 39.884615 | 0.772138 | 0.175506 | 0 | 0.405405 | 0 | 0 | 0.00411 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.054054 | 0 | 0.054054 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
266e0277bcae11db50326d39aa9dffd144bed3ed | 156 | py | Python | util.py | odgiv/SegAN | d7a91fbc10139dc81c61737326649a3a758cdf94 | [
"MIT"
] | null | null | null | util.py | odgiv/SegAN | d7a91fbc10139dc81c61737326649a3a758cdf94 | [
"MIT"
] | null | null | null | util.py | odgiv/SegAN | d7a91fbc10139dc81c61737326649a3a758cdf94 | [
"MIT"
] | null | null | null | from scipy.spatial.distance import directed_hausdorff
def hausdorf_distance(a, b):
return max(directed_hausdorff(a, b)[0], directed_hausdorff(b, a)[0]) | 39 | 72 | 0.775641 | 24 | 156 | 4.875 | 0.583333 | 0.435897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014286 | 0.102564 | 156 | 4 | 72 | 39 | 0.821429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 7 |
cd00a7ad85df0e8214491d3d6eb67c2014975921 | 56 | py | Python | tests/m3u/__init__.py | grdorin/mopidy | 76db44088c102d7ad92a3fc6a15a938e66b99b0d | [
"Apache-2.0"
] | 6,700 | 2015-01-01T03:57:59.000Z | 2022-03-30T09:31:31.000Z | tests/m3u/__init__.py | pnijhara/mopidy | 7168787ea6c82b66e138fc2b388d78fa1c7661ba | [
"Apache-2.0"
] | 1,141 | 2015-01-02T09:48:59.000Z | 2022-03-28T22:25:30.000Z | tests/m3u/__init__.py | pnijhara/mopidy | 7168787ea6c82b66e138fc2b388d78fa1c7661ba | [
"Apache-2.0"
] | 735 | 2015-01-01T21:15:50.000Z | 2022-03-20T16:13:44.000Z | def generate_song(i):
return f"dummy:track:song{i}"
| 18.666667 | 33 | 0.696429 | 10 | 56 | 3.8 | 0.8 | 0.263158 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 56 | 2 | 34 | 28 | 0.791667 | 0 | 0 | 0 | 1 | 0 | 0.339286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 7 |
cd429585c3289f773ca51de9caa6d60f03383447 | 529 | py | Python | The Game API.py | Yofenry/Text-Game-API | a8ad920f3e0e4f12e4141b0e4c6ba19f00c42b46 | [
"CC0-1.0"
] | null | null | null | The Game API.py | Yofenry/Text-Game-API | a8ad920f3e0e4f12e4141b0e4c6ba19f00c42b46 | [
"CC0-1.0"
] | null | null | null | The Game API.py | Yofenry/Text-Game-API | a8ad920f3e0e4f12e4141b0e4c6ba19f00c42b46 | [
"CC0-1.0"
] | null | null | null | api = input ("Enter Text Here")
print("Enter Text Here" + api +"Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here" + api + "Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
print("Enter Text Here")
| 26.45 | 51 | 0.678639 | 81 | 529 | 4.432099 | 0.074074 | 0.501393 | 0.724234 | 0.852368 | 0.977716 | 0.977716 | 0.977716 | 0.977716 | 0.977716 | 0.977716 | 0 | 0 | 0.168242 | 529 | 19 | 52 | 27.842105 | 0.815909 | 0 | 0 | 0.944444 | 0 | 0 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.944444 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 11 |
cd888fc36ba22077c4d97dadbb32947eac092abc | 3,670 | py | Python | adabox/plot_tools.py | jnfran92/adaptive-boxes | bcf03a91d48877b3a24125b74a233bda5bd8e044 | [
"MIT"
] | 7 | 2020-06-05T23:18:14.000Z | 2021-12-27T01:27:06.000Z | adabox/plot_tools.py | jnfran92/adaptive-boxes | bcf03a91d48877b3a24125b74a233bda5bd8e044 | [
"MIT"
] | 3 | 2019-09-15T15:43:29.000Z | 2020-11-19T16:27:22.000Z | adabox/plot_tools.py | jnfran92/adaptive-boxes | bcf03a91d48877b3a24125b74a233bda5bd8e044 | [
"MIT"
] | 1 | 2020-09-24T08:01:39.000Z | 2020-09-24T08:01:39.000Z |
import matplotlib
import matplotlib.pyplot as plt
import numpy as np
from .tools import Rectangle
from itertools import cycle
cycle_color = cycle('bgrcmk')
def plot_rectangles(recs_arg, sep_value_arg):
max_area_val = np.max([item.get_area() for item in recs_arg])
fig = plt.figure()
plt.axis('off')
ax = fig.add_subplot(111)
sep_to_plot = sep_value_arg / 2
for rec_val in recs_arg:
plot_rectangle(rec_val, sep_to_plot, max_area_val, ax)
plt.axis('scaled')
fig.tight_layout()
def plot_rectangles_only_lines(recs_arg, sep_value_arg):
max_area_val = np.max([item.get_area() for item in recs_arg])
fig = plt.figure()
plt.axis('off')
ax = fig.add_subplot(111)
sep_to_plot = sep_value_arg / 2
for rec_val in recs_arg:
plot_rectangle_lines(rec_val, sep_to_plot, max_area_val, ax)
plt.axis('scaled')
fig.tight_layout()
def plot_rectangle(rec_arg: Rectangle, sep_to_plot_arg, max_area_arg, ax):
p1 = np.array([rec_arg.x1 - sep_to_plot_arg, rec_arg.y1 - sep_to_plot_arg])
p2 = np.array([rec_arg.x1 - sep_to_plot_arg, rec_arg.y2 + sep_to_plot_arg])
p3 = np.array([rec_arg.x2 + sep_to_plot_arg, rec_arg.y1 - sep_to_plot_arg])
p4 = np.array([rec_arg.x2 + sep_to_plot_arg, rec_arg.y2 + sep_to_plot_arg])
ps = np.array([p1, p2, p4, p3, p1])
max_n = 300
max_log = np.log2(max_n + 1)
area_ratio = (max_n*(rec_arg.get_area()/max_area_arg))+1
line_w = np.log2(area_ratio)/max_log
plt.plot(ps[:, 0], ps[:, 1], linewidth=0.1*line_w + 0.08, c='black')
# rect = matplotlib.patches.Rectangle((p1[0], p1[1]), p3[0] - p1[0], p2[1] - p1[1], color='black', lw=0)
# rect = matplotlib.patches.Rectangle((p1[0], p1[1]), p3[0] - p1[0], p2[1] - p1[1], color=next(cycol), lw=0)
rect = matplotlib.patches.Rectangle((p1[0], p1[1]), p3[0] - p1[0], p2[1] - p1[1], color=np.random.rand(3,), lw=0)
ax.add_patch(rect)
def plot_rectangle_lines(rec_arg: Rectangle, sep_to_plot_arg, max_area_arg, ax):
p1 = np.array([rec_arg.x1 - sep_to_plot_arg, rec_arg.y1 - sep_to_plot_arg])
p2 = np.array([rec_arg.x1 - sep_to_plot_arg, rec_arg.y2 + sep_to_plot_arg])
p3 = np.array([rec_arg.x2 + sep_to_plot_arg, rec_arg.y1 - sep_to_plot_arg])
p4 = np.array([rec_arg.x2 + sep_to_plot_arg, rec_arg.y2 + sep_to_plot_arg])
ps = np.array([p1, p2, p4, p3, p1])
max_n = 300
max_log = np.log2(max_n + 1)
area_ratio = (max_n*(rec_arg.get_area()/max_area_arg))+1
line_w = np.log2(area_ratio)/max_log
# plt.plot(ps[:, 0], ps[:, 1], linewidth=0.1*line_w + 0.05, c='red')
max_n = 300
max_log = np.log2(max_n + 1)
area_ratio = (max_n*(rec_arg.get_area()/max_area_arg))+1
line_w = np.log2(area_ratio)/max_log
# plt.plot(ps[:, 0], ps[:, 1], linewidth=0.1*line_w + 0.05, c='red')
plt.plot(ps[:, 0], ps[:, 1], linewidth=0.05, c='red')
rect = matplotlib.patches.Rectangle((p1[0], p1[1]), p3[0] - p1[0], p2[1] - p1[1], color='yellow', lw=0)
# rect = matplotlib.patches.Rectangle((p1[0], p1[1]), p3[0] - p1[0], p2[1] - p1[1], color=next(cycol), lw=0)
# rect = matplotlib.patches.Rectangle((p1[0], p1[1]), p3[0] - p1[0], p2[1] - p1[1], color=np.random.rand(3,), lw=0)
ax.add_patch(rect)
plt.plot(ps[:, 0], ps[:, 1], linewidth=0.05, c='red')
rect = matplotlib.patches.Rectangle((p1[0], p1[1]), p3[0] - p1[0], p2[1] - p1[1], color='yellow', lw=0)
# rect = matplotlib.patches.Rectangle((p1[0], p1[1]), p3[0] - p1[0], p2[1] - p1[1], color=next(cycol), lw=0)
# rect = matplotlib.patches.Rectangle((p1[0], p1[1]), p3[0] - p1[0], p2[1] - p1[1], color=np.random.rand(3,), lw=0)
ax.add_patch(rect)
| 39.891304 | 119 | 0.642779 | 692 | 3,670 | 3.16474 | 0.114162 | 0.050228 | 0.090411 | 0.09863 | 0.903653 | 0.903653 | 0.903653 | 0.903653 | 0.903653 | 0.903653 | 0 | 0.070441 | 0.172207 | 3,670 | 91 | 120 | 40.32967 | 0.650428 | 0.213896 | 0 | 0.762712 | 0 | 0 | 0.016348 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.067797 | false | 0 | 0.084746 | 0 | 0.152542 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
26d8ced72700b226b4f46522f66117ded62eab3f | 5,475 | py | Python | test/external_api/test_workspace.py | kbase/feeds | a2ed4cb88120aeb10a295919cb0fba85e13d462d | [
"MIT"
] | null | null | null | test/external_api/test_workspace.py | kbase/feeds | a2ed4cb88120aeb10a295919cb0fba85e13d462d | [
"MIT"
] | 48 | 2018-10-15T23:36:50.000Z | 2022-01-19T02:49:30.000Z | test/external_api/test_workspace.py | kbase/feeds | a2ed4cb88120aeb10a295919cb0fba85e13d462d | [
"MIT"
] | 3 | 2018-10-03T20:37:41.000Z | 2019-01-16T15:03:19.000Z | import pytest
from feeds.external_api.workspace import (
validate_narrative_id,
validate_workspace_id,
validate_workspace_ids,
get_workspace_name,
get_workspace_names,
get_narrative_name,
get_narrative_names
)
from feeds.exceptions import WorkspaceError
DUMMY_WS_INFO = [1, "some_name"]
DUMMY_WS_INFO2 = [2, "some_other_name"]
DUMMY_NARR_INFO = [1, "some_name", "some_owner", "timestamp", 5, "a", "n", "unlocked", {"narrative_nice_name": "Foo", "narrative": 1}]
DUMMY_NARR_INFO2 = [2, "some_other_name", "some_owner", "timestamp", 6, "a", "n", "unlocked", {"narrative_nice_name": "Bar", "narrative": 2}]
DUMMY_NARR_INFO_NO_NAME = [3, "some_name", "some_owner", "timestamp", 5, "a", "n", "unlocked", {"narrative": 1}]
def test_validate_narr_id(mock_valid_user_token, mock_workspace_info):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info(DUMMY_NARR_INFO)
assert validate_workspace_id(1, token)
def test_validate_narr_id_fail(mock_valid_user_token, mock_workspace_info_invalid):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_invalid(1)
assert validate_workspace_id(1, token) is False
def test_validate_narr_id_err(mock_valid_user_token, mock_workspace_info_error):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_error(1)
assert validate_workspace_id(1, token) is False
def test_validate_ws_id(mock_valid_user_token, mock_workspace_info):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info(DUMMY_WS_INFO)
assert validate_workspace_id(1, token)
def test_validate_ws_id_fail(mock_valid_user_token, mock_workspace_info_invalid):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_invalid(1)
assert validate_workspace_id(1, token) is False
def test_validate_ws_id_err(mock_valid_user_token, mock_workspace_info_error):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_error(1)
assert validate_workspace_id(1, token) is False
def test_validate_ws_ids(mock_valid_user_token, mock_workspace_info):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info(DUMMY_WS_INFO)
valids = validate_workspace_ids([1, 2], token)
std = {1: True, 2: False}
for i in [1, 2]:
assert valids.get(i) == std[i]
def test_validate_ws_ids_fail(mock_valid_user_token, mock_workspace_info_invalid):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_invalid(1)
valids = validate_workspace_ids([1], token)
assert valids[1] is False
def test_validate_ws_ids_err(mock_valid_user_token, mock_workspace_info_error):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_error(1)
valids = validate_workspace_ids([1], token)
assert valids[1] is None
def test_get_ws_name(mock_valid_user_token, mock_workspace_info):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info(DUMMY_WS_INFO)
assert get_workspace_name(DUMMY_WS_INFO[0], token) == DUMMY_WS_INFO[1]
def test_get_ws_name_fail(mock_valid_user_token, mock_workspace_info_invalid):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_invalid(DUMMY_WS_INFO[0])
assert get_workspace_name(DUMMY_WS_INFO[0], token) is None
def test_get_ws_name_err(mock_valid_user_token, mock_workspace_info_error):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_error(DUMMY_WS_INFO[0])
with pytest.raises(WorkspaceError) as e:
get_workspace_name(DUMMY_WS_INFO[0], token)
assert "Unable to find name for workspace id: {}".format(DUMMY_WS_INFO[0]) in str(e)
def test_get_ws_names(mock_valid_user_token, mock_workspace_info):
pass
def test_get_ws_names_fail(mock_valid_user_token, mock_workspace_info_invalid):
pass
def test_get_ws_names_err(mock_valid_user_token, mock_workspace_info_error):
pass
def test_get_narr_name(mock_valid_user_token, mock_workspace_info):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info(DUMMY_NARR_INFO)
assert get_narrative_name(DUMMY_NARR_INFO[0], token) == DUMMY_NARR_INFO[8]["narrative_nice_name"]
mock_workspace_info(DUMMY_NARR_INFO_NO_NAME)
assert get_narrative_name(DUMMY_NARR_INFO_NO_NAME[0], token) == "Untitled"
def test_get_narr_name_fail(mock_valid_user_token, mock_workspace_info_invalid):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_invalid(DUMMY_NARR_INFO[0])
assert get_narrative_name(DUMMY_NARR_INFO[0], token) is None
def test_get_narr_name_err(mock_valid_user_token, mock_workspace_info_error):
mock_valid_user_token("test_user", "Test User")
token = "foo"
mock_workspace_info_error(DUMMY_NARR_INFO[0])
with pytest.raises(WorkspaceError) as e:
get_narrative_name(DUMMY_NARR_INFO[0], token)
assert "Unable to find name for narrative id: {}".format(DUMMY_NARR_INFO[0]) in str(e)
def test_get_narr_names(mock_valid_user_token, mock_workspace_info):
pass
def test_get_narr_names_fail(mock_valid_user_token, mock_workspace_info_invalid):
pass
def test_get_narr_names_err(mock_valid_user_token, mock_workspace_info_error):
pass
| 39.673913 | 141 | 0.768584 | 851 | 5,475 | 4.452409 | 0.082256 | 0.12114 | 0.166007 | 0.171021 | 0.872262 | 0.820797 | 0.800475 | 0.779097 | 0.748482 | 0.669306 | 0 | 0.010544 | 0.133881 | 5,475 | 137 | 142 | 39.963504 | 0.788486 | 0 | 0 | 0.5 | 0 | 0 | 0.116368 | 0 | 0 | 0 | 0 | 0 | 0.140351 | 1 | 0.184211 | false | 0.052632 | 0.026316 | 0 | 0.210526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 8 |
f810948fc0ca22466324ada12d16d171ab9c4408 | 1,270 | py | Python | tests/grooves/test_spline.py | pyroll-project/pyroll-core | f59094d58c2f7493ddc6345b3afc4700ca259681 | [
"BSD-3-Clause"
] | null | null | null | tests/grooves/test_spline.py | pyroll-project/pyroll-core | f59094d58c2f7493ddc6345b3afc4700ca259681 | [
"BSD-3-Clause"
] | null | null | null | tests/grooves/test_spline.py | pyroll-project/pyroll-core | f59094d58c2f7493ddc6345b3afc4700ca259681 | [
"BSD-3-Clause"
] | null | null | null | import numpy as np
from pyroll.core import SplineGroove
points = [
(-2, 0),
(0, 0),
(1, 1),
(2, 2),
(5, 2),
(8, 2),
(9, 1),
(10, 0),
(13, 0)
]
def test_spline_with_usable_width():
g = SplineGroove(
points,
usable_width=9,
types=("oval", "swedish_oval"),
)
assert np.isclose(g.usable_width, 9)
assert np.isclose(g.cross_section.area, 16)
assert np.isclose(g.depth, 2)
assert np.isclose(g.local_depth(1), 2)
assert np.isclose(g.local_depth(5), 0)
assert np.isclose(g.local_depth(4), 1)
assert np.isclose(g.local_depth(-5), 0)
assert np.isclose(g.local_depth(-4), 1)
assert "oval" in g.types
assert "swedish_oval" in g.types
def test_spline_without_usable_width():
g = SplineGroove(
points,
types=("oval", "swedish_oval"),
)
assert np.isclose(g.usable_width, 10)
assert np.isclose(g.cross_section.area, 16)
assert np.isclose(g.depth, 2)
assert np.isclose(g.local_depth(1), 2)
assert np.isclose(g.local_depth(5), 0)
assert np.isclose(g.local_depth(4), 1)
assert np.isclose(g.local_depth(-5), 0)
assert np.isclose(g.local_depth(-4), 1)
assert "oval" in g.types
assert "swedish_oval" in g.types
| 23.518519 | 47 | 0.617323 | 201 | 1,270 | 3.766169 | 0.189055 | 0.169089 | 0.317041 | 0.338177 | 0.834875 | 0.755614 | 0.755614 | 0.755614 | 0.755614 | 0.755614 | 0 | 0.05123 | 0.231496 | 1,270 | 53 | 48 | 23.962264 | 0.724385 | 0 | 0 | 0.545455 | 0 | 0 | 0.050394 | 0 | 0 | 0 | 0 | 0 | 0.454545 | 1 | 0.045455 | false | 0 | 0.045455 | 0 | 0.090909 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
f891f7af63cf8cd2e536dc05f023b909c0f09869 | 8,344 | py | Python | python/test_world.py | carterbancroft/game-of-life | e9dfb0cbd2d6ceb2cb94a91db0de899ff9e45fa4 | [
"Unlicense"
] | null | null | null | python/test_world.py | carterbancroft/game-of-life | e9dfb0cbd2d6ceb2cb94a91db0de899ff9e45fa4 | [
"Unlicense"
] | null | null | null | python/test_world.py | carterbancroft/game-of-life | e9dfb0cbd2d6ceb2cb94a91db0de899ff9e45fa4 | [
"Unlicense"
] | null | null | null | # 3rd party
import unittest
# Package level
import world as world_helper
# Note that I'm not using cell_states here... If I change a state value that
# could break all these tests. However I think the current way reads better.
# Leaving it as is for now.
class GetNeighbors(unittest.TestCase):
def test_center_cell_surrounded_by_dead(self):
world = [
['.', '.', '.'],
['.', 'a', '.'],
['.', '.', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
expected = ['.', '.', '.', '.', '.', '.', '.', '.']
self.assertEqual(neighbors, expected)
def test_center_cell_some_alive(self):
world = [
['a', '.', 'a'],
['.', 'a', '.'],
['a', 'a', 'a'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
expected = ['a', '.', 'a', '.', '.', 'a', 'a', 'a']
self.assertEqual(neighbors, expected)
def test_top_left_cell(self):
world = [
['.', 'a', '.'],
['a', 'a', '.'],
['.', '.', '.'],
]
neighbors = world_helper.getNeighbors(world, 0, 0)
expected = ['a', 'a', 'a']
self.assertEqual(neighbors, expected)
def test_top_right_cell(self):
world = [
['a', '.', 'a'],
['a', '.', '.'],
['a', 'a', 'a'],
]
neighbors = world_helper.getNeighbors(world, 0, 2)
expected = ['.', '.', '.']
self.assertEqual(neighbors, expected)
def test_bottom_left_cell(self):
world = [
['.', '.', '.'],
['a', 'a', '.'],
['.', 'a', '.'],
]
neighbors = world_helper.getNeighbors(world, 2, 0)
expected = ['a', 'a', 'a']
self.assertEqual(neighbors, expected)
def test_bottom_right_cell(self):
world = [
['a', 'a', 'a'],
['a', '.', '.'],
['a', '.', 'a'],
]
neighbors = world_helper.getNeighbors(world, 2, 2)
expected = ['.', '.', '.']
self.assertEqual(neighbors, expected)
class DetermineCellState(unittest.TestCase):
def test_currently_alive_no_alive_neighbors(self):
world = [
['.', '.', '.'],
['.', 'a', '.'],
['.', '.', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
new_state = world_helper.determineCellState(neighbors, 'a')
self.assertEqual(new_state, 'd')
def test_currently_alive_one_alive_neighbor(self):
world = [
['a', '.', '.'],
['.', 'a', '.'],
['.', '.', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
new_state = world_helper.determineCellState(neighbors, 'a')
self.assertEqual(new_state, 'd')
def test_currently_alive_two_alive_neighbors(self):
world = [
['a', '.', '.'],
['.', 'a', 'a'],
['.', '.', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
new_state = world_helper.determineCellState(neighbors, 'a')
self.assertEqual(new_state, 'a')
def test_currently_alive_three_alive_neighbors(self):
world = [
['a', '.', '.'],
['.', 'a', 'a'],
['a', '.', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
new_state = world_helper.determineCellState(neighbors, 'a')
self.assertEqual(new_state, 'a')
def test_currently_alive_four_alive_neighbors(self):
world = [
['a', '.', '.'],
['.', 'a', 'a'],
['a', 'a', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
new_state = world_helper.determineCellState(neighbors, 'a')
self.assertEqual(new_state, 'd')
def test_currently_dead_one_alive_neighbor(self):
world = [
['a', '.', '.'],
['.', '.', '.'],
['.', '.', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
new_state = world_helper.determineCellState(neighbors, '.')
self.assertEqual(new_state, '.')
def test_currently_dead_two_alive_neighbors(self):
world = [
['a', '.', '.'],
['.', '.', '.'],
['.', 'a', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
new_state = world_helper.determineCellState(neighbors, '.')
self.assertEqual(new_state, '.')
def test_currently_dead_three_alive_neighbors(self):
world = [
['a', '.', '.'],
['.', '.', 'a'],
['.', 'a', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
new_state = world_helper.determineCellState(neighbors, '.')
self.assertEqual(new_state, 'b')
def test_currently_dead_four_alive_neighbors(self):
world = [
['a', '.', '.'],
['.', '.', 'a'],
['a', 'a', '.'],
]
neighbors = world_helper.getNeighbors(world, 1, 1)
new_state = world_helper.determineCellState(neighbors, '.')
self.assertEqual(new_state, '.')
class UpdateWorld(unittest.TestCase):
def test_block(self):
world = [
['.', '.', '.'],
['.', 'a', 'a'],
['.', 'a', 'a'],
]
world_helper.updateWorld(world)
expected = [
['.', '.', '.'],
['.', 'a', 'a'],
['.', 'a', 'a'],
]
self.assertEqual(world, expected)
def test_tub(self):
world = [
['.', 'a', '.'],
['a', '.', 'a'],
['.', 'a', '.'],
]
world_helper.updateWorld(world)
expected = [
['.', 'a', '.'],
['a', '.', 'a'],
['.', 'a', '.'],
]
self.assertEqual(world, expected)
def test_blinker(self):
world = [
['.', '.', '.'],
['a', 'a', 'a'],
['.', '.', '.'],
]
world_helper.updateWorld(world)
expected = [
['.', 'a', '.'],
['.', 'a', '.'],
['.', 'a', '.'],
]
self.assertEqual(world, expected)
def test_beacon(self):
world = [
['a', 'a', '.', '.'],
['a', 'a', '.', '.'],
['.', '.', 'a', 'a'],
['.', '.', 'a', 'a'],
]
world_helper.updateWorld(world)
expected = [
['a', 'a', '.', '.'],
['a', '.', '.', '.'],
['.', '.', '.', 'a'],
['.', '.', 'a', 'a'],
]
self.assertEqual(world, expected)
def test_beacon_2(self):
world = [
['a', 'a', '.', '.'],
['a', '.', '.', '.'],
['.', '.', '.', 'a'],
['.', '.', 'a', 'a'],
]
world_helper.updateWorld(world)
expected = [
['a', 'a', '.', '.'],
['a', 'a', '.', '.'],
['.', '.', 'a', 'a'],
['.', '.', 'a', 'a'],
]
self.assertEqual(world, expected)
def test_glider(self):
world = [
['.', '.', '.', '.', '.'],
['.', '.', 'a', '.', '.'],
['a', '.', 'a', '.', '.'],
['.', 'a', 'a', '.', '.'],
['.', '.', '.', '.', '.'],
]
world_helper.updateWorld(world)
expected = [
['.', '.', '.', '.', '.'],
['.', 'a', '.', '.', '.'],
['.', '.', 'a', 'a', '.'],
['.', 'a', 'a', '.', '.'],
['.', '.', '.', '.', '.'],
]
self.assertEqual(world, expected)
def test_glider_2(self):
world = [
['.', '.', '.', '.', '.'],
['.', 'a', '.', '.', '.'],
['.', '.', 'a', 'a', '.'],
['.', 'a', 'a', '.', '.'],
['.', '.', '.', '.', '.'],
]
world_helper.updateWorld(world)
expected = [
['.', '.', '.', '.', '.'],
['.', '.', 'a', '.', '.'],
['.', '.', '.', 'a', '.'],
['.', 'a', 'a', 'a', '.'],
['.', '.', '.', '.', '.'],
]
self.assertEqual(world, expected)
| 25.673846 | 76 | 0.385906 | 651 | 8,344 | 4.752688 | 0.121352 | 0.063995 | 0.067873 | 0.055591 | 0.864577 | 0.857143 | 0.833226 | 0.795087 | 0.784098 | 0.767615 | 0 | 0.006242 | 0.366371 | 8,344 | 324 | 77 | 25.753086 | 0.578967 | 0.023849 | 0 | 0.728395 | 0 | 0 | 0.049023 | 0 | 0 | 0 | 0 | 0 | 0.090535 | 1 | 0.090535 | false | 0 | 0.00823 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
e415c5d0326b49b5bd69aa0fab146d9a8b7035ea | 6,573 | py | Python | test/dungeons/TestIcePalace.py | compiling/ALttPEntranceRandomizer | cd062d2165e230f2298541afdaff9463c0e2feb1 | [
"MIT"
] | 42 | 2019-08-22T16:19:51.000Z | 2022-03-30T17:39:39.000Z | test/dungeons/TestIcePalace.py | compiling/ALttPEntranceRandomizer | cd062d2165e230f2298541afdaff9463c0e2feb1 | [
"MIT"
] | 48 | 2019-09-04T22:47:03.000Z | 2022-01-13T22:16:13.000Z | test/dungeons/TestIcePalace.py | compiling/ALttPEntranceRandomizer | cd062d2165e230f2298541afdaff9463c0e2feb1 | [
"MIT"
] | 35 | 2020-01-10T09:12:53.000Z | 2022-03-23T08:22:25.000Z | from test.dungeons.TestDungeon import TestDungeon
class TestIcePalace(TestDungeon):
def testIcePalace(self):
self.starting_regions = ['Ice Palace (Entrance)']
self.run_tests([
["Ice Palace - Big Key Chest", False, []],
["Ice Palace - Big Key Chest", False, [], ['Hammer']],
["Ice Palace - Big Key Chest", False, [], ['Progressive Glove']],
["Ice Palace - Big Key Chest", False, [], ['Fire Rod', 'Bombos']],
["Ice Palace - Big Key Chest", False, [], ['Fire Rod', 'Progressive Sword']],
["Ice Palace - Big Key Chest", True, ['Progressive Glove', 'Fire Rod', 'Hammer', 'Hookshot', 'Small Key (Ice Palace)']],
["Ice Palace - Big Key Chest", True, ['Progressive Glove', 'Bombos', 'Progressive Sword', 'Hammer', 'Hookshot', 'Small Key (Ice Palace)']],
#@todo: Change from item randomizer - Right side key door is only in logic if big key is in there
#["Ice Palace - Big Key Chest", True, ['Progressive Glove', 'Cane of Byrna', 'Fire Rod', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
#["Ice Palace - Big Key Chest", True, ['Progressive Glove', 'Cane of Byrna', 'Bombos', 'Progressive Sword', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
#["Ice Palace - Big Key Chest", True, ['Progressive Glove', 'Cape', 'Fire Rod', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
#["Ice Palace - Big Key Chest", True, ['Progressive Glove', 'Cape', 'Bombos', 'Progressive Sword', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
["Ice Palace - Compass Chest", False, []],
["Ice Palace - Compass Chest", False, [], ['Fire Rod', 'Bombos']],
["Ice Palace - Compass Chest", False, [], ['Fire Rod', 'Progressive Sword']],
["Ice Palace - Compass Chest", True, ['Fire Rod']],
["Ice Palace - Compass Chest", True, ['Bombos', 'Progressive Sword']],
["Ice Palace - Map Chest", False, []],
["Ice Palace - Map Chest", False, [], ['Hammer']],
["Ice Palace - Map Chest", False, [], ['Progressive Glove']],
["Ice Palace - Map Chest", False, [], ['Fire Rod', 'Bombos']],
["Ice Palace - Map Chest", False, [], ['Fire Rod', 'Progressive Sword']],
["Ice Palace - Map Chest", True, ['Progressive Glove', 'Fire Rod', 'Hammer', 'Hookshot', 'Small Key (Ice Palace)']],
["Ice Palace - Map Chest", True, ['Progressive Glove', 'Bombos', 'Progressive Sword', 'Hammer', 'Hookshot', 'Small Key (Ice Palace)']],
#["Ice Palace - Map Chest", True, ['Progressive Glove', 'Cane of Byrna', 'Fire Rod', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
#["Ice Palace - Map Chest", True, ['Progressive Glove', 'Cane of Byrna', 'Bombos', 'Progressive Sword', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
#["Ice Palace - Map Chest", True, ['Progressive Glove', 'Cape', 'Fire Rod', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
#["Ice Palace - Map Chest", True, ['Progressive Glove', 'Cape', 'Bombos', 'Progressive Sword', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
["Ice Palace - Spike Room", False, []],
["Ice Palace - Spike Room", False, [], ['Fire Rod', 'Bombos']],
["Ice Palace - Spike Room", False, [], ['Fire Rod', 'Progressive Sword']],
["Ice Palace - Spike Room", True, ['Fire Rod', 'Hookshot', 'Small Key (Ice Palace)']],
["Ice Palace - Spike Room", True, ['Bombos', 'Progressive Sword', 'Hookshot', 'Small Key (Ice Palace)']],
#["Ice Palace - Spike Room", True, ['Cape', 'Fire Rod', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
#["Ice Palace - Spike Room", True, ['Cape', 'Bombos', 'Progressive Sword', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
#["Ice Palace - Spike Room", True, ['Cane of Byrna', 'Fire Rod', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
#["Ice Palace - Spike Room", True, ['Cane of Byrna', 'Bombos', 'Progressive Sword', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
["Ice Palace - Freezor Chest", False, []],
["Ice Palace - Freezor Chest", False, [], ['Fire Rod', 'Bombos']],
["Ice Palace - Freezor Chest", False, [], ['Fire Rod', 'Progressive Sword']],
["Ice Palace - Freezor Chest", True, ['Fire Rod']],
["Ice Palace - Freezor Chest", True, ['Bombos', 'Progressive Sword']],
["Ice Palace - Iced T Room", False, []],
["Ice Palace - Iced T Room", False, [], ['Fire Rod', 'Bombos']],
["Ice Palace - Iced T Room", False, [], ['Fire Rod', 'Progressive Sword']],
["Ice Palace - Iced T Room", True, ['Fire Rod']],
["Ice Palace - Iced T Room", True, ['Bombos', 'Progressive Sword']],
["Ice Palace - Big Chest", False, []],
["Ice Palace - Big Chest", False, [], ['Big Key (Ice Palace)']],
["Ice Palace - Big Chest", False, [], ['Fire Rod', 'Bombos']],
["Ice Palace - Big Chest", False, [], ['Fire Rod', 'Progressive Sword']],
["Ice Palace - Big Chest", True, ['Big Key (Ice Palace)', 'Fire Rod']],
["Ice Palace - Big Chest", True, ['Big Key (Ice Palace)', 'Bombos', 'Progressive Sword']],
["Ice Palace - Boss", False, []],
["Ice Palace - Boss", False, [], ['Hammer']],
["Ice Palace - Boss", False, [], ['Progressive Glove']],
["Ice Palace - Boss", False, [], ['Big Key (Ice Palace)']],
["Ice Palace - Boss", False, [], ['Fire Rod', 'Bombos']],
["Ice Palace - Boss", False, [], ['Fire Rod', 'Progressive Sword']],
["Ice Palace - Boss", True, ['Progressive Glove', 'Big Key (Ice Palace)', 'Fire Rod', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
["Ice Palace - Boss", True, ['Progressive Glove', 'Big Key (Ice Palace)', 'Fire Rod', 'Hammer', 'Cane of Somaria', 'Small Key (Ice Palace)']],
["Ice Palace - Boss", True, ['Progressive Glove', 'Big Key (Ice Palace)', 'Bombos', 'Progressive Sword', 'Hammer', 'Small Key (Ice Palace)', 'Small Key (Ice Palace)']],
["Ice Palace - Boss", True, ['Progressive Glove', 'Big Key (Ice Palace)', 'Bombos', 'Progressive Sword', 'Hammer', 'Cane of Somaria', 'Small Key (Ice Palace)']],
]) | 83.202532 | 183 | 0.55302 | 756 | 6,573 | 4.805556 | 0.074074 | 0.26507 | 0.145334 | 0.168456 | 0.926232 | 0.864299 | 0.785301 | 0.668318 | 0.56207 | 0.520782 | 0 | 0 | 0.249201 | 6,573 | 79 | 184 | 83.202532 | 0.73617 | 0.276282 | 0 | 0 | 0 | 0 | 0.52687 | 0 | 0 | 0 | 0 | 0.012658 | 0 | 1 | 0.017857 | false | 0.089286 | 0.017857 | 0 | 0.053571 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
e489ea56255ea0c45d2063087fe81aee3e4401e7 | 361 | py | Python | tests/internal/instance_type/test_instance_type_i3en_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/instance_type/test_instance_type_i3en_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | null | null | null | tests/internal/instance_type/test_instance_type_i3en_auto.py | frolovv/aws.ec2.compare | 582805823492f833d65c0441c4a14dce697c12aa | [
"Apache-2.0"
] | 1 | 2021-12-15T11:58:22.000Z | 2021-12-15T11:58:22.000Z |
# Testing module instance_type.i3en
import pytest
import ec2_compare.internal.instance_type.i3en
def test_get_internal_data_instance_type_i3en_get_instances_list():
assert len(ec2_compare.internal.instance_type.i3en.get_instances_list()) > 0
def test_get_internal_data_instance_type_i3en_get():
assert len(ec2_compare.internal.instance_type.i3en.get) > 0
| 36.1 | 78 | 0.853186 | 56 | 361 | 5.053571 | 0.339286 | 0.254417 | 0.339223 | 0.268551 | 0.826855 | 0.826855 | 0.614841 | 0.614841 | 0.614841 | 0 | 0 | 0.032836 | 0.072022 | 361 | 9 | 79 | 40.111111 | 0.81194 | 0.091413 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 10 |
e4de4129b64e74e87707e38325665a9f95c7b81b | 125 | py | Python | tadmap/data/__init__.py | rs239/tadmap | 0ad259b62ad2abbafa848c8fd1c3ba69aad4ebb4 | [
"MIT"
] | null | null | null | tadmap/data/__init__.py | rs239/tadmap | 0ad259b62ad2abbafa848c8fd1c3ba69aad4ebb4 | [
"MIT"
] | null | null | null | tadmap/data/__init__.py | rs239/tadmap | 0ad259b62ad2abbafa848c8fd1c3ba69aad4ebb4 | [
"MIT"
] | null | null | null |
from ._tadmap_data import tad_to_geneset_dict, tadlist_bed_format
__all__ = ['tad_to_geneset_dict', 'tadlist_bed_format']
| 20.833333 | 65 | 0.824 | 19 | 125 | 4.578947 | 0.631579 | 0.114943 | 0.275862 | 0.367816 | 0.735632 | 0.735632 | 0.735632 | 0 | 0 | 0 | 0 | 0 | 0.096 | 125 | 5 | 66 | 25 | 0.769912 | 0 | 0 | 0 | 0 | 0 | 0.300813 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
9020b77dbcf252fd1d038d2a20299d4d98276634 | 2,106 | py | Python | example_model/policy/mlp/discrete.py | SunandBean/tensorflow_RL | a248cbfb99b2041f6f7cc008fcad53fb83ac486e | [
"MIT"
] | 60 | 2019-01-29T14:13:00.000Z | 2020-11-24T09:08:05.000Z | example_model/policy/mlp/discrete.py | SunandBean/tensorflow_RL | a248cbfb99b2041f6f7cc008fcad53fb83ac486e | [
"MIT"
] | 2 | 2019-08-14T06:44:32.000Z | 2020-11-12T12:57:55.000Z | example_model/policy/mlp/discrete.py | SunandBean/tensorflow_RL | a248cbfb99b2041f6f7cc008fcad53fb83ac486e | [
"MIT"
] | 37 | 2019-01-22T05:19:34.000Z | 2021-04-12T02:27:50.000Z | import tensorflow as tf
import numpy as np
class MLPActor:
def __init__(self, name, state_size, output_size):
self.state_size = state_size
self.output_size = output_size
with tf.variable_scope(name):
self.input = tf.placeholder(dtype=tf.float32, shape=[None, self.state_size])
self.dense_1 = tf.layers.dense(inputs=self.input, units=256, activation=tf.nn.relu)
self.dense_2 = tf.layers.dense(inputs=self.dense_1, units=256, activation=tf.nn.relu)
self.dense_3 = tf.layers.dense(inputs=self.dense_2, units=256, activation=tf.nn.relu)
self.actor = tf.layers.dense(inputs=self.dense_3, units=self.output_size, activation=tf.nn.softmax)
self.scope = tf.get_variable_scope().name
def get_action_prob(self, obs):
return self.sess.run(self.act_probs, feed_dict={self.obs: obs})
def get_variables(self):
return tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, self.scope)
def get_trainable_variables(self):
return tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, self.scope)
class MLPCritic:
def __init__(self, name, state_size):
self.state_size = state_size
with tf.variable_scope(name):
self.input = tf.placeholder(dtype=tf.float32, shape=[None, self.state_size])
self.dense_1 = tf.layers.dense(inputs=self.input, units=256, activation=tf.nn.relu)
self.dense_2 = tf.layers.dense(inputs=self.dense_1, units=256, activation=tf.nn.relu)
self.dense_3 = tf.layers.dense(inputs=self.dense_2, units=256, activation=tf.nn.relu)
self.critic = tf.layers.dense(inputs=self.dense_3, units=1, activation=None)
self.scope = tf.get_variable_scope().name
def get_action_prob(self, obs):
return self.sess.run(self.act_probs, feed_dict={self.obs: obs})
def get_variables(self):
return tf.get_collection(tf.GraphKeys.GLOBAL_VARIABLES, self.scope)
def get_trainable_variables(self):
return tf.get_collection(tf.GraphKeys.TRAINABLE_VARIABLES, self.scope) | 44.808511 | 111 | 0.693732 | 307 | 2,106 | 4.563518 | 0.179153 | 0.077088 | 0.074233 | 0.108494 | 0.893647 | 0.893647 | 0.82798 | 0.82798 | 0.779443 | 0.779443 | 0 | 0.020504 | 0.189459 | 2,106 | 47 | 112 | 44.808511 | 0.800234 | 0 | 0 | 0.742857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.228571 | false | 0 | 0.057143 | 0.171429 | 0.514286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 10 |
9021a827ccf4ad84513e76d2fc9453f79542f0de | 116 | py | Python | glx/viewport/__init__.py | NeilGirdhar/glx | 643abc73e05f94ea56a00deb927a3978f01184f2 | [
"MIT"
] | 3 | 2018-04-18T02:42:36.000Z | 2020-09-06T15:48:17.000Z | glx/viewport/__init__.py | NeilGirdhar/glx | 643abc73e05f94ea56a00deb927a3978f01184f2 | [
"MIT"
] | 1 | 2020-07-12T22:36:45.000Z | 2020-07-13T14:20:32.000Z | glx/viewport/__init__.py | NeilGirdhar/glx | 643abc73e05f94ea56a00deb927a3978f01184f2 | [
"MIT"
] | 1 | 2021-04-27T14:53:34.000Z | 2021-04-27T14:53:34.000Z | from .bounded_ortho_view import *
from .ortho_projection import *
from .ortho_view import *
from .viewport import *
| 23.2 | 33 | 0.793103 | 16 | 116 | 5.5 | 0.4375 | 0.340909 | 0.340909 | 0.431818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137931 | 116 | 4 | 34 | 29 | 0.88 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
903451af7174621a45745c378c0b85df8cd6c3f3 | 1,335 | py | Python | tests/extractions/test_scope_overlap_extract.py | dpasse/eeyore | 0420cf9ff6d3ddecfc716e62aa97a00443cb23cb | [
"MIT"
] | null | null | null | tests/extractions/test_scope_overlap_extract.py | dpasse/eeyore | 0420cf9ff6d3ddecfc716e62aa97a00443cb23cb | [
"MIT"
] | null | null | null | tests/extractions/test_scope_overlap_extract.py | dpasse/eeyore | 0420cf9ff6d3ddecfc716e62aa97a00443cb23cb | [
"MIT"
] | null | null | null | import os
import sys
sys.path.insert(0, os.path.abspath('src'))
from eeyore_nlp.extractions import ScopeOverlapExtract
from eeyore_nlp.models import Context
def test_extractor_1():
context = Context(
'Tom declined cancer treatment.',
['Tom', 'declined', 'cancer', 'treatment', '.']
)
context.add('scope1', ['', 'S1', 'S1', 'S1', 'S1'])
context.add('scope2', ['', '', 'S2', 'S2', ''])
relationships = ScopeOverlapExtract('scope1', 'scope2').evaluate(context)
assert relationships == ['', 'REL', 'REL', 'REL', 'REL']
def test_extractor_2():
context = Context(
'Tom declined cancer treatment.',
['Tom', 'declined', 'cancer', 'treatment', '.']
)
context.add('scope1', ['', 'S1', 'S1', '', ''])
context.add('scope2', ['', '', 'S2', 'S2', ''])
relationships = ScopeOverlapExtract('scope1', 'scope2').evaluate(context)
assert relationships == ['', 'REL', 'REL', 'REL', '']
def test_extractor_3():
context = Context(
'Tom declined cancer treatment.',
['Tom', 'declined', 'cancer', 'treatment', '.']
)
context.add('scope1', ['S1', 'S1', '', '', ''])
context.add('scope2', ['', '', '', 'S2', ''])
relationships = ScopeOverlapExtract('scope1', 'scope2').evaluate(context)
assert relationships == ['', '', '', '', '']
| 31.785714 | 77 | 0.570787 | 128 | 1,335 | 5.890625 | 0.265625 | 0.087533 | 0.135279 | 0.206897 | 0.80504 | 0.80504 | 0.758621 | 0.758621 | 0.758621 | 0.655172 | 0 | 0.026977 | 0.194757 | 1,335 | 41 | 78 | 32.560976 | 0.674419 | 0 | 0 | 0.4375 | 0 | 0 | 0.219476 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 1 | 0.09375 | false | 0 | 0.125 | 0 | 0.21875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
5f66cfb41836441811b5cf85f81c7c0c92d3f085 | 196 | py | Python | Python/Tests/TestData/Grammar/Delimiters.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 695 | 2019-05-06T23:49:37.000Z | 2022-03-30T01:56:00.000Z | Python/Tests/TestData/Grammar/Delimiters.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 1,672 | 2019-05-06T21:09:38.000Z | 2022-03-31T23:16:04.000Z | Python/Tests/TestData/Grammar/Delimiters.py | techkey/PTVS | 8355e67eedd8e915ca49bd38a2f36172696fd903 | [
"Apache-2.0"
] | 186 | 2019-05-13T03:17:37.000Z | 2022-03-31T16:24:05.000Z | 1(2)
1[2]
{1:2}
1, 2, 3
1[2:3]
1[2:3:4]
1[2::4]
1[::4]
1[...]
1[:,]
fob.oar
fob = 1
fob += 1
fob -= 1
fob *= 1
fob /= 1
fob //= 1
fob %= 1
fob &= 1
fob |= 1
fob ^= 1
fob >>= 1
fob <<= 1
fob **= 1
| 7.84 | 9 | 0.413265 | 51 | 196 | 1.588235 | 0.117647 | 0.641975 | 1.037037 | 1.185185 | 0.82716 | 0.82716 | 0.641975 | 0.641975 | 0.641975 | 0.641975 | 0 | 0.25 | 0.265306 | 196 | 24 | 10 | 8.166667 | 0.3125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
39b8411861a2d5e1ad1ad3807db1ca91f78a7888 | 2,399 | py | Python | cgatpipelines/tools/pipeline_docs/pipeline_metagenomeassembly/trackers/AssemblyCoverage.py | kevinrue/cgat-flow | 02b5a1867253c2f6fd6b4f3763e0299115378913 | [
"MIT"
] | 49 | 2015-04-13T16:49:25.000Z | 2022-03-29T10:29:14.000Z | cgatpipelines/tools/pipeline_docs/pipeline_metagenomeassembly/trackers/AssemblyCoverage.py | kevinrue/cgat-flow | 02b5a1867253c2f6fd6b4f3763e0299115378913 | [
"MIT"
] | 252 | 2015-04-08T13:23:34.000Z | 2019-03-18T21:51:29.000Z | cgatpipelines/tools/pipeline_docs/pipeline_metagenomeassembly/trackers/AssemblyCoverage.py | kevinrue/cgat-flow | 02b5a1867253c2f6fd6b4f3763e0299115378913 | [
"MIT"
] | 22 | 2015-05-21T00:37:52.000Z | 2019-09-25T05:04:27.000Z | from CGATReport.Tracker import *
import numpy as np
class AlignmentSummary(TrackerSQL):
'''
class to collect the alignment statistics from
picard
'''
def __call__(self, track, slice=None):
return self.getAll("""SELECT * FROM picard_stats_alignment_summary_metrics""")
class ReadsAligned(TrackerSQL):
'''
percent reads aligned
'''
def __call__(self, track, slice=None):
result = {}
for data in self.execute("""SELECT track, PCT_PF_READS_ALIGNED FROM picard_stats_alignment_summary_metrics"""):
result[data[0]] = data[1]
return result
class ReadsAlignedInPairs(TrackerSQL):
'''
percent reads aligned in pairs
'''
def __call__(self, track, slice=None):
result = {}
for data in self.execute("""SELECT track, PCT_READS_ALIGNED_IN_PAIRS FROM picard_stats_alignment_summary_metrics"""):
result[data[0]] = data[1]
return result
class MismatchRate(TrackerSQL):
'''
percent reads aligned in pairs
'''
def __call__(self, track, slice=None):
result = {}
for data in self.execute("""SELECT track, PF_MISMATCH_RATE FROM picard_stats_alignment_summary_metrics"""):
result[data[0]] = data[1]
return result
class InsertSizeSummary(TrackerSQL):
'''
class to collect the insert size statistics from
picard
'''
def __call__(self, track, slice=None):
return self.getAll("""SELECT * FROM picard_stats_insert_size_metrics""")
class CoverageSd(TrackerSQL):
'''
class to collect data on the standard deviation of base coverage
across contigs
'''
pattern = "(.*)_coverage_stats"
def __call__(self, track, slice=None):
result = []
for data in self.execute("""SELECT cov_sd FROM %(track)s_coverage_stats WHERE cov_sd > 0""" % locals()).fetchall():
result.append(data[0])
return np.log2(result)
class CoverageMean(TrackerSQL):
'''
class to collect data on the standard deviation of base coverage
across contigs
'''
pattern = "(.*)_coverage_stats"
def __call__(self, track, slice=None):
result = []
for data in self.execute("""SELECT cov_mean FROM %(track)s_coverage_stats WHERE cov_mean > 0""" % locals()).fetchall():
result.append(data[0])
return np.log2(result)
| 24.479592 | 127 | 0.644852 | 284 | 2,399 | 5.204225 | 0.242958 | 0.047361 | 0.052097 | 0.075778 | 0.807848 | 0.771313 | 0.755751 | 0.713802 | 0.713802 | 0.713802 | 0 | 0.006619 | 0.244268 | 2,399 | 97 | 128 | 24.731959 | 0.808605 | 0.147145 | 0 | 0.6 | 0 | 0 | 0.257796 | 0.134096 | 0 | 0 | 0 | 0 | 0 | 1 | 0.175 | false | 0 | 0.05 | 0.05 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 7 |
f2dca346ba9217dbbb6611a52cb732ad92453b66 | 210,704 | py | Python | CyberTK/talkFuncs.py | CyberTKR/Simple-LINELIB | 8596afb6b201b13675a0ed6314b3151f6bbf208b | [
"BSD-3-Clause"
] | 4 | 2022-02-20T11:27:29.000Z | 2022-03-05T00:50:05.000Z | CyberTK/talkFuncs.py | CyberTKR/Simple-LINELIB | 8596afb6b201b13675a0ed6314b3151f6bbf208b | [
"BSD-3-Clause"
] | null | null | null | CyberTK/talkFuncs.py | CyberTKR/Simple-LINELIB | 8596afb6b201b13675a0ed6314b3151f6bbf208b | [
"BSD-3-Clause"
] | null | null | null | #
# Autogenerated by Thrift Compiler (0.13.0)
#
# DO NOT EDIT UNLESS YOU ARE SURE THAT YOU KNOW WHAT YOU ARE DOING
#
# options string: py
#
from thrift.Thrift import TType, TMessageType, TFrozenDict, TException, TApplicationException
from thrift.protocol.TProtocol import TProtocolException
from thrift.TRecursive import fix_spec
import sys
import logging
from .aLLTypes import *
from thrift.Thrift import TProcessor
from thrift.transport import TTransport
all_structs = []
class Iface(object):
def acceptChatInvitation(self, request):
"""
Parameters:
- request
"""
pass
def acceptChatInvitationByTicket(self, request):
"""
Parameters:
- request
"""
pass
def blockContact(self, reqSeq, id):
"""
Parameters:
- reqSeq
- id
"""
pass
def getRecentMessagesV2(self, messageBoxId, messagesCount):
"""
Parameters:
- messageBoxId
- messagesCount
"""
pass
def cancelChatInvitation(self, request):
"""
Parameters:
- request
"""
pass
def createChat(self, request):
"""
Parameters:
- request
"""
pass
def deleteSelfFromChat(self, request):
"""
Parameters:
- request
"""
pass
def deleteOtherFromChat(self, request):
"""
Parameters:
- request
"""
pass
def fetchOperations(self, request):
"""
Parameters:
- request
"""
pass
def fetchOps(self, localRev, count, globalRev, individualRev):
"""
Parameters:
- localRev
- count
- globalRev
- individualRev
"""
pass
def findAndAddContactsByMid(self, reqSeq, mid, type, reference):
"""
Parameters:
- reqSeq
- mid
- type
- reference
"""
pass
def findAndAddContactsByUserid(self, reqSeq, searchId, reference):
"""
Parameters:
- reqSeq
- searchId
- reference
"""
pass
def findContactByUserid(self, userid):
"""
Parameters:
- userid
"""
pass
def findChatByTicket(self, request):
"""
Parameters:
- request
"""
pass
def getAllChatMids(self, request, syncReason):
"""
Parameters:
- request
- syncReason
"""
pass
def getProfile(self, syncReason):
"""
Parameters:
- syncReason
"""
pass
def getContact(self, id):
"""
Parameters:
- id
"""
pass
def getCountryWithRequestIp(self):
pass
def getServerTime(self):
pass
def getContacts(self, ids):
"""
Parameters:
- ids
"""
pass
def getAllContactIds(self, syncReason):
"""
Parameters:
- syncReason
"""
pass
def getChats(self, request):
"""
Parameters:
- request
"""
pass
def inviteIntoChat(self, request):
"""
Parameters:
- request
"""
pass
def reissueChatTicket(self, request):
"""
Parameters:
- request
"""
pass
def rejectChatInvitation(self, request):
"""
Parameters:
- request
"""
pass
def sendMessage(self, seq, message):
"""
Parameters:
- seq
- message
"""
pass
def unsendMessage(self, seq, messageId):
"""
Parameters:
- seq
- messageId
"""
pass
def updateChat(self, request):
"""
Parameters:
- request
"""
pass
def updateProfileAttribute(self, reqSeq, attr, value):
"""
Parameters:
- reqSeq
- attr
- value
"""
pass
class Client(Iface):
def __init__(self, iprot, oprot=None):
self._iprot = self._oprot = iprot
if oprot is not None:
self._oprot = oprot
self._seqid = 0
def acceptChatInvitation(self, request):
"""
Parameters:
- request
"""
self.send_acceptChatInvitation(request)
return self.recv_acceptChatInvitation()
def send_acceptChatInvitation(self, request):
self._oprot.writeMessageBegin('acceptChatInvitation', TMessageType.CALL, self._seqid)
args = acceptChatInvitation_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_acceptChatInvitation(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = acceptChatInvitation_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "acceptChatInvitation failed: unknown result")
def acceptChatInvitationByTicket(self, request):
"""
Parameters:
- request
"""
self.send_acceptChatInvitationByTicket(request)
return self.recv_acceptChatInvitationByTicket()
def send_acceptChatInvitationByTicket(self, request):
self._oprot.writeMessageBegin('acceptChatInvitationByTicket', TMessageType.CALL, self._seqid)
args = acceptChatInvitationByTicket_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_acceptChatInvitationByTicket(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = acceptChatInvitationByTicket_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "acceptChatInvitationByTicket failed: unknown result")
def blockContact(self, reqSeq, id):
"""
Parameters:
- reqSeq
- id
"""
self.send_blockContact(reqSeq, id)
self.recv_blockContact()
def send_blockContact(self, reqSeq, id):
self._oprot.writeMessageBegin('blockContact', TMessageType.CALL, self._seqid)
args = blockContact_args()
args.reqSeq = reqSeq
args.id = id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_blockContact(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = blockContact_result()
result.read(iprot)
iprot.readMessageEnd()
if result.e is not None:
raise result.e
return
def cancelChatInvitation(self, request):
"""
Parameters:
- request
"""
self.send_cancelChatInvitation(request)
return self.recv_cancelChatInvitation()
def send_cancelChatInvitation(self, request):
self._oprot.writeMessageBegin('cancelChatInvitation', TMessageType.CALL, self._seqid)
args = cancelChatInvitation_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_cancelChatInvitation(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = cancelChatInvitation_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "cancelChatInvitation failed: unknown result")
def createChat(self, request):
"""
Parameters:
- request
"""
self.send_createChat(request)
return self.recv_createChat()
def send_createChat(self, request):
self._oprot.writeMessageBegin('createChat', TMessageType.CALL, self._seqid)
args = createChat_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_createChat(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = createChat_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "createChat failed: unknown result")
def deleteSelfFromChat(self, request):
"""
Parameters:
- request
"""
self.send_deleteSelfFromChat(request)
return self.recv_deleteSelfFromChat()
def send_deleteSelfFromChat(self, request):
self._oprot.writeMessageBegin('deleteSelfFromChat', TMessageType.CALL, self._seqid)
args = deleteSelfFromChat_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_deleteSelfFromChat(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = deleteSelfFromChat_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "deleteSelfFromChat failed: unknown result")
def deleteOtherFromChat(self, request):
"""
Parameters:
- request
"""
self.send_deleteOtherFromChat(request)
return self.recv_deleteOtherFromChat()
def send_deleteOtherFromChat(self, request):
self._oprot.writeMessageBegin('deleteOtherFromChat', TMessageType.CALL, self._seqid)
args = deleteOtherFromChat_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_deleteOtherFromChat(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = deleteOtherFromChat_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "deleteOtherFromChat failed: unknown result")
def fetchOperations(self, request):
"""
Parameters:
- request
"""
self.send_fetchOperations(request)
return self.recv_fetchOperations()
def send_fetchOperations(self, request):
self._oprot.writeMessageBegin('fetchOperations', TMessageType.CALL, self._seqid)
args = fetchOperations_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_fetchOperations(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = fetchOperations_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "fetchOperations failed: unknown result")
def fetchOps(self, localRev, count, globalRev, individualRev):
"""
Parameters:
- localRev
- count
- globalRev
- individualRev
"""
self.send_fetchOps(localRev, count, globalRev, individualRev)
return self.recv_fetchOps()
def send_fetchOps(self, localRev, count, globalRev, individualRev):
self._oprot.writeMessageBegin('fetchOps', TMessageType.CALL, self._seqid)
args = fetchOps_args()
args.localRev = localRev
args.count = count
args.globalRev = globalRev
args.individualRev = individualRev
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_fetchOps(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = fetchOps_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "fetchOps failed: unknown result")
def findAndAddContactsByMid(self, reqSeq, mid, type, reference):
"""
Parameters:
- reqSeq
- mid
- type
- reference
"""
self.send_findAndAddContactsByMid(reqSeq, mid, type, reference)
return self.recv_findAndAddContactsByMid()
def send_findAndAddContactsByMid(self, reqSeq, mid, type, reference):
self._oprot.writeMessageBegin('findAndAddContactsByMid', TMessageType.CALL, self._seqid)
args = findAndAddContactsByMid_args()
args.reqSeq = reqSeq
args.mid = mid
args.type = type
args.reference = reference
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_findAndAddContactsByMid(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = findAndAddContactsByMid_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "findAndAddContactsByMid failed: unknown result")
def findAndAddContactsByUserid(self, reqSeq, searchId, reference):
"""
Parameters:
- reqSeq
- searchId
- reference
"""
self.send_findAndAddContactsByUserid(reqSeq, searchId, reference)
return self.recv_findAndAddContactsByUserid()
def send_findAndAddContactsByUserid(self, reqSeq, searchId, reference):
self._oprot.writeMessageBegin('findAndAddContactsByUserid', TMessageType.CALL, self._seqid)
args = findAndAddContactsByUserid_args()
args.reqSeq = reqSeq
args.searchId = searchId
args.reference = reference
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_findAndAddContactsByUserid(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = findAndAddContactsByUserid_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "findAndAddContactsByUserid failed: unknown result")
def findContactByUserid(self, userid):
"""
Parameters:
- userid
"""
self.send_findContactByUserid(userid)
return self.recv_findContactByUserid()
def send_findContactByUserid(self, userid):
self._oprot.writeMessageBegin('findContactByUserid', TMessageType.CALL, self._seqid)
args = findContactByUserid_args()
args.userid = userid
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_findContactByUserid(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = findContactByUserid_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "findContactByUserid failed: unknown result")
def findChatByTicket(self, request):
"""
Parameters:
- request
"""
self.send_findChatByTicket(request)
return self.recv_findChatByTicket()
def send_findChatByTicket(self, request):
self._oprot.writeMessageBegin('findChatByTicket', TMessageType.CALL, self._seqid)
args = findChatByTicket_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_findChatByTicket(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = findChatByTicket_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "findChatByTicket failed: unknown result")
def getAllChatMids(self, request, syncReason):
"""
Parameters:
- request
- syncReason
"""
self.send_getAllChatMids(request, syncReason)
return self.recv_getAllChatMids()
def send_getAllChatMids(self, request, syncReason):
self._oprot.writeMessageBegin('getAllChatMids', TMessageType.CALL, self._seqid)
args = getAllChatMids_args()
args.request = request
args.syncReason = syncReason
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getAllChatMids(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getAllChatMids_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "getAllChatMids failed: unknown result")
def getProfile(self, syncReason):
"""
Parameters:
- syncReason
"""
self.send_getProfile(syncReason)
return self.recv_getProfile()
def send_getProfile(self, syncReason):
self._oprot.writeMessageBegin('getProfile', TMessageType.CALL, self._seqid)
args = getProfile_args()
args.syncReason = syncReason
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getProfile(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getProfile_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "getProfile failed: unknown result")
def getContact(self, id):
"""
Parameters:
- id
"""
self.send_getContact(id)
return self.recv_getContact()
def send_getContact(self, id):
self._oprot.writeMessageBegin('getContact', TMessageType.CALL, self._seqid)
args = getContact_args()
args.id = id
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getContact(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getContact_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "getContact failed: unknown result")
def getCountryWithRequestIp(self):
self.send_getCountryWithRequestIp()
return self.recv_getCountryWithRequestIp()
def send_getCountryWithRequestIp(self):
self._oprot.writeMessageBegin('getCountryWithRequestIp', TMessageType.CALL, self._seqid)
args = getCountryWithRequestIp_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getCountryWithRequestIp(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getCountryWithRequestIp_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "getCountryWithRequestIp failed: unknown result")
def getServerTime(self):
self.send_getServerTime()
return self.recv_getServerTime()
def send_getServerTime(self):
self._oprot.writeMessageBegin('getServerTime', TMessageType.CALL, self._seqid)
args = getServerTime_args()
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getServerTime(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getServerTime_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "getServerTime failed: unknown result")
def getRecentMessagesV2(self, messageBoxId, messagesCount):
"""
Parameters:
- messageBoxId
- messagesCount
"""
self.send_getRecentMessagesV2(messageBoxId, messagesCount)
return self.recv_getRecentMessagesV2()
def send_getRecentMessagesV2(self, messageBoxId, messagesCount):
self._oprot.writeMessageBegin('getRecentMessagesV2', TMessageType.CALL, self._seqid)
args = getRecentMessagesV2_args()
args.messageBoxId = messageBoxId
args.messagesCount = messagesCount
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getRecentMessagesV2(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getRecentMessagesV2_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "getRecentMessagesV2 failed: unknown result")
def getContacts(self, ids):
"""
Parameters:
- ids
"""
self.send_getContacts(ids)
return self.recv_getContacts()
def send_getContacts(self, ids):
self._oprot.writeMessageBegin('getContacts', TMessageType.CALL, self._seqid)
args = getContacts_args()
args.ids = ids
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getContacts(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getContacts_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "getContacts failed: unknown result")
def getAllContactIds(self, syncReason):
"""
Parameters:
- syncReason
"""
self.send_getAllContactIds(syncReason)
return self.recv_getAllContactIds()
def send_getAllContactIds(self, syncReason):
self._oprot.writeMessageBegin('getAllContactIds', TMessageType.CALL, self._seqid)
args = getAllContactIds_args()
args.syncReason = syncReason
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getAllContactIds(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getAllContactIds_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "getAllContactIds failed: unknown result")
def getChats(self, request):
"""
Parameters:
- request
"""
self.send_getChats(request)
return self.recv_getChats()
def send_getChats(self, request):
self._oprot.writeMessageBegin('getChats', TMessageType.CALL, self._seqid)
args = getChats_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_getChats(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = getChats_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "getChats failed: unknown result")
def inviteIntoChat(self, request):
"""
Parameters:
- request
"""
self.send_inviteIntoChat(request)
return self.recv_inviteIntoChat()
def send_inviteIntoChat(self, request):
self._oprot.writeMessageBegin('inviteIntoChat', TMessageType.CALL, self._seqid)
args = inviteIntoChat_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_inviteIntoChat(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = inviteIntoChat_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "inviteIntoChat failed: unknown result")
def reissueChatTicket(self, request):
"""
Parameters:
- request
"""
self.send_reissueChatTicket(request)
return self.recv_reissueChatTicket()
def send_reissueChatTicket(self, request):
self._oprot.writeMessageBegin('reissueChatTicket', TMessageType.CALL, self._seqid)
args = reissueChatTicket_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_reissueChatTicket(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = reissueChatTicket_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "reissueChatTicket failed: unknown result")
def rejectChatInvitation(self, request):
"""
Parameters:
- request
"""
self.send_rejectChatInvitation(request)
return self.recv_rejectChatInvitation()
def send_rejectChatInvitation(self, request):
self._oprot.writeMessageBegin('rejectChatInvitation', TMessageType.CALL, self._seqid)
args = rejectChatInvitation_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_rejectChatInvitation(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = rejectChatInvitation_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "rejectChatInvitation failed: unknown result")
def sendMessage(self, seq, message):
"""
Parameters:
- seq
- message
"""
self.send_sendMessage(seq, message)
return self.recv_sendMessage()
def send_sendMessage(self, seq, message):
self._oprot.writeMessageBegin('sendMessage', TMessageType.CALL, self._seqid)
args = sendMessage_args()
args.seq = seq
args.message = message
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_sendMessage(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = sendMessage_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "sendMessage failed: unknown result")
def unsendMessage(self, seq, messageId):
"""
Parameters:
- seq
- messageId
"""
self.send_unsendMessage(seq, messageId)
self.recv_unsendMessage()
def send_unsendMessage(self, seq, messageId):
self._oprot.writeMessageBegin('unsendMessage', TMessageType.CALL, self._seqid)
args = unsendMessage_args()
args.seq = seq
args.messageId = messageId
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_unsendMessage(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = unsendMessage_result()
result.read(iprot)
iprot.readMessageEnd()
if result.e is not None:
raise result.e
return
def updateChat(self, request):
"""
Parameters:
- request
"""
self.send_updateChat(request)
return self.recv_updateChat()
def send_updateChat(self, request):
self._oprot.writeMessageBegin('updateChat', TMessageType.CALL, self._seqid)
args = updateChat_args()
args.request = request
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_updateChat(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = updateChat_result()
result.read(iprot)
iprot.readMessageEnd()
if result.success is not None:
return result.success
if result.e is not None:
raise result.e
raise TApplicationException(TApplicationException.MISSING_RESULT, "updateChat failed: unknown result")
def updateProfileAttribute(self, reqSeq, attr, value):
"""
Parameters:
- reqSeq
- attr
- value
"""
self.send_updateProfileAttribute(reqSeq, attr, value)
self.recv_updateProfileAttribute()
def send_updateProfileAttribute(self, reqSeq, attr, value):
self._oprot.writeMessageBegin('updateProfileAttribute', TMessageType.CALL, self._seqid)
args = updateProfileAttribute_args()
args.reqSeq = reqSeq
args.attr = attr
args.value = value
args.write(self._oprot)
self._oprot.writeMessageEnd()
self._oprot.trans.flush()
def recv_updateProfileAttribute(self):
iprot = self._iprot
(fname, mtype, rseqid) = iprot.readMessageBegin()
if mtype == TMessageType.EXCEPTION:
x = TApplicationException()
x.read(iprot)
iprot.readMessageEnd()
raise x
result = updateProfileAttribute_result()
result.read(iprot)
iprot.readMessageEnd()
if result.e is not None:
raise result.e
return
class Processor(Iface, TProcessor):
def __init__(self, handler):
self._handler = handler
self._processMap = {}
self._processMap["acceptChatInvitation"] = Processor.process_acceptChatInvitation
self._processMap["acceptChatInvitationByTicket"] = Processor.process_acceptChatInvitationByTicket
self._processMap["blockContact"] = Processor.process_blockContact
self._processMap["cancelChatInvitation"] = Processor.process_cancelChatInvitation
self._processMap["createChat"] = Processor.process_createChat
self._processMap["deleteSelfFromChat"] = Processor.process_deleteSelfFromChat
self._processMap["deleteOtherFromChat"] = Processor.process_deleteOtherFromChat
self._processMap["fetchOperations"] = Processor.process_fetchOperations
self._processMap["fetchOps"] = Processor.process_fetchOps
self._processMap["findAndAddContactsByMid"] = Processor.process_findAndAddContactsByMid
self._processMap["findAndAddContactsByUserid"] = Processor.process_findAndAddContactsByUserid
self._processMap["findContactByUserid"] = Processor.process_findContactByUserid
self._processMap["findChatByTicket"] = Processor.process_findChatByTicket
self._processMap["getAllChatMids"] = Processor.process_getAllChatMids
self._processMap["getProfile"] = Processor.process_getProfile
self._processMap["getContact"] = Processor.process_getContact
self._processMap["getCountryWithRequestIp"] = Processor.process_getCountryWithRequestIp
self._processMap["getServerTime"] = Processor.process_getServerTime
self._processMap["getContacts"] = Processor.process_getContacts
self._processMap["getAllContactIds"] = Processor.process_getAllContactIds
self._processMap["getChats"] = Processor.process_getChats
self._processMap["inviteIntoChat"] = Processor.process_inviteIntoChat
self._processMap["reissueChatTicket"] = Processor.process_reissueChatTicket
self._processMap["rejectChatInvitation"] = Processor.process_rejectChatInvitation
self._processMap["sendMessage"] = Processor.process_sendMessage
self._processMap["getRecentMessagesV2"] = Processor.process_getRecentMessagesV2
self._processMap["unsendMessage"] = Processor.process_unsendMessage
self._processMap["updateChat"] = Processor.process_updateChat
self._processMap["updateProfileAttribute"] = Processor.process_updateProfileAttribute
self._on_message_begin = None
def on_message_begin(self, func):
self._on_message_begin = func
def process(self, iprot, oprot):
(name, type, seqid) = iprot.readMessageBegin()
if self._on_message_begin:
self._on_message_begin(name, type, seqid)
if name not in self._processMap:
iprot.skip(TType.STRUCT)
iprot.readMessageEnd()
x = TApplicationException(TApplicationException.UNKNOWN_METHOD, 'Unknown function %s' % (name))
oprot.writeMessageBegin(name, TMessageType.EXCEPTION, seqid)
x.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
return
else:
self._processMap[name](self, seqid, iprot, oprot)
return True
def process_getRecentMessagesV2(self, seqid, iprot, oprot):
args = getRecentMessagesV2_args()
args.read(iprot)
iprot.readMessageEnd()
result = getRecentMessagesV2_result()
try:
result.success = self._handler.getRecentMessagesV2(args.messageBoxId, args.messagesCount)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getRecentMessagesV2", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_acceptChatInvitation(self, seqid, iprot, oprot):
args = acceptChatInvitation_args()
args.read(iprot)
iprot.readMessageEnd()
result = acceptChatInvitation_result()
try:
result.success = self._handler.acceptChatInvitation(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("acceptChatInvitation", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_acceptChatInvitationByTicket(self, seqid, iprot, oprot):
args = acceptChatInvitationByTicket_args()
args.read(iprot)
iprot.readMessageEnd()
result = acceptChatInvitationByTicket_result()
try:
result.success = self._handler.acceptChatInvitationByTicket(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("acceptChatInvitationByTicket", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_blockContact(self, seqid, iprot, oprot):
args = blockContact_args()
args.read(iprot)
iprot.readMessageEnd()
result = blockContact_result()
try:
self._handler.blockContact(args.reqSeq, args.id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("blockContact", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_cancelChatInvitation(self, seqid, iprot, oprot):
args = cancelChatInvitation_args()
args.read(iprot)
iprot.readMessageEnd()
result = cancelChatInvitation_result()
try:
result.success = self._handler.cancelChatInvitation(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("cancelChatInvitation", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_createChat(self, seqid, iprot, oprot):
args = createChat_args()
args.read(iprot)
iprot.readMessageEnd()
result = createChat_result()
try:
result.success = self._handler.createChat(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("createChat", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_deleteSelfFromChat(self, seqid, iprot, oprot):
args = deleteSelfFromChat_args()
args.read(iprot)
iprot.readMessageEnd()
result = deleteSelfFromChat_result()
try:
result.success = self._handler.deleteSelfFromChat(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("deleteSelfFromChat", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_deleteOtherFromChat(self, seqid, iprot, oprot):
args = deleteOtherFromChat_args()
args.read(iprot)
iprot.readMessageEnd()
result = deleteOtherFromChat_result()
try:
result.success = self._handler.deleteOtherFromChat(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("deleteOtherFromChat", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_fetchOperations(self, seqid, iprot, oprot):
args = fetchOperations_args()
args.read(iprot)
iprot.readMessageEnd()
result = fetchOperations_result()
try:
result.success = self._handler.fetchOperations(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except ThingsException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("fetchOperations", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_fetchOps(self, seqid, iprot, oprot):
args = fetchOps_args()
args.read(iprot)
iprot.readMessageEnd()
result = fetchOps_result()
try:
result.success = self._handler.fetchOps(args.localRev, args.count, args.globalRev, args.individualRev)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("fetchOps", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_findAndAddContactsByMid(self, seqid, iprot, oprot):
args = findAndAddContactsByMid_args()
args.read(iprot)
iprot.readMessageEnd()
result = findAndAddContactsByMid_result()
try:
result.success = self._handler.findAndAddContactsByMid(args.reqSeq, args.mid, args.type, args.reference)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("findAndAddContactsByMid", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_findAndAddContactsByUserid(self, seqid, iprot, oprot):
args = findAndAddContactsByUserid_args()
args.read(iprot)
iprot.readMessageEnd()
result = findAndAddContactsByUserid_result()
try:
result.success = self._handler.findAndAddContactsByUserid(args.reqSeq, args.searchId, args.reference)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("findAndAddContactsByUserid", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_findContactByUserid(self, seqid, iprot, oprot):
args = findContactByUserid_args()
args.read(iprot)
iprot.readMessageEnd()
result = findContactByUserid_result()
try:
result.success = self._handler.findContactByUserid(args.userid)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("findContactByUserid", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_findChatByTicket(self, seqid, iprot, oprot):
args = findChatByTicket_args()
args.read(iprot)
iprot.readMessageEnd()
result = findChatByTicket_result()
try:
result.success = self._handler.findChatByTicket(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("findChatByTicket", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getAllChatMids(self, seqid, iprot, oprot):
args = getAllChatMids_args()
args.read(iprot)
iprot.readMessageEnd()
result = getAllChatMids_result()
try:
result.success = self._handler.getAllChatMids(args.request, args.syncReason)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getAllChatMids", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getProfile(self, seqid, iprot, oprot):
args = getProfile_args()
args.read(iprot)
iprot.readMessageEnd()
result = getProfile_result()
try:
result.success = self._handler.getProfile(args.syncReason)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getProfile", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getContact(self, seqid, iprot, oprot):
args = getContact_args()
args.read(iprot)
iprot.readMessageEnd()
result = getContact_result()
try:
result.success = self._handler.getContact(args.id)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getContact", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getCountryWithRequestIp(self, seqid, iprot, oprot):
args = getCountryWithRequestIp_args()
args.read(iprot)
iprot.readMessageEnd()
result = getCountryWithRequestIp_result()
try:
result.success = self._handler.getCountryWithRequestIp()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getCountryWithRequestIp", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getServerTime(self, seqid, iprot, oprot):
args = getServerTime_args()
args.read(iprot)
iprot.readMessageEnd()
result = getServerTime_result()
try:
result.success = self._handler.getServerTime()
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getServerTime", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getContacts(self, seqid, iprot, oprot):
args = getContacts_args()
args.read(iprot)
iprot.readMessageEnd()
result = getContacts_result()
try:
result.success = self._handler.getContacts(args.ids)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getContacts", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getAllContactIds(self, seqid, iprot, oprot):
args = getAllContactIds_args()
args.read(iprot)
iprot.readMessageEnd()
result = getAllContactIds_result()
try:
result.success = self._handler.getAllContactIds(args.syncReason)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getAllContactIds", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_getChats(self, seqid, iprot, oprot):
args = getChats_args()
args.read(iprot)
iprot.readMessageEnd()
result = getChats_result()
try:
result.success = self._handler.getChats(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("getChats", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_inviteIntoChat(self, seqid, iprot, oprot):
args = inviteIntoChat_args()
args.read(iprot)
iprot.readMessageEnd()
result = inviteIntoChat_result()
try:
result.success = self._handler.inviteIntoChat(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("inviteIntoChat", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_reissueChatTicket(self, seqid, iprot, oprot):
args = reissueChatTicket_args()
args.read(iprot)
iprot.readMessageEnd()
result = reissueChatTicket_result()
try:
result.success = self._handler.reissueChatTicket(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("reissueChatTicket", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_rejectChatInvitation(self, seqid, iprot, oprot):
args = rejectChatInvitation_args()
args.read(iprot)
iprot.readMessageEnd()
result = rejectChatInvitation_result()
try:
result.success = self._handler.rejectChatInvitation(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("rejectChatInvitation", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_sendMessage(self, seqid, iprot, oprot):
args = sendMessage_args()
args.read(iprot)
iprot.readMessageEnd()
result = sendMessage_result()
try:
result.success = self._handler.sendMessage(args.seq, args.message)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("sendMessage", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_unsendMessage(self, seqid, iprot, oprot):
args = unsendMessage_args()
args.read(iprot)
iprot.readMessageEnd()
result = unsendMessage_result()
try:
self._handler.unsendMessage(args.seq, args.messageId)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("unsendMessage", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_updateChat(self, seqid, iprot, oprot):
args = updateChat_args()
args.read(iprot)
iprot.readMessageEnd()
result = updateChat_result()
try:
result.success = self._handler.updateChat(args.request)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("updateChat", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
def process_updateProfileAttribute(self, seqid, iprot, oprot):
args = updateProfileAttribute_args()
args.read(iprot)
iprot.readMessageEnd()
result = updateProfileAttribute_result()
try:
self._handler.updateProfileAttribute(args.reqSeq, args.attr, args.value)
msg_type = TMessageType.REPLY
except TTransport.TTransportException:
raise
except TalkException as e:
msg_type = TMessageType.REPLY
result.e = e
except TApplicationException as ex:
logging.exception('TApplication exception in handler')
msg_type = TMessageType.EXCEPTION
result = ex
except Exception:
logging.exception('Unexpected exception in handler')
msg_type = TMessageType.EXCEPTION
result = TApplicationException(TApplicationException.INTERNAL_ERROR, 'Internal error')
oprot.writeMessageBegin("updateProfileAttribute", msg_type, seqid)
result.write(oprot)
oprot.writeMessageEnd()
oprot.trans.flush()
# HELPER FUNCTIONS AND STRUCTURES
class acceptChatInvitation_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = AcceptChatInvitationRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('acceptChatInvitation_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(acceptChatInvitation_args)
acceptChatInvitation_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [AcceptChatInvitationRequest, None], None, ), # 1
)
class acceptChatInvitation_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = AcceptChatInvitationResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('acceptChatInvitation_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(acceptChatInvitation_result)
acceptChatInvitation_result.thrift_spec = (
(0, TType.STRUCT, 'success', [AcceptChatInvitationResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class acceptChatInvitationByTicket_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = AcceptChatInvitationByTicketRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('acceptChatInvitationByTicket_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(acceptChatInvitationByTicket_args)
acceptChatInvitationByTicket_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [AcceptChatInvitationByTicketRequest, None], None, ), # 1
)
class acceptChatInvitationByTicket_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = AcceptChatInvitationByTicketResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('acceptChatInvitationByTicket_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(acceptChatInvitationByTicket_result)
acceptChatInvitationByTicket_result.thrift_spec = (
(0, TType.STRUCT, 'success', [AcceptChatInvitationByTicketResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class blockContact_args(object):
"""
Attributes:
- reqSeq
- id
"""
def __init__(self, reqSeq=None, id=None,):
self.reqSeq = reqSeq
self.id = id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.reqSeq = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.id = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('blockContact_args')
if self.reqSeq is not None:
oprot.writeFieldBegin('reqSeq', TType.I32, 1)
oprot.writeI32(self.reqSeq)
oprot.writeFieldEnd()
if self.id is not None:
oprot.writeFieldBegin('id', TType.STRING, 2)
oprot.writeString(self.id.encode('utf-8') if sys.version_info[0] == 2 else self.id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(blockContact_args)
blockContact_args.thrift_spec = (
None, # 0
(1, TType.I32, 'reqSeq', None, None, ), # 1
(2, TType.STRING, 'id', 'UTF8', None, ), # 2
)
class blockContact_result(object):
"""
Attributes:
- e
"""
def __init__(self, e=None,):
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('blockContact_result')
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(blockContact_result)
blockContact_result.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class cancelChatInvitation_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = CancelChatInvitationRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('cancelChatInvitation_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(cancelChatInvitation_args)
cancelChatInvitation_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [CancelChatInvitationRequest, None], None, ), # 1
)
class cancelChatInvitation_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = CancelChatInvitationResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('cancelChatInvitation_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(cancelChatInvitation_result)
cancelChatInvitation_result.thrift_spec = (
(0, TType.STRUCT, 'success', [CancelChatInvitationResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class createChat_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = CreateChatRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('createChat_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(createChat_args)
createChat_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [CreateChatRequest, None], None, ), # 1
)
class createChat_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = CreateChatResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('createChat_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(createChat_result)
createChat_result.thrift_spec = (
(0, TType.STRUCT, 'success', [CreateChatResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class deleteSelfFromChat_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = DeleteSelfFromChatRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('deleteSelfFromChat_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(deleteSelfFromChat_args)
deleteSelfFromChat_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [DeleteSelfFromChatRequest, None], None, ), # 1
)
class deleteSelfFromChat_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = DeleteSelfFromChatResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('deleteSelfFromChat_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(deleteSelfFromChat_result)
deleteSelfFromChat_result.thrift_spec = (
(0, TType.STRUCT, 'success', [DeleteSelfFromChatResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class deleteOtherFromChat_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = DeleteOtherFromChatRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('deleteOtherFromChat_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(deleteOtherFromChat_args)
deleteOtherFromChat_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [DeleteOtherFromChatRequest, None], None, ), # 1
)
class deleteOtherFromChat_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = DeleteOtherFromChatResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('deleteOtherFromChat_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(deleteOtherFromChat_result)
deleteOtherFromChat_result.thrift_spec = (
(0, TType.STRUCT, 'success', [DeleteOtherFromChatResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class fetchOperations_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = FetchOperationsRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('fetchOperations_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(fetchOperations_args)
fetchOperations_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [FetchOperationsRequest, None], None, ), # 1
)
class fetchOperations_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = FetchOperationsResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = ThingsException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('fetchOperations_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(fetchOperations_result)
fetchOperations_result.thrift_spec = (
(0, TType.STRUCT, 'success', [FetchOperationsResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [ThingsException, None], None, ), # 1
)
class fetchOps_args(object):
"""
Attributes:
- localRev
- count
- globalRev
- individualRev
"""
def __init__(self, localRev=None, count=None, globalRev=None, individualRev=None,):
self.localRev = localRev
self.count = count
self.globalRev = globalRev
self.individualRev = individualRev
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 2:
if ftype == TType.I64:
self.localRev = iprot.readI64()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.count = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.I64:
self.globalRev = iprot.readI64()
else:
iprot.skip(ftype)
elif fid == 5:
if ftype == TType.I64:
self.individualRev = iprot.readI64()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('fetchOps_args')
if self.localRev is not None:
oprot.writeFieldBegin('localRev', TType.I64, 2)
oprot.writeI64(self.localRev)
oprot.writeFieldEnd()
if self.count is not None:
oprot.writeFieldBegin('count', TType.I32, 3)
oprot.writeI32(self.count)
oprot.writeFieldEnd()
if self.globalRev is not None:
oprot.writeFieldBegin('globalRev', TType.I64, 4)
oprot.writeI64(self.globalRev)
oprot.writeFieldEnd()
if self.individualRev is not None:
oprot.writeFieldBegin('individualRev', TType.I64, 5)
oprot.writeI64(self.individualRev)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(fetchOps_args)
fetchOps_args.thrift_spec = (
None, # 0
None, # 1
(2, TType.I64, 'localRev', None, None, ), # 2
(3, TType.I32, 'count', None, None, ), # 3
(4, TType.I64, 'globalRev', None, None, ), # 4
(5, TType.I64, 'individualRev', None, None, ), # 5
)
class fetchOps_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype127, _size124) = iprot.readListBegin()
for _i128 in range(_size124):
_elem129 = Operation()
_elem129.read(iprot)
self.success.append(_elem129)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('fetchOps_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter130 in self.success:
iter130.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(fetchOps_result)
fetchOps_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [Operation, None], False), None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class findAndAddContactsByMid_args(object):
"""
Attributes:
- reqSeq
- mid
- type
- reference
"""
def __init__(self, reqSeq=None, mid=None, type=None, reference=None,):
self.reqSeq = reqSeq
self.mid = mid
self.type = type
self.reference = reference
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.reqSeq = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.mid = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.type = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 4:
if ftype == TType.STRING:
self.reference = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('findAndAddContactsByMid_args')
if self.reqSeq is not None:
oprot.writeFieldBegin('reqSeq', TType.I32, 1)
oprot.writeI32(self.reqSeq)
oprot.writeFieldEnd()
if self.mid is not None:
oprot.writeFieldBegin('mid', TType.STRING, 2)
oprot.writeString(self.mid.encode('utf-8') if sys.version_info[0] == 2 else self.mid)
oprot.writeFieldEnd()
if self.type is not None:
oprot.writeFieldBegin('type', TType.I32, 3)
oprot.writeI32(self.type)
oprot.writeFieldEnd()
if self.reference is not None:
oprot.writeFieldBegin('reference', TType.STRING, 4)
oprot.writeString(self.reference.encode('utf-8') if sys.version_info[0] == 2 else self.reference)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(findAndAddContactsByMid_args)
findAndAddContactsByMid_args.thrift_spec = (
None, # 0
(1, TType.I32, 'reqSeq', None, None, ), # 1
(2, TType.STRING, 'mid', 'UTF8', None, ), # 2
(3, TType.I32, 'type', None, None, ), # 3
(4, TType.STRING, 'reference', 'UTF8', None, ), # 4
)
class findAndAddContactsByMid_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.MAP:
self.success = {}
(_ktype132, _vtype133, _size131) = iprot.readMapBegin()
for _i135 in range(_size131):
_key136 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
_val137 = Contact()
_val137.read(iprot)
self.success[_key136] = _val137
iprot.readMapEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('findAndAddContactsByMid_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.MAP, 0)
oprot.writeMapBegin(TType.STRING, TType.STRUCT, len(self.success))
for kiter138, viter139 in self.success.items():
oprot.writeString(kiter138.encode('utf-8') if sys.version_info[0] == 2 else kiter138)
viter139.write(oprot)
oprot.writeMapEnd()
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(findAndAddContactsByMid_result)
findAndAddContactsByMid_result.thrift_spec = (
(0, TType.MAP, 'success', (TType.STRING, 'UTF8', TType.STRUCT, [Contact, None], False), None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class findAndAddContactsByUserid_args(object):
"""
Attributes:
- reqSeq
- searchId
- reference
"""
def __init__(self, reqSeq=None, searchId=None, reference=None,):
self.reqSeq = reqSeq
self.searchId = searchId
self.reference = reference
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.reqSeq = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.searchId = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.reference = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('findAndAddContactsByUserid_args')
if self.reqSeq is not None:
oprot.writeFieldBegin('reqSeq', TType.I32, 1)
oprot.writeI32(self.reqSeq)
oprot.writeFieldEnd()
if self.searchId is not None:
oprot.writeFieldBegin('searchId', TType.STRING, 2)
oprot.writeString(self.searchId.encode('utf-8') if sys.version_info[0] == 2 else self.searchId)
oprot.writeFieldEnd()
if self.reference is not None:
oprot.writeFieldBegin('reference', TType.STRING, 3)
oprot.writeString(self.reference.encode('utf-8') if sys.version_info[0] == 2 else self.reference)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(findAndAddContactsByUserid_args)
findAndAddContactsByUserid_args.thrift_spec = (
None, # 0
(1, TType.I32, 'reqSeq', None, None, ), # 1
(2, TType.STRING, 'searchId', 'UTF8', None, ), # 2
(3, TType.STRING, 'reference', 'UTF8', None, ), # 3
)
class findAndAddContactsByUserid_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.MAP:
self.success = {}
(_ktype141, _vtype142, _size140) = iprot.readMapBegin()
for _i144 in range(_size140):
_key145 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
_val146 = Contact()
_val146.read(iprot)
self.success[_key145] = _val146
iprot.readMapEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('findAndAddContactsByUserid_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.MAP, 0)
oprot.writeMapBegin(TType.STRING, TType.STRUCT, len(self.success))
for kiter147, viter148 in self.success.items():
oprot.writeString(kiter147.encode('utf-8') if sys.version_info[0] == 2 else kiter147)
viter148.write(oprot)
oprot.writeMapEnd()
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(findAndAddContactsByUserid_result)
findAndAddContactsByUserid_result.thrift_spec = (
(0, TType.MAP, 'success', (TType.STRING, 'UTF8', TType.STRUCT, [Contact, None], False), None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class findContactByUserid_args(object):
"""
Attributes:
- userid
"""
def __init__(self, userid=None,):
self.userid = userid
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 2:
if ftype == TType.STRING:
self.userid = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('findContactByUserid_args')
if self.userid is not None:
oprot.writeFieldBegin('userid', TType.STRING, 2)
oprot.writeString(self.userid.encode('utf-8') if sys.version_info[0] == 2 else self.userid)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(findContactByUserid_args)
findContactByUserid_args.thrift_spec = (
None, # 0
None, # 1
(2, TType.STRING, 'userid', 'UTF8', None, ), # 2
)
class findContactByUserid_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = Contact()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('findContactByUserid_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(findContactByUserid_result)
findContactByUserid_result.thrift_spec = (
(0, TType.STRUCT, 'success', [Contact, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class findChatByTicket_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = FindChatByTicketRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('findChatByTicket_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(findChatByTicket_args)
findChatByTicket_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [FindChatByTicketRequest, None], None, ), # 1
)
class findChatByTicket_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = FindChatByTicketResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('findChatByTicket_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(findChatByTicket_result)
findChatByTicket_result.thrift_spec = (
(0, TType.STRUCT, 'success', [FindChatByTicketResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class getAllChatMids_args(object):
"""
Attributes:
- request
- syncReason
"""
def __init__(self, request=None, syncReason=None,):
self.request = request
self.syncReason = syncReason
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = GetAllChatMidsRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.syncReason = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getAllChatMids_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
if self.syncReason is not None:
oprot.writeFieldBegin('syncReason', TType.I32, 2)
oprot.writeI32(self.syncReason)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getAllChatMids_args)
getAllChatMids_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [GetAllChatMidsRequest, None], None, ), # 1
(2, TType.I32, 'syncReason', None, None, ), # 2
)
class getAllChatMids_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = GetAllChatMidsResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getAllChatMids_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getAllChatMids_result)
getAllChatMids_result.thrift_spec = (
(0, TType.STRUCT, 'success', [GetAllChatMidsResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class getProfile_args(object):
"""
Attributes:
- syncReason
"""
def __init__(self, syncReason=None,):
self.syncReason = syncReason
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.syncReason = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getProfile_args')
if self.syncReason is not None:
oprot.writeFieldBegin('syncReason', TType.I32, 1)
oprot.writeI32(self.syncReason)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getProfile_args)
getProfile_args.thrift_spec = (
None, # 0
(1, TType.I32, 'syncReason', None, None, ), # 1
)
class getProfile_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = Profile()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getProfile_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getProfile_result)
getProfile_result.thrift_spec = (
(0, TType.STRUCT, 'success', [Profile, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class getContact_args(object):
"""
Attributes:
- id
"""
def __init__(self, id=None,):
self.id = id
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 2:
if ftype == TType.STRING:
self.id = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getContact_args')
if self.id is not None:
oprot.writeFieldBegin('id', TType.STRING, 2)
oprot.writeString(self.id.encode('utf-8') if sys.version_info[0] == 2 else self.id)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getContact_args)
getContact_args.thrift_spec = (
None, # 0
None, # 1
(2, TType.STRING, 'id', 'UTF8', None, ), # 2
)
class getContact_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = Contact()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getContact_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getContact_result)
getContact_result.thrift_spec = (
(0, TType.STRUCT, 'success', [Contact, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class getCountryWithRequestIp_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getCountryWithRequestIp_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getCountryWithRequestIp_args)
getCountryWithRequestIp_args.thrift_spec = (
)
class getCountryWithRequestIp_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRING:
self.success = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getCountryWithRequestIp_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRING, 0)
oprot.writeString(self.success.encode('utf-8') if sys.version_info[0] == 2 else self.success)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getCountryWithRequestIp_result)
getCountryWithRequestIp_result.thrift_spec = (
(0, TType.STRING, 'success', 'UTF8', None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class getServerTime_args(object):
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getServerTime_args')
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getServerTime_args)
getServerTime_args.thrift_spec = (
)
class getServerTime_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.I64:
self.success = iprot.readI64()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getServerTime_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.I64, 0)
oprot.writeI64(self.success)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getServerTime_result)
getServerTime_result.thrift_spec = (
(0, TType.I64, 'success', None, None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class getContacts_args(object):
"""
Attributes:
- ids
"""
def __init__(self, ids=None,):
self.ids = ids
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 2:
if ftype == TType.LIST:
self.ids = []
(_etype152, _size149) = iprot.readListBegin()
for _i153 in range(_size149):
_elem154 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
self.ids.append(_elem154)
iprot.readListEnd()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getContacts_args')
if self.ids is not None:
oprot.writeFieldBegin('ids', TType.LIST, 2)
oprot.writeListBegin(TType.STRING, len(self.ids))
for iter155 in self.ids:
oprot.writeString(iter155.encode('utf-8') if sys.version_info[0] == 2 else iter155)
oprot.writeListEnd()
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getContacts_args)
getContacts_args.thrift_spec = (
None, # 0
None, # 1
(2, TType.LIST, 'ids', (TType.STRING, 'UTF8', False), None, ), # 2
)
class getContacts_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype159, _size156) = iprot.readListBegin()
for _i160 in range(_size156):
_elem161 = Contact()
_elem161.read(iprot)
self.success.append(_elem161)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getContacts_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter162 in self.success:
iter162.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getContacts_result)
getContacts_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [Contact, None], False), None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class getAllContactIds_args(object):
"""
Attributes:
- syncReason
"""
def __init__(self, syncReason=None,):
self.syncReason = syncReason
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.syncReason = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getAllContactIds_args')
if self.syncReason is not None:
oprot.writeFieldBegin('syncReason', TType.I32, 1)
oprot.writeI32(self.syncReason)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getAllContactIds_args)
getAllContactIds_args.thrift_spec = (
None, # 0
(1, TType.I32, 'syncReason', None, None, ), # 1
)
class getAllContactIds_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype166, _size163) = iprot.readListBegin()
for _i167 in range(_size163):
_elem168 = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
self.success.append(_elem168)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getAllContactIds_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRING, len(self.success))
for iter169 in self.success:
oprot.writeString(iter169.encode('utf-8') if sys.version_info[0] == 2 else iter169)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getAllContactIds_result)
getAllContactIds_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRING, 'UTF8', False), None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class getChats_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = GetChatsRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getChats_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getChats_args)
getChats_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [GetChatsRequest, None], None, ), # 1
)
class getChats_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = GetChatsResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getChats_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getChats_result)
getChats_result.thrift_spec = (
(0, TType.STRUCT, 'success', [GetChatsResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class inviteIntoChat_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = InviteIntoChatRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('inviteIntoChat_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(inviteIntoChat_args)
inviteIntoChat_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [InviteIntoChatRequest, None], None, ), # 1
)
class inviteIntoChat_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = InviteIntoChatResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('inviteIntoChat_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(inviteIntoChat_result)
inviteIntoChat_result.thrift_spec = (
(0, TType.STRUCT, 'success', [InviteIntoChatResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class reissueChatTicket_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = ReissueChatTicketRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('reissueChatTicket_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(reissueChatTicket_args)
reissueChatTicket_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [ReissueChatTicketRequest, None], None, ), # 1
)
class reissueChatTicket_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = ReissueChatTicketResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('reissueChatTicket_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(reissueChatTicket_result)
reissueChatTicket_result.thrift_spec = (
(0, TType.STRUCT, 'success', [ReissueChatTicketResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class rejectChatInvitation_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = RejectChatInvitationRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('rejectChatInvitation_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(rejectChatInvitation_args)
rejectChatInvitation_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [RejectChatInvitationRequest, None], None, ), # 1
)
class rejectChatInvitation_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = RejectChatInvitationResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('rejectChatInvitation_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(rejectChatInvitation_result)
rejectChatInvitation_result.thrift_spec = (
(0, TType.STRUCT, 'success', [RejectChatInvitationResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class sendMessage_args(object):
"""
Attributes:
- seq
- message
"""
def __init__(self, seq=None, message=None,):
self.seq = seq
self.message = message
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.seq = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRUCT:
self.message = Message()
self.message.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sendMessage_args')
if self.seq is not None:
oprot.writeFieldBegin('seq', TType.I32, 1)
oprot.writeI32(self.seq)
oprot.writeFieldEnd()
if self.message is not None:
oprot.writeFieldBegin('message', TType.STRUCT, 2)
self.message.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sendMessage_args)
sendMessage_args.thrift_spec = (
None, # 0
(1, TType.I32, 'seq', None, None, ), # 1
(2, TType.STRUCT, 'message', [Message, None], None, ), # 2
)
class sendMessage_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = Message()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('sendMessage_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(sendMessage_result)
sendMessage_result.thrift_spec = (
(0, TType.STRUCT, 'success', [Message, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class unsendMessage_args(object):
"""
Attributes:
- seq
- messageId
"""
def __init__(self, seq=None, messageId=None,):
self.seq = seq
self.messageId = messageId
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.seq = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.STRING:
self.messageId = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('unsendMessage_args')
if self.seq is not None:
oprot.writeFieldBegin('seq', TType.I32, 1)
oprot.writeI32(self.seq)
oprot.writeFieldEnd()
if self.messageId is not None:
oprot.writeFieldBegin('messageId', TType.STRING, 2)
oprot.writeString(self.messageId.encode('utf-8') if sys.version_info[0] == 2 else self.messageId)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(unsendMessage_args)
unsendMessage_args.thrift_spec = (
None, # 0
(1, TType.I32, 'seq', None, None, ), # 1
(2, TType.STRING, 'messageId', 'UTF8', None, ), # 2
)
class unsendMessage_result(object):
"""
Attributes:
- e
"""
def __init__(self, e=None,):
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('unsendMessage_result')
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(unsendMessage_result)
unsendMessage_result.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class getRecentMessagesV2_args(object):
"""
Attributes:
- messageBoxId
- messagesCount
"""
def __init__(self, messageBoxId=None, messagesCount=None,):
self.messageBoxId = messageBoxId
self.messagesCount = messagesCount
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 2:
if ftype == TType.STRING:
self.messageBoxId = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.I32:
self.messagesCount = iprot.readI32()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getRecentMessagesV2_args')
if self.messageBoxId is not None:
oprot.writeFieldBegin('messageBoxId', TType.STRING, 2)
oprot.writeString(self.messageBoxId.encode('utf-8') if sys.version_info[0] == 2 else self.messageBoxId)
oprot.writeFieldEnd()
if self.messagesCount is not None:
oprot.writeFieldBegin('messagesCount', TType.I32, 3)
oprot.writeI32(self.messagesCount)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getRecentMessagesV2_args)
getRecentMessagesV2_args.thrift_spec = (
None, # 0
None, # 1
(2, TType.STRING, 'messageBoxId', 'UTF8', None, ), # 2
(3, TType.I32, 'messagesCount', None, None, ), # 3
)
class getRecentMessagesV2_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.LIST:
self.success = []
(_etype2069, _size2066) = iprot.readListBegin()
for _i2070 in range(_size2066):
_elem2071 = Message()
_elem2071.read(iprot)
self.success.append(_elem2071)
iprot.readListEnd()
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('getRecentMessagesV2_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.LIST, 0)
oprot.writeListBegin(TType.STRUCT, len(self.success))
for iter2072 in self.success:
iter2072.write(oprot)
oprot.writeListEnd()
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(getRecentMessagesV2_result)
getRecentMessagesV2_result.thrift_spec = (
(0, TType.LIST, 'success', (TType.STRUCT, [Message, None], False), None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class updateChat_args(object):
"""
Attributes:
- request
"""
def __init__(self, request=None,):
self.request = request
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.request = UpdateChatRequest()
self.request.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('updateChat_args')
if self.request is not None:
oprot.writeFieldBegin('request', TType.STRUCT, 1)
self.request.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(updateChat_args)
updateChat_args.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'request', [UpdateChatRequest, None], None, ), # 1
)
class updateChat_result(object):
"""
Attributes:
- success
- e
"""
def __init__(self, success=None, e=None,):
self.success = success
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 0:
if ftype == TType.STRUCT:
self.success = UpdateChatResponse()
self.success.read(iprot)
else:
iprot.skip(ftype)
elif fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('updateChat_result')
if self.success is not None:
oprot.writeFieldBegin('success', TType.STRUCT, 0)
self.success.write(oprot)
oprot.writeFieldEnd()
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(updateChat_result)
updateChat_result.thrift_spec = (
(0, TType.STRUCT, 'success', [UpdateChatResponse, None], None, ), # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
class updateProfileAttribute_args(object):
"""
Attributes:
- reqSeq
- attr
- value
"""
def __init__(self, reqSeq=None, attr=None, value=None,):
self.reqSeq = reqSeq
self.attr = attr
self.value = value
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.I32:
self.reqSeq = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 2:
if ftype == TType.I32:
self.attr = iprot.readI32()
else:
iprot.skip(ftype)
elif fid == 3:
if ftype == TType.STRING:
self.value = iprot.readString().decode('utf-8') if sys.version_info[0] == 2 else iprot.readString()
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('updateProfileAttribute_args')
if self.reqSeq is not None:
oprot.writeFieldBegin('reqSeq', TType.I32, 1)
oprot.writeI32(self.reqSeq)
oprot.writeFieldEnd()
if self.attr is not None:
oprot.writeFieldBegin('attr', TType.I32, 2)
oprot.writeI32(self.attr)
oprot.writeFieldEnd()
if self.value is not None:
oprot.writeFieldBegin('value', TType.STRING, 3)
oprot.writeString(self.value.encode('utf-8') if sys.version_info[0] == 2 else self.value)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(updateProfileAttribute_args)
updateProfileAttribute_args.thrift_spec = (
None, # 0
(1, TType.I32, 'reqSeq', None, None, ), # 1
(2, TType.I32, 'attr', None, None, ), # 2
(3, TType.STRING, 'value', 'UTF8', None, ), # 3
)
class updateProfileAttribute_result(object):
"""
Attributes:
- e
"""
def __init__(self, e=None,):
self.e = e
def read(self, iprot):
if iprot._fast_decode is not None and isinstance(iprot.trans, TTransport.CReadableTransport) and self.thrift_spec is not None:
iprot._fast_decode(self, iprot, [self.__class__, self.thrift_spec])
return
iprot.readStructBegin()
while True:
(fname, ftype, fid) = iprot.readFieldBegin()
if ftype == TType.STOP:
break
if fid == 1:
if ftype == TType.STRUCT:
self.e = TalkException()
self.e.read(iprot)
else:
iprot.skip(ftype)
else:
iprot.skip(ftype)
iprot.readFieldEnd()
iprot.readStructEnd()
def write(self, oprot):
if oprot._fast_encode is not None and self.thrift_spec is not None:
oprot.trans.write(oprot._fast_encode(self, [self.__class__, self.thrift_spec]))
return
oprot.writeStructBegin('updateProfileAttribute_result')
if self.e is not None:
oprot.writeFieldBegin('e', TType.STRUCT, 1)
self.e.write(oprot)
oprot.writeFieldEnd()
oprot.writeFieldStop()
oprot.writeStructEnd()
def validate(self):
return
def __repr__(self):
L = ['%s=%r' % (key, value)
for key, value in self.__dict__.items()]
return '%s(%s)' % (self.__class__.__name__, ', '.join(L))
def __eq__(self, other):
return isinstance(other, self.__class__) and self.__dict__ == other.__dict__
def __ne__(self, other):
return not (self == other)
all_structs.append(updateProfileAttribute_result)
updateProfileAttribute_result.thrift_spec = (
None, # 0
(1, TType.STRUCT, 'e', [TalkException, None], None, ), # 1
)
fix_spec(all_structs)
del all_structs
| 33.864352 | 134 | 0.588304 | 21,189 | 210,704 | 5.630138 | 0.014253 | 0.016136 | 0.029045 | 0.023387 | 0.851924 | 0.819015 | 0.790875 | 0.774789 | 0.768695 | 0.762718 | 0 | 0.00663 | 0.311332 | 210,704 | 6,221 | 135 | 33.869796 | 0.815513 | 0.016825 | 0 | 0.822418 | 1 | 0 | 0.038936 | 0.006488 | 0 | 0 | 0 | 0 | 0 | 1 | 0.113529 | false | 0.005954 | 0.001642 | 0.035722 | 0.21084 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
84200c2df5e5f11fa595de20313aa4f33f2b3d47 | 15,183 | py | Python | test/test_cam_get_protocol_v2.py | dondemonz/RestApi | 0459d2b8079b9f2abc50bf5e206625427c4a2dcf | [
"Apache-2.0"
] | null | null | null | test/test_cam_get_protocol_v2.py | dondemonz/RestApi | 0459d2b8079b9f2abc50bf5e206625427c4a2dcf | [
"Apache-2.0"
] | null | null | null | test/test_cam_get_protocol_v2.py | dondemonz/RestApi | 0459d2b8079b9f2abc50bf5e206625427c4a2dcf | [
"Apache-2.0"
] | null | null | null | import requests
from model.json_check import *
from model.input_data import *
import time
import datetime as dt
def test_GetV2CameraProtocolCode200(fix):
data = 2
m = dt.datetime.now()
starttime = m.strftime("%Y-%m-%d %H:%M:%S") # почему иногда использовал "%Y-%m-%d %H:%M:%S", а иногда "%Y%m%dT%H%M%S" - непонятно
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
p = dt.datetime.now()
endtime = (p.strftime("%Y%m%dT%H%M%S"))
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime + "&stop_time=" + endtime + "&max_count=3", auth=auth)
user_resp_code = "200"
time.sleep(3)
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
def test_GetV2CameraProtocolCode200WithoutEndTime(fix):
data = 1
m = dt.datetime.now()
# print(m)
# print(m.strftime("%Y-%m-%d %H:%M:%S"))
# v = m.time()
# print(v)
starttime = m.strftime("%Y%m%dT%H%M%S")
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime + "&max_count=1", auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
def test_GetV2CameraProtocolCode200WithoutEndTimeAndMaxCount(fix):
data = 2
m = dt.datetime.now()
# print(m)
# print(m.strftime("%Y-%m-%d %H:%M:%S"))
# v = m.time()
# print(v)
starttime = m.strftime("%Y%m%dT%H%M%S")
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
def test_GetV2CameraProtocolCode400():
data = "Missed required parameter:start_time"
response = requests.get(url="http://"+slave_ip+":"+restPort+"/api/v1/cameras/"+camId+"/protocol", auth=auth)
user_resp_code = "400"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["message"]
assert data == n
def test_GetV2CameraProtocolCode400IncorrectTime():
data = "Incorrect parameter:start_time, value:20151119T1848032"
response = requests.get(url="http://"+slave_ip+":"+restPort+"/api/v1/cameras/"+camId+"/protocol?start_time=20151119T1848032", auth=auth)
user_resp_code = "400"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["message"]
assert data == n
def test_GetV2CameraProtocolCode401():
data = {"name": camName}
response = requests.put(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"", headers=headers, data=json.dumps(dict(data)), auth=("", ""))
user_resp_code = "401"
assert str(response.status_code) == user_resp_code
# Тесты фильтров для протокола
def test_GetV2CameraProtocolCode200EventFilterEmpty(fix):
data = 0
fix.send_event(message=("CORE||UPDATE_OBJECT|objtype<EVENT_FILTER>,objid<"+objId+">,parent_id<1>,EVENT.action.count<0>,EVENT.type.count<0>,EVENT.id.count<0>,EVENT.rule.count<0>").encode("utf-8"))
m = dt.datetime.now()
starttime = m.strftime("%Y-%m-%d %H:%M:%S")
time.sleep(1)
fix.connect_to_dll()
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
def test_GetV2CameraProtocolCode200EventFilterAllowAll(fix):
data = 2
fix.send_event(message=(("CORE||UPDATE_OBJECT|objtype<EVENT_FILTER>,objid<"+objId+">,parent_id<1>,EVENT.action.count<1>,EVENT.type.count<1>,EVENT.id.count<1>,EVENT.rule.count<1>,EVENT.rule.0<1>,EVENT.id.0<>,EVENT.type.0<CAM>,EVENT.action.0<>").encode("utf-8")))
time.sleep(3)
m = dt.datetime.now()
starttime = m.strftime("%Y-%m-%d %H:%M:%S")
time.sleep(1)
fix.connect_to_dll()
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
def test_GetV2CameraProtocolCode200EventFilterAllowId(fix):
data = 2
data2 = 0
fix.send_event(message=(("CORE||UPDATE_OBJECT|objtype<EVENT_FILTER>,objid<"+objId+">,parent_id<1>,EVENT.action.count<1>,EVENT.type.count<1>,EVENT.id.count<1>,EVENT.rule.count<1>,EVENT.rule.0<1>,EVENT.id.0<"+camId+">,EVENT.type.0<CAM>,EVENT.action.0<>").encode("utf-8")))
time.sleep(3)
m = dt.datetime.now()
starttime = m.strftime("%Y-%m-%d %H:%M:%S")
time.sleep(1)
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
response1 = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId2+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response1.status_code) == user_resp_code
body1 = json.dumps(response1.json())
data3 = json.loads(body1)
n2 = data3["data"]["actual_count"]
assert data2 == n2
def test_GetV2CameraProtocolCode200EventFilterAllowEvent(fix):
data = 1
data2 = 1
fix.send_event(message=(("CORE||UPDATE_OBJECT|objtype<EVENT_FILTER>,objid<"+objId+">,parent_id<1>,EVENT.action.count<1>,EVENT.type.count<1>,EVENT.id.count<1>,EVENT.rule.count<1>,EVENT.rule.0<1>,EVENT.id.0<>,EVENT.type.0<CAM>,EVENT.action.0<ARMED>").encode("utf-8")))
m = dt.datetime.now()
starttime = m.strftime("%Y-%m-%d %H:%M:%S")
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
response1 = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId2+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response1.status_code) == user_resp_code
body1 = json.dumps(response1.json())
data3 = json.loads(body1)
n2 = data3["data"]["actual_count"]
assert data2 == n2
def test_GetV2CameraProtocolCode200EventFilterAllowIdForbidAll(fix):
data = 2
data2 = 0
fix.send_event(message=(("CORE||UPDATE_OBJECT|objtype<EVENT_FILTER>,objid<"+objId+">,parent_id<1>,EVENT.action.count<2>,EVENT.type.count<2>,EVENT.id.count<2>,EVENT.rule.count<2>,EVENT.rule.0<1>,EVENT.id.0<"+camId+">,EVENT.type.0<CAM>,EVENT.action.0<>,EVENT.rule.1<0>,EVENT.id.1<>,EVENT.type.1<CAM>,EVENT.action.1<>").encode("utf-8")))
m = dt.datetime.now()
starttime = m.strftime("%Y-%m-%d %H:%M:%S")
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
response1 = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId2+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response1.status_code) == user_resp_code
body1 = json.dumps(response1.json())
data3 = json.loads(body1)
n2 = data3["data"]["actual_count"]
assert data2 == n2
def test_GetV2CameraProtocolCode200EventFilterAllowAllForbidId(fix):
data = 0
data2 = 2
fix.send_event(message=(("CORE||UPDATE_OBJECT|objtype<EVENT_FILTER>,objid<"+objId+">,parent_id<1>,EVENT.action.count<2>,EVENT.type.count<2>,EVENT.id.count<2>,EVENT.rule.count<2>,EVENT.rule.0<0>,EVENT.id.0<"+camId+">,EVENT.type.0<CAM>,EVENT.action.0<>,EVENT.rule.1<1>,EVENT.id.1<>,EVENT.type.1<CAM>,EVENT.action.1<>").encode("utf-8")))
m = dt.datetime.now()
starttime = m.strftime("%Y-%m-%d %H:%M:%S")
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
response1 = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId2+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response1.status_code) == user_resp_code
body1 = json.dumps(response1.json())
data3 = json.loads(body1)
n2 = data3["data"]["actual_count"]
assert data2 == n2
def test_GetV2CameraProtocolCode200EventFilterAllowEventForbidAll(fix):
data = 1
data2 = 1
fix.send_event(message=(("CORE||UPDATE_OBJECT|objtype<EVENT_FILTER>,objid<"+objId+">,parent_id<1>,EVENT.action.count<2>,EVENT.type.count<2>,EVENT.id.count<2>,EVENT.rule.count<2>,EVENT.rule.0<1>,EVENT.id.0<>,EVENT.type.0<CAM>,EVENT.action.0<DISARMED>,EVENT.rule.1<0>,EVENT.id.1<>,EVENT.type.1<CAM>,EVENT.action.1<>").encode("utf-8")))
m = dt.datetime.now()
starttime = m.strftime("%Y-%m-%d %H:%M:%S")
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
assert data == n
response1 = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId2+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response1.status_code) == user_resp_code
body1 = json.dumps(response1.json())
data3 = json.loads(body1)
n2 = data3["data"]["actual_count"]
assert data2 == n2
def test_GetV2CameraProtocolCode200EventFilterAllowAllForbidEvent(fix):
data = 1
data2 = 1
fix.send_event(message=(("CORE||UPDATE_OBJECT|objtype<EVENT_FILTER>,objid<"+objId+">,parent_id<1>,EVENT.action.count<2>,EVENT.type.count<2>,EVENT.id.count<2>,EVENT.rule.count<2>,EVENT.rule.0<0>,EVENT.id.0<>,EVENT.type.0<CAM>,EVENT.action.0<DISARMED>,EVENT.rule.1<1>,EVENT.id.1<>,EVENT.type.1<CAM>,EVENT.action.1<>").encode("utf-8")))
time.sleep(1)
m = dt.datetime.now()
starttime = m.strftime("%Y-%m-%d %H:%M:%S")
fix.send_react(("CAM|"+camId+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|ARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId+"|DISARM").encode("utf-8"))
time.sleep(1)
fix.send_react(("CAM|"+camId2+"|DISARM").encode("utf-8"))
time.sleep(1)
response = requests.get(url="http://"+slave_ip+":"+restPort+"/api/v2/cameras/"+camId+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response.status_code) == user_resp_code
body = json.dumps(response.json())
data1 = json.loads(body)
n = data1["data"]["actual_count"]
print(data1)
assert data == n
time.sleep(1)
response1 = requests.get(url="http://" + slave_ip + ":"+restPort+"/api/v2/cameras/"+camId2+"/protocol?start_time=" + starttime, auth=auth)
user_resp_code = "200"
assert str(response1.status_code) == user_resp_code
body1 = json.dumps(response1.json())
data3 = json.loads(body1)
n2 = data3["data"]["actual_count"]
assert data2 == n2
# после последнего теста выключаем фильтр из реста
# fix.send_event(message=("CORE||UPDATE_OBJECT|objtype<REST_API>,objid<" + objId + ">,parent_id<" + slave + ">,event_filter_id<>").encode("utf-8"))
# и обновить фильтр
fix.send_event(message=(("CORE||UPDATE_OBJECT|objtype<EVENT_FILTER>,objid<" + objId + ">,parent_id<1>,EVENT.action.count<1>,EVENT.type.count<1>,EVENT.id.count<1>,EVENT.rule.count<1>,EVENT.rule.0<1>,EVENT.id.0<>,EVENT.type.0<>,EVENT.action.0<>").encode("utf-8")))
| 46.289634 | 338 | 0.654087 | 2,210 | 15,183 | 4.377828 | 0.063801 | 0.031835 | 0.045478 | 0.05354 | 0.870491 | 0.869457 | 0.862429 | 0.859225 | 0.850853 | 0.846822 | 0 | 0.03713 | 0.136139 | 15,183 | 327 | 339 | 46.431193 | 0.700518 | 0.030626 | 0 | 0.857143 | 0 | 0.038328 | 0.282411 | 0.170113 | 0 | 0 | 0 | 0 | 0.135889 | 1 | 0.04878 | false | 0 | 0.017422 | 0 | 0.066202 | 0.003484 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8428642731da604aa54f8cef1afdf4b05806ad87 | 108 | py | Python | lemon/default/app/web.py | InsaneMiner/Salt | b61c5f931fe4b6fa652e8fbfb59b30dbaaf9ed18 | [
"MIT"
] | 6 | 2020-11-22T11:42:55.000Z | 2022-01-09T12:29:30.000Z | lemon/default/app/web.py | InsaneMiner/Salt | b61c5f931fe4b6fa652e8fbfb59b30dbaaf9ed18 | [
"MIT"
] | 1 | 2020-11-21T00:05:40.000Z | 2020-11-22T21:58:54.000Z | lemon/default/app/web.py | InsaneMiner/Salt | b61c5f931fe4b6fa652e8fbfb59b30dbaaf9ed18 | [
"MIT"
] | 2 | 2021-06-05T04:19:04.000Z | 2021-06-05T04:28:08.000Z | import lemon.libs.lemon
def main(object):
return lemon.libs.lemon.render_template(object,"default.html") | 36 | 66 | 0.787037 | 16 | 108 | 5.25 | 0.6875 | 0.214286 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 108 | 3 | 66 | 36 | 0.848485 | 0 | 0 | 0 | 0 | 0 | 0.110092 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
842976c5f2ceed9bcc04f5533603253b153e7f00 | 25,106 | py | Python | code/utils/imgs/create_rgb_datasets_imgs.py | Robert-xiaoqiang/S3Net | c5e4e6a67efb8bf84e2e8d28cf36ed34c376e4a6 | [
"Unlicense"
] | 9 | 2021-12-08T13:08:56.000Z | 2022-03-05T08:38:25.000Z | code/utils/imgs/create_rgb_datasets_imgs.py | Robert-xiaoqiang/S3Net | c5e4e6a67efb8bf84e2e8d28cf36ed34c376e4a6 | [
"Unlicense"
] | 1 | 2022-01-02T06:18:05.000Z | 2022-01-02T06:50:25.000Z | code/utils/imgs/create_rgb_datasets_imgs.py | Robert-xiaoqiang/S3Net | c5e4e6a67efb8bf84e2e8d28cf36ed34c376e4a6 | [
"Unlicense"
] | null | null | null | import os
import numpy as np
from PIL import Image
import torch
from torch.utils.data import Dataset
from torchvision import transforms
from utils.joint_transforms import Compose, JointResize, RandomHorizontallyFlip, RandomRotate
from utils.misc import construct_print
mean_rgb = np.array([0.447, 0.407, 0.386])
std_rgb = np.array([0.244, 0.250, 0.253])
def _get_ext(path_list):
ext_list = list(set([os.path.splitext(p)[1] for p in path_list]))
if len(ext_list) != 1:
if '.png' in ext_list:
ext = '.png'
elif '.jpg' in ext_list:
ext = '.jpg'
elif '.bmp' in ext_list:
ext = '.bmp'
else:
raise NotImplementedError
construct_print(f"数据文件夹中包含多种扩展名,这里仅使用{ext}")
elif len(ext_list) == 1:
ext = ext_list[0]
else:
raise NotImplementedError
return ext
def _make_unlabeled_dataset(root, split):
img_path = os.path.join(root, split + '_images')
depth_path = os.path.join(root, split + '_depth')
img_list = os.listdir(img_path)
depth_list = os.listdir(depth_path)
img_ext = _get_ext(img_list)
depth_ext = _get_ext(depth_list)
# relative path to main file name
img_list = [os.path.splitext(f)[0] for f in img_list if f.endswith(img_ext)]
return [ (os.path.join(img_path, img_name + img_ext), \
os.path.join(depth_path, img_name + depth_ext)) for img_name in img_list ]
def _make_dataset(root, split):
img_path = os.path.join(root, split + '_images')
depth_path = os.path.join(root, split + '_depth')
mask_path = os.path.join(root, split + '_masks')
img_list = os.listdir(img_path)
depth_list = os.listdir(depth_path)
mask_list = os.listdir(mask_path)
img_ext = _get_ext(img_list)
depth_ext = _get_ext(depth_list)
mask_ext = _get_ext(mask_list)
img_list = [os.path.splitext(f)[0] for f in mask_list if f.endswith(mask_ext)]
return [(os.path.join(img_path, img_name + img_ext),
os.path.join(depth_path, img_name + depth_ext),
os.path.join(mask_path, img_name + mask_ext),
)
for img_name in img_list]
def _make_fdp_dataset(root):
img_path = os.path.join(root, 'RGB')
depth_path = os.path.join(root, 'depth')
mask_path = os.path.join(root, 'GT')
img_list = os.listdir(img_path)
depth_list = os.listdir(depth_path)
mask_list = os.listdir(mask_path)
img_ext = _get_ext(img_list)
depth_ext = _get_ext(depth_list)
mask_ext = _get_ext(mask_list)
img_list = [os.path.splitext(f)[0] for f in mask_list if f.endswith(mask_ext)]
return [(os.path.join(img_path, img_name + img_ext),
os.path.join(depth_path, img_name + depth_ext),
os.path.join(mask_path, img_name + mask_ext),
)
for img_name in img_list]
def _read_list_from_file(list_filepath):
img_list = []
with open(list_filepath, mode='r', encoding='utf-8') as openedfile:
line = openedfile.readline()
while line:
img_list.append(line.split()[0])
line = openedfile.readline()
return img_list
def _make_test_dataset_from_list(list_filepath, prefix=('.jpg', '.png')):
img_list = _read_list_from_file(list_filepath)
return [(os.path.join(os.path.join(os.path.dirname(img_path), 'test_images'),
os.path.basename(img_path) + prefix[0]),
os.path.join(os.path.join(os.path.dirname(img_path), 'test_masks'),
os.path.basename(img_path) + prefix[1]))
for img_path in img_list]
class TestImageFolder(Dataset):
def __init__(self, root, in_size, prefix):
if os.path.isdir(root):
construct_print(f"{root} is an image folder, we will test on it.")
self.imgs = _make_dataset(root, split = 'test')
elif os.path.isfile(root):
construct_print(f"{root} is a list of images, we will use these paths to read the "
f"corresponding image")
self.imgs = _make_test_dataset_from_list(root, prefix=prefix)
else:
raise NotImplementedError
self.test_img_trainsform = transforms.Compose([
# 输入的如果是一个tuple,则按照数据缩放,但是如果是一个数字,则按比例缩放到短边等于该值
transforms.Resize((in_size, in_size)),
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])
])
self.test_depth_trainsform = transforms.Compose([
transforms.Resize((in_size, in_size)),
transforms.ToTensor()
])
def __getitem__(self, index):
img_path, depth_path, mask_path = self.imgs[index]
img = Image.open(img_path).convert('RGB')
depth = Image.open(depth_path).convert('L')
depth = np.asarray(depth)
depth = (depth - np.min(depth)) / (np.max(depth) - np.min(depth) + 1.0e-6) * 255.0
depth = Image.fromarray(depth.astype(np.uint8)) # 255 -> [0, 1] automatically !!!!
img_name = (img_path.split(os.sep)[-1]).split('.')[0]
img = self.test_img_trainsform(img).float()
depth = self.test_depth_trainsform(depth).float()
# depth = (depth - torch.min(depth)) / (torch.max(depth) - torch.min(depth) + torch.tensor(1.0e-6)) * torch.tensor(255.0)
return img, depth, mask_path, img_name
def __len__(self):
return len(self.imgs)
class TestFDPImageFolder(Dataset):
def __init__(self, root, in_size, prefix):
if os.path.isdir(root):
construct_print(f"{root} is an image folder, we will test on it.")
self.imgs = _make_fdp_dataset(root)
elif os.path.isfile(root):
raise NotImplementedError
else:
raise NotImplementedError
self.test_img_trainsform = transforms.Compose([
# 输入的如果是一个tuple,则按照数据缩放,但是如果是一个数字,则按比例缩放到短边等于该值
transforms.Resize((in_size, in_size)),
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])
])
self.test_depth_trainsform = transforms.Compose([
transforms.Resize((in_size, in_size)),
transforms.ToTensor()
])
def __getitem__(self, index):
img_path, depth_path, mask_path = self.imgs[index]
img = Image.open(img_path).convert('RGB')
depth = Image.open(depth_path).convert('L')
depth = np.asarray(depth)
# depth = (depth - np.min(depth)) / (np.max(depth) - np.min(depth) + 1.0e-6) * 255.0
depth = Image.fromarray(depth.astype(np.uint8)) # 255 -> [0, 1] automatically !!!!
img_name = (img_path.split(os.sep)[-1]).split('.')[0]
img = self.test_img_trainsform(img).float()
depth = self.test_depth_trainsform(depth).float()
return img, depth, mask_path, img_name
def __len__(self):
return len(self.imgs)
class TestUnlabeledImageFolder(Dataset):
def __init__(self, root, in_size, prefix):
if os.path.isdir(root):
construct_print(f"{root} is an image folder, we will test on it.")
self.imgs = _make_unlabeled_dataset(root, split = 'test')
elif os.path.isfile(root):
raise NotImplementedError
else:
raise NotImplementedError
self.test_img_trainsform = transforms.Compose([
transforms.Resize((in_size, in_size)),
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])
])
self.test_depth_trainsform = transforms.Compose([
transforms.Resize((in_size, in_size)),
transforms.ToTensor()
])
def __getitem__(self, index):
img_path, depth_path= self.imgs[index]
img = Image.open(img_path).convert('RGB')
depth = Image.open(depth_path).convert('L')
img_name = (img_path.split(os.sep)[-1]).split('.')[0]
img = self.test_img_trainsform(img).float()
depth = self.test_depth_trainsform(depth).float()
depth = (depth - torch.min(depth)) / (torch.max(depth) - torch.min(depth) + torch.tensor(1.0e-6)) * torch.tensor(255.0)
return img, depth, img_name
def __len__(self):
return len(self.imgs)
class TestWithRotationImageFolder(Dataset):
def __init__(self, root, in_size, prefix, rotations = (0, 90, 180, 270)):
self.rotations = rotations
self.times = len(rotations)
if os.path.isdir(root):
construct_print(f"{root} is an image folder, we will test on it.")
self.imgs = _make_dataset(root, split = 'test')
elif os.path.isfile(root):
raise NotImplementedError
else:
raise NotImplementedError
self.test_img_trainsform = transforms.Compose([
# 输入的如果是一个tuple,则按照数据缩放,但是如果是一个数字,则按比例缩放到短边等于该值
transforms.Resize((in_size, in_size)),
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])
])
self.test_depth_trainsform = transforms.Compose([
transforms.Resize((in_size, in_size)),
transforms.ToTensor()
])
def __getitem__(self, index):
rotate_index = np.random.randint(self.times)
img_path, depth_path, mask_path = self.imgs[index]
img_name = (img_path.split(os.sep)[-1]).split('.')[0]
img = Image.open(img_path).convert('RGB')
depth = Image.open(depth_path).convert('L')
depth = np.asarray(depth)
depth = (depth - np.min(depth)) / (np.max(depth) - np.min(depth) + 1.0e-6) * 255.0
depth = Image.fromarray(depth.astype(np.uint8)) # 255 -> [0, 1] automatically !!!!
img = img.rotate(self.rotations[rotate_index])
depth = depth.rotate(self.rotations[rotate_index])
img = self.test_img_trainsform(img).float()
depth = self.test_depth_trainsform(depth).float()
# depth = (depth - torch.min(depth)) / (torch.max(depth) - torch.min(depth) + torch.tensor(1.0e-6)) * torch.tensor(255.0)
rotate_label = torch.tensor(rotate_index).long()
return img, depth, mask_path, rotate_label, img_name
def __len__(self):
return len(self.imgs)
class TestWithRotationFDPImageFolder(Dataset):
def __init__(self, root, in_size, prefix, rotations = (0, 90, 180, 270)):
self.rotations = rotations
self.times = len(rotations)
if os.path.isdir(root):
construct_print(f"{root} is an image folder, we will test on it.")
self.imgs = _make_fdp_dataset(root)
elif os.path.isfile(root):
raise NotImplementedError
else:
raise NotImplementedError
self.test_img_trainsform = transforms.Compose([
# 输入的如果是一个tuple,则按照数据缩放,但是如果是一个数字,则按比例缩放到短边等于该值
transforms.Resize((in_size, in_size)),
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406], [0.229, 0.224, 0.225])
])
self.test_depth_trainsform = transforms.Compose([
transforms.Resize((in_size, in_size)),
transforms.ToTensor()
])
def __getitem__(self, index):
rotate_index = index % self.times
# rotate_index = np.random.randint(self.times)
img_path, depth_path, mask_path = self.imgs[index]
img_name = (img_path.split(os.sep)[-1]).split('.')[0]
img = Image.open(img_path).convert('RGB')
depth = Image.open(depth_path).convert('L')
depth = np.asarray(depth)
depth = (depth - np.min(depth)) / (np.max(depth) - np.min(depth) + 1.0e-6) * 255.0
depth = Image.fromarray(depth.astype(np.uint8)) # 255 -> [0, 1] automatically !!!!
img = img.rotate(self.rotations[rotate_index])
depth = depth.rotate(self.rotations[rotate_index])
img = self.test_img_trainsform(img).float()
depth = self.test_depth_trainsform(depth).float()
# depth = (depth - torch.min(depth)) / (torch.max(depth) - torch.min(depth) + torch.tensor(1.0e-6)) * torch.tensor(255.0)
rotate_label = torch.tensor(rotate_index).long()
return img, depth, mask_path, rotate_label, img_name, img_path
def __len__(self):
return len(self.imgs)
def _make_train_dataset_from_list(list_filepath, prefix=('.jpg', '.png')):
# list_filepath = '/home/lart/Datasets/RGBDSaliency/FinalSet/rgbd_train_jw.lst'
img_list = _read_list_from_file(list_filepath)
return [(os.path.join(os.path.join(os.path.dirname(img_path), 'train_images'),
os.path.basename(img_path) + prefix[0]),
os.path.join(os.path.join(os.path.dirname(img_path), 'train_masks'),
os.path.basename(img_path) + prefix[1]))
for img_path in img_list]
class TrainImageFolder(Dataset):
def __init__(self, root, in_size, prefix, use_bigt=False):
self.use_bigt = use_bigt
if os.path.isdir(root):
construct_print(f"{root} is an image folder, we will train on it.")
self.imgs = _make_fdp_dataset(root)
elif os.path.isfile(root):
construct_print(f"{root} is a list of images, we will use these paths to read the "
f"corresponding image")
self.imgs = _make_train_dataset_from_list(root, prefix=prefix)
else:
raise NotImplementedError
self.train_joint_transform = Compose([
JointResize(in_size),
RandomHorizontallyFlip(),
RandomRotate(10)
])
self.train_img_transform = transforms.Compose([
transforms.ColorJitter(0.1, 0.1, 0.1),
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406],
[0.229, 0.224, 0.225]) # 处理的是Tensor
])
self.train_depth_transform = transforms.ToTensor()
self.train_mask_transform = transforms.ToTensor()
def __getitem__(self, index):
img_path, depth_path, mask_path = self.imgs[index]
img = Image.open(img_path)
depth = Image.open(depth_path)
mask = Image.open(mask_path)
if len(img.split()) != 3:
img = img.convert('RGB')
if len(depth.split()) == 3:
depth = depth.convert('L')
if len(mask.split()) == 3:
mask = mask.convert('L')
img, depth, mask = self.train_joint_transform(img, depth, mask)
mask = self.train_mask_transform(mask).float()
img = self.train_img_transform(img).float()
depth = self.train_depth_transform(depth).float()
###########################################
# depth 255 normalized already(NJUD + NLPR)
# depth = (depth - torch.min(depth)) / (torch.max(depth) - torch.min(depth) + torch.tensor(1.0e-6))
###########################################
if self.use_bigt:
mask = mask.ge(0.5).float() # 二值化
img_name = (img_path.split(os.sep)[-1]).split('.')[0]
return img, depth, mask, img_name
def __len__(self):
return len(self.imgs)
class TrainMTImageFolder(Dataset):
def __init__(self, root, unlabeled_root, in_size, prefix, use_bigt=False):
self.in_size = in_size
self.use_bigt = use_bigt
if os.path.isdir(root):
construct_print(f"{root} is an image folder, we will train on it.")
self.imgs = _make_dataset(root, split = 'train')
elif os.path.isfile(root):
construct_print(f"{root} is a list of images, we will use these paths to read the "
f"corresponding image")
self.imgs = _make_train_dataset_from_list(root, prefix=prefix)
else:
raise NotImplementedError
if os.path.isdir(unlabeled_root):
construct_print(f"{unlabeled_root} is an image folder, we will conduct MT on it.")
self.unlabeled_imgs = _make_unlabeled_dataset(unlabeled_root, split = 'train')
elif os.path.isfile(unlabeled_root):
raise NotImplementedError
else:
raise NotImplementedError
self.train_joint_transform = Compose([
JointResize(in_size),
RandomHorizontallyFlip(),
RandomRotate(10)
])
self.train_img_transform = transforms.Compose([
transforms.ColorJitter(0.1, 0.1, 0.1),
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406],
[0.229, 0.224, 0.225]) # 处理的是Tensor
])
self.train_depth_transform = transforms.ToTensor()
self.train_mask_transform = transforms.ToTensor()
def __getitem__(self, index):
if index < len(self.imgs):
img_path, depth_path, mask_path = self.imgs[index]
img = Image.open(img_path)
depth = Image.open(depth_path)
mask = Image.open(mask_path)
if len(img.split()) != 3:
img = img.convert('RGB')
if len(depth.split()) == 3:
depth = depth.convert('L')
if len(mask.split()) == 3:
mask = mask.convert('L')
img, depth, mask = self.train_joint_transform(img, depth, mask)
mask = self.train_mask_transform(mask).float()
img = self.train_img_transform(img).float()
depth = self.train_depth_transform(depth).float()
# depth = (depth - torch.min(depth)) / (torch.max(depth) - torch.min(depth) + torch.tensor(1.0e-6))
if self.use_bigt:
mask = mask.ge(0.5).float() # 二值化
img_name = (img_path.split(os.sep)[-1]).split('.')[0]
return img, depth, mask, img_name
else:
index -= len(self.imgs)
# without unlabeled_mask_path
unlabeled_img_path, unlabeled_depth_path = self.unlabeled_imgs[index]
unlabeled_img = Image.open(unlabeled_img_path)
unlabeled_depth = Image.open(unlabeled_depth_path)
if len(unlabeled_img.split()) != 3:
unlabeled_img = unlabeled_img.convert('RGB')
if len(unlabeled_depth.split()) == 3:
unlabeled_depth = unlabeled_depth.convert('L')
unlabeled_img, unlabeled_depth = self.train_joint_transform(unlabeled_img, unlabeled_depth)
unlabeled_img = self.train_img_transform(unlabeled_img).float()
unlabeled_depth = self.train_depth_transform(unlabeled_depth).float()
unlabeled_depth = (unlabeled_depth - torch.min(unlabeled_depth)) / (torch.max(unlabeled_depth) - torch.min(unlabeled_depth) + torch.tensor(1.0e-6)) * torch.tensor(255.0)
unlabeled_mask = torch.zeros((1, self.in_size, self.in_size)).float()
unlabeled_img_name = (unlabeled_img_path.split(os.sep)[-1]).split('.')[0]
return unlabeled_img, unlabeled_depth, unlabeled_mask, unlabeled_img_name
def __len__(self):
return len(self.imgs) + len(self.unlabeled_imgs)
def get_primary_secondary_indices(self):
return np.arange(len(self.imgs)), np.arange(len(self.imgs), len(self.unlabeled_imgs))
class TrainSSImageFolder(Dataset):
def __init__(self, root, unlabeled_root, in_size, prefix, is_labeled_rotation,
use_bigt=False, rotations = (0, 90, 180, 270)):
self.in_size = in_size
self.is_labeled_rotation = is_labeled_rotation
self.use_bigt = use_bigt
self.rotations = rotations
self.times = len(rotations)
if os.path.isdir(root):
construct_print(f"{root} is an image folder, we will train on it.")
self.imgs = _make_fdp_dataset(root)
elif os.path.isfile(root):
construct_print(f"{root} is a list of images, we will use these paths to read the "
f"corresponding image")
self.imgs = _make_train_dataset_from_list(root, prefix=prefix)
else:
raise NotImplementedError
if os.path.isdir(unlabeled_root):
construct_print(f"{unlabeled_root} is an image folder, we will conduct SS (also SSMT) on it.")
self.unlabeled_imgs = _make_unlabeled_dataset(unlabeled_root, split = 'train')
elif os.path.isfile(unlabeled_root):
raise NotImplementedError
else:
raise NotImplementedError
self.train_joint_transform = JointResize(in_size)
self.train_img_transform = transforms.Compose([
transforms.ColorJitter(0.1, 0.1, 0.1),
transforms.ToTensor(),
transforms.Normalize([0.485, 0.456, 0.406],
[0.229, 0.224, 0.225]) # 处理的是Tensor
])
self.train_depth_transform = transforms.ToTensor()
self.train_mask_transform = transforms.ToTensor()
def __getitem__(self, index):
# main_index, rotate_index = index // self.times, index % self.times
main_index = index
rotate_index = np.random.randint(self.times)
if main_index < len(self.imgs):
img_path, depth_path, mask_path = self.imgs[main_index]
img = Image.open(img_path)
depth = Image.open(depth_path)
mask = Image.open(mask_path)
if len(img.split()) != 3:
img = img.convert('RGB')
if len(depth.split()) == 3:
depth = depth.convert('L')
if len(mask.split()) == 3:
mask = mask.convert('L')
img, depth, mask = self.train_joint_transform(img, depth, mask)
if self.is_labeled_rotation:
img = img.rotate(self.rotations[rotate_index])
depth = depth.rotate(self.rotations[rotate_index])
mask = mask.rotate(self.rotations[rotate_index])
mask = self.train_mask_transform(mask).float()
img = self.train_img_transform(img).float()
depth = self.train_depth_transform(depth).float()
# depth = (depth - torch.min(depth)) / (torch.max(depth) - torch.min(depth) + torch.tensor(1.0e-6))
if self.use_bigt:
mask = mask.ge(0.5).float() # 二值化
# rotate_label = torch.zeros((self.times), dtype = torch.int64).scatter_(0, torch.LongTensor([ rotate_index ]), torch.LongTensor([ 1 ])).long()
rotate_label = torch.tensor(rotate_index).long()
img_name = (img_path.split(os.sep)[-1]).split('.')[0]
return img, depth, mask, rotate_label, img_name
else:
main_index -= len(self.imgs)
# without unlabeled_mask_path
unlabeled_img_path, unlabeled_depth_path = self.unlabeled_imgs[main_index]
unlabeled_img = Image.open(unlabeled_img_path)
unlabeled_depth = Image.open(unlabeled_depth_path)
if len(unlabeled_img.split()) != 3:
unlabeled_img = unlabeled_img.convert('RGB')
if len(unlabeled_depth.split()) == 3:
unlabeled_depth = unlabeled_depth.convert('L')
unlabeled_img, unlabeled_depth = self.train_joint_transform(unlabeled_img, unlabeled_depth)
unlabeled_img = unlabeled_img.rotate(self.rotations[rotate_index])
unlabeled_depth = unlabeled_depth.rotate(self.rotations[rotate_index])
unlabeled_img = self.train_img_transform(unlabeled_img).float()
unlabeled_depth = self.train_depth_transform(unlabeled_depth).float()
unlabeled_depth = (unlabeled_depth - torch.min(unlabeled_depth)) / (torch.max(unlabeled_depth) - torch.min(unlabeled_depth) + torch.tensor(1.0e-6)) * torch.tensor(255.0)
unlabeled_mask = torch.zeros((1, self.in_size, self.in_size)).float() # dummy
# unlabeled_rotate_label = torch.zeros((self.times), dtype = torch.int64).scatter_(0, torch.LongTensor([ rotate_index ]), torch.LongTensor([ 1 ]))
unlabeled_rotate_label = torch.tensor(rotate_index).long()
unlabeled_img_name = (unlabeled_img_path.split(os.sep)[-1]).split('.')[0]
return unlabeled_img, unlabeled_depth, unlabeled_mask, unlabeled_rotate_label, unlabeled_img_name
def __len__(self):
return (len(self.imgs) + len(self.unlabeled_imgs))
# return (len(self.imgs) + len(self.unlabeled_imgs)) * self.times
def get_primary_secondary_indices(self):
return np.arange(len(self.imgs)), np.arange(len(self.imgs), len(self.unlabeled_imgs))
# return np.arange(len(self.imgs) * self.times), np.arange(len(self.imgs) * self.times, len(self.unlabeled_imgs) * self.times)
| 43.586806 | 181 | 0.605393 | 3,167 | 25,106 | 4.573098 | 0.06473 | 0.0232 | 0.016571 | 0.018366 | 0.911413 | 0.906649 | 0.884554 | 0.869986 | 0.853483 | 0.851412 | 0 | 0.023593 | 0.26898 | 25,106 | 575 | 182 | 43.662609 | 0.765542 | 0.076277 | 0 | 0.803922 | 0 | 0 | 0.046595 | 0.00104 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071895 | false | 0 | 0.017429 | 0.021786 | 0.165577 | 0.034858 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0813385bc973fa337b1bb9a19c29bd3f1550f664 | 2,450 | py | Python | U-GAT-IT/lib/data/dataset.py | gcinbis/deep-generative-models-spring20 | d377bd63d5e79539477cca47c71462e5cc12adfa | [
"MIT"
] | 18 | 2020-07-06T10:47:26.000Z | 2021-05-30T11:43:17.000Z | U-GAT-IT/lib/data/dataset.py | gcinbis/deep-generative-models-course-projects | d377bd63d5e79539477cca47c71462e5cc12adfa | [
"MIT"
] | 1 | 2022-03-12T00:39:12.000Z | 2022-03-12T00:39:12.000Z | U-GAT-IT/lib/data/dataset.py | gcinbis/deep-generative-models-spring20 | d377bd63d5e79539477cca47c71462e5cc12adfa | [
"MIT"
] | 2 | 2020-07-13T20:46:44.000Z | 2020-10-01T13:15:25.000Z | from PIL import Image
from pathlib import Path
from lib.base import BaseDataset
class ImageFoldersAB(BaseDataset):
def __init__(self, root, dataset_dir, width, height, mean, std, mode, **kwargs):
super().__init__(width, height, mean, std, mode)
self.root = Path(root).resolve()
self.dataset_dir = self.root / dataset_dir
self.files = self.get_files()
self.transforms = self.train_transform if mode == 'train' else self.test_transform
def get_files(self):
domainA_imgs = list((self.dataset_dir / f'{self.mode}A').glob("*.*"))
domainB_imgs = list((self.dataset_dir / f'{self.mode}B').glob("*.*"))
return list(zip(domainA_imgs, domainB_imgs))
def _load_data_(self, index):
img_A_path, img_B_path = self.files[index][0], self.files[index][1]
img_A = Image.open(img_A_path).convert("RGB")
img_B = Image.open(img_B_path).convert('RGB')
return img_A, img_B
def __getitem__(self, index):
img_A, img_B = self._load_data_(index)
img_A = self.transforms(img_A)
img_B = self.transforms(img_B)
return {'A': img_A, 'B': img_B}
class ImageMemoryAB(BaseDataset):
def __init__(self, root, dataset_dir, width, height, mean, std, mode, **kwargs):
super().__init__(width, height, mean, std, mode)
self.root = Path(root).resolve()
self.dataset_dir = self.root / dataset_dir
self.files = self.get_files()
self.transforms = self.train_transform if mode == 'train' else self.test_transform
self.dataset = self.load_all_images()
def get_files(self):
domainA_imgs = list((self.dataset_dir / f'{self.mode}A').glob("*.*"))
domainB_imgs = list((self.dataset_dir / f'{self.mode}B').glob("*.*"))
return list(zip(domainA_imgs, domainB_imgs))
def _load_data_(self, index):
img_A_path, img_B_path = self.files[index][0], self.files[index][1]
img_A = Image.open(img_A_path).convert("RGB")
img_B = Image.open(img_B_path).convert('RGB')
return img_A, img_B
def load_all_images(self):
dataset = []
for i in range(len(self.files)):
dataset.append(self._load_data_(i))
return dataset
def __getitem__(self, index):
img_A, img_B = self.dataset[index]
img_A = self.transforms(img_A)
img_B = self.transforms(img_B)
return {'A': img_A, 'B': img_B} | 38.888889 | 90 | 0.636735 | 353 | 2,450 | 4.11898 | 0.175637 | 0.044017 | 0.057772 | 0.033012 | 0.839752 | 0.839752 | 0.839752 | 0.839752 | 0.839752 | 0.799175 | 0 | 0.002103 | 0.223673 | 2,450 | 63 | 91 | 38.888889 | 0.762355 | 0 | 0 | 0.745098 | 0 | 0 | 0.035088 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0 | 0.058824 | 0 | 0.411765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
f2358e6ca265bca4c790b56c5e678943fa675cc7 | 35 | py | Python | data_scraper/utils/__init__.py | tg970/co-big-game-tags | fa7f8b0d2ecc28c2706b65e8313baeee6a35e69d | [
"MIT"
] | null | null | null | data_scraper/utils/__init__.py | tg970/co-big-game-tags | fa7f8b0d2ecc28c2706b65e8313baeee6a35e69d | [
"MIT"
] | null | null | null | data_scraper/utils/__init__.py | tg970/co-big-game-tags | fa7f8b0d2ecc28c2706b65e8313baeee6a35e69d | [
"MIT"
] | null | null | null | from . import read_pdf as read_pdf
| 17.5 | 34 | 0.8 | 7 | 35 | 3.714286 | 0.714286 | 0.538462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 35 | 1 | 35 | 35 | 0.896552 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
f2bbf88f5f993f019945def0992402303d7fcbe4 | 143 | py | Python | virtual/lib/python3.6/site-packages/tests/custom_reviews/views.py | Eccie-K/Awards | 05bedf7c8aba4168d25715197d5bf3ad3e712ff8 | [
"MIT"
] | 15 | 2019-02-16T12:17:30.000Z | 2022-03-27T20:11:49.000Z | virtual/lib/python3.6/site-packages/tests/custom_reviews/views.py | Eccie-K/Awards | 05bedf7c8aba4168d25715197d5bf3ad3e712ff8 | [
"MIT"
] | 7 | 2019-05-10T08:27:14.000Z | 2021-04-26T15:19:06.000Z | virtual/lib/python3.6/site-packages/tests/custom_reviews/views.py | Eccie-K/Awards | 05bedf7c8aba4168d25715197d5bf3ad3e712ff8 | [
"MIT"
] | 8 | 2019-11-07T21:05:10.000Z | 2021-08-03T06:59:37.000Z | from django.http import HttpResponse
def custom_submit_review(request):
return HttpResponse("Hello from the custom submit review view.")
| 23.833333 | 68 | 0.797203 | 19 | 143 | 5.894737 | 0.736842 | 0.214286 | 0.321429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.13986 | 143 | 5 | 69 | 28.6 | 0.910569 | 0 | 0 | 0 | 0 | 0 | 0.286713 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 8 |
4b506e84ec853a256a58189246f136ddb8cd3a36 | 31,895 | py | Python | OnCruiseRoutines/CTD_Vis/ncprocessing.py | shaunwbell/AtSeaPrograms | 839ee4dc1cf7a85bce1de82b04379c6d1670c414 | [
"MIT"
] | 1 | 2016-08-03T17:03:10.000Z | 2016-08-03T17:03:10.000Z | OnCruiseRoutines/CTD_Vis/ncprocessing.py | shaunwbell/AtSeaPrograms | 839ee4dc1cf7a85bce1de82b04379c6d1670c414 | [
"MIT"
] | 2 | 2016-07-25T17:25:55.000Z | 2016-10-05T22:29:43.000Z | OnCruiseRoutines/CTD_Vis/ncprocessing.py | shaunwbell/AtSeaPrograms | 839ee4dc1cf7a85bce1de82b04379c6d1670c414 | [
"MIT"
] | 1 | 2020-01-15T06:22:23.000Z | 2020-01-15T06:22:23.000Z | #!/usr/bin/env
"""
ncprossessing.py
Seabird CNV only
Built using Anaconda packaged Python:
"""
from __future__ import absolute_import
# Standard library.
import datetime, os
# Scientific stack.
from netCDF4 import Dataset
# User library
from OnCruiseRoutines.EPICNetCDF import SBE_Epiclibrary
from OnCruiseRoutines.EPICNetCDF import epic_key_codes as ekc
__author__ = 'Shaun Bell'
__email__ = 'shaun.bell@noaa.gov'
__created__ = datetime.datetime(2014, 01, 13)
__modified__ = datetime.datetime(2014, 10, 10)
__version__ = "0.2.0"
__status__ = "Development"
"""-------------------------------NCFile Creation--------------------------------------"""
"""-------------------------------EPIC Standard----------------------------------------"""
class CTD_NC(object):
""" Class instance to generate a NetCDF file.
Assumes data format and information ingested is a dataframe object from ctd.py
Standards
---------
EPICNetCDF (PMEL) Standards
Usage
-----
Order of routines matters and no error checking currently exists
ToDo: Error Checking
Use this to create a nc file with all default values
ncinstance = CTD_NC()
ncinstance.file_create()
ncinstance.sbeglobal_atts()
ncinstance.PMELglobal_atts()
ncinstance.dimension_init()
ncinstance.variable_init()
ncinstance.add_coord_data()
ncinstance.add_data()
ncinstance.close()
"""
nc_format = 'NETCDF3_CLASSIC'
nc_read = 'w'
def __init__(self, savefile='ncfiles/test.nc', data=None):
"""data is a pandas dataframe"""
self.data = data
self.savefile = savefile
def file_create(self):
rootgrpID = Dataset(self.savefile, CTD_NC.nc_read, format=CTD_NC.nc_format)
self.rootgrpID = rootgrpID
return ( rootgrpID )
def sbeglobal_atts(self, coord_system="GEOGRAPHICAL", Water_Mass="G", featureType="Profile"):
"""
Assumptions
-----------
Format of DataFrame.name = 'dy1309l1_ctd001'
seabird related global attributes found in DataFrame.header list
"""
self.rootgrpID.CREATION_DATE = datetime.datetime.utcnow().strftime("%B %d, %Y %H:%M UTC")
self.rootgrpID.CRUISE = self.data.name.split('_')[0]
self.rootgrpID.CAST = self.data.name.split('_')[-1]
self.rootgrpID.INST_TYPE = self.data.header[0]
self.rootgrpID.DATA_TYPE = 'CTD'
self.rootgrpID.DATA_CMNT = self.data.header[1].replace('hex','cnv')
self.rootgrpID.COORD_SYSTEM = coord_system
self.rootgrpID.WATER_MASS = Water_Mass
self.rootgrpID.featureType = featureType
def PMELglobal_atts(self, Barometer=9999, Wind_Dir=999, Wind_Speed=99,
Air_Temp=99.9, Water_Depth=9999, Prog_Cmnt='',
Edit_Cmnt='', Station_Name='', sfc_extend='', Station_No=''):
"""
Assumptions
-----------
Format of DataFrame.name = 'dy1309l1_ctd001'
seabird related global attributes found in DataFrame.header list
Options
-------
Todo
-----
Retrieve PMEL header information from '@' comments in .cnv file or from separate
header.txt file
"""
#From PMELheader
self.rootgrpID.BAROMETER = Barometer
self.rootgrpID.WIND_DIR = Wind_Dir
self.rootgrpID.WIND_SPEED = Wind_Speed
self.rootgrpID.AIR_TEMP = Air_Temp
self.rootgrpID.WATER_DEPTH = Water_Depth
self.rootgrpID.STATION_NAME = Station_Name
self.rootgrpID.STATION_NO = Station_No
self.rootgrpID.EPIC_FILE_GENERATOR = 'ncprossessing.py V' + __version__
self.rootgrpID.PROG_CMNT01 = Prog_Cmnt
self.rootgrpID.EDIT_CMNT01 = Edit_Cmnt
self.rootgrpID.SFC_EXTEND = sfc_extend
pass
def dimension_init(self):
"""
Assumes
-------
Dimensions will be 'time', 'depth', 'lat', 'lon'
Todo
----
User defined dimensions
"""
self.dim_vars = ['time', 'dep', 'lat', 'lon']
self.rootgrpID.createDimension( self.dim_vars[0], 1 ) #time
self.rootgrpID.createDimension( self.dim_vars[1], self.data.shape[0] ) #depth
self.rootgrpID.createDimension( self.dim_vars[2], 1 ) #lat
self.rootgrpID.createDimension( self.dim_vars[3], 1 ) #lon
def variable_init(self):
"""data.columns.values is a list of all parameters in seabird file.
We need to match these to EPIC key codes. These can be found in the EPICNetCDF folder
Usage:
------
from EPICnetCDF import SBE_Epiclibrary
from EPICnetCDF import epic_key_codes as ekc
ekcl = ekc.EpicKeyCodes()
ekcl.epic_dic_call(SBE_Epiclibrary.SBE_EPIC['sal00'])
#will return
#['S ', 'SALINITY (PSU) ', 'sal', 'PSU', ' ', 'Practical Salinity Units']
DataFrame.columns.values[0]
"""
ekcl = ekc.EpicKeyCodes()
self.epicvars = {}
self.sbe2epic = {}
# get list of only epic variables in sbe file
for pname in self.data.columns.values:
try:
self.epicvars[pname] = ekcl.epic_dic_call(SBE_Epiclibrary.SBE_EPIC[pname])
self.sbe2epic[pname] = SBE_Epiclibrary.SBE_EPIC[pname]
print pname
except KeyError:
print "%s is not in the SBE_Epiclibrary and will not be added to the .nc file" % pname
#build record variable attributes
rec_vars, rec_var_name, rec_var_longname = [], [], []
rec_var_generic_name, rec_var_FORTRAN, rec_var_units, rec_var_epic = [], [], [], []
# for each epic variable, build required metainformation from epic.key file
# temperatures should always be first
for i, k in enumerate(sorted(self.epicvars.keys())):
kname = self.epicvars[k]
if kname is None: #dave K designated variables which aren't in epic.key of form -4084 for key 84
print "Variables in .cnv file %s using identifier %s" % (k, self.sbe2epic[k].split('_')[-1])
kname = ekcl.epic_dic_call( self.sbe2epic[k][-2:] )
rec_vars.append('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k].split('-')[-1])))
elif (kname[0].strip().lower()) is not '': #no variables without Epic Keys
print "Variables in .cnv file %s" % ('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k])))
rec_vars.append('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k])))
else:
print "No EPICkey. Variables in .cnv file %s - %s skipped" % (k, self.sbe2epic[k].split('_')[-1])
continue
rec_var_name.append( kname[0].strip() )
rec_var_longname.append( kname[1].strip() )
rec_var_generic_name.append( kname[2].strip() )
rec_var_units.append( kname[3].strip() )
rec_var_FORTRAN.append( kname[4].strip() )
rec_var_epic.append( int(self.sbe2epic[k].split('_')[-1]) )
rec_vars = ['time','time2','dep','lat','lon'] + rec_vars
rec_var_name = ['', '', '', '', ''] + rec_var_name
rec_var_longname = ['', '', '', '', ''] + rec_var_longname
rec_var_generic_name = ['', '', '', '', ''] + rec_var_generic_name
rec_var_FORTRAN = ['', '', '', '', ''] + rec_var_FORTRAN
rec_var_units = ['True Julian Day', 'msec since 0:00 GMT','dbar','degree_north','degree_west'] + rec_var_units
rec_var_type= ['i4', 'i4'] + ['f4' for spot in rec_vars[2:]]
rec_var_strtype= ['EVEN', 'EVEN', 'EVEN', 'EVEN', 'EVEN']
rec_epic_code = [624,624,1,500,501] + rec_var_epic
var_class = []
var_class.append(self.rootgrpID.createVariable(rec_vars[0], rec_var_type[0], self.dim_vars[0]))#time1
var_class.append(self.rootgrpID.createVariable(rec_vars[1], rec_var_type[1], self.dim_vars[0]))#time2
var_class.append(self.rootgrpID.createVariable(rec_vars[2], rec_var_type[2], self.dim_vars[1]))#depth
var_class.append(self.rootgrpID.createVariable(rec_vars[3], rec_var_type[3], self.dim_vars[2]))#lat
var_class.append(self.rootgrpID.createVariable(rec_vars[4], rec_var_type[4], self.dim_vars[3]))#lon
for i, v in enumerate(rec_vars[5:]): #1D coordinate variables
var_class.append(self.rootgrpID.createVariable(rec_vars[i+5], rec_var_type[i+5], self.dim_vars))
### add variable attributes
for i, v in enumerate(var_class): #4dimensional for all vars
print ("Adding Variable {0}").format(v)#
v.setncattr('name',rec_var_name[i])
v.long_name = rec_var_longname[i]
v.generic_name = rec_var_generic_name[i]
v.FORTRAN_format = rec_var_FORTRAN[i]
v.units = rec_var_units[i]
if (i <= 4) :
#no type indicator for non dimensional variables
v.type = rec_var_strtype[i]
v.epic_code = rec_epic_code[i]
self.var_class = var_class
self.rec_vars = rec_vars
def add_coord_data(self, pressure_var='prDM', latitude=None, longitude=None, time1=None, time2=None, CastLog=False):
""" """
self.var_class[0][:] = time1
self.var_class[1][:] = time2
if not CastLog:
self.var_class[2][:] = self.data[pressure_var].values
try:
self.var_class[3][:] = self.data.latitude
self.var_class[4][:] = -1 * self.data.longitude #PMEL standard direction
except:
pass
else:
self.var_class[2][:] = self.data[pressure_var].values
self.var_class[3][:] = latitude
self.var_class[4][:] = -1 * longitude #PMEL standard direction W is +
def add_data(self):
""" """
ekcl = ekc.EpicKeyCodes()
for k in self.data.columns.values:
try:
kname = self.epicvars[k]
if kname is None: #dave K designated variables which aren't in epic.key of form -4084 for key 84
kname = ekcl.epic_dic_call( self.sbe2epic[k][-2:] )
temp = ('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k].split('-')[-1])))
elif (kname[0].strip().lower()) is not '': #no variables without Epic Keys
temp = ('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k])))
else:
continue
except:
continue
di = self.rec_vars.index(temp)
self.var_class[di][:] = self.data[k].values
def add_history(self, new_history):
"""Adds timestamp (UTC time) and history to existing information"""
self.History = self.History + ' ' + datetime.datetime.utcnow().strftime("%B %d, %Y %H:%M UTC")\
+ ' ' + new_history + '\n'
def close(self):
self.rootgrpID.close()
class CTD_IPHC(object):
""" Class instance to generate a NetCDF file.
Assumes data format and information ingested is a dataframe object from ctd.py
Standards
---------
EPICNetCDF (PMEL) Standards
Usage
-----
Order of routines matters and no error checking currently exists
ToDo: Error Checking
Use this to create a nc file with all default values
ncinstance = CTD_IPHC()
ncinstance.file_create()
ncinstance.sbeglobal_atts()
ncinstance.PMELglobal_atts()
ncinstance.dimension_init()
ncinstance.variable_init()
ncinstance.add_coord_data()
ncinstance.add_data()
ncinstance.close()
"""
nc_format = 'NETCDF3_CLASSIC'
nc_read = 'w'
def __init__(self, savefile='ncfiles/test.nc', data=None):
"""data is a pandas dataframe"""
self.data = data
self.savefile = savefile
def file_create(self):
rootgrpID = Dataset(self.savefile, CTD_IPHC.nc_read, format=CTD_IPHC.nc_format)
self.rootgrpID = rootgrpID
return ( rootgrpID )
def sbeglobal_atts(self, cruise='', cast='', coord_system="GEOGRAPHICAL", Water_Mass="G", featureType="Profile"):
"""
Assumptions
-----------
Format of DataFrame.name = 'dy1309l1_ctd001'
seabird related global attributes found in DataFrame.header list
"""
self.rootgrpID.CREATION_DATE = datetime.datetime.utcnow().strftime("%B %d, %Y %H:%M UTC")
self.rootgrpID.CRUISE = cruise
self.rootgrpID.CAST = cast
self.rootgrpID.INST_TYPE = self.data.header[0]
self.rootgrpID.DATA_TYPE = 'CTD'
self.rootgrpID.DATA_CMNT = self.data.header[1].replace('hex','cnv')
self.rootgrpID.COORD_SYSTEM = coord_system
self.rootgrpID.WATER_MASS = Water_Mass
self.rootgrpID.featureType = featureType
def PMELglobal_atts(self, Barometer=9999, Wind_Dir=999, Wind_Speed=99,
Air_Temp=99.9, Water_Depth=9999, Prog_Cmnt='', Edit_Cmnt='', Station_Name='', sfc_extend=''):
"""
Assumptions
-----------
Format of DataFrame.name = 'dy1309l1_ctd001'
seabird related global attributes found in DataFrame.header list
Options
-------
Todo
-----
Retrieve PMEL header information from '@' comments in .cnv file or from separate
header.txt file
"""
#From PMELheader
self.rootgrpID.BAROMETER = Barometer
self.rootgrpID.WIND_DIR = Wind_Dir
self.rootgrpID.WIND_SPEED = Wind_Speed
self.rootgrpID.AIR_TEMP = Air_Temp
self.rootgrpID.WATER_DEPTH = Water_Depth
self.rootgrpID.STATION_NAME = Station_Name
self.rootgrpID.EPIC_FILE_GENERATOR = 'ncprossessing.py V' + __version__
self.rootgrpID.PROG_CMNT01 = Prog_Cmnt
self.rootgrpID.EDIT_CMNT01 = Edit_Cmnt
self.rootgrpID.SFC_EXTEND = sfc_extend
def IPHC_atts(self, vslcde=None, setno=None, stnno=None, trpno=None, region=None):
self.rootgrpID.VSLCDE = vslcde
self.rootgrpID.SETNO = setno
self.rootgrpID.STNNO = stnno
self.rootgrpID.TRPNO = trpno
self.rootgrpID.REGION = region
def dimension_init(self):
"""
Assumes
-------
Dimensions will be 'time', 'depth', 'lat', 'lon'
Todo
----
User defined dimensions
"""
self.dim_vars = ['time', 'dep', 'lat', 'lon']
self.rootgrpID.createDimension( self.dim_vars[0], 1 ) #time
self.rootgrpID.createDimension( self.dim_vars[1], self.data.shape[0] ) #depth
self.rootgrpID.createDimension( self.dim_vars[2], 1 ) #lat
self.rootgrpID.createDimension( self.dim_vars[3], 1 ) #lon
def variable_init(self):
"""data.columns.values is a list of all parameters in seabird file.
We need to match these to EPIC key codes. These can be found in the EPICNetCDF folder
Usage:
------
from EPICnetCDF import SBE_Epiclibrary
from EPICnetCDF import epic_key_codes as ekc
ekcl = ekc.EpicKeyCodes()
ekcl.epic_dic_call(SBE_Epiclibrary.SBE_EPIC['sal00'])
#will return
#['S ', 'SALINITY (PSU) ', 'sal', 'PSU', ' ', 'Practical Salinity Units']
DataFrame.columns.values[0]
"""
ekcl = ekc.EpicKeyCodes()
self.epicvars = {}
self.sbe2epic = {}
# get list of only epic variables in sbe file
for pname in self.data.columns.values:
try:
self.epicvars[pname] = ekcl.epic_dic_call(SBE_Epiclibrary.SBE_EPIC[pname])
self.sbe2epic[pname] = SBE_Epiclibrary.SBE_EPIC[pname]
print pname
except KeyError:
print "%s is not in the SBE_Epiclibrary and will not be added to the .nc file" % pname
#build record variable attributes
rec_vars, rec_var_name, rec_var_longname = [], [], []
rec_var_generic_name, rec_var_FORTRAN, rec_var_units, rec_var_epic = [], [], [], []
# for each epic variable, build required metainformation from epic.key file
# temperatures should always be first
for i, k in enumerate(sorted(self.epicvars.keys())):
kname = self.epicvars[k]
if kname is None: #dave K designated variables which aren't in epic.key of form -4084 for key 84
print "Variables in .cnv file %s using identifier %s" % (k, self.sbe2epic[k].split('_')[-1])
kname = ekcl.epic_dic_call( self.sbe2epic[k][-2:] )
rec_vars.append('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k].split('-')[-1])))
elif (kname[0].strip().lower()) is not '': #no variables without Epic Keys
print "Variables in .cnv file %s" % ('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k])))
rec_vars.append('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k])))
else:
print "No EPICkey. Variables in .cnv file %s - %s skipped" % (k, self.sbe2epic[k].split('_')[-1])
continue
rec_var_name.append( kname[0].strip() )
rec_var_longname.append( kname[1].strip() )
rec_var_generic_name.append( kname[2].strip() )
rec_var_units.append( kname[3].strip() )
rec_var_FORTRAN.append( kname[4].strip() )
rec_var_epic.append( int(self.sbe2epic[k].split('_')[-1]) )
rec_vars = ['time','time2','dep','lat','lon'] + rec_vars
rec_var_name = ['', '', '', '', ''] + rec_var_name
rec_var_longname = ['', '', '', '', ''] + rec_var_longname
rec_var_generic_name = ['', '', '', '', ''] + rec_var_generic_name
rec_var_FORTRAN = ['', '', '', '', ''] + rec_var_FORTRAN
rec_var_units = ['True Julian Day', 'msec since 0:00 GMT','dbar','degree_north','degree_west'] + rec_var_units
rec_var_type= ['i4', 'i4'] + ['f4' for spot in rec_vars[2:]]
rec_var_strtype= ['EVEN', 'EVEN', 'EVEN', 'EVEN', 'EVEN']
rec_epic_code = [624,624,1,500,501] + rec_var_epic
var_class = []
var_class.append(self.rootgrpID.createVariable(rec_vars[0], rec_var_type[0], self.dim_vars[0]))#time1
var_class.append(self.rootgrpID.createVariable(rec_vars[1], rec_var_type[1], self.dim_vars[0]))#time2
var_class.append(self.rootgrpID.createVariable(rec_vars[2], rec_var_type[2], self.dim_vars[1]))#depth
var_class.append(self.rootgrpID.createVariable(rec_vars[3], rec_var_type[3], self.dim_vars[2]))#lat
var_class.append(self.rootgrpID.createVariable(rec_vars[4], rec_var_type[4], self.dim_vars[3]))#lon
for i, v in enumerate(rec_vars[5:]): #1D coordinate variables
var_class.append(self.rootgrpID.createVariable(rec_vars[i+5], rec_var_type[i+5], self.dim_vars))
### add variable attributes
for i, v in enumerate(var_class): #4dimensional for all vars
print ("Adding Variable {0}").format(v)#
v.setncattr('name',rec_var_name[i])
v.long_name = rec_var_longname[i]
v.generic_name = rec_var_generic_name[i]
v.FORTRAN_format = rec_var_FORTRAN[i]
v.units = rec_var_units[i]
if (i <= 4) :
#no type indicator for non dimensional variables
v.type = rec_var_strtype[i]
v.epic_code = rec_epic_code[i]
self.var_class = var_class
self.rec_vars = rec_vars
def add_coord_data(self, pressure_var='prDM', latitude=None, longitude=None, time1=None, time2=None, CastLog=False):
""" """
self.var_class[0][:] = time1
self.var_class[1][:] = time2
if not CastLog:
self.var_class[2][:] = self.data[pressure_var].values
self.var_class[3][:] = self.data.latitude
self.var_class[4][:] = -1 * self.data.longitude #PMEL standard direction
else:
self.var_class[2][:] = self.data[pressure_var].values
self.var_class[3][:] = latitude
self.var_class[4][:] = -1 * longitude #PMEL standard direction W is +
def add_data(self):
""" """
ekcl = ekc.EpicKeyCodes()
for k in self.data.columns.values:
try:
kname = self.epicvars[k]
if kname is None: #dave K designated variables which aren't in epic.key of form -4084 for key 84
kname = ekcl.epic_dic_call( self.sbe2epic[k][-2:] )
temp = ('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k].split('-')[-1])))
elif (kname[0].strip().lower()) is not '': #no variables without Epic Keys
temp = ('_'.join((kname[0].strip().strip('\\'), self.sbe2epic[k])))
else:
continue
except:
continue
di = self.rec_vars.index(temp)
self.var_class[di][:] = self.data[k].values
def add_history(self, new_history):
"""Adds timestamp (UTC time) and history to existing information"""
self.History = self.History + ' ' + datetime.datetime.utcnow().strftime("%B %d, %Y %H:%M UTC")\
+ ' ' + new_history + '\n'
def close(self):
self.rootgrpID.close()
"""-----------------------------CF V1.6 / COARDS Standards-----------------------------"""
class CF_CTD_NC(object):
"""
Class instance to generate a NetCDF file.
Assumes data format and information ingested is a dataframe object from ctd.py
Standards
---------
CF V1.6 / COARDS Standards
Usage
-----
Order of routines matters and no error checking currently exists
ToDo: Error Checking
Use this to create a nc file with all default values
ncinstance = CF_CTD_NC()
ncinstance.file_create()
ncinstance.sbeglobal_atts()
ncinstance.PMELglobal_atts()
ncinstance.dimension_init()
ncinstance.variable_init()
ncinstance.add_coord_data()
ncinstance.add_data()
ncinstance.close()
"""
nc_format = 'NETCDF3_CLASSIC'
nc_read = 'w'
def __init__(self, savefile='ncfiles/test.nc', data=None):
"""data is a pandas dataframe"""
self.data = data
self.savefile = savefile
def file_create(self):
rootgrpID = Dataset(self.savefile, CTD_NC.nc_read, format=CTD_NC.nc_format)
self.rootgrpID = rootgrpID
return ( rootgrpID )
def sbeglobal_atts(self, coord_system="GEOGRAPHICAL", Water_Mass="G"):
"""
Assumptions
-----------
Format of DataFrame.name = 'dy1309l1_ctd001'
seabird related global attributes found in DataFrame.header list
"""
self.rootgrpID.creation_date = datetime.datetime.utcnow().strftime("%B %d, %Y %H:%M UTC")
self.rootgrpID.cruiseID = self.data.name.split('_')[0]
self.rootgrpID.castID = self.data.name.split('_')[-1]
self.rootgrpID.instrument_type = self.data.header[0]
self.rootgrpID.data_type = 'CTD'
self.rootgrpID.data_comment = self.data.header[0].replace('hex','cnv')
self.rootgrpID.coordinate_system = coord_system
self.rootgrpID.water_mass = Water_Mass
def PMELglobal_atts(self, Barometer=9999, Wind_Dir=999, Wind_Speed=99,
Air_Temp=99.9, Water_Depth=9999, Prog_Cmnt='', Edit_Cmnt='', Station_Name='', sfc_extend=''):
"""
Assumptions
-----------
Format of DataFrame.name = 'dy1309l1_ctd001'
seabird related global attributes found in DataFrame.header list
Todo
----
Either ingest .cnv files with '@' comment codes, or a seperate text file
"""
#From PMELheader
self.rootgrpID.barometer = Barometer
self.rootgrpID.wind_direction = Wind_Dir
self.rootgrpID.wind_speed = Wind_Speed
self.rootgrpID.air_temperature = Air_Temp
self.rootgrpID.water_depth = Water_Depth
self.rootgrpID.station_name = Station_Name
self.rootgrpID.ingest_software = 'ncprossessing.py V' + __version__
self.rootgrpID.processing_level = 'a0'
self.rootgrpID.history = ''
self.rootgrpID.conventions = 'COARDS'
self.rootgrpID.surface_extend = sfc_extend
pass
def dimension_init(self):
"""
Assumes
-------
Dimensions will be 'time', 'depth', 'lat', 'lon'
Todo
----
User defined dimensions
"""
self.dim_vars = ['time', 'dep', 'lat', 'lon']
self.rootgrpID.createDimension( self.dim_vars[0], 1 ) #time
self.rootgrpID.createDimension( self.dim_vars[1], self.data.shape[0] ) #depth
self.rootgrpID.createDimension( self.dim_vars[2], 1 ) #lat
self.rootgrpID.createDimension( self.dim_vars[3], 1 ) #lon
def variable_init(self):
"""data.columns.values is a list of all parameters in seabird file.
We need to match these to EPIC key codes. These can be found in the EPICNetCDF folder
Usage:
------
from EPICnetCDF import SBE_Epiclibrary
from EPICnetCDF import epic_key_codes as ekc
ekcl = ekc.EpicKeyCodes()
ekcl.epic_dic_call(SBE_Epiclibrary.SBE_EPIC['sal00'])
#will return
#['S ', 'SALINITY (PSU) ', 'sal', 'PSU', ' ', 'Practical Salinity Units']
DataFrame.columns.values[0]
"""
ekcl = ekc.EpicKeyCodes()
self.epicvars = {}
self.sbe2epic = {}
# get list of only epic variables in sbe file
for pname in self.data.columns.values:
try:
self.epicvars[pname] = ekcl.epic_dic_call(SBE_Epiclibrary.SBE_EPIC[pname])
self.sbe2epic[pname] = SBE_Epiclibrary.SBE_EPIC[pname]
except KeyError:
print "%s is not in the SBE_Epiclibrary and will not be added to the .nc file" % pname
#build record variable attributes
rec_vars, rec_var_name, rec_var_longname = [], [], []
rec_var_generic_name, rec_var_missing, rec_var_units, rec_epic_code = [], [], [], []
# for each epic variable, build required metainformation from epic.key file
for i, k in enumerate(self.epicvars.keys()):
kname = self.epicvars[k]
if kname is None: # variables not in epic.key but given epic like codes
# these are often secondary instruments
kname = ekcl.epic_dic_call(self.sbe2epic[k][-2:])
print "Variables in .cnv file %s listed as secondary" % ( k )
rec_vars.append( k.replace('/','per') )
rec_var_name.append( kname[0].strip() )
rec_var_longname.append( kname[1].strip() )
rec_var_generic_name.append( kname[2].strip() )
rec_var_units.append( kname[3].strip() )
rec_epic_code.append( self.sbe2epic[k] )
else:
print "Variables in .cnv file %s" % ( k )
rec_vars.append( k.replace('/','per') )
rec_var_name.append( kname[0].strip() )
rec_var_longname.append( kname[1].strip() )
rec_var_generic_name.append( kname[2].strip() )
rec_var_units.append( kname[3].strip() )
rec_epic_code.append( self.sbe2epic[k] )
#hard coded variables are expected coordinate variables
rec_vars = ['time','dep','lat','lon'] + rec_vars
rec_var_name = ['', '', '', ''] + rec_var_name
rec_var_longname = ['', '', '', ''] + rec_var_longname
rec_var_generic_name = ['', '', '', ''] + rec_var_generic_name
rec_var_units = ['Days Since 01 01 0001 00:00:00','dbar','degrees_north','degrees_east'] + rec_var_units
rec_var_type= ['f8'] + ['f4' for spot in rec_vars[1:]]
rec_epic_code = ['','1','500','501'] + rec_epic_code
rec_var_missing = [-9999. for spot in rec_vars]
var_class = []
var_class.append(self.rootgrpID.createVariable(rec_vars[0], rec_var_type[0], self.dim_vars[0]))#time1
var_class.append(self.rootgrpID.createVariable(rec_vars[1], rec_var_type[1], self.dim_vars[1]))#depth
var_class.append(self.rootgrpID.createVariable(rec_vars[2], rec_var_type[2], self.dim_vars[2]))#lat
var_class.append(self.rootgrpID.createVariable(rec_vars[3], rec_var_type[3], self.dim_vars[3]))#lon
for i, v in enumerate(rec_vars[4:]): #1D coordinate variables
var_class.append(self.rootgrpID.createVariable(rec_vars[i+4], rec_var_type[i+4], self.dim_vars))
### add variable attributes
for i, v in enumerate(var_class): #4dimensional for all vars
print ("Adding Variable {0}").format(v)#
v.setncattr('name',rec_var_name[i])
v.long_name = rec_var_longname[i]
v.generic_name = rec_var_generic_name[i]
v.units = rec_var_units[i]
v.historic_epic_code = rec_epic_code[i]
v.missing_value = rec_var_missing[i]
self.var_class = var_class
self.rec_vars = rec_vars
def add_coord_data(self, pressure_var='prDM', latitude=None, longitude=None, time=None, CastLog=False):
self.var_class[0][:] = time
if not CastLog:
self.var_class[1][:] = self.data[pressure_var].values
self.var_class[2][:] = self.data.latitude
self.var_class[3][:] = self.data.longitude # +/- East/West
else:
self.var_class[1][:] = self.data[pressure_var].values
self.var_class[2][:] = latitude
self.var_class[3][:] = longitude # +/- East/West
def add_data(self):
try:
for pname in self.data.columns.values:
di = self.rec_vars.index(pname.replace('/','per'))
self.var_class[di][:] = self.data[pname].values
except ValueError:
print "%s is not in epic library and will not be added" % (pname)
def add_history(self, new_history):
"""Adds timestamp (UTC time) and history to existing information"""
self.history = self.history + ' ' + datetime.datetime.utcnow().strftime("%B %d, %Y %H:%M UTC")\
+ ' ' + new_history + '\n'
def close(self):
self.rootgrpID.close()
| 39.425216 | 120 | 0.572065 | 3,807 | 31,895 | 4.598109 | 0.094563 | 0.036675 | 0.020109 | 0.017481 | 0.922194 | 0.91054 | 0.898315 | 0.88569 | 0.88569 | 0.881005 | 0 | 0.019961 | 0.297884 | 31,895 | 808 | 121 | 39.47401 | 0.761722 | 0.054805 | 0 | 0.8 | 0 | 0 | 0.062225 | 0 | 0 | 0 | 0 | 0.011139 | 0 | 0 | null | null | 0.0075 | 0.0125 | null | null | 0.0425 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 9 |
4b939014fc56efe5c00d17e56044e645433b2316 | 214 | py | Python | database_sanitizer/sanitizers/constant.py | sharescape/python-database-sanitizer | 560bf402e3896285980abb21a74d5be8d2da1698 | [
"MIT"
] | 37 | 2018-05-07T13:07:25.000Z | 2022-02-07T18:58:10.000Z | database_sanitizer/sanitizers/constant.py | sharescape/python-database-sanitizer | 560bf402e3896285980abb21a74d5be8d2da1698 | [
"MIT"
] | 31 | 2018-04-27T13:16:28.000Z | 2021-12-10T10:08:00.000Z | database_sanitizer/sanitizers/constant.py | sharescape/python-database-sanitizer | 560bf402e3896285980abb21a74d5be8d2da1698 | [
"MIT"
] | 15 | 2018-05-04T12:28:12.000Z | 2022-02-17T09:27:58.000Z | def sanitize_null(value):
return None
def sanitize_empty_json_dict(value):
return '{}'
def sanitize_empty_json_list(value):
return '[]'
def sanitize_invalid_django_password(value):
return '!'
| 14.266667 | 44 | 0.714953 | 27 | 214 | 5.296296 | 0.481481 | 0.307692 | 0.223776 | 0.27972 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.17757 | 214 | 14 | 45 | 15.285714 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0.023364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0.125 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 8 |
4ba9ddc9bf36ec932e189464d30a0d04c967f812 | 1,694 | py | Python | gse_infra_configuration/network_manager/api.py | cynpna/gs-engine | 6137d3c53621cfa044a90822c18bfceea16caa0a | [
"Apache-2.0"
] | 13 | 2020-10-14T07:45:08.000Z | 2021-10-01T08:19:56.000Z | gse_infra_configuration/network_manager/api.py | cynpna/gs-engine | 6137d3c53621cfa044a90822c18bfceea16caa0a | [
"Apache-2.0"
] | null | null | null | gse_infra_configuration/network_manager/api.py | cynpna/gs-engine | 6137d3c53621cfa044a90822c18bfceea16caa0a | [
"Apache-2.0"
] | 17 | 2020-11-09T05:16:42.000Z | 2021-12-28T08:04:33.000Z | from apps.common.api_wrapper import ApiWrapper
class NicApi(ApiWrapper):
def __init__(self):
super(NicApi, self).__init__(api_url='/apis/k8s.cni.cncf.io/v1',kind='network-attachment-definitions')
def read_namespaced_nic(self, name, namespace, **kwargs):
return self.read_namespaced_obj(name, namespace, **kwargs)
def list_namespaced_nic(self, namespace, **kwargs):
return self.list_namespaced_obj(namespace, **kwargs)
def create_namespaced_nic(self, namespace, body, **kwargs):
return self.create_namespaced_obj(namespace, body, **kwargs)
def delete_namespaced_nic(self, name, namespace, **kwargs):
return self.delete_namespaced_obj(name, namespace, **kwargs)
def delete_collection_namespaced_nic(self, namespace, **kwargs):
return self.delete_collection_namespaced_obj(namespace, **kwargs)
class CiliumPlcyApi(ApiWrapper):
def __init__(self):
super(CiliumPlcyApi, self).__init__(api_url='/apis/cilium.io/v2',kind='ciliumnetworkpolicies')
def read_namespaced_plcy(self, name, namespace, **kwargs):
return self.read_namespaced_obj(name, namespace, **kwargs)
def list_namespaced_plcy(self, namespace, **kwargs):
return self.list_namespaced_obj(namespace, **kwargs)
def create_namespaced_plcy(self, namespace, body, **kwargs):
return self.create_namespaced_obj(namespace, body, **kwargs)
def delete_namespaced_plcy(self, name, namespace, **kwargs):
return self.delete_namespaced_obj(name, namespace, **kwargs)
def delete_collection_namespaced_plcy(self, namespace, **kwargs):
return self.delete_collection_namespaced_obj(namespace, **kwargs) | 43.435897 | 110 | 0.736128 | 203 | 1,694 | 5.832512 | 0.206897 | 0.202703 | 0.135135 | 0.168919 | 0.839527 | 0.765203 | 0.765203 | 0.753378 | 0.724662 | 0.724662 | 0 | 0.002082 | 0.149351 | 1,694 | 39 | 111 | 43.435897 | 0.81957 | 0 | 0 | 0.444444 | 0 | 0 | 0.054867 | 0.044248 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | false | 0 | 0.037037 | 0.37037 | 0.925926 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 11 |
4b1370ed1cc6bfab4ce4b87141b813175214db0c | 108 | py | Python | blocksequence/__init__.py | LaurentGoderre/block-sequence | 6f9b8da3a9b567bf71c33f551b49991ae0605da0 | [
"MIT"
] | 3 | 2019-06-25T19:45:38.000Z | 2021-03-13T12:39:09.000Z | blocksequence/__init__.py | StatCan/block-sequence | 6f9b8da3a9b567bf71c33f551b49991ae0605da0 | [
"MIT"
] | null | null | null | blocksequence/__init__.py | StatCan/block-sequence | 6f9b8da3a9b567bf71c33f551b49991ae0605da0 | [
"MIT"
] | null | null | null | from .algorithms.blockorder import *
from .algorithms.edgeorder import *
from .algorithms.evolution import * | 36 | 36 | 0.814815 | 12 | 108 | 7.333333 | 0.5 | 0.477273 | 0.454545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101852 | 108 | 3 | 37 | 36 | 0.907216 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
d99b21e38c39189cec2cb312520598cabb252930 | 32,909 | py | Python | pypureclient/flashblade/FB_2_2/api/directory_services_api.py | Flav-STOR-WL/py-pure-client | 03b889c997d90380ac5d6380ca5d5432792d3e89 | [
"BSD-2-Clause"
] | 14 | 2018-12-07T18:30:27.000Z | 2022-02-22T09:12:33.000Z | pypureclient/flashblade/FB_2_2/api/directory_services_api.py | Flav-STOR-WL/py-pure-client | 03b889c997d90380ac5d6380ca5d5432792d3e89 | [
"BSD-2-Clause"
] | 28 | 2019-09-17T21:03:52.000Z | 2022-03-29T22:07:35.000Z | pypureclient/flashblade/FB_2_2/api/directory_services_api.py | Flav-STOR-WL/py-pure-client | 03b889c997d90380ac5d6380ca5d5432792d3e89 | [
"BSD-2-Clause"
] | 15 | 2020-06-11T15:50:08.000Z | 2022-03-21T09:27:25.000Z | # coding: utf-8
"""
FlashBlade REST API
A lightweight client for FlashBlade REST API 2.2, developed by Pure Storage, Inc. (http://www.purestorage.com/).
OpenAPI spec version: 2.2
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from __future__ import absolute_import
import re
# python 2 and python 3 compatibility library
import six
from typing import List, Optional
from .. import models
class DirectoryServicesApi(object):
def __init__(self, api_client):
self.api_client = api_client
def api22_directory_services_get_with_http_info(
self,
continuation_token=None, # type: str
filter=None, # type: str
ids=None, # type: List[str]
limit=None, # type: int
names=None, # type: List[str]
offset=None, # type: int
sort=None, # type: List[str]
async_req=False, # type: bool
_return_http_data_only=False, # type: bool
_preload_content=True, # type: bool
_request_timeout=None, # type: Optional[int]
):
# type: (...) -> models.DirectoryServiceGetResponse
"""GET directory-services
List directory service configuration information for the array.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api22_directory_services_get_with_http_info(async_req=True)
>>> result = thread.get()
:param str continuation_token: An opaque token used to iterate over a collection. The token to use on the next request is returned in the `continuation_token` field of the result.
:param str filter: Exclude resources that don't match the specified criteria.
:param list[str] ids: A comma-separated list of resource IDs. If after filtering, there is not at least one resource that matches each of the elements of `ids`, then an error is returned. This cannot be provided together with the `name` or `names` query parameters.
:param int limit: Limit the size of the response to the specified number of resources. A `limit` of `0` can be used to get the number of resources without getting all of the resources. It will be returned in the `total_item_count` field. If a client asks for a page size larger than the maximum number, the request is still valid. In that case the server just returns the maximum number of items, disregarding the client's page size request.
:param list[str] names: A comma-separated list of resource names. If there is not at least one resource that matches each of the elements of `names`, then an error is returned.
:param int offset: The offset of the first resource to return from a collection.
:param list[str] sort: Sort the response by the specified fields (in descending order if '-' is appended to the field name). NOTE: If you provide a sort you will not get a `continuation_token` in the response.
:param bool async_req: Request runs in separate thread and method returns multiprocessing.pool.ApplyResult.
:param bool _return_http_data_only: Returns only data field.
:param bool _preload_content: Response is converted into objects.
:param int _request_timeout: Total request timeout in seconds.
It can also be a tuple of (connection time, read time) timeouts.
:return: DirectoryServiceGetResponse
If the method is called asynchronously,
returns the request thread.
"""
if ids is not None:
if not isinstance(ids, list):
ids = [ids]
if names is not None:
if not isinstance(names, list):
names = [names]
if sort is not None:
if not isinstance(sort, list):
sort = [sort]
params = {k: v for k, v in six.iteritems(locals()) if v is not None}
# Convert the filter into a string
if params.get('filter'):
params['filter'] = str(params['filter'])
if params.get('sort'):
params['sort'] = [str(_x) for _x in params['sort']]
if 'limit' in params and params['limit'] < 1:
raise ValueError("Invalid value for parameter `limit` when calling `api22_directory_services_get`, must be a value greater than or equal to `1`")
if 'offset' in params and params['offset'] < 0:
raise ValueError("Invalid value for parameter `offset` when calling `api22_directory_services_get`, must be a value greater than or equal to `0`")
collection_formats = {}
path_params = {}
query_params = []
if 'continuation_token' in params:
query_params.append(('continuation_token', params['continuation_token']))
if 'filter' in params:
query_params.append(('filter', params['filter']))
if 'ids' in params:
query_params.append(('ids', params['ids']))
collection_formats['ids'] = 'csv'
if 'limit' in params:
query_params.append(('limit', params['limit']))
if 'names' in params:
query_params.append(('names', params['names']))
collection_formats['names'] = 'csv'
if 'offset' in params:
query_params.append(('offset', params['offset']))
if 'sort' in params:
query_params.append(('sort', params['sort']))
collection_formats['sort'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = ['AuthorizationHeader']
return self.api_client.call_api(
'/api/2.2/directory-services', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DirectoryServiceGetResponse',
auth_settings=auth_settings,
async_req=async_req,
_return_http_data_only=_return_http_data_only,
_preload_content=_preload_content,
_request_timeout=_request_timeout,
collection_formats=collection_formats,
)
def api22_directory_services_patch_with_http_info(
self,
directory_service=None, # type: models.DirectoryService
ids=None, # type: List[str]
names=None, # type: List[str]
async_req=False, # type: bool
_return_http_data_only=False, # type: bool
_preload_content=True, # type: bool
_request_timeout=None, # type: Optional[int]
):
# type: (...) -> models.DirectoryServiceResponse
"""PATCH directory-services
Modifies and tests the directory service configuration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api22_directory_services_patch_with_http_info(directory_service, async_req=True)
>>> result = thread.get()
:param DirectoryService directory_service: (required)
:param list[str] ids: A comma-separated list of resource IDs. If after filtering, there is not at least one resource that matches each of the elements of `ids`, then an error is returned. This cannot be provided together with the `name` or `names` query parameters.
:param list[str] names: A comma-separated list of resource names. If there is not at least one resource that matches each of the elements of `names`, then an error is returned.
:param bool async_req: Request runs in separate thread and method returns multiprocessing.pool.ApplyResult.
:param bool _return_http_data_only: Returns only data field.
:param bool _preload_content: Response is converted into objects.
:param int _request_timeout: Total request timeout in seconds.
It can also be a tuple of (connection time, read time) timeouts.
:return: DirectoryServiceResponse
If the method is called asynchronously,
returns the request thread.
"""
if ids is not None:
if not isinstance(ids, list):
ids = [ids]
if names is not None:
if not isinstance(names, list):
names = [names]
params = {k: v for k, v in six.iteritems(locals()) if v is not None}
# Convert the filter into a string
if params.get('filter'):
params['filter'] = str(params['filter'])
if params.get('sort'):
params['sort'] = [str(_x) for _x in params['sort']]
# verify the required parameter 'directory_service' is set
if directory_service is None:
raise TypeError("Missing the required parameter `directory_service` when calling `api22_directory_services_patch`")
collection_formats = {}
path_params = {}
query_params = []
if 'ids' in params:
query_params.append(('ids', params['ids']))
collection_formats['ids'] = 'csv'
if 'names' in params:
query_params.append(('names', params['names']))
collection_formats['names'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'directory_service' in params:
body_params = params['directory_service']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = ['AuthorizationHeader']
return self.api_client.call_api(
'/api/2.2/directory-services', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DirectoryServiceResponse',
auth_settings=auth_settings,
async_req=async_req,
_return_http_data_only=_return_http_data_only,
_preload_content=_preload_content,
_request_timeout=_request_timeout,
collection_formats=collection_formats,
)
def api22_directory_services_roles_get_with_http_info(
self,
continuation_token=None, # type: str
ids=None, # type: List[str]
filter=None, # type: str
limit=None, # type: int
offset=None, # type: int
role_ids=None, # type: List[str]
role_names=None, # type: List[str]
sort=None, # type: List[str]
async_req=False, # type: bool
_return_http_data_only=False, # type: bool
_preload_content=True, # type: bool
_request_timeout=None, # type: Optional[int]
):
# type: (...) -> models.DirectoryServiceRolesGetResponse
"""GET directory-service/roles
Return array's RBAC group configuration settings for manageability.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api22_directory_services_roles_get_with_http_info(async_req=True)
>>> result = thread.get()
:param str continuation_token: An opaque token used to iterate over a collection. The token to use on the next request is returned in the `continuation_token` field of the result.
:param list[str] ids: A comma-separated list of resource IDs. If after filtering, there is not at least one resource that matches each of the elements of `ids`, then an error is returned. This cannot be provided together with the `role_names` or `role_ids` query parameters.
:param str filter: Exclude resources that don't match the specified criteria.
:param int limit: Limit the size of the response to the specified number of resources. A `limit` of `0` can be used to get the number of resources without getting all of the resources. It will be returned in the `total_item_count` field. If a client asks for a page size larger than the maximum number, the request is still valid. In that case the server just returns the maximum number of items, disregarding the client's page size request.
:param int offset: The offset of the first resource to return from a collection.
:param list[str] role_ids: A comma-separated list of role_ids. If after filtering, there is not at least one resource that matches each of the elements of `role_ids`, then an error is returned. This cannot be provided together with the `ids` or `role_names` query parameters.
:param list[str] role_names: A comma-separated list of role_names. If there is not at least one resource that matches each of the elements of `role_names`, then an error is returned. This cannot be provided together with the `ids` or `role_ids` query parameters.
:param list[str] sort: Sort the response by the specified fields (in descending order if '-' is appended to the field name). NOTE: If you provide a sort you will not get a `continuation_token` in the response.
:param bool async_req: Request runs in separate thread and method returns multiprocessing.pool.ApplyResult.
:param bool _return_http_data_only: Returns only data field.
:param bool _preload_content: Response is converted into objects.
:param int _request_timeout: Total request timeout in seconds.
It can also be a tuple of (connection time, read time) timeouts.
:return: DirectoryServiceRolesGetResponse
If the method is called asynchronously,
returns the request thread.
"""
if ids is not None:
if not isinstance(ids, list):
ids = [ids]
if role_ids is not None:
if not isinstance(role_ids, list):
role_ids = [role_ids]
if role_names is not None:
if not isinstance(role_names, list):
role_names = [role_names]
if sort is not None:
if not isinstance(sort, list):
sort = [sort]
params = {k: v for k, v in six.iteritems(locals()) if v is not None}
# Convert the filter into a string
if params.get('filter'):
params['filter'] = str(params['filter'])
if params.get('sort'):
params['sort'] = [str(_x) for _x in params['sort']]
if 'limit' in params and params['limit'] < 1:
raise ValueError("Invalid value for parameter `limit` when calling `api22_directory_services_roles_get`, must be a value greater than or equal to `1`")
if 'offset' in params and params['offset'] < 0:
raise ValueError("Invalid value for parameter `offset` when calling `api22_directory_services_roles_get`, must be a value greater than or equal to `0`")
collection_formats = {}
path_params = {}
query_params = []
if 'continuation_token' in params:
query_params.append(('continuation_token', params['continuation_token']))
if 'ids' in params:
query_params.append(('ids', params['ids']))
collection_formats['ids'] = 'csv'
if 'filter' in params:
query_params.append(('filter', params['filter']))
if 'limit' in params:
query_params.append(('limit', params['limit']))
if 'offset' in params:
query_params.append(('offset', params['offset']))
if 'role_ids' in params:
query_params.append(('role_ids', params['role_ids']))
collection_formats['role_ids'] = 'csv'
if 'role_names' in params:
query_params.append(('role_names', params['role_names']))
collection_formats['role_names'] = 'csv'
if 'sort' in params:
query_params.append(('sort', params['sort']))
collection_formats['sort'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = ['AuthorizationHeader']
return self.api_client.call_api(
'/api/2.2/directory-services/roles', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DirectoryServiceRolesGetResponse',
auth_settings=auth_settings,
async_req=async_req,
_return_http_data_only=_return_http_data_only,
_preload_content=_preload_content,
_request_timeout=_request_timeout,
collection_formats=collection_formats,
)
def api22_directory_services_roles_patch_with_http_info(
self,
directory_service_roles=None, # type: models.DirectoryServiceRole
ids=None, # type: List[str]
role_ids=None, # type: List[str]
role_names=None, # type: List[str]
async_req=False, # type: bool
_return_http_data_only=False, # type: bool
_preload_content=True, # type: bool
_request_timeout=None, # type: Optional[int]
):
# type: (...) -> models.DirectoryServiceRolesResponse
"""PATCH directory-service/roles
Update an RBAC group configuration setting for manageability.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api22_directory_services_roles_patch_with_http_info(directory_service_roles, async_req=True)
>>> result = thread.get()
:param DirectoryServiceRole directory_service_roles: (required)
:param list[str] ids: A comma-separated list of resource IDs. If after filtering, there is not at least one resource that matches each of the elements of `ids`, then an error is returned. This cannot be provided together with the `role_names` or `role_ids` query parameters.
:param list[str] role_ids: A comma-separated list of role_ids. If after filtering, there is not at least one resource that matches each of the elements of `role_ids`, then an error is returned. This cannot be provided together with the `ids` or `role_names` query parameters.
:param list[str] role_names: A comma-separated list of role_names. If there is not at least one resource that matches each of the elements of `role_names`, then an error is returned. This cannot be provided together with the `ids` or `role_ids` query parameters.
:param bool async_req: Request runs in separate thread and method returns multiprocessing.pool.ApplyResult.
:param bool _return_http_data_only: Returns only data field.
:param bool _preload_content: Response is converted into objects.
:param int _request_timeout: Total request timeout in seconds.
It can also be a tuple of (connection time, read time) timeouts.
:return: DirectoryServiceRolesResponse
If the method is called asynchronously,
returns the request thread.
"""
if ids is not None:
if not isinstance(ids, list):
ids = [ids]
if role_ids is not None:
if not isinstance(role_ids, list):
role_ids = [role_ids]
if role_names is not None:
if not isinstance(role_names, list):
role_names = [role_names]
params = {k: v for k, v in six.iteritems(locals()) if v is not None}
# Convert the filter into a string
if params.get('filter'):
params['filter'] = str(params['filter'])
if params.get('sort'):
params['sort'] = [str(_x) for _x in params['sort']]
# verify the required parameter 'directory_service_roles' is set
if directory_service_roles is None:
raise TypeError("Missing the required parameter `directory_service_roles` when calling `api22_directory_services_roles_patch`")
collection_formats = {}
path_params = {}
query_params = []
if 'ids' in params:
query_params.append(('ids', params['ids']))
collection_formats['ids'] = 'csv'
if 'role_ids' in params:
query_params.append(('role_ids', params['role_ids']))
collection_formats['role_ids'] = 'csv'
if 'role_names' in params:
query_params.append(('role_names', params['role_names']))
collection_formats['role_names'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'directory_service_roles' in params:
body_params = params['directory_service_roles']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = ['AuthorizationHeader']
return self.api_client.call_api(
'/api/2.2/directory-services/roles', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='DirectoryServiceRolesResponse',
auth_settings=auth_settings,
async_req=async_req,
_return_http_data_only=_return_http_data_only,
_preload_content=_preload_content,
_request_timeout=_request_timeout,
collection_formats=collection_formats,
)
def api22_directory_services_test_get_with_http_info(
self,
filter=None, # type: str
ids=None, # type: List[str]
limit=None, # type: int
names=None, # type: List[str]
sort=None, # type: List[str]
async_req=False, # type: bool
_return_http_data_only=False, # type: bool
_preload_content=True, # type: bool
_request_timeout=None, # type: Optional[int]
):
# type: (...) -> models.TestResultGetResponse
"""GET directory-services/test
Test the configured directory services on the array.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api22_directory_services_test_get_with_http_info(async_req=True)
>>> result = thread.get()
:param str filter: Exclude resources that don't match the specified criteria.
:param list[str] ids: A comma-separated list of resource IDs. If after filtering, there is not at least one resource that matches each of the elements of `ids`, then an error is returned. This cannot be provided together with the `name` or `names` query parameters.
:param int limit: Limit the size of the response to the specified number of resources. A `limit` of `0` can be used to get the number of resources without getting all of the resources. It will be returned in the `total_item_count` field. If a client asks for a page size larger than the maximum number, the request is still valid. In that case the server just returns the maximum number of items, disregarding the client's page size request.
:param list[str] names: A comma-separated list of resource names. If there is not at least one resource that matches each of the elements of `names`, then an error is returned.
:param list[str] sort: Sort the response by the specified fields (in descending order if '-' is appended to the field name). NOTE: If you provide a sort you will not get a `continuation_token` in the response.
:param bool async_req: Request runs in separate thread and method returns multiprocessing.pool.ApplyResult.
:param bool _return_http_data_only: Returns only data field.
:param bool _preload_content: Response is converted into objects.
:param int _request_timeout: Total request timeout in seconds.
It can also be a tuple of (connection time, read time) timeouts.
:return: TestResultGetResponse
If the method is called asynchronously,
returns the request thread.
"""
if ids is not None:
if not isinstance(ids, list):
ids = [ids]
if names is not None:
if not isinstance(names, list):
names = [names]
if sort is not None:
if not isinstance(sort, list):
sort = [sort]
params = {k: v for k, v in six.iteritems(locals()) if v is not None}
# Convert the filter into a string
if params.get('filter'):
params['filter'] = str(params['filter'])
if params.get('sort'):
params['sort'] = [str(_x) for _x in params['sort']]
if 'limit' in params and params['limit'] < 1:
raise ValueError("Invalid value for parameter `limit` when calling `api22_directory_services_test_get`, must be a value greater than or equal to `1`")
collection_formats = {}
path_params = {}
query_params = []
if 'filter' in params:
query_params.append(('filter', params['filter']))
if 'ids' in params:
query_params.append(('ids', params['ids']))
collection_formats['ids'] = 'csv'
if 'limit' in params:
query_params.append(('limit', params['limit']))
if 'names' in params:
query_params.append(('names', params['names']))
collection_formats['names'] = 'csv'
if 'sort' in params:
query_params.append(('sort', params['sort']))
collection_formats['sort'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = ['AuthorizationHeader']
return self.api_client.call_api(
'/api/2.2/directory-services/test', 'GET',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TestResultGetResponse',
auth_settings=auth_settings,
async_req=async_req,
_return_http_data_only=_return_http_data_only,
_preload_content=_preload_content,
_request_timeout=_request_timeout,
collection_formats=collection_formats,
)
def api22_directory_services_test_patch_with_http_info(
self,
filter=None, # type: str
ids=None, # type: List[str]
names=None, # type: List[str]
sort=None, # type: List[str]
directory_service=None, # type: models.DirectoryService
async_req=False, # type: bool
_return_http_data_only=False, # type: bool
_preload_content=True, # type: bool
_request_timeout=None, # type: Optional[int]
):
# type: (...) -> models.TestResultResponse
"""PATCH directory-service/test
Test the configured directory services on the array. Optionally, provide modifications which will be used to perform the tests, but will not be applied to the current configuration.
This method makes a synchronous HTTP request by default. To make an
asynchronous HTTP request, please pass async_req=True
>>> thread = api.api22_directory_services_test_patch_with_http_info(async_req=True)
>>> result = thread.get()
:param str filter: Exclude resources that don't match the specified criteria.
:param list[str] ids: A comma-separated list of resource IDs. If after filtering, there is not at least one resource that matches each of the elements of `ids`, then an error is returned. This cannot be provided together with the `name` or `names` query parameters.
:param list[str] names: A comma-separated list of resource names. If there is not at least one resource that matches each of the elements of `names`, then an error is returned.
:param list[str] sort: Sort the response by the specified fields (in descending order if '-' is appended to the field name). NOTE: If you provide a sort you will not get a `continuation_token` in the response.
:param DirectoryService directory_service: An optional directory service configuration that, if provided, will be used to overwrite aspects of the existing directory service objects when performing tests.
:param bool async_req: Request runs in separate thread and method returns multiprocessing.pool.ApplyResult.
:param bool _return_http_data_only: Returns only data field.
:param bool _preload_content: Response is converted into objects.
:param int _request_timeout: Total request timeout in seconds.
It can also be a tuple of (connection time, read time) timeouts.
:return: TestResultResponse
If the method is called asynchronously,
returns the request thread.
"""
if ids is not None:
if not isinstance(ids, list):
ids = [ids]
if names is not None:
if not isinstance(names, list):
names = [names]
if sort is not None:
if not isinstance(sort, list):
sort = [sort]
params = {k: v for k, v in six.iteritems(locals()) if v is not None}
# Convert the filter into a string
if params.get('filter'):
params['filter'] = str(params['filter'])
if params.get('sort'):
params['sort'] = [str(_x) for _x in params['sort']]
collection_formats = {}
path_params = {}
query_params = []
if 'filter' in params:
query_params.append(('filter', params['filter']))
if 'ids' in params:
query_params.append(('ids', params['ids']))
collection_formats['ids'] = 'csv'
if 'names' in params:
query_params.append(('names', params['names']))
collection_formats['names'] = 'csv'
if 'sort' in params:
query_params.append(('sort', params['sort']))
collection_formats['sort'] = 'csv'
header_params = {}
form_params = []
local_var_files = {}
body_params = None
if 'directory_service' in params:
body_params = params['directory_service']
# HTTP header `Accept`
header_params['Accept'] = self.api_client.select_header_accept(
['application/json'])
# HTTP header `Content-Type`
header_params['Content-Type'] = self.api_client.select_header_content_type(
['application/json'])
# Authentication setting
auth_settings = ['AuthorizationHeader']
return self.api_client.call_api(
'/api/2.2/directory-services/test', 'PATCH',
path_params,
query_params,
header_params,
body=body_params,
post_params=form_params,
files=local_var_files,
response_type='TestResultResponse',
auth_settings=auth_settings,
async_req=async_req,
_return_http_data_only=_return_http_data_only,
_preload_content=_preload_content,
_request_timeout=_request_timeout,
collection_formats=collection_formats,
)
| 48.681953 | 449 | 0.641101 | 4,125 | 32,909 | 4.942303 | 0.065939 | 0.016874 | 0.034188 | 0.027027 | 0.923334 | 0.917644 | 0.909109 | 0.899299 | 0.89611 | 0.891794 | 0 | 0.00293 | 0.274028 | 32,909 | 675 | 450 | 48.754074 | 0.850404 | 0.422134 | 0 | 0.897674 | 0 | 0.011628 | 0.147413 | 0.035125 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016279 | false | 0 | 0.011628 | 0 | 0.044186 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
d9b43d592a68f4087084a6e1598f93d04ceab84f | 49 | py | Python | archivo_1.py | anggelomos/Workshop-github | 84ef732ff713a13737d6c7b140414aeccfc8bbe1 | [
"MIT"
] | null | null | null | archivo_1.py | anggelomos/Workshop-github | 84ef732ff713a13737d6c7b140414aeccfc8bbe1 | [
"MIT"
] | null | null | null | archivo_1.py | anggelomos/Workshop-github | 84ef732ff713a13737d6c7b140414aeccfc8bbe1 | [
"MIT"
] | null | null | null | def funcion():
return 2+2
| 2.227273 | 14 | 0.367347 | 5 | 49 | 3.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090909 | 0.55102 | 49 | 21 | 15 | 2.333333 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | true | 0 | 0 | 0.5 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 8 |
8a0d35f25a92e9eae9fddc032bc0b997530bba47 | 23,649 | py | Python | plugins/insta_cmds.py | TamilBots/instagram | 51ee6548398ca9dd5c7d026ffa2ad9aac543506f | [
"MIT"
] | 1 | 2021-09-04T12:10:21.000Z | 2021-09-04T12:10:21.000Z | plugins/insta_cmds.py | TamilBots/instagram | 51ee6548398ca9dd5c7d026ffa2ad9aac543506f | [
"MIT"
] | null | null | null | plugins/insta_cmds.py | TamilBots/instagram | 51ee6548398ca9dd5c7d026ffa2ad9aac543506f | [
"MIT"
] | 1 | 2021-09-04T10:19:19.000Z | 2021-09-04T10:19:19.000Z | #MIT License
#Copyright (c) 2021 Sathishzus
#Permission is hereby granted, free of charge, to any person obtaining a copy
#of this software and associated documentation files (the "Software"), to deal
#in the Software without restriction, including without limitation the rights
#to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
#copies of the Software, and to permit persons to whom the Software is
#furnished to do so, subject to the following conditions:
#The above copyright notice and this permission notice shall be included in all
#copies or substantial portions of the Software.
#THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
#IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
#FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
#AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
#LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
#OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
#SOFTWARE.
from pyrogram.types import InlineKeyboardButton, InlineKeyboardMarkup
from pyrogram import Client, filters
from config import Config
from instaloader import Profile
from pyrogram.errors.exceptions.bad_request_400 import MessageTooLong
import os
from utils import *
USER=Config.USER
OWNER=Config.OWNER
HOME_TEXT_OWNER=Config.HOME_TEXT_OWNER
HELP=Config.HELP
HOME_TEXT=Config.HOME_TEXT
session=f"./{USER}"
STATUS=Config.STATUS
insta = Config.L
buttons=InlineKeyboardMarkup(
[
[
InlineKeyboardButton("👨🏼💻Developer", url='https://t.me/TamilBotZ'),
InlineKeyboardButton("🤖Other Bot", url="https://t.me/TamiliniBot")
],
[
InlineKeyboardButton("🔗Source Code", url="https://github.com/TamilBots/instagram"),
InlineKeyboardButton("🧩Deploy Own Bot", url="https://heroku.com/deploy?template=https://github.com/TamilBots/instagram")
],
[
InlineKeyboardButton("👨🏼🦯How To Use?", callback_data="help"),
InlineKeyboardButton("⚙️Update Channel", url="https://t.me/TamilBots")
]
]
)
@Client.on_message(filters.command("posts") & filters.private)
async def post(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
if " " in text:
cmd, username = text.split(' ')
profile = Profile.from_username(insta.context, username)
is_followed = yes_or_no(profile.followed_by_viewer)
type = acc_type(profile.is_private)
if type == "🔒Private🔒" and is_followed == "No":
await message.reply_text("Sorry!\nI can't fetch details from that account.\nSince its a Private account and you are not following <code>@{username}</code>.")
return
await bot.send_message(
message.from_user.id,
f"What type of post do you want to download?.",
reply_markup=InlineKeyboardMarkup(
[
[
InlineKeyboardButton("Photos", callback_data=f"photos#{username}"),
InlineKeyboardButton("Videos", callback_data=f"video#{username}")
]
]
)
)
@Client.on_message(filters.command("igtv") & filters.private)
async def igtv(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
if " " in text:
cmd, username = text.split(' ')
profile = Profile.from_username(insta.context, username)
is_followed = yes_or_no(profile.followed_by_viewer)
type = acc_type(profile.is_private)
if type == "🔒Private🔒" and is_followed == "No":
await message.reply_text("Sorry!\nI can't fetch details from that account.\nSince its a Private account and you are not following <code>@{username}</code>.")
return
m=await message.reply_text(f"Fetching IGTV from <code>@{username}</code>")
profile = Profile.from_username(insta.context, username)
igtvcount = profile.igtvcount
await m.edit(
text = f"Do you Want to download all IGTV posts?\nThere are {igtvcount} posts.",
reply_markup=InlineKeyboardMarkup(
[
[
InlineKeyboardButton("Yes", callback_data=f"yesigtv#{username}"),
InlineKeyboardButton("No", callback_data=f"no#{username}")
]
]
)
)
@Client.on_message(filters.command("followers") & filters.private)
async def followers(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
if " " in text:
cmd, username = text.split(' ')
profile = Profile.from_username(insta.context, username)
is_followed = yes_or_no(profile.followed_by_viewer)
type = acc_type(profile.is_private)
if type == "🔒Private🔒" and is_followed == "No":
await message.reply_text("Sorry!\nI can't fetch details from that account.\nSince its a Private account and you are not following <code>@{username}</code>.")
return
profile = Profile.from_username(insta.context, username)
name=profile.full_name
m=await message.reply_text(f"Fetching Followers list of <code>@{username}</code>")
chat_id=message.from_user.id
f = profile.get_followers()
followers=f"**Followers List for {name}**\n\n"
for p in f:
followers += f"\n[{p.username}](www.instagram.com/{p.username})"
try:
await m.delete()
await bot.send_message(chat_id=chat_id, text=followers)
except MessageTooLong:
followers=f"**Followers List for {name}**\n\n"
f = profile.get_followers()
for p in f:
followers += f"\nName: {p.username} : Link to Profile: www.instagram.com/{p.username}"
text_file = open(f"{username}'s followers.txt", "w")
text_file.write(followers)
text_file.close()
await bot.send_document(chat_id=chat_id, document=f"./{username}'s followers.txt", caption=f"{name}'s followers\n\nA Project By [Tamil Bots](https://t.me/TamilBots)")
os.remove(f"./{username}'s followers.txt")
@Client.on_message(filters.command("followees") & filters.private)
async def followees(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
if " " in text:
cmd, username = text.split(' ')
profile = Profile.from_username(insta.context, username)
is_followed = yes_or_no(profile.followed_by_viewer)
type = acc_type(profile.is_private)
if type == "🔒Private🔒" and is_followed == "No":
await message.reply_text("Sorry!\nI can't fetch details from that account.\nSince its a Private account and you are not following <code>@{username}</code>.")
return
profile = Profile.from_username(insta.context, username)
name=profile.full_name
m=await message.reply_text(f"Fetching Followees list of <code>@{username}</code>")
chat_id=message.from_user.id
f = profile.get_followees()
followees=f"**Followees List for {name}**\n\n"
for p in f:
followees += f"\n[{p.username}](www.instagram.com/{p.username})"
try:
await m.delete()
await bot.send_message(chat_id=chat_id, text=followees)
except MessageTooLong:
followees=f"**Followees List for {name}**\n\n"
f = profile.get_followees()
for p in f:
followees += f"\nName: {p.username} : Link to Profile: www.instagram.com/{p.username}"
text_file = open(f"{username}'s followees.txt", "w")
text_file.write(followees)
text_file.close()
await bot.send_document(chat_id=chat_id, document=f"./{username}'s followees.txt", caption=f"{name}'s followees\n\nA Project By [Tamil Bots](https://t.me/TamilBots)")
os.remove(f"./{username}'s followees.txt")
@Client.on_message(filters.command("fans") & filters.private)
async def fans(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
if " " in text:
cmd, username = text.split(' ')
profile = Profile.from_username(insta.context, username)
is_followed = yes_or_no(profile.followed_by_viewer)
type = acc_type(profile.is_private)
if type == "🔒Private🔒" and is_followed == "No":
await message.reply_text("Sorry!\nI can't fetch details from that account.\nSince its a Private account and you are not following <code>@{username}</code>.")
return
profile = Profile.from_username(insta.context, username)
name=profile.full_name
m=await message.reply_text(f"Fetching list of followees of <code>@{username}</code> who follows <code>@{username}</code>.")
chat_id=message.from_user.id
f = profile.get_followers()
fl = profile.get_followees()
flist=[]
fmlist=[]
for fn in f:
u=fn.username
flist.append(u)
for fm in fl:
n=fm.username
fmlist.append(n)
fans = [value for value in fmlist if value in flist]
print(len(fans))
followers=f"**Fans List for {name}**\n\n"
for p in fans:
followers += f"\n[{p}](www.instagram.com/{p})"
try:
await m.delete()
await bot.send_message(chat_id=chat_id, text=followers)
except MessageTooLong:
followers=f"**Fans List for {name}**\n\n"
for p in fans:
followers += f"\nName: {p} : Link to Profile: www.instagram.com/{p}"
text_file = open(f"{username}'s fans.txt", "w")
text_file.write(followers)
text_file.close()
await bot.send_document(chat_id=chat_id, document=f"./{username}'s fans.txt", caption=f"{name}'s fans\n\nA Project By [Tamil Bots](https://t.me/TamilBots)")
os.remove(f"./{username}'s fans.txt")
@Client.on_message(filters.command("notfollowing") & filters.private)
async def nfans(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
if " " in text:
cmd, username = text.split(' ')
profile = Profile.from_username(insta.context, username)
is_followed = yes_or_no(profile.followed_by_viewer)
type = acc_type(profile.is_private)
if type == "🔒Private🔒" and is_followed == "No":
await message.reply_text("Sorry!\nI can't fetch details from that account.\nSince its a Private account and you are not following <code>@{username}</code>.")
return
profile = Profile.from_username(insta.context, username)
name=profile.full_name
m=await message.reply_text(f"Fetching list of followees of <code>@{username}</code> who is <b>not</b> following <code>@{username}</code>.")
chat_id=message.from_user.id
f = profile.get_followers()
fl = profile.get_followees()
flist=[]
fmlist=[]
for fn in f:
u=fn.username
flist.append(u)
for fm in fl:
n=fm.username
fmlist.append(n)
fans = list(set(fmlist) - set(flist))
print(len(fans))
followers=f"**Followees of <code>@{username}</code> who is <b>not</b> following <code>@{username}</code>**\n\n"
for p in fans:
followers += f"\n[{p}](www.instagram.com/{p})"
try:
await m.delete()
await bot.send_message(chat_id=chat_id, text=followers)
except MessageTooLong:
followers=f"Followees of <code>@{username}</code> who is <b>not</b> following <code>@{username}</code>\n\n"
for p in fans:
followers += f"\nName: {p} : Link to Profile: www.instagram.com/{p}"
text_file = open(f"{username}'s Non_followers.txt", "w")
text_file.write(followers)
text_file.close()
await bot.send_document(chat_id=chat_id, document=f"./{username}'s Non_followers.txt", caption=f"{name}'s Non_followers\n\nA Project By [Tamil Bots](https://t.me/TamilBots)")
os.remove(f"./{username}'s Non_followers.txt")
@Client.on_message(filters.command("feed") & filters.private)
async def feed(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
count=None
if " " in text:
cmd, count = text.split(' ')
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
m=await message.reply_text(f"Fetching Posts in Your Feed.")
chat_id=message.from_user.id
dir=f"{chat_id}/{username}"
await m.edit("Starting Downloading..\nThis may take longer time Depending upon number of posts.")
if count:
command = [
"instaloader",
"--no-metadata-json",
"--no-compress-json",
"--no-profile-pic",
"--no-posts",
"--no-captions",
"--no-video-thumbnails",
"--login", USER,
"--sessionfile", session,
"--dirname-pattern", dir,
":feed",
"--count", count
]
else:
command = [
"instaloader",
"--no-metadata-json",
"--no-compress-json",
"--no-profile-pic",
"--no-posts",
"--no-captions",
"--no-video-thumbnails",
"--login", USER,
"--sessionfile", session,
"--dirname-pattern", dir,
":feed"
]
await download_insta(command, m, dir)
await upload(m, bot, chat_id, dir)
@Client.on_message(filters.command("saved") & filters.private)
async def saved(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
count=None
if " " in text:
cmd, count = text.split(' ')
m=await message.reply_text(f"Fetching your Saved Posts.")
chat_id=message.from_user.id
dir=f"{chat_id}/{username}"
await m.edit("Starting Downloading..\nThis may take longer time Depending upon number of posts.")
if count:
command = [
"instaloader",
"--no-metadata-json",
"--no-compress-json",
"--no-profile-pic",
"--no-posts",
"--no-captions",
"--no-video-thumbnails",
"--login", USER,
"-f", session,
"--dirname-pattern", dir,
":saved",
"--count", count
]
else:
command = [
"instaloader",
"--no-metadata-json",
"--no-compress-json",
"--no-profile-pic",
"--no-posts",
"--no-captions",
"--no-video-thumbnails",
"--login", USER,
"-f", session,
"--dirname-pattern", dir,
":saved"
]
await download_insta(command, m, dir)
await upload(m, bot, chat_id, dir)
@Client.on_message(filters.command("tagged") & filters.private)
async def tagged(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
if " " in text:
cmd, username = text.split(' ')
profile = Profile.from_username(insta.context, username)
is_followed = yes_or_no(profile.followed_by_viewer)
type = acc_type(profile.is_private)
if type == "🔒Private🔒" and is_followed == "No":
await message.reply_text("Sorry!\nI can't fetch details from that account.\nSince its a Private account and you are not following <code>@{username}</code>.")
return
m=await message.reply_text(f"Fetching the posts in which <code>@{username}</code> is tagged.")
chat_id=message.from_user.id
dir=f"{chat_id}/{username}"
await m.edit("Starting Downloading..\nThis may take longer time Depending upon number of posts.")
command = [
"instaloader",
"--no-metadata-json",
"--no-compress-json",
"--no-profile-pic",
"--no-posts",
"--tagged",
"--no-captions",
"--no-video-thumbnails",
"--login", USER,
"-f", session,
"--dirname-pattern", dir,
"--", username
]
await download_insta(command, m, dir)
await upload(m, bot, chat_id, dir)
@Client.on_message(filters.command("story") & filters.private)
async def story(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
text=message.text
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
if " " in text:
cmd, username = text.split(' ')
profile = Profile.from_username(insta.context, username)
is_followed = yes_or_no(profile.followed_by_viewer)
type = acc_type(profile.is_private)
if type == "🔒Private🔒" and is_followed == "No":
await message.reply_text("Sorry!\nI can't fetch details from that account.\nSince its a Private account and you are not following <code>@{username}</code>.")
return
m=await message.reply_text(f"Fetching stories of <code>@{username}</code>")
chat_id=message.from_user.id
dir=f"{chat_id}/{username}"
await m.edit("Starting Downloading..\nThis may take longer time Depending upon number of posts.")
command = [
"instaloader",
"--no-metadata-json",
"--no-compress-json",
"--no-profile-pic",
"--no-posts",
"--stories",
"--no-captions",
"--no-video-thumbnails",
"--login", USER,
"-f", session,
"--dirname-pattern", dir,
"--", username
]
await download_insta(command, m, dir)
await upload(m, bot, chat_id, dir)
@Client.on_message(filters.command("stories") & filters.private)
async def stories(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
m=await message.reply_text(f"Fetching stories of all your followees")
chat_id=message.from_user.id
dir=f"{chat_id}/{username}"
await m.edit("Starting Downloading..\nThis may take longer time Depending upon number of posts.")
command = [
"instaloader",
"--no-metadata-json",
"--no-compress-json",
"--no-profile-pic",
"--no-captions",
"--no-posts",
"--no-video-thumbnails",
"--login", USER,
"-f", session,
"--dirname-pattern", dir,
":stories"
]
await download_insta(command, m, dir)
await upload(m, bot, chat_id, dir)
@Client.on_message(filters.command("highlights") & filters.private)
async def highlights(bot, message):
if str(message.from_user.id) != OWNER:
await message.reply_text(
HOME_TEXT.format(message.from_user.first_name, message.from_user.id, USER, USER, USER, OWNER),
reply_markup=buttons,
disable_web_page_preview=True
)
return
username=USER
if 1 not in STATUS:
await message.reply_text("You Must Login First /login ")
return
text=message.text
if " " in text:
cmd, username = text.split(' ')
profile = Profile.from_username(insta.context, username)
is_followed = yes_or_no(profile.followed_by_viewer)
type = acc_type(profile.is_private)
if type == "🔒Private🔒" and is_followed == "No":
await message.reply_text("Sorry!\nI can't fetch details from that account.\nSince its a Private account and you are not following <code>@{username}</code>.")
return
m=await message.reply_text(f"Fetching highlights from profile <code>@{username}</code>")
chat_id=message.from_user.id
dir=f"{chat_id}/{username}"
await m.edit("Starting Downloading..\nThis may take longer time Depending upon number of posts.")
command = [
"instaloader",
"--no-metadata-json",
"--no-compress-json",
"--no-profile-pic",
"--no-posts",
"--highlights",
"--no-captions",
"--no-video-thumbnails",
"--login", USER,
"-f", session,
"--dirname-pattern", dir,
"--", username
]
await download_insta(command, m, dir)
await upload(m, bot, chat_id, dir)
| 37.8384 | 182 | 0.622436 | 3,052 | 23,649 | 4.716252 | 0.09633 | 0.035918 | 0.048979 | 0.064193 | 0.817146 | 0.805127 | 0.77838 | 0.767334 | 0.759414 | 0.751911 | 0 | 0.001073 | 0.251258 | 23,649 | 624 | 183 | 37.899038 | 0.810075 | 0.04423 | 0 | 0.754955 | 0 | 0.034234 | 0.263139 | 0.044233 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.012613 | 0 | 0.072072 | 0.003604 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8a109ffa2f1746fb2385bc35888c37629dd96961 | 880,155 | py | Python | tests/EVM/test_EVMADDMOD.py | ivanpustogarov/manticore | f17410b8427ddbd5d751d8824bdf10ce33c9f3ce | [
"Apache-2.0"
] | null | null | null | tests/EVM/test_EVMADDMOD.py | ivanpustogarov/manticore | f17410b8427ddbd5d751d8824bdf10ce33c9f3ce | [
"Apache-2.0"
] | null | null | null | tests/EVM/test_EVMADDMOD.py | ivanpustogarov/manticore | f17410b8427ddbd5d751d8824bdf10ce33c9f3ce | [
"Apache-2.0"
] | 1 | 2018-08-12T17:29:11.000Z | 2018-08-12T17:29:11.000Z |
import struct
import unittest
import json
from manticore.platforms import evm
from manticore.core import state
from manticore.core.smtlib import Operators, ConstraintSet
import os
class EVMTest_ADDMOD(unittest.TestCase):
_multiprocess_can_split_ = True
maxDiff=None
def _execute(self, new_vm):
last_returned = None
last_exception = None
try:
new_vm.execute()
except evm.Stop as e:
last_exception = "STOP"
except evm.NotEnoughGas:
last_exception = "OOG"
except evm.StackUnderflow:
last_exception = "INSUFICIENT STACK"
except evm.InvalidOpcode:
last_exception = "INVALID"
except evm.SelfDestruct:
last_exception = "SUICIDED"
except evm.Return as e:
last_exception = "RETURN"
last_returned = e.data
except evm.Revert:
last_exception = "REVERT"
return last_exception, last_returned
def test_ADDMOD_1(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_2(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_3(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_4(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819952])
def test_ADDMOD_5(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301263])
def test_ADDMOD_6(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_7(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_8(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_9(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_10(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_11(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_12(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_13(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819952])
def test_ADDMOD_14(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301263])
def test_ADDMOD_15(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_16(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_17(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_18(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_19(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_20(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_21(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [2])
def test_ADDMOD_22(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819953])
def test_ADDMOD_23(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301264])
def test_ADDMOD_24(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_25(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_26(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [49])
def test_ADDMOD_27(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718913])
def test_ADDMOD_28(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819952])
def test_ADDMOD_29(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819952])
def test_ADDMOD_30(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819953])
def test_ADDMOD_31(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [115792089237316195423570985008687907853269984665640564039457584007913129639904])
def test_ADDMOD_32(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [61514547407324228818772085785865451047049679353621549645961841504203850121215])
def test_ADDMOD_33(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819968])
def test_ADDMOD_34(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819984])
def test_ADDMOD_35(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564820000])
def test_ADDMOD_36(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504350043516790537761646130706531776516538464538864])
def test_ADDMOD_37(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301263])
def test_ADDMOD_38(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301263])
def test_ADDMOD_39(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301264])
def test_ADDMOD_40(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [61514547407324228818772085785865451047049679353621549645961841504203850121215])
def test_ADDMOD_41(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [7237005577332262213973186563042994240829374041602535252466099000494570602526])
def test_ADDMOD_42(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301279])
def test_ADDMOD_43(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301295])
def test_ADDMOD_44(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301311])
def test_ADDMOD_45(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281527586710570232449627116313036034012829185020175])
def test_ADDMOD_46(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_47(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_48(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_49(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819968])
def test_ADDMOD_50(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301279])
def test_ADDMOD_51(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_52(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_53(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_54(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718928])
def test_ADDMOD_55(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_56(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_57(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_58(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564819984])
def test_ADDMOD_59(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301295])
def test_ADDMOD_60(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_61(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_62(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [80])
def test_ADDMOD_63(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718944])
def test_ADDMOD_64(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_65(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_66(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [49])
def test_ADDMOD_67(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504343953926634992332820282019728792003956564820000])
def test_ADDMOD_68(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301311])
def test_ADDMOD_69(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_70(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [80])
def test_ADDMOD_71(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [96])
def test_ADDMOD_72(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718960])
def test_ADDMOD_73(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_74(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_75(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718913])
def test_ADDMOD_76(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [57896044618658097711785492504350043516790537761646130706531776516538464538864])
def test_ADDMOD_77(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281527586710570232449627116313036034012829185020175])
def test_ADDMOD_78(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718928])
def test_ADDMOD_79(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718944])
def test_ADDMOD_80(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718960])
def test_ADDMOD_81(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [12179180311090857651697373605969025163799437824])
def test_ADDMOD_82(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_83(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_84(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_85(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_86(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_87(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_88(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_89(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_90(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_91(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_92(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_93(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_94(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_95(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_96(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_97(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_98(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_99(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_100(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_101(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_102(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_103(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_104(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_105(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_106(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_107(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_108(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_109(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_110(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_111(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_112(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_113(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_114(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_115(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_116(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_117(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_118(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_119(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_120(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_121(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_122(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_123(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_124(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_125(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_126(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_127(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_128(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_129(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_130(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_131(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_132(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_133(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_134(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_135(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_136(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_137(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_138(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_139(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_140(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_141(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_142(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_143(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_144(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_145(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_146(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_147(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_148(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_149(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_150(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_151(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_152(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_153(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_154(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_155(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_156(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_157(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_158(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_159(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_160(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_161(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_162(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_163(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_164(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_165(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_166(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_167(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_168(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_169(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_170(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_171(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_172(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_173(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_174(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_175(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_176(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_177(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_178(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_179(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_180(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_181(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_182(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_183(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_184(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_185(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_186(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_187(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_188(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_189(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_190(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_191(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_192(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_193(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_194(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_195(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_196(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_197(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_198(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_199(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_200(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_201(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_202(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_203(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_204(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_205(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_206(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_207(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_208(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_209(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_210(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_211(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_212(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_213(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_214(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_215(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_216(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_217(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_218(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_219(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_220(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_221(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_222(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_223(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_224(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_225(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_226(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_227(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_228(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_229(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_230(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_231(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_232(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_233(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_234(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_235(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_236(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_237(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_238(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_239(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_240(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_241(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_242(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_243(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_244(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [62])
def test_ADDMOD_245(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_246(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_247(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_248(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301294])
def test_ADDMOD_249(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [47])
def test_ADDMOD_250(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [63])
def test_ADDMOD_251(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [79])
def test_ADDMOD_252(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718943])
def test_ADDMOD_253(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_254(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_255(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_256(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_257(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301263])
def test_ADDMOD_258(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_259(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_260(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_261(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_262(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_263(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_264(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [2])
def test_ADDMOD_265(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_266(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301264])
def test_ADDMOD_267(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_268(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_269(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [49])
def test_ADDMOD_270(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718913])
def test_ADDMOD_271(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_272(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_273(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_274(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_275(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301263])
def test_ADDMOD_276(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_277(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_278(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_279(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_280(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301294])
def test_ADDMOD_281(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301263])
def test_ADDMOD_282(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301264])
def test_ADDMOD_283(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301263])
def test_ADDMOD_284(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [7237005577332262213973186563042994240829374041602535252466099000494570602526])
def test_ADDMOD_285(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301279])
def test_ADDMOD_286(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301295])
def test_ADDMOD_287(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301311])
def test_ADDMOD_288(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281527586710570232449627116313036034012829185020175])
def test_ADDMOD_289(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [47])
def test_ADDMOD_290(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_291(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_292(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_293(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301279])
def test_ADDMOD_294(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_295(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_296(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_297(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718928])
def test_ADDMOD_298(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [63])
def test_ADDMOD_299(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_300(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_301(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_302(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301295])
def test_ADDMOD_303(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_304(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_305(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [80])
def test_ADDMOD_306(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718944])
def test_ADDMOD_307(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [79])
def test_ADDMOD_308(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_309(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [49])
def test_ADDMOD_310(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_311(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301311])
def test_ADDMOD_312(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_313(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [80])
def test_ADDMOD_314(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [96])
def test_ADDMOD_315(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718960])
def test_ADDMOD_316(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718943])
def test_ADDMOD_317(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_318(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718913])
def test_ADDMOD_319(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_320(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281527586710570232449627116313036034012829185020175])
def test_ADDMOD_321(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718928])
def test_ADDMOD_322(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718944])
def test_ADDMOD_323(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718960])
def test_ADDMOD_324(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [12179180311090857651697373605969025163799437824])
def test_ADDMOD_325(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300301])
def test_ADDMOD_326(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300782])
def test_ADDMOD_327(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300783])
def test_ADDMOD_328(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300526])
def test_ADDMOD_329(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300782])
def test_ADDMOD_330(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300798])
def test_ADDMOD_331(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300814])
def test_ADDMOD_332(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300830])
def test_ADDMOD_333(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718431])
def test_ADDMOD_334(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300782])
def test_ADDMOD_335(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_336(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_337(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301007])
def test_ADDMOD_338(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_339(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_340(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_341(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_342(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_343(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300783])
def test_ADDMOD_344(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_345(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [2])
def test_ADDMOD_346(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301008])
def test_ADDMOD_347(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_348(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_349(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_350(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [49])
def test_ADDMOD_351(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718913])
def test_ADDMOD_352(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300526])
def test_ADDMOD_353(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301007])
def test_ADDMOD_354(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301008])
def test_ADDMOD_355(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300751])
def test_ADDMOD_356(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301007])
def test_ADDMOD_357(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301023])
def test_ADDMOD_358(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301039])
def test_ADDMOD_359(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301055])
def test_ADDMOD_360(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718656])
def test_ADDMOD_361(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300782])
def test_ADDMOD_362(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_363(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_364(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301007])
def test_ADDMOD_365(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_366(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_367(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_368(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_369(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_370(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300798])
def test_ADDMOD_371(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_372(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_373(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301023])
def test_ADDMOD_374(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_375(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_376(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_377(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_378(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718928])
def test_ADDMOD_379(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300814])
def test_ADDMOD_380(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_381(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_382(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301039])
def test_ADDMOD_383(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_384(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_385(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_386(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [80])
def test_ADDMOD_387(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718944])
def test_ADDMOD_388(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285300830])
def test_ADDMOD_389(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_390(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [49])
def test_ADDMOD_391(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [3618502788666131106986593281521497120414687020801267626233049500247285301055])
def test_ADDMOD_392(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_393(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_394(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [80])
def test_ADDMOD_395(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [96])
def test_ADDMOD_396(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718960])
def test_ADDMOD_397(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718431])
def test_ADDMOD_398(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_399(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718913])
def test_ADDMOD_400(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718656])
def test_ADDMOD_401(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718912])
def test_ADDMOD_402(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718928])
def test_ADDMOD_403(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718944])
def test_ADDMOD_404(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [6089590155545428825848686802984512581899718960])
def test_ADDMOD_405(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [12179180311090857651697373605969025163799437824])
def test_ADDMOD_406(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [14])
def test_ADDMOD_407(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_408(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_409(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_410(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [14])
def test_ADDMOD_411(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_412(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_413(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_414(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_415(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_416(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_417(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_418(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_419(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_420(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_421(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_422(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_423(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_424(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_425(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_426(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [2])
def test_ADDMOD_427(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_428(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_429(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_430(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_431(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_432(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_433(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_434(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_435(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_436(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_437(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_438(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_439(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_440(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_441(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_442(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [14])
def test_ADDMOD_443(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_444(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_445(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_446(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [14])
def test_ADDMOD_447(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_448(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_449(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_450(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_451(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_452(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_453(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_454(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_455(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_456(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_457(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_458(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_459(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_460(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_461(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_462(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_463(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_464(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_465(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_466(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_467(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_468(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_469(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_470(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_471(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_472(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_473(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_474(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_475(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_476(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_477(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_478(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_479(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_480(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_481(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_482(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_483(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_484(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_485(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_486(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_487(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [30])
def test_ADDMOD_488(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_489(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_490(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_491(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [14])
def test_ADDMOD_492(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_493(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_494(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_495(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_496(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_497(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_498(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_499(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_500(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_501(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_502(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_503(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_504(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_505(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_506(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_507(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [2])
def test_ADDMOD_508(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_509(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_510(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_511(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_512(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_513(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_514(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_515(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_516(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_517(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_518(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_519(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_520(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_521(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_522(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_523(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [14])
def test_ADDMOD_524(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_525(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_526(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_527(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [30])
def test_ADDMOD_528(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_529(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_530(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_531(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_532(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_533(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_534(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_535(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_536(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_537(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_538(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_539(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_540(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_541(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_542(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_543(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_544(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_545(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_546(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_547(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_548(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_549(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_550(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_551(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_552(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_553(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_554(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_555(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_556(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_557(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_558(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_559(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_560(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_561(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_562(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_563(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_564(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_565(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_566(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_567(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_568(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [30])
def test_ADDMOD_569(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_570(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_571(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_572(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [14])
def test_ADDMOD_573(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_574(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [47])
def test_ADDMOD_575(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_576(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_577(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_578(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_579(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_580(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_581(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [47])
def test_ADDMOD_582(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_583(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_584(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_585(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_586(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_587(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_588(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [2])
def test_ADDMOD_589(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_590(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_591(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_592(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_593(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_594(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_595(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_596(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_597(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_598(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_599(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_600(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_601(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_602(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_603(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_604(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [14])
def test_ADDMOD_605(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [47])
def test_ADDMOD_606(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_607(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_608(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [46])
def test_ADDMOD_609(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_610(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_611(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [47])
def test_ADDMOD_612(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_613(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_614(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_615(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_616(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_617(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_618(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_619(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_620(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_621(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_622(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [47])
def test_ADDMOD_623(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_624(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_625(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_626(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_627(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_628(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_629(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_630(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_631(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_632(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_633(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_634(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_635(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [47])
def test_ADDMOD_636(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_637(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_638(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_639(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_640(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [31])
def test_ADDMOD_641(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_642(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_643(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_644(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [15])
def test_ADDMOD_645(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_646(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_647(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_648(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_649(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [649037107316853453566312041152510])
def test_ADDMOD_650(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576255])
def test_ADDMOD_651(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576256])
def test_ADDMOD_652(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [486777830487640090174734030864367])
def test_ADDMOD_653(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [334659758460252561995129646219278])
def test_ADDMOD_654(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576271])
def test_ADDMOD_655(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576287])
def test_ADDMOD_656(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576303])
def test_ADDMOD_657(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576255])
def test_ADDMOD_658(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576255])
def test_ADDMOD_659(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_660(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_661(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288112])
def test_ADDMOD_662(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643023])
def test_ADDMOD_663(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_664(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_665(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_666(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_667(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576256])
def test_ADDMOD_668(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_669(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [2])
def test_ADDMOD_670(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288113])
def test_ADDMOD_671(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643024])
def test_ADDMOD_672(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_673(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_674(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [49])
def test_ADDMOD_675(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_676(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [486777830487640090174734030864367])
def test_ADDMOD_677(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288112])
def test_ADDMOD_678(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288113])
def test_ADDMOD_679(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576224])
def test_ADDMOD_680(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [172400481631039198603551635931135])
def test_ADDMOD_681(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288128])
def test_ADDMOD_682(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288144])
def test_ADDMOD_683(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288160])
def test_ADDMOD_684(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288112])
def test_ADDMOD_685(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [334659758460252561995129646219278])
def test_ADDMOD_686(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643023])
def test_ADDMOD_687(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643024])
def test_ADDMOD_688(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [172400481631039198603551635931135])
def test_ADDMOD_689(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [20282409603651670423947251286046])
def test_ADDMOD_690(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643039])
def test_ADDMOD_691(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643055])
def test_ADDMOD_692(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643071])
def test_ADDMOD_693(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643023])
def test_ADDMOD_694(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576271])
def test_ADDMOD_695(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_696(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [17])
def test_ADDMOD_697(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288128])
def test_ADDMOD_698(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643039])
def test_ADDMOD_699(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_700(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_701(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_702(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_703(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576287])
def test_ADDMOD_704(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_705(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [33])
def test_ADDMOD_706(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288144])
def test_ADDMOD_707(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643055])
def test_ADDMOD_708(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_709(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_710(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [80])
def test_ADDMOD_711(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_712(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576303])
def test_ADDMOD_713(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_714(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [49])
def test_ADDMOD_715(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288160])
def test_ADDMOD_716(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643071])
def test_ADDMOD_717(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [64])
def test_ADDMOD_718(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [80])
def test_ADDMOD_719(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [96])
def test_ADDMOD_720(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_721(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(115792089237316195423570985008687907853269984665640564039457584007913129639935)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [324518553658426726783156020576255])
def test_ADDMOD_722(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(0)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
def test_ADDMOD_723(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(1)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [1])
def test_ADDMOD_724(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(57896044618658097711785492504343953926634992332820282019728792003956564819952)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [162259276829213363391578010288112])
def test_ADDMOD_725(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(3618502788666131106986593281521497120414687020801267626233049500247285301263)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [10141204801825835211973625643023])
def test_ADDMOD_726(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(16)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [16])
def test_ADDMOD_727(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(32)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [32])
def test_ADDMOD_728(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(48)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [48])
def test_ADDMOD_729(self):
#Make the constraint store
constraints = ConstraintSet()
#make the ethereum world state
world = evm.EVMWorld(constraints)
address=0x222222222222222222222222222222222222200
caller=origin=0x111111111111111111111111111111111111100
price=0
value=10000
bytecode='\x08'
data = 'AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA'
header = { 'coinbase': 0,
'timestamp': 0,
'number': 0,
'difficulty': 0,
'gaslimit': 0,
}
gas = 1000000
new_vm = evm.EVM(constraints, address, data, caller, value, bytecode, gas=gas, world=world)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
new_vm._push(6089590155545428825848686802984512581899718912)
last_exception, last_returned = self._execute(new_vm)
self.assertEqual(last_exception, None)
self.assertEqual(new_vm.pc, 1)
self.assertEqual(new_vm.stack, [0])
if __name__ == '__main__':
unittest.main()
| 41.555949 | 124 | 0.56638 | 70,825 | 880,155 | 6.87342 | 0.012326 | 0.052433 | 0.040433 | 0.0599 | 0.993498 | 0.993498 | 0.993498 | 0.993498 | 0.993498 | 0.993498 | 0 | 0.274747 | 0.358172 | 880,155 | 21,179 | 125 | 41.557911 | 0.587004 | 0.044726 | 0 | 0.914485 | 0 | 0 | 0.065212 | 0.026057 | 0 | 0 | 0.071221 | 0 | 0.124765 | 1 | 0.041645 | false | 0 | 0.000399 | 0 | 0.042273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
8a2041f5ed2fd6df3f2b9ca0901dd63c2aa61ef5 | 183 | py | Python | form2fit/code/ml/models/__init__.py | jettan/form2fit | 15db0a1e3359f0bba3cbcac1b67942cbe19c66a6 | [
"MIT"
] | 82 | 2019-11-01T00:56:37.000Z | 2021-12-13T03:06:53.000Z | form2fit/code/ml/models/__init__.py | jettan/form2fit | 15db0a1e3359f0bba3cbcac1b67942cbe19c66a6 | [
"MIT"
] | 5 | 2019-11-14T08:05:25.000Z | 2020-10-12T09:02:18.000Z | form2fit/code/ml/models/__init__.py | jettan/form2fit | 15db0a1e3359f0bba3cbcac1b67942cbe19c66a6 | [
"MIT"
] | 21 | 2019-11-01T08:14:56.000Z | 2022-02-17T07:35:09.000Z | from form2fit.code.ml.models.correspondence import CorrespondenceNet
from form2fit.code.ml.models.placement import PlacementNet
from form2fit.code.ml.models.suction import SuctionNet
| 45.75 | 68 | 0.868852 | 24 | 183 | 6.625 | 0.5 | 0.226415 | 0.301887 | 0.339623 | 0.45283 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017544 | 0.065574 | 183 | 3 | 69 | 61 | 0.912281 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
8a284902b988f5980198e8aa64e9f46cb1018e76 | 4,163 | py | Python | library/source1/bsp/lumps/game_lump.py | anderlli0053/SourceIO | 3c0c4839939ce698439987ac52154f89ee2f5341 | [
"MIT"
] | 199 | 2019-04-02T02:30:58.000Z | 2022-03-30T21:29:49.000Z | library/source1/bsp/lumps/game_lump.py | anderlli0053/SourceIO | 3c0c4839939ce698439987ac52154f89ee2f5341 | [
"MIT"
] | 113 | 2019-03-03T19:36:25.000Z | 2022-03-31T19:44:05.000Z | library/source1/bsp/lumps/game_lump.py | anderlli0053/SourceIO | 3c0c4839939ce698439987ac52154f89ee2f5341 | [
"MIT"
] | 38 | 2019-05-15T16:49:30.000Z | 2022-03-22T03:40:43.000Z | from typing import List
from .. import Lump, lump_tag
from ..datatypes.game_lump_header import GameLumpHeader, VindictusGameLumpHeader
from ..datatypes.gamelumps.detail_prop_lump import DetailPropLump
from ..datatypes.gamelumps.static_prop_lump import StaticPropLump
from . import SteamAppId
from . import ByteIO
@lump_tag(35, 'LUMP_GAME_LUMP')
class GameLump(Lump):
def __init__(self, bsp, lump_id):
super().__init__(bsp, lump_id)
self.lump_count = 0
self.game_lumps_info: List[GameLumpHeader] = []
self.game_lumps = {}
def parse(self):
reader = self.reader
self.lump_count = reader.read_uint32()
for _ in range(self.lump_count):
lump = GameLumpHeader(self, self._bsp).parse(reader)
if not lump.id:
continue
self.game_lumps_info.append(lump)
for lump in self.game_lumps_info:
relative_offset = lump.offset - self._lump.offset
print(f'GLump "{lump.id}" offset: {relative_offset} size: {lump.size} ')
with reader.save_current_pos():
reader.seek(relative_offset)
if lump.flags == 1:
curr_index = self.game_lumps_info.index(lump)
if curr_index + 1 != len(self.game_lumps_info):
next_offset = self.game_lumps_info[curr_index + 1].offset - self._lump.offset
else:
next_offset = self._lump.size
compressed_size = next_offset - relative_offset
buffer = reader.read(compressed_size)
game_lump_reader = Lump.decompress_lump(ByteIO(buffer))
else:
game_lump_reader = ByteIO(reader.read(lump.size))
pass # TODO
if lump.id == 'sprp':
game_lump = StaticPropLump(lump)
game_lump.parse(game_lump_reader)
self.game_lumps[lump.id] = game_lump
elif lump.id == 'dprp':
detail_lump = DetailPropLump(lump)
detail_lump.parse(game_lump_reader)
self.game_lumps[lump.id] = detail_lump
return self
@lump_tag(35, 'LUMP_GAME_LUMP', steam_id=SteamAppId.VINDICTUS)
class VGameLump(Lump):
def __init__(self, bsp, lump_id):
super().__init__(bsp, lump_id)
self.lump_count = 0
self.game_lumps_info: List[GameLumpHeader] = []
self.game_lumps = {}
def parse(self):
reader = self.reader
self.lump_count = reader.read_uint32()
for _ in range(self.lump_count):
lump = VindictusGameLumpHeader(self, self._bsp).parse(reader)
if not lump.id:
continue
self.game_lumps_info.append(lump)
for lump in self.game_lumps_info:
relative_offset = lump.offset - self._lump.offset
print(f'GLump "{lump.id}" offset: {relative_offset} size: {lump.size} ')
with reader.save_current_pos():
reader.seek(relative_offset)
if lump.flags == 1:
curr_index = self.game_lumps_info.index(lump)
if curr_index + 1 != len(self.game_lumps_info):
next_offset = self.game_lumps_info[curr_index + 1].offset - self._lump.offset
else:
next_offset = self._lump.size
compressed_size = next_offset - relative_offset
buffer = reader.read(compressed_size)
game_lump_reader = Lump.decompress_lump(ByteIO(buffer))
else:
game_lump_reader = ByteIO(reader.read(lump.size))
pass # TODO
if lump.id == 'sprp':
game_lump = StaticPropLump(lump)
game_lump.parse(game_lump_reader)
self.game_lumps[lump.id] = game_lump
elif lump.id == 'dprp':
detail_lump = DetailPropLump(lump)
detail_lump.parse(game_lump_reader)
self.game_lumps[lump.id] = detail_lump
return self
| 41.217822 | 101 | 0.580591 | 477 | 4,163 | 4.773585 | 0.15304 | 0.063241 | 0.102767 | 0.089592 | 0.846728 | 0.846728 | 0.828283 | 0.828283 | 0.828283 | 0.828283 | 0 | 0.005741 | 0.330531 | 4,163 | 100 | 102 | 41.63 | 0.811267 | 0.002162 | 0 | 0.853933 | 0 | 0 | 0.040472 | 0 | 0 | 0 | 0 | 0.01 | 0 | 1 | 0.044944 | false | 0.022472 | 0.078652 | 0 | 0.168539 | 0.022472 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8a6709ff930303d60a3c26bc55b0abd0c9e60928 | 7,286 | py | Python | tests/data/test_board.py | Tyler-Yates/crossword-creator | c8358e97d8ccf2567f7ebc9bfe4bc7deb45ff590 | [
"MIT"
] | null | null | null | tests/data/test_board.py | Tyler-Yates/crossword-creator | c8358e97d8ccf2567f7ebc9bfe4bc7deb45ff590 | [
"MIT"
] | null | null | null | tests/data/test_board.py | Tyler-Yates/crossword-creator | c8358e97d8ccf2567f7ebc9bfe4bc7deb45ff590 | [
"MIT"
] | null | null | null | from application.data.board import Board
from tests.data.test_word_manager import TestWordManager
class TestBoard:
def setup_method(self):
# Create a word manager that accepts any word
self.word_manager = TestWordManager()
def test_valid_board_1(self):
tiles = [
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
[None, None, None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
invalid_points = board.board_is_valid_crossword()
assert set() == invalid_points
def test_valid_board_2(self):
tiles = [
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["d", None, None, None, None],
["y", None, None, None, None],
]
board = Board("test", 5, TestWordManager({"daddy", "dad", "bad", "as"}))
board._set_board(tiles)
invalid_points = board.board_is_valid_crossword()
assert set() == invalid_points
def test_invalid_board_invalid_word(self):
tiles = [
["d", None, "b", "a", "d"],
["a", None, "a", "z", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
[None, None, None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
invalid_points = board.board_is_valid_crossword()
# Invalid because the two instances of "az" are not valid words
assert {(0, 3), (1, 2), (1, 3)} == invalid_points
def test_invalid_board_not_connected(self):
tiles = [
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
[None, None, "d", "a", "dad"],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
invalid_points = board.board_is_valid_crossword()
# Invalid because not all tiles are connected
assert {(4, 2), (4, 3), (4, 4)} == invalid_points
def test_shift_down_valid(self):
tiles = [
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
[None, None, None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
assert board.shift_board_down()
expected_tiles = [
[None, None, None, None, None],
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
]
assert expected_tiles == board.board
def test_shift_down_invalid(self):
tiles = [
[None, None, None, None, None],
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
assert not board.shift_board_down()
# Tiles should not have moved
assert tiles == board.board
def test_shift_up_valid(self):
tiles = [
[None, None, None, None, None],
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
assert board.shift_board_up()
expected_tiles = [
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
[None, None, None, None, None],
]
assert expected_tiles == board.board
def test_shift_up_invalid(self):
tiles = [
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
[None, None, None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
assert not board.shift_board_up()
# Tiles should not have moved
assert tiles == board.board
def test_shift_right_valid(self):
tiles = [
["d", None, "b", "a", None],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
["x", None, None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
assert board.shift_board_right()
expected_tiles = [
[None, "d", None, "b", "a"],
[None, "a", None, "a", "s"],
[None, "d", "a", "d", None],
[None, "s", None, None, None],
[None, "x", None, None, None],
]
assert expected_tiles == board.board
def test_shift_right_invalid(self):
tiles = [
[None, None, None, None, None],
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
assert not board.shift_board_right()
# Tiles should not have moved
assert tiles == board.board
def test_shift_left_valid(self):
tiles = [
[None, "d", None, "b", "a"],
[None, "a", None, "a", "s"],
[None, "d", "a", "d", None],
[None, "s", None, None, None],
[None, "x", None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
assert board.shift_board_left()
expected_tiles = [
["d", None, "b", "a", None],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
["x", None, None, None, None],
]
assert expected_tiles == board.board
def test_shift_left_invalid(self):
tiles = [
[None, None, None, None, None],
["d", None, "b", "a", "d"],
["a", None, "a", "s", None],
["d", "a", "d", None, None],
["s", None, None, None, None],
]
board = Board("test", 5, TestWordManager({"dads", "dad", "bad", "as"}))
board._set_board(tiles)
assert not board.shift_board_left()
# Tiles should not have moved
assert tiles == board.board
| 31.678261 | 80 | 0.458962 | 854 | 7,286 | 3.786885 | 0.078454 | 0.306741 | 0.304267 | 0.277056 | 0.862709 | 0.862709 | 0.843847 | 0.832715 | 0.832715 | 0.804886 | 0 | 0.00545 | 0.345183 | 7,286 | 229 | 81 | 31.816594 | 0.672396 | 0.035822 | 0 | 0.728814 | 0 | 0 | 0.053442 | 0 | 0 | 0 | 0 | 0 | 0.112994 | 1 | 0.073446 | false | 0 | 0.011299 | 0 | 0.090395 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
8a7c1b6d3b637ff1388935024e20023218550508 | 18,516 | py | Python | tests/test_pwhash.py | carlosonunez/pynacl | 56f3b9bff23c6e5761a806aaf1222ce5428ddc43 | [
"Apache-2.0"
] | null | null | null | tests/test_pwhash.py | carlosonunez/pynacl | 56f3b9bff23c6e5761a806aaf1222ce5428ddc43 | [
"Apache-2.0"
] | 9 | 2021-01-05T06:55:00.000Z | 2021-11-06T19:27:33.000Z | tests/test_pwhash.py | carlosonunez/pynacl | 56f3b9bff23c6e5761a806aaf1222ce5428ddc43 | [
"Apache-2.0"
] | 1 | 2021-12-23T12:31:23.000Z | 2021-12-23T12:31:23.000Z | # Copyright 2013 Donald Stufft and individual contributors
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import binascii
import json
import os
import sys
import unicodedata as ud
from hypothesis import given, settings
from hypothesis.strategies import integers, text
import pytest
import nacl.bindings
import nacl.encoding
import nacl.exceptions as exc
import nacl.pwhash
_all_unicode = "".join(chr(i) for i in range(sys.maxunicode))
PASSWD_CHARS = "".join(
c
for c in _all_unicode
if (
ud.category(c).startswith("L")
or ud.category(c).startswith("N")
or ud.category(c) == "Zs"
)
)
# Select Letters, number representations and spacing characters
def argon2i_modular_crypt_ref():
DATA = "modular_crypt_argon2i_hashes.json"
path = os.path.join(os.path.dirname(__file__), "data", DATA)
jvectors = json.load(open(path))
vectors = [
(x["pwhash"], x["passwd"]) for x in jvectors if x["mode"] == "crypt"
]
return vectors
def argon2i_raw_ref():
DATA = "raw_argon2i_hashes.json"
path = os.path.join(os.path.dirname(__file__), "data", DATA)
jvectors = json.load(open(path))
vectors = [
(
x["dgst_len"],
x["passwd"],
x["salt"],
x["iters"],
x["maxmem"],
x["pwhash"],
)
for x in jvectors
if x["mode"] == "raw"
]
return vectors
def argon2id_modular_crypt_ref():
DATA = "modular_crypt_argon2id_hashes.json"
path = os.path.join(os.path.dirname(__file__), "data", DATA)
jvectors = json.load(open(path))
vectors = [
(x["pwhash"], x["passwd"])
for x in jvectors
if (x["mode"] == "crypt" and x["construct"] == "argon2id")
]
return vectors
def argon2id_raw_ref():
DATA = "raw_argon2id_hashes.json"
path = os.path.join(os.path.dirname(__file__), "data", DATA)
jvectors = json.load(open(path))
vectors = [
(
x["dgst_len"],
x["passwd"],
x["salt"],
x["iters"],
x["maxmem"],
x["pwhash"],
)
for x in jvectors
if (x["mode"] == "raw" and x["construct"] == "argon2id")
]
return vectors
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("size", "password", "salt", "opslimit", "memlimit", "expected"),
[
(
32,
b"The quick brown fox jumps over the lazy dog.",
b"ef537f25c895bfa782526529a9b63d97",
20000,
(2 ** 20) * 100,
(
b"\x10e>\xc8A8\x11\xde\x07\xf1\x0f\x98"
b"EG\xe6}V]\xd4yN\xae\xd3P\x87yP\x1b\xc7+n*"
),
),
],
)
def test_kdf_scryptsalsa208sha256(
size, password, salt, opslimit, memlimit, expected
):
res = nacl.pwhash.kdf_scryptsalsa208sha256(
size, password, salt, opslimit, memlimit
)
assert res == expected
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("password",), [(b"The quick brown fox jumps over the lazy dog.",)]
)
def test_scryptsalsa208sha256_random(password):
h1 = nacl.pwhash.scryptsalsa208sha256_str(password)
h2 = nacl.pwhash.scryptsalsa208sha256_str(password)
assert h1 != h2
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("password",), [(b"The quick brown fox jumps over the lazy dog.",)]
)
def test_scryptsalsa208sha256_verify(password):
assert nacl.pwhash.verify_scryptsalsa208sha256(
nacl.pwhash.scryptsalsa208sha256_str(password), password
)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("password",), [(b"The quick brown fox jumps over the lazy dog.",)]
)
def test_scryptsalsa208sha256_verify_incorrect(password):
with pytest.raises(exc.InvalidkeyError):
nacl.pwhash.verify_scryptsalsa208sha256(
nacl.pwhash.scryptsalsa208sha256_str(password),
password.replace(b"dog", b"cat"),
)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("size", "password", "salt", "opslimit", "memlimit"),
[
(
32,
b"The quick brown fox jumps over the lazy dog.",
b"ef537f25c895bfa782526529a9",
20000,
(2 ** 20) * 100,
),
],
)
def test_wrong_salt_length(size, password, salt, opslimit, memlimit):
with pytest.raises(exc.ValueError):
nacl.pwhash.kdf_scryptsalsa208sha256(
size, password, salt, opslimit, memlimit
)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("passwd_hash", "password"),
[
(
b"Too short (and wrong) hash",
b"a password",
)
],
)
def test_wrong_hash_length(passwd_hash, password):
with pytest.raises(exc.ValueError):
nacl.pwhash.verify_scryptsalsa208sha256(passwd_hash, password)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("size", "password", "salt", "opslimit", "memlimit"),
[
(
32,
b"The quick brown fox jumps over the lazy dog.",
b"ef537f25c895bfa782526529a9b6",
20000,
(2 ** 20) * 100,
),
],
)
def test_kdf_wrong_salt_length(size, password, salt, opslimit, memlimit):
with pytest.raises(exc.ValueError):
nacl.pwhash.kdf_scryptsalsa208sha256(
size, password, salt, opslimit, memlimit
)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("passwd_hash", "password"),
[
(
b"Too short (and wrong) hash",
b"another password",
)
],
)
def test_str_verify_wrong_hash_length(passwd_hash, password):
with pytest.raises(exc.ValueError):
nacl.pwhash.verify_scryptsalsa208sha256(passwd_hash, password)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("size", "password", "salt", "opslimit", "memlimit", "expected"),
[
(
32,
b"The quick brown fox jumps over the lazy dog.",
b"ef537f25c895bfa782526529a9b63d97",
20000,
(2 ** 20) * 100,
(
b"\x10e>\xc8A8\x11\xde\x07\xf1\x0f\x98"
b"EG\xe6}V]\xd4yN\xae\xd3P\x87yP\x1b\xc7+n*"
),
),
],
)
def test_scrypt_kdf(size, password, salt, opslimit, memlimit, expected):
res = nacl.pwhash.scrypt.kdf(size, password, salt, opslimit, memlimit)
assert res == expected
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("password",), [(b"The quick brown fox jumps over the lazy dog.",)]
)
def test_scrypt_random(password):
h1 = nacl.pwhash.scrypt.str(password)
h2 = nacl.pwhash.scrypt.str(password)
assert h1 != h2
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("password",), [(b"The quick brown fox jumps over the lazy dog.",)]
)
def test_scrypt_verify(password):
assert nacl.pwhash.scrypt.verify(
nacl.pwhash.scrypt.str(password), password
)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("password",), [(b"The quick brown fox jumps over the lazy dog.",)]
)
def test_scrypt_verify_incorrect(password):
with pytest.raises(exc.InvalidkeyError):
nacl.pwhash.scrypt.verify(
nacl.pwhash.scrypt.str(password), password.replace(b"dog", b"cat")
)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("size", "password", "salt", "opslimit", "memlimit"),
[
(
32,
b"The quick brown fox jumps over the lazy dog.",
b"ef537f25c895bfa782526529a9",
20000,
(2 ** 20) * 100,
),
],
)
def test_wrong_scrypt_salt_length(size, password, salt, opslimit, memlimit):
with pytest.raises(exc.ValueError):
nacl.pwhash.scrypt.kdf(size, password, salt, opslimit, memlimit)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("passwd_hash", "password"),
[
(
b"Too short (and wrong) hash",
b"a password",
)
],
)
def test_wrong_scrypt_hash_length(passwd_hash, password):
with pytest.raises(exc.ValueError):
nacl.pwhash.scrypt.verify(passwd_hash, password)
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("size", "password", "salt", "opslimit", "memlimit"),
[
(
32,
b"The quick brown fox jumps over the lazy dog.",
b"ef537f25c895bfa782526529a9b6",
20000,
(2 ** 20) * 100,
),
],
)
def test_scrypt_kdf_wrong_salt_length(
size, password, salt, opslimit, memlimit
):
with pytest.raises(exc.ValueError):
nacl.pwhash.scrypt.kdf(size, password, salt, opslimit, memlimit)
@pytest.mark.parametrize(
("opslimit", "memlimit", "n", "r", "p"),
[
(32768, 2 * (2 ** 20), 10, 8, 1),
(32768, 8 * (2 ** 10), 3, 8, 128),
(65536, (2 ** 20) * 2, 11, 8, 1),
(262144, (2 ** 20) * 2, 11, 8, 4),
(2 * (2 ** 20), 2 * (2 ** 20), 11, 8, 32),
],
)
def test_variable_limits(opslimit, memlimit, n, r, p):
rn, rr, rp = nacl.bindings.nacl_bindings_pick_scrypt_params(
opslimit, memlimit
)
assert rn == n
assert rr == r
assert rp == p
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@pytest.mark.parametrize(
("passwd_hash", "password"),
[
(
b"Too short (and wrong) hash",
b"another password",
)
],
)
def test_scrypt_str_verify_wrong_hash_length(passwd_hash, password):
with pytest.raises(exc.ValueError):
nacl.pwhash.scrypt.verify(passwd_hash, password)
@pytest.mark.parametrize(
("password_hash", "password"),
argon2i_modular_crypt_ref() + argon2id_modular_crypt_ref(),
)
def test_str_verify_argon2_ref(password_hash, password):
pw_hash = password_hash.encode("ascii")
pw = password.encode("ascii")
res = nacl.pwhash.argon2id.verify(pw_hash, pw)
assert res is True
@pytest.mark.parametrize(
("password_hash", "password"),
argon2i_modular_crypt_ref() + argon2id_modular_crypt_ref(),
)
def test_str_verify_argon2_ref_fail(password_hash, password):
pw_hash = password_hash.encode("ascii")
pw = ("a" + password).encode("ascii")
with pytest.raises(exc.InvalidkeyError):
nacl.pwhash.argon2id.verify(pw_hash, pw)
@given(
text(alphabet=PASSWD_CHARS, min_size=5, max_size=20),
integers(min_value=4, max_value=6),
integers(min_value=1024 * 1024, max_value=16 * 1024 * 1024),
)
@settings(deadline=None, max_examples=20)
def test_argon2i_str_and_verify(password, ops, mem):
_psw = password.encode("utf-8")
pw_hash = nacl.pwhash.argon2i.str(_psw, opslimit=ops, memlimit=mem)
res = nacl.pwhash.argon2i.verify(pw_hash, _psw)
assert res is True
@given(
text(alphabet=PASSWD_CHARS, min_size=5, max_size=20),
integers(min_value=1, max_value=4),
integers(min_value=1024 * 1024, max_value=16 * 1024 * 1024),
)
@settings(deadline=None, max_examples=20)
def test_argon2id_str_and_verify(password, ops, mem):
_psw = password.encode("utf-8")
pw_hash = nacl.pwhash.argon2id.str(_psw, opslimit=ops, memlimit=mem)
res = nacl.pwhash.argon2id.verify(pw_hash, _psw)
assert res is True
@given(
text(alphabet=PASSWD_CHARS, min_size=5, max_size=20),
integers(min_value=4, max_value=6),
integers(min_value=1024 * 1024, max_value=16 * 1024 * 1024),
)
@settings(deadline=None, max_examples=20)
def test_argon2i_str_and_verify_fail(password, ops, mem):
_psw = password.encode("utf-8")
pw_hash = nacl.pwhash.argon2i.str(_psw, opslimit=ops, memlimit=mem)
with pytest.raises(exc.InvalidkeyError):
nacl.pwhash.argon2i.verify(pw_hash, b"A" + _psw)
@given(text(alphabet=PASSWD_CHARS, min_size=5, max_size=20))
@settings(deadline=None, max_examples=5)
def test_pwhash_str_and_verify(password):
_psw = password.encode("utf-8")
a2i_hash = nacl.pwhash.argon2i.str(
_psw,
opslimit=nacl.pwhash.argon2i.OPSLIMIT_INTERACTIVE,
memlimit=nacl.pwhash.argon2i.MEMLIMIT_INTERACTIVE,
)
a2i_res = nacl.pwhash.verify(a2i_hash, _psw)
assert a2i_res is True
a2id_hash = nacl.pwhash.argon2id.str(
_psw,
opslimit=nacl.pwhash.argon2id.OPSLIMIT_INTERACTIVE,
memlimit=nacl.pwhash.argon2id.MEMLIMIT_INTERACTIVE,
)
a2id_res = nacl.pwhash.verify(a2id_hash, _psw)
assert a2id_res is True
@pytest.mark.skipif(
not nacl.pwhash.scrypt.AVAILABLE, reason="Requires full build of libsodium"
)
@given(text(alphabet=PASSWD_CHARS, min_size=5, max_size=20))
@settings(deadline=None, max_examples=5)
def test_pwhash_scrypt_str_and_verify(password):
_psw = password.encode("utf-8")
scrypt_hash = nacl.pwhash.scrypt.str(
_psw,
opslimit=nacl.pwhash.scrypt.OPSLIMIT_INTERACTIVE,
memlimit=nacl.pwhash.scrypt.MEMLIMIT_INTERACTIVE,
)
scrypt_res = nacl.pwhash.verify(scrypt_hash, _psw)
assert scrypt_res is True
def test_invalid_modular_scrypt_prefix():
psw = b"always invalid password"
invalid_modular_hash = b"$invalid_prefix$"
with pytest.raises(exc.InvalidkeyError):
nacl.pwhash.verify(invalid_modular_hash, psw)
def test_crypt_prefix_error():
psw = b"always invalid password"
invalid_modular_hash = b"$invalid_prefix$"
with pytest.raises(exc.CryptPrefixError):
nacl.pwhash.verify(invalid_modular_hash, psw)
@pytest.mark.parametrize(
("dk_size", "password", "salt", "iters", "mem_kb", "pwhash"),
argon2i_raw_ref(),
)
def test_argon2i_kdf(dk_size, password, salt, iters, mem_kb, pwhash):
dk = nacl.pwhash.argon2i.kdf(
dk_size,
password.encode("utf-8"),
salt.encode("utf-8"),
iters,
1024 * mem_kb,
)
ref = binascii.unhexlify(pwhash)
assert dk == ref
@pytest.mark.parametrize(
("dk_size", "password", "salt", "iters", "mem_kb", "pwhash"),
argon2id_raw_ref(),
)
def test_argon2_kdf_alg_argon2id(
dk_size, password, salt, iters, mem_kb, pwhash
):
dk = nacl.pwhash.argon2id.kdf(
dk_size,
password.encode("utf-8"),
salt.encode("utf-8"),
iters,
1024 * mem_kb,
)
ref = binascii.unhexlify(pwhash)
assert dk == ref
raising_argon2_parameters = [
# wrong salt length:
(20, "aPassword", 3 * "salt", 3, 256),
# too short output:
(15, "aPassword", 4 * "salt", 4, 256),
# too long output:
(0xFFFFFFFF + 1, "aPassword", 4 * "salt", 4, 256),
# too high interation count:
(20, "aPassword", 4 * "salt", 0xFFFFFFFF + 1, 256),
# too low memory usage:
(20, "aPassword", 4 * "salt", 4, 2),
# too high memory usage:
(20, "aPassword", 4 * "salt", 4, 0xFFFFFFFF + 1),
]
@pytest.mark.parametrize(
("dk_size", "password", "salt", "iters", "mem_kb"),
raising_argon2_parameters
+ [
# too low iteration count:
(20, "aPassword", 4 * "salt", 1, 256),
],
)
def test_argon2i_kdf_invalid_parms(dk_size, password, salt, iters, mem_kb):
with pytest.raises(exc.ValueError):
nacl.pwhash.argon2i.kdf(
dk_size,
password.encode("utf-8"),
salt.encode("utf-8"),
iters,
1024 * mem_kb,
)
@pytest.mark.parametrize(
("dk_size", "password", "salt", "iters", "mem_kb"),
raising_argon2_parameters
+ [
# too low iteration count:
(20, "aPassword", 4 * "salt", 0, 256),
],
)
def test_argon2id_kdf_invalid_parms(dk_size, password, salt, iters, mem_kb):
with pytest.raises(exc.ValueError):
nacl.pwhash.argon2id.kdf(
dk_size,
password.encode("utf-8"),
salt.encode("utf-8"),
iters,
1024 * mem_kb,
)
def test_check_limits_for_unknown_algorithm():
from nacl.bindings.crypto_pwhash import _check_argon2_limits_alg
with pytest.raises(exc.TypeError):
_check_argon2_limits_alg(4, 1024, -1)
@pytest.mark.skipif(
nacl.pwhash.scrypt.AVAILABLE, reason="Requires minimal build of libsodium"
)
def test_scryptsalsa208sha256_unavailable():
empty = b""
with pytest.raises(exc.UnavailableError):
nacl.pwhash.kdf_scryptsalsa208sha256(0, empty, empty)
with pytest.raises(exc.UnavailableError):
nacl.pwhash.scryptsalsa208sha256_str(empty)
with pytest.raises(exc.UnavailableError):
nacl.pwhash.verify_scryptsalsa208sha256(empty, empty)
@pytest.mark.skipif(
nacl.pwhash.scrypt.AVAILABLE, reason="Requires minimal build of libsodium"
)
def test_scrypt_unavailable():
empty = b""
with pytest.raises(exc.UnavailableError):
nacl.pwhash.scrypt.kdf(0, empty, empty)
with pytest.raises(exc.UnavailableError):
nacl.pwhash.scrypt.str(empty)
with pytest.raises(exc.UnavailableError):
nacl.pwhash.scrypt.verify(empty, empty)
| 28.93125 | 79 | 0.643768 | 2,307 | 18,516 | 5.02124 | 0.120503 | 0.063881 | 0.049724 | 0.037724 | 0.816212 | 0.778747 | 0.761654 | 0.729023 | 0.725829 | 0.691989 | 0 | 0.046181 | 0.225805 | 18,516 | 639 | 80 | 28.976526 | 0.761911 | 0.044286 | 0 | 0.544423 | 0 | 0 | 0.154133 | 0.024897 | 0 | 0 | 0.001698 | 0 | 0.032136 | 1 | 0.069943 | false | 0.198488 | 0.024575 | 0 | 0.102079 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
8a8724d9a3c05fdcc2d8e3ce75246b669e8fab3a | 135,846 | py | Python | python2/smb/base.py | smartfile/pysmb | 1e78986909bfaf8f64b4f95a92e3768bf3db92bc | [
"Zlib"
] | 1 | 2015-04-28T19:16:15.000Z | 2015-04-28T19:16:15.000Z | python2/smb/base.py | smartfile/pysmb | 1e78986909bfaf8f64b4f95a92e3768bf3db92bc | [
"Zlib"
] | null | null | null | python2/smb/base.py | smartfile/pysmb | 1e78986909bfaf8f64b4f95a92e3768bf3db92bc | [
"Zlib"
] | null | null | null |
import logging, binascii, time, hmac
from datetime import datetime
from smb_constants import *
from smb2_constants import *
from smb_structs import *
from smb2_structs import *
from nmb.base import NMBSession
from utils import convertFILETIMEtoEpoch
import ntlm, securityblob
try:
import hashlib
sha256 = hashlib.sha256
except ImportError:
from utils import sha256
class NotReadyError(Exception):
"""Raised when SMB connection is not ready (i.e. not authenticated or authentication failed)"""
pass
class NotConnectedError(Exception):
"""Raised when underlying SMB connection has been disconnected or not connected yet"""
pass
class SMBTimeout(Exception):
"""Raised when a timeout has occurred while waiting for a response or for a SMB/CIFS operation to complete."""
pass
def _convert_to_unicode(string):
if not isinstance(string, unicode):
string = unicode(string, "utf-8")
return string
class SMB(NMBSession):
"""
This class represents a "connection" to the remote SMB/CIFS server.
It is not meant to be used directly in an application as it does not have any network transport implementations.
For application use, please refer to
- L{SMBProtocol.SMBProtocolFactory<smb.SMBProtocol>} if you are using Twisted framework
In [MS-CIFS], this class will contain attributes of Client, Client.Connection and Client.Session abstract data models.
References:
===========
- [MS-CIFS]: 3.2.1
"""
log = logging.getLogger('SMB.SMB')
SIGN_NEVER = 0
SIGN_WHEN_SUPPORTED = 1
SIGN_WHEN_REQUIRED = 2
def __init__(self, username, password, my_name, remote_name, domain = '', use_ntlm_v2 = True, sign_options = SIGN_WHEN_REQUIRED, is_direct_tcp = False):
NMBSession.__init__(self, my_name, remote_name, is_direct_tcp = is_direct_tcp)
self.username = _convert_to_unicode(username)
self.password = _convert_to_unicode(password)
self.domain = _convert_to_unicode(domain)
self.sign_options = sign_options
self.is_direct_tcp = is_direct_tcp
self.use_ntlm_v2 = use_ntlm_v2 #: Similar to LMAuthenticationPolicy and NTAuthenticationPolicy as described in [MS-CIFS] 3.2.1.1
self.smb_message = SMBMessage()
self.is_using_smb2 = False #: Are we communicating using SMB2 protocol? self.smb_message will be a SMB2Message instance if this flag is True
self.pending_requests = { } #: MID mapped to _PendingRequest instance
self.connected_trees = { } #: Share name mapped to TID
self.next_rpc_call_id = 1 #: Next RPC callID value. Not used directly in SMB message. Usually encapsulated in sub-commands under SMB_COM_TRANSACTION or SMB_COM_TRANSACTION2 messages
self.has_negotiated = False
self.has_authenticated = False
self.is_signing_active = False #: True if the remote server accepts message signing. All outgoing messages will be signed. Simiar to IsSigningActive as described in [MS-CIFS] 3.2.1.2
self.signing_session_key = None #: Session key for signing packets, if signing is active. Similar to SigningSessionKey as described in [MS-CIFS] 3.2.1.2
self.signing_challenge_response = None #: Contains the challenge response for signing, if signing is active. Similar to SigningChallengeResponse as described in [MS-CIFS] 3.2.1.2
self.mid = 0
self.uid = 0
self.next_signing_id = 2 #: Similar to ClientNextSendSequenceNumber as described in [MS-CIFS] 3.2.1.2
# SMB1 and SMB2 attributes
# Note that the interpretations of the values may differ between SMB1 and SMB2 protocols
self.capabilities = 0
self.security_mode = 0 #: Initialized from the SecurityMode field of the SMB_COM_NEGOTIATE message
# SMB1 attributes
# Most of the following attributes will be initialized upon receipt of SMB_COM_NEGOTIATE message from server (via self._updateServerInfo_SMB1 method)
self.use_plaintext_authentication = False #: Similar to PlaintextAuthenticationPolicy in in [MS-CIFS] 3.2.1.1
self.max_raw_size = 0
self.max_buffer_size = 0 #: Similar to MaxBufferSize as described in [MS-CIFS] 3.2.1.1
self.max_mpx_count = 0 #: Similar to MaxMpxCount as described in [MS-CIFS] 3.2.1.1
# SMB2 attributes
self.max_read_size = 0 #: Similar to MaxReadSize as described in [MS-SMB2] 2.2.4
self.max_write_size = 0 #: Similar to MaxWriteSize as described in [MS-SMB2] 2.2.4
self.max_transact_size = 0 #: Similar to MaxTransactSize as described in [MS-SMB2] 2.2.4
self.session_id = 0 #: Similar to SessionID as described in [MS-SMB2] 2.2.4. This will be set in _updateState_SMB2 method
self._setupSMB1Methods()
self.log.info('Authetication with remote machine "%s" for user "%s" will be using NTLM %s authentication (%s extended security)',
self.remote_name, self.username,
(self.use_ntlm_v2 and 'v2') or 'v1',
(SUPPORT_EXTENDED_SECURITY and 'with') or 'without')
#
# NMBSession Methods
#
def onNMBSessionOK(self):
self._sendSMBMessage(SMBMessage(ComNegotiateRequest()))
def onNMBSessionFailed(self):
pass
def onNMBSessionMessage(self, flags, data):
while True:
try:
i = self.smb_message.decode(data)
except SMB2ProtocolHeaderError:
self.log.info('Now switching over to SMB2 protocol communication')
self.is_using_smb2 = True
self.mid = 0 # Must reset messageID counter, or else remote SMB2 server will disconnect
self._setupSMB2Methods()
self.smb_message = self._klassSMBMessage()
i = self.smb_message.decode(data)
next_message_offset = 0
if self.is_using_smb2:
next_message_offset = self.smb_message.next_command_offset
if i > 0:
if not self.is_using_smb2:
self.log.debug('Received SMB message "%s" (command:0x%2X flags:0x%02X flags2:0x%04X TID:%d UID:%d)',
SMB_COMMAND_NAMES.get(self.smb_message.command, '<unknown>'),
self.smb_message.command, self.smb_message.flags, self.smb_message.flags2, self.smb_message.tid, self.smb_message.uid)
else:
self.log.debug('Received SMB2 message "%s" (command:0x%04X flags:0x%04x)',
SMB2_COMMAND_NAMES.get(self.smb_message.command, '<unknown>'),
self.smb_message.command, self.smb_message.flags)
if self._updateState(self.smb_message):
# We need to create a new instance instead of calling reset() because the instance could be captured in the message history.
self.smb_message = self._klassSMBMessage()
if next_message_offset > 0:
data = data[next_message_offset:]
else:
break
#
# Public Methods for Overriding in Subclasses
#
def onAuthOK(self):
pass
def onAuthFailed(self):
pass
#
# Protected Methods
#
def _setupSMB1Methods(self):
self._klassSMBMessage = SMBMessage
self._updateState = self._updateState_SMB1
self._updateServerInfo = self._updateServerInfo_SMB1
self._handleNegotiateResponse = self._handleNegotiateResponse_SMB1
self._sendSMBMessage = self._sendSMBMessage_SMB1
self._handleSessionChallenge = self._handleSessionChallenge_SMB1
self._listShares = self._listShares_SMB1
self._listPath = self._listPath_SMB1
self._listSnapshots = self._listSnapshots_SMB1
self._getAttributes = self._getAttributes_SMB1
self._retrieveFile = self._retrieveFile_SMB1
self._retrieveFileFromOffset = self._retrieveFileFromOffset_SMB1
self._storeFile = self._storeFile_SMB1
self._storeFileFromOffset = self._storeFileFromOffset_SMB1
self._deleteFiles = self._deleteFiles_SMB1
self._createDirectory = self._createDirectory_SMB1
self._deleteDirectory = self._deleteDirectory_SMB1
self._rename = self._rename_SMB1
self._echo = self._echo_SMB1
def _setupSMB2Methods(self):
self._klassSMBMessage = SMB2Message
self._updateState = self._updateState_SMB2
self._updateServerInfo = self._updateServerInfo_SMB2
self._handleNegotiateResponse = self._handleNegotiateResponse_SMB2
self._sendSMBMessage = self._sendSMBMessage_SMB2
self._handleSessionChallenge = self._handleSessionChallenge_SMB2
self._listShares = self._listShares_SMB2
self._listPath = self._listPath_SMB2
self._listSnapshots = self._listSnapshots_SMB2
self._getAttributes = self._getAttributes_SMB2
self._retrieveFile = self._retrieveFile_SMB2
self._retrieveFileFromOffset = self._retrieveFileFromOffset_SMB2
self._storeFile = self._storeFile_SMB2
self._storeFileFromOffset = self._storeFileFromOffset_SMB2
self._deleteFiles = self._deleteFiles_SMB2
self._createDirectory = self._createDirectory_SMB2
self._deleteDirectory = self._deleteDirectory_SMB2
self._rename = self._rename_SMB2
self._echo = self._echo_SMB2
def _getNextRPCCallID(self):
self.next_rpc_call_id += 1
return self.next_rpc_call_id
#
# SMB2 Methods Family
#
def _sendSMBMessage_SMB2(self, smb_message):
if smb_message.mid == 0:
smb_message.mid = self._getNextMID_SMB2()
if smb_message.command != SMB2_COM_NEGOTIATE and smb_message.command != SMB2_COM_ECHO:
smb_message.session_id = self.session_id
if self.is_signing_active:
smb_message.flags |= SMB2_FLAGS_SIGNED
raw_data = smb_message.encode()
smb_message.signature = hmac.new(self.signing_session_key, raw_data, sha256).digest()[:16]
smb_message.raw_data = smb_message.encode()
self.log.debug('MID is %d. Signature is %s. Total raw message is %d bytes', smb_message.mid, binascii.hexlify(smb_message.signature), len(smb_message.raw_data))
else:
smb_message.raw_data = smb_message.encode()
self.sendNMBMessage(smb_message.raw_data)
def _getNextMID_SMB2(self):
self.mid += 1
return self.mid
def _updateState_SMB2(self, message):
if message.isReply:
if message.command == SMB2_COM_NEGOTIATE:
if message.status == 0:
self.has_negotiated = True
self.log.info('SMB2 dialect negotiation successful')
self._updateServerInfo(message.payload)
self._handleNegotiateResponse(message)
else:
raise ProtocolError('Unknown status value (0x%08X) in SMB2_COM_NEGOTIATE' % message.status,
message.raw_data, message)
elif message.command == SMB2_COM_SESSION_SETUP:
if message.status == 0:
self.session_id = message.session_id
try:
result = securityblob.decodeAuthResponseSecurityBlob(message.payload.security_blob)
if result == securityblob.RESULT_ACCEPT_COMPLETED:
self.has_authenticated = True
self.log.info('Authentication (on SMB2) successful!')
self.onAuthOK()
else:
raise ProtocolError('SMB2_COM_SESSION_SETUP status is 0 but security blob negResult value is %d' % result, message.raw_data, message)
except securityblob.BadSecurityBlobError, ex:
raise ProtocolError(str(ex), message.raw_data, message)
elif message.status == 0xc0000016: # STATUS_MORE_PROCESSING_REQUIRED
self.session_id = message.session_id
try:
result, ntlm_token = securityblob.decodeChallengeSecurityBlob(message.payload.security_blob)
if result == securityblob.RESULT_ACCEPT_INCOMPLETE:
self._handleSessionChallenge(message, ntlm_token)
except ( securityblob.BadSecurityBlobError, securityblob.UnsupportedSecurityProvider ), ex:
raise ProtocolError(str(ex), message.raw_data, message)
elif message.status == 0xc000006d: # STATUS_LOGON_FAILURE
self.has_authenticated = False
self.log.info('Authentication (on SMB2) failed. Please check username and password.')
self.onAuthFailed()
else:
raise ProtocolError('Unknown status value (0x%08X) in SMB_COM_SESSION_SETUP_ANDX (with extended security)' % message.status,
message.raw_data, message)
req = self.pending_requests.pop(message.mid, None)
if req:
req.callback(message, **req.kwargs)
return True
def _updateServerInfo_SMB2(self, payload):
self.capabilities = payload.capabilities
self.security_mode = payload.security_mode
self.max_transact_size = payload.max_transact_size
self.max_read_size = payload.max_read_size
self.max_write_size = payload.max_write_size
self.use_plaintext_authentication = False # SMB2 never allows plaintext authentication
def _handleNegotiateResponse_SMB2(self, message):
ntlm_data = ntlm.generateNegotiateMessage()
blob = securityblob.generateNegotiateSecurityBlob(ntlm_data)
self._sendSMBMessage(SMB2Message(SMB2SessionSetupRequest(blob)))
def _handleSessionChallenge_SMB2(self, message, ntlm_token):
server_challenge, server_flags, server_info = ntlm.decodeChallengeMessage(ntlm_token)
self.log.info('Performing NTLMv2 authentication (on SMB2) with server challenge "%s"', binascii.hexlify(server_challenge))
if self.use_ntlm_v2:
self.log.info('Performing NTLMv2 authentication (on SMB2) with server challenge "%s"', binascii.hexlify(server_challenge))
nt_challenge_response, lm_challenge_response, session_key = ntlm.generateChallengeResponseV2(self.password,
self.username,
server_challenge,
server_info,
self.domain)
else:
self.log.info('Performing NTLMv1 authentication (on SMB2) with server challenge "%s"', binascii.hexlify(server_challenge))
nt_challenge_response, lm_challenge_response, session_key = ntlm.generateChallengeResponseV1(self.password, server_challenge, True)
ntlm_data = ntlm.generateAuthenticateMessage(server_flags,
nt_challenge_response,
lm_challenge_response,
session_key,
self.username,
self.domain)
if self.log.isEnabledFor(logging.DEBUG):
self.log.debug('NT challenge response is "%s" (%d bytes)', binascii.hexlify(nt_challenge_response), len(nt_challenge_response))
self.log.debug('LM challenge response is "%s" (%d bytes)', binascii.hexlify(lm_challenge_response), len(lm_challenge_response))
blob = securityblob.generateAuthSecurityBlob(ntlm_data)
self._sendSMBMessage(SMB2Message(SMB2SessionSetupRequest(blob)))
if self.security_mode & SMB2_NEGOTIATE_SIGNING_REQUIRED:
self.log.info('Server requires all SMB messages to be signed')
self.is_signing_active = (self.sign_options != SMB.SIGN_NEVER)
elif self.security_mode & SMB2_NEGOTIATE_SIGNING_ENABLED:
self.log.info('Server supports SMB signing')
self.is_signing_active = (self.sign_options == SMB.SIGN_WHEN_SUPPORTED)
else:
self.is_signing_active = False
if self.is_signing_active:
self.log.info("SMB signing activated. All SMB messages will be signed.")
self.signing_session_key = (session_key + '\0'*16)[:16]
if self.capabilities & CAP_EXTENDED_SECURITY:
self.signing_challenge_response = None
else:
self.signing_challenge_response = blob
else:
self.log.info("SMB signing deactivated. SMB messages will NOT be signed.")
def _listShares_SMB2(self, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = 'IPC$'
messages_history = [ ]
def connectSrvSvc(tid):
m = SMB2Message(SMB2CreateRequest('srvsvc',
file_attributes = 0,
access_mask = FILE_READ_DATA | FILE_WRITE_DATA | FILE_APPEND_DATA | FILE_READ_EA | FILE_WRITE_EA | READ_CONTROL | FILE_READ_ATTRIBUTES | FILE_WRITE_ATTRIBUTES | SYNCHRONIZE,
share_access = FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = FILE_NON_DIRECTORY_FILE | FILE_OPEN_NO_RECALL,
create_disp = FILE_OPEN))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectSrvSvcCB, errback)
messages_history.append(m)
def connectSrvSvcCB(create_message, **kwargs):
messages_history.append(create_message)
if create_message.status == 0:
call_id = self._getNextRPCCallID()
# The data_bytes are binding call to Server Service RPC using DCE v1.1 RPC over SMB. See [MS-SRVS] and [C706]
# If you wish to understand the meanings of the byte stream, I would suggest you use a recent version of WireShark to packet capture the stream
data_bytes = \
binascii.unhexlify("""05 00 0b 03 10 00 00 00 74 00 00 00""".replace(' ', '')) + \
struct.pack('<I', call_id) + \
binascii.unhexlify("""
b8 10 b8 10 00 00 00 00 02 00 00 00 00 00 01 00
c8 4f 32 4b 70 16 d3 01 12 78 5a 47 bf 6e e1 88
03 00 00 00 04 5d 88 8a eb 1c c9 11 9f e8 08 00
2b 10 48 60 02 00 00 00 01 00 01 00 c8 4f 32 4b
70 16 d3 01 12 78 5a 47 bf 6e e1 88 03 00 00 00
2c 1c b7 6c 12 98 40 45 03 00 00 00 00 00 00 00
01 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2WriteRequest(create_message.payload.fid, data_bytes, 0))
m.tid = create_message.tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, rpcBindCB, errback, fid = create_message.payload.fid)
messages_history.append(m)
else:
errback(OperationFailure('Failed to list shares: Unable to locate Server Service RPC endpoint', messages_history))
def rpcBindCB(trans_message, **kwargs):
messages_history.append(trans_message)
if trans_message.status == 0:
m = SMB2Message(SMB2ReadRequest(kwargs['fid'], read_len = 1024, read_offset = 0))
m.tid = trans_message.tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, rpcReadCB, errback, fid = kwargs['fid'])
messages_history.append(m)
else:
closeFid(trans_message.tid, kwargs['fid'], error = 'Failed to list shares: Unable to read from Server Service RPC endpoint')
def rpcReadCB(read_message, **kwargs):
messages_history.append(read_message)
if read_message.status == 0:
call_id = self._getNextRPCCallID()
padding = ''
remote_name = '\\\\' + self.remote_name
server_len = len(remote_name) + 1
server_bytes_len = server_len * 2
if server_len % 2 != 0:
padding = '\0\0'
server_bytes_len += 2
# The data bytes are the RPC call to NetrShareEnum (Opnum 15) at Server Service RPC.
# If you wish to understand the meanings of the byte stream, I would suggest you use a recent version of WireShark to packet capture the stream
data_bytes = \
binascii.unhexlify("""05 00 00 03 10 00 00 00""".replace(' ', '')) + \
struct.pack('<HHI', 72+server_bytes_len, 0, call_id) + \
binascii.unhexlify("""4c 00 00 00 00 00 0f 00 00 00 02 00""".replace(' ', '')) + \
struct.pack('<III', server_len, 0, server_len) + \
(remote_name + '\0').encode('UTF-16LE') + padding + \
binascii.unhexlify("""
01 00 00 00 01 00 00 00 04 00 02 00 00 00 00 00
00 00 00 00 ff ff ff ff 08 00 02 00 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2IoctlRequest(kwargs['fid'], 0x0011C017, flags = 0x01, max_out_size = 8196, in_data = data_bytes))
m.tid = read_message.tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, listShareResultsCB, errback, fid = kwargs['fid'])
messages_history.append(m)
else:
closeFid(read_message.tid, kwargs['fid'], error = 'Failed to list shares: Unable to bind to Server Service RPC endpoint')
def listShareResultsCB(result_message, **kwargs):
messages_history.append(result_message)
if result_message.status == 0:
# The payload.data_bytes will contain the results of the RPC call to NetrShareEnum (Opnum 15) at Server Service RPC.
data_bytes = result_message.payload.out_data
if ord(data_bytes[3]) & 0x02 == 0:
sendReadRequest(result_message.tid, kwargs['fid'], data_bytes)
else:
decodeResults(result_message.tid, kwargs['fid'], data_bytes)
else:
closeFid(result_message.tid, kwargs['fid'])
errback(OperationFailure('Failed to list shares: Unable to retrieve shared device list', messages_history))
def decodeResults(tid, fid, data_bytes):
shares_count = struct.unpack('<I', data_bytes[36:40])[0]
results = [ ] # A list of SharedDevice instances
offset = 36 + 12 # You need to study the byte stream to understand the meaning of these constants
for i in range(0, shares_count):
results.append(SharedDevice(struct.unpack('<I', data_bytes[offset+4:offset+8])[0], None, None))
offset += 12
for i in range(0, shares_count):
max_length, _, length = struct.unpack('<III', data_bytes[offset:offset+12])
offset += 12
results[i].name = unicode(data_bytes[offset:offset+length*2-2], 'UTF-16LE')
if length % 2 != 0:
offset += (length * 2 + 2)
else:
offset += (length * 2)
max_length, _, length = struct.unpack('<III', data_bytes[offset:offset+12])
offset += 12
results[i].comments = unicode(data_bytes[offset:offset+length*2-2], 'UTF-16LE')
if length % 2 != 0:
offset += (length * 2 + 2)
else:
offset += (length * 2)
closeFid(tid, fid)
callback(results)
def sendReadRequest(tid, fid, data_bytes):
read_count = min(4280, self.max_read_size)
m = SMB2Message(SMB2ReadRequest(fid, 0, read_count))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, readCB, errback,
fid = fid, data_bytes = data_bytes)
def readCB(read_message, **kwargs):
messages_history.append(read_message)
if read_message.status == 0:
data_len = read_message.payload.data_length
data_bytes = read_message.payload.data
if ord(data_bytes[3]) & 0x02 == 0:
sendReadRequest(read_message.tid, kwargs['fid'], kwargs['data_bytes'] + data_bytes[24:data_len-24])
else:
decodeResults(read_message.tid, kwargs['fid'], kwargs['data_bytes'] + data_bytes[24:data_len-24])
else:
closeFid(read_message.tid, kwargs['fid'])
errback(OperationFailure('Failed to list shares: Unable to retrieve shared device list', messages_history))
def closeFid(tid, fid, results = None, error = None):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, closeCB, errback, results = results, error = error)
messages_history.append(m)
def closeCB(close_message, **kwargs):
if kwargs['results'] is not None:
callback(kwargs['results'])
elif kwargs['error'] is not None:
errback(OperationFailure(kwargs['error'], messages_history))
if not self.connected_trees.has_key(path):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[path] = connect_message.tid
connectSrvSvc(connect_message.tid)
else:
errback(OperationFailure('Failed to list shares: Unable to connect to IPC$', messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), path )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = path)
messages_history.append(m)
else:
connectSrvSvc(self.connected_trees[path])
def _listPath_SMB2(self, service_name, path, callback, errback, search, pattern, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path.replace('/', '\\')
if path.startswith('\\'):
path = path[1:]
if path.endswith('\\'):
path = path[:-1]
messages_history = [ ]
results = [ ]
def sendCreate(tid):
create_context_data = binascii.unhexlify("""
28 00 00 00 10 00 04 00 00 00 18 00 10 00 00 00
44 48 6e 51 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 18 00 00 00 10 00 04 00
00 00 18 00 00 00 00 00 4d 78 41 63 00 00 00 00
00 00 00 00 10 00 04 00 00 00 18 00 00 00 00 00
51 46 69 64 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2CreateRequest(path,
file_attributes = 0,
access_mask = FILE_READ_DATA | FILE_READ_EA | FILE_READ_ATTRIBUTES | SYNCHRONIZE,
share_access = FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = FILE_DIRECTORY_FILE,
create_disp = FILE_OPEN,
create_context_data = create_context_data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, createCB, errback)
messages_history.append(m)
def createCB(create_message, **kwargs):
messages_history.append(create_message)
if create_message.status == 0:
sendQuery(create_message.tid, create_message.payload.fid, '')
else:
errback(OperationFailure('Failed to list %s on %s: Unable to open directory' % ( path, service_name ), messages_history))
def sendQuery(tid, fid, data_buf):
m = SMB2Message(SMB2QueryDirectoryRequest(fid, pattern,
info_class = 0x03, # FileBothDirectoryInformation
flags = 0,
output_buf_len = self.max_transact_size))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, queryCB, errback, fid = fid, data_buf = data_buf)
messages_history.append(m)
def queryCB(query_message, **kwargs):
messages_history.append(query_message)
if query_message.status == 0:
data_buf = decodeQueryStruct(kwargs['data_buf'] + query_message.payload.data)
sendQuery(query_message.tid, kwargs['fid'], data_buf)
elif query_message.status == 0x80000006L: # STATUS_NO_MORE_FILES
closeFid(query_message.tid, kwargs['fid'], results = results)
else:
closeFid(query_message.tid, kwargs['fid'], error = query_message.status)
def decodeQueryStruct(data_bytes):
# SMB_FIND_FILE_BOTH_DIRECTORY_INFO structure. See [MS-CIFS]: 2.2.8.1.7 and [MS-SMB]: 2.2.8.1.1
info_format = '<IIQQQQQQIIIBB24s'
info_size = struct.calcsize(info_format)
data_length = len(data_bytes)
offset = 0
while offset < data_length:
if offset + info_size > data_length:
return data_bytes[offset:]
next_offset, _, \
create_time, last_access_time, last_write_time, last_attr_change_time, \
file_size, alloc_size, file_attributes, filename_length, ea_size, \
short_name_length, _, short_name = struct.unpack(info_format, data_bytes[offset:offset+info_size])
offset2 = offset + info_size
if offset2 + filename_length > data_length:
return data_bytes[offset:]
filename = data_bytes[offset2:offset2+filename_length].decode('UTF-16LE')
short_name = short_name.decode('UTF-16LE')
results.append(SharedFile(convertFILETIMEtoEpoch(create_time), convertFILETIMEtoEpoch(last_access_time),
convertFILETIMEtoEpoch(last_write_time), convertFILETIMEtoEpoch(last_attr_change_time),
file_size, alloc_size, file_attributes, short_name, filename))
if next_offset:
offset += next_offset
else:
break
return ''
def closeFid(tid, fid, results = None, error = None):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, closeCB, errback, results = results, error = error)
messages_history.append(m)
def closeCB(close_message, **kwargs):
if kwargs['results'] is not None:
callback(kwargs['results'])
elif kwargs['error'] is not None:
errback(OperationFailure('Failed to list %s on %s: Query failed with errorcode 0x%08x' % ( path, service_name, kwargs['error'] ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to list %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _getAttributes_SMB2(self, service_name, path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path.replace('/', '\\')
if path.startswith('\\'):
path = path[1:]
if path.endswith('\\'):
path = path[:-1]
messages_history = [ ]
def sendCreate(tid):
create_context_data = binascii.unhexlify("""
28 00 00 00 10 00 04 00 00 00 18 00 10 00 00 00
44 48 6e 51 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 18 00 00 00 10 00 04 00
00 00 18 00 00 00 00 00 4d 78 41 63 00 00 00 00
00 00 00 00 10 00 04 00 00 00 18 00 00 00 00 00
51 46 69 64 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2CreateRequest(path,
file_attributes = 0,
access_mask = FILE_READ_DATA | FILE_READ_EA | FILE_READ_ATTRIBUTES | SYNCHRONIZE,
share_access = FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = 0,
create_disp = FILE_OPEN,
create_context_data = create_context_data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, createCB, errback)
messages_history.append(m)
def createCB(create_message, **kwargs):
messages_history.append(create_message)
if create_message.status == 0:
p = create_message.payload
info = SharedFile(p.create_time, p.lastaccess_time, p.lastwrite_time, p.change_time,
p.file_size, p.allocation_size, p.file_attributes,
unicode(path), unicode(path))
closeFid(create_message.tid, p.fid, info = info)
else:
errback(OperationFailure('Failed to get attributes for %s on %s: Unable to open remote file object' % ( path, service_name ), messages_history))
def closeFid(tid, fid, info = None, error = None):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, closeCB, errback, info = info, error = error)
messages_history.append(m)
def closeCB(close_message, **kwargs):
if kwargs['info'] is not None:
callback(kwargs['info'])
elif kwargs['error'] is not None:
errback(OperationFailure('Failed to get attributes for %s on %s: Query failed with errorcode 0x%08x' % ( path, service_name, kwargs['error'] ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to get attributes for %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _retrieveFile_SMB2(self, service_name, path, file_obj, callback, errback, timeout = 30):
return self._retrieveFileFromOffset(service_name, path, file_obj, callback, errback, 0L, -1L, timeout)
def _retrieveFileFromOffset_SMB2(self, service_name, path, file_obj, callback, errback, starting_offset, max_length, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path.replace('/', '\\')
if path.startswith('\\'):
path = path[1:]
if path.endswith('\\'):
path = path[:-1]
messages_history = [ ]
results = [ ]
def sendCreate(tid):
create_context_data = binascii.unhexlify("""
28 00 00 00 10 00 04 00 00 00 18 00 10 00 00 00
44 48 6e 51 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 18 00 00 00 10 00 04 00
00 00 18 00 00 00 00 00 4d 78 41 63 00 00 00 00
00 00 00 00 10 00 04 00 00 00 18 00 00 00 00 00
51 46 69 64 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2CreateRequest(path,
file_attributes = 0,
access_mask = FILE_READ_DATA | FILE_READ_EA | FILE_READ_ATTRIBUTES | READ_CONTROL | SYNCHRONIZE,
share_access = FILE_SHARE_READ,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = FILE_SEQUENTIAL_ONLY | FILE_NON_DIRECTORY_FILE,
create_disp = FILE_OPEN,
create_context_data = create_context_data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, createCB, errback, tid = tid)
messages_history.append(m)
def createCB(create_message, **kwargs):
messages_history.append(create_message)
if create_message.status == 0:
m = SMB2Message(SMB2QueryInfoRequest(create_message.payload.fid,
flags = 0,
additional_info = 0,
info_type = SMB2_INFO_FILE,
file_info_class = 0x16, # FileStreamInformation [MS-FSCC] 2.4
input_buf = '',
output_buf_len = 4096))
m.tid = create_message.tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, infoCB, errback,
fid = create_message.payload.fid, file_attributes = create_message.payload.file_attributes)
messages_history.append(m)
else:
errback(OperationFailure('Failed to list %s on %s: Unable to open file' % ( path, service_name ), messages_history))
def infoCB(info_message, **kwargs):
messages_history.append(info_message)
if info_message.status == 0:
file_len = struct.unpack('<Q', info_message.payload.data[8:16])[0]
if max_length == 0 or starting_offset > file_len:
closeFid(info_message.tid, kwargs['fid'])
callback(( file_obj, kwargs['file_attributes'], 0 )) # Note that this is a tuple of 3-elements
else:
remaining_len = max_length
if remaining_len < 0:
remaining_len = file_len
if starting_offset + remaining_len > file_len:
remaining_len = file_len - starting_offset
sendRead(info_message.tid, kwargs['fid'], starting_offset, remaining_len, 0, kwargs['file_attributes'])
else:
errback(OperationFailure('Failed to list %s on %s: Unable to retrieve information on file' % ( path, service_name ), messages_history))
def sendRead(tid, fid, offset, remaining_len, read_len, file_attributes):
read_count = min(self.max_read_size, remaining_len)
m = SMB2Message(SMB2ReadRequest(fid, offset, read_count))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, readCB, errback,
fid = fid, offset = offset,
remaining_len = remaining_len,
read_len = read_len,
file_attributes = file_attributes)
def readCB(read_message, **kwargs):
# To avoid crazy memory usage when retrieving large files, we do not save every read_message in messages_history.
if read_message.status == 0:
data_len = read_message.payload.data_length
file_obj.write(read_message.payload.data)
remaining_len = kwargs['remaining_len'] - data_len
if remaining_len > 0:
sendRead(read_message.tid, kwargs['fid'], kwargs['offset'] + data_len, remaining_len, kwargs['read_len'] + data_len, kwargs['file_attributes'])
else:
closeFid(read_message.tid, kwargs['fid'], ret = ( file_obj, kwargs['file_attributes'], kwargs['read_len'] + data_len ))
else:
messages_history.append(read_message)
closeFid(read_message.tid, kwargs['fid'], error = read_message.status)
def closeFid(tid, fid, ret = None, error = None):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, closeCB, errback, ret = ret, error = error)
messages_history.append(m)
def closeCB(close_message, **kwargs):
if kwargs['ret'] is not None:
callback(kwargs['ret'])
elif kwargs['error'] is not None:
errback(OperationFailure('Failed to retrieve %s on %s: Read failed' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to retrieve %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _storeFile_SMB2(self, service_name, path, file_obj, callback, errback, timeout = 30):
self._storeFileFromOffset_SMB2(service_name, path, file_obj, callback, errback, 0L, timeout)
def _storeFileFromOffset_SMB2(self, service_name, path, file_obj, callback, errback, starting_offset, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
path = path.replace('/', '\\')
if path.startswith('\\'):
path = path[1:]
if path.endswith('\\'):
path = path[:-1]
messages_history = [ ]
def sendCreate(tid):
create_context_data = binascii.unhexlify("""
28 00 00 00 10 00 04 00 00 00 18 00 10 00 00 00
44 48 6e 51 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 20 00 00 00 10 00 04 00
00 00 18 00 08 00 00 00 41 6c 53 69 00 00 00 00
85 62 00 00 00 00 00 00 18 00 00 00 10 00 04 00
00 00 18 00 00 00 00 00 4d 78 41 63 00 00 00 00
00 00 00 00 10 00 04 00 00 00 18 00 00 00 00 00
51 46 69 64 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2CreateRequest(path,
file_attributes = ATTR_ARCHIVE,
access_mask = FILE_READ_DATA | FILE_WRITE_DATA | FILE_APPEND_DATA | FILE_READ_ATTRIBUTES | FILE_WRITE_ATTRIBUTES | FILE_READ_EA | FILE_WRITE_EA | WRITE_DAC | READ_CONTROL | SYNCHRONIZE,
share_access = 0,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = FILE_SEQUENTIAL_ONLY | FILE_NON_DIRECTORY_FILE,
create_disp = FILE_OVERWRITE_IF,
create_context_data = create_context_data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, createCB, errback, tid = tid)
messages_history.append(m)
def createCB(create_message, **kwargs):
messages_history.append(create_message)
if create_message.status == 0:
sendWrite(create_message.tid, create_message.payload.fid, starting_offset)
else:
errback(OperationFailure('Failed to store %s on %s: Unable to open file' % ( path, service_name ), messages_history))
def sendWrite(tid, fid, offset):
write_count = self.max_write_size
data = file_obj.read(write_count)
data_len = len(data)
if data_len > 0:
m = SMB2Message(SMB2WriteRequest(fid, data, offset))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, writeCB, errback, fid = fid, offset = offset+data_len)
else:
closeFid(tid, fid, offset = offset)
def writeCB(write_message, **kwargs):
# To avoid crazy memory usage when saving large files, we do not save every write_message in messages_history.
if write_message.status == 0:
sendWrite(write_message.tid, kwargs['fid'], kwargs['offset'])
else:
messages_history.append(write_message)
closeFid(write_message.tid, kwargs['fid'])
errback(OperationFailure('Failed to store %s on %s: Write failed' % ( path, service_name ), messages_history))
def closeFid(tid, fid, error = None, offset = None):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, closeCB, errback, fid = fid, offset = offset, error = error)
messages_history.append(m)
def closeCB(close_message, **kwargs):
if kwargs['offset'] is not None:
callback(( file_obj, kwargs['offset'] )) # Note that this is a tuple of 2-elements
elif kwargs['error'] is not None:
errback(OperationFailure('Failed to store %s on %s: Write failed' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to store %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _deleteFiles_SMB2(self, service_name, path_file_pattern, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path_file_pattern.replace('/', '\\')
if path.startswith('\\'):
path = path[1:]
if path.endswith('\\'):
path = path[:-1]
messages_history = [ ]
def sendCreate(tid):
create_context_data = binascii.unhexlify("""
28 00 00 00 10 00 04 00 00 00 18 00 10 00 00 00
44 48 6e 51 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 18 00 00 00 10 00 04 00
00 00 18 00 00 00 00 00 4d 78 41 63 00 00 00 00
00 00 00 00 10 00 04 00 00 00 18 00 00 00 00 00
51 46 69 64 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2CreateRequest(path,
file_attributes = 0,
access_mask = DELETE | FILE_READ_ATTRIBUTES,
share_access = FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = FILE_NON_DIRECTORY_FILE,
create_disp = FILE_OPEN,
create_context_data = create_context_data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, createCB, errback, tid = tid)
messages_history.append(m)
def createCB(open_message, **kwargs):
messages_history.append(open_message)
if open_message.status == 0:
sendDelete(open_message.tid, open_message.payload.fid)
else:
errback(OperationFailure('Failed to delete %s on %s: Unable to open file' % ( path, service_name ), messages_history))
def sendDelete(tid, fid):
m = SMB2Message(SMB2SetInfoRequest(fid,
additional_info = 0,
info_type = SMB2_INFO_FILE,
file_info_class = 0x0d, # SMB2_FILE_DISPOSITION_INFO
data = '\x01'))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, deleteCB, errback, fid = fid)
messages_history.append(m)
def deleteCB(delete_message, **kwargs):
messages_history.append(delete_message)
if delete_message.status == 0:
closeFid(delete_message.tid, kwargs['fid'], status = 0)
else:
closeFid(delete_message.tid, kwargs['fid'], status = delete_message.status)
def closeFid(tid, fid, status = None):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, closeCB, errback, status = status)
messages_history.append(m)
def closeCB(close_message, **kwargs):
if kwargs['status'] == 0:
callback(path_file_pattern)
else:
errback(OperationFailure('Failed to delete %s on %s: Delete failed' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to delete %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _createDirectory_SMB2(self, service_name, path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path.replace('/', '\\')
if path.startswith('\\'):
path = path[1:]
if path.endswith('\\'):
path = path[:-1]
messages_history = [ ]
def sendCreate(tid):
create_context_data = binascii.unhexlify("""
28 00 00 00 10 00 04 00 00 00 18 00 10 00 00 00
44 48 6e 51 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 18 00 00 00 10 00 04 00
00 00 18 00 00 00 00 00 4d 78 41 63 00 00 00 00
00 00 00 00 10 00 04 00 00 00 18 00 00 00 00 00
51 46 69 64 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2CreateRequest(path,
file_attributes = 0,
access_mask = FILE_READ_DATA | FILE_WRITE_DATA | FILE_READ_EA | FILE_WRITE_EA | FILE_READ_ATTRIBUTES | FILE_WRITE_ATTRIBUTES | WRITE_DAC | READ_CONTROL | DELETE | SYNCHRONIZE,
share_access = 0,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = FILE_DIRECTORY_FILE | FILE_SYNCHRONOUS_IO_NONALERT,
create_disp = FILE_CREATE,
create_context_data = create_context_data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, createCB, errback)
messages_history.append(m)
def createCB(create_message, **kwargs):
messages_history.append(create_message)
if create_message.status == 0:
closeFid(create_message.tid, create_message.payload.fid)
else:
errback(OperationFailure('Failed to create directory %s on %s: Create failed' % ( path, service_name ), messages_history))
def closeFid(tid, fid):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, closeCB, errback)
messages_history.append(m)
def closeCB(close_message, **kwargs):
callback(path)
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to create directory %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _deleteDirectory_SMB2(self, service_name, path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path.replace('/', '\\')
if path.startswith('\\'):
path = path[1:]
if path.endswith('\\'):
path = path[:-1]
messages_history = [ ]
def sendCreate(tid):
create_context_data = binascii.unhexlify("""
28 00 00 00 10 00 04 00 00 00 18 00 10 00 00 00
44 48 6e 51 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 18 00 00 00 10 00 04 00
00 00 18 00 00 00 00 00 4d 78 41 63 00 00 00 00
00 00 00 00 10 00 04 00 00 00 18 00 00 00 00 00
51 46 69 64 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2CreateRequest(path,
file_attributes = 0,
access_mask = DELETE | FILE_READ_ATTRIBUTES,
share_access = FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = FILE_DIRECTORY_FILE,
create_disp = FILE_OPEN,
create_context_data = create_context_data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, createCB, errback, tid = tid)
messages_history.append(m)
def createCB(open_message, **kwargs):
messages_history.append(open_message)
if open_message.status == 0:
sendDelete(open_message.tid, open_message.payload.fid)
else:
errback(OperationFailure('Failed to delete %s on %s: Unable to open directory' % ( path, service_name ), messages_history))
def sendDelete(tid, fid):
m = SMB2Message(SMB2SetInfoRequest(fid,
additional_info = 0,
info_type = SMB2_INFO_FILE,
file_info_class = 0x0d, # SMB2_FILE_DISPOSITION_INFO
data = '\x01'))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, deleteCB, errback, fid = fid)
messages_history.append(m)
def deleteCB(delete_message, **kwargs):
messages_history.append(delete_message)
if delete_message.status == 0:
closeFid(delete_message.tid, kwargs['fid'], status = 0)
else:
closeFid(delete_message.tid, kwargs['fid'], status = delete_message.status)
def closeFid(tid, fid, status = None):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, closeCB, errback, status = status)
messages_history.append(m)
def closeCB(close_message, **kwargs):
if kwargs['status'] == 0:
callback(path)
else:
errback(OperationFailure('Failed to delete %s on %s: Delete failed' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to delete %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _rename_SMB2(self, service_name, old_path, new_path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
messages_history = [ ]
new_path = new_path.replace('/', '\\')
if new_path.startswith('\\'):
new_path = new_path[1:]
if new_path.endswith('\\'):
new_path = new_path[:-1]
old_path = old_path.replace('/', '\\')
if old_path.startswith('\\'):
old_path = old_path[1:]
if old_path.endswith('\\'):
old_path = old_path[:-1]
def sendCreate(tid):
create_context_data = binascii.unhexlify("""
20 00 00 00 10 00 04 00 00 00 18 00 08 00 00 00
4d 78 41 63 00 00 00 00 00 00 00 00 ff 01 1f 00
00 00 00 00 10 00 04 00 00 00 18 00 20 00 00 00
51 46 69 64 00 00 00 00 14 e7 01 00 00 00 50 00
30 e0 4c 0b 80 fa ff ff 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2CreateRequest(old_path,
file_attributes = 0,
access_mask = DELETE | FILE_READ_ATTRIBUTES | SYNCHRONIZE,
share_access = FILE_SHARE_READ | FILE_SHARE_WRITE | FILE_SHARE_DELETE,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = FILE_SYNCHRONOUS_IO_NONALERT,
create_disp = FILE_OPEN,
create_context_data = create_context_data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, createCB, errback, tid = tid)
messages_history.append(m)
def createCB(create_message, **kwargs):
messages_history.append(create_message)
if create_message.status == 0:
sendRename(create_message.tid, create_message.payload.fid)
else:
errback(OperationFailure('Failed to rename %s on %s: Unable to open file/directory' % ( old_path, service_name ), messages_history))
def sendRename(tid, fid):
data = '\x00'*16 + struct.pack('<I', len(new_path)*2) + new_path.encode('UTF-16LE')
m = SMB2Message(SMB2SetInfoRequest(fid,
additional_info = 0,
info_type = SMB2_INFO_FILE,
file_info_class = 0x0a, # SMB2_FILE_RENAME_INFO
data = data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, renameCB, errback, fid = fid)
messages_history.append(m)
def renameCB(rename_message, **kwargs):
messages_history.append(rename_message)
if rename_message.status == 0:
closeFid(rename_message.tid, kwargs['fid'], status = 0)
else:
closeFid(rename_message.tid, kwargs['fid'], status = rename_message.status)
def closeFid(tid, fid, status = None):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, closeCB, errback, status = status)
messages_history.append(m)
def closeCB(close_message, **kwargs):
if kwargs['status'] == 0:
callback(( old_path, new_path ))
else:
errback(OperationFailure('Failed to rename %s on %s: Rename failed' % ( old_path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to rename %s on %s: Unable to connect to shared device' % ( old_path, service_name ), messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _listSnapshots_SMB2(self, service_name, path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path.replace('/', '\\')
if path.startswith('\\'):
path = path[1:]
if path.endswith('\\'):
path = path[:-1]
messages_history = [ ]
def sendCreate(tid):
create_context_data = binascii.unhexlify("""
28 00 00 00 10 00 04 00 00 00 18 00 10 00 00 00
44 48 6e 51 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 10 00 04 00
00 00 18 00 00 00 00 00 4d 78 41 63 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMB2Message(SMB2CreateRequest(path,
file_attributes = 0,
access_mask = FILE_READ_DATA | FILE_READ_ATTRIBUTES | SYNCHRONIZE,
share_access = FILE_SHARE_READ | FILE_SHARE_WRITE,
oplock = SMB2_OPLOCK_LEVEL_NONE,
impersonation = SEC_IMPERSONATE,
create_options = FILE_SYNCHRONOUS_IO_NONALERT,
create_disp = FILE_OPEN,
create_context_data = create_context_data))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, createCB, errback, tid = tid)
messages_history.append(m)
def createCB(create_message, **kwargs):
messages_history.append(create_message)
if create_message.status == 0:
sendEnumSnapshots(create_message.tid, create_message.payload.fid)
else:
errback(OperationFailure('Failed to list snapshots %s on %s: Unable to open file/directory' % ( old_path, service_name ), messages_history))
def sendEnumSnapshots(tid, fid):
m = SMB2Message(SMB2IoctlRequest(fid,
ctlcode = 0x00144064, # FSCTL_SRV_ENUMERATE_SNAPSHOTS
flags = 0x0001,
in_data = ''))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, enumSnapshotsCB, errback, tid = tid, fid = fid)
messages_history.append(m)
def enumSnapshotsCB(enum_message, **kwargs):
messages_history.append(enum_message)
if enum_message.status == 0:
results = [ ]
snapshots_count = struct.unpack('<I', enum_message.payload.out_data[4:8])[0]
for i in range(0, snapshots_count):
s = enum_message.payload.out_data[12+i*50:12+48+i*50].decode('UTF-16LE')
results.append(datetime(*map(int, ( s[5:9], s[10:12], s[13:15], s[16:18], s[19:21], s[22:24] ))))
closeFid(kwargs['tid'], kwargs['fid'], results = results)
else:
closeFid(kwargs['tid'], kwargs['fid'], status = enum_message.status)
def closeFid(tid, fid, status = None, results = None):
m = SMB2Message(SMB2CloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, closeCB, errback, status = status, results = results)
messages_history.append(m)
def closeCB(close_message, **kwargs):
if kwargs['results'] is not None:
callback(kwargs['results'])
else:
errback(OperationFailure('Failed to list snapshots %s on %s: List failed' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if connect_message.status == 0:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to list snapshots %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMB2Message(SMB2TreeConnectRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name )))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _echo_SMB2(self, data, callback, errback, timeout = 30):
messages_history = [ ]
def echoCB(echo_message, **kwargs):
messages_history.append(echo_message)
if echo_message.status == 0:
callback(data)
else:
errback(OperationFailure('Echo failed', messages_history))
m = SMB2Message(SMB2EchoRequest())
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, echoCB, errback)
messages_history.append(m)
#
# SMB1 Methods Family
#
def _sendSMBMessage_SMB1(self, smb_message):
if smb_message.mid == 0:
smb_message.mid = self._getNextMID_SMB1()
smb_message.uid = self.uid
if self.is_signing_active:
smb_message.flags2 |= SMB_FLAGS2_SMB_SECURITY_SIGNATURE
# Increment the next_signing_id as described in [MS-CIFS] 3.2.4.1.3
smb_message.security = self.next_signing_id
self.next_signing_id += 2 # All our defined messages currently have responses, so always increment by 2
raw_data = smb_message.encode()
md = ntlm.MD5(self.signing_session_key)
if self.signing_challenge_response:
md.update(self.signing_challenge_response)
md.update(raw_data)
signature = md.digest()[:8]
self.log.debug('MID is %d. Signing ID is %d. Signature is %s. Total raw message is %d bytes', smb_message.mid, smb_message.security, binascii.hexlify(signature), len(raw_data))
smb_message.raw_data = raw_data[:14] + signature + raw_data[22:]
else:
smb_message.raw_data = smb_message.encode()
self.sendNMBMessage(smb_message.raw_data)
def _getNextMID_SMB1(self):
self.mid += 1
if self.mid >= 0xFFFF: # MID cannot be 0xFFFF. [MS-CIFS]: 2.2.1.6.2
# We don't use MID of 0 as MID can be reused for SMB_COM_TRANSACTION2_SECONDARY messages
# where if mid=0, _sendSMBMessage will re-assign new MID values again
self.mid = 1
return self.mid
def _updateState_SMB1(self, message):
if message.isReply:
if message.command == SMB_COM_NEGOTIATE:
if not message.status.hasError:
self.has_negotiated = True
self.log.info('SMB dialect negotiation successful (ExtendedSecurity:%s)', message.hasExtendedSecurity)
self._updateServerInfo(message.payload)
self._handleNegotiateResponse(message)
else:
raise ProtocolError('Unknown status value (0x%08X) in SMB_COM_NEGOTIATE' % message.status.internal_value,
message.raw_data, message)
elif message.command == SMB_COM_SESSION_SETUP_ANDX:
if message.hasExtendedSecurity:
if not message.status.hasError:
try:
result = securityblob.decodeAuthResponseSecurityBlob(message.payload.security_blob)
if result == securityblob.RESULT_ACCEPT_COMPLETED:
self.has_authenticated = True
self.log.info('Authentication (with extended security) successful!')
self.onAuthOK()
else:
raise ProtocolError('SMB_COM_SESSION_SETUP_ANDX status is 0 but security blob negResult value is %d' % result, message.raw_data, message)
except securityblob.BadSecurityBlobError, ex:
raise ProtocolError(str(ex), message.raw_data, message)
elif message.status.internal_value == 0xc0000016: # STATUS_MORE_PROCESSING_REQUIRED
try:
result, ntlm_token = securityblob.decodeChallengeSecurityBlob(message.payload.security_blob)
if result == securityblob.RESULT_ACCEPT_INCOMPLETE:
self._handleSessionChallenge(message, ntlm_token)
except ( securityblob.BadSecurityBlobError, securityblob.UnsupportedSecurityProvider ), ex:
raise ProtocolError(str(ex), message.raw_data, message)
elif message.status.internal_value == 0xc000006d: # STATUS_LOGON_FAILURE
self.has_authenticated = False
self.log.info('Authentication (with extended security) failed. Please check username and password. You may need to enable/disable NTLMv2 authentication.')
self.onAuthFailed()
else:
raise ProtocolError('Unknown status value (0x%08X) in SMB_COM_SESSION_SETUP_ANDX (with extended security)' % message.status.internal_value,
message.raw_data, message)
else:
if message.status.internal_value == 0:
self.has_authenticated = True
self.log.info('Authentication (without extended security) successful!')
self.onAuthOK()
else:
self.has_authenticated = False
self.log.info('Authentication (without extended security) failed. Please check username and password')
self.onAuthFailed()
elif message.command == SMB_COM_TREE_CONNECT_ANDX:
try:
req = self.pending_requests[message.mid]
except KeyError:
pass
else:
if not message.status.hasError:
self.connected_trees[req.kwargs['path']] = message.tid
req = self.pending_requests.pop(message.mid, None)
if req:
req.callback(message, **req.kwargs)
return True
def _updateServerInfo_SMB1(self, payload):
self.capabilities = payload.capabilities
self.security_mode = payload.security_mode
self.max_raw_size = payload.max_raw_size
self.max_buffer_size = payload.max_buffer_size
self.max_mpx_count = payload.max_mpx_count
self.use_plaintext_authentication = not bool(payload.security_mode & NEGOTIATE_ENCRYPT_PASSWORDS)
if self.use_plaintext_authentication:
self.log.warning('Remote server only supports plaintext authentication. Your password can be stolen easily over the network.')
def _handleSessionChallenge_SMB1(self, message, ntlm_token):
assert message.hasExtendedSecurity
if message.uid and not self.uid:
self.log.debug('SMB uid is now %d', message.uid)
self.uid = message.uid
server_challenge, server_flags, server_info = ntlm.decodeChallengeMessage(ntlm_token)
if self.use_ntlm_v2:
self.log.info('Performing NTLMv2 authentication (with extended security) with server challenge "%s"', binascii.hexlify(server_challenge))
nt_challenge_response, lm_challenge_response, session_key = ntlm.generateChallengeResponseV2(self.password,
self.username,
server_challenge,
server_info,
self.domain)
else:
self.log.info('Performing NTLMv1 authentication (with extended security) with server challenge "%s"', binascii.hexlify(server_challenge))
nt_challenge_response, lm_challenge_response, session_key = ntlm.generateChallengeResponseV1(self.password, server_challenge, True)
ntlm_data = ntlm.generateAuthenticateMessage(server_flags,
nt_challenge_response,
lm_challenge_response,
session_key,
self.username,
self.domain)
if self.log.isEnabledFor(logging.DEBUG):
self.log.debug('NT challenge response is "%s" (%d bytes)', binascii.hexlify(nt_challenge_response), len(nt_challenge_response))
self.log.debug('LM challenge response is "%s" (%d bytes)', binascii.hexlify(lm_challenge_response), len(lm_challenge_response))
blob = securityblob.generateAuthSecurityBlob(ntlm_data)
self._sendSMBMessage(SMBMessage(ComSessionSetupAndxRequest__WithSecurityExtension(0, blob)))
if self.security_mode & NEGOTIATE_SECURITY_SIGNATURES_REQUIRE:
self.log.info('Server requires all SMB messages to be signed')
self.is_signing_active = (self.sign_options != SMB.SIGN_NEVER)
elif self.security_mode & NEGOTIATE_SECURITY_SIGNATURES_ENABLE:
self.log.info('Server supports SMB signing')
self.is_signing_active = (self.sign_options == SMB.SIGN_WHEN_SUPPORTED)
else:
self.is_signing_active = False
if self.is_signing_active:
self.log.info("SMB signing activated. All SMB messages will be signed.")
self.signing_session_key = session_key
if self.capabilities & CAP_EXTENDED_SECURITY:
self.signing_challenge_response = None
else:
self.signing_challenge_response = blob
else:
self.log.info("SMB signing deactivated. SMB messages will NOT be signed.")
def _handleNegotiateResponse_SMB1(self, message):
if message.uid and not self.uid:
self.log.debug('SMB uid is now %d', message.uid)
self.uid = message.uid
if message.hasExtendedSecurity or message.payload.supportsExtendedSecurity:
ntlm_data = ntlm.generateNegotiateMessage()
blob = securityblob.generateNegotiateSecurityBlob(ntlm_data)
self._sendSMBMessage(SMBMessage(ComSessionSetupAndxRequest__WithSecurityExtension(message.payload.session_key, blob)))
else:
nt_password, _, _ = ntlm.generateChallengeResponseV1(self.password, message.payload.challenge, False)
self.log.info('Performing NTLMv1 authentication (without extended security) with challenge "%s" and hashed password of "%s"',
binascii.hexlify(message.payload.challenge),
binascii.hexlify(nt_password))
self._sendSMBMessage(SMBMessage(ComSessionSetupAndxRequest__NoSecurityExtension(message.payload.session_key,
self.username,
nt_password,
True,
message.payload.domain)))
def _listShares_SMB1(self, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = 'IPC$'
messages_history = [ ]
def connectSrvSvc(tid):
m = SMBMessage(ComNTCreateAndxRequest('\\srvsvc',
flags = NT_CREATE_REQUEST_EXTENDED_RESPONSE,
access_mask = READ_CONTROL | FILE_WRITE_ATTRIBUTES | FILE_READ_ATTRIBUTES | FILE_WRITE_EA | FILE_READ_EA | FILE_APPEND_DATA | FILE_WRITE_DATA | FILE_READ_DATA,
share_access = FILE_SHARE_READ | FILE_SHARE_WRITE,
create_disp = FILE_OPEN,
create_options = FILE_OPEN_NO_RECALL | FILE_NON_DIRECTORY_FILE,
impersonation = SEC_IMPERSONATE,
security_flags = 0))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectSrvSvcCB, errback)
messages_history.append(m)
def connectSrvSvcCB(create_message, **kwargs):
messages_history.append(create_message)
if not create_message.status.hasError:
call_id = self._getNextRPCCallID()
# See [MS-CIFS]: 2.2.5.6.1 for more information on TRANS_TRANSACT_NMPIPE (0x0026) parameters
setup_bytes = struct.pack('<HH', 0x0026, create_message.payload.fid)
# The data_bytes are binding call to Server Service RPC using DCE v1.1 RPC over SMB. See [MS-SRVS] and [C706]
# If you wish to understand the meanings of the byte stream, I would suggest you use a recent version of WireShark to packet capture the stream
data_bytes = \
binascii.unhexlify("""05 00 0b 03 10 00 00 00 48 00 00 00""".replace(' ', '')) + \
struct.pack('<I', call_id) + \
binascii.unhexlify("""
b8 10 b8 10 00 00 00 00 01 00 00 00 00 00 01 00
c8 4f 32 4b 70 16 d3 01 12 78 5a 47 bf 6e e1 88
03 00 00 00 04 5d 88 8a eb 1c c9 11 9f e8 08 00
2b 10 48 60 02 00 00 00""".replace(' ', '').replace('\n', ''))
m = SMBMessage(ComTransactionRequest(max_params_count = 0,
max_data_count = 4280,
max_setup_count = 0,
data_bytes = data_bytes,
setup_bytes = setup_bytes))
m.tid = create_message.tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, rpcBindCB, errback, fid = create_message.payload.fid)
messages_history.append(m)
else:
errback(OperationFailure('Failed to list shares: Unable to locate Server Service RPC endpoint', messages_history))
def rpcBindCB(trans_message, **kwargs):
messages_history.append(trans_message)
if not trans_message.status.hasError:
call_id = self._getNextRPCCallID()
padding = ''
server_len = len(self.remote_name) + 1
server_bytes_len = server_len * 2
if server_len % 2 != 0:
padding = '\0\0'
server_bytes_len += 2
# See [MS-CIFS]: 2.2.5.6.1 for more information on TRANS_TRANSACT_NMPIPE (0x0026) parameters
setup_bytes = struct.pack('<HH', 0x0026, kwargs['fid'])
# The data bytes are the RPC call to NetrShareEnum (Opnum 15) at Server Service RPC.
# If you wish to understand the meanings of the byte stream, I would suggest you use a recent version of WireShark to packet capture the stream
data_bytes = \
binascii.unhexlify("""05 00 00 03 10 00 00 00""".replace(' ', '')) + \
struct.pack('<HHI', 72+server_bytes_len, 0, call_id) + \
binascii.unhexlify("""4c 00 00 00 00 00 0f 00 00 00 02 00""".replace(' ', '')) + \
struct.pack('<III', server_len, 0, server_len) + \
(self.remote_name + '\0').encode('UTF-16LE') + padding + \
binascii.unhexlify("""
01 00 00 00 01 00 00 00 04 00 02 00 00 00 00 00
00 00 00 00 ff ff ff ff 08 00 02 00 00 00 00 00
""".replace(' ', '').replace('\n', ''))
m = SMBMessage(ComTransactionRequest(max_params_count = 0,
max_data_count = 4280,
max_setup_count = 0,
data_bytes = data_bytes,
setup_bytes = setup_bytes))
m.tid = trans_message.tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, listShareResultsCB, errback, fid = kwargs['fid'])
messages_history.append(m)
else:
closeFid(trans_message.tid, kwargs['fid'])
errback(OperationFailure('Failed to list shares: Unable to bind to Server Service RPC endpoint', messages_history))
def listShareResultsCB(result_message, **kwargs):
messages_history.append(result_message)
if not result_message.status.hasError:
# The payload.data_bytes will contain the results of the RPC call to NetrShareEnum (Opnum 15) at Server Service RPC.
data_bytes = result_message.payload.data_bytes
if ord(data_bytes[3]) & 0x02 == 0:
sendReadRequest(result_message.tid, kwargs['fid'], data_bytes)
else:
decodeResults(result_message.tid, kwargs['fid'], data_bytes)
else:
closeFid(result_message.tid, kwargs['fid'])
errback(OperationFailure('Failed to list shares: Unable to retrieve shared device list', messages_history))
def decodeResults(tid, fid, data_bytes):
shares_count = struct.unpack('<I', data_bytes[36:40])[0]
results = [ ] # A list of SharedDevice instances
offset = 36 + 12 # You need to study the byte stream to understand the meaning of these constants
for i in range(0, shares_count):
results.append(SharedDevice(struct.unpack('<I', data_bytes[offset+4:offset+8])[0], None, None))
offset += 12
for i in range(0, shares_count):
max_length, _, length = struct.unpack('<III', data_bytes[offset:offset+12])
offset += 12
results[i].name = unicode(data_bytes[offset:offset+length*2-2], 'UTF-16LE')
if length % 2 != 0:
offset += (length * 2 + 2)
else:
offset += (length * 2)
max_length, _, length = struct.unpack('<III', data_bytes[offset:offset+12])
offset += 12
results[i].comments = unicode(data_bytes[offset:offset+length*2-2], 'UTF-16LE')
if length % 2 != 0:
offset += (length * 2 + 2)
else:
offset += (length * 2)
closeFid(tid, fid)
callback(results)
def sendReadRequest(tid, fid, data_bytes):
read_count = min(4280, self.max_raw_size - 2)
m = SMBMessage(ComReadAndxRequest(fid = fid,
offset = 0,
max_return_bytes_count = read_count,
min_return_bytes_count = read_count))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, readCB, errback, fid = fid, data_bytes = data_bytes)
def readCB(read_message, **kwargs):
messages_history.append(read_message)
if not read_message.status.hasError:
data_len = read_message.payload.data_length
data_bytes = read_message.payload.data
if ord(data_bytes[3]) & 0x02 == 0:
sendReadRequest(read_message.tid, kwargs['fid'], kwargs['data_bytes'] + data_bytes[24:data_len-24])
else:
decodeResults(read_message.tid, kwargs['fid'], kwargs['data_bytes'] + data_bytes[24:data_len-24])
else:
closeFid(read_message.tid, kwargs['fid'])
errback(OperationFailure('Failed to list shares: Unable to retrieve shared device list', messages_history))
def closeFid(tid, fid):
m = SMBMessage(ComCloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
messages_history.append(m)
if not self.connected_trees.has_key(path):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[path] = connect_message.tid
connectSrvSvc(connect_message.tid)
else:
errback(OperationFailure('Failed to list shares: Unable to connect to IPC$', messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), path ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = path)
messages_history.append(m)
else:
connectSrvSvc(self.connected_trees[path])
def _listPath_SMB1(self, service_name, path, callback, errback, search, pattern, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path.replace('/', '\\')
if not path.endswith('\\'):
path += '\\'
messages_history = [ ]
results = [ ]
def sendFindFirst(tid):
setup_bytes = struct.pack('<H', 0x0001) # TRANS2_FIND_FIRST2 sub-command. See [MS-CIFS]: 2.2.6.2.1
params_bytes = \
struct.pack('<HHHHI',
search, # SearchAttributes
100, # SearchCount
0x0006, # Flags: SMB_FIND_CLOSE_AT_EOS | SMB_FIND_RETURN_RESUME_KEYS
0x0104, # InfoLevel: SMB_FIND_FILE_BOTH_DIRECTORY_INFO
0x0000) # SearchStorageType
params_bytes += (path + pattern).encode('UTF-16LE')
m = SMBMessage(ComTransaction2Request(max_params_count = 10,
max_data_count = 16644,
max_setup_count = 0,
params_bytes = params_bytes,
setup_bytes = setup_bytes))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, findFirstCB, errback)
messages_history.append(m)
def decodeFindStruct(data_bytes):
# SMB_FIND_FILE_BOTH_DIRECTORY_INFO structure. See [MS-CIFS]: 2.2.8.1.7 and [MS-SMB]: 2.2.8.1.1
info_format = '<IIQQQQQQIIIBB24s'
info_size = struct.calcsize(info_format)
data_length = len(data_bytes)
offset = 0
while offset < data_length:
if offset + info_size > data_length:
return data_bytes[offset:]
next_offset, _, \
create_time, last_access_time, last_write_time, last_attr_change_time, \
file_size, alloc_size, file_attributes, filename_length, ea_size, \
short_name_length, _, short_name = struct.unpack(info_format, data_bytes[offset:offset+info_size])
offset2 = offset + info_size
if offset2 + filename_length > data_length:
return data_bytes[offset:]
filename = data_bytes[offset2:offset2+filename_length].decode('UTF-16LE')
short_name = short_name.decode('UTF-16LE')
results.append(SharedFile(convertFILETIMEtoEpoch(create_time), convertFILETIMEtoEpoch(last_access_time),
convertFILETIMEtoEpoch(last_write_time), convertFILETIMEtoEpoch(last_attr_change_time),
file_size, alloc_size, file_attributes, short_name, filename))
if next_offset:
offset += next_offset
else:
break
return ''
def findFirstCB(find_message, **kwargs):
messages_history.append(find_message)
if not find_message.status.hasError:
if not kwargs.has_key('total_count'):
# TRANS2_FIND_FIRST2 response. [MS-CIFS]: 2.2.6.2.2
sid, search_count, end_of_search, _, last_name_offset = struct.unpack('<HHHHH', find_message.payload.params_bytes[:10])
kwargs.update({ 'sid': sid, 'end_of_search': end_of_search, 'last_name_offset': last_name_offset, 'data_buf': '' })
else:
sid, end_of_search, last_name_offset = kwargs['sid'], kwargs['end_of_search'], kwargs['last_name_offset']
send_next = True
if find_message.payload.data_bytes:
d = decodeFindStruct(kwargs['data_buf'] + find_message.payload.data_bytes)
if not kwargs.has_key('data_count'):
if len(find_message.payload.data_bytes) != find_message.payload.total_data_count:
kwargs.update({ 'data_count': len(find_message.payload.data_bytes),
'total_count': find_message.payload.total_data_count,
'data_buf': d,
})
send_next = False
else:
kwargs['data_count'] += len(find_message.payload.data_bytes)
kwargs['total_count'] = min(find_message.payload.total_data_count, kwargs['total_count'])
kwargs['data_buf'] = d
if kwargs['data_count'] != kwargs['total_count']:
send_next = False
if not send_next:
self.pending_requests[find_message.mid] = _PendingRequest(find_message.mid, expiry_time, findFirstCB, errback, **kwargs)
elif end_of_search:
callback(results)
else:
sendFindNext(find_message.tid, sid, last_name_offset)
else:
errback(OperationFailure('Failed to list %s on %s: Unable to retrieve file list' % ( path, service_name ), messages_history))
def sendFindNext(tid, sid, resume_key):
setup_bytes = struct.pack('<H', 0x0002) # TRANS2_FIND_NEXT2 sub-command. See [MS-CIFS]: 2.2.6.3.1
params_bytes = \
struct.pack('<HHHIH',
sid, # SID
100, # SearchCount
0x0104, # InfoLevel: SMB_FIND_FILE_BOTH_DIRECTORY_INFO
resume_key, # ResumeKey
0x000a) # Flags: SMB_FIND_RETURN_RESUME_KEYS | SMB_FIND_CLOSE_AT_EOS | SMB_FIND_RETURN_RESUME_KEYS
params_bytes += pattern.encode('UTF-16LE')
m = SMBMessage(ComTransaction2Request(max_params_count = 10,
max_data_count = 16644,
max_setup_count = 0,
params_bytes = params_bytes,
setup_bytes = setup_bytes))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, findNextCB, errback, sid = sid)
messages_history.append(m)
def findNextCB(find_message, **kwargs):
messages_history.append(find_message)
if not find_message.status.hasError:
if not kwargs.has_key('total_count'):
# TRANS2_FIND_NEXT2 response. [MS-CIFS]: 2.2.6.3.2
search_count, end_of_search, _, last_name_offset = struct.unpack('<HHHH', find_message.payload.params_bytes[:8])
kwargs.update({ 'end_of_search': end_of_search, 'last_name_offset': last_name_offset, 'data_buf': '' })
else:
end_of_search, last_name_offset = kwargs['end_of_search'], kwargs['last_name_offset']
send_next = True
if find_message.payload.data_bytes:
d = decodeFindStruct(kwargs['data_buf'] + find_message.payload.data_bytes)
if not kwargs.has_key('data_count'):
if len(find_message.payload.data_bytes) != find_message.payload.total_data_count:
kwargs.update({ 'data_count': len(find_message.payload.data_bytes),
'total_count': find_message.payload.total_data_count,
'data_buf': d,
})
send_next = False
else:
kwargs['data_count'] += len(find_message.payload.data_bytes)
kwargs['total_count'] = min(find_message.payload.total_data_count, kwargs['total_count'])
kwargs['data_buf'] = d
if kwargs['data_count'] != kwargs['total_count']:
send_next = False
if not send_next:
self.pending_requests[find_message.mid] = _PendingRequest(find_message.mid, expiry_time, findNextCB, errback, **kwargs)
elif end_of_search:
callback(results)
else:
sendFindNext(find_message.tid, kwargs['sid'], last_name_offset)
else:
errback(OperationFailure('Failed to list %s on %s: Unable to retrieve file list' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[service_name] = connect_message.tid
sendFindFirst(connect_message.tid)
else:
errback(OperationFailure('Failed to list %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendFindFirst(self.connected_trees[service_name])
def _getAttributes_SMB1(self, service_name, path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path.replace('/', '\\')
if path.startswith('\\'):
path = path[1:]
if path.endswith('\\'):
path = path[:-1]
messages_history = [ ]
def sendQuery(tid):
setup_bytes = struct.pack('<H', 0x0005) # TRANS2_QUERY_PATH_INFORMATION sub-command. See [MS-CIFS]: 2.2.6.6.1
params_bytes = \
struct.pack('<HI',
0x0107, # SMB_QUERY_FILE_ALL_INFO ([MS-CIFS] 2.2.2.3.3)
0x0000) # Reserved
params_bytes += (path + '\0').encode('UTF-16LE')
m = SMBMessage(ComTransaction2Request(max_params_count = 2,
max_data_count = 65535,
max_setup_count = 0,
params_bytes = params_bytes,
setup_bytes = setup_bytes))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, queryCB, errback)
messages_history.append(m)
def queryCB(query_message, **kwargs):
messages_history.append(query_message)
if not query_message.status.hasError:
info_format = '<QQQQIIQQ'
info_size = struct.calcsize(info_format)
create_time, last_access_time, last_write_time, last_attr_change_time, \
file_attributes, _, alloc_size, file_size = struct.unpack(info_format, query_message.payload.data_bytes[:info_size])
info = SharedFile(create_time, last_access_time, last_write_time, last_attr_change_time,
file_size, alloc_size, file_attributes, unicode(path), unicode(path))
callback(info)
else:
errback(OperationFailure('Failed to get attributes for %s on %s: Read failed' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[service_name] = connect_message.tid
sendQuery(connect_message.tid)
else:
errback(OperationFailure('Failed to get attributes for %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendQuery(self.connected_trees[service_name])
def _retrieveFile_SMB1(self, service_name, path, file_obj, callback, errback, timeout = 30):
return self._retrieveFileFromOffset(service_name, path, file_obj, callback, errback, 0L, -1L, timeout)
def _retrieveFileFromOffset_SMB1(self, service_name, path, file_obj, callback, errback, starting_offset, max_length, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
path = path.replace('/', '\\')
messages_history = [ ]
def sendOpen(tid):
m = SMBMessage(ComOpenAndxRequest(filename = path,
access_mode = 0x0040, # Sharing mode: Deny nothing to others
open_mode = 0x0001, # Failed if file does not exist
search_attributes = SMB_FILE_ATTRIBUTE_HIDDEN | SMB_FILE_ATTRIBUTE_SYSTEM,
timeout = timeout * 1000))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, openCB, errback)
messages_history.append(m)
def openCB(open_message, **kwargs):
messages_history.append(open_message)
if not open_message.status.hasError:
if max_length == 0:
closeFid(open_message.tid, open_message.payload.fid)
callback(( file_obj, open_message.payload.file_attributes, 0L ))
else:
sendRead(open_message.tid, open_message.payload.fid, starting_offset, open_message.payload.file_attributes, 0L, max_length)
else:
errback(OperationFailure('Failed to retrieve %s on %s: Unable to open file' % ( path, service_name ), messages_history))
def sendRead(tid, fid, offset, file_attributes, read_len, remaining_len):
read_count = self.max_raw_size - 2
m = SMBMessage(ComReadAndxRequest(fid = fid,
offset = offset,
max_return_bytes_count = read_count,
min_return_bytes_count = min(0xFFFF, read_count)))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, readCB, errback, fid = fid, offset = offset, file_attributes = file_attributes,
read_len = read_len, remaining_len = remaining_len)
def readCB(read_message, **kwargs):
# To avoid crazy memory usage when retrieving large files, we do not save every read_message in messages_history.
if not read_message.status.hasError:
read_len = kwargs['read_len']
remaining_len = kwargs['remaining_len']
data_len = read_message.payload.data_length
if max_length > 0:
if data_len > remaining_len:
file_obj.write(read_message.payload.data[:remaining_len])
read_len += remaining_len
remaining_len = 0
else:
file_obj.write(read_message.payload.data)
remaining_len -= data_len
read_len += data_len
else:
file_obj.write(read_message.payload.data)
read_len += data_len
if (max_length > 0 and remaining_len <= 0) or data_len < (self.max_raw_size - 2):
closeFid(read_message.tid, kwargs['fid'])
callback(( file_obj, kwargs['file_attributes'], read_len )) # Note that this is a tuple of 3-elements
else:
sendRead(read_message.tid, kwargs['fid'], kwargs['offset']+data_len, kwargs['file_attributes'], read_len, remaining_len)
else:
messages_history.append(read_message)
closeFid(read_message.tid, kwargs['fid'])
errback(OperationFailure('Failed to retrieve %s on %s: Read failed' % ( path, service_name ), messages_history))
def closeFid(tid, fid):
m = SMBMessage(ComCloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
messages_history.append(m)
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[service_name] = connect_message.tid
sendOpen(connect_message.tid)
else:
errback(OperationFailure('Failed to retrieve %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendOpen(self.connected_trees[service_name])
def _storeFile_SMB1(self, service_name, path, file_obj, callback, errback, timeout = 30):
self._storeFileFromOffset_SMB1(service_name, path, file_obj, callback, errback, 0L, timeout)
def _storeFileFromOffset_SMB1(self, service_name, path, file_obj, callback, errback, starting_offset, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
path = path.replace('/', '\\')
messages_history = [ ]
def sendOpen(tid):
m = SMBMessage(ComOpenAndxRequest(filename = path,
access_mode = 0x0041, # Sharing mode: Deny nothing to others + Open for writing
open_mode = 0x0011, # Create file if file does not exist. Overwrite if exists.
search_attributes = SMB_FILE_ATTRIBUTE_HIDDEN | SMB_FILE_ATTRIBUTE_SYSTEM,
timeout = timeout * 1000))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, openCB, errback)
messages_history.append(m)
def openCB(open_message, **kwargs):
messages_history.append(open_message)
if not open_message.status.hasError:
sendWrite(open_message.tid, open_message.payload.fid, starting_offset)
else:
errback(OperationFailure('Failed to store %s on %s: Unable to open file' % ( path, service_name ), messages_history))
def sendWrite(tid, fid, offset):
# For message signing, the total SMB message size must be not exceed the max_buffer_size. Non-message signing does not have this limitation
write_count = min((self.is_signing_active and (self.max_buffer_size-64)) or self.max_raw_size, 0xFFFF-1) # Need to minus 1 byte from 0xFFFF because of the first NULL byte in the ComWriteAndxRequest message data
data_bytes = file_obj.read(write_count)
data_len = len(data_bytes)
if data_len > 0:
m = SMBMessage(ComWriteAndxRequest(fid = fid, offset = offset, data_bytes = data_bytes))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, writeCB, errback, fid = fid, offset = offset+data_len)
else:
closeFid(tid, fid)
callback(( file_obj, offset )) # Note that this is a tuple of 2-elements
def writeCB(write_message, **kwargs):
# To avoid crazy memory usage when saving large files, we do not save every write_message in messages_history.
if not write_message.status.hasError:
sendWrite(write_message.tid, kwargs['fid'], kwargs['offset'])
else:
messages_history.append(write_message)
closeFid(write_message.tid, kwargs['fid'])
errback(OperationFailure('Failed to store %s on %s: Write failed' % ( path, service_name ), messages_history))
def closeFid(tid, fid):
m = SMBMessage(ComCloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
messages_history.append(m)
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[service_name] = connect_message.tid
sendOpen(connect_message.tid)
else:
errback(OperationFailure('Failed to store %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendOpen(self.connected_trees[service_name])
def _deleteFiles_SMB1(self, service_name, path_file_pattern, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
path = path_file_pattern.replace('/', '\\')
messages_history = [ ]
def sendDelete(tid):
m = SMBMessage(ComDeleteRequest(filename_pattern = path,
search_attributes = SMB_FILE_ATTRIBUTE_HIDDEN | SMB_FILE_ATTRIBUTE_SYSTEM))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, deleteCB, errback)
messages_history.append(m)
def deleteCB(delete_message, **kwargs):
messages_history.append(delete_message)
if not delete_message.status.hasError:
callback(path_file_pattern)
else:
errback(OperationFailure('Failed to store %s on %s: Delete failed' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[service_name] = connect_message.tid
sendDelete(connect_message.tid)
else:
errback(OperationFailure('Failed to delete %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendDelete(self.connected_trees[service_name])
def _createDirectory_SMB1(self, service_name, path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
path = path.replace('/', '\\')
messages_history = [ ]
def sendCreate(tid):
m = SMBMessage(ComCreateDirectoryRequest(path))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, createCB, errback)
messages_history.append(m)
def createCB(create_message, **kwargs):
messages_history.append(create_message)
if not create_message.status.hasError:
callback(path)
else:
errback(OperationFailure('Failed to create directory %s on %s: Create failed' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[service_name] = connect_message.tid
sendCreate(connect_message.tid)
else:
errback(OperationFailure('Failed to create directory %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendCreate(self.connected_trees[service_name])
def _deleteDirectory_SMB1(self, service_name, path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
path = path.replace('/', '\\')
messages_history = [ ]
def sendDelete(tid):
m = SMBMessage(ComDeleteDirectoryRequest(path))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, deleteCB, errback)
messages_history.append(m)
def deleteCB(delete_message, **kwargs):
messages_history.append(delete_message)
if not delete_message.status.hasError:
callback(path)
else:
errback(OperationFailure('Failed to delete directory %s on %s: Delete failed' % ( path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[service_name] = connect_message.tid
sendDelete(connect_message.tid)
else:
errback(OperationFailure('Failed to delete %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendDelete(self.connected_trees[service_name])
def _rename_SMB1(self, service_name, old_path, new_path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
new_path = new_path.replace('/', '\\')
old_path = old_path.replace('/', '\\')
messages_history = [ ]
def sendRename(tid):
m = SMBMessage(ComRenameRequest(old_path = old_path,
new_path = new_path,
search_attributes = SMB_FILE_ATTRIBUTE_HIDDEN | SMB_FILE_ATTRIBUTE_SYSTEM))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, renameCB, errback)
messages_history.append(m)
def renameCB(rename_message, **kwargs):
messages_history.append(rename_message)
if not rename_message.status.hasError:
callback(( old_path, new_path )) # Note that this is a tuple of 2-elements
else:
errback(OperationFailure('Failed to rename %s on %s: Rename failed' % ( old_path, service_name ), messages_history))
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[service_name] = connect_message.tid
sendRename(connect_message.tid)
else:
errback(OperationFailure('Failed to rename %s on %s: Unable to connect to shared device' % ( old_path, service_name ), messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendRename(self.connected_trees[service_name])
def _listSnapshots_SMB1(self, service_name, path, callback, errback, timeout = 30):
if not self.has_authenticated:
raise NotReadyError('SMB connection not authenticated')
expiry_time = time.time() + timeout
path = path.replace('/', '\\')
if not path.endswith('\\'):
path += '\\'
messages_history = [ ]
results = [ ]
def sendOpen(tid):
m = SMBMessage(ComOpenAndxRequest(filename = path,
access_mode = 0x0040, # Sharing mode: Deny nothing to others
open_mode = 0x0001, # Failed if file does not exist
search_attributes = 0,
timeout = timeout * 1000))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, openCB, errback)
messages_history.append(m)
def openCB(open_message, **kwargs):
messages_history.append(open_message)
if not open_message.status.hasError:
sendEnumSnapshots(open_message.tid, open_message.payload.fid)
else:
errback(OperationFailure('Failed to list snapshots %s on %s: Unable to open path' % ( path, service_name ), messages_history))
def sendEnumSnapshots(tid, fid):
# [MS-CIFS]: 2.2.7.2
# [MS-SMB]: 2.2.7.2.1
setup_bytes = struct.pack('<IHBB',
0x00144064, # [MS-SMB]: 2.2.7.2.1
fid, # FID
0x01, # IsFctl
0) # IsFlags
m = SMBMessage(ComNTTransactRequest(function = 0x0002, # NT_TRANSACT_IOCTL. [MS-CIFS]: 2.2.7.2.1
max_params_count = 0,
max_data_count = 0xFFFF,
max_setup_count = 0,
setup_bytes = setup_bytes))
m.tid = tid
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, enumSnapshotsCB, errback, tid = tid, fid = fid)
messages_history.append(m)
def enumSnapshotsCB(enum_message, **kwargs):
messages_history.append(enum_message)
if not enum_message.status.hasError:
results = [ ]
snapshots_count = struct.unpack('<I', enum_message.payload.data_bytes[4:8])[0]
for i in range(0, snapshots_count):
s = enum_message.payload.data_bytes[12+i*50:12+48+i*50].decode('UTF-16LE')
results.append(datetime(*map(int, ( s[5:9], s[10:12], s[13:15], s[16:18], s[19:21], s[22:24] ))))
closeFid(kwargs['tid'], kwargs['fid'])
callback(results)
else:
closeFid(kwargs['tid'], kwargs['fid'])
errback(OperationFailure('Failed to list snapshots %s on %s: Unable to list snapshots on path' % ( path, service_name ), messages_history))
def closeFid(tid, fid):
m = SMBMessage(ComCloseRequest(fid))
m.tid = tid
self._sendSMBMessage(m)
messages_history.append(m)
if not self.connected_trees.has_key(service_name):
def connectCB(connect_message, **kwargs):
messages_history.append(connect_message)
if not connect_message.status.hasError:
self.connected_trees[service_name] = connect_message.tid
sendOpen(connect_message.tid)
else:
errback(OperationFailure('Failed to list snapshots %s on %s: Unable to connect to shared device' % ( path, service_name ), messages_history))
m = SMBMessage(ComTreeConnectAndxRequest(r'\\%s\%s' % ( self.remote_name.upper(), service_name ), SERVICE_ANY, ''))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, expiry_time, connectCB, errback, path = service_name)
messages_history.append(m)
else:
sendOpen(self.connected_trees[service_name])
def _echo_SMB1(self, data, callback, errback, timeout = 30):
messages_history = [ ]
def echoCB(echo_message, **kwargs):
messages_history.append(echo_message)
if not echo_message.status.hasError:
callback(echo_message.payload.data)
else:
errback(OperationFailure('Echo failed', messages_history))
m = SMBMessage(ComEchoRequest(echo_data = data))
self._sendSMBMessage(m)
self.pending_requests[m.mid] = _PendingRequest(m.mid, int(time.time()) + timeout, echoCB, errback)
messages_history.append(m)
class SharedDevice:
"""
Contains information about a single shared device on the remote server.
"""
# The following constants are taken from [MS-SRVS]: 2.2.2.4
# They are used to identify the type of shared resource from the results from the NetrShareEnum in Server Service RPC
DISK_TREE = 0x00
PRINT_QUEUE = 0x01
COMM_DEVICE = 0x02
IPC = 0x03
def __init__(self, type, name, comments):
self._type = type
self.name = name #: An unicode string containing the name of the shared device
self.comments = comments #: An unicode string containing the user description of the shared device
@property
def type(self):
"""
Returns one of the following integral constants.
- SharedDevice.DISK_TREE
- SharedDevice.PRINT_QUEUE
- SharedDevice.COMM_DEVICE
- SharedDevice.IPC
"""
return self._type & 0xFFFF
@property
def isSpecial(self):
"""
Returns True if this shared device is a special share reserved for interprocess communication (IPC$)
or remote administration of the server (ADMIN$). Can also refer to administrative shares such as
C$, D$, E$, and so forth
"""
return bool(self._type & 0x80000000)
@property
def isTemporary(self):
"""
Returns True if this is a temporary share that is not persisted for creation each time the file server initializes.
"""
return bool(self._type & 0x40000000)
def __unicode__(self):
return u'Shared device: %s (type:0x%02x comments:%s)' % (self.name, self.type, self.comments )
class SharedFile:
"""
Contain information about a file/folder entry that is shared on the shared device.
As an application developer, you should not need to instantiate a *SharedFile* instance directly in your application.
These *SharedFile* instances are usually returned via a call to *listPath* method in :doc:`smb.SMBProtocol.SMBProtocolFactory<smb_SMBProtocolFactory>`.
If you encounter *SharedFile* instance where its short_name attribute is empty but the filename attribute contains a short name which does not correspond
to any files/folders on your remote shared device, it could be that the original filename on the file/folder entry on the shared device contains
one of these prohibited characters: "\/[]:+|<>=;?,* (see [MS-CIFS]: 2.2.1.1.1 for more details).
"""
def __init__(self, create_time, last_access_time, last_write_time, last_attr_change_time, file_size, alloc_size, file_attributes, short_name, filename):
self.create_time = create_time #: Float value in number of seconds since 1970-01-01 00:00:00 to the time of creation of this file resource on the remote server
self.last_access_time = last_access_time #: Float value in number of seconds since 1970-01-01 00:00:00 to the time of last access of this file resource on the remote server
self.last_write_time = last_write_time #: Float value in number of seconds since 1970-01-01 00:00:00 to the time of last modification of this file resource on the remote server
self.last_attr_change_time = last_attr_change_time #: Float value in number of seconds since 1970-01-01 00:00:00 to the time of last attribute change of this file resource on the remote server
self.file_size = file_size #: File size in number of bytes
self.alloc_size = alloc_size #: Total number of bytes allocated to store this file
self.file_attributes = file_attributes #: A SMB_EXT_FILE_ATTR integer value. See [MS-CIFS]: 2.2.1.2.3
self.short_name = short_name #: Unicode string containing the short name of this file (usually in 8.3 notation)
self.filename = filename #: Unicode string containing the long filename of this file. Each OS has a limit to the length of this file name. On Windows, it is 256 characters.
@property
def isDirectory(self):
"""A convenient property to return True if this file resource is a directory on the remote server"""
return bool(self.file_attributes & ATTR_DIRECTORY)
@property
def isReadOnly(self):
"""A convenient property to return True if this file resource is read-only on the remote server"""
return bool(self.file_attributes & ATTR_READONLY)
def __unicode__(self):
return u'Shared file: %s (FileSize:%d bytes, isDirectory:%s)' % ( self.filename, self.file_size, self.isDirectory )
class _PendingRequest:
def __init__(self, mid, expiry_time, callback, errback, **kwargs):
self.mid = mid
self.expiry_time = expiry_time
self.callback = callback
self.errback = errback
self.kwargs = kwargs
| 53.88576 | 231 | 0.580415 | 14,968 | 135,846 | 5.058458 | 0.05799 | 0.028211 | 0.031698 | 0.028317 | 0.808202 | 0.7832 | 0.764129 | 0.743789 | 0.728957 | 0.716265 | 0 | 0.037626 | 0.333643 | 135,846 | 2,520 | 232 | 53.907143 | 0.7988 | 0.056012 | 0 | 0.722886 | 0 | 0.001911 | 0.091426 | 0.000963 | 0 | 0 | 0.002388 | 0 | 0.000478 | 0 | null | null | 0.010511 | 0.005733 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
0a3368c2700ffadefc8f2ee4c27a226488a28c04 | 15,271 | py | Python | pybind/nos/v7_1_0/interface/hundredgigabitethernet/rmon/collection/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/nos/v7_1_0/interface/hundredgigabitethernet/rmon/collection/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | null | null | null | pybind/nos/v7_1_0/interface/hundredgigabitethernet/rmon/collection/__init__.py | shivharis/pybind | 4e1c6d54b9fd722ccec25546ba2413d79ce337e6 | [
"Apache-2.0"
] | 1 | 2021-11-05T22:15:42.000Z | 2021-11-05T22:15:42.000Z |
from operator import attrgetter
import pyangbind.lib.xpathhelper as xpathhelper
from pyangbind.lib.yangtypes import RestrictedPrecisionDecimalType, RestrictedClassType, TypedListType
from pyangbind.lib.yangtypes import YANGBool, YANGListType, YANGDynClass, ReferenceType
from pyangbind.lib.base import PybindBase
from decimal import Decimal
from bitarray import bitarray
import __builtin__
import ether_stats_entry
import history_control_entry
class collection(PybindBase):
"""
This class was auto-generated by the PythonClass plugin for PYANG
from YANG module brocade-interface - based on the path /interface/hundredgigabitethernet/rmon/collection. Each member element of
the container is represented as a class variable - with a specific
YANG type.
"""
__slots__ = ('_pybind_generated_by', '_path_helper', '_yang_name', '_rest_name', '_extmethods', '__ether_stats_entry','__history_control_entry',)
_yang_name = 'collection'
_rest_name = 'collection'
_pybind_generated_by = 'container'
def __init__(self, *args, **kwargs):
path_helper_ = kwargs.pop("path_helper", None)
if path_helper_ is False:
self._path_helper = False
elif path_helper_ is not None and isinstance(path_helper_, xpathhelper.YANGPathHelper):
self._path_helper = path_helper_
elif hasattr(self, "_parent"):
path_helper_ = getattr(self._parent, "_path_helper", False)
self._path_helper = path_helper_
else:
self._path_helper = False
extmethods = kwargs.pop("extmethods", None)
if extmethods is False:
self._extmethods = False
elif extmethods is not None and isinstance(extmethods, dict):
self._extmethods = extmethods
elif hasattr(self, "_parent"):
extmethods = getattr(self._parent, "_extmethods", None)
self._extmethods = extmethods
else:
self._extmethods = False
self.__history_control_entry = YANGDynClass(base=YANGListType("history_control_index",history_control_entry.history_control_entry, yang_name="history-control-entry", rest_name="history", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='history-control-index', extensions={u'tailf-common': {u'info': u'RMON ether History statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'history', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_history'}}), is_container='list', yang_name="history-control-entry", rest_name="history", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'RMON ether History statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'history', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_history'}}, namespace='urn:brocade.com:mgmt:brocade-rmon', defining_module='brocade-rmon', yang_type='list', is_config=True)
self.__ether_stats_entry = YANGDynClass(base=YANGListType("ether_stats_index",ether_stats_entry.ether_stats_entry, yang_name="ether-stats-entry", rest_name="stats", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='ether-stats-index', extensions={u'tailf-common': {u'info': u'RMON ether statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'stats', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_stats'}}), is_container='list', yang_name="ether-stats-entry", rest_name="stats", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'RMON ether statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'stats', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_stats'}}, namespace='urn:brocade.com:mgmt:brocade-rmon', defining_module='brocade-rmon', yang_type='list', is_config=True)
load = kwargs.pop("load", None)
if args:
if len(args) > 1:
raise TypeError("cannot create a YANG container with >1 argument")
all_attr = True
for e in self._pyangbind_elements:
if not hasattr(args[0], e):
all_attr = False
break
if not all_attr:
raise ValueError("Supplied object did not have the correct attributes")
for e in self._pyangbind_elements:
nobj = getattr(args[0], e)
if nobj._changed() is False:
continue
setmethod = getattr(self, "_set_%s" % e)
if load is None:
setmethod(getattr(args[0], e))
else:
setmethod(getattr(args[0], e), load=load)
def _path(self):
if hasattr(self, "_parent"):
return self._parent._path()+[self._yang_name]
else:
return [u'interface', u'hundredgigabitethernet', u'rmon', u'collection']
def _rest_path(self):
if hasattr(self, "_parent"):
if self._rest_name:
return self._parent._rest_path()+[self._rest_name]
else:
return self._parent._rest_path()
else:
return [u'interface', u'HundredGigabitEthernet', u'rmon', u'collection']
def _get_ether_stats_entry(self):
"""
Getter method for ether_stats_entry, mapped from YANG variable /interface/hundredgigabitethernet/rmon/collection/ether_stats_entry (list)
"""
return self.__ether_stats_entry
def _set_ether_stats_entry(self, v, load=False):
"""
Setter method for ether_stats_entry, mapped from YANG variable /interface/hundredgigabitethernet/rmon/collection/ether_stats_entry (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_ether_stats_entry is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_ether_stats_entry() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType("ether_stats_index",ether_stats_entry.ether_stats_entry, yang_name="ether-stats-entry", rest_name="stats", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='ether-stats-index', extensions={u'tailf-common': {u'info': u'RMON ether statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'stats', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_stats'}}), is_container='list', yang_name="ether-stats-entry", rest_name="stats", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'RMON ether statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'stats', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_stats'}}, namespace='urn:brocade.com:mgmt:brocade-rmon', defining_module='brocade-rmon', yang_type='list', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """ether_stats_entry must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("ether_stats_index",ether_stats_entry.ether_stats_entry, yang_name="ether-stats-entry", rest_name="stats", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='ether-stats-index', extensions={u'tailf-common': {u'info': u'RMON ether statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'stats', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_stats'}}), is_container='list', yang_name="ether-stats-entry", rest_name="stats", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'RMON ether statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'stats', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_stats'}}, namespace='urn:brocade.com:mgmt:brocade-rmon', defining_module='brocade-rmon', yang_type='list', is_config=True)""",
})
self.__ether_stats_entry = t
if hasattr(self, '_set'):
self._set()
def _unset_ether_stats_entry(self):
self.__ether_stats_entry = YANGDynClass(base=YANGListType("ether_stats_index",ether_stats_entry.ether_stats_entry, yang_name="ether-stats-entry", rest_name="stats", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='ether-stats-index', extensions={u'tailf-common': {u'info': u'RMON ether statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'stats', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_stats'}}), is_container='list', yang_name="ether-stats-entry", rest_name="stats", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'RMON ether statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'stats', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_stats'}}, namespace='urn:brocade.com:mgmt:brocade-rmon', defining_module='brocade-rmon', yang_type='list', is_config=True)
def _get_history_control_entry(self):
"""
Getter method for history_control_entry, mapped from YANG variable /interface/hundredgigabitethernet/rmon/collection/history_control_entry (list)
"""
return self.__history_control_entry
def _set_history_control_entry(self, v, load=False):
"""
Setter method for history_control_entry, mapped from YANG variable /interface/hundredgigabitethernet/rmon/collection/history_control_entry (list)
If this variable is read-only (config: false) in the
source YANG file, then _set_history_control_entry is considered as a private
method. Backends looking to populate this variable should
do so via calling thisObj._set_history_control_entry() directly.
"""
if hasattr(v, "_utype"):
v = v._utype(v)
try:
t = YANGDynClass(v,base=YANGListType("history_control_index",history_control_entry.history_control_entry, yang_name="history-control-entry", rest_name="history", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='history-control-index', extensions={u'tailf-common': {u'info': u'RMON ether History statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'history', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_history'}}), is_container='list', yang_name="history-control-entry", rest_name="history", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'RMON ether History statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'history', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_history'}}, namespace='urn:brocade.com:mgmt:brocade-rmon', defining_module='brocade-rmon', yang_type='list', is_config=True)
except (TypeError, ValueError):
raise ValueError({
'error-string': """history_control_entry must be of a type compatible with list""",
'defined-type': "list",
'generated-type': """YANGDynClass(base=YANGListType("history_control_index",history_control_entry.history_control_entry, yang_name="history-control-entry", rest_name="history", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='history-control-index', extensions={u'tailf-common': {u'info': u'RMON ether History statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'history', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_history'}}), is_container='list', yang_name="history-control-entry", rest_name="history", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'RMON ether History statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'history', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_history'}}, namespace='urn:brocade.com:mgmt:brocade-rmon', defining_module='brocade-rmon', yang_type='list', is_config=True)""",
})
self.__history_control_entry = t
if hasattr(self, '_set'):
self._set()
def _unset_history_control_entry(self):
self.__history_control_entry = YANGDynClass(base=YANGListType("history_control_index",history_control_entry.history_control_entry, yang_name="history-control-entry", rest_name="history", parent=self, is_container='list', user_ordered=False, path_helper=self._path_helper, yang_keys='history-control-index', extensions={u'tailf-common': {u'info': u'RMON ether History statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'history', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_history'}}), is_container='list', yang_name="history-control-entry", rest_name="history", parent=self, path_helper=self._path_helper, extmethods=self._extmethods, register_paths=True, extensions={u'tailf-common': {u'info': u'RMON ether History statistics collection', u'cli-no-key-completion': None, u'cli-suppress-mode': None, u'cli-suppress-list-no': None, u'cli-full-no': None, u'alt-name': u'history', u'cli-compact-syntax': None, u'cli-suppress-key-abbreviation': None, u'callpoint': u'rmon_history'}}, namespace='urn:brocade.com:mgmt:brocade-rmon', defining_module='brocade-rmon', yang_type='list', is_config=True)
ether_stats_entry = __builtin__.property(_get_ether_stats_entry, _set_ether_stats_entry)
history_control_entry = __builtin__.property(_get_history_control_entry, _set_history_control_entry)
_pyangbind_elements = {'ether_stats_entry': ether_stats_entry, 'history_control_entry': history_control_entry, }
| 95.44375 | 1,243 | 0.742649 | 2,233 | 15,271 | 4.873712 | 0.081057 | 0.035284 | 0.047046 | 0.070569 | 0.826243 | 0.800882 | 0.777727 | 0.777727 | 0.773132 | 0.773132 | 0 | 0.000442 | 0.110143 | 15,271 | 159 | 1,244 | 96.044025 | 0.800427 | 0.087617 | 0 | 0.381818 | 0 | 0.018182 | 0.457381 | 0.179378 | 0 | 0 | 0 | 0 | 0 | 1 | 0.081818 | false | 0 | 0.090909 | 0 | 0.309091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0a57e246bc35375ea9869d57511288c9d8a3f2dd | 183 | py | Python | tests/malib/environments/dummy/__init__.py | wwxFromTju/malib | 7cd2a4af55cf1f56da8854e26ea7a4f3782ceea2 | [
"MIT"
] | 6 | 2021-05-19T10:25:36.000Z | 2021-12-27T03:30:33.000Z | tests/malib/environments/dummy/__init__.py | wwxFromTju/malib | 7cd2a4af55cf1f56da8854e26ea7a4f3782ceea2 | [
"MIT"
] | 1 | 2021-05-29T04:51:37.000Z | 2021-05-30T06:18:10.000Z | tests/malib/environments/dummy/__init__.py | ying-wen/malib_deprecated | 875338b81c4d87064ad31201f461ef742db05f25 | [
"MIT"
] | 1 | 2021-05-31T16:16:12.000Z | 2021-05-31T16:16:12.000Z | # Created by yingwen at 2019-03-13
from tests.malib.environments.dummy.dummy_box_env import DummyBoxEnv
from tests.malib.environments.dummy.dummy_discrate_env import DummyDiscreteEnv
| 45.75 | 78 | 0.863388 | 27 | 183 | 5.703704 | 0.666667 | 0.116883 | 0.181818 | 0.337662 | 0.467532 | 0.467532 | 0 | 0 | 0 | 0 | 0 | 0.047337 | 0.076503 | 183 | 3 | 79 | 61 | 0.863905 | 0.174863 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0a5cd03f04fb4aaa6068d0ac58deaaf924fbc5c8 | 354 | py | Python | highway_env/envs/__init__.py | AlexanderDavid/Powerlaw-Highway-Env | e3e3b6277e0a75e4dcbc7988a9cb144137328d22 | [
"MIT"
] | 3 | 2021-04-23T03:30:57.000Z | 2021-04-27T19:32:41.000Z | highway_env/envs/__init__.py | AlexanderDavid/Powerlaw-Highway-Env | e3e3b6277e0a75e4dcbc7988a9cb144137328d22 | [
"MIT"
] | null | null | null | highway_env/envs/__init__.py | AlexanderDavid/Powerlaw-Highway-Env | e3e3b6277e0a75e4dcbc7988a9cb144137328d22 | [
"MIT"
] | null | null | null | from highway_env.envs.highway_env import *
from highway_env.envs.merge_env import *
from highway_env.envs.parking_env import *
from highway_env.envs.summon_env import *
from highway_env.envs.roundabout_env import *
from highway_env.envs.two_way_env import *
from highway_env.envs.intersection_env import *
from highway_env.envs.lane_keeping_env import *
| 39.333333 | 47 | 0.841808 | 58 | 354 | 4.827586 | 0.241379 | 0.321429 | 0.4 | 0.514286 | 0.675 | 0.675 | 0 | 0 | 0 | 0 | 0 | 0 | 0.090395 | 354 | 8 | 48 | 44.25 | 0.869565 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
0a64525f807f2212ee049a5468ce485b3e01e228 | 12,957 | py | Python | sdk/purview/azure-purview-catalog/azure/purview/catalog/rest/discovery/_request_builders.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 3 | 2020-06-23T02:25:27.000Z | 2021-09-07T18:48:11.000Z | sdk/purview/azure-purview-catalog/azure/purview/catalog/rest/discovery/_request_builders.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 510 | 2019-07-17T16:11:19.000Z | 2021-08-02T08:38:32.000Z | sdk/purview/azure-purview-catalog/azure/purview/catalog/rest/discovery/_request_builders.py | rsdoherty/azure-sdk-for-python | 6bba5326677468e6660845a703686327178bb7b1 | [
"MIT"
] | 5 | 2019-09-04T12:51:37.000Z | 2020-09-16T07:28:40.000Z | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from typing import TYPE_CHECKING
from azure.core.pipeline.transport._base import _format_url_section
from azure.purview.catalog.core.rest import HttpRequest
from msrest import Serializer
if TYPE_CHECKING:
# pylint: disable=unused-import,ungrouped-imports
from typing import Any, Dict, IO, List, Optional, Union
_SERIALIZER = Serializer()
def build_query_request(
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Gets data using search.
See https://aka.ms/azsdk/python/protocol/quickstart for how to incorporate this request builder into your code flow.
:keyword json: An object specifying the search criteria.
:paramtype json: Any
:keyword content: An object specifying the search criteria.
:paramtype content: Any
:return: Returns an :class:`~azure.purview.catalog.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/protocol/quickstart for how to incorporate this response into your code flow.
:rtype: ~azure.purview.catalog.core.rest.HttpRequest
Example:
.. code-block:: python
# JSON input template you can fill out and use as your `json` input.
json = {
"facets": [
{
"count": "int (optional)",
"facet": "str (optional)",
"sort": "object (optional)"
}
],
"filter": "object (optional)",
"keywords": "str (optional)",
"limit": "int (optional)",
"offset": "int (optional)",
"taxonomySetting": {
"assetTypes": [
"str (optional)"
],
"facet": {
"count": "int (optional)",
"facet": "str (optional)",
"sort": "object (optional)"
}
}
}
# response body for status code(s): 200
response_body == {
"@search.count": "int (optional)",
"@search.facets": {
"assetType": [
{
"count": "int (optional)",
"value": "str (optional)"
}
],
"classification": [
{
"count": "int (optional)",
"value": "str (optional)"
}
],
"classificationCategory": [
{
"count": "int (optional)",
"value": "str (optional)"
}
],
"contactId": [
{
"count": "int (optional)",
"value": "str (optional)"
}
],
"fileExtension": [
{
"count": "int (optional)",
"value": "str (optional)"
}
],
"label": [
{
"count": "int (optional)",
"value": "str (optional)"
}
],
"term": [
{
"count": "int (optional)",
"value": "str (optional)"
}
]
},
"value": [
{
"@search.highlights": {
"description": [
"str (optional)"
],
"entityType": [
"str (optional)"
],
"id": [
"str (optional)"
],
"name": [
"str (optional)"
],
"qualifiedName": [
"str (optional)"
]
},
"@search.score": "float (optional)",
"@search.text": "str (optional)",
"assetType": [
"str (optional)"
],
"classification": [
"str (optional)"
],
"contact": [
{
"contactType": "str (optional)",
"id": "str (optional)",
"info": "str (optional)"
}
],
"description": "str (optional)",
"entityType": "str (optional)",
"id": "str (optional)",
"label": [
"str (optional)"
],
"name": "str (optional)",
"owner": "str (optional)",
"qualifiedName": "str (optional)",
"term": [
{
"glossaryName": "str (optional)",
"guid": "str (optional)",
"name": "str (optional)"
}
]
}
]
}
"""
content_type = kwargs.pop("content_type", None)
api_version = "2021-05-01-preview"
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/search/query')
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_suggest_request(
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Get search suggestions by query criteria.
See https://aka.ms/azsdk/python/protocol/quickstart for how to incorporate this request builder into your code flow.
:keyword json: An object specifying the suggest criteria.
:paramtype json: Any
:keyword content: An object specifying the suggest criteria.
:paramtype content: Any
:return: Returns an :class:`~azure.purview.catalog.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/protocol/quickstart for how to incorporate this response into your code flow.
:rtype: ~azure.purview.catalog.core.rest.HttpRequest
Example:
.. code-block:: python
# JSON input template you can fill out and use as your `json` input.
json = {
"filter": "object (optional)",
"keywords": "str (optional)",
"limit": "int (optional)"
}
# response body for status code(s): 200
response_body == {
"value": [
{
"@search.score": "float (optional)",
"@search.text": "str (optional)",
"assetType": [
"str (optional)"
],
"classification": [
"str (optional)"
],
"contact": [
{
"contactType": "str (optional)",
"id": "str (optional)",
"info": "str (optional)"
}
],
"description": "str (optional)",
"entityType": "str (optional)",
"id": "str (optional)",
"label": [
"str (optional)"
],
"name": "str (optional)",
"owner": "str (optional)",
"qualifiedName": "str (optional)",
"term": [
{
"glossaryName": "str (optional)",
"guid": "str (optional)",
"name": "str (optional)"
}
]
}
]
}
"""
content_type = kwargs.pop("content_type", None)
api_version = "2021-05-01-preview"
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/search/suggest')
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
def build_auto_complete_request(
**kwargs # type: Any
):
# type: (...) -> HttpRequest
"""Get auto complete options.
See https://aka.ms/azsdk/python/protocol/quickstart for how to incorporate this request builder into your code flow.
:keyword json: An object specifying the autocomplete criteria.
:paramtype json: Any
:keyword content: An object specifying the autocomplete criteria.
:paramtype content: Any
:return: Returns an :class:`~azure.purview.catalog.core.rest.HttpRequest` that you will pass to the client's `send_request` method.
See https://aka.ms/azsdk/python/protocol/quickstart for how to incorporate this response into your code flow.
:rtype: ~azure.purview.catalog.core.rest.HttpRequest
Example:
.. code-block:: python
# JSON input template you can fill out and use as your `json` input.
json = {
"filter": "object (optional)",
"keywords": "str (optional)",
"limit": "int (optional)"
}
# response body for status code(s): 200
response_body == {
"value": [
{
"queryPlusText": "str (optional)",
"text": "str (optional)"
}
]
}
"""
content_type = kwargs.pop("content_type", None)
api_version = "2021-05-01-preview"
accept = "application/json"
# Construct URL
url = kwargs.pop("template_url", '/search/autocomplete')
# Construct parameters
query_parameters = kwargs.pop("params", {}) # type: Dict[str, Any]
query_parameters['api-version'] = _SERIALIZER.query("api_version", api_version, 'str')
# Construct headers
header_parameters = kwargs.pop("headers", {}) # type: Dict[str, Any]
if content_type is not None:
header_parameters['Content-Type'] = _SERIALIZER.header("content_type", content_type, 'str')
header_parameters['Accept'] = _SERIALIZER.header("accept", accept, 'str')
return HttpRequest(
method="POST",
url=url,
params=query_parameters,
headers=header_parameters,
**kwargs
)
| 36.705382 | 135 | 0.439454 | 1,011 | 12,957 | 5.555885 | 0.183976 | 0.101834 | 0.028485 | 0.028663 | 0.84992 | 0.83817 | 0.79206 | 0.769628 | 0.769628 | 0.733488 | 0 | 0.004678 | 0.439068 | 12,957 | 352 | 136 | 36.809659 | 0.768162 | 0.716678 | 0 | 0.761194 | 0 | 0 | 0.171988 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044776 | false | 0 | 0.074627 | 0 | 0.164179 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a573005f451407ac03ead1344c8899c4ff1eb1e | 79,534 | py | Python | python/paddle/nn/functional/pooling.py | RangeKing/Paddle | 2d87300809ae75d76f5b0b457d8112cb88dc3e27 | [
"Apache-2.0"
] | 8 | 2016-08-15T07:02:27.000Z | 2016-08-24T09:34:00.000Z | python/paddle/nn/functional/pooling.py | RangeKing/Paddle | 2d87300809ae75d76f5b0b457d8112cb88dc3e27 | [
"Apache-2.0"
] | null | null | null | python/paddle/nn/functional/pooling.py | RangeKing/Paddle | 2d87300809ae75d76f5b0b457d8112cb88dc3e27 | [
"Apache-2.0"
] | null | null | null | # Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# TODO: define pooling functions
from ...fluid.layers import utils, LayerHelper
from ...tensor.manipulation import unsqueeze, squeeze
from ...fluid.data_feeder import check_type, check_variable_and_dtype
from paddle import _C_ops
from paddle import in_dynamic_mode
from paddle.fluid.framework import _in_legacy_dygraph
from paddle.fluid.framework import in_dygraph_mode
__all__ = []
def _is_list_or_tuple(input):
return isinstance(input, (list, tuple))
def _check_input(x, dimension):
if len(x.shape) != dimension:
raise ValueError(
"Excepted Input X is {}-D tensor, but received {}-D {}".format(
dimension, len(x.shape), type(x)))
def _check_instance(x, x_name, types=(int, float)):
if not isinstance(x, types):
raise ValueError("Excepted {} type for {} but received type: {}. ".
format(types, x_name, type(x)))
def _check_value_limitation(x, x_name, min_limit=1e-3):
def _check_value(x, x_name, min_limit=1e-3):
if isinstance(x, int) and min_limit is not None and x < min_limit:
raise ValueError(
"Excepted the input {} to be greater than {} but received x: {}. ".
format(x_name, min_limit, x))
for ele in x:
_check_value(ele, x_name)
def _zero_padding_in_batch_and_channel(padding, channel_last):
if channel_last:
return list(padding[0]) == [0, 0] and list(padding[-1]) == [0, 0]
else:
return list(padding[0]) == [0, 0] and list(padding[1]) == [0, 0]
def _exclude_padding_in_batch_and_channel(padding, channel_last):
padding_ = padding[1:-1] if channel_last else padding[2:]
padding_ = [elem for pad_a_dim in padding_ for elem in pad_a_dim]
return padding_
def _channel_last(data_format, num_dims):
if num_dims == 1:
if data_format not in ['NCL', 'NLC']:
raise ValueError(
"Attr(data_format) should be 'NCL' or 'NLC'. Received "
"Attr(data_format): %s" % str(data_format))
else:
return True if data_format == "NLC" else False
if num_dims == 2:
if data_format not in ['NCHW', 'NHWC']:
raise ValueError(
"Attr(data_format) should be 'NCHW' or 'NHWC'. Received "
"Attr(data_format): %s" % str(data_format))
else:
return True if data_format == "NHWC" else False
if num_dims == 3:
if data_format not in ['NCDHW', 'NDHWC']:
raise ValueError(
"Attr(data_format) should be 'NCDHW' or 'NDHWC'. Received "
"Attr(data_format): %s" % str(data_format))
else:
return True if data_format == "NDHWC" else False
def _update_padding_nd(padding, num_dims, channel_last=False, ceil_mode=False):
if isinstance(padding, str):
padding = padding.upper()
if padding not in ["SAME", "VALID"]:
raise ValueError(
"Unknown padding: '{}'. It can only be 'SAME' or 'VALID'.".
format(padding))
if padding == "VALID":
if ceil_mode != False:
raise ValueError(
"When Attr(padding) is \"VALID\", Attr(ceil_mode) must be False. "
"Received ceil_mode: True.")
padding_algorithm = "VALID"
padding = [0] * num_dims
else:
padding_algorithm = "SAME"
padding = [0] * num_dims
elif _is_list_or_tuple(padding):
# for padding like
# [(pad_before, pad_after), (pad_before, pad_after), ...]
# padding for batch_dim and channel_dim included
if len(padding) == 2 + num_dims and _is_list_or_tuple(padding[0]):
if not _zero_padding_in_batch_and_channel(padding, channel_last):
raise ValueError(
"Non-zero padding({}) in the batch or channel dimensions "
"is not supported.".format(padding))
padding_algorithm = "EXPLICIT"
padding = _exclude_padding_in_batch_and_channel(padding,
channel_last)
if utils._is_symmetric_padding(padding, num_dims):
padding = padding[0::2]
# for padding like [pad_before, pad_after, pad_before, pad_after, ...]
elif len(padding) == 2 * num_dims and isinstance(padding[0], int):
padding_algorithm = "EXPLICIT"
padding = utils.convert_to_list(padding, 2 * num_dims, 'padding')
if utils._is_symmetric_padding(padding, num_dims):
padding = padding[0::2]
# for padding like [pad_d1, pad_d2, ...]
elif len(padding) == num_dims and isinstance(padding[0], int):
padding_algorithm = "EXPLICIT"
padding = utils.convert_to_list(padding, num_dims, 'padding')
else:
raise ValueError("Invalid padding: {}".format(padding))
# for integer padding
else:
padding_algorithm = "EXPLICIT"
padding = utils.convert_to_list(padding, num_dims, 'padding')
return padding, padding_algorithm
def _expand_low_nd_padding(padding):
#1d to 2d fake input
if len(padding) == 2:
padding = [0] * 2 + padding
elif len(padding) == 1:
padding = [0] + padding
else:
raise ValueError(
"The size of padding's dimmention should be 1 or 2. But got padding={}".
format(padding))
return padding
def avg_pool1d(x,
kernel_size,
stride=None,
padding=0,
exclusive=True,
ceil_mode=False,
name=None):
"""
This API implements average pooling 1d operation,
See more details in :ref:`api_nn_pooling_AvgPool1d` .
Args:
x (Tensor): The input tensor of pooling operator which is a 3-D tensor with
shape [N, C, L]. where `N` is batch size, `C` is the number of channels,
`L` is the length of the feature. The data type is float32 or float64.
kernel_size (int|list|tuple): The pool kernel size. If pool kernel size is a tuple or list,
it must contain an integer.
stride (int|list|tuple): The pool stride size. If pool stride size is a tuple or list,
it must contain an integer.
padding (string|int|list|tuple): The padding size. Padding could be in one of the following forms.
1. A string in ['valid', 'same'].
2. An int, which means the feature map is zero padded by size of `padding` on every sides.
3. A list[int] or tuple(int) whose length is 1, which means the feature map is zero padded by the size of `padding[0]` on every sides.
4. A list[int] or tuple(int) whose length is 2. It has the form [pad_before, pad_after].
5. A list or tuple of pairs of integers. It has the form [[pad_before, pad_after], [pad_before, pad_after], ...]. Note that, the batch dimension and channel dimension should be [0,0] or (0,0).
The default value is 0.
exclusive (bool): Whether to exclude padding points in average pooling
mode, default is `True`.
ceil_mode (bool): ${ceil_mode_comment}Whether to use the ceil function to calculate output height and width.
If it is set to False, the floor function will be used. The default value is False.
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of pooling result. The data type is same as input tensor.
Raises:
ValueError: If `padding` is a string, but not "SAME" or "VALID".
ValueError: If `padding` is "VALID", but `ceil_mode` is True.
ValueError: If `padding` is a list or tuple but its length is greater than 1.
ShapeError: If the input is not a 3-D tensor.
ShapeError: If the output's shape calculated is not greater than 0.
Examples:
.. code-block:: python
import paddle
import paddle.nn.functional as F
import numpy as np
data = paddle.to_tensor(np.random.uniform(-1, 1, [1, 3, 32]).astype(np.float32))
out = F.avg_pool1d(data, kernel_size=2, stride=2, padding=0)
# out shape: [1, 3, 16]
"""
"""NCL to NCHW"""
data_format = "NCHW"
if not in_dynamic_mode():
check_variable_and_dtype(x, 'x', ['float32', 'float64'], 'avg_pool1d')
_check_input(x, 3)
x = unsqueeze(x, [2])
kernel_size = utils.convert_to_list(kernel_size, 1, 'kernel_size')
kernel_size = [1] + kernel_size
if stride is None:
stride = kernel_size
else:
stride = utils.convert_to_list(stride, 1, 'pool_stride')
stride = [1] + stride
_check_value_limitation(kernel_size, "kernel_size", min_limit=1e-3)
_check_value_limitation(stride, "stride", min_limit=1e-3)
channel_last = _channel_last("NCL", 1)
padding, padding_algorithm = _update_padding_nd(
padding, 1, channel_last=channel_last, ceil_mode=ceil_mode)
# use 2d to implenment 1d should expand padding in advance.
padding = _expand_low_nd_padding(padding)
if in_dynamic_mode():
output = _C_ops.pool2d(
x, 'pooling_type', 'avg', 'ksize', kernel_size, 'global_pooling',
False, 'strides', stride, 'paddings', padding, 'padding_algorithm',
padding_algorithm, 'use_cudnn', True, 'ceil_mode', ceil_mode,
'use_mkldnn', False, 'exclusive', exclusive, 'data_format',
data_format)
return squeeze(output, [2])
op_type = 'pool2d'
helper = LayerHelper(op_type, **locals())
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
helper.append_op(
type=op_type,
inputs={"X": x},
outputs={"Out": pool_out},
attrs={
"pooling_type": 'avg',
"ksize": kernel_size,
"global_pooling": False,
"strides": stride,
"paddings": padding,
"padding_algorithm": padding_algorithm,
"use_cudnn": True,
"ceil_mode": ceil_mode,
"use_mkldnn": False,
"exclusive": exclusive,
"data_format": data_format,
})
return squeeze(pool_out, [2])
def avg_pool2d(x,
kernel_size,
stride=None,
padding=0,
ceil_mode=False,
exclusive=True,
divisor_override=None,
data_format="NCHW",
name=None):
"""
This API implements average pooling 2d operation.
See more details in :ref:`api_nn_pooling_AvgPool2d` .
Args:
x (Tensor): The input tensor of pooling operator which is a 4-D tensor with
shape [N, C, H, W]. The format of input tensor is `"NCHW"` or
`"NHWC"`, where `N` is batch size, `C` is the number of channels,
`H` is the height of the feature, and `W` is the width of the
feature. The data type if float32 or float64.
kernel_size (int|list|tuple): The pool kernel size. If it is a tuple or list,
it must contain two integers, (kernel_size_Height, kernel_size_Width).
Otherwise, the pool kernel size will be a square of an int.
stride (int|list|tuple): The stride size. If it is a tuple or list,
it must contain two integers, (stride_Height, stride_Width).
Otherwise, the stride size will be a square of an int.
padding (string|int|list|tuple): The padding size. Padding could be in one of the following forms.
1. A string in ['valid', 'same'].
2. An int, which means the feature map is zero padded by size of `padding` on every sides.
3. A list[int] or tuple(int) whose length is 2, [pad_height, pad_weight] whose value means the padding size of each dimension.
4. A list[int] or tuple(int) whose length is 4. [pad_height_top, pad_height_bottom, pad_width_left, pad_width_right] whose value means the padding size of each side.
5. A list or tuple of pairs of integers. It has the form [[pad_before, pad_after], [pad_before, pad_after], ...]. Note that, the batch dimension and channel dimension should be [0,0] or (0,0).
The default value is 0.
ceil_mode (bool): when True, will use `ceil` instead of `floor` to compute the output shape
exclusive (bool): Whether to exclude padding points in average pooling
mode, default is `true`.
divisor_override (float): if specified, it will be used as divisor, otherwise kernel_size will be used. Default None.
data_format (string): The data format of the input and output data. An optional string from: `"NCHW"`, `"NHWC"`.
The default is `"NCHW"`. When it is `"NCHW"`, the data is stored in the order of:
`[batch_size, input_channels, input_height, input_width]`.
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of pooling result. The data type is same as input tensor.
Raises:
ValueError: If `padding` is a string, but not "SAME" or "VALID".
ValueError: If `padding` is "VALID", but `ceil_mode` is True.
ShapeError: If the output's shape calculated is not greater than 0.
Examples:
.. code-block:: python
import paddle
import paddle.nn.functional as F
import numpy as np
# avg pool2d
x = paddle.to_tensor(np.random.uniform(-1, 1, [1, 3, 32, 32]).astype(np.float32))
out = F.avg_pool2d(x,
kernel_size=2,
stride=2, padding=0)
# out.shape [1, 3, 16, 16]
"""
kernel_size = utils.convert_to_list(kernel_size, 2, 'pool_size')
if stride is None:
stride = kernel_size
else:
stride = utils.convert_to_list(stride, 2, 'pool_stride')
_check_value_limitation(kernel_size, "kernel_size", min_limit=1e-3)
_check_value_limitation(stride, "stride", min_limit=1e-3)
channel_last = _channel_last(data_format, 2)
padding, padding_algorithm = _update_padding_nd(
padding, 2, channel_last, ceil_mode=ceil_mode)
if in_dygraph_mode() or _in_legacy_dygraph():
if in_dygraph_mode():
output = _C_ops.final_state_pool2d(
x, kernel_size, stride, padding, ceil_mode, exclusive,
data_format, 'avg', False, False, padding_algorithm)
else:
output = _C_ops.pool2d(
x, 'pooling_type', 'avg', 'ksize', kernel_size,
'global_pooling', False, 'padding_algorithm', padding_algorithm,
'strides', stride, 'paddings', padding, 'use_cudnn', True,
'ceil_mode', ceil_mode, 'use_mkldnn', False, 'exclusive',
exclusive, 'data_format', data_format)
if divisor_override is None:
return output
else:
_check_instance(divisor_override, "divisor_override")
return output * (kernel_size[0] * kernel_size[1]) / divisor_override
op_type = 'pool2d'
helper = LayerHelper(op_type, **locals())
check_variable_and_dtype(x, 'x', ['float32', 'float64'], 'avg_pool2d')
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
helper.append_op(
type=op_type,
inputs={"X": x},
outputs={"Out": pool_out},
attrs={
"pooling_type": "avg",
"ksize": kernel_size,
"global_pooling": False,
"strides": stride,
"paddings": padding,
"padding_algorithm": padding_algorithm,
"use_cudnn": True,
"ceil_mode": ceil_mode,
"use_mkldnn": False,
"exclusive": exclusive,
"data_format": data_format,
})
if divisor_override is None:
return pool_out
else:
_check_instance(divisor_override, "divisor_override")
return pool_out * (kernel_size[0] * kernel_size[1]) / divisor_override
def avg_pool3d(x,
kernel_size,
stride=None,
padding=0,
ceil_mode=False,
exclusive=True,
divisor_override=None,
data_format="NCDHW",
name=None):
"""
This API implements average pooling 3d operation.
See more details in :ref:`api_nn_pooling_AvgPool3d` .
Args:
x (Tensor): The input tensor of pooling operator, which is a 5-D tensor with
shape [N, C, D, H, W], where `N` represents the batch size, `C` represents
the number of channels, `D`, `H` and `W` represent the depth, height and width of the feature respectively.
kernel_size (int|list|tuple): The pool kernel size. If pool kernel size
is a tuple or list, it must contain three integers,
(kernel_size_Depth, kernel_size_Height, kernel_size_Width).
Otherwise, the pool kernel size will be the cube of an int.
stride (int|list|tuple): The pool stride size. If pool stride size is a tuple or list,
it must contain three integers, [stride_Depth, stride_Height, stride_Width).
Otherwise, the pool stride size will be a cube of an int.
padding (string|int|list|tuple): The padding size. Padding could be in one of the following forms.
1. A string in ['valid', 'same'].
2. An int, which means the feature map is zero padded by size of `padding` on every sides.
3. A list[int] or tuple(int) whose length is 3, [pad_depth, pad_height, pad_weight] whose value means the padding size of each dimension.
4. A list[int] or tuple(int) whose length is 6. [pad_depth_front, pad_depth_back, pad_height_top, pad_height_bottom, pad_width_left, pad_width_right] whose value means the padding size of each side.
5. A list or tuple of pairs of integers. It has the form [[pad_before, pad_after], [pad_before, pad_after], ...]. Note that, the batch dimension and channel dimension should be [0,0] or (0,0).
The default value is 0.
ceil_mode (bool): ${ceil_mode_comment}
exclusive (bool): Whether to exclude padding points in average pooling
mode, default is True.
divisor_override (int|float) if specified, it will be used as divisor, otherwise kernel_size will be used. Default None.
data_format (string): The data format of the input and output data. An optional string from: `"NCDHW"`, `"NDHWC"`.
The default is `"NCDHW"`. When it is `"NCDHW"`, the data is stored in the order of:
`[batch_size, input_channels, input_depth, input_height, input_width]`.
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of pooling result. The data type is same as input tensor.
Raises:
ValueError: If `padding` is a string, but not "SAME" or "VALID".
ValueError: If `padding` is "VALID", but `ceil_mode` is True.
ShapeError: If the output's shape calculated is not greater than 0.
Examples:
.. code-block:: python
import paddle
import numpy as np
x = paddle.to_tensor(np.random.uniform(-1, 1, [1, 3, 32, 32, 32]).astype(np.float32))
# avg pool3d
out = paddle.nn.functional.avg_pool3d(
x,
kernel_size = 2,
stride = 2,
padding=0)
# out.shape: [1, 3, 16, 16, 16]
"""
kernel_size = utils.convert_to_list(kernel_size, 3, 'pool_size')
if stride is None:
stride = kernel_size
else:
stride = utils.convert_to_list(stride, 3, 'pool_stride')
channel_last = _channel_last(data_format, 3)
padding, padding_algorithm = _update_padding_nd(
padding, 3, channel_last=channel_last, ceil_mode=ceil_mode)
_check_value_limitation(kernel_size, "kernel_size", min_limit=1e-3)
_check_value_limitation(stride, "stride", min_limit=1e-3)
if in_dygraph_mode() or _in_legacy_dygraph():
if in_dygraph_mode():
output = _C_ops.final_state_pool3d(
x, kernel_size, stride, padding, ceil_mode, exclusive,
data_format, 'avg', False, False, padding_algorithm)
if _in_legacy_dygraph():
output = _C_ops.pool3d(
x, 'pooling_type', 'avg', 'ksize', kernel_size, 'strides',
stride, 'paddings', padding, 'global_pooling', False,
'padding_algorithm', padding_algorithm, 'use_cudnn', True,
'ceil_mode', ceil_mode, 'use_mkldnn', False, 'exclusive',
exclusive, 'data_format', data_format)
if divisor_override is None:
return output
else:
_check_instance(divisor_override, "divisor_override")
return output * (kernel_size[0] * kernel_size[1] *
kernel_size[2]) / divisor_override
op_type = "pool3d"
helper = LayerHelper(op_type, **locals())
check_variable_and_dtype(x, 'x', ['float32', 'float64'], 'max_pool3d')
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
outputs = {"Out": pool_out}
helper.append_op(
type=op_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": 'avg',
"ksize": kernel_size,
"global_pooling": False,
"strides": stride,
"paddings": padding,
"padding_algorithm": padding_algorithm,
"use_cudnn": True,
"ceil_mode": ceil_mode,
"use_mkldnn": False,
"exclusive": exclusive,
"data_format": data_format,
})
if divisor_override is None:
return pool_out
else:
_check_instance(divisor_override, "divisor_override")
return pool_out * (kernel_size[0] * kernel_size[1] *
kernel_size[2]) / divisor_override
def max_pool1d(x,
kernel_size,
stride=None,
padding=0,
return_mask=False,
ceil_mode=False,
name=None):
"""
This API implements max pooling 1d opereation.
See more details in :ref:`api_nn_pooling_MaxPool1d` .
Args:
x (Tensor): The input tensor of pooling operator which is a 3-D tensor with
shape [N, C, L], where `N` is batch size, `C` is the number of channels,
`L` is the length of the feature. The data type if float32 or float64.
kernel_size (int|list|tuple): The pool kernel size. If pool kernel size is a tuple or list,
it must contain an integer.
stride (int|list|tuple): The pool stride size. If pool stride size is a tuple or list,
it must contain an integer.
padding (string|int|list|tuple): The padding size. Padding could be in one of the following forms.
1. A string in ['valid', 'same'].
2. An integer, which means the feature map is zero padded by size of `padding` on every sides.
3. A list[int] or tuple(int) whose length is 1, which means the feature map is zero padded by the size of `padding[0]` on every sides.
4. A list[int] or tuple(int) whose length is 2. It has the form [pad_before, pad_after].
5. A list or tuple of pairs of integers. It has the form [[pad_before, pad_after], [pad_before, pad_after], ...]. Note that, the batch dimension and channel dimension should be [0,0] or (0,0).
The default value is 0.
return_mask (bool): Whether return the max indices along with the outputs. default is `False`.
ceil_mode (bool): Whether to use the ceil function to calculate output height and width. False is the default.
If it is set to False, the floor function will be used. Default False.
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of pooling result. The data type is same as input tensor.
Raises:
ValueError: If `padding` is a string, but not "SAME" or "VALID".
ValueError: If `padding` is "VALID", but `ceil_mode` is True.
ShapeError: If the input is not a 3-D tensor.
ShapeError: If the output's shape calculated is not greater than 0.
Examples:
.. code-block:: python
import paddle
import paddle.nn.functional as F
import numpy as np
data = paddle.to_tensor(np.random.uniform(-1, 1, [1, 3, 32]).astype(np.float32))
pool_out = F.max_pool1d(data, kernel_size=2, stride=2, padding=0)
# pool_out shape: [1, 3, 16]
pool_out, indices = F.max_pool1d(data, kernel_size=2, stride=2, padding=0, return_mask=True)
# pool_out shape: [1, 3, 16], indices shape: [1, 3, 16]
"""
"""NCL to NCHW"""
data_format = "NCHW"
if not in_dynamic_mode():
check_variable_and_dtype(x, 'x', ['float32', 'float64'], 'max_pool1d')
_check_input(x, 3)
x = unsqueeze(x, [2])
kernel_size = [1] + utils.convert_to_list(kernel_size, 1, 'pool_size')
if stride is None:
stride = kernel_size
else:
stride = [1] + utils.convert_to_list(stride, 1, 'pool_stride')
padding, padding_algorithm = _update_padding_nd(
padding, 1, ceil_mode=ceil_mode)
# use 2d to implenment 1d should expand padding in advance.
padding = _expand_low_nd_padding(padding)
if in_dygraph_mode():
if return_mask:
pool_out = _C_ops.final_state_max_pool2d_with_index(
x, kernel_size, stride, padding, False, False)
return (squeeze(pool_out[0], [2]),
squeeze(pool_out[1],
[2])) if return_mask else squeeze(pool_out[0], [2])
else:
pool_out = _C_ops.final_state_pool2d(
x, kernel_size, stride, padding, ceil_mode, True, data_format,
'max', False, False, padding_algorithm)
return squeeze(pool_out, [2])
if _in_legacy_dygraph():
if return_mask:
pool_out = _C_ops.max_pool2d_with_index(
x, 'ksize', kernel_size, 'global_pooling', False, 'strides',
stride, 'paddings', padding, 'padding_algorithm',
padding_algorithm, 'use_cudnn', True, 'ceil_mode', ceil_mode,
'use_mkldnn', False, 'exclusive', True, 'data_format',
data_format)
return (squeeze(pool_out[0], [2]),
squeeze(pool_out[1],
[2])) if return_mask else squeeze(pool_out[0], [2])
else:
pool_out = _C_ops.pool2d(
x, 'pooling_type', 'max', 'ksize', kernel_size,
'global_pooling', False, 'padding_algorithm', padding_algorithm,
'strides', stride, 'paddings', padding, 'use_cudnn', True,
'ceil_mode', ceil_mode, 'use_mkldnn', False, 'exclusive', True,
'data_format', data_format)
return squeeze(pool_out, [2])
op_type = 'max_pool2d_with_index' if return_mask else "pool2d"
helper = LayerHelper(op_type, **locals())
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
mask = helper.create_variable_for_type_inference('int32')
outputs = {"Out": pool_out, "Mask": mask}
helper.append_op(
type=op_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": 'max',
"ksize": kernel_size,
"global_pooling": False,
"strides": stride,
"paddings": padding,
"padding_algorithm": padding_algorithm,
"use_cudnn": True,
"ceil_mode": ceil_mode,
"use_mkldnn": False,
"exclusive": True,
"data_format": data_format,
})
return (squeeze(pool_out, [2]),
squeeze(mask, [2])) if return_mask else squeeze(pool_out, [2])
def _unpool_output_size(x, kernel_size, stride, padding, output_size):
input_size = x.shape
default_size = []
for d in range(len(kernel_size)):
default_size.append((input_size[-len(kernel_size) + d] - 1) * stride[d]
+ kernel_size[d] - 2 * padding[d])
if output_size is None:
ret = default_size
else:
if len(output_size) == len(kernel_size) + 2:
output_size = output_size[2:]
if len(output_size) != len(kernel_size):
raise ValueError(
"output_size should be a sequence containing "
"{} or {} elements, but it has a length of '{}'".format(
len(kernel_size), len(kernel_size) + 2, len(output_size)))
for d in range(len(kernel_size)):
min_size = default_size[d] - stride[d]
max_size = default_size[d] + stride[d]
if not (min_size < output_size[d] < max_size):
raise ValueError(
'invalid output_size "{}" (dim {} must be between {} and {})'.
format(output_size, d, min_size, max_size))
ret = output_size
return ret
def max_unpool1d(x,
indices,
kernel_size,
stride=None,
padding=0,
data_format="NCL",
output_size=None,
name=None):
r"""
This API implements max unpooling 1d opereation.
`max_unpool1d` accepts the output of `max_pool1d` as input,
including the indices of the maximum value and calculate the partial inverse.
All non-maximum values are set to zero.
- Input: :math:`(N, C, L_{in})`
- Output: :math:`(N, C, L_{out})`, where
.. math::
L_{out} = (L_{in} - 1) * stride - 2 * padding + kernel\_size
or as given by :attr:`output_size` in the call operator.
Args:
x (Tensor): The input tensor of unpooling operator which is a 3-D tensor with
shape [N, C, L]. The format of input tensor is `"NCL"`,
where `N` is batch size, `C` is the number of channels, `L` is
the length of the feature. The data type is float32 or float64.
indices (Tensor): The indices given out by maxpooling1d which is a 3-D tensor with
shape [N, C, L]. The format of input tensor is `"NCL"` ,
where `N` is batch size, `C` is the number of channels, `L` is
the length of the featuree. The data type is float32 or float64.
kernel_size (int|list|tuple): The unpool kernel size. If unpool kernel size is a tuple or list,
it must contain an integer.
stride (int|list|tuple): The unpool stride size. If unpool stride size is a tuple or list,
it must contain an integer.
padding (int | tuple): Padding that was added to the input.
output_size(list|tuple, optional): The target output size. If output_size is not specified,
the actual output shape will be automatically calculated by (input_shape,
kernel_size, stride, padding).
data_format (string): The data format of the input and output data.
The default is `"NCL"`. When it is `"NCL"`, the data is stored in the order of:
`[batch_size, input_channels, input_length]`.
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of unpooling result.
Examples:
.. code-block:: python
import paddle
import paddle.nn.functional as F
data = paddle.rand(shape=[1, 3, 16])
pool_out, indices = F.max_pool1d(data, kernel_size=2, stride=2, padding=0, return_mask=True)
# pool_out shape: [1, 3, 8], indices shape: [1, 3, 8]
unpool_out = F.max_unpool1d(pool_out, indices, kernel_size=2, padding=0)
# unpool_out shape: [1, 3, 16]
"""
"""NCL to NCHW"""
if data_format not in ["NCL"]:
raise ValueError("Attr(data_format) should be 'NCL'. Received "
"Attr(data_format): %s." % str(data_format))
data_format = "NCHW"
x = unsqueeze(x, [2])
indices = unsqueeze(indices, [2])
kernel_size = [1] + utils.convert_to_list(kernel_size, 1, 'pool_size')
if stride is None:
stride = kernel_size
else:
stride = [1] + utils.convert_to_list(stride, 1, 'pool_stride')
padding, padding_algorithm = _update_padding_nd(padding, 1)
# use 2d to implenment 1d should expand padding in advance.
padding = _expand_low_nd_padding(padding)
output_size = _unpool_output_size(x, kernel_size, stride, padding,
output_size)
if in_dynamic_mode():
output = _C_ops.unpool(x, indices, 'unpooling_type', 'max', 'ksize',
kernel_size, 'strides', stride, 'paddings',
padding, "output_size", output_size,
"data_format", data_format)
return squeeze(output, [2])
op_type = "unpool"
helper = LayerHelper(op_type, **locals())
dtype = helper.input_dtype(input_param_name="x")
unpool_out = helper.create_variable_for_type_inference(dtype)
helper.append_op(
type=op_type,
inputs={"X": x,
"Indices": indices},
outputs={"Out": unpool_out},
attrs={
"unpooling_type": "max",
"ksize": kernel_size,
"strides": stride,
"paddings": padding,
"output_size": output_size
})
return squeeze(unpool_out, [2])
def max_unpool2d(x,
indices,
kernel_size,
stride=None,
padding=0,
data_format="NCHW",
output_size=None,
name=None):
r"""
This API implements max unpooling 2d opereation.
See more details in :ref:`api_nn_pooling_MaxUnPool2D` .
Args:
x (Tensor): The input tensor of unpooling operator which is a 4-D tensor with
shape [N, C, H, W]. The format of input tensor is `"NCHW"`,
where `N` is batch size, `C` is the number of channels,
`H` is the height of the feature, and `W` is the width of the
feature. The data type if float32 or float64.
indices (Tensor): The indices given out by maxpooling2d which is a 4-D tensor with
shape [N, C, H, W]. The format of input tensor is `"NCHW"` ,
where `N` is batch size, `C` is the number of channels,
`H` is the height of the feature, and `W` is the width of the
feature. The data type if float32 or float64.
kernel_size (int|list|tuple): The unpool kernel size. If unpool kernel size is a tuple or list,
it must contain an integer.
stride (int|list|tuple): The unpool stride size. If unpool stride size is a tuple or list,
it must contain an integer.
kernel_size (int|tuple): Size of the max unpooling window.
padding (int | tuple): Padding that was added to the input.
output_size(list|tuple, optional): The target output size. If output_size is not specified,
the actual output shape will be automatically calculated by (input_shape,
kernel_size, padding).
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
- Input: :math:`(N, C, H_{in}, W_{in})`
- Output: :math:`(N, C, H_{out}, W_{out})`, where
.. math::
H_{out} = (H_{in} - 1) \times \text{stride[0]} - 2 \times \text{padding[0]} + \text{kernel\_size[0]}
.. math::
W_{out} = (W_{in} - 1) \times \text{stride[1]} - 2 \times \text{padding[1]} + \text{kernel\_size[1]}
or as given by :attr:`output_size` in the call operator
Returns:
Tensor: The output tensor of unpooling result.
Raises:
ValueError: If the input is not a 4-D tensor.
ValueError: If indeces shape is not equal input shape.
Examples:
.. code-block:: python
import paddle
import paddle.nn.functional as F
data = paddle.rand(shape=[1,1,6,6])
pool_out, indices = F.max_pool2d(data, kernel_size=2, stride=2, padding=0, return_mask=True)
# pool_out shape: [1, 1, 3, 3], indices shape: [1, 1, 3, 3]
unpool_out = F.max_unpool2d(pool_out, indices, kernel_size=2, padding=0)
# unpool_out shape: [1, 1, 6, 6]
# specify a different output size than input size
unpool_out = F.max_unpool2d(pool_out, indices, kernel_size=2, padding=0, output_size=[7,7])
# unpool_out shape: [1, 1, 7, 7]
"""
kernel_size = utils.convert_to_list(kernel_size, 2, 'pool_size')
if stride is None:
stride = kernel_size
else:
stride = utils.convert_to_list(stride, 2, 'pool_stride')
padding = utils.convert_to_list(padding, 2, 'padding')
if data_format not in ["NCHW"]:
raise ValueError("Attr(data_format) should be 'NCHW'. Received "
"Attr(data_format): %s." % str(data_format))
output_size = _unpool_output_size(x, kernel_size, stride, padding,
output_size)
if in_dynamic_mode():
output = _C_ops.unpool(x, indices, 'unpooling_type', 'max', 'ksize',
kernel_size, 'strides', stride, 'paddings',
padding, "output_size", output_size,
"data_format", data_format)
return output
op_type = "unpool"
helper = LayerHelper(op_type, **locals())
dtype = helper.input_dtype(input_param_name="x")
unpool_out = helper.create_variable_for_type_inference(dtype)
helper.append_op(
type=op_type,
inputs={"X": x,
"Indices": indices},
outputs={"Out": unpool_out},
attrs={
"unpooling_type": "max",
"ksize": kernel_size,
"strides": stride,
"paddings": padding,
"output_size": output_size
})
return unpool_out
def max_unpool3d(x,
indices,
kernel_size,
stride=None,
padding=0,
data_format="NCDHW",
output_size=None,
name=None):
r"""
This API implements max unpooling 3d opereation.
`max_unpool3d` accepts the output of `max_pool3d` as input,
including the indices of the maximum value and calculate the partial inverse.
All non-maximum values are set to zero.
- Input: :math:`(N, C, D_{in}, H_{in}, W_{in})`
- Output: :math:`(N, C, D_{out}, H_{out}, W_{out})`, where
.. math::
D_{out} = (D_{in} - 1) * stride[0] - 2 * padding[0] + kernel\_size[0]
.. math::
H_{out} = (H_{in} - 1) * stride[1] - 2 * padding[1] + kernel\_size[1]
.. math::
W_{out} = (W_{in} - 1) * stride[2] - 2 * padding[2] + kernel\_size[2]
or as given by :attr:`output_size` in the call operator
Args:
x (Tensor): The input tensor of unpooling operator which is a 5-D tensor with
shape [N, C, D, H, W]. The format of input tensor is `"NCDHW"`,
where `N` is batch size, `C` is the number of channels, `D` is
the depth of the feature, `H` is the height of the feature,
and `W` is the width of the feature. The data type is float32 or float64.
indices (Tensor): The indices given out by maxpooling3d which is a 5-D tensor with
shape [N, C, D, H, W]. The format of input tensor is `"NCDHW"` ,
where `N` is batch size, `C` is the number of channels, `D` is
the depth of the feature, `H` is the height of the feature,
and `W` is the width of the feature. The data type is float32 or float64.
kernel_size (int|list|tuple): The unpool kernel size. If unpool kernel size is a tuple or list,
it must contain an integer.
stride (int|list|tuple): The unpool stride size. If unpool stride size is a tuple or list,
it must contain an integer.
padding (int | tuple): Padding that was added to the input.
output_size(list|tuple, optional): The target output size. If output_size is not specified,
the actual output shape will be automatically calculated by (input_shape,
kernel_size, stride, padding).
data_format (string): The data format of the input and output data.
The default is `"NCDHW"`. When it is `"NCDHW"`, the data is stored in the order of:
`[batch_size, input_channels, input_depth, input_height, input_width]`.
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of unpooling result.
Examples:
.. code-block:: python
import paddle
import paddle.nn.functional as F
data = paddle.rand(shape=[1, 1, 4, 4, 6])
pool_out, indices = F.max_pool3d(data, kernel_size=2, stride=2, padding=0, return_mask=True)
# pool_out shape: [1, 1, 2, 2, 3], indices shape: [1, 1, 2, 2, 3]
unpool_out = F.max_unpool3d(pool_out, indices, kernel_size=2, padding=0)
# unpool_out shape: [1, 1, 4, 4, 6]
"""
kernel_size = utils.convert_to_list(kernel_size, 3, 'pool_size')
if stride is None:
stride = kernel_size
else:
stride = utils.convert_to_list(stride, 3, 'pool_stride')
padding = utils.convert_to_list(padding, 3, 'padding')
if data_format not in ["NCDHW"]:
raise ValueError("Attr(data_format) should be 'NCDHW'. Received "
"Attr(data_format): %s." % str(data_format))
output_size = _unpool_output_size(x, kernel_size, stride, padding,
output_size)
if in_dynamic_mode():
output = _C_ops.unpool3d(x, indices, 'unpooling_type', 'max', 'ksize',
kernel_size, 'strides', stride, 'paddings',
padding, "output_size", output_size,
"data_format", data_format)
return output
op_type = "unpool3d"
helper = LayerHelper(op_type, **locals())
dtype = helper.input_dtype(input_param_name="x")
unpool_out = helper.create_variable_for_type_inference(dtype)
helper.append_op(
type=op_type,
inputs={"X": x,
"Indices": indices},
outputs={"Out": unpool_out},
attrs={
"unpooling_type": "max",
"ksize": kernel_size,
"strides": stride,
"paddings": padding,
"output_size": output_size
})
return unpool_out
def max_pool2d(x,
kernel_size,
stride=None,
padding=0,
return_mask=False,
ceil_mode=False,
data_format="NCHW",
name=None):
kernel_size = utils.convert_to_list(kernel_size, 2, 'pool_size')
if stride is None:
stride = kernel_size
else:
stride = utils.convert_to_list(stride, 2, 'pool_stride')
if data_format not in ["NCHW", "NHWC"]:
raise ValueError(
"Attr(data_format) should be 'NCHW' or 'NHWC'. Received "
"Attr(data_format): %s." % str(data_format))
channel_last = True if data_format == "NHWC" else False
padding, padding_algorithm = _update_padding_nd(
padding, num_dims=2, channel_last=channel_last, ceil_mode=ceil_mode)
if data_format == "NHWC" and return_mask:
raise ValueError(
"When setting return_mask to true, data_format must be set to NCHW in API:max_pool2d"
)
if in_dygraph_mode():
if return_mask:
output = _C_ops.final_state_max_pool2d_with_index(
x, kernel_size, stride, padding, False, False)
return output if return_mask else output[0]
else:
return _C_ops.final_state_pool2d(
x, kernel_size, stride, padding, ceil_mode, True, data_format,
'max', False, False, padding_algorithm)
if _in_legacy_dygraph():
if return_mask:
output = _C_ops.max_pool2d_with_index(
x, 'ksize', kernel_size, 'global_pooling', False, 'strides',
stride, 'paddings', padding, 'padding_algorithm',
padding_algorithm, 'use_cudnn', True, 'ceil_mode', ceil_mode,
'use_mkldnn', False, 'exclusive', True, 'data_format',
data_format)
return output if return_mask else output[0]
else:
output = _C_ops.pool2d(
x, 'pooling_type', 'max', 'ksize', kernel_size,
'global_pooling', False, 'padding_algorithm', padding_algorithm,
'strides', stride, 'paddings', padding, 'use_cudnn', True,
'ceil_mode', ceil_mode, 'use_mkldnn', False, 'exclusive', True,
'data_format', data_format)
return output
op_type = 'max_pool2d_with_index' if return_mask else "pool2d"
helper = LayerHelper(op_type, **locals())
check_variable_and_dtype(x, 'x', ['float16', 'float32', 'float64'],
'max_pool2d')
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
mask = helper.create_variable_for_type_inference("int32")
outputs = {"Out": pool_out, "Mask": mask}
helper.append_op(
type=op_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": 'max',
"ksize": kernel_size,
"global_pooling": False,
"strides": stride,
"paddings": padding,
"padding_algorithm": padding_algorithm,
"use_cudnn": True,
"ceil_mode": ceil_mode,
"use_mkldnn": False,
"exclusive": True,
"data_format": data_format,
})
return (pool_out, mask) if return_mask else pool_out
def max_pool3d(x,
kernel_size,
stride=None,
padding=0,
return_mask=False,
ceil_mode=False,
data_format="NCDHW",
name=None):
"""
This API implements max pooling 2d operation.
See more details in :ref:`api_nn_pooling_MaxPool3d` .
Args:
x (Tensor): The input tensor of pooling operator, which is a 5-D tensor with
shape [N, C, D, H, W]. The format of input tensor is `"NCDHW"` or `"NDHWC"`, where N represents batch size, C represents the number of channels, D, H and W represent the depth, height and width of the feature respectively.
kernel_size (int|list|tuple): The pool kernel size. If the kernel size
is a tuple or list, it must contain three integers,
(kernel_size_Depth, kernel_size_Height, kernel_size_Width).
Otherwise, the pool kernel size will be the cube of an int.
stride (int|list|tuple): The pool stride size. If pool stride size is a tuple or list,
it must contain three integers, [stride_Depth, stride_Height, stride_Width).
Otherwise, the pool stride size will be a cube of an int.
padding (string|int|list|tuple): The padding size. Padding could be in one of the following forms.
1. A string in ['valid', 'same'].
2. An int, which means the feature map is zero padded by size of `padding` on every sides.
3. A list[int] or tuple(int) whose length is 3, [pad_depth, pad_height, pad_weight] whose value means the padding size of each dimension.
4. A list[int] or tuple(int) whose length is 6. [pad_depth_front, pad_depth_back, pad_height_top, pad_height_bottom, pad_width_left, pad_width_right] whose value means the padding size of each side.
5. A list or tuple of pairs of integers. It has the form [[pad_before, pad_after], [pad_before, pad_after], ...]. Note that, the batch dimension and channel dimension should be [0,0] or (0,0).
The default value is 0.
ceil_mode (bool): ${ceil_mode_comment}
return_mask (bool): Whether to return the max indices along with the outputs. Default False. Only support "NDCHW" data_format.
data_format (string): The data format of the input and output data. An optional string from: `"NCDHW"`, `"NDHWC"`.
The default is `"NCDHW"`. When it is `"NCDHW"`, the data is stored in the order of:
`[batch_size, input_channels, input_depth, input_height, input_width]`.
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of pooling result. The data type is same as input tensor.
Raises:
ValueError: If `padding` is a string, but not "SAME" or "VALID".
ValueError: If `padding` is "VALID", but `ceil_mode` is True.
ShapeError: If the output's shape calculated is not greater than 0.
Examples:
.. code-block:: python
import paddle
import paddle.nn.functional as F
# max pool3d
x = paddle.uniform([1, 3, 32, 32, 32])
output = F.max_pool3d(x,
kernel_size=2,
stride=2, padding=0)
# output.shape [1, 3, 16, 16, 16]
# for return_mask=True
x = paddle.uniform([1, 3, 32, 32, 32])
output, max_indices = paddle.nn.functional.max_pool3d(x,
kernel_size = 2,
stride = 2,
padding=0,
return_mask=True)
# output.shape [1, 3, 16, 16, 16], max_indices.shape [1, 3, 16, 16, 16]
"""
kernel_size = utils.convert_to_list(kernel_size, 3, 'pool_size')
if stride is None:
stride = kernel_size
else:
stride = utils.convert_to_list(stride, 3, 'pool_stride')
channel_last = _channel_last(data_format, 3)
padding, padding_algorithm = _update_padding_nd(
padding, 3, channel_last=channel_last, ceil_mode=ceil_mode)
if data_format == "NDHWC" and return_mask:
raise ValueError(
"When setting return_mask to true, data_format must be set to NCDHW in API:max_pool3d"
)
if in_dygraph_mode():
if return_mask:
output = _C_ops.final_state_max_pool3d_with_index(
x, kernel_size, stride, padding, False, False)
return output if return_mask else output[0]
else:
return _C_ops.final_state_pool3d(
x, kernel_size, stride, padding, ceil_mode, True, data_format,
'max', False, False, padding_algorithm)
if _in_legacy_dygraph():
if return_mask:
output = _C_ops.max_pool3d_with_index(
x, 'pooling_type', 'max', 'ksize', kernel_size, 'strides',
stride, 'paddings', padding, 'global_pooling', False,
'padding_algorithm', padding_algorithm, 'use_cudnn', True,
'ceil_mode', ceil_mode, 'use_mkldnn', False, 'exclusive', True,
'data_format', data_format)
return output if return_mask else output[0]
else:
output = _C_ops.pool3d(
x, 'pooling_type', 'max', 'ksize', kernel_size,
'global_pooling', False, 'padding_algorithm', padding_algorithm,
'strides', stride, 'paddings', padding, 'use_cudnn', True,
'ceil_mode', ceil_mode, 'use_mkldnn', False, 'exclusive', True,
'data_format', data_format)
return output
op_type = "max_pool3d_with_index" if return_mask else "pool3d"
helper = LayerHelper(op_type, **locals())
check_variable_and_dtype(x, 'x', ['float32', 'float64'], 'max_pool3d')
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
mask = helper.create_variable_for_type_inference('int32')
outputs = {"Out": pool_out, "Mask": mask}
helper.append_op(
type=op_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": 'max',
"ksize": kernel_size,
"global_pooling": False,
"strides": stride,
"paddings": padding,
"padding_algorithm": padding_algorithm,
"use_cudnn": True,
"ceil_mode": ceil_mode,
"use_mkldnn": False,
"exclusive": False,
"data_format": data_format,
})
return (pool_out, mask) if return_mask else pool_out
def adaptive_avg_pool1d(x, output_size, name=None):
"""
This API implements adaptive average pooling 1d operation.
See more details in :ref:`api_nn_pooling_AdaptiveAvgPool1d` .
Args:
x (Tensor): The input tensor of pooling operator, which is a 3-D tensor
with shape [N, C, L]. The format of input tensor is NCL,
where N is batch size, C is the number of channels, L is the
length of the feature. The data type is float32 or float64.
output_size (int): The target output size. It must be an integer.
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of adaptive average pooling result. The data type is same
as input tensor.
Examples:
.. code-block:: python
:name: code-example1
# average adaptive pool1d
# suppose input data in shape of [N, C, L], `output_size` is m or [m],
# output shape is [N, C, m], adaptive pool divide L dimension
# of input data into m grids averagely and performs poolings in each
# grid to get output.
# adaptive max pool performs calculations as follow:
#
# for i in range(m):
# lstart = floor(i * L / m)
# lend = ceil((i + 1) * L / m)
# output[:, :, i] = sum(input[:, :, lstart: lend])/(lstart - lend)
#
import paddle
import paddle.nn.functional as F
data = paddle.uniform([1, 3, 32])
pool_out = F.adaptive_avg_pool1d(data, output_size=16)
# pool_out shape: [1, 3, 16])
"""
pool_type = 'avg'
if not in_dynamic_mode():
check_variable_and_dtype(x, 'x', ['float16', 'float32', 'float64'],
'adaptive_pool2d')
check_type(output_size, 'pool_size', (int), 'adaptive_pool1d')
_check_input(x, 3)
pool_size = [1] + utils.convert_to_list(output_size, 1, 'pool_size')
x = unsqueeze(x, [2])
if in_dynamic_mode():
pool_out = _C_ops.pool2d(x, 'pooling_type', pool_type, 'ksize',
pool_size, 'adaptive', True)
return squeeze(pool_out, [2])
l_type = "pool2d"
helper = LayerHelper(l_type, **locals())
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
outputs = {"Out": pool_out}
helper.append_op(
type=l_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": pool_type,
"ksize": pool_size,
"adaptive": True,
})
return squeeze(pool_out, [2])
def adaptive_avg_pool2d(x, output_size, data_format='NCHW', name=None):
"""
This API implements adaptive average pooling 2d operation.
See more details in :ref:`api_nn_pooling_AdaptiveAvgPool2d` .
Args:
x (Tensor): The input tensor of adaptive avg pool2d operator, which is a 4-D tensor.
The data type can be float32 or float64.
output_size (int|list|tuple): The pool kernel size. If pool kernel size is a tuple or list,
it must contain two element, (H, W). H and W can be either a int, or None which means
the size will be the same as that of the input.
data_format (str): The data format of the input and output data. An optional string
from: "NCHW", "NHWC". The default is "NCHW". When it is "NCHW", the data is stored in
the order of: [batch_size, input_channels, input_height, input_width].
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of avg adaptive pool2d result. The data type is same as input tensor.
Raises:
ValueError: If `data_format` is not "NCHW" or "NHWC".
Examples:
.. code-block:: python
# adaptive avg pool2d
# suppose input data in shape of [N, C, H, W], `output_size` is [m, n],
# output shape is [N, C, m, n], adaptive pool divide H and W dimensions
# of input data into m * n grids averagely and performs poolings in each
# grid to get output.
# adaptive avg pool performs calculations as follow:
#
# for i in range(m):
# for j in range(n):
# hstart = floor(i * H / m)
# hend = ceil((i + 1) * H / m)
# wstart = floor(i * W / n)
# wend = ceil((i + 1) * W / n)
# output[:, :, i, j] = avg(input[:, :, hstart: hend, wstart: wend])
#
import paddle
import numpy as np
input_data = np.random.rand(2, 3, 32, 32)
x = paddle.to_tensor(input_data)
# x.shape is [2, 3, 32, 32]
out = paddle.nn.functional.adaptive_avg_pool2d(
x = x,
output_size=[3, 3])
# out.shape is [2, 3, 3, 3]
"""
if not in_dynamic_mode():
check_variable_and_dtype(x, 'x', ['float16', 'float32', 'float64'],
'adaptive_avg_pool2d')
check_type(data_format, 'data_format', str, 'adaptive_avg_pool2d')
if data_format not in ["NCHW", "NHWC"]:
raise ValueError(
"Attr(data_format) should be 'NCHW' or 'NHWC'. Received "
"Attr(data_format): %s." % str(data_format))
if data_format == "NCHW":
in_h, in_w = x.shape[2:4]
else:
in_h, in_w = x.shape[1:3]
if isinstance(output_size, int):
output_size = utils.convert_to_list(output_size, 2, 'output_size')
else:
output_size = list(output_size)
if output_size[0] == None:
output_size[0] = in_h
if output_size[1] == None:
output_size[1] = in_w
if in_dygraph_mode():
return _C_ops.final_state_pool2d_gpudnn_unused(
x, output_size, [1, 1], [0, 0], False, True, data_format, 'avg',
False, True, "EXPLICIT")
if _in_legacy_dygraph():
return _C_ops.pool2d(x, 'pooling_type', 'avg', 'ksize', output_size,
'global_pooling', False, 'adaptive', True,
'data_format', data_format)
l_type = 'pool2d'
helper = LayerHelper(l_type, **locals())
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
outputs = {"Out": pool_out}
helper.append_op(
type=l_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": "avg",
"ksize": output_size,
"adaptive": True,
"data_format": data_format,
})
return pool_out
def adaptive_avg_pool3d(x, output_size, data_format='NCDHW', name=None):
"""
This API implements adaptive average pooling 3d operation.
See more details in :ref:`api_nn_pooling_AdaptiveAvgPool3d` .
Args:
x (Tensor): The input tensor of adaptive avg pool3d operator, which is a 5-D tensor.
The data type can be float32, float64.
output_size (int|list|tuple): The pool kernel size. If pool kernel size is a tuple or list,
it must contain three elements, (D, H, W). D, H and W can be either a int, or None which means
the size will be the same as that of the input.
data_format (str): The data format of the input and output data. An optional string
from: "NCDHW", "NDHWC". The default is "NCDHW". When it is "NCDHW", the data is stored in
the order of: [batch_size, input_channels, input_depth, input_height, input_width].
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of avg adaptive pool3d result. The data type is same as input tensor.
Raises:
ValueError: If `data_format` is not "NCDHW" or "NDHWC".
Examples:
.. code-block:: python
# adaptive avg pool3d
# suppose input data in shape of [N, C, D, H, W], `output_size` is [l, m, n],
# output shape is [N, C, l, m, n], adaptive pool divide D, H and W dimensions
# of input data into l * m * n grids averagely and performs poolings in each
# grid to get output.
# adaptive avg pool performs calculations as follow:
#
# for i in range(l):
# for j in range(m):
# for k in range(n):
# dstart = floor(i * D / l)
# dend = ceil((i + 1) * D / l)
# hstart = floor(j * H / m)
# hend = ceil((j + 1) * H / m)
# wstart = floor(k * W / n)
# wend = ceil((k + 1) * W / n)
# output[:, :, i, j, k] =
# avg(input[:, :, dstart:dend, hstart: hend, wstart: wend])
import paddle
import numpy as np
input_data = np.random.rand(2, 3, 8, 32, 32)
x = paddle.to_tensor(input_data)
# x.shape is [2, 3, 8, 32, 32]
out = paddle.nn.functional.adaptive_avg_pool3d(
x = x,
output_size=[3, 3, 3])
# out.shape is [2, 3, 3, 3, 3]
"""
if not in_dynamic_mode():
check_variable_and_dtype(x, 'x', ['float32', 'float64'],
'adaptive_avg_pool3d')
check_type(data_format, 'data_format', str, 'adaptive_avg_pool3d')
if data_format not in ["NCDHW", "NDHWC"]:
raise ValueError(
"Attr(data_format) should be 'NCDHW' or 'NDHWC'. Received "
"Attr(data_format): %s." % str(data_format))
if data_format == "NCDHW":
in_l, in_h, in_w = x.shape[2:5]
else:
in_l, in_h, in_w = x.shape[1:4]
if isinstance(output_size, int):
output_size = utils.convert_to_list(output_size, 3, 'output_size')
else:
output_size = list(output_size)
if output_size[0] == None:
output_size[0] = in_l
if output_size[1] == None:
output_size[1] = in_h
if output_size[2] == None:
output_size[2] = in_w
if in_dynamic_mode():
return _C_ops.pool3d(x, 'pooling_type', 'avg', 'ksize', output_size,
'global_pooling', False, 'adaptive', True,
'data_format', data_format)
l_type = 'pool3d'
helper = LayerHelper(l_type, **locals())
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
outputs = {"Out": pool_out}
helper.append_op(
type=l_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": "avg",
"ksize": output_size,
"adaptive": True,
"data_format": data_format,
})
return pool_out
def adaptive_max_pool1d(x, output_size, return_mask=False, name=None):
"""
This API implements adaptive max pooling 1d operation.
See more details in :ref:`api_nn_pooling_AdaptiveMaxPool1d` .
Args:
x (Tensor): The input tensor of pooling operator, which is a 3-D tensor
with shape [N, C, L]. The format of input tensor is NCL,
where N is batch size, C is the number of channels, L is the
length of the feature. The data type is float32 or float64.
output_size (int): The pool kernel size. The value should be an integer.
return_mask (bool): If true, the index of max pooling point will be returned along
with outputs. It cannot be set in average pooling type. Default False.
name(str, optional): For detailed information, please refer
to :ref:`api_guide_Name`. Usually name is no need to set and
None by default.
Returns:
Tensor: The output tensor of adaptive pooling result. The data type is same
as input tensor.
Raises:
ValueError: 'output_size' should be an integer.
Examples:
.. code-block:: python
# max adaptive pool1d
# suppose input data in shape of [N, C, L], `output_size` is m or [m],
# output shape is [N, C, m], adaptive pool divide L dimension
# of input data into m grids averagely and performs poolings in each
# grid to get output.
# adaptive max pool performs calculations as follow:
#
# for i in range(m):
# lstart = floor(i * L / m)
# lend = ceil((i + 1) * L / m)
# output[:, :, i] = max(input[:, :, lstart: lend])
#
import paddle
import paddle.nn.functional as F
import numpy as np
data = paddle.to_tensor(np.random.uniform(-1, 1, [1, 3, 32]).astype(np.float32))
pool_out = F.adaptive_max_pool1d(data, output_size=16)
# pool_out shape: [1, 3, 16])
pool_out, indices = F.adaptive_max_pool1d(data, output_size=16, return_mask=True)
# pool_out shape: [1, 3, 16] indices shape: [1, 3, 16]
"""
pool_type = 'max'
if not in_dynamic_mode():
check_variable_and_dtype(x, 'x', ['float32', 'float64'],
'adaptive_max_pool1d')
check_type(output_size, 'pool_size', int, 'adaptive_max_pool1d')
check_type(return_mask, 'return_mask', bool, 'adaptive_max_pool1d')
_check_input(x, 3)
pool_size = [1] + utils.convert_to_list(output_size, 1, 'pool_size')
x = unsqueeze(x, [2])
if in_dynamic_mode():
pool_out = _C_ops.max_pool2d_with_index(
x, 'pooling_type', pool_type, 'ksize', pool_size, 'adaptive', True)
return (squeeze(pool_out[0], [2]), squeeze(
pool_out[1], [2])) if return_mask else squeeze(pool_out[0], [2])
l_type = 'max_pool2d_with_index'
helper = LayerHelper(l_type, **locals())
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
mask = helper.create_variable_for_type_inference('int32')
outputs = {"Out": pool_out, "Mask": mask}
helper.append_op(
type=l_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": pool_type,
"ksize": pool_size,
"adaptive": True,
})
return (squeeze(pool_out, [2]),
squeeze(mask, [2])) if return_mask else squeeze(pool_out, [2])
def adaptive_max_pool2d(x, output_size, return_mask=False, name=None):
"""
This operation applies a 2D adaptive max pooling on input tensor.
See more details in :ref:`api_nn_pooling_AdaptiveMaxPool2d` .
Args:
x (Tensor): The input tensor of adaptive max pool2d operator, which is a 4-D tensor. The data type can be float16, float32, float64, int32 or int64.
output_size (int|list|tuple): The pool kernel size. If pool kernel size is a tuple or list, it must contain two elements, (H, W). H and W can be either a int, or None which means the size will be the same as that of the input.
return_mask (bool): If true, the index of max pooling point will be returned along with outputs. Default False.
name(str, optional): For detailed information, please refer to :ref:`api_guide_Name`. Usually name is no need to set and None by default.
Returns:
Tensor: The output tensor of adaptive max pool2d result. The data type is same as input tensor.
Examples:
.. code-block:: python
# max adaptive pool2d
# suppose input data in the shape of [N, C, H, W], `output_size` is [m, n]
# output shape is [N, C, m, n], adaptive pool divide H and W dimensions
# of input data into m*n grids averagely and performs poolings in each
# grid to get output.
# adaptive max pool performs calculations as follow:
#
# for i in range(m):
# for j in range(n):
# hstart = floor(i * H / m)
# hend = ceil((i + 1) * H / m)
# wstart = floor(i * W / n)
# wend = ceil((i + 1) * W / n)
# output[:, :, i, j] = max(input[:, :, hstart: hend, wstart: wend])
#
import paddle
import numpy as np
input_data = np.random.rand(2, 3, 32, 32)
x = paddle.to_tensor(input_data)
# x.shape is [2, 3, 32, 32]
out = paddle.nn.functional.adaptive_max_pool2d(
x = x,
output_size=[3, 3])
# out.shape is [2, 3, 3, 3]
"""
if not in_dynamic_mode():
check_variable_and_dtype(x, 'x', ['float32', 'float64'],
'adaptive_max_pool2d')
check_type(return_mask, 'return_mask', bool, 'adaptive_max_pool2d')
#check_type(output_size, 'pool_size', (int), 'adaptive_max_pool2d')
_check_input(x, 4)
in_h, in_w = x.shape[2:4]
if isinstance(output_size, int):
output_size = utils.convert_to_list(output_size, 2, 'output_size')
else:
output_size = list(output_size)
if output_size[0] == None:
output_size[0] = in_h
if output_size[1] == None:
output_size[1] = in_w
if in_dynamic_mode():
pool_out = _C_ops.max_pool2d_with_index(
x, 'pooling_type', 'max', 'ksize', output_size, 'adaptive', True)
return pool_out if return_mask else pool_out[0]
l_type = 'max_pool2d_with_index'
helper = LayerHelper(l_type, **locals())
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
mask = helper.create_variable_for_type_inference('int32')
outputs = {"Out": pool_out, "Mask": mask}
helper.append_op(
type=l_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": 'max',
"ksize": output_size,
"adaptive": True,
})
#return (pool_out, mask) if return_mask else pool_out
return pool_out
def adaptive_max_pool3d(x, output_size, return_mask=False, name=None):
"""
This operation applies a 3D adaptive max pooling on input tensor.
See more details in :ref:`api_nn_pooling_AdaptiveMaxPool3d` .
Args:
x (Tensor): The input tensor of adaptive max pool3d operator, which is a 5-D tensor. The data type can be float32, float64.
output_size (int|list|tuple): The pool kernel size. If pool kernel size is a tuple or list, it must contain three elements, (D, H, W). D, H and W can be either a int, or None which means the size will be the same as that of the input.
return_mask (bool): If true, the index of max pooling point will be returned along with outputs. Default False.
name(str, optional): For detailed information, please refer to :ref:`api_guide_Name`. Usually name is no need to set and None by default.
Returns:
Tensor: The output tensor of adaptive max pool3d result. The data type is same as input tensor.
Examples:
.. code-block:: python
# adaptive max pool3d
# suppose input data in the shape of [N, C, D, H, W], `output_size` is [l, m, n]
# output shape is [N, C, l, m, n], adaptive pool divide D, H and W dimensions
# of input data into m*n grids averagely and performs poolings in each
# grid to get output.
# adaptive max pool performs calculations as follow:
#
# for i in range(l):
# for j in range(m):
# for k in range(n):
# dstart = floor(i * D / l)
# dend = ceil((i + 1) * D / l)
# hstart = floor(i * H / m)
# hend = ceil((i + 1) * H / m)
# wstart = floor(i * W / n)
# wend = ceil((i + 1) * W / n)
# output[:, :, i, j, k] = max(input[:, :, dstart: dend, hstart: hend, wstart: wend])
#
import paddle
import numpy as np
input_data = np.random.rand(2, 3, 8, 32, 32)
x = paddle.to_tensor(input_data)
# x.shape is [2, 3, 8, 32, 32]
out = paddle.nn.functional.adaptive_max_pool3d(
x = x,
output_size=[3, 3, 3])
# out.shape is [2, 3, 3, 3, 3]
"""
if not in_dynamic_mode():
check_variable_and_dtype(x, 'x', ['float32', 'float64'],
'adaptive_max_pool3d')
check_type(return_mask, 'return_mask', bool, 'adaptive_max_pool3d')
#check_type(output_size, 'pool_size', (int), 'adaptive_max_pool3d')
_check_input(x, 5)
in_l, in_h, in_w = x.shape[2:5]
if isinstance(output_size, int):
output_size = utils.convert_to_list(output_size, 3, 'output_size')
else:
output_size = list(output_size)
if output_size[0] == None:
output_size[0] = in_l
if output_size[1] == None:
output_size[1] = in_h
if output_size[2] == None:
output_size[2] = in_w
if in_dynamic_mode():
pool_out = _C_ops.max_pool3d_with_index(
x, 'pooling_type', 'max', 'ksize', output_size, 'adaptive', True)
return pool_out if return_mask else pool_out[0]
l_type = 'max_pool3d_with_index'
helper = LayerHelper(l_type, **locals())
dtype = helper.input_dtype(input_param_name='x')
pool_out = helper.create_variable_for_type_inference(dtype)
mask = helper.create_variable_for_type_inference('int32')
outputs = {"Out": pool_out, "Mask": mask}
helper.append_op(
type=l_type,
inputs={"X": x},
outputs=outputs,
attrs={
"pooling_type": 'max',
"ksize": output_size,
"adaptive": True,
})
return (pool_out, mask) if return_mask else pool_out
| 44.185556 | 248 | 0.576734 | 10,345 | 79,534 | 4.252296 | 0.041276 | 0.037509 | 0.009229 | 0.011866 | 0.898068 | 0.885042 | 0.868015 | 0.85799 | 0.831507 | 0.806411 | 0 | 0.017732 | 0.324981 | 79,534 | 1,799 | 249 | 44.210117 | 0.801647 | 0.481643 | 0 | 0.76648 | 0 | 0 | 0.141126 | 0.003298 | 0 | 0 | 0 | 0.000556 | 0 | 1 | 0.02905 | false | 0 | 0.007821 | 0.001117 | 0.097207 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6a71e11a10b093c68afab5c9f4e9c92ed8e5a2d0 | 28,722 | py | Python | dist-packages/samba/tests/dns.py | Jianwei-Wang/python2.7_lib | 911b8e81512e5ac5f13e669ab46f7693ed897378 | [
"PSF-2.0"
] | null | null | null | dist-packages/samba/tests/dns.py | Jianwei-Wang/python2.7_lib | 911b8e81512e5ac5f13e669ab46f7693ed897378 | [
"PSF-2.0"
] | null | null | null | dist-packages/samba/tests/dns.py | Jianwei-Wang/python2.7_lib | 911b8e81512e5ac5f13e669ab46f7693ed897378 | [
"PSF-2.0"
] | null | null | null | # Unix SMB/CIFS implementation.
# Copyright (C) Kai Blin <kai@samba.org> 2011
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
import os
import struct
import random
from samba import socket
import samba.ndr as ndr
import samba.dcerpc.dns as dns
from samba.tests import TestCase
FILTER=''.join([(len(repr(chr(x)))==3) and chr(x) or '.' for x in range(256)])
class DNSTest(TestCase):
def errstr(self, errcode):
"Return a readable error code"
string_codes = [
"OK",
"FORMERR",
"SERVFAIL",
"NXDOMAIN",
"NOTIMP",
"REFUSED",
"YXDOMAIN",
"YXRRSET",
"NXRRSET",
"NOTAUTH",
"NOTZONE",
]
return string_codes[errcode]
def assert_dns_rcode_equals(self, packet, rcode):
"Helper function to check return code"
p_errcode = packet.operation & 0x000F
self.assertEquals(p_errcode, rcode, "Expected RCODE %s, got %s" %
(self.errstr(rcode), self.errstr(p_errcode)))
def assert_dns_opcode_equals(self, packet, opcode):
"Helper function to check opcode"
p_opcode = packet.operation & 0x7800
self.assertEquals(p_opcode, opcode, "Expected OPCODE %s, got %s" %
(opcode, p_opcode))
def make_name_packet(self, opcode, qid=None):
"Helper creating a dns.name_packet"
p = dns.name_packet()
if qid is None:
p.id = random.randint(0x0, 0xffff)
p.operation = opcode
p.questions = []
return p
def finish_name_packet(self, packet, questions):
"Helper to finalize a dns.name_packet"
packet.qdcount = len(questions)
packet.questions = questions
def make_name_question(self, name, qtype, qclass):
"Helper creating a dns.name_question"
q = dns.name_question()
q.name = name
q.question_type = qtype
q.question_class = qclass
return q
def get_dns_domain(self):
"Helper to get dns domain"
return os.getenv('REALM', 'example.com').lower()
def dns_transaction_udp(self, packet, host=os.getenv('SERVER_IP'), dump=False):
"send a DNS query and read the reply"
s = None
try:
send_packet = ndr.ndr_pack(packet)
if dump:
print self.hexdump(send_packet)
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, 0)
s.connect((host, 53))
s.send(send_packet, 0)
recv_packet = s.recv(2048, 0)
if dump:
print self.hexdump(recv_packet)
return ndr.ndr_unpack(dns.name_packet, recv_packet)
finally:
if s is not None:
s.close()
def dns_transaction_tcp(self, packet, host=os.getenv('SERVER_IP'), dump=False):
"send a DNS query and read the reply"
s = None
try:
send_packet = ndr.ndr_pack(packet)
if dump:
print self.hexdump(send_packet)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM, 0)
s.connect((host, 53))
tcp_packet = struct.pack('!H', len(send_packet))
tcp_packet += send_packet
s.send(tcp_packet, 0)
recv_packet = s.recv(0xffff + 2, 0)
if dump:
print self.hexdump(recv_packet)
return ndr.ndr_unpack(dns.name_packet, recv_packet[2:])
finally:
if s is not None:
s.close()
def hexdump(self, src, length=8):
N=0; result=''
while src:
s,src = src[:length],src[length:]
hexa = ' '.join(["%02X"%ord(x) for x in s])
s = s.translate(FILTER)
result += "%04X %-*s %s\n" % (N, length*3, hexa, s)
N+=length
return result
class TestSimpleQueries(DNSTest):
def test_one_a_query(self):
"create a query packet containing one query record"
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
q = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_IN)
print "asking for ", q.name
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
self.assert_dns_opcode_equals(response, dns.DNS_OPCODE_QUERY)
self.assertEquals(response.ancount, 1)
self.assertEquals(response.answers[0].rdata,
os.getenv('SERVER_IP'))
def test_one_a_query_tcp(self):
"create a query packet containing one query record via TCP"
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
q = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_IN)
print "asking for ", q.name
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_tcp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
self.assert_dns_opcode_equals(response, dns.DNS_OPCODE_QUERY)
self.assertEquals(response.ancount, 1)
self.assertEquals(response.answers[0].rdata,
os.getenv('SERVER_IP'))
def test_two_queries(self):
"create a query packet containing two query records"
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
q = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_IN)
questions.append(q)
name = "%s.%s" % ('bogusname', self.get_dns_domain())
q = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_FORMERR)
def test_qtype_all_query(self):
"create a QTYPE_ALL query"
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
q = self.make_name_question(name, dns.DNS_QTYPE_ALL, dns.DNS_QCLASS_IN)
print "asking for ", q.name
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
num_answers = 1
dc_ipv6 = os.getenv('SERVER_IPV6')
if dc_ipv6 is not None:
num_answers += 1
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
self.assert_dns_opcode_equals(response, dns.DNS_OPCODE_QUERY)
self.assertEquals(response.ancount, num_answers)
self.assertEquals(response.answers[0].rdata,
os.getenv('SERVER_IP'))
if dc_ipv6 is not None:
self.assertEquals(response.answers[1].rdata, dc_ipv6)
def test_qclass_none_query(self):
"create a QCLASS_NONE query"
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
q = self.make_name_question(name, dns.DNS_QTYPE_ALL, dns.DNS_QCLASS_NONE)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_NOTIMP)
# Only returns an authority section entry in BIND and Win DNS
# FIXME: Enable one Samba implements this feature
# def test_soa_hostname_query(self):
# "create a SOA query for a hostname"
# p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
# questions = []
#
# name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
# q = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
# questions.append(q)
#
# self.finish_name_packet(p, questions)
# response = self.dns_transaction_udp(p)
# self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
# self.assert_dns_opcode_equals(response, dns.DNS_OPCODE_QUERY)
# # We don't get SOA records for single hosts
# self.assertEquals(response.ancount, 0)
def test_soa_domain_query(self):
"create a SOA query for a domain"
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = self.get_dns_domain()
q = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
self.assert_dns_opcode_equals(response, dns.DNS_OPCODE_QUERY)
self.assertEquals(response.ancount, 1)
class TestDNSUpdates(DNSTest):
def test_two_updates(self):
"create two update requests"
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
u = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_IN)
updates.append(u)
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_FORMERR)
def test_update_wrong_qclass(self):
"create update with DNS_QCLASS_NONE"
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_NONE)
updates.append(u)
self.finish_name_packet(p, updates)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_NOTIMP)
def test_update_prereq_with_non_null_ttl(self):
"test update with a non-null TTL"
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
prereqs = []
r = dns.res_rec()
r.name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
r.rr_type = dns.DNS_QTYPE_TXT
r.rr_class = dns.DNS_QCLASS_NONE
r.ttl = 1
r.length = 0
prereqs.append(r)
p.ancount = len(prereqs)
p.answers = prereqs
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_FORMERR)
# I'd love to test this one, but it segfaults. :)
# def test_update_prereq_with_non_null_length(self):
# "test update with a non-null length"
# p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
# updates = []
#
# name = self.get_dns_domain()
#
# u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
# updates.append(u)
# self.finish_name_packet(p, updates)
#
# prereqs = []
# r = dns.res_rec()
# r.name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
# r.rr_type = dns.DNS_QTYPE_TXT
# r.rr_class = dns.DNS_QCLASS_ANY
# r.ttl = 0
# r.length = 1
# prereqs.append(r)
#
# p.ancount = len(prereqs)
# p.answers = prereqs
#
# response = self.dns_transaction_udp(p)
# self.assert_dns_rcode_equals(response, dns.DNS_RCODE_FORMERR)
def test_update_prereq_nonexisting_name(self):
"test update with a nonexisting name"
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
prereqs = []
r = dns.res_rec()
r.name = "idontexist.%s" % self.get_dns_domain()
r.rr_type = dns.DNS_QTYPE_TXT
r.rr_class = dns.DNS_QCLASS_ANY
r.ttl = 0
r.length = 0
prereqs.append(r)
p.ancount = len(prereqs)
p.answers = prereqs
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_NXRRSET)
def test_update_add_txt_record(self):
"test adding records works"
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = "textrec.%s" % self.get_dns_domain()
r.rr_type = dns.DNS_QTYPE_TXT
r.rr_class = dns.DNS_QCLASS_IN
r.ttl = 900
r.length = 0xffff
rdata = dns.txt_record()
rdata.txt = '"This is a test"'
r.rdata = rdata
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "textrec.%s" % self.get_dns_domain()
q = self.make_name_question(name, dns.DNS_QTYPE_TXT, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
self.assertEquals(response.ancount, 1)
self.assertEquals(response.answers[0].rdata.txt, '"This is a test"')
def test_update_add_two_txt_records(self):
"test adding two txt records works"
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = "textrec2.%s" % self.get_dns_domain()
r.rr_type = dns.DNS_QTYPE_TXT
r.rr_class = dns.DNS_QCLASS_IN
r.ttl = 900
r.length = 0xffff
rdata = dns.txt_record()
rdata.txt = '"This is a test" "and this is a test, too"'
r.rdata = rdata
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "textrec2.%s" % self.get_dns_domain()
q = self.make_name_question(name, dns.DNS_QTYPE_TXT, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
self.assertEquals(response.ancount, 1)
self.assertEquals(response.answers[0].rdata.txt, '"This is a test" "and this is a test, too"')
def test_delete_record(self):
"Test if deleting records works"
NAME = "deleterec.%s" % self.get_dns_domain()
# First, create a record to make sure we have a record to delete.
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = NAME
r.rr_type = dns.DNS_QTYPE_TXT
r.rr_class = dns.DNS_QCLASS_IN
r.ttl = 900
r.length = 0xffff
rdata = dns.txt_record()
rdata.txt = '"This is a test"'
r.rdata = rdata
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
# Now check the record is around
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
q = self.make_name_question(NAME, dns.DNS_QTYPE_TXT, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
# Now delete the record
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = NAME
r.rr_type = dns.DNS_QTYPE_TXT
r.rr_class = dns.DNS_QCLASS_NONE
r.ttl = 0
r.length = 0xffff
rdata = dns.txt_record()
rdata.txt = '"This is a test"'
r.rdata = rdata
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
# And finally check it's gone
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
q = self.make_name_question(NAME, dns.DNS_QTYPE_TXT, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_NXDOMAIN)
def test_readd_record(self):
"Test if adding, deleting and then readding a records works"
NAME = "readdrec.%s" % self.get_dns_domain()
# Create the record
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = NAME
r.rr_type = dns.DNS_QTYPE_TXT
r.rr_class = dns.DNS_QCLASS_IN
r.ttl = 900
r.length = 0xffff
rdata = dns.txt_record()
rdata.txt = '"This is a test"'
r.rdata = rdata
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
# Now check the record is around
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
q = self.make_name_question(NAME, dns.DNS_QTYPE_TXT, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
# Now delete the record
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = NAME
r.rr_type = dns.DNS_QTYPE_TXT
r.rr_class = dns.DNS_QCLASS_NONE
r.ttl = 0
r.length = 0xffff
rdata = dns.txt_record()
rdata.txt = '"This is a test"'
r.rdata = rdata
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
# check it's gone
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
q = self.make_name_question(NAME, dns.DNS_QTYPE_TXT, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_NXDOMAIN)
# recreate the record
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = NAME
r.rr_type = dns.DNS_QTYPE_TXT
r.rr_class = dns.DNS_QCLASS_IN
r.ttl = 900
r.length = 0xffff
rdata = dns.txt_record()
rdata.txt = '"This is a test"'
r.rdata = rdata
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
# Now check the record is around
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
q = self.make_name_question(NAME, dns.DNS_QTYPE_TXT, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
def test_update_add_mx_record(self):
"test adding MX records works"
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = "%s" % self.get_dns_domain()
r.rr_type = dns.DNS_QTYPE_MX
r.rr_class = dns.DNS_QCLASS_IN
r.ttl = 900
r.length = 0xffff
rdata = dns.mx_record()
rdata.preference = 10
rdata.exchange = 'mail.%s' % self.get_dns_domain()
r.rdata = rdata
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "%s" % self.get_dns_domain()
q = self.make_name_question(name, dns.DNS_QTYPE_MX, dns.DNS_QCLASS_IN)
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
self.assertEqual(response.ancount, 1)
ans = response.answers[0]
self.assertEqual(ans.rr_type, dns.DNS_QTYPE_MX)
self.assertEqual(ans.rdata.preference, 10)
self.assertEqual(ans.rdata.exchange, 'mail.%s' % self.get_dns_domain())
class TestComplexQueries(DNSTest):
def setUp(self):
super(TestComplexQueries, self).setUp()
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = "cname_test.%s" % self.get_dns_domain()
r.rr_type = dns.DNS_QTYPE_CNAME
r.rr_class = dns.DNS_QCLASS_IN
r.ttl = 900
r.length = 0xffff
r.rdata = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
def tearDown(self):
super(TestComplexQueries, self).tearDown()
p = self.make_name_packet(dns.DNS_OPCODE_UPDATE)
updates = []
name = self.get_dns_domain()
u = self.make_name_question(name, dns.DNS_QTYPE_SOA, dns.DNS_QCLASS_IN)
updates.append(u)
self.finish_name_packet(p, updates)
updates = []
r = dns.res_rec()
r.name = "cname_test.%s" % self.get_dns_domain()
r.rr_type = dns.DNS_QTYPE_CNAME
r.rr_class = dns.DNS_QCLASS_NONE
r.ttl = 0
r.length = 0xffff
r.rdata = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
updates.append(r)
p.nscount = len(updates)
p.nsrecs = updates
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
def test_one_a_query(self):
"create a query packet containing one query record"
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "cname_test.%s" % self.get_dns_domain()
q = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_IN)
print "asking for ", q.name
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
self.assert_dns_opcode_equals(response, dns.DNS_OPCODE_QUERY)
self.assertEquals(response.ancount, 2)
self.assertEquals(response.answers[0].rr_type, dns.DNS_QTYPE_CNAME)
self.assertEquals(response.answers[0].rdata, "%s.%s" %
(os.getenv('SERVER'), self.get_dns_domain()))
self.assertEquals(response.answers[1].rr_type, dns.DNS_QTYPE_A)
self.assertEquals(response.answers[1].rdata,
os.getenv('SERVER_IP'))
class TestInvalidQueries(DNSTest):
def test_one_a_query(self):
"send 0 bytes follows by create a query packet containing one query record"
s = None
try:
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, 0)
s.connect((os.getenv('SERVER_IP'), 53))
s.send("", 0)
finally:
if s is not None:
s.close()
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "%s.%s" % (os.getenv('SERVER'), self.get_dns_domain())
q = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_IN)
print "asking for ", q.name
questions.append(q)
self.finish_name_packet(p, questions)
response = self.dns_transaction_udp(p)
self.assert_dns_rcode_equals(response, dns.DNS_RCODE_OK)
self.assert_dns_opcode_equals(response, dns.DNS_OPCODE_QUERY)
self.assertEquals(response.ancount, 1)
self.assertEquals(response.answers[0].rdata,
os.getenv('SERVER_IP'))
def test_one_a_reply(self):
"send a reply instead of a query"
p = self.make_name_packet(dns.DNS_OPCODE_QUERY)
questions = []
name = "%s.%s" % ('fakefakefake', self.get_dns_domain())
q = self.make_name_question(name, dns.DNS_QTYPE_A, dns.DNS_QCLASS_IN)
print "asking for ", q.name
questions.append(q)
self.finish_name_packet(p, questions)
p.operation |= dns.DNS_FLAG_REPLY
s = None
try:
send_packet = ndr.ndr_pack(p)
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM, 0)
host=os.getenv('SERVER_IP')
s.connect((host, 53))
tcp_packet = struct.pack('!H', len(send_packet))
tcp_packet += send_packet
s.send(tcp_packet, 0)
recv_packet = s.recv(0xffff + 2, 0)
self.assertEquals(0, len(recv_packet))
finally:
if s is not None:
s.close()
if __name__ == "__main__":
import unittest
unittest.main()
| 34.274463 | 102 | 0.623355 | 3,941 | 28,722 | 4.279371 | 0.077645 | 0.061192 | 0.048384 | 0.042692 | 0.810377 | 0.795494 | 0.77812 | 0.767685 | 0.755826 | 0.749303 | 0 | 0.006316 | 0.27237 | 28,722 | 837 | 103 | 34.315412 | 0.80066 | 0.089165 | 0 | 0.731707 | 0 | 0 | 0.06903 | 0 | 0 | 0 | 0.003565 | 0.001195 | 0.104065 | 0 | null | null | 0 | 0.013008 | null | null | 0.01626 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
6ab0dcd4299b8f8848a52aec974f8ed55da5068c | 13,036 | py | Python | kayobe/tests/unit/test_kolla_ansible.py | G-Research/kayobe | 80c13e75deb4bff94c5bfe2fd79bb9beecb71873 | [
"Apache-2.0"
] | 48 | 2018-03-08T13:34:34.000Z | 2022-03-14T15:42:20.000Z | kayobe/tests/unit/test_kolla_ansible.py | chazzrobbz/kayobe | 5fb6362e2548afdc2ea824678e565ef81cdbcaa5 | [
"Apache-2.0"
] | null | null | null | kayobe/tests/unit/test_kolla_ansible.py | chazzrobbz/kayobe | 5fb6362e2548afdc2ea824678e565ef81cdbcaa5 | [
"Apache-2.0"
] | 25 | 2018-04-23T07:51:31.000Z | 2022-03-14T15:42:22.000Z | # Copyright (c) 2017 StackHPC Ltd.
#
# Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import argparse
import os
import subprocess
import unittest
from unittest import mock
from kayobe import ansible
from kayobe import kolla_ansible
from kayobe import utils
from kayobe import vault
@mock.patch.object(os, "getcwd", new=lambda: "/path/to/cwd")
@mock.patch.dict(os.environ, clear=True)
class TestCase(unittest.TestCase):
@mock.patch.object(utils, "run_command")
@mock.patch.object(kolla_ansible, "_validate_args")
def test_run(self, mock_validate, mock_run):
parser = argparse.ArgumentParser()
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
parsed_args = parser.parse_args([])
kolla_ansible.run(parsed_args, "command", "overcloud")
expected_cmd = [
".", "/path/to/cwd/venvs/kolla-ansible/bin/activate", "&&",
"kolla-ansible", "command",
"--inventory", "/etc/kolla/inventory/overcloud",
]
expected_cmd = " ".join(expected_cmd)
mock_run.assert_called_once_with(expected_cmd, shell=True, quiet=False,
env={})
@mock.patch.object(utils, "run_command")
@mock.patch.object(kolla_ansible, "_validate_args")
def test_run_all_the_args(self, mock_validate, mock_run):
parser = argparse.ArgumentParser()
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
args = [
"-C",
"-D",
"--kolla-config-path", "/path/to/config",
"-ke", "ev_name1=ev_value1",
"-ki", "/path/to/inventory",
"-kl", "host1:host2",
"-kt", "tag1,tag2",
]
parsed_args = parser.parse_args(args)
kolla_ansible.run(parsed_args, "command", "overcloud")
expected_cmd = [
".", "/path/to/cwd/venvs/kolla-ansible/bin/activate", "&&",
"kolla-ansible", "command",
"--inventory", "/path/to/inventory",
"--configdir", "/path/to/config",
"--passwords", "/path/to/config/passwords.yml",
"-e", "ev_name1=ev_value1",
"--limit", "'host1:host2'",
"--tags", "tag1,tag2",
]
expected_cmd = " ".join(expected_cmd)
expected_env = {"EXTRA_OPTS": " --check --diff"}
mock_run.assert_called_once_with(expected_cmd, shell=True, quiet=False,
env=expected_env)
@mock.patch.object(utils, "run_command")
@mock.patch.object(kolla_ansible, "_validate_args")
@mock.patch.object(vault, "_ask_vault_pass")
def test_run_all_the_long_args(self, mock_ask, mock_validate, mock_run):
parser = argparse.ArgumentParser()
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
mock_ask.return_value = "test-pass"
args = [
"--ask-vault-pass",
"--check",
"--diff",
"--kolla-config-path", "/path/to/config",
"--kolla-extra-vars", "ev_name1=ev_value1",
"--kolla-inventory", "/path/to/inventory",
"--kolla-limit", "host1:host2",
"--kolla-skip-tags", "tag3,tag4",
"--kolla-tags", "tag1,tag2",
]
parsed_args = parser.parse_args(args)
mock_run.return_value = "/path/to/kayobe-vault-password-helper"
kolla_ansible.run(parsed_args, "command", "overcloud")
expected_cmd = [
".", "/path/to/cwd/venvs/kolla-ansible/bin/activate", "&&",
"kolla-ansible", "command",
"--key", "/path/to/kayobe-vault-password-helper",
"--inventory", "/path/to/inventory",
"--configdir", "/path/to/config",
"--passwords", "/path/to/config/passwords.yml",
"-e", "ev_name1=ev_value1",
"--limit", "'host1:host2'",
"--skip-tags", "tag3,tag4",
"--tags", "tag1,tag2",
]
expected_cmd = " ".join(expected_cmd)
expected_env = {"EXTRA_OPTS": " --check --diff",
"KAYOBE_VAULT_PASSWORD": "test-pass"}
expected_calls = [
mock.call(["which", "kayobe-vault-password-helper"],
check_output=True, universal_newlines=True),
mock.call(expected_cmd, shell=True, quiet=False, env=expected_env)
]
self.assertEqual(expected_calls, mock_run.mock_calls)
@mock.patch.object(utils, "run_command")
@mock.patch.object(kolla_ansible, "_validate_args")
@mock.patch.object(vault, "update_environment")
def test_run_vault_password_file(self, mock_update, mock_validate,
mock_run):
parser = argparse.ArgumentParser()
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
args = [
"--vault-password-file", "/path/to/vault/pw",
]
parsed_args = parser.parse_args(args)
kolla_ansible.run(parsed_args, "command", "overcloud")
expected_cmd = [
".", "/path/to/cwd/venvs/kolla-ansible/bin/activate", "&&",
"kolla-ansible", "command",
"--key", "/path/to/vault/pw",
"--inventory", "/etc/kolla/inventory/overcloud",
]
expected_cmd = " ".join(expected_cmd)
mock_run.assert_called_once_with(expected_cmd, shell=True, quiet=False,
env={})
mock_update.assert_called_once_with(mock.ANY, {})
@mock.patch.dict(os.environ, {"KAYOBE_VAULT_PASSWORD": "test-pass"})
@mock.patch.object(utils, "run_command")
@mock.patch.object(kolla_ansible, "_validate_args")
@mock.patch.object(vault, "update_environment")
def test_run_vault_password_helper(self, mock_update, mock_vars, mock_run):
mock_vars.return_value = []
parser = argparse.ArgumentParser()
mock_run.return_value = "/path/to/kayobe-vault-password-helper"
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
mock_run.assert_called_once_with(
["which", "kayobe-vault-password-helper"], check_output=True,
universal_newlines=True)
mock_run.reset_mock()
parsed_args = parser.parse_args([])
kolla_ansible.run(parsed_args, "command", "overcloud")
expected_cmd = [
".", "/path/to/cwd/venvs/kolla-ansible/bin/activate", "&&",
"kolla-ansible", "command",
"--key", "/path/to/kayobe-vault-password-helper",
"--inventory", "/etc/kolla/inventory/overcloud",
]
expected_cmd = " ".join(expected_cmd)
expected_env = {"KAYOBE_VAULT_PASSWORD": "test-pass"}
mock_run.assert_called_once_with(expected_cmd, shell=True, quiet=False,
env=expected_env)
mock_update.assert_called_once_with(mock.ANY, expected_env)
@mock.patch.object(utils, "run_command")
@mock.patch.object(kolla_ansible, "_validate_args")
def test_run_func_args(self, mock_validate, mock_run):
parser = argparse.ArgumentParser()
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
args = [
"--kolla-extra-vars", "ev_name1=ev_value1",
"--kolla-tags", "tag1,tag2",
]
parsed_args = parser.parse_args(args)
kwargs = {
"extra_vars": {"ev_name2": "ev_value2"},
"tags": "tag3,tag4",
"verbose_level": 1,
"extra_args": ["--arg1", "--arg2"],
}
kolla_ansible.run(parsed_args, "command", "overcloud", **kwargs)
expected_cmd = [
".", "/path/to/cwd/venvs/kolla-ansible/bin/activate", "&&",
"kolla-ansible", "command",
"-v",
"--inventory", "/etc/kolla/inventory/overcloud",
"-e", "ev_name1=ev_value1",
"-e", "ev_name2='ev_value2'",
"--tags", "tag1,tag2,tag3,tag4",
"--arg1", "--arg2",
]
expected_cmd = " ".join(expected_cmd)
mock_run.assert_called_once_with(expected_cmd, shell=True, quiet=False,
env={})
@mock.patch.object(utils, "run_command")
@mock.patch.object(utils, "is_readable_file")
@mock.patch.object(kolla_ansible, "_validate_args")
def test_run_custom_ansible_cfg(self, mock_validate, mock_readable,
mock_run):
mock_readable.return_value = {"result": True}
parser = argparse.ArgumentParser()
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
parsed_args = parser.parse_args([])
kolla_ansible.run(parsed_args, "command", "overcloud")
expected_cmd = [
".", "/path/to/cwd/venvs/kolla-ansible/bin/activate", "&&",
"kolla-ansible", "command",
"--inventory", "/etc/kolla/inventory/overcloud",
]
expected_cmd = " ".join(expected_cmd)
expected_env = {"ANSIBLE_CONFIG": "/etc/kayobe/kolla/ansible.cfg"}
mock_run.assert_called_once_with(expected_cmd, shell=True, quiet=False,
env=expected_env)
mock_readable.assert_called_once_with("/etc/kayobe/kolla/ansible.cfg")
@mock.patch.object(utils, "run_command")
@mock.patch.object(utils, "is_readable_file")
@mock.patch.object(kolla_ansible, "_validate_args")
def test_run_custom_ansible_cfg_2(self, mock_validate, mock_readable,
mock_run):
mock_readable.side_effect = [{"result": False}, {"result": True}]
parser = argparse.ArgumentParser()
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
parsed_args = parser.parse_args([])
kolla_ansible.run(parsed_args, "command", "overcloud")
expected_cmd = [
".", "/path/to/cwd/venvs/kolla-ansible/bin/activate", "&&",
"kolla-ansible", "command",
"--inventory", "/etc/kolla/inventory/overcloud",
]
expected_cmd = " ".join(expected_cmd)
expected_env = {"ANSIBLE_CONFIG": "/etc/kayobe/ansible.cfg"}
mock_run.assert_called_once_with(expected_cmd, shell=True, quiet=False,
env=expected_env)
expected_calls = [
mock.call("/etc/kayobe/kolla/ansible.cfg"),
mock.call("/etc/kayobe/ansible.cfg"),
]
self.assertEqual(mock_readable.call_args_list, expected_calls)
@mock.patch.object(utils, "run_command")
@mock.patch.object(utils, "is_readable_file")
@mock.patch.object(kolla_ansible, "_validate_args")
def test_run_custom_ansible_cfg_env(self, mock_validate, mock_readable,
mock_run):
mock_readable.return_value = {"result": True}
os.environ["ANSIBLE_CONFIG"] = "/path/to/ansible.cfg"
parser = argparse.ArgumentParser()
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
parsed_args = parser.parse_args([])
kolla_ansible.run(parsed_args, "command", "overcloud")
expected_cmd = [
".", "/path/to/cwd/venvs/kolla-ansible/bin/activate", "&&",
"kolla-ansible", "command",
"--inventory", "/etc/kolla/inventory/overcloud",
]
expected_cmd = " ".join(expected_cmd)
expected_env = {"ANSIBLE_CONFIG": "/path/to/ansible.cfg"}
mock_run.assert_called_once_with(expected_cmd, shell=True, quiet=False,
env=expected_env)
mock_readable.assert_called_once_with("/etc/kayobe/kolla/ansible.cfg")
@mock.patch.object(utils, "run_command")
@mock.patch.object(kolla_ansible, "_validate_args")
def test_run_failure(self, mock_validate, mock_run):
parser = argparse.ArgumentParser()
ansible.add_args(parser)
kolla_ansible.add_args(parser)
vault.add_args(parser)
parsed_args = parser.parse_args([])
mock_run.side_effect = subprocess.CalledProcessError(1, "dummy")
self.assertRaises(SystemExit,
kolla_ansible.run, parsed_args, "command",
"overcloud")
| 43.453333 | 79 | 0.597499 | 1,462 | 13,036 | 5.082763 | 0.127223 | 0.085587 | 0.052483 | 0.053829 | 0.810658 | 0.786166 | 0.759386 | 0.748352 | 0.738797 | 0.717669 | 0 | 0.006136 | 0.26235 | 13,036 | 299 | 80 | 43.598662 | 0.766639 | 0.042574 | 0 | 0.658088 | 0 | 0 | 0.240555 | 0.090078 | 0 | 0 | 0 | 0 | 0.058824 | 1 | 0.036765 | false | 0.0625 | 0.033088 | 0 | 0.073529 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 7 |
6ae5c28f3daf9b6512bfa3a18146330d87440d75 | 161 | py | Python | tests/python/runtests.py | karpierz/jtypes.jep | 0e2347c3144022313802183c75fc8944ac6ddef5 | [
"Zlib"
] | 1 | 2021-05-18T03:38:21.000Z | 2021-05-18T03:38:21.000Z | tests/python/runtests.py | karpierz/jtypes.jep | 0e2347c3144022313802183c75fc8944ac6ddef5 | [
"Zlib"
] | null | null | null | tests/python/runtests.py | karpierz/jtypes.jep | 0e2347c3144022313802183c75fc8944ac6ddef5 | [
"Zlib"
] | null | null | null | import sys, os.path as path
sys.path[0] = path.dirname(path.dirname(path.dirname(path.abspath(__file__))))
del sys, path
__import__("runpy").run_module("tests")
| 32.2 | 78 | 0.751553 | 26 | 161 | 4.307692 | 0.538462 | 0.294643 | 0.401786 | 0.392857 | 0.330357 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006711 | 0.074534 | 161 | 4 | 79 | 40.25 | 0.744966 | 0 | 0 | 0 | 0 | 0 | 0.062112 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
0a805c33f9a823d3c69dedb8e4e4f4eea979a4d1 | 2,779 | py | Python | tests/test_word_vector.py | Gorlph/pythainlp | 6135ba5f490e00640de902a0d5c65a4537739d98 | [
"Apache-2.0"
] | 569 | 2017-10-22T16:48:55.000Z | 2022-03-30T09:59:14.000Z | tests/test_word_vector.py | Gorlph/pythainlp | 6135ba5f490e00640de902a0d5c65a4537739d98 | [
"Apache-2.0"
] | 531 | 2017-10-24T04:34:13.000Z | 2022-03-20T16:30:14.000Z | tests/test_word_vector.py | Gorlph/pythainlp | 6135ba5f490e00640de902a0d5c65a4537739d98 | [
"Apache-2.0"
] | 218 | 2017-12-08T01:52:25.000Z | 2022-03-21T06:56:32.000Z | # -*- coding: utf-8 -*-
import unittest
from pythainlp import word_vector
from pythainlp.word_vector import WordVector
class TestWordVectorPackage(unittest.TestCase):
def test_thai2vec(self):
self.assertGreaterEqual(word_vector.similarity("แบคทีเรีย", "คน"), 0)
self.assertIsNotNone(word_vector.sentence_vectorizer(""))
self.assertIsNotNone(word_vector.get_model())
self.assertIsNotNone(
word_vector.sentence_vectorizer("เสรีภาพในการชุมนุม")
)
self.assertIsNotNone(
word_vector.sentence_vectorizer(
"เสรีภาพในการรวมตัว\nสมาคม", use_mean=True
)
)
self.assertIsNotNone(
word_vector.sentence_vectorizer("I คิด therefore I am ผ็ฎ์")
)
self.assertIsNotNone(
word_vector.most_similar_cosmul(
["สหรัฐอเมริกา", "ประธานาธิบดี"], ["ประเทศไทย"]
)[0][0]
)
self.assertEqual(
word_vector.doesnt_match(["ญี่ปุ่น", "พม่า", "ไอติม"]), "ไอติม"
)
_wv = WordVector("thai2fit_wv")
self.assertGreaterEqual(
_wv.similarity("แบคทีเรีย", "คน"), 0
)
self.assertIsNotNone(_wv.sentence_vectorizer(""))
self.assertIsNotNone(_wv.get_model())
self.assertIsNotNone(
_wv.sentence_vectorizer("เสรีภาพในการชุมนุม")
)
self.assertIsNotNone(
_wv.sentence_vectorizer(
"เสรีภาพในการรวมตัว\nสมาคม", use_mean=True
)
)
self.assertIsNotNone(
_wv.sentence_vectorizer("I คิด therefore I am ผ็ฎ์")
)
self.assertIsNotNone(
_wv.most_similar_cosmul(
["สหรัฐอเมริกา", "ประธานาธิบดี"], ["ประเทศไทย"]
)[0][0]
)
self.assertEqual(
_wv.doesnt_match(["ญี่ปุ่น", "พม่า", "ไอติม"]), "ไอติม"
)
def test_ltw2v(self):
_wv = WordVector("ltw2v")
self.assertGreaterEqual(
_wv.similarity("แบคทีเรีย", "คน"), 0
)
self.assertIsNotNone(_wv.sentence_vectorizer(""))
self.assertIsNotNone(_wv.get_model())
self.assertIsNotNone(
_wv.sentence_vectorizer("เสรีภาพในการชุมนุม")
)
self.assertIsNotNone(
_wv.sentence_vectorizer(
"เสรีภาพในการรวมตัว\nสมาคม", use_mean=True
)
)
self.assertIsNotNone(
_wv.sentence_vectorizer("I คิด therefore I am ผ็ฎ์")
)
self.assertIsNotNone(
_wv.most_similar_cosmul(
["สหรัฐอเมริกา", "ประธานาธิบดี"], ["ไทย"]
)[0][0]
)
self.assertEqual(
_wv.doesnt_match(["ญี่ปุ่น", "พม่า", "ไอติม"]), "ไอติม"
)
| 32.694118 | 77 | 0.553796 | 343 | 2,779 | 4.501458 | 0.22449 | 0.221503 | 0.163212 | 0.150259 | 0.805699 | 0.805699 | 0.781088 | 0.781088 | 0.755829 | 0.755829 | 0 | 0.007315 | 0.311263 | 2,779 | 84 | 78 | 33.083333 | 0.766458 | 0.007557 | 0 | 0.525641 | 0 | 0 | 0.148403 | 0.027213 | 0 | 0 | 0 | 0 | 0.307692 | 1 | 0.025641 | false | 0 | 0.038462 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0aba401a0cca5d734e8d9c7058e03e53ab701301 | 16,117 | py | Python | src/stack-hci/azext_stack_hci/generated/_params.py | Caoxuyang/azure-cli-extensions | d2011261f29033cb31a1064256727d87049ab423 | [
"MIT"
] | null | null | null | src/stack-hci/azext_stack_hci/generated/_params.py | Caoxuyang/azure-cli-extensions | d2011261f29033cb31a1064256727d87049ab423 | [
"MIT"
] | 9 | 2022-03-25T19:35:49.000Z | 2022-03-31T06:09:47.000Z | src/stack-hci/azext_stack_hci/generated/_params.py | Caoxuyang/azure-cli-extensions | d2011261f29033cb31a1064256727d87049ab423 | [
"MIT"
] | 1 | 2022-02-14T21:43:29.000Z | 2022-02-14T21:43:29.000Z | # --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=too-many-lines
# pylint: disable=too-many-statements
from azure.cli.core.commands.parameters import (
tags_type,
get_three_state_flag,
get_enum_type,
resource_group_name_type,
get_location_type
)
from azure.cli.core.commands.validators import (
get_default_location_from_resource_group,
validate_file_or_dict
)
from azext_stack_hci.action import AddDesiredProperties
def load_arguments(self, _):
with self.argument_context('stack-hci arc-setting list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.')
with self.argument_context('stack-hci arc-setting show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.', id_part='name')
c.argument('arc_setting_name', options_list=['--name', '-n', '--arc-setting-name'], type=str, help='The name '
'of the proxy resource holding details of HCI ArcSetting information.', id_part='child_name_1')
with self.argument_context('stack-hci arc-setting create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.')
c.argument('arc_setting_name', options_list=['--name', '-n', '--arc-setting-name'], type=str, help='The name '
'of the proxy resource holding details of HCI ArcSetting information.')
c.argument('arc_instance_resource_group', options_list=['--instance-rg'], type=str, help='The resource group '
'that hosts the Arc agents, ie. Hybrid Compute Machine resources.')
c.argument('created_by', type=str, help='The identity that created the resource.', arg_group='System Data')
c.argument('created_by_type', arg_type=get_enum_type(['User', 'Application', 'ManagedIdentity', 'Key']),
help='The type of identity that created the resource.', arg_group='System Data')
c.argument('created_at', help='The timestamp of resource creation (UTC).', arg_group='System Data')
c.argument('last_modified_by', type=str, help='The identity that last modified the resource.',
arg_group='System Data')
c.argument('last_modified_by_type', arg_type=get_enum_type(['User', 'Application', 'ManagedIdentity', 'Key']),
help='The type of identity that last modified the resource.', arg_group='System Data')
c.argument('last_modified_at', help='The timestamp of resource last modification (UTC)', arg_group='System '
'Data')
with self.argument_context('stack-hci arc-setting delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.', id_part='name')
c.argument('arc_setting_name', options_list=['--name', '-n', '--arc-setting-name'], type=str, help='The name '
'of the proxy resource holding details of HCI ArcSetting information.', id_part='child_name_1')
with self.argument_context('stack-hci arc-setting wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.', id_part='name')
c.argument('arc_setting_name', options_list=['--name', '-n', '--arc-setting-name'], type=str, help='The name '
'of the proxy resource holding details of HCI ArcSetting information.', id_part='child_name_1')
with self.argument_context('stack-hci cluster list') as c:
c.argument('resource_group_name', resource_group_name_type)
with self.argument_context('stack-hci cluster show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', options_list=['--name', '-n', '--cluster-name'], type=str, help='The name of the '
'cluster.', id_part='name')
with self.argument_context('stack-hci cluster create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', options_list=['--name', '-n', '--cluster-name'], type=str, help='The name of the '
'cluster.')
c.argument('tags', tags_type)
c.argument('location', arg_type=get_location_type(self.cli_ctx), required=False,
validator=get_default_location_from_resource_group)
c.argument('cloud_management_endpoint', options_list=['--endpoint'], type=str, help='Endpoint configured for '
'management from the Azure portal.')
c.argument('aad_client_id', type=str, help='App id of cluster AAD identity.')
c.argument('aad_tenant_id', type=str, help='Tenant id of cluster AAD identity.')
c.argument('desired_properties', action=AddDesiredProperties, nargs='+', help='Desired properties of the '
'cluster.')
c.argument('created_by', type=str, help='The identity that created the resource.', arg_group='System Data')
c.argument('created_by_type', arg_type=get_enum_type(['User', 'Application', 'ManagedIdentity', 'Key']),
help='The type of identity that created the resource.', arg_group='System Data')
c.argument('created_at', help='The timestamp of resource creation (UTC).', arg_group='System Data')
c.argument('last_modified_by', type=str, help='The identity that last modified the resource.',
arg_group='System Data')
c.argument('last_modified_by_type', arg_type=get_enum_type(['User', 'Application', 'ManagedIdentity', 'Key']),
help='The type of identity that last modified the resource.', arg_group='System Data')
c.argument('last_modified_at', help='The timestamp of resource last modification (UTC)', arg_group='System '
'Data')
with self.argument_context('stack-hci cluster update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', options_list=['--name', '-n', '--cluster-name'], type=str, help='The name of the '
'cluster.', id_part='name')
c.argument('tags', tags_type)
c.argument('cloud_management_endpoint', options_list=['--endpoint'], type=str, help='Endpoint configured for '
'management from the Azure portal')
c.argument('aad_client_id', type=str, help='App id of cluster AAD identity.')
c.argument('aad_tenant_id', type=str, help='Tenant id of cluster AAD identity.')
c.argument('desired_properties', action=AddDesiredProperties, nargs='+', help='Desired properties of the '
'cluster.')
with self.argument_context('stack-hci cluster delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', options_list=['--name', '-n', '--cluster-name'], type=str, help='The name of the '
'cluster.', id_part='name')
with self.argument_context('stack-hci extension list') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.')
c.argument('arc_setting_name', type=str, help='The name of the proxy resource holding details of HCI '
'ArcSetting information.')
with self.argument_context('stack-hci extension show') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.', id_part='name')
c.argument('arc_setting_name', type=str, help='The name of the proxy resource holding details of HCI '
'ArcSetting information.', id_part='child_name_1')
c.argument('extension_name', options_list=['--name', '-n', '--extension-name'], type=str, help='The name of '
'the machine extension.', id_part='child_name_2')
with self.argument_context('stack-hci extension create') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.')
c.argument('arc_setting_name', type=str, help='The name of the proxy resource holding details of HCI '
'ArcSetting information.')
c.argument('extension_name', options_list=['--name', '-n', '--extension-name'], type=str, help='The name of '
'the machine extension.')
c.argument('force_update_tag', type=str, help='How the extension handler should be forced to update even if '
'the extension configuration has not changed.', arg_group='Extension Parameters')
c.argument('publisher', type=str, help='The name of the extension handler publisher.', arg_group='Extension '
'Parameters')
c.argument('type_properties_extension_parameters_type', options_list=['--type'], type=str, help='Specifies the '
'type of the extension; an example is "CustomScriptExtension".', arg_group='Extension Parameters')
c.argument('type_handler_version', type=str, help='Specifies the version of the script handler.',
arg_group='Extension Parameters')
c.argument('auto_upgrade_minor_version', options_list=['--auto-upgrade'], arg_type=get_three_state_flag(),
help='Indicates whether the extension should use a newer minor version if one is available at '
'deployment time. Once deployed, however, the extension will not upgrade minor versions unless '
'redeployed, even with this property set to true.', arg_group='Extension Parameters')
c.argument('settings', type=validate_file_or_dict, help='Json formatted public settings for the extension. '
'Expected value: json-string/json-file/@json-file.', arg_group='Extension Parameters')
c.argument('protected_settings', type=validate_file_or_dict, help='Protected settings (may contain secrets). '
'Expected value: json-string/json-file/@json-file.', arg_group='Extension Parameters')
c.argument('created_by', type=str, help='The identity that created the resource.', arg_group='System Data')
c.argument('created_by_type', arg_type=get_enum_type(['User', 'Application', 'ManagedIdentity', 'Key']),
help='The type of identity that created the resource.', arg_group='System Data')
c.argument('created_at', help='The timestamp of resource creation (UTC).', arg_group='System Data')
c.argument('last_modified_by', type=str, help='The identity that last modified the resource.',
arg_group='System Data')
c.argument('last_modified_by_type', arg_type=get_enum_type(['User', 'Application', 'ManagedIdentity', 'Key']),
help='The type of identity that last modified the resource.', arg_group='System Data')
c.argument('last_modified_at', help='The timestamp of resource last modification (UTC)', arg_group='System '
'Data')
with self.argument_context('stack-hci extension update') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.', id_part='name')
c.argument('arc_setting_name', type=str, help='The name of the proxy resource holding details of HCI '
'ArcSetting information.', id_part='child_name_1')
c.argument('extension_name', options_list=['--name', '-n', '--extension-name'], type=str, help='The name of '
'the machine extension.', id_part='child_name_2')
c.argument('force_update_tag', type=str, help='How the extension handler should be forced to update even if '
'the extension configuration has not changed.', arg_group='Extension Parameters')
c.argument('publisher', type=str, help='The name of the extension handler publisher.', arg_group='Extension '
'Parameters')
c.argument('type_properties_extension_parameters_type', options_list=['--type'], type=str, help='Specifies the '
'type of the extension; an example is "CustomScriptExtension".', arg_group='Extension Parameters')
c.argument('type_handler_version', type=str, help='Specifies the version of the script handler.',
arg_group='Extension Parameters')
c.argument('auto_upgrade_minor_version', options_list=['--auto-upgrade'], arg_type=get_three_state_flag(),
help='Indicates whether the extension should use a newer minor version if one is available at '
'deployment time. Once deployed, however, the extension will not upgrade minor versions unless '
'redeployed, even with this property set to true.', arg_group='Extension Parameters')
c.argument('settings', type=validate_file_or_dict, help='Json formatted public settings for the extension. '
'Expected value: json-string/json-file/@json-file.', arg_group='Extension Parameters')
c.argument('protected_settings', type=validate_file_or_dict, help='Protected settings (may contain secrets). '
'Expected value: json-string/json-file/@json-file.', arg_group='Extension Parameters')
c.argument('created_by', type=str, help='The identity that created the resource.', arg_group='System Data')
c.argument('created_by_type', arg_type=get_enum_type(['User', 'Application', 'ManagedIdentity', 'Key']),
help='The type of identity that created the resource.', arg_group='System Data')
c.argument('created_at', help='The timestamp of resource creation (UTC).', arg_group='System Data')
c.argument('last_modified_by', type=str, help='The identity that last modified the resource.',
arg_group='System Data')
c.argument('last_modified_by_type', arg_type=get_enum_type(['User', 'Application', 'ManagedIdentity', 'Key']),
help='The type of identity that last modified the resource.', arg_group='System Data')
c.argument('last_modified_at', help='The timestamp of resource last modification (UTC)', arg_group='System '
'Data')
with self.argument_context('stack-hci extension delete') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.', id_part='name')
c.argument('arc_setting_name', type=str, help='The name of the proxy resource holding details of HCI '
'ArcSetting information.', id_part='child_name_1')
c.argument('extension_name', options_list=['--name', '-n', '--extension-name'], type=str, help='The name of '
'the machine extension.', id_part='child_name_2')
with self.argument_context('stack-hci extension wait') as c:
c.argument('resource_group_name', resource_group_name_type)
c.argument('cluster_name', type=str, help='The name of the cluster.', id_part='name')
c.argument('arc_setting_name', type=str, help='The name of the proxy resource holding details of HCI '
'ArcSetting information.', id_part='child_name_1')
c.argument('extension_name', options_list=['--name', '-n', '--extension-name'], type=str, help='The name of '
'the machine extension.', id_part='child_name_2')
| 74.962791 | 120 | 0.665695 | 2,106 | 16,117 | 4.904558 | 0.093542 | 0.083648 | 0.056443 | 0.055572 | 0.937361 | 0.930681 | 0.923904 | 0.909575 | 0.903669 | 0.897667 | 0 | 0.000853 | 0.200223 | 16,117 | 214 | 121 | 75.313084 | 0.800465 | 0.031395 | 0 | 0.798913 | 0 | 0 | 0.477979 | 0.030451 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005435 | false | 0 | 0.016304 | 0 | 0.021739 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0ac264049cf4131818196d6a9e438784cbbc5477 | 71,547 | py | Python | airflow/providers/google/cloud/operators/cloud_memorystore.py | gtossou/airflow | 0314a3a218f864f78ec260cc66134e7acae34bc5 | [
"Apache-2.0"
] | 79 | 2021-10-15T07:32:27.000Z | 2022-03-28T04:10:19.000Z | airflow/providers/google/cloud/operators/cloud_memorystore.py | gtossou/airflow | 0314a3a218f864f78ec260cc66134e7acae34bc5 | [
"Apache-2.0"
] | 153 | 2021-10-15T05:23:46.000Z | 2022-02-23T06:07:10.000Z | airflow/providers/google/cloud/operators/cloud_memorystore.py | gtossou/airflow | 0314a3a218f864f78ec260cc66134e7acae34bc5 | [
"Apache-2.0"
] | 23 | 2021-10-15T02:36:37.000Z | 2022-03-17T02:59:27.000Z | #
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
"""Operators for Google Cloud Memorystore service"""
from typing import Dict, Optional, Sequence, Tuple, Union
from google.api_core.retry import Retry
from google.cloud.memcache_v1beta2.types import cloud_memcache
from google.cloud.redis_v1.gapic.enums import FailoverInstanceRequest
from google.cloud.redis_v1.types import FieldMask, InputConfig, Instance, OutputConfig
from google.protobuf.json_format import MessageToDict
from airflow.models import BaseOperator
from airflow.providers.google.cloud.hooks.cloud_memorystore import (
CloudMemorystoreHook,
CloudMemorystoreMemcachedHook,
)
from airflow.utils.decorators import apply_defaults
class CloudMemorystoreCreateInstanceOperator(BaseOperator):
"""
Creates a Redis instance based on the specified tier and memory size.
By default, the instance is accessible from the project's `default network
<https://cloud.google.com/compute/docs/networks-and-firewalls#networks>`__.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreCreateInstanceOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance_id: Required. The logical name of the Redis instance in the customer project with the
following restrictions:
- Must contain only lowercase letters, numbers, and hyphens.
- Must start with a letter.
- Must be between 1-40 characters.
- Must end with a number or a letter.
- Must be unique within the customer project / location
:type instance_id: str
:param instance: Required. A Redis [Instance] resource
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.redis_v1.types.Instance`
:type instance: Union[Dict, google.cloud.redis_v1.types.Instance]
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"instance_id",
"instance",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
instance_id: str,
instance: Union[Dict, Instance],
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.instance_id = instance_id
self.instance = instance
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict):
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
result = hook.create_instance(
location=self.location,
instance_id=self.instance_id,
instance=self.instance,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
return MessageToDict(result)
class CloudMemorystoreDeleteInstanceOperator(BaseOperator):
"""
Deletes a specific Redis instance. Instance stops serving and data is deleted.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreDeleteInstanceOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance: The logical name of the Redis instance in the customer project.
:type instance: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"instance",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
instance: str,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.instance = instance
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict) -> None:
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.delete_instance(
location=self.location,
instance=self.instance,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreExportInstanceOperator(BaseOperator):
"""
Export Redis instance data into a Redis RDB format file in Cloud Storage.
Redis will continue serving during this operation.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreExportInstanceOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance: The logical name of the Redis instance in the customer project.
:type instance: str
:param output_config: Required. Specify data to be exported.
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.redis_v1.types.OutputConfig`
:type output_config: Union[Dict, google.cloud.redis_v1.types.OutputConfig]
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"instance",
"output_config",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
instance: str,
output_config: Union[Dict, OutputConfig],
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.instance = instance
self.output_config = output_config
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict) -> None:
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.export_instance(
location=self.location,
instance=self.instance,
output_config=self.output_config,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreFailoverInstanceOperator(BaseOperator):
"""
Initiates a failover of the master node to current replica node for a specific STANDARD tier Cloud
Memorystore for Redis instance.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreFailoverInstanceOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance: The logical name of the Redis instance in the customer project.
:type instance: str
:param data_protection_mode: Optional. Available data protection modes that the user can choose. If it's
unspecified, data protection mode will be LIMITED_DATA_LOSS by default.
:type data_protection_mode: google.cloud.redis_v1.gapic.enums.FailoverInstanceRequest.DataProtectionMode
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"instance",
"data_protection_mode",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
instance: str,
data_protection_mode: FailoverInstanceRequest.DataProtectionMode,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.instance = instance
self.data_protection_mode = data_protection_mode
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict) -> None:
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.failover_instance(
location=self.location,
instance=self.instance,
data_protection_mode=self.data_protection_mode,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreGetInstanceOperator(BaseOperator):
"""
Gets the details of a specific Redis instance.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreGetInstanceOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance: The logical name of the Redis instance in the customer project.
:type instance: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"instance",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
instance: str,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.instance = instance
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict):
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
result = hook.get_instance(
location=self.location,
instance=self.instance,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
return MessageToDict(result)
class CloudMemorystoreImportOperator(BaseOperator):
"""
Import a Redis RDB snapshot file from Cloud Storage into a Redis instance.
Redis may stop serving during this operation. Instance state will be IMPORTING for entire operation. When
complete, the instance will contain only data from the imported file.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreImportOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance: The logical name of the Redis instance in the customer project.
:type instance: str
:param input_config: Required. Specify data to be imported.
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.redis_v1.types.InputConfig`
:type input_config: Union[Dict, google.cloud.redis_v1.types.InputConfig]
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"instance",
"input_config",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
instance: str,
input_config: Union[Dict, InputConfig],
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.instance = instance
self.input_config = input_config
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict) -> None:
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.import_instance(
location=self.location,
instance=self.instance,
input_config=self.input_config,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreListInstancesOperator(BaseOperator):
"""
Lists all Redis instances owned by a project in either the specified location (region) or all locations.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreListInstancesOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
If it is specified as ``-`` (wildcard), then all regions available to the project are
queried, and the results are aggregated.
:type location: str
:param page_size: The maximum number of resources contained in the underlying API response. If page
streaming is performed per- resource, this parameter does not affect the return value. If page
streaming is performed per-page, this determines the maximum number of resources in a page.
:type page_size: int
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"page_size",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
page_size: int,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.page_size = page_size
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict):
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
result = hook.list_instances(
location=self.location,
page_size=self.page_size,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
instances = [MessageToDict(a) for a in result]
return instances
class CloudMemorystoreUpdateInstanceOperator(BaseOperator):
"""
Updates the metadata and configuration of a specific Redis instance.
:param update_mask: Required. Mask of fields to update. At least one path must be supplied in this field.
The elements of the repeated paths field may only include these fields from ``Instance``:
- ``displayName``
- ``labels``
- ``memorySizeGb``
- ``redisConfig``
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.redis_v1.types.FieldMask`
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreUpdateInstanceOperator`
:type update_mask: Union[Dict, google.cloud.redis_v1.types.FieldMask]
:param instance: Required. Update description. Only fields specified in update_mask are updated.
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.redis_v1.types.Instance`
:type instance: Union[Dict, google.cloud.redis_v1.types.Instance]
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance_id: The logical name of the Redis instance in the customer project.
:type instance_id: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"update_mask",
"instance",
"location",
"instance_id",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
update_mask: Union[Dict, FieldMask],
instance: Union[Dict, Instance],
location: Optional[str] = None,
instance_id: Optional[str] = None,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.update_mask = update_mask
self.instance = instance
self.location = location
self.instance_id = instance_id
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict) -> None:
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.update_instance(
update_mask=self.update_mask,
instance=self.instance,
location=self.location,
instance_id=self.instance_id,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreScaleInstanceOperator(BaseOperator):
"""
Updates the metadata and configuration of a specific Redis instance.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreScaleInstanceOperator`
:param memory_size_gb: Redis memory size in GiB.
:type memory_size_gb: int
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance_id: The logical name of the Redis instance in the customer project.
:type instance_id: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"memory_size_gb",
"location",
"instance_id",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
memory_size_gb: int,
location: Optional[str] = None,
instance_id: Optional[str] = None,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.memory_size_gb = memory_size_gb
self.location = location
self.instance_id = instance_id
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict) -> None:
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.update_instance(
update_mask={"paths": ["memory_size_gb"]},
instance={"memory_size_gb": self.memory_size_gb},
location=self.location,
instance_id=self.instance_id,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreCreateInstanceAndImportOperator(BaseOperator):
"""
Creates a Redis instance based on the specified tier and memory size and import a Redis RDB snapshot file
from Cloud Storage into a this instance.
By default, the instance is accessible from the project's `default network
<https://cloud.google.com/compute/docs/networks-and-firewalls#networks>`__.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreCreateInstanceAndImportOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance_id: Required. The logical name of the Redis instance in the customer project with the
following restrictions:
- Must contain only lowercase letters, numbers, and hyphens.
- Must start with a letter.
- Must be between 1-40 characters.
- Must end with a number or a letter.
- Must be unique within the customer project / location
:type instance_id: str
:param instance: Required. A Redis [Instance] resource
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.redis_v1.types.Instance`
:type instance: Union[Dict, google.cloud.redis_v1.types.Instance]
:param input_config: Required. Specify data to be imported.
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.redis_v1.types.InputConfig`
:type input_config: Union[Dict, google.cloud.redis_v1.types.InputConfig]
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"instance_id",
"instance",
"input_config",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
instance_id: str,
instance: Union[Dict, Instance],
input_config: Union[Dict, InputConfig],
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.instance_id = instance_id
self.instance = instance
self.input_config = input_config
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict) -> None:
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.create_instance(
location=self.location,
instance_id=self.instance_id,
instance=self.instance,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
hook.import_instance(
location=self.location,
instance=self.instance_id,
input_config=self.input_config,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreExportAndDeleteInstanceOperator(BaseOperator):
"""
Export Redis instance data into a Redis RDB format file in Cloud Storage. In next step, deletes a this
instance.
Redis will continue serving during this operation.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreExportAndDeleteInstanceOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance: The logical name of the Redis instance in the customer project.
:type instance: str
:param output_config: Required. Specify data to be exported.
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.redis_v1.types.OutputConfig`
:type output_config: Union[Dict, google.cloud.redis_v1.types.OutputConfig]
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"instance",
"output_config",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
instance: str,
output_config: Union[Dict, OutputConfig],
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.instance = instance
self.output_config = output_config
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: dict) -> None:
hook = CloudMemorystoreHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.export_instance(
location=self.location,
instance=self.instance,
output_config=self.output_config,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
hook.delete_instance(
location=self.location,
instance=self.instance,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreMemcachedApplyParametersOperator(BaseOperator):
"""
Will update current set of Parameters to the set of specified nodes of the Memcached Instance.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreMemcachedApplyParametersOperator`
:param node_ids: Nodes to which we should apply the instance-level parameter group.
:type node_ids: Sequence[str]
:param apply_all: Whether to apply instance-level parameter group to all nodes. If set to true,
will explicitly restrict users from specifying any nodes, and apply parameter group updates
to all nodes within the instance.
:type apply_all: bool
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance_id: The logical name of the Memcached instance in the customer project.
:type instance_id: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
"""
template_fields = (
"node_ids",
"apply_all",
"location",
"instance_id",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
node_ids: Sequence[str],
apply_all: bool,
location: str,
instance_id: str,
project_id: str,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.node_ids = node_ids
self.apply_all = apply_all
self.location = location
self.instance_id = instance_id
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: Dict):
hook = CloudMemorystoreMemcachedHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.apply_parameters(
node_ids=self.node_ids,
apply_all=self.apply_all,
location=self.location,
instance_id=self.instance_id,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreMemcachedCreateInstanceOperator(BaseOperator):
"""
Creates a Memcached instance based on the specified tier and memory size.
By default, the instance is accessible from the project's `default network
<https://cloud.google.com/compute/docs/networks-and-firewalls#networks>`__.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreMemcachedCreateInstanceOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance_id: Required. The logical name of the Memcached instance in the customer project with the
following restrictions:
- Must contain only lowercase letters, numbers, and hyphens.
- Must start with a letter.
- Must be between 1-40 characters.
- Must end with a number or a letter.
- Must be unique within the customer project / location
:type instance_id: str
:param instance: Required. A Memcached [Instance] resource
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.memcache_v1beta2.types.cloud_memcache.Instance`
:type instance: Union[Dict, google.cloud.memcache_v1beta2.types.cloud_memcache.Instance]
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the GCP connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud Platform.
:type gcp_conn_id: str
"""
template_fields = (
"location",
"instance_id",
"instance",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
)
@apply_defaults
def __init__(
self,
location: str,
instance_id: str,
instance: Union[Dict, cloud_memcache.Instance],
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
*args,
**kwargs,
) -> None:
super().__init__(*args, **kwargs)
self.location = location
self.instance_id = instance_id
self.instance = instance
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
def execute(self, context: Dict):
hook = CloudMemorystoreMemcachedHook(gcp_conn_id=self.gcp_conn_id)
result = hook.create_instance(
location=self.location,
instance_id=self.instance_id,
instance=self.instance,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
return cloud_memcache.Instance.to_dict(result)
class CloudMemorystoreMemcachedDeleteInstanceOperator(BaseOperator):
"""
Deletes a specific Memcached instance. Instance stops serving and data is deleted.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreMemcachedDeleteInstanceOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance: The logical name of the Memcached instance in the customer project.
:type instance: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the GCP connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud Platform.
:type gcp_conn_id: str
"""
template_fields = ("location", "instance", "project_id", "retry", "timeout", "metadata", "gcp_conn_id")
@apply_defaults
def __init__(
self,
location: str,
instance: str,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
*args,
**kwargs,
) -> None:
super().__init__(*args, **kwargs)
self.location = location
self.instance = instance
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
def execute(self, context: Dict):
hook = CloudMemorystoreMemcachedHook(gcp_conn_id=self.gcp_conn_id)
hook.delete_instance(
location=self.location,
instance=self.instance,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreMemcachedGetInstanceOperator(BaseOperator):
"""
Gets the details of a specific Memcached instance.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreMemcachedGetInstanceOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance: The logical name of the Memcached instance in the customer project.
:type instance: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud Platform.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"instance",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
instance: str,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.instance = instance
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: Dict):
hook = CloudMemorystoreMemcachedHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
result = hook.get_instance(
location=self.location,
instance=self.instance,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
return cloud_memcache.Instance.to_dict(result)
class CloudMemorystoreMemcachedListInstancesOperator(BaseOperator):
"""
Lists all Memcached instances owned by a project in either the specified location (region) or all
locations.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreMemcachedListInstancesOperator`
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
If it is specified as ``-`` (wildcard), then all regions available to the project are
queried, and the results are aggregated.
:type location: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"location",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
location: str,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.location = location
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: Dict):
hook = CloudMemorystoreMemcachedHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
result = hook.list_instances(
location=self.location,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
instances = [cloud_memcache.Instance.to_dict(a) for a in result]
return instances
class CloudMemorystoreMemcachedUpdateInstanceOperator(BaseOperator):
"""
Updates the metadata and configuration of a specific Memcached instance.
:param update_mask: Required. Mask of fields to update. At least one path must be supplied in this field.
The elements of the repeated paths field may only include these fields from ``Instance``:
- ``displayName``
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.memcache_v1beta2.types.cloud_memcache.field_mask.FieldMas`
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreMemcachedUpdateInstanceOperator`
:type update_mask: Union[Dict, google.cloud.memcache_v1beta2.types.cloud_memcache.field_mask.FieldMask]
:param instance: Required. Update description. Only fields specified in update_mask are updated.
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.memcache_v1beta2.types.cloud_memcache.Instance`
:type instance: Union[Dict, google.cloud.memcache_v1beta2.types.cloud_memcache.Instance]
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance_id: The logical name of the Memcached instance in the customer project.
:type instance_id: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
:param gcp_conn_id: The connection ID to use connecting to Google Cloud.
:type gcp_conn_id: str
:param impersonation_chain: Optional service account to impersonate using short-term
credentials, or chained list of accounts required to get the access_token
of the last account in the list, which will be impersonated in the request.
If set as a string, the account must grant the originating account
the Service Account Token Creator IAM role.
If set as a sequence, the identities from the list must grant
Service Account Token Creator IAM role to the directly preceding identity, with first
account from the list granting this role to the originating account (templated).
:type impersonation_chain: Union[str, Sequence[str]]
"""
template_fields = (
"update_mask",
"instance",
"location",
"instance_id",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
update_mask: Union[Dict, cloud_memcache.field_mask.FieldMask],
instance: Union[Dict, cloud_memcache.Instance],
location: Optional[str] = None,
instance_id: Optional[str] = None,
project_id: Optional[str] = None,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.update_mask = update_mask
self.instance = instance
self.location = location
self.instance_id = instance_id
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: Dict):
hook = CloudMemorystoreMemcachedHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.update_instance(
update_mask=self.update_mask,
instance=self.instance,
location=self.location,
instance_id=self.instance_id,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
class CloudMemorystoreMemcachedUpdateParametersOperator(BaseOperator):
"""
Updates the defined Memcached Parameters for an existing Instance. This method only stages the
parameters, it must be followed by apply_parameters to apply the parameters to nodes of
the Memcached Instance.
.. seealso::
For more information on how to use this operator, take a look at the guide:
:ref:`howto/operator:CloudMemorystoreMemcachedApplyParametersOperator`
:param update_mask: Required. Mask of fields to update.
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.memcache_v1beta2.types.cloud_memcache.field_mask.FieldMask`
:type update_mask:
Union[Dict, google.cloud.memcache_v1beta2.types.cloud_memcache.field_mask.FieldMask]
:param parameters: The parameters to apply to the instance.
If a dict is provided, it must be of the same form as the protobuf message
:class:`~google.cloud.memcache_v1beta2.types.cloud_memcache.MemcacheParameters`
:type parameters: Union[Dict, google.cloud.memcache_v1beta2.types.cloud_memcache.MemcacheParameters]
:param location: The location of the Cloud Memorystore instance (for example europe-west1)
:type location: str
:param instance_id: The logical name of the Memcached instance in the customer project.
:type instance_id: str
:param project_id: Project ID of the project that contains the instance. If set
to None or missing, the default project_id from the Google Cloud connection is used.
:type project_id: str
:param retry: A retry object used to retry requests. If ``None`` is specified, requests will not be
retried.
:type retry: google.api_core.retry.Retry
:param timeout: The amount of time, in seconds, to wait for the request to complete. Note that if
``retry`` is specified, the timeout applies to each individual attempt.
:type timeout: float
:param metadata: Additional metadata that is provided to the method.
:type metadata: Sequence[Tuple[str, str]]
"""
template_fields = (
"update_mask",
"parameters",
"location",
"instance_id",
"project_id",
"retry",
"timeout",
"metadata",
"gcp_conn_id",
"impersonation_chain",
)
@apply_defaults
def __init__(
self,
*,
update_mask: Union[Dict, cloud_memcache.field_mask.FieldMask],
parameters: Union[Dict, cloud_memcache.MemcacheParameters],
location: str,
instance_id: str,
project_id: str,
retry: Optional[Retry] = None,
timeout: Optional[float] = None,
metadata: Optional[Sequence[Tuple[str, str]]] = None,
gcp_conn_id: str = "google_cloud_default",
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.update_mask = update_mask
self.parameters = parameters
self.location = location
self.instance_id = instance_id
self.project_id = project_id
self.retry = retry
self.timeout = timeout
self.metadata = metadata
self.gcp_conn_id = gcp_conn_id
self.impersonation_chain = impersonation_chain
def execute(self, context: Dict):
hook = CloudMemorystoreMemcachedHook(
gcp_conn_id=self.gcp_conn_id, impersonation_chain=self.impersonation_chain
)
hook.update_parameters(
update_mask=self.update_mask,
parameters=self.parameters,
location=self.location,
instance_id=self.instance_id,
project_id=self.project_id,
retry=self.retry,
timeout=self.timeout,
metadata=self.metadata,
)
| 41.476522 | 109 | 0.671992 | 8,922 | 71,547 | 5.267429 | 0.043488 | 0.033705 | 0.026811 | 0.013788 | 0.904013 | 0.899289 | 0.895757 | 0.892416 | 0.890246 | 0.884799 | 0 | 0.001352 | 0.255776 | 71,547 | 1,724 | 110 | 41.50058 | 0.881252 | 0.537591 | 0 | 0.87157 | 0 | 0 | 0.063165 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039517 | false | 0 | 0.01427 | 0 | 0.09989 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0ad6810eb2c306a678a7bac55a59ef798da52c4f | 42,762 | py | Python | saleor/payment/gateways/stripe/tests/test_plugin.py | wuchujiecode/saleor | c2ee650e11b1dde6744be7c46e28262318ae4ac9 | [
"CC-BY-4.0"
] | null | null | null | saleor/payment/gateways/stripe/tests/test_plugin.py | wuchujiecode/saleor | c2ee650e11b1dde6744be7c46e28262318ae4ac9 | [
"CC-BY-4.0"
] | null | null | null | saleor/payment/gateways/stripe/tests/test_plugin.py | wuchujiecode/saleor | c2ee650e11b1dde6744be7c46e28262318ae4ac9 | [
"CC-BY-4.0"
] | null | null | null | import warnings
from unittest.mock import Mock, patch
import pytest
from django.core.exceptions import ValidationError
from stripe.error import AuthenticationError, StripeError
from stripe.stripe_object import StripeObject
from .....plugins.models import PluginConfiguration
from .... import TransactionKind
from ....interface import GatewayResponse
from ....utils import (
create_payment_information,
create_transaction,
price_to_minor_unit,
)
from ..consts import (
ACTION_REQUIRED_STATUSES,
AUTHORIZED_STATUS,
AUTOMATIC_CAPTURE_METHOD,
MANUAL_CAPTURE_METHOD,
PROCESSING_STATUS,
STRIPE_API_VERSION,
SUCCESS_STATUS,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint.list")
def test_validate_plugin_configuration_correct_configuration(
mocked_stripe, stripe_plugin
):
plugin = stripe_plugin(
public_api_key="public",
secret_api_key="ABC",
active=True,
)
configuration = PluginConfiguration.objects.get()
plugin.validate_plugin_configuration(configuration)
assert mocked_stripe.called
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint.list")
def test_validate_plugin_configuration_incorrect_configuration(
mocked_stripe, stripe_plugin
):
mocked_stripe.side_effect = AuthenticationError()
plugin = stripe_plugin(
public_api_key="public",
secret_api_key="wrong",
active=True,
)
configuration = PluginConfiguration.objects.get()
with pytest.raises(ValidationError):
plugin.validate_plugin_configuration(configuration)
assert mocked_stripe.called
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint.list")
def test_validate_plugin_configuration_missing_required_fields(
mocked_stripe, stripe_plugin
):
plugin = stripe_plugin(
secret_api_key="wrong",
active=True,
)
configuration = PluginConfiguration.objects.get()
for config_field in configuration.configuration:
if config_field["name"] == "public_api_key":
config_field["value"] = None
break
with pytest.raises(ValidationError):
plugin.validate_plugin_configuration(configuration)
assert not mocked_stripe.called
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint.list")
def test_validate_plugin_configuration_validate_only_when_active(
mocked_stripe, stripe_plugin
):
plugin = stripe_plugin(
secret_api_key="wrong",
active=False,
)
configuration = PluginConfiguration.objects.get()
for config_field in configuration.configuration:
if config_field["name"] == "public_api_key":
config_field["value"] = None
break
plugin.validate_plugin_configuration(configuration)
assert not mocked_stripe.called
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint.delete")
def test_pre_save_plugin_configuration_removes_webhook_when_disabled(
mocked_stripe, stripe_plugin
):
plugin = stripe_plugin(
active=False, webhook_secret_key="secret", webhook_endpoint_id="endpoint"
)
configuration = PluginConfiguration.objects.get()
plugin.pre_save_plugin_configuration(configuration)
assert all(
[
c_field["value"] != "endpoint"
for c_field in configuration.configuration
if c_field["name"] == "webhook_endpoint_id"
]
)
assert all(
[
c_field["name"] != "webhook_secret_key"
for c_field in configuration.configuration
]
)
assert mocked_stripe.called
def get_field_from_plugin_configuration(
plugin_configuration: PluginConfiguration, field_name: str
):
configuration = plugin_configuration.configuration
for config_field in configuration:
if config_field["name"] == field_name:
return config_field
return None
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.WebhookEndpoint.create")
def test_pre_save_plugin_configuration(mocked_stripe, stripe_plugin):
webhook_object = StripeObject(id="stripe_webhook_id", last_response={})
webhook_object.secret = "stripe_webhook_secret"
mocked_stripe.return_value = webhook_object
plugin = stripe_plugin(
active=True, webhook_endpoint_id=None, webhook_secret_key=None
)
configuration = PluginConfiguration.objects.get()
plugin.pre_save_plugin_configuration(configuration)
webhook_id = get_field_from_plugin_configuration(
configuration, "webhook_endpoint_id"
)
webhook_secret = get_field_from_plugin_configuration(
configuration, "webhook_secret_key"
)
assert webhook_id["value"] == "stripe_webhook_id"
assert webhook_secret["value"] == "stripe_webhook_secret"
assert mocked_stripe.called
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Customer.create")
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment(
mocked_payment_intent,
mocked_customer,
stripe_plugin,
payment_stripe_for_checkout,
channel_USD,
):
payment_intent = Mock()
mocked_payment_intent.return_value = payment_intent
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
payment_intent.status = "requires_payment_method"
plugin = stripe_plugin(auto_capture=True)
payment_info = create_payment_information(
payment_stripe_for_checkout,
)
response = plugin.process_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is True
assert response.kind == TransactionKind.ACTION_TO_CONFIRM
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
assert response.raw_response == dummy_response
assert response.action_required_data == {
"client_secret": client_secret,
"id": payment_intent_id,
}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=AUTOMATIC_CAPTURE_METHOD,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
assert not mocked_customer.called
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Customer.create")
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_with_customer(
mocked_payment_intent,
mocked_customer_create,
stripe_plugin,
payment_stripe_for_checkout,
channel_USD,
customer_user,
):
customer = StripeObject(id="cus_id")
mocked_customer_create.return_value = customer
payment_intent = Mock()
mocked_payment_intent.return_value = payment_intent
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
payment_intent.status = "requires_payment_method"
plugin = stripe_plugin(auto_capture=True)
payment_stripe_for_checkout.checkout.user = customer_user
payment_stripe_for_checkout.checkout.email = customer_user.email
payment_info = create_payment_information(
payment_stripe_for_checkout,
customer_id=None,
store_source=True,
)
response = plugin.process_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is True
assert response.kind == TransactionKind.ACTION_TO_CONFIRM
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
assert response.raw_response == dummy_response
assert response.action_required_data == {
"client_secret": client_secret,
"id": payment_intent_id,
}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=AUTOMATIC_CAPTURE_METHOD,
customer=customer,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=customer_user.email,
stripe_version=STRIPE_API_VERSION,
)
mocked_customer_create.assert_called_once_with(
api_key="secret_key",
email=customer_user.email,
stripe_version=STRIPE_API_VERSION,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Customer.create")
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_with_customer_and_future_usage(
mocked_payment_intent,
mocked_customer_create,
stripe_plugin,
payment_stripe_for_checkout,
channel_USD,
customer_user,
):
customer = Mock()
mocked_customer_create.return_value = customer
payment_intent = Mock()
mocked_payment_intent.return_value = payment_intent
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
payment_intent.status = SUCCESS_STATUS
plugin = stripe_plugin(auto_capture=True)
payment_stripe_for_checkout.checkout.user = customer_user
payment_stripe_for_checkout.checkout.email = customer_user.email
payment_info = create_payment_information(
payment_stripe_for_checkout,
customer_id=None,
store_source=True,
additional_data={"setup_future_usage": "off_session"},
)
response = plugin.process_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == TransactionKind.CAPTURE
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
assert response.raw_response == dummy_response
assert response.action_required_data == {
"client_secret": client_secret,
"id": payment_intent_id,
}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=AUTOMATIC_CAPTURE_METHOD,
customer=customer,
setup_future_usage="off_session",
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
mocked_customer_create.assert_called_once_with(
api_key="secret_key",
email=customer_user.email,
stripe_version=STRIPE_API_VERSION,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Customer.create")
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_with_customer_and_payment_method(
mocked_payment_intent,
mocked_customer_create,
stripe_plugin,
payment_stripe_for_checkout,
channel_USD,
customer_user,
):
customer = Mock()
mocked_customer_create.return_value = customer
payment_intent = Mock()
mocked_payment_intent.return_value = payment_intent
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
payment_intent.status = SUCCESS_STATUS
plugin = stripe_plugin(auto_capture=True)
payment_stripe_for_checkout.checkout.user = customer_user
payment_stripe_for_checkout.checkout.email = customer_user.email
payment_info = create_payment_information(
payment_stripe_for_checkout,
customer_id=None,
store_source=True,
additional_data={"payment_method_id": "pm_ID"},
)
response = plugin.process_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == TransactionKind.CAPTURE
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
assert response.raw_response == dummy_response
assert response.action_required_data == {
"client_secret": client_secret,
"id": payment_intent_id,
}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=AUTOMATIC_CAPTURE_METHOD,
customer=customer,
payment_method="pm_ID",
off_session=False,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
mocked_customer_create.assert_called_once_with(
api_key="secret_key",
email=customer_user.email,
stripe_version=STRIPE_API_VERSION,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Customer.create")
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_with_payment_method_types(
mocked_payment_intent,
mocked_customer_create,
stripe_plugin,
payment_stripe_for_checkout,
channel_USD,
customer_user,
):
customer = Mock()
mocked_customer_create.return_value = customer
payment_intent = Mock()
mocked_payment_intent.return_value = payment_intent
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
payment_intent.status = SUCCESS_STATUS
plugin = stripe_plugin(auto_capture=True)
payment_stripe_for_checkout.checkout.user = customer_user
payment_stripe_for_checkout.checkout.email = customer_user.email
payment_stripe_for_checkout.save()
payment_info = create_payment_information(
payment_stripe_for_checkout,
customer_id=None,
store_source=True,
additional_data={"payment_method_types": ["p24", "card"]},
)
response = plugin.process_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == TransactionKind.CAPTURE
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
assert response.raw_response == dummy_response
assert response.action_required_data == {
"client_secret": client_secret,
"id": payment_intent_id,
}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=AUTOMATIC_CAPTURE_METHOD,
customer=customer,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
payment_method_types=["p24", "card"],
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
mocked_customer_create.assert_called_once_with(
api_key="secret_key",
email=customer_user.email,
stripe_version=STRIPE_API_VERSION,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Customer.create")
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_offline(
mocked_payment_intent,
mocked_customer_create,
stripe_plugin,
payment_stripe_for_checkout,
channel_USD,
customer_user,
):
customer = Mock()
mocked_customer_create.return_value = customer
payment_intent = Mock()
mocked_payment_intent.return_value = payment_intent
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
payment_intent.status = SUCCESS_STATUS
plugin = stripe_plugin(auto_capture=True)
payment_stripe_for_checkout.checkout.user = customer_user
payment_stripe_for_checkout.checkout.email = customer_user.email
payment_info = create_payment_information(
payment_stripe_for_checkout,
customer_id=None,
store_source=True,
additional_data={"payment_method_id": "pm_ID", "off_session": True},
)
response = plugin.process_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == TransactionKind.CAPTURE
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
assert response.raw_response == dummy_response
assert response.action_required_data == {
"client_secret": client_secret,
"id": payment_intent_id,
}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=AUTOMATIC_CAPTURE_METHOD,
customer=customer,
payment_method="pm_ID",
confirm=True,
off_session=True,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
mocked_customer_create.assert_called_once_with(
api_key="secret_key",
email=customer_user.email,
stripe_version=STRIPE_API_VERSION,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Customer.create")
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_with_customer_and_payment_method_raises_authentication_error(
mocked_payment_intent,
mocked_customer_create,
stripe_plugin,
payment_stripe_for_checkout,
channel_USD,
customer_user,
):
customer = Mock()
mocked_customer_create.return_value = customer
payment_intent = Mock()
stripe_error_object = StripeError()
stripe_error_object.error = StripeError()
stripe_error_object.code = "authentication_required"
stripe_error_object.error.payment_intent = payment_intent
mocked_payment_intent.side_effect = stripe_error_object
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
payment_intent.status = SUCCESS_STATUS
plugin = stripe_plugin(auto_capture=True)
payment_stripe_for_checkout.checkout.user = customer_user
payment_stripe_for_checkout.checkout.email = customer_user.email
payment_info = create_payment_information(
payment_stripe_for_checkout,
customer_id=None,
store_source=True,
additional_data={"payment_method_id": "pm_ID", "off_session": True},
)
response = plugin.process_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == TransactionKind.CAPTURE
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
assert response.raw_response == dummy_response
assert response.action_required_data == {
"client_secret": client_secret,
"id": payment_intent_id,
}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=AUTOMATIC_CAPTURE_METHOD,
customer=customer,
payment_method="pm_ID",
confirm=True,
off_session=True,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
mocked_customer_create.assert_called_once_with(
api_key="secret_key",
email=customer_user.email,
stripe_version=STRIPE_API_VERSION,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Customer.create")
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_with_customer_and_payment_method_raises_error(
mocked_payment_intent,
mocked_customer_create,
stripe_plugin,
payment_stripe_for_checkout,
channel_USD,
customer_user,
):
customer = Mock()
mocked_customer_create.return_value = customer
payment_intent = Mock()
stripe_error_object = StripeError(
message="Card declined", json_body={"error": "body"}
)
stripe_error_object.error = StripeError()
stripe_error_object.code = "card_declined"
stripe_error_object.error.payment_intent = payment_intent
mocked_payment_intent.side_effect = stripe_error_object
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
plugin = stripe_plugin(auto_capture=True)
payment_stripe_for_checkout.checkout.user = customer_user
payment_stripe_for_checkout.checkout.email = customer_user.email
payment_info = create_payment_information(
payment_stripe_for_checkout,
customer_id=None,
store_source=True,
additional_data={"payment_method_id": "pm_ID", "off_session": True},
)
response = plugin.process_payment(payment_info, None)
assert response.is_success is False
assert response.action_required is True
assert response.kind == TransactionKind.ACTION_TO_CONFIRM
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert not response.transaction_id
assert response.error == "Card declined"
assert response.raw_response == {"error": "body"}
assert response.action_required_data == {"client_secret": None, "id": None}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=AUTOMATIC_CAPTURE_METHOD,
customer=customer,
payment_method="pm_ID",
confirm=True,
off_session=True,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
mocked_customer_create.assert_called_once_with(
api_key="secret_key",
email=customer_user.email,
stripe_version=STRIPE_API_VERSION,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_with_disabled_order_auto_confirmation(
mocked_payment_intent,
stripe_plugin,
payment_stripe_for_checkout,
site_settings,
channel_USD,
):
payment_intent = Mock()
mocked_payment_intent.return_value = payment_intent
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
payment_intent.status = "requires_payment_method"
plugin = stripe_plugin(auto_capture=True)
payment_info = create_payment_information(
payment_stripe_for_checkout,
)
site_settings.automatically_confirm_all_new_orders = False
site_settings.save()
response = plugin.process_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is True
assert response.kind == TransactionKind.ACTION_TO_CONFIRM
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
assert response.raw_response == dummy_response
assert response.action_required_data == {
"client_secret": client_secret,
"id": payment_intent_id,
}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=MANUAL_CAPTURE_METHOD,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_with_manual_capture(
mocked_payment_intent, stripe_plugin, payment_stripe_for_checkout, channel_USD
):
payment_intent = Mock()
mocked_payment_intent.return_value = payment_intent
client_secret = "client-secret"
dummy_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment_intent.id = payment_intent_id
payment_intent.client_secret = client_secret
payment_intent.last_response.data = dummy_response
plugin = stripe_plugin(auto_capture=False)
payment_info = create_payment_information(
payment_stripe_for_checkout,
)
plugin.process_payment(payment_info, None)
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=MANUAL_CAPTURE_METHOD,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.create")
def test_process_payment_with_error(
mocked_payment_intent, stripe_plugin, payment_stripe_for_checkout, channel_USD
):
mocked_payment_intent.side_effect = StripeError(
message="stripe-error", json_body={"error": "body"}
)
plugin = stripe_plugin()
payment_info = create_payment_information(
payment_stripe_for_checkout,
)
response = plugin.process_payment(payment_info, None)
assert response.is_success is False
assert response.action_required is True
assert response.kind == TransactionKind.ACTION_TO_CONFIRM
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == ""
assert response.error == "stripe-error"
assert response.raw_response == {"error": "body"}
assert response.action_required_data == {"client_secret": None, "id": None}
api_key = plugin.config.connection_params["secret_api_key"]
mocked_payment_intent.assert_called_once_with(
api_key=api_key,
amount=price_to_minor_unit(payment_info.amount, payment_info.currency),
currency=payment_info.currency,
capture_method=AUTOMATIC_CAPTURE_METHOD,
metadata={
"channel": channel_USD.slug,
"payment_id": payment_info.graphql_payment_id,
},
receipt_email=payment_stripe_for_checkout.checkout.email,
stripe_version=STRIPE_API_VERSION,
)
@pytest.mark.parametrize("kind", [TransactionKind.AUTH, TransactionKind.CAPTURE])
def test_confirm_payment_for_webhook(kind, stripe_plugin, payment_stripe_for_checkout):
payment_intent_id = "payment-intent-id"
gateway_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment = payment_stripe_for_checkout
payment.transactions.create(
is_success=True,
action_required=False,
kind=kind,
token=payment_intent_id,
gateway_response=gateway_response,
amount=payment.total,
currency=payment.currency,
)
payment_info = create_payment_information(
payment_stripe_for_checkout,
)
plugin = stripe_plugin()
response = plugin.confirm_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == kind
assert response.amount == payment_info.amount
assert response.currency == payment_info.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
assert response.raw_response == gateway_response
assert response.action_required_data is None
assert response.transaction_already_processed is True
@pytest.mark.parametrize(
"kind, status",
[
(TransactionKind.AUTH, AUTHORIZED_STATUS),
(TransactionKind.CAPTURE, SUCCESS_STATUS),
],
)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.retrieve")
def test_confirm_payment(
mocked_intent_retrieve, kind, status, stripe_plugin, payment_stripe_for_checkout
):
gateway_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment = payment_stripe_for_checkout
payment.transactions.create(
is_success=True,
action_required=False,
kind=TransactionKind.ACTION_TO_CONFIRM,
token=payment_intent_id,
gateway_response=gateway_response,
amount=payment.total,
currency=payment.currency,
)
payment_intent = StripeObject(id=payment_intent_id)
payment_intent["amount"] = price_to_minor_unit(payment.total, payment.currency)
payment_intent["status"] = status
payment_intent["currency"] = payment.currency
mocked_intent_retrieve.return_value = payment_intent
payment_info = create_payment_information(
payment_stripe_for_checkout, payment_token=payment_intent_id
)
plugin = stripe_plugin()
response = plugin.confirm_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == kind
assert response.amount == payment.total
assert response.currency == payment.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.retrieve")
def test_confirm_payment_incorrect_payment_intent(
mocked_intent_retrieve, stripe_plugin, payment_stripe_for_checkout
):
gateway_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment = payment_stripe_for_checkout
payment.transactions.create(
is_success=True,
action_required=False,
kind=TransactionKind.ACTION_TO_CONFIRM,
token=payment_intent_id,
gateway_response=gateway_response,
amount=payment.total,
currency=payment.currency,
)
mocked_intent_retrieve.side_effect = StripeError(message="stripe-error")
payment_info = create_payment_information(
payment_stripe_for_checkout, payment_token=payment_intent_id
)
plugin = stripe_plugin()
with warnings.catch_warnings(record=True):
response = plugin.confirm_payment(payment_info, None)
assert response.is_success is False
assert response.action_required is False
assert response.kind == TransactionKind.AUTH
assert response.amount == payment.total
assert response.currency == payment.currency
assert response.transaction_id == ""
assert response.error == "stripe-error"
@pytest.mark.parametrize("status", ACTION_REQUIRED_STATUSES)
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.retrieve")
def test_confirm_payment_action_required_status(
mocked_intent_retrieve, status, stripe_plugin, payment_stripe_for_checkout
):
gateway_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment = payment_stripe_for_checkout
payment.transactions.create(
is_success=True,
action_required=False,
kind=TransactionKind.ACTION_TO_CONFIRM,
token=payment_intent_id,
gateway_response=gateway_response,
amount=payment.total,
currency=payment.currency,
)
payment_intent = StripeObject(id=payment_intent_id)
payment_intent["capture_method"] = "automatic"
payment_intent["amount"] = price_to_minor_unit(payment.total, payment.currency)
payment_intent["status"] = status
payment_intent["currency"] = payment.currency
mocked_intent_retrieve.return_value = payment_intent
payment_info = create_payment_information(
payment_stripe_for_checkout, payment_token=payment_intent_id
)
plugin = stripe_plugin()
response = plugin.confirm_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is True
assert response.kind == TransactionKind.ACTION_TO_CONFIRM
assert response.amount == payment.total
assert response.currency == payment.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.retrieve")
def test_confirm_payment_processing_status(
mocked_intent_retrieve, stripe_plugin, payment_stripe_for_checkout
):
gateway_response = {
"id": "evt_1Ip9ANH1Vac4G4dbE9ch7zGS",
}
payment_intent_id = "payment-intent-id"
payment = payment_stripe_for_checkout
payment.transactions.create(
is_success=True,
action_required=False,
kind=TransactionKind.ACTION_TO_CONFIRM,
token=payment_intent_id,
gateway_response=gateway_response,
amount=payment.total,
currency=payment.currency,
)
payment_intent = StripeObject(id=payment_intent_id)
payment_intent["capture_method"] = "automatic"
payment_intent["amount"] = price_to_minor_unit(payment.total, payment.currency)
payment_intent["status"] = PROCESSING_STATUS
payment_intent["currency"] = payment.currency
mocked_intent_retrieve.return_value = payment_intent
payment_info = create_payment_information(
payment_stripe_for_checkout, payment_token=payment_intent_id
)
plugin = stripe_plugin()
response = plugin.confirm_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == TransactionKind.PENDING
assert response.amount == payment.total
assert response.currency == payment.currency
assert response.transaction_id == payment_intent_id
assert response.error is None
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.capture")
def test_capture_payment(
mocked_capture, payment_stripe_for_order, order_with_lines, stripe_plugin
):
payment = payment_stripe_for_order
payment_intent_id = "ABC"
payment_intent = StripeObject(id=payment_intent_id)
payment_intent["amount"] = price_to_minor_unit(payment.total, payment.currency)
payment_intent["status"] = SUCCESS_STATUS
payment_intent["currency"] = payment.currency
payment_intent["last_response"] = StripeObject()
payment_intent["last_response"]["data"] = {"response": "json"}
mocked_capture.return_value = payment_intent
payment_info = create_payment_information(
payment,
payment_token=payment_intent_id,
)
gateway_response = GatewayResponse(
kind=TransactionKind.AUTH,
action_required=False,
transaction_id=payment_intent_id,
is_success=True,
amount=payment_info.amount,
currency=payment_info.currency,
error="",
raw_response={},
)
create_transaction(
payment=payment,
payment_information=payment_info,
kind=TransactionKind.AUTH,
gateway_response=gateway_response,
)
plugin = stripe_plugin()
response = plugin.capture_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == TransactionKind.CAPTURE
assert response.amount == payment.total
assert response.currency == order_with_lines.currency
assert response.transaction_id == payment_intent_id
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.Refund.create")
def test_refund_payment(
mocked_refund, payment_stripe_for_order, order_with_lines, stripe_plugin
):
payment = payment_stripe_for_order
payment_intent_id = "ABC"
refund_object = StripeObject(id=payment_intent_id)
refund_object["amount"] = price_to_minor_unit(payment.total, payment.currency)
refund_object["status"] = SUCCESS_STATUS
refund_object["currency"] = payment.currency
refund_object["last_response"] = StripeObject()
refund_object["last_response"]["data"] = {"response": "json"}
mocked_refund.return_value = refund_object
payment_info = create_payment_information(
payment,
payment_token=payment_intent_id,
)
gateway_response = GatewayResponse(
kind=TransactionKind.CAPTURE,
action_required=False,
transaction_id=payment_intent_id,
is_success=True,
amount=payment_info.amount,
currency=payment_info.currency,
error="",
raw_response={},
)
create_transaction(
payment=payment,
payment_information=payment_info,
kind=TransactionKind.CAPTURE,
gateway_response=gateway_response,
)
plugin = stripe_plugin()
response = plugin.refund_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == TransactionKind.REFUND
assert response.amount == payment.total
assert response.currency == order_with_lines.currency
assert response.transaction_id == payment_intent_id
@patch("saleor.payment.gateways.stripe.stripe_api.stripe.PaymentIntent.cancel")
def test_void_payment(
mocked_cancel, payment_stripe_for_order, order_with_lines, stripe_plugin
):
payment = payment_stripe_for_order
payment_intent_id = "ABC"
payment_intent = StripeObject(id=payment_intent_id)
payment_intent["amount"] = price_to_minor_unit(payment.total, payment.currency)
payment_intent["status"] = SUCCESS_STATUS
payment_intent["currency"] = payment.currency
payment_intent["last_response"] = StripeObject()
payment_intent["last_response"]["data"] = {"response": "json"}
mocked_cancel.return_value = payment_intent
payment_info = create_payment_information(
payment,
payment_token=payment_intent_id,
)
gateway_response = GatewayResponse(
kind=TransactionKind.AUTH,
action_required=False,
transaction_id=payment_intent_id,
is_success=True,
amount=payment_info.amount,
currency=payment_info.currency,
error="",
raw_response={},
)
create_transaction(
payment=payment,
payment_information=payment_info,
kind=TransactionKind.AUTH,
gateway_response=gateway_response,
)
plugin = stripe_plugin()
response = plugin.void_payment(payment_info, None)
assert response.is_success is True
assert response.action_required is False
assert response.kind == TransactionKind.VOID
assert response.amount == payment.total
assert response.currency == order_with_lines.currency
assert response.transaction_id == payment_intent_id
| 33.992051 | 87 | 0.736612 | 4,911 | 42,762 | 6.030747 | 0.036042 | 0.093055 | 0.049127 | 0.038458 | 0.921025 | 0.908465 | 0.888645 | 0.882973 | 0.882973 | 0.870649 | 0 | 0.003123 | 0.183831 | 42,762 | 1,257 | 88 | 34.019093 | 0.845477 | 0 | 0 | 0.777674 | 0 | 0 | 0.107666 | 0.063795 | 0 | 0 | 0 | 0 | 0.164165 | 1 | 0.02439 | false | 0 | 0.010319 | 0 | 0.036585 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
0aeba1f08586d4aa45179934da01a647f261afd0 | 3,948 | py | Python | pytorch-frontend/test/test_quantization.py | AndreasKaratzas/stonne | 2915fcc46cc94196303d81abbd1d79a56d6dd4a9 | [
"MIT"
] | 40 | 2021-06-01T07:37:59.000Z | 2022-03-25T01:42:09.000Z | pytorch-frontend/test/test_quantization.py | AndreasKaratzas/stonne | 2915fcc46cc94196303d81abbd1d79a56d6dd4a9 | [
"MIT"
] | 14 | 2021-06-01T11:52:46.000Z | 2022-03-25T02:13:08.000Z | pytorch-frontend/test/test_quantization.py | AndreasKaratzas/stonne | 2915fcc46cc94196303d81abbd1d79a56d6dd4a9 | [
"MIT"
] | 7 | 2021-07-20T19:34:26.000Z | 2022-03-13T21:07:36.000Z | # -*- coding: utf-8 -*-
from torch.testing._internal.common_utils import run_tests
# Quantized Tensor
from quantization.test_quantized_tensor import TestQuantizedTensor # noqa: F401
# Quantized Op
# TODO: merge test cases in quantization.test_quantized
from quantization.test_quantized_op import TestQuantizedOps # noqa: F401
from quantization.test_quantized_op import TestQNNPackOps # noqa: F401
from quantization.test_quantized_op import TestQuantizedLinear # noqa: F401
from quantization.test_quantized_op import TestQuantizedConv # noqa: F401
from quantization.test_quantized_op import TestDynamicQuantizedLinear # noqa: F401
from quantization.test_quantized_op import TestComparatorOps # noqa: F401
from quantization.test_quantized_op import TestPadding # noqa: F401
from quantization.test_quantized_op import TestQuantizedEmbeddingBag # noqa: F401
# Quantized Functional
from quantization.test_quantized_functional import TestQuantizedFunctional # noqa: F401
# Quantized Module
from quantization.test_quantized_module import TestStaticQuantizedModule # noqa: F401
from quantization.test_quantized_module import TestDynamicQuantizedModule # noqa: F401
# Quantization Aware Training
from quantization.test_qat_module import TestQATModule # noqa: F401
# Quantization specific fusion passes
from quantization.test_fusion_passes import TestFusionPasses # noqa: F401
# Module
# TODO: merge the fake quant per tensor and per channel test cases
# TODO: some of the tests are actually operator tests, e.g. test_forward_per_tensor, and
# should be moved to test_quantized_op
from quantization.test_workflow_module import TestFakeQuantizePerTensor # noqa: F401
from quantization.test_workflow_module import TestFakeQuantizePerChannel # noqa: F401
from quantization.test_workflow_module import TestObserver # noqa: F401
# TODO: merge with TestObserver
# TODO: some tests belong to test_quantize.py, e.g. test_record_observer
from quantization.test_workflow_module import TestRecordHistogramObserver # noqa: F401
from quantization.test_workflow_module import TestDistributed # noqa: F401
# Workflow
# 1. Eager mode quantization
from quantization.test_quantize import TestPostTrainingStatic # noqa: F401
from quantization.test_quantize import TestPostTrainingDynamic # noqa: F401
from quantization.test_quantize import TestQuantizationAwareTraining # noqa: F401
# TODO: merge with other tests in test_quantize.py?
from quantization.test_quantize import TestFunctionalModule # noqa: F401
from quantization.test_quantize import TestFusion # noqa: F401
from quantization.test_quantize import TestModelNumerics # noqa: F401
from quantization.test_quantize import TestQuantizeONNXExport # noqa: F401
from quantization.test_quantize import TestDeprecatedJitQuantized # noqa: F401
# 2. Graph mode quantization
from quantization.test_quantize_jit import TestQuantizeJit # noqa: F401
from quantization.test_quantize_jit import TestQuantizeJitPasses # noqa: F401
from quantization.test_quantize_jit import TestQuantizeJitOps # noqa: F401
from quantization.test_quantize_jit import TestQuantizeDynamicJitPasses # noqa: F401
from quantization.test_quantize_jit import TestQuantizeDynamicJitOps # noqaa: F401
# 3. GraphModule based graph mode quantization
from quantization.test_quantize_fx import TestQuantizeFx # noqa: F401
from quantization.test_quantize_fx import TestQuantizeFxOps # noqa: F401
from quantization.test_quantize_fx import TestQuantizeFxModels # noqa: F401
# Tooling: numric_suite
from quantization.test_numeric_suite import TestEagerModeNumericSuite # noqa: F401
# Backward Compatibility
from quantization.test_backward_compatibility import TestSerialization # noqa: F401
# Equalization
from quantization.test_equalize import TestEqualizeEager # noqa: F401
# Bias Correction
from quantization.test_bias_correction import TestBiasCorrection # noqa: F401
if __name__ == '__main__':
run_tests()
| 48.740741 | 88 | 0.838399 | 467 | 3,948 | 6.890792 | 0.256959 | 0.198881 | 0.242387 | 0.171535 | 0.443443 | 0.417961 | 0.33468 | 0.225917 | 0 | 0 | 0 | 0.03468 | 0.116261 | 3,948 | 80 | 89 | 49.35 | 0.887647 | 0.295339 | 0 | 0 | 0 | 0 | 0.002942 | 0 | 0 | 0 | 0 | 0.0125 | 0 | 1 | 0 | true | 0.071429 | 0.952381 | 0 | 0.952381 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 7 |
e409bcdb6202b143a3d0825b05a8b2312f659c28 | 127 | py | Python | deepface/models/__init__.py | MatheusAD95/fg2020-faceunderstanding | 95a3d04f68c2c3207137a9f3b9fb3f8e2134fe8e | [
"MIT"
] | null | null | null | deepface/models/__init__.py | MatheusAD95/fg2020-faceunderstanding | 95a3d04f68c2c3207137a9f3b9fb3f8e2134fe8e | [
"MIT"
] | null | null | null | deepface/models/__init__.py | MatheusAD95/fg2020-faceunderstanding | 95a3d04f68c2c3207137a9f3b9fb3f8e2134fe8e | [
"MIT"
] | null | null | null | # import deepface.models.resnet
# import deepface.models.base_model
# import deepface.models.vgg
from .resnet import Resnet50
| 21.166667 | 35 | 0.811024 | 17 | 127 | 6 | 0.529412 | 0.411765 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017699 | 0.110236 | 127 | 5 | 36 | 25.4 | 0.884956 | 0.708661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
7c4db011b107b2793ccf465fcea7879e9d07b2c8 | 7,768 | py | Python | tools/mo/unit_tests/mo/front/HSwish_fusing_test.py | ryanloney/openvino-1 | 4e0a740eb3ee31062ba0df88fcf438564f67edb7 | [
"Apache-2.0"
] | 1,127 | 2018-10-15T14:36:58.000Z | 2020-04-20T09:29:44.000Z | tools/mo/unit_tests/mo/front/HSwish_fusing_test.py | ryanloney/openvino-1 | 4e0a740eb3ee31062ba0df88fcf438564f67edb7 | [
"Apache-2.0"
] | 439 | 2018-10-20T04:40:35.000Z | 2020-04-19T05:56:25.000Z | tools/mo/unit_tests/mo/front/HSwish_fusing_test.py | ryanloney/openvino-1 | 4e0a740eb3ee31062ba0df88fcf438564f67edb7 | [
"Apache-2.0"
] | 414 | 2018-10-17T05:53:46.000Z | 2020-04-16T17:29:53.000Z | # Copyright (C) 2018-2022 Intel Corporation
# SPDX-License-Identifier: Apache-2.0
import unittest
from openvino.tools.mo.front.HSwish_fusion import HSwishWithClamp, HSwishWithMinMax
from openvino.tools.mo.front.common.partial_infer.utils import float_array
from openvino.tools.mo.utils.ir_engine.compare_graphs import compare_graphs
from unit_tests.utils.graph import build_graph, const, regular_op, result, build_graph_with_edge_attrs
ref_nodes = {**regular_op('input', {'type': 'Parameter'}),
**regular_op('hswish', {'type': 'HSwish', 'name': 'final_mul'}),
**result('result')
}
ref_edges = [('input', 'hswish'), ('hswish', 'result')]
class HSwishWithClampTest(unittest.TestCase):
nodes = {
**regular_op('input', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('relu6', {'op': 'Clamp'}),
**regular_op('mul', {'op': 'Mul'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}
edges = [('input', 'mul', {'in': 0, 'out': 0}),
('input', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'relu6', {'in': 0, 'out': 0}),
('const_0', 'relu6', {'in': 1, 'out': 0}),
('const_6', 'relu6', {'in': 2, 'out': 0}),
('relu6', 'mul', {'in': 1, 'out': 0}),
('mul', 'mul_2', {'in': 0, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})]
def test_hswish_with_clamp(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {})
graph_ref = build_graph(ref_nodes, ref_edges)
graph.stage = 'front'
HSwishWithClamp().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
self.assertTrue(len(graph.get_op_nodes(name='final_mul')) == 1 and
graph.get_op_nodes(name='final_mul')[0].op == 'HSwish')
def test_hswish_with_clamp_wrong_constant(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {'const_0': {'value': float_array([0.00001])}})
graph_ref = graph.copy()
graph.stage = 'front'
HSwishWithClamp().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
def test_hswish_with_clamp_different_tensors(self):
graph = build_graph_with_edge_attrs({
**regular_op('input', {'type': 'Parameter'}),
**regular_op('input_2', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('relu6', {'op': 'Clamp'}),
**regular_op('mul', {'op': 'Mul'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}, [('input', 'mul', {'in': 0, 'out': 0}),
('input_2', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'relu6', {'in': 0, 'out': 0}),
('const_0', 'relu6', {'in': 1, 'out': 0}),
('const_6', 'relu6', {'in': 2, 'out': 0}),
('relu6', 'mul', {'in': 1, 'out': 0}),
('mul', 'mul_2', {'in': 0, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})])
graph_ref = graph.copy()
graph.stage = 'front'
HSwishWithClamp().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
class HSwishWithMinMaxTest(unittest.TestCase):
nodes = {
**regular_op('input', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('max', {'op': 'Maximum'}),
**regular_op('min', {'op': 'Minimum'}),
**regular_op('mul', {'op': 'Mul'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}
edges = [('input', 'mul', {'in': 1, 'out': 0}),
('input', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'max', {'in': 0, 'out': 0}),
('const_0', 'max', {'in': 1, 'out': 0}),
('max', 'min', {'in': 0, 'out': 0}),
('const_6', 'min', {'in': 1, 'out': 0}),
('min', 'mul', {'in': 0, 'out': 0}),
('mul', 'mul_2', {'in': 0, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})]
def test_hswish_with_min_max(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {})
graph_ref = build_graph(ref_nodes, ref_edges)
graph.stage = 'front'
HSwishWithMinMax().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
self.assertTrue(len(graph.get_op_nodes(name='final_mul')) == 1 and
graph.get_op_nodes(name='final_mul')[0].op == 'HSwish')
def test_hswish_with_min_max_wrong_constant(self):
graph = build_graph_with_edge_attrs(self.nodes, self.edges, {'const_0': {'value': float_array([0.00001])}})
graph_ref = graph.copy()
graph.stage = 'front'
HSwishWithMinMax().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
def test_hswish_with_min_max_different_tensors(self):
graph = build_graph_with_edge_attrs({
**regular_op('input', {'type': 'Parameter'}),
**regular_op('input_2', {'type': 'Parameter'}),
**regular_op('add', {'op': 'Add'}),
**regular_op('max', {'op': 'Maximum'}),
**regular_op('min', {'op': 'Minimum'}),
**regular_op('mul', {'op': 'Mul'}),
**regular_op('mul_2', {'op': 'Mul', 'name': 'final_mul'}),
**const('const_0', float_array([0.0])),
**const('const_3', float_array([3.0])),
**const('const_6', float_array([6.0])),
**const('const_1_6', float_array([1.0 / 6.0])),
**result('result'),
}, [('input_2', 'mul', {'in': 1, 'out': 0}),
('input', 'add', {'in': 0, 'out': 0}),
('const_3', 'add', {'in': 1, 'out': 0}),
('add', 'max', {'in': 0, 'out': 0}),
('const_0', 'max', {'in': 1, 'out': 0}),
('max', 'min', {'in': 0, 'out': 0}),
('const_6', 'min', {'in': 1, 'out': 0}),
('min', 'mul', {'in': 0, 'out': 0}),
('mul', 'mul_2', {'in': 0, 'out': 0}),
('const_1_6', 'mul_2', {'in': 1, 'out': 0}),
('mul_2', 'result', {'in': 0, 'out': 0})])
graph_ref = graph.copy()
graph.stage = 'front'
HSwishWithMinMax().find_and_replace_pattern(graph)
(flag, resp) = compare_graphs(graph, graph_ref, 'result')
self.assertTrue(flag, resp)
| 42.217391 | 115 | 0.510299 | 971 | 7,768 | 3.84243 | 0.091658 | 0.045028 | 0.035379 | 0.041276 | 0.889306 | 0.866256 | 0.861431 | 0.857143 | 0.846154 | 0.846154 | 0 | 0.038482 | 0.257338 | 7,768 | 183 | 116 | 42.448087 | 0.608251 | 0.009912 | 0 | 0.831081 | 0 | 0 | 0.169615 | 0 | 0 | 0 | 0 | 0 | 0.054054 | 1 | 0.040541 | false | 0 | 0.033784 | 0 | 0.114865 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7c8a829bc7c40d557dc9ae72a2775da9c0144f43 | 37,952 | py | Python | BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py | KaoTuz/edk2-stable202108 | 49d9306e7bf64b2f07d8473be1f2faea49d0a012 | [
"Python-2.0",
"Zlib",
"BSD-2-Clause",
"MIT",
"BSD-2-Clause-Patent",
"BSD-3-Clause"
] | 9 | 2021-07-26T17:02:51.000Z | 2021-12-30T10:49:46.000Z | BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py | ESdove/edk2_exploring | 34ff32b45f43d233d9696e7c8e3de68ea3000a7b | [
"Python-2.0",
"Zlib",
"BSD-2-Clause",
"MIT",
"BSD-2-Clause-Patent",
"BSD-3-Clause"
] | null | null | null | BaseTools/Source/Python/UPT/UnitTest/CommentGeneratingUnitTest.py | ESdove/edk2_exploring | 34ff32b45f43d233d9696e7c8e3de68ea3000a7b | [
"Python-2.0",
"Zlib",
"BSD-2-Clause",
"MIT",
"BSD-2-Clause-Patent",
"BSD-3-Clause"
] | null | null | null | ## @file
# This file contain unit test for CommentParsing
#
# Copyright (c) 2011 - 2018, Intel Corporation. All rights reserved.<BR>
#
# SPDX-License-Identifier: BSD-2-Clause-Patent
import os
import unittest
import Logger.Log as Logger
from GenMetaFile.GenInfFile import GenGuidSections
from GenMetaFile.GenInfFile import GenProtocolPPiSections
from GenMetaFile.GenInfFile import GenPcdSections
from GenMetaFile.GenInfFile import GenSpecialSections
from Library.CommentGenerating import GenGenericCommentF
from Library.CommentGenerating import _GetHelpStr
from Object.POM.CommonObject import TextObject
from Object.POM.CommonObject import GuidObject
from Object.POM.CommonObject import ProtocolObject
from Object.POM.CommonObject import PpiObject
from Object.POM.CommonObject import PcdObject
from Object.POM.ModuleObject import HobObject
from Library.StringUtils import GetSplitValueList
from Library.DataType import TAB_SPACE_SPLIT
from Library.DataType import TAB_LANGUAGE_EN_US
from Library.DataType import TAB_LANGUAGE_ENG
from Library.DataType import ITEM_UNDEFINED
from Library.DataType import TAB_INF_FEATURE_PCD
from Library import GlobalData
from Library.Misc import CreateDirectory
#
# Test _GetHelpStr
#
class _GetHelpStrTest(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
#
# Normal case1: have one help text object with Lang = 'en-US'
#
def testNormalCase1(self):
HelpStr = 'Hello world'
HelpTextObj = TextObject()
HelpTextObj.SetLang(TAB_LANGUAGE_EN_US)
HelpTextObj.SetString(HelpStr)
HelpTextList = [HelpTextObj]
Result = _GetHelpStr(HelpTextList)
self.assertEqual(Result, HelpStr)
#
# Normal case2: have two help text object with Lang = 'en-US' and other
#
def testNormalCase2(self):
HelpStr = 'Hello world'
HelpTextObj = TextObject()
HelpTextObj.SetLang(TAB_LANGUAGE_ENG)
HelpTextObj.SetString(HelpStr)
HelpTextList = [HelpTextObj]
ExpectedStr = 'Hello world1'
HelpTextObj = TextObject()
HelpTextObj.SetLang(TAB_LANGUAGE_EN_US)
HelpTextObj.SetString(ExpectedStr)
HelpTextList.append(HelpTextObj)
Result = _GetHelpStr(HelpTextList)
self.assertEqual(Result, ExpectedStr)
#
# Normal case3: have two help text object with Lang = '' and 'eng'
#
def testNormalCase3(self):
HelpStr = 'Hello world'
HelpTextObj = TextObject()
HelpTextObj.SetLang('')
HelpTextObj.SetString(HelpStr)
HelpTextList = [HelpTextObj]
ExpectedStr = 'Hello world1'
HelpTextObj = TextObject()
HelpTextObj.SetLang(TAB_LANGUAGE_ENG)
HelpTextObj.SetString(ExpectedStr)
HelpTextList.append(HelpTextObj)
Result = _GetHelpStr(HelpTextList)
self.assertEqual(Result, ExpectedStr)
#
# Normal case4: have two help text object with Lang = '' and ''
#
def testNormalCase4(self):
ExpectedStr = 'Hello world1'
HelpTextObj = TextObject()
HelpTextObj.SetLang(TAB_LANGUAGE_ENG)
HelpTextObj.SetString(ExpectedStr)
HelpTextList = [HelpTextObj]
HelpStr = 'Hello world'
HelpTextObj = TextObject()
HelpTextObj.SetLang('')
HelpTextObj.SetString(HelpStr)
HelpTextList.append(HelpTextObj)
Result = _GetHelpStr(HelpTextList)
self.assertEqual(Result, ExpectedStr)
#
# Normal case: have three help text object with Lang = '','en', 'en-US'
#
def testNormalCase5(self):
ExpectedStr = 'Hello world1'
HelpTextObj = TextObject()
HelpTextObj.SetLang(TAB_LANGUAGE_EN_US)
HelpTextObj.SetString(ExpectedStr)
HelpTextList = [HelpTextObj]
HelpStr = 'Hello unknown world'
HelpTextObj = TextObject()
HelpTextObj.SetLang('')
HelpTextObj.SetString(HelpStr)
HelpTextList.append(HelpTextObj)
HelpStr = 'Hello mysterious world'
HelpTextObj = TextObject()
HelpTextObj.SetLang('')
HelpTextObj.SetString(HelpStr)
HelpTextList.append(HelpTextObj)
Result = _GetHelpStr(HelpTextList)
self.assertEqual(Result, ExpectedStr)
HelpTextList.sort()
self.assertEqual(Result, ExpectedStr)
HelpTextList.sort(reverse=True)
self.assertEqual(Result, ExpectedStr)
#
# Test GenGuidSections
#
class GenGuidSectionsTest(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
#
# This is the API to generate Guid Object to help UnitTest
#
def GuidFactory(self, CName, FFE, Usage, GuidType, VariableName, HelpStr):
Guid = GuidObject()
Guid.SetCName(CName)
Guid.SetFeatureFlag(FFE)
Guid.SetGuidTypeList([GuidType])
Guid.SetUsage(Usage)
Guid.SetVariableName(VariableName)
HelpTextObj = TextObject()
HelpTextObj.SetLang('')
HelpTextObj.SetString(HelpStr)
Guid.SetHelpTextList([HelpTextObj])
return Guid
#
# Normal case: have two GuidObject
#
def testNormalCase1(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
GuidType = 'Event'
VariableName = ''
HelpStr = 'Usage comment line 1'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'CONSUMES'
GuidType = 'Variable'
VariableName = ''
HelpStr = 'Usage comment line 2'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
## PRODUCES ## Event # Usage comment line 1
## CONSUMES ## Variable: # Usage comment line 2
Guid1|FFE1'''
self.assertEqual(Result.strip(), Expected)
#
# Normal case: have two GuidObject
#
def testNormalCase2(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
GuidType = 'Event'
VariableName = ''
HelpStr = 'Usage comment line 1'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
GuidType = 'UNDEFINED'
VariableName = ''
HelpStr = 'Generic comment line 1\n Generic comment line 2'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
## PRODUCES ## Event # Usage comment line 1
# Generic comment line 1
# Generic comment line 2
Guid1|FFE1'''
self.assertEqual(Result.strip(), Expected)
#
# Normal case: have two GuidObject, one help goes to generic help,
# the other go into usage comment
#
def testNormalCase3(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
GuidType = 'UNDEFINED'
VariableName = ''
HelpStr = 'Generic comment'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
GuidType = 'Event'
VariableName = ''
HelpStr = 'Usage comment line 1'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
# Generic comment
## PRODUCES ## Event # Usage comment line 1
Guid1|FFE1'''
self.assertEqual(Result.strip(), Expected)
#
# Normal case: have one GuidObject, generic comment multiple lines
#
def testNormalCase5(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
GuidType = 'UNDEFINED'
VariableName = ''
HelpStr = 'Generic comment line1 \n generic comment line 2'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
# Generic comment line1
# generic comment line 2
Guid1|FFE1'''
self.assertEqual(Result.strip(), Expected)
#
# Normal case: have one GuidObject, usage comment multiple lines
#
def testNormalCase6(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
GuidType = 'Event'
VariableName = ''
HelpStr = 'Usage comment line 1\n Usage comment line 2'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
Guid1|FFE1 ## PRODUCES ## Event # Usage comment line 1 Usage comment line 2
'''
self.assertEqual(Result.strip(), Expected.strip())
#
# Normal case: have one GuidObject, usage comment one line
#
def testNormalCase7(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
GuidType = 'UNDEFINED'
VariableName = ''
HelpStr = 'Usage comment line 1'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
Guid1|FFE1 # Usage comment line 1
'''
self.assertEqual(Result.strip(), Expected.strip())
#
# Normal case: have two GuidObject
#
def testNormalCase8(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
GuidType = 'Event'
VariableName = ''
HelpStr = 'Usage comment line 1\n Usage comment line 2'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
GuidType = 'Event'
VariableName = ''
HelpStr = 'Usage comment line 3'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
## PRODUCES ## Event # Usage comment line 1 Usage comment line 2
## PRODUCES ## Event # Usage comment line 3
Guid1|FFE1
'''
self.assertEqual(Result.strip(), Expected.strip())
#
# Normal case: have no GuidObject
#
def testNormalCase9(self):
GuidList = []
Result = GenGuidSections(GuidList)
Expected = ''
self.assertEqual(Result.strip(), Expected.strip())
#
# Normal case: have one GuidObject with no comment generated
#
def testNormalCase10(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
GuidType = 'UNDEFINED'
VariableName = ''
HelpStr = ''
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
Guid1|FFE1
'''
self.assertEqual(Result.strip(), Expected.strip())
#
# Normal case: have three GuidObject
#
def testNormalCase11(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
GuidType = 'UNDEFINED'
VariableName = ''
HelpStr = 'general comment line 1'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
GuidType = 'Event'
VariableName = ''
HelpStr = 'Usage comment line 3'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
GuidType = 'UNDEFINED'
VariableName = ''
HelpStr = 'general comment line 2'
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
# general comment line 1
## PRODUCES ## Event # Usage comment line 3
# general comment line 2
Guid1|FFE1
'''
self.assertEqual(Result.strip(), Expected.strip())
#
# Normal case: have three GuidObject, with Usage/Type and no help
#
def testNormalCase12(self):
GuidList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
GuidType = 'GUID'
VariableName = ''
HelpStr = ''
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
GuidType = 'Event'
VariableName = ''
HelpStr = ''
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'CONSUMES'
GuidType = 'Event'
VariableName = ''
HelpStr = ''
Guid1 = self.GuidFactory(CName, FFE, Usage, GuidType,
VariableName, HelpStr)
GuidList.append(Guid1)
Result = GenGuidSections(GuidList)
Expected = '''[Guids]
## PRODUCES ## GUID
## PRODUCES ## Event
## CONSUMES ## Event
Guid1|FFE1
'''
self.assertEqual(Result.strip(), Expected.strip())
#
# Test GenProtocolPPiSections
#
class GenProtocolPPiSectionsTest(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
#
# This is the API to generate Protocol/Ppi Object to help UnitTest
#
def ObjectFactory(self, CName, FFE, Usage, Notify, HelpStr, IsProtocol):
if IsProtocol:
Object = ProtocolObject()
else:
Object = PpiObject()
Object.SetCName(CName)
Object.SetFeatureFlag(FFE)
Object.SetUsage(Usage)
Object.SetNotify(Notify)
HelpTextObj = TextObject()
HelpTextObj.SetLang('')
HelpTextObj.SetString(HelpStr)
Object.SetHelpTextList([HelpTextObj])
return Object
# Usage Notify Help INF Comment
#1 UNDEFINED true Present ## UNDEFINED ## NOTIFY # Help
#2 UNDEFINED true Not Present ## UNDEFINED ## NOTIFY
#3 UNDEFINED false Present ## UNDEFINED # Help
#4 UNDEFINED false Not Present ## UNDEFINED
#5 UNDEFINED Not Present Present # Help
#6 UNDEFINED Not Present Not Present <empty>
#7 Other true Present ## Other ## NOTIFY # Help
#8 Other true Not Present ## Other ## NOTIFY
#9 Other false Present ## Other # Help
#A Other false Not Present ## Other
#B Other Not Present Present ## Other # Help
#C Other Not Present Not Present ## Other
def testNormalCase1(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
Notify = True
HelpStr = 'Help'
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## UNDEFINED ## NOTIFY # Help'''
self.assertEqual(Result.strip(), Expected)
IsProtocol = False
ObjectList = []
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Ppis]
Guid1|FFE1 ## UNDEFINED ## NOTIFY # Help'''
self.assertEqual(Result.strip(), Expected)
def testNormalCase2(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
Notify = True
HelpStr = ''
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## UNDEFINED ## NOTIFY'''
self.assertEqual(Result.strip(), Expected)
def testNormalCase3(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
Notify = False
HelpStr = 'Help'
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## UNDEFINED # Help'''
self.assertEqual(Result.strip(), Expected)
def testNormalCase4(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
Notify = False
HelpStr = ''
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## UNDEFINED'''
self.assertEqual(Result.strip(), Expected)
def testNormalCase5(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
Notify = ''
HelpStr = 'Help'
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 # Help'''
self.assertEqual(Result.strip(), Expected)
def testNormalCase6(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'UNDEFINED'
Notify = ''
HelpStr = ''
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1'''
self.assertEqual(Result.strip(), Expected)
def testNormalCase7(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
Notify = True
HelpStr = 'Help'
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## PRODUCES ## NOTIFY # Help'''
self.assertEqual(Result.strip(), Expected)
def testNormalCase8(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
Notify = True
HelpStr = ''
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## PRODUCES ## NOTIFY'''
self.assertEqual(Result.strip(), Expected)
def testNormalCase9(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
Notify = False
HelpStr = 'Help'
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## PRODUCES # Help'''
self.assertEqual(Result.strip(), Expected)
def testNormalCaseA(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
Notify = False
HelpStr = ''
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## PRODUCES'''
self.assertEqual(Result.strip(), Expected)
def testNormalCaseB(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
Notify = ''
HelpStr = 'Help'
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## PRODUCES # Help'''
self.assertEqual(Result.strip(), Expected)
def testNormalCaseC(self):
ObjectList = []
CName = 'Guid1'
FFE = 'FFE1'
Usage = 'PRODUCES'
Notify = ''
HelpStr = ''
IsProtocol = True
Object = self.ObjectFactory(CName, FFE, Usage, Notify,
HelpStr, IsProtocol)
ObjectList.append(Object)
Result = GenProtocolPPiSections(ObjectList, IsProtocol)
Expected = '''[Protocols]
Guid1|FFE1 ## PRODUCES'''
self.assertEqual(Result.strip(), Expected)
#
# Test GenPcdSections
#
class GenPcdSectionsTest(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
#
# This is the API to generate Pcd Object to help UnitTest
#
def ObjectFactory(self, ItemType, TSCName, CName, DValue, FFE, Usage, Str):
Object = PcdObject()
HelpStr = Str
Object.SetItemType(ItemType)
Object.SetTokenSpaceGuidCName(TSCName)
Object.SetCName(CName)
Object.SetDefaultValue(DValue)
Object.SetFeatureFlag(FFE)
Object.SetValidUsage(Usage)
HelpTextObj = TextObject()
HelpTextObj.SetLang('')
HelpTextObj.SetString(HelpStr)
Object.SetHelpTextList([HelpTextObj])
return Object
# Usage Help INF Comment
#1 UNDEFINED Present # Help
#2 UNDEFINED Not Present <empty>
#3 Other Present ## Other # Help
#4 Other Not Present ## Other
def testNormalCase1(self):
ObjectList = []
ItemType = 'Pcd'
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'UNDEFINED'
Str = 'Help'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = \
'[Pcd]\n' + \
'TSCName.CName|DValue|FFE # Help'
self.assertEqual(Result.strip(), Expected)
def testNormalCase2(self):
ObjectList = []
ItemType = 'Pcd'
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'UNDEFINED'
Str = ''
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = '[Pcd]\nTSCName.CName|DValue|FFE'
self.assertEqual(Result.strip(), Expected)
def testNormalCase3(self):
ObjectList = []
ItemType = 'Pcd'
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'CONSUMES'
Str = 'Help'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = '[Pcd]\nTSCName.CName|DValue|FFE ## CONSUMES # Help'
self.assertEqual(Result.strip(), Expected)
def testNormalCase4(self):
ObjectList = []
ItemType = 'Pcd'
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'CONSUMES'
Str = ''
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = '[Pcd]\nTSCName.CName|DValue|FFE ## CONSUMES'
self.assertEqual(Result.strip(), Expected)
#
# multiple lines for normal usage
#
def testNormalCase5(self):
ObjectList = []
ItemType = 'Pcd'
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'CONSUMES'
Str = 'commment line 1\ncomment line 2'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = '''[Pcd]
TSCName.CName|DValue|FFE ## CONSUMES # commment line 1 comment line 2'''
self.assertEqual(Result.strip(), Expected)
#
# multiple lines for UNDEFINED usage
#
def testNormalCase6(self):
ObjectList = []
ItemType = 'Pcd'
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'UNDEFINED'
Str = 'commment line 1\ncomment line 2'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Usage = 'UNDEFINED'
Str = 'commment line 3'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = '''[Pcd]
# commment line 1
# comment line 2
# commment line 3
TSCName.CName|DValue|FFE'''
self.assertEqual(Result.strip(), Expected)
#
# multiple lines for UNDEFINED and normal usage
#
def testNormalCase7(self):
ObjectList = []
ItemType = 'Pcd'
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'UNDEFINED'
Str = 'commment line 1\ncomment line 2'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Usage = 'CONSUMES'
Str = 'Foo'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Usage = 'UNDEFINED'
Str = 'commment line 3'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = '''[Pcd]
# commment line 1
# comment line 2
## CONSUMES # Foo
# commment line 3
TSCName.CName|DValue|FFE'''
self.assertEqual(Result.strip(), Expected)
# Usage Help INF Comment
# CONSUMES Present # Help (keep <EOL> and insert '#' at beginning of each new line)
# CONSUMES Not Present <empty>
#
# TAB_INF_FEATURE_PCD
#
def testNormalCase8(self):
ObjectList = []
ItemType = TAB_INF_FEATURE_PCD
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'CONSUMES'
Str = 'commment line 1\ncomment line 2'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = '''[FeaturePcd]
# commment line 1
# comment line 2
TSCName.CName|DValue|FFE'''
self.assertEqual(Result.strip(), Expected)
#
# TAB_INF_FEATURE_PCD
#
def testNormalCase9(self):
ObjectList = []
ItemType = TAB_INF_FEATURE_PCD
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'CONSUMES'
Str = ''
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = '''[FeaturePcd]
TSCName.CName|DValue|FFE'''
self.assertEqual(Result.strip(), Expected)
#
# TAB_INF_FEATURE_PCD
#
def testNormalCase10(self):
ObjectList = []
ItemType = TAB_INF_FEATURE_PCD
TSCName = 'TSCName'
CName = 'CName'
DValue = 'DValue'
FFE = 'FFE'
Usage = 'PRODUCES'
Str = 'commment line 1\ncomment line 2'
Object = self.ObjectFactory(ItemType, TSCName, CName, DValue, FFE,
Usage, Str)
ObjectList.append(Object)
Result = GenPcdSections(ObjectList)
Expected = '''
[FeaturePcd]
# commment line 1
# comment line 2
TSCName.CName|DValue|FFE
'''
self.assertEqual(Result, Expected)
#
# Test GenSpecialSections of Hob
#
class GenHobSectionsTest(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
#
# This is the API to generate Event Object to help UnitTest
#
def ObjectFactory(self, SupArchList, Type, Usage, Str):
Object = HobObject()
HelpStr = Str
Object.SetHobType(Type)
Object.SetUsage(Usage)
Object.SetSupArchList(SupArchList)
HelpTextObj = TextObject()
HelpTextObj.SetLang('')
HelpTextObj.SetString(HelpStr)
Object.SetHelpTextList([HelpTextObj])
return Object
def testNormalCase1(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = 'Help'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# # Help
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase2(self):
ObjectList = []
SupArchList = []
Type = 'Foo'
Usage = 'UNDEFINED'
Str = 'Help'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob]
# ##
# # Help
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase3(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = '\nComment Line 1\n\n'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# # Comment Line 1
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase4(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = '\nComment Line 1\n'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# # Comment Line 1
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase5(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = 'Comment Line 1\n\n'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# # Comment Line 1
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase6(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = ''
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase7(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = '\nNew Stack HoB'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# # New Stack HoB
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase8(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = '\nNew Stack HoB\n\nTail Comment'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# # New Stack HoB
# #
# # Tail Comment
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase9(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = '\n\n'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# #
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase10(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = '\n'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# #
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase11(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = '\n\n\n'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# #
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
def testNormalCase12(self):
ObjectList = []
SupArchList = ['X64']
Type = 'Foo'
Usage = 'UNDEFINED'
Str = '\n\n\n\n'
Object = self.ObjectFactory(SupArchList, Type, Usage, Str)
ObjectList.append(Object)
Result = GenSpecialSections(ObjectList, 'Hob')
Expected = '''# [Hob.X64]
# ##
# #
# #
# #
# Foo ## UNDEFINED
#
#
'''
self.assertEqual(Result, Expected)
#
# Test GenGenericCommentF
#
class GenGenericCommentFTest(unittest.TestCase):
def setUp(self):
pass
def tearDown(self):
pass
def testNormalCase1(self):
CommentLines = 'Comment Line 1'
Result = GenGenericCommentF(CommentLines)
Expected = '# Comment Line 1\n'
self.assertEqual(Result, Expected)
def testNormalCase2(self):
CommentLines = '\n'
Result = GenGenericCommentF(CommentLines)
Expected = '#\n'
self.assertEqual(Result, Expected)
def testNormalCase3(self):
CommentLines = '\n\n\n'
Result = GenGenericCommentF(CommentLines)
Expected = '#\n#\n#\n'
self.assertEqual(Result, Expected)
def testNormalCase4(self):
CommentLines = 'coment line 1\n'
Result = GenGenericCommentF(CommentLines)
Expected = '# coment line 1\n'
self.assertEqual(Result, Expected)
def testNormalCase5(self):
CommentLines = 'coment line 1\n coment line 2\n'
Result = GenGenericCommentF(CommentLines)
Expected = '# coment line 1\n# coment line 2\n'
self.assertEqual(Result, Expected)
if __name__ == '__main__':
Logger.Initialize()
unittest.main()
| 26.821201 | 96 | 0.573593 | 3,368 | 37,952 | 6.445665 | 0.069181 | 0.040076 | 0.056106 | 0.045143 | 0.840112 | 0.815192 | 0.788567 | 0.748491 | 0.729145 | 0.70427 | 0 | 0.013503 | 0.322882 | 37,952 | 1,414 | 97 | 26.84017 | 0.831271 | 0.069825 | 0 | 0.854127 | 0 | 0 | 0.137844 | 0.007429 | 0 | 0 | 0 | 0 | 0.055662 | 1 | 0.068138 | false | 0.011516 | 0.022073 | 0 | 0.099808 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7ceebac0aeb1477412f7ea2561abc671f32943f1 | 4,598 | py | Python | tests/single_file_test.py | tailhook/swindon | a4b912d678b4159624b53870b1670134fbc32d91 | [
"Apache-2.0",
"MIT"
] | 104 | 2017-03-14T15:48:25.000Z | 2022-02-23T15:12:48.000Z | tests/single_file_test.py | tailhook/swindon | a4b912d678b4159624b53870b1670134fbc32d91 | [
"Apache-2.0",
"MIT"
] | 58 | 2017-03-21T15:28:33.000Z | 2021-03-05T11:43:19.000Z | tests/single_file_test.py | tailhook/swindon | a4b912d678b4159624b53870b1670134fbc32d91 | [
"Apache-2.0",
"MIT"
] | 11 | 2017-04-12T11:21:44.000Z | 2021-11-05T08:31:49.000Z | import os.path
def data_check(data, method, expected):
if method == "HEAD":
assert data == b''
else:
assert data == expected
async def test_ok(swindon, get_request, static_request_method,
debug_routing, TESTS_DIR):
resp, data = await get_request(swindon.url / 'static-file')
assert resp.status == 200
assert resp.headers['Content-Type'] == 'text/plain'
assert resp.headers['Content-Length'] == '17'
data_check(data, static_request_method, b'Static file test\n')
if debug_routing:
assert resp.headers['X-Swindon-Route'] == 'single_file'
assert resp.headers['X-Swindon-File-Path'] == \
'"{}/assets/static_file.txt"'.format(TESTS_DIR)
else:
assert 'X-Swindon-Route' not in resp.headers
assert 'X-Swindon-File-Path' not in resp.headers
async def test_query_args(swindon, get_request, static_request_method,
debug_routing, TESTS_DIR):
url = swindon.url / 'static-file'
url = url.with_query(foo='bar')
resp, data = await get_request(url)
assert resp.status == 200
assert resp.headers['Content-Type'] == 'text/plain'
assert resp.headers['Content-Length'] == '17'
data_check(data, static_request_method, b'Static file test\n')
if debug_routing:
assert resp.headers['X-Swindon-Route'] == 'single_file'
assert resp.headers['X-Swindon-File-Path'] == \
'"{}/assets/static_file.txt"'.format(TESTS_DIR)
else:
assert 'X-Swindon-Route' not in resp.headers
assert 'X-Swindon-File-Path' not in resp.headers
async def test_request_method(swindon, get_request, static_request_method):
resp, data = await get_request(swindon.url / 'static-file')
assert resp.status == 200
assert resp.headers['Content-Type'] == 'text/plain'
assert resp.headers['Content-Length'] == '17'
data_check(data, static_request_method, b'Static file test\n')
async def test_missing_file(swindon, get_request, static_request_method,
debug_routing, TESTS_DIR):
msg = open(os.path.dirname(__file__) + '/404.html', 'rb').read()
resp, data = await get_request(swindon.url / 'missing-file')
assert resp.status == 404
data_check(data, static_request_method, msg)
assert resp.headers['Content-Type'] != 'text/is/missing'
assert resp.headers['Content-Length'] == str(len(msg))
if debug_routing:
assert resp.headers['X-Swindon-File-Path'] == \
'"{}/assets/missing_file.txt"'.format(TESTS_DIR)
async def test_permission(swindon, get_request, static_request_method):
msg = open(os.path.dirname(__file__) + '/403.html', 'rb').read()
resp, data = await get_request(swindon.url / 'no-permission')
assert resp.status == 403
data_check(data, static_request_method, msg)
assert resp.headers['Content-Type'] == 'text/html'
assert resp.headers['Content-Length'] == str(len(msg))
async def test_extra_headers(swindon, get_request, static_request_method):
resp, data = await get_request(swindon.url / 'static-file-headers')
assert resp.status == 200
assert resp.headers.getall('X-Extra-Header') == ['extra value']
assert 'X-Bad-Header' not in resp.headers
async def test_symlink(swindon, get_request, static_request_method,
debug_routing, TESTS_DIR):
resp, data = await get_request(swindon.url / 'symlink')
assert resp.status == 200
assert resp.headers['Content-Type'] == 'text/plain'
assert resp.headers['Content-Length'] == '17'
data_check(data, static_request_method, b'Static file test\n')
if debug_routing:
assert resp.headers['X-Swindon-Route'] == 'single_symlink'
assert resp.headers['X-Swindon-File-Path'] == \
'"{}/assets/link.txt"'.format(TESTS_DIR)
else:
assert 'X-Swindon-Route' not in resp.headers
assert 'X-Swindon-File-Path' not in resp.headers
async def test_non_file(swindon, get_request, static_request_method,
debug_routing):
msg = open(os.path.dirname(__file__) + '/403.html', 'rb').read()
resp, data = await get_request(swindon.url / 'dev-null')
assert resp.status == 403
assert resp.headers['Content-Type'] == 'text/html'
assert resp.headers['Content-Length'] == str(len(msg))
data_check(data, static_request_method, msg)
if debug_routing:
assert resp.headers['X-Swindon-Route'] == 'dev_null'
assert resp.headers['X-Swindon-File-Path'] == \
'"/dev/null"'
else:
assert 'X-Swindon-Route' not in resp.headers
assert 'X-Swindon-File-Path' not in resp.headers
| 40.690265 | 75 | 0.672901 | 627 | 4,598 | 4.767145 | 0.119617 | 0.121445 | 0.136501 | 0.112412 | 0.854801 | 0.84008 | 0.820007 | 0.776848 | 0.747073 | 0.698561 | 0 | 0.010968 | 0.187038 | 4,598 | 112 | 76 | 41.053571 | 0.788657 | 0 | 0 | 0.659574 | 0 | 0 | 0.207047 | 0.017834 | 0 | 0 | 0 | 0 | 0.457447 | 1 | 0.010638 | false | 0 | 0.010638 | 0 | 0.021277 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
7cfa9dda78abcf103e1a131b20a6fb706c8f6261 | 15,687 | py | Python | napari_plot/layers/infline/_tests/test_infline_mouse_bindings.py | lukasz-migas/napari-1d | b0f081a8711ae941b3e4b5c58c3aea56bd0e3277 | [
"BSD-3-Clause"
] | 13 | 2021-08-27T23:01:09.000Z | 2022-03-22T13:51:35.000Z | napari_plot/layers/infline/_tests/test_infline_mouse_bindings.py | lukasz-migas/napari-1d | b0f081a8711ae941b3e4b5c58c3aea56bd0e3277 | [
"BSD-3-Clause"
] | 71 | 2021-08-28T13:29:17.000Z | 2022-03-28T21:22:12.000Z | napari_plot/layers/infline/_tests/test_infline_mouse_bindings.py | lukasz-migas/napari-1d | b0f081a8711ae941b3e4b5c58c3aea56bd0e3277 | [
"BSD-3-Clause"
] | null | null | null | """Test regions mouse bindings."""
import collections
import numpy as np
import pytest
from napari.utils.interactions import (
ReadOnlyWrapper,
mouse_move_callbacks,
mouse_press_callbacks,
mouse_release_callbacks,
)
from napari_plot.layers import InfLine
from napari_plot.layers.infline._infline_constants import Mode, Orientation
@pytest.fixture
def Event():
"""Create a subclass for simulating vispy mouse events.
Returns
-------
Event : Type
A new tuple subclass named Event that can be used to create a
NamedTuple object with fields "type", "is_dragging", and "modifiers".
"""
return collections.namedtuple("Event", field_names=["type", "is_dragging", "modifiers", "position", "pos"])
def _get_position(pos: float):
return 50, pos
@pytest.fixture
def create_known_infline_layer():
"""Create region layer with known coordinates
Returns
-------
data : list
List containing data used to generate regions.
layer : napari_plot.layers.InfLine
Region layer.
n_inflines : int
Number of inflines in the InfLine layer
known_non_infline : list
Data coordinates that are known to contain no region. Useful during
testing when needing to guarantee no region is clicked on.
"""
data = [
(25, "vertical"),
(500, "horizontal"),
(85, "vertical"),
]
known_non_infline = [0]
n_inflines = len(data)
layer = InfLine(data)
assert layer.ndim == 2
assert len(layer.data) == n_inflines
assert len(layer.orientation) == n_inflines
assert len(layer.selected_data) == 0
return data, layer, n_inflines, known_non_infline
def test_add_infline_vertical(create_known_infline_layer, Event):
"""Add new region by clicking in add mode."""
data, layer, n_inflines, known_non_infline = create_known_infline_layer
layer.mode = "add"
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=np.asarray(known_non_infline),
position=known_non_infline,
)
)
mouse_press_callbacks(layer, event)
known_non_infline_end = [0, 100]
# Simulate drag end
event = ReadOnlyWrapper(
Event(
type="mouse_move",
is_dragging=True,
modifiers=[],
pos=np.asarray(known_non_infline_end),
position=known_non_infline_end,
)
)
mouse_move_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline_end,
)
)
mouse_release_callbacks(layer, event)
# Check new shape added at coordinates
assert len(layer.data) == n_inflines + 1
assert layer.orientation[-1] == Orientation.VERTICAL
def test_add_infline_vertical_force_horizontal(create_known_infline_layer, Event):
"""Add new region by clicking in add mode."""
data, layer, n_inflines, known_non_infline = create_known_infline_layer
layer.mode = "add"
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=np.asarray(known_non_infline),
position=known_non_infline,
)
)
mouse_press_callbacks(layer, event)
known_non_infline_end = [0, 100]
# Simulate drag end
event = ReadOnlyWrapper(
Event(
type="mouse_move",
is_dragging=True,
modifiers=["Shift"],
pos=np.asarray(known_non_infline_end),
position=known_non_infline_end,
)
)
mouse_move_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline_end,
)
)
mouse_release_callbacks(layer, event)
# Check new shape added at coordinates
assert len(layer.data) == n_inflines + 1
assert layer.orientation[-1] == Orientation.HORIZONTAL
def test_add_infline_horizontal(create_known_infline_layer, Event):
"""Add new region by clicking in add mode."""
data, layer, n_inflines, known_non_infline = create_known_infline_layer
layer.mode = "add"
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=np.asarray(known_non_infline),
position=known_non_infline,
)
)
mouse_press_callbacks(layer, event)
known_non_infline_end = [100, 0]
# Simulate drag end
event = ReadOnlyWrapper(
Event(
type="mouse_move",
is_dragging=True,
modifiers=[],
pos=np.asarray(known_non_infline_end),
position=known_non_infline_end,
)
)
mouse_move_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline_end,
)
)
mouse_release_callbacks(layer, event)
# Check new shape added at coordinates
assert len(layer.data) == n_inflines + 1
assert layer.orientation[-1] == Orientation.HORIZONTAL
def test_add_infline_horizontal_force_vertical(create_known_infline_layer, Event):
"""Add new region by clicking in add mode."""
data, layer, n_inflines, known_non_infline = create_known_infline_layer
layer.mode = "add"
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=np.asarray(known_non_infline),
position=known_non_infline,
)
)
mouse_press_callbacks(layer, event)
known_non_infline_end = [100, 0]
# Simulate drag end
event = ReadOnlyWrapper(
Event(
type="mouse_move",
is_dragging=True,
modifiers=["Control"],
pos=np.asarray(known_non_infline_end),
position=known_non_infline_end,
)
)
mouse_move_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline_end,
)
)
mouse_release_callbacks(layer, event)
# Check new shape added at coordinates
assert len(layer.data) == n_inflines + 1
assert layer.orientation[-1] == Orientation.VERTICAL
def test_not_adding_or_selecting_infline(create_known_infline_layer, Event):
"""Don't add or select a shape by clicking on one in pan_zoom mode."""
data, layer, n_inflines, _ = create_known_infline_layer
layer.mode = "pan_zoom"
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=(),
position=(0, 0),
)
)
mouse_press_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=(0, 0),
)
)
mouse_release_callbacks(layer, event)
# Check no new shape added and non selected
assert len(layer.data) == n_inflines
assert len(layer.selected_data) == 0
def test_select_infline(create_known_infline_layer, Event):
"""Select a shape by clicking on one in select mode."""
data, layer, n_inflines, _ = create_known_infline_layer
layer.mode = "select"
position = _get_position(data[0][0])
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=(),
position=position,
)
)
mouse_press_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=position,
)
)
mouse_release_callbacks(layer, event)
# Check clicked shape selected
assert len(layer.selected_data) == 1
assert layer.selected_data == {0}
@pytest.mark.parametrize(
"mode",
[
"select",
"move",
"add",
"select",
],
)
def test_after_in_add_mode_infline(mode, create_known_infline_layer, Event):
"""Don't add or select a shape by clicking on one in pan_zoom mode."""
data, layer, n_inflines, _ = create_known_infline_layer
layer.mode = mode
layer.mode = "pan_zoom"
position = _get_position(data[0][0])
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=(),
position=position,
)
)
mouse_press_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=position,
)
)
mouse_release_callbacks(layer, event)
# Check no new shape added and non selected
assert len(layer.data) == n_inflines
assert len(layer.selected_data) == 0
def test_unselect_select_infline(create_known_infline_layer, Event):
"""Select a shape by clicking on one in select mode."""
data, layer, n_inflines, _ = create_known_infline_layer
layer.mode = "select"
position = _get_position(data[0][0])
layer.selected_data = {1}
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=(),
position=position,
)
)
mouse_press_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=position,
)
)
mouse_release_callbacks(layer, event)
# Check clicked shape selected
assert len(layer.selected_data) == 1
assert layer.selected_data == {0}
def test_not_selecting_infline(create_known_infline_layer, Event):
"""Don't select a shape by not clicking on one in select mode."""
data, layer, n_inflines, known_non_infline = create_known_infline_layer
layer.mode = "select"
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline,
)
)
mouse_press_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline,
)
)
mouse_release_callbacks(layer, event)
# Check clicked shape selected
assert len(layer.selected_data) == 0
def test_unselecting_inflines(create_known_infline_layer, Event):
"""Unselect shapes by not clicking on one in select mode."""
data, layer, n_inflines, known_non_infline = create_known_infline_layer
layer.mode = "select"
layer.selected_data = {0, 1}
assert len(layer.selected_data) == 2
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline,
)
)
mouse_press_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline,
)
)
mouse_release_callbacks(layer, event)
# Check clicked shape selected
assert len(layer.selected_data) == 0
def test_selecting_inflines_with_drag(create_known_infline_layer, Event):
"""Select all shapes when drag box includes all of them."""
data, layer, n_inflines, known_non_infline = create_known_infline_layer
layer.mode = "select"
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline,
)
)
mouse_press_callbacks(layer, event)
# Simulate drag start
event = ReadOnlyWrapper(
Event(
type="mouse_move",
is_dragging=True,
modifiers=[],
pos=(),
position=known_non_infline,
)
)
mouse_move_callbacks(layer, event)
# Simulate drag end
event = ReadOnlyWrapper(Event(type="mouse_move", is_dragging=True, modifiers=[], pos=(), position=(1000, 1000)))
mouse_move_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=True,
modifiers=[],
pos=(),
position=(1000, 1000),
)
)
mouse_release_callbacks(layer, event)
# Check all shapes selected as drag box contains them
assert len(layer.selected_data) == n_inflines
def test_selecting_no_inflines_with_drag(create_known_infline_layer, Event):
"""Select all shapes when drag box includes all of them."""
data, layer, n_inflines, known_non_infline = create_known_infline_layer
layer.mode = "select"
# Simulate click
event = ReadOnlyWrapper(
Event(
type="mouse_press",
is_dragging=False,
modifiers=[],
pos=(),
position=known_non_infline,
)
)
mouse_press_callbacks(layer, event)
# Simulate drag start
event = ReadOnlyWrapper(
Event(
type="mouse_move",
is_dragging=True,
modifiers=[],
pos=(),
position=known_non_infline,
)
)
mouse_move_callbacks(layer, event)
# Simulate drag end
event = ReadOnlyWrapper(
Event(
type="mouse_move",
is_dragging=True,
modifiers=[],
pos=(),
position=(200, 20),
)
)
mouse_move_callbacks(layer, event)
# Simulate release
event = ReadOnlyWrapper(
Event(
type="mouse_release",
is_dragging=True,
modifiers=[],
pos=(),
position=(200, 20),
)
)
mouse_release_callbacks(layer, event)
# Check no shapes selected as drag box doesn't contain them
assert len(layer.selected_data) == 0
@pytest.mark.parametrize("attr", ("_move_modes", "_drag_modes", "_cursor_modes"))
def test_all_modes_covered(attr):
"""
Test that all dictionaries modes have all the keys, this simplify the handling logic
As we do not need to test whether a key is in a dict or not.
"""
mode_dict = getattr(InfLine, attr)
assert {k.value for k in mode_dict.keys()} == set(Mode.keys())
| 26.364706 | 116 | 0.602218 | 1,704 | 15,687 | 5.292254 | 0.096831 | 0.048791 | 0.071524 | 0.102905 | 0.83167 | 0.815813 | 0.806831 | 0.792859 | 0.788867 | 0.772788 | 0 | 0.00779 | 0.304456 | 15,687 | 594 | 117 | 26.409091 | 0.818715 | 0.152865 | 0 | 0.713948 | 0 | 0 | 0.043741 | 0 | 0 | 0 | 0 | 0 | 0.061466 | 1 | 0.037825 | false | 0 | 0.014184 | 0.002364 | 0.059102 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
6b0194995a9686671ccb713f9fa92ea60e7a7ed9 | 25,502 | py | Python | tests/oper/test_stl_ct_boolean_and_temporal_online.py | BentleyJOakes/rtamt | 5366349c44afb53cffa5fb29e43fea1eb23b6c52 | [
"BSD-2-Clause-FreeBSD",
"BSD-3-Clause"
] | null | null | null | tests/oper/test_stl_ct_boolean_and_temporal_online.py | BentleyJOakes/rtamt | 5366349c44afb53cffa5fb29e43fea1eb23b6c52 | [
"BSD-2-Clause-FreeBSD",
"BSD-3-Clause"
] | null | null | null | tests/oper/test_stl_ct_boolean_and_temporal_online.py | BentleyJOakes/rtamt | 5366349c44afb53cffa5fb29e43fea1eb23b6c52 | [
"BSD-2-Clause-FreeBSD",
"BSD-3-Clause"
] | null | null | null | import unittest
from rtamt.operation.stl_ct.and_operation import AndOperation
from rtamt.operation.stl_ct.not_operation import NotOperation
from rtamt.operation.stl_ct.or_operation import OrOperation
from rtamt.operation.stl_ct.implies_operation import ImpliesOperation
from rtamt.operation.stl_ct.iff_operation import IffOperation
from rtamt.operation.stl_ct.xor_operation import XorOperation
from rtamt.operation.stl_ct.always_operation import AlwaysOperation
from rtamt.operation.stl_ct.historically_operation import HistoricallyOperation
from rtamt.operation.stl_ct.once_operation import OnceOperation
from rtamt.operation.stl_ct.since_operation import SinceOperation
from rtamt.operation.stl_ct.once_bounded_operation import OnceBoundedOperation
from rtamt.operation.stl_ct.historically_bounded_operation import HistoricallyBoundedOperation
from rtamt.operation.stl_ct.since_bounded_operation import SinceBoundedOperation
class TestSTLBooleanAndTemporalOnline(unittest.TestCase):
def __init__(self, *args, **kwargs):
super(TestSTLBooleanAndTemporalOnline, self).__init__(*args, **kwargs)
def test_and(self):
oper = AndOperation()
in_data_1_1 = [[2, 2], [3.3, 3], [5.7, 4]]
in_data_2_1 = [[2.5, 5], [4.7, 6]]
out_expected_1 = [[2.5, 2], [3.3, 3]]
out_computed_1 = oper.update(in_data_1_1, in_data_2_1)
self.assertListEqual(out_expected_1, out_computed_1,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_1, out_computed_1))
in_data_1_2 = []
in_data_2_2 = [[5.7, 1]]
out_expected_2 = [[4.7, 3]]
out_computed_2 = oper.update(in_data_1_2, in_data_2_2)
self.assertListEqual(out_expected_2, out_computed_2,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_2, out_computed_2))
out_expected_final = [[5.7, 1]]
out_computed_final = oper.update_final([], [])
self.assertListEqual(out_expected_final, out_computed_final,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_final, out_computed_final))
oper = AndOperation()
in_data_1_1 = [[2, 2]]
in_data_2_1 = [[2, 1]]
out_expected_1 = []
out_computed_1 = oper.update(in_data_1_1, in_data_2_1)
self.assertListEqual(out_expected_1, out_computed_1,
"Problem with 2nd example:\nExpected output: %s\nComputed output: %s" % (
out_expected_1, out_computed_1))
in_data_1_2 = [[3.3, 3]]
in_data_2_2 = [[3.3, 5]]
out_expected_2 = [[2, 1]]
out_computed_2 = oper.update(in_data_1_2, in_data_2_2)
self.assertListEqual(out_expected_2, out_computed_2,
"Problem with 2nd example:\nExpected output: %s\nComputed output: %s" % (
out_expected_2, out_computed_2))
in_data_1_3 = [[4.7, 5]]
in_data_2_3 = [[4.7, 2]]
out_expected_3 = [[3.3, 3]]
out_computed_3 = oper.update(in_data_1_3, in_data_2_3)
self.assertListEqual(out_expected_3, out_computed_3,
"Problem with 2nd example:\nExpected output: %s\nComputed output: %s" % (
out_expected_3, out_computed_3))
out_expected_final = [[4.7, 2]]
out_computed_final = oper.update_final([], [])
self.assertListEqual(out_expected_final, out_computed_final,
"Problem with 2nd example:\nExpected output: %s\nComputed output: %s" % (
out_expected_final, out_computed_final))
################################################################################################
oper = AndOperation()
in_data_1_1 = [[1, 2], [4.1, 1], [5, 2], [6.1, 1]]
in_data_2_1 = [[1.2, 1]]
out_expected_1 = []
out_computed_1 = oper.update(in_data_1_1, in_data_2_1)
self.assertListEqual(out_expected_1, out_computed_1,
"Problem with 6th example:\nExpected output: %s\nComputed output: %s" % (
out_expected_1, out_computed_1))
in_data_1_2 = []
in_data_2_2 = [[1.2, 1], [3.7, 3], [7.5, 2]]
out_expected_2 = [[1.2, 1], [3.7, 2], [4.1, 1], [5, 2]]
out_computed_2 = oper.update(in_data_1_2, in_data_2_2)
self.assertListEqual(out_expected_2, out_computed_2,
"Problem with 6th example:\nExpected output: %s\nComputed output: %s" % (
out_expected_2, out_computed_2))
in_data_1_3 = [[6.7, 4], [9.9, 5]]
in_data_2_3 = [[8.1, 6]]
out_expected_3 = [[6.1, 1], [6.7, 3], [7.5, 2]]
out_computed_3 = oper.update(in_data_1_3, in_data_2_3)
self.assertListEqual(out_expected_3, out_computed_3,
"Problem with 6th example:\nExpected output: %s\nComputed output: %s" % (
out_expected_3, out_computed_3))
out_expected_final = [[8.1, 4]]
out_computed_final = oper.update_final([], [])
self.assertListEqual(out_expected_final, out_computed_final,
"Problem with 6th example:\nExpected output: %s\nComputed output: %s" % (
out_expected_final, out_computed_final))
def test_or(self):
oper = OrOperation()
in_data_1_1 = [[1, 2], [4.1, 1], [5, 2], [6.1, 1]]
in_data_2_1 = [[1.2, 1]]
out_expected_1 = []
out_computed_1 = oper.update(in_data_1_1, in_data_2_1)
self.assertListEqual(out_expected_1, out_computed_1,
"Problem with 6th example:\nExpected output: %s\nComputed output: %s" % (
out_expected_1, out_computed_1))
in_data_1_2 = []
in_data_2_2 = [[3.7, 3], [7.5, 2]]
out_expected_2 = [[1.2, 2], [3.7, 3]]
out_computed_2 = oper.update(in_data_1_2, in_data_2_2)
self.assertListEqual(out_expected_2, out_computed_2,
"Problem with 6th example:\nExpected output: %s\nComputed output: %s" % (
out_expected_2, out_computed_2))
in_data_1_3 = [[6.7, 4], [9.9, 5]]
in_data_2_3 = [[8.1, 6]]
out_expected_3 = [[6.1, 3], [6.7, 4]]
out_computed_3 = oper.update(in_data_1_3, in_data_2_3)
self.assertListEqual(out_expected_3, out_computed_3,
"Problem with 6th example:\nExpected output: %s\nComputed output: %s" % (
out_expected_3, out_computed_3))
out_expected_final = [[8.1, 6]]
out_computed_final = oper.update_final([], [])
self.assertListEqual(out_expected_final, out_computed_final,
"Problem with 6th example:\nExpected output: %s\nComputed output: %s" % (
out_expected_final, out_computed_final))
def test_iff(self):
oper = IffOperation()
in_data_1 = [[1, 2], [4.1, 1], [5, 2], [6.1, 1], [6.7, 4], [9.9, 5]]
in_data_2 = [[1.2, 1], [3.7, 3], [7.5, 2], [8.1, 6]]
out_expected = [[1.2, -1], [4.1, -2], [5, -1], [6.1, -2], [6.7, -1], [7.5, -2]]
out_computed = oper.update(in_data_1, in_data_2)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
out_expected = [[8.1, -2]]
out_computed = oper.update_final([], [])
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
def test_xor(self):
oper = XorOperation()
in_data_1 = [[1, 2], [4.1, 1], [5, 2], [6.1, 1], [6.7, 4], [9.9, 5]]
in_data_2 = [[1.2, 1], [3.7, 3], [7.5, 2], [8.1, 6]]
out_expected = [[1.2, 1], [4.1, 2], [5, 1], [6.1, 2], [6.7, 1], [7.5, 2]]
out_computed = oper.update(in_data_1, in_data_2)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
out_expected = [[8.1, 2]]
out_computed = oper.update_final([], [])
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
def test_implies(self):
oper = ImpliesOperation()
in_data_1 = [[1, 2], [4.1, 1], [5, 2], [6.1, 1], [6.7, 4], [9.9, 5]]
in_data_2 = [[1.2, 1], [3.7, 3], [7.5, 2], [8.1, 6]]
out_expected = [[1.2, 1], [3.7, 3], [7.5, 2]]
out_computed = oper.update(in_data_1, in_data_2)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
out_expected = [[8.1, 6]]
out_computed = oper.update_final([], [])
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
def test_always(self):
oper = AlwaysOperation()
in_data = [[5, 3], [5.3, 1], [5.75, 2]]
out_expected = [[5, 3], [5.3, 1], [5.75, 1]]
out_computed = oper.update(in_data)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
in_data = [[6.5, 5], [6.75, 6], [9, 5], [9.25, 4], [10, 2]]
out_expected = [[6.5, 1], [6.75, 1], [9, 1], [9.25, 1], [10, 1]]
out_computed = oper.update(in_data)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
def test_historically(self):
oper = HistoricallyOperation()
in_data = [[5, 3], [5.3, 1], [5.75, 2]]
out_expected = [[5, 3], [5.3, 1], [5.75, 1]]
out_computed = oper.update(in_data)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
in_data = [[6.5, 5], [6.75, 6], [9, 5], [9.25, 4], [10, 2]]
out_expected = [[6.5, 1], [6.75, 1], [9, 1], [9.25, 1], [10, 1]]
out_computed = oper.update(in_data)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
def test_once(self):
oper = OnceOperation()
in_data = [[5, 3], [5.3, 1], [5.75, 2]]
out_expected = [[5, 3], [5.3, 3], [5.75, 3]]
out_computed = oper.update(in_data)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
in_data = [[6.5, 5], [6.75, 6]]
out_expected = [[6.5, 5], [6.75, 6]]
out_computed = oper.update(in_data)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
in_data = [[9, 5], [9.25, 4], [10, 2]]
out_expected = [[9, 6], [9.25, 6], [10, 6]]
out_computed = oper.update(in_data)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
def test_once_0_1(self):
oper = OnceBoundedOperation(0, 1)
in_data_1 = [[5, 3], [5.3, 2], [5.75, 1]]
out_expected_1 = [[5, 3]]
out_computed_1 = oper.update(in_data_1)
self.assertListEqual(out_expected_1, out_computed_1,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_1, out_computed_1))
in_data_2 = [[6.5, 5], [6.75, 6], [9, 5]]
out_expected_2 = [[5.75, 3], [6.3, 2], [6.5, 5], [6.75, 6]]
out_computed_2 = oper.update(in_data_2)
self.assertListEqual(out_expected_2, out_computed_2,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_2, out_computed_2))
in_data_3 = [[9.25, 4], [10, 2]]
out_expected_3 = [[9, 6]]
out_computed_3 = oper.update(in_data_3)
self.assertListEqual(out_expected_3, out_computed_3,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_3, out_computed_3))
out_expected_final = [[10, 5], [10.25, 4], [11, 2]]
out_computed_final = oper.update_final([])
self.assertListEqual(out_expected_final, out_computed_final,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_final, out_computed_final))
def test_once_1_3(self):
oper = OnceBoundedOperation(1, 3)
in_data_1 = [[5, 3], [5.3, 2], [5.75, 1]]
out_expected_1 = [[6, 3]]
out_computed_1 = oper.update(in_data_1)
self.assertListEqual(out_expected_1, out_computed_1,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_1, out_computed_1))
in_data_2 = [[6.5, 5], [6.75, 6], [9, 5]]
out_expected_2 = [[6.75, 3], [7.5, 5], [7.75, 6]]
out_computed_2 = oper.update(in_data_2)
self.assertListEqual(out_expected_2, out_computed_2,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_2, out_computed_2))
in_data_3 = [[9.25, 4], [10, 2]]
out_expected_3 = [[10, 6]]
out_computed_3 = oper.update(in_data_3)
self.assertListEqual(out_expected_3, out_computed_3,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_3, out_computed_3))
out_expected_final = [[11, 6], [12, 5], [12.25, 4], [13, 2]]
out_computed_final = oper.update_final([])
self.assertListEqual(out_expected_final, out_computed_final,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_final, out_computed_final))
def test_since(self):
oper = SinceOperation()
in_data_1 = [[1, 2], [4.1, 1], [5, 2], [6.1, 1], [6.7, 4], [9.9, 5]]
in_data_2 = [[1.2, 1], [3.7, 3], [7.5, 2], [8.1, 6]]
out_expected = [[1.2, 1], [3.7, 2], [4.1, 1], [5, 2], [6.1, 1], [6.7, 3], [7.5, 3]]
out_computed = oper.update(in_data_1, in_data_2)
self.assertListEqual(out_expected, out_computed,
"Problem with 3d example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
out_expected = [[8.1, 4]]
out_computed = oper.update_final([], [])
self.assertListEqual(out_expected, out_computed,
"Problem with 3d example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
oper = SinceOperation()
in_data_1 = [[1, 2], [4.1, 1]]
in_data_2 = [[1.2, 1], [3.7, 3], [7.5, 2]]
out_expected = [[1.2, 1], [3.7, 2]]
out_computed = oper.update(in_data_1, in_data_2)
self.assertListEqual(out_expected, out_computed,
"Problem with 3d example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
in_data_1 = [[5, 2], [6.1, 1], [6.7, 4]]
in_data_2 = []
out_expected = [[4.1, 1], [5, 2], [6.1, 1]]
out_computed = oper.update(in_data_1, in_data_2)
self.assertListEqual(out_expected, out_computed,
"Problem with 3d example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
in_data_1 = [[9.9, 5]]
in_data_2 = [[8.1, 6]]
out_expected = [[6.7, 3], [7.5, 3]]
out_computed = oper.update(in_data_1, in_data_2)
self.assertListEqual(out_expected, out_computed,
"Problem with 3d example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
out_expected = [[8.1, 4]]
out_computed = oper.update_final([], [])
self.assertListEqual(out_expected, out_computed,
"Problem with 3d example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
def test_historically_1_2(self):
oper = HistoricallyBoundedOperation(1, 2)
in_data = [[5, 3], [5.3, 1], [5.75, 2], [6.5, 5], [6.75, 6], [9, 5], [9.25, 4], [10, 2]]
out_expected = [[6, 3], [6.3, 1], [7.75, 2], [8.5, 5], [8.75, 6], [10, 5], [10.25, 4]]
out_computed = oper.update(in_data)
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
out_expected = [[11, 2]]
out_computed = oper.update_final([])
self.assertListEqual(out_expected, out_computed,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected, out_computed))
oper = HistoricallyBoundedOperation(1, 2)
in_data_1 = [[5, 3], [5.3, 2], [5.75, 1]]
out_expected_1 = [[6, 3], [6.3, 2]]
out_computed_1 = oper.update(in_data_1)
self.assertListEqual(out_expected_1, out_computed_1,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_1, out_computed_1))
in_data_2 = [[6.5, 5], [6.75, 6], [9, 5]]
out_expected_2 = [[6.75, 1], [8.5, 5], [8.75, 6]]
out_computed_2 = oper.update(in_data_2)
self.assertListEqual(out_expected_2, out_computed_2,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_2, out_computed_2))
in_data_3 = [[9.25, 4], [10, 2]]
out_expected_3 = [[10, 5], [10.25, 4]]
out_computed_3 = oper.update(in_data_3)
self.assertListEqual(out_expected_3, out_computed_3,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_3, out_computed_3))
out_expected_final = [[11, 2]]
out_computed_final = oper.update_final([])
self.assertListEqual(out_expected_final, out_computed_final,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_final, out_computed_final))
# in_data = [[0, 1], [0.5, 2], [1, 3], [1.5, 4], [2, 5]]
# out_expected = [[1, 1], [2.5, 2], [3, 3], [3.5, 4], [4, 5]]
# out_computed = oper.offline(in_data)
#
# self.assertListEqual(out_expected, out_computed,
# "Problem with 2nd example:\nExpected output: %s\nComputed output: %s" % (
# out_expected, out_computed))
#
# in_data = [[0, 5], [0.5, 4], [1, 3], [1.5, 2], [2, 1]]
# out_expected = [[1, 5], [1.5, 4], [2, 3], [2.5, 2], [4, 1]]
# out_computed = oper.offline(in_data)
#
# self.assertListEqual(out_expected, out_computed,
# "Problem with 3rd example:\nExpected output: %s\nComputed output: %s" % (
# out_expected, out_computed))
#
# in_data = [[5, 3], [5.3, 1], [5.75, 2], [6.5, 5], [6.75, 1], [9, 5], [9.25, 4], [10, 2]]
# out_expected = [[6, 3], [6.3, 1], [11, 4], [12, 2]]
# out_computed = oper.offline(in_data)
#
# self.assertListEqual(out_expected, out_computed,
# "Problem with 4th example:\nExpected output: %s\nComputed output: %s" % (
# out_expected, out_computed))
#
# in_data = [[6, 2], [8, 1], [8.1, 2], [10, 3]]
# out_expected = [[7, 2], [9, 1], [10.1, 2], [12, 3]]
# out_computed = oper.offline(in_data)
#
# self.assertListEqual(out_expected, out_computed,
# "Problem with 5th example:\nExpected output: %s\nComputed output: %s" % (
# out_expected, out_computed))
#
# in_data = [[6, 2], [8, 3], [8.1, 2], [10, 3]]
# out_expected = [[7, 2], [12, 3]]
# out_computed = oper.offline(in_data)
#
# self.assertListEqual(out_expected, out_computed,
# "Problem with 6th example:\nExpected output: %s\nComputed output: %s" % (
# out_expected, out_computed))
#
# in_data = []
# out_expected = []
# out_computed = oper.offline(in_data)
#
# self.assertListEqual(out_expected, out_computed,
# "Problem with 7th example:\nExpected output: %s\nComputed output: %s" % (
# out_expected, out_computed))
#
# in_data = [[2, 5]]
# out_expected = []
# out_computed = oper.offline(in_data)
#
# self.assertListEqual(out_expected, out_computed,
# "Problem with 8th example:\nExpected output: %s\nComputed output: %s" % (
# out_expected, out_computed))
#
#
# def test_since_0_1(self):
# oper = SinceBoundedOperation(0, 1)
# in_data_1 = [[0, 3], [2, 4], [4, 6]]
# in_data_2 = [[0, -1], [2, 5], [4, 6]]
# out_expected = [[0, -1], [2, 4]]
# out_computed = oper.offline(in_data_1, in_data_2)
#
# # self.assertListEqual(out_expected, out_computed,
# # "Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
# # out_expected, out_computed))
#
def test_not(self):
oper = NotOperation()
in_data_1 = [[5, 3], [5.3, 1]]
in_data_2 = [[5.75, 2], [6.5, 5], [6.75, 6], [9, 5], [9.25, 4]]
in_data_3 = [[10, 2]]
out_expected_1 = [[5, -3], [5.3, -1]]
out_expected_2 = [[5.75, -2], [6.5, -5], [6.75, -6], [9, -5], [9.25, -4]]
out_expected_3 = [[10, -2]]
out_expected_final = []
out_computed_1 = oper.update(in_data_1)
out_computed_2 = oper.update(in_data_2)
out_computed_3 = oper.update(in_data_3)
out_computed_final = oper.update_final([])
self.assertListEqual(out_expected_1, out_computed_1,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_1, out_computed_1))
self.assertListEqual(out_expected_2, out_computed_2,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_2, out_computed_2))
self.assertListEqual(out_expected_3, out_computed_3,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_3, out_computed_3))
self.assertListEqual(out_expected_final, out_computed_final,
"Problem with 1st example:\nExpected output: %s\nComputed output: %s" % (
out_expected_final, out_computed_final))
if __name__ == '__main__':
unittest.main() | 46.283122 | 107 | 0.535566 | 3,270 | 25,502 | 3.901223 | 0.0263 | 0.155209 | 0.103473 | 0.141099 | 0.904209 | 0.879987 | 0.855295 | 0.852003 | 0.840401 | 0.835228 | 0 | 0.075677 | 0.326916 | 25,502 | 551 | 108 | 46.283122 | 0.667521 | 0.11254 | 0 | 0.70137 | 0 | 0 | 0.155202 | 0 | 0 | 0 | 0 | 0 | 0.142466 | 1 | 0.038356 | false | 0 | 0.038356 | 0 | 0.079452 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
8610afde39944ac1d47bbeffa74393e94e473f32 | 65,321 | py | Python | pynetdicom/tests/test_transport.py | fengqiliu/pynetdicom | 735c72e898005c7bddc4b6714f19469e6bed0602 | [
"MIT"
] | 1 | 2021-12-23T03:07:46.000Z | 2021-12-23T03:07:46.000Z | pynetdicom/tests/test_transport.py | fengqiliu/pynetdicom | 735c72e898005c7bddc4b6714f19469e6bed0602 | [
"MIT"
] | null | null | null | pynetdicom/tests/test_transport.py | fengqiliu/pynetdicom | 735c72e898005c7bddc4b6714f19469e6bed0602 | [
"MIT"
] | null | null | null | """Unit tests for the transport module."""
from datetime import datetime
import logging
import queue
import os
import platform
import socket
import ssl
from struct import pack
import sys
import threading
import time
import pytest
from pydicom import dcmread
import pynetdicom
from pynetdicom import AE, evt, _config, debug_logger
from pynetdicom.association import Association
from pynetdicom.events import Event
from pynetdicom._globals import MODE_REQUESTOR
from pynetdicom import transport
from pynetdicom.transport import (
AssociationSocket,
ThreadedAssociationServer,
)
from pynetdicom.sop_class import Verification, RTImageStorage
from .encoded_pdu_items import p_data_tf_rq
from .hide_modules import hide_modules
from .utils import wait_for_server_socket
# This is the directory that contains test data
TEST_ROOT = os.path.abspath(os.path.dirname(__file__))
CERT_DIR = os.path.join(TEST_ROOT, "cert_files")
DCM_DIR = os.path.join(TEST_ROOT, "dicom_files")
# SSL Testing
SERVER_CERT, SERVER_KEY = (
os.path.join(CERT_DIR, "server.crt"),
os.path.join(CERT_DIR, "server.key"),
)
CLIENT_CERT, CLIENT_KEY = (
os.path.join(CERT_DIR, "client.crt"),
os.path.join(CERT_DIR, "client.key"),
)
DATASET = dcmread(os.path.join(DCM_DIR, "RTImageStorage.dcm"))
# debug_logger()
class TestAssociationSocket:
"""Tests for the transport.AssociationSocket class."""
def setup(self):
ae = AE()
self.assoc = Association(ae, MODE_REQUESTOR)
def get_listen_socket(self):
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_RCVTIMEO, pack("ll", 1, 0))
sock.bind(("", 11112))
sock.listen(5)
return sock
def test_init_new(self):
"""Test creating a new AssociationSocket instance."""
sock = AssociationSocket(self.assoc)
assert sock.tls_args is None
assert sock.select_timeout == 0.5
assert sock._assoc == self.assoc
assert isinstance(sock.socket, socket.socket)
assert sock._is_connected is False
with pytest.raises(queue.Empty):
sock.event_queue.get(block=False)
def test_init_address(self):
"""Test creating a new bound AssociationSocket instance."""
sock = AssociationSocket(self.assoc, address=("", 11112))
assert sock.tls_args is None
assert sock.select_timeout == 0.5
assert sock._assoc == self.assoc
assert isinstance(sock.socket, socket.socket)
assert sock.socket.getsockname()[0] == "0.0.0.0"
assert sock.socket.getsockname()[1] == 11112
assert sock._is_connected is False
with pytest.raises(queue.Empty):
sock.event_queue.get(block=False)
def test_init_existing(self):
"""Test creating a new AssociationSocket around existing socket."""
sock = AssociationSocket(self.assoc, client_socket="abc")
assert sock.tls_args is None
assert sock.select_timeout == 0.5
assert sock._assoc == self.assoc
assert sock.socket == "abc"
assert sock._is_connected is True
assert sock.event_queue.get(block=False) == "Evt5"
def test_init_raises(self, caplog):
"""Test exception is raised if init with client_socket and address."""
msg = (
r"AssociationSocket instantiated with both a 'client_socket' "
r"and bind 'address'. The original socket will not be rebound"
)
with caplog.at_level(logging.WARNING, logger="pynetdicom"):
AssociationSocket(self.assoc, client_socket="abc", address=("", 11112))
assert msg in caplog.text
def test_close_connect(self):
"""Test closing and connecting."""
sock = AssociationSocket(self.assoc)
sock._is_connected = True
assert sock.socket is not None
sock.close()
assert sock.socket is None
# Tries to connect, sets to None if fails
sock.connect(("", 11112))
assert sock.event_queue.get() == "Evt17"
assert sock.socket is None
def test_ready_error(self):
"""Test AssociationSocket.ready."""
sock = AssociationSocket(self.assoc, address=("localhost", 0))
assert sock.ready is False
sock._is_connected = True
if platform.system() in ["Windows", "Darwin"]:
assert sock.ready is False
else:
assert sock.ready is True
sock.socket.close()
assert sock.ready is False
assert sock.event_queue.get() == "Evt17"
def test_print(self):
"""Test str(AssociationSocket)."""
sock = AssociationSocket(self.assoc)
assert sock.__str__() == sock.socket.__str__()
def test_close_socket_none(self):
"""Test trying to close a closed socket."""
def handle_close(event):
event.assoc.dul.socket.socket = None
hh = [(evt.EVT_CONN_CLOSE, handle_close)]
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_supported_context(Verification)
scp = ae.start_server(("", 11113), block=False, evt_handlers=hh)
ae.add_requested_context(Verification)
assoc = ae.associate("localhost", 11113)
assert assoc.is_established
assoc.release()
assert assoc.is_released
scp.shutdown()
def test_get_local_addr(self):
"""Test get_local_addr()."""
# Normal use
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_requested_context(Verification)
assoc = ae.associate("localhost", 11113)
assert not assoc.is_established
assert isinstance(assoc.requestor.address, str)
# Exceptional use
assert not assoc.is_established
addr = assoc.dul.socket.get_local_addr(("", 111111))
assert "127.0.0.1" == addr
def test_multiple_pdu_req(self):
"""Test what happens if two PDUs are sent before the select call."""
events = []
def handle_echo(event):
events.append(event)
return 0x0000
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_supported_context("1.2.840.10008.1.1")
ae.add_requested_context("1.2.840.10008.1.1")
server = ae.start_server(("", 11112), block=False)
assoc = ae.associate(
"localhost", 11112, evt_handlers=[(evt.EVT_C_ECHO, handle_echo)]
)
assert assoc.is_established
# Send data directly to the requestor
socket = server.active_associations[0].dul.socket
socket.send(2 * p_data_tf_rq)
time.sleep(1)
assoc.release()
assert assoc.is_released
server.shutdown()
assert 2 == len(events)
def test_multiple_pdu_acc(self):
"""Test what happens if two PDUs are sent before the select call."""
events = []
def handle_echo(event):
events.append(event)
return 0x0000
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_supported_context("1.2.840.10008.1.1")
ae.add_requested_context("1.2.840.10008.1.1")
server = ae.start_server(
("", 11112), block=False, evt_handlers=[(evt.EVT_C_ECHO, handle_echo)]
)
assoc = ae.associate(
"localhost",
11112,
)
assert assoc.is_established
# Send data directly to the requestor
socket = assoc.dul.socket
socket.send(2 * p_data_tf_rq)
time.sleep(1)
assoc.release()
assert assoc.is_released
server.shutdown()
assert 2 == len(events)
@pytest.fixture
def server_context(request):
"""Return a good server SSLContext."""
# TLS v1.3 is not currently supported :(
# The actual available attributes/protocols depend on OS, OpenSSL version
# and Python version, ugh
if hasattr(ssl, "TLSVersion"):
# This is the current and future, but heavily depends on OpenSSL
# Python 3.7+, w/ OpenSSL 1.1.0g+
context = ssl.create_default_context(ssl.Purpose.CLIENT_AUTH)
context.verify_mode = ssl.CERT_REQUIRED
context.load_cert_chain(certfile=SERVER_CERT, keyfile=SERVER_KEY)
context.load_verify_locations(cafile=CLIENT_CERT)
context.maximum_version = ssl.TLSVersion.TLSv1_2
return context
# Should work with older Python and OpenSSL versions
# Python 3.6
context = ssl.SSLContext(protocol=ssl.PROTOCOL_TLSv1_2)
context.verify_mode = ssl.CERT_REQUIRED
context.load_cert_chain(certfile=SERVER_CERT, keyfile=SERVER_KEY)
context.load_verify_locations(cafile=CLIENT_CERT)
return context
@pytest.fixture
def client_context(request):
"""Return a good client SSLContext."""
context = ssl.create_default_context(ssl.Purpose.SERVER_AUTH, cafile=SERVER_CERT)
context.verify_mode = ssl.CERT_REQUIRED
context.load_cert_chain(certfile=CLIENT_CERT, keyfile=CLIENT_KEY)
context.check_hostname = False
return context
class TestTLS:
"""Test using TLS to wrap the association."""
def setup(self):
self.ae = None
self.has_ssl = transport._HAS_SSL
def teardown(self):
if self.ae:
self.ae.shutdown()
# Ensure ssl module is available again
import importlib
importlib.reload(pynetdicom.transport)
def test_tls_not_server_not_client(self):
"""Test associating with no TLS on either end."""
self.ae = ae = AE()
ae.add_supported_context("1.2.840.10008.1.1")
server = ae.start_server(("localhost", 11112), block=False)
ae.add_requested_context("1.2.840.10008.1.1")
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
status = assoc.send_c_echo()
assert status.Status == 0x0000
assoc.release()
assert assoc.is_released
server.shutdown()
assert len(server.active_associations) == 0
def test_tls_not_server_yes_client(self, client_context):
"""Test wrapping the requestor socket with TLS (but not server)."""
self.ae = ae = AE()
ae.acse_timeout = 0.5
ae.dimse_timeout = 0.5
ae.network_timeout = 0.5
ae.add_supported_context("1.2.840.10008.1.1")
server = ae.start_server(("localhost", 11112), block=False)
ae.add_requested_context("1.2.840.10008.1.1")
assoc = ae.associate("localhost", 11112, tls_args=(client_context, None))
assert assoc.is_aborted
server.shutdown()
time.sleep(0.5)
assert len(server.active_associations) == 0
def test_tls_yes_server_not_client(self, server_context):
"""Test wrapping the acceptor socket with TLS (and not client)."""
self.ae = ae = AE()
ae.add_supported_context("1.2.840.10008.1.1")
server = ae.start_server(
("localhost", 11112),
block=False,
ssl_context=server_context,
)
ae.add_requested_context("1.2.840.10008.1.1")
assoc = ae.associate("localhost", 11112)
assert assoc.is_aborted
server.shutdown()
assert len(server.active_associations) == 0
def test_tls_yes_server_yes_client(self, server_context, client_context):
"""Test associating with TLS on both ends."""
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_supported_context("1.2.840.10008.1.1")
server = ae.start_server(
("localhost", 11112),
block=False,
ssl_context=server_context,
)
wait_for_server_socket(server, 1)
ae.add_requested_context("1.2.840.10008.1.1")
assoc = ae.associate("localhost", 11112, tls_args=(client_context, None))
assert assoc.is_established
assoc.release()
assert assoc.is_released
server.shutdown()
assert len(server.active_associations) == 0
def test_tls_transfer(self, server_context, client_context):
"""Test transferring data after associating with TLS."""
ds = []
def handle_store(event):
ds.append(event.dataset)
return 0x0000
handlers = [(evt.EVT_C_STORE, handle_store)]
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_supported_context("1.2.840.10008.1.1")
ae.add_supported_context(RTImageStorage)
server = ae.start_server(
("localhost", 11112),
block=False,
ssl_context=server_context,
evt_handlers=handlers,
)
ae.add_requested_context("1.2.840.10008.1.1")
ae.add_requested_context(RTImageStorage)
assoc = ae.associate("localhost", 11112, tls_args=(client_context, None))
assert assoc.is_established
status = assoc.send_c_store(DATASET)
assert status.Status == 0x0000
assoc.release()
assert assoc.is_released
server.shutdown()
assert len(ds[0].PixelData) == 2097152
@hide_modules(["ssl"])
def test_no_ssl_scp(self):
"""Test exception raised if no SSL available to Python as SCP."""
# Reload pynetdicom package
import importlib
importlib.reload(pynetdicom.transport)
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_supported_context("1.2.840.10008.1.1")
msg = r"Your Python installation lacks support for SSL"
with pytest.raises(RuntimeError, match=msg):
ae.start_server(
("localhost", 11112),
block=False,
ssl_context=["random", "object"],
)
@hide_modules(["ssl"])
def test_no_ssl_scu(self):
"""Test exception raised if no SSL available to Python as SCU."""
# Reload pynetdicom package
import importlib
importlib.reload(pynetdicom.transport)
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_requested_context("1.2.840.10008.1.1")
msg = r"Your Python installation lacks support for SSL"
with pytest.raises(RuntimeError, match=msg):
ae.associate("localhost", 11112, tls_args=(["random", "object"], None))
def test_multiple_pdu_req(self, server_context, client_context):
"""Test what happens if two PDUs are sent before the select call."""
events = []
def handle_echo(event):
events.append(event)
return 0x0000
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_supported_context("1.2.840.10008.1.1")
ae.add_requested_context("1.2.840.10008.1.1")
server = ae.start_server(
("localhost", 11112),
block=False,
ssl_context=server_context,
)
assoc = ae.associate(
"localhost",
11112,
tls_args=(client_context, None),
evt_handlers=[(evt.EVT_C_ECHO, handle_echo)],
)
assert assoc.is_established
# Send data directly to the requestor
socket = server.active_associations[0].dul.socket
socket.send(2 * p_data_tf_rq)
time.sleep(1)
assoc.release()
timeout = 0
while not assoc.is_released and timeout < 5:
time.sleep(0.05)
timeout += 0.05
assert assoc.is_released
def test_multiple_pdu_acc(self, server_context, client_context):
"""Test what happens if two PDUs are sent before the select call."""
events = []
def handle_echo(event):
events.append(event)
return 0x0000
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_supported_context("1.2.840.10008.1.1")
ae.add_requested_context("1.2.840.10008.1.1")
server = ae.start_server(
("localhost", 11112),
block=False,
ssl_context=server_context,
evt_handlers=[(evt.EVT_C_ECHO, handle_echo)],
)
assoc = ae.associate("localhost", 11112, tls_args=(client_context, None))
assert assoc.is_established
# Send data directly to the requestor
socket = assoc.dul.socket
socket.send(2 * p_data_tf_rq)
time.sleep(1)
assoc.release()
timeout = 0
while not assoc.is_released and timeout < 5:
time.sleep(0.05)
timeout += 0.05
assert assoc.is_released
server.shutdown()
assert 2 == len(events)
class TestAssociationServer:
def setup(self):
self.ae = None
def teardown(self):
if self.ae:
self.ae.shutdown()
@pytest.mark.skip()
def test_multi_assoc_block(self):
"""Test that multiple requestors can associate when blocking."""
self.ae = ae = AE()
ae.maximum_associations = 10
ae.add_supported_context("1.2.840.10008.1.1")
ae.start_server(("", 11112))
def test_multi_assoc_non(self):
"""Test that multiple requestors can association when non-blocking."""
self.ae = ae = AE()
ae.maximum_associations = 10
ae.add_supported_context("1.2.840.10008.1.1")
ae.add_requested_context("1.2.840.10008.1.1")
scp = ae.start_server(("", 11112), block=False)
assocs = []
for ii in range(10):
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assocs.append(assoc)
for assoc in assocs:
assoc.release()
scp.shutdown()
def test_init_handlers(self):
"""Test AssociationServer.__init__()."""
def handle(event):
pass
def handle_echo(event):
return 0x0000
def handle_echo_b(event):
return 0x0000
self.ae = ae = AE()
handlers = [
(evt.EVT_DATA_RECV, handle),
(evt.EVT_DATA_RECV, handle),
(evt.EVT_C_ECHO, handle_echo),
(evt.EVT_C_ECHO, handle_echo_b),
(evt.EVT_DATA_SENT, handle_echo_b),
(evt.EVT_DATA_SENT, handle_echo),
(evt.EVT_DATA_SENT, handle),
]
ae.add_supported_context(Verification)
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
assert evt.EVT_DATA_RECV in scp._handlers
assert evt.EVT_C_ECHO in scp._handlers
# Duplicates not added
assert len(scp._handlers[evt.EVT_DATA_RECV]) == 1
# Multiples allowed
assert len(scp._handlers[evt.EVT_DATA_SENT]) == 3
# Only a single handler allowed
assert scp._handlers[evt.EVT_C_ECHO] == (handle_echo_b, None)
def test_get_events(self):
"""Test AssociationServer.get_events()."""
def handle(event):
pass
def handle_echo(event):
return 0x0000
def handle_echo_b(event):
return 0x0000
self.ae = ae = AE()
handlers = [
(evt.EVT_DATA_RECV, handle),
(evt.EVT_DATA_RECV, handle),
(evt.EVT_C_ECHO, handle_echo),
(evt.EVT_C_ECHO, handle_echo_b),
(evt.EVT_DATA_SENT, handle_echo_b),
(evt.EVT_DATA_SENT, handle_echo),
(evt.EVT_DATA_SENT, handle),
]
ae.add_supported_context(Verification)
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
bound_events = scp.get_events()
assert evt.EVT_DATA_RECV in bound_events
assert evt.EVT_DATA_SENT in bound_events
assert evt.EVT_C_ECHO in bound_events
scp.shutdown()
def test_get_handlers(self):
"""Test AssociationServer.get_handlers()."""
_config.LOG_HANDLER_LEVEL = "none"
def handle(event):
pass
def handle_echo(event):
return 0x0000
def handle_echo_b(event):
return 0x0000
self.ae = ae = AE()
handlers = [
(evt.EVT_DATA_RECV, handle),
(evt.EVT_DATA_RECV, handle),
(evt.EVT_C_ECHO, handle_echo),
(evt.EVT_C_ECHO, handle_echo_b),
(evt.EVT_DATA_SENT, handle_echo_b),
(evt.EVT_DATA_SENT, handle_echo),
(evt.EVT_DATA_SENT, handle),
]
ae.add_supported_context(Verification)
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
assert scp.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
assert (handle, None) in scp.get_handlers(evt.EVT_DATA_SENT)
assert (handle_echo, None) in scp.get_handlers(evt.EVT_DATA_SENT)
assert (handle_echo_b, None) in scp.get_handlers(evt.EVT_DATA_SENT)
assert scp.get_handlers(evt.EVT_C_ECHO) == (handle_echo_b, None)
assert scp.get_handlers(evt.EVT_PDU_SENT) == []
scp.shutdown()
def test_shutdown(self):
"""test trying to shutdown a socket that's already closed."""
self.ae = ae = AE()
ae.add_supported_context(Verification)
server = ae.start_server(("", 11112), block=False)
server.socket.close()
server.shutdown()
def test_exception_in_handler(self):
"""Test exc raised by the handler doesn't shut down the server."""
class DummyAE:
network_timeout = 5
_servers = []
dummy = DummyAE()
server = ThreadedAssociationServer(dummy, ("", 11112), b"a", [])
dummy._servers.append(server)
thread = threading.Thread(target=server.serve_forever)
thread.daemon = True
thread.start()
ae = AE()
ae.add_requested_context("1.2.840.10008.1.1")
ae.associate("localhost", 11112)
assert server.socket.fileno() != -1
server.shutdown()
if sys.version_info[0] == 2:
with pytest.raises(socket.error):
server.socket.fileno()
else:
assert server.socket.fileno() == -1
def test_blocking_process_request(self):
"""Test AssociationServer.process_request."""
self.ae = ae = AE()
ae.acse_timeout = 5
ae.dimse_timeout = 5
ae.network_timeout = 5
ae.add_supported_context(Verification)
t = threading.Thread(
target=ae.start_server, args=(("localhost", 11112),), kwargs={"block": True}
)
t.start()
ae.add_requested_context(Verification)
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assoc.release()
ae.shutdown()
class TestEventHandlingAcceptor:
"""Test the transport events and handling as acceptor."""
def setup(self):
self.ae = None
def teardown(self):
if self.ae:
self.ae.shutdown()
def test_no_handlers(self):
"""Test with no transport event handlers bound."""
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc.release()
scp.shutdown()
def test_bind_evt_conn_open(self):
"""Test associations as acceptor with EVT_CONN_OPEN bound."""
triggered_events = []
def on_conn_open(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(
("", 11112), block=False, evt_handlers=[(evt.EVT_CONN_OPEN, on_conn_open)]
)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assert len(triggered_events) == 1
event = triggered_events[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.address[0], str)
assert isinstance(event.address[1], int)
assert event.event.name == "EVT_CONN_OPEN"
assoc.release()
scp.shutdown()
def test_bind_evt_conn_open_running(self):
"""Test binding EVT_CONN_OPEN while running."""
triggered_events = []
def on_conn_open(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assert len(scp.active_associations) == 1
assert len(triggered_events) == 0
# Bind
scp.bind(evt.EVT_CONN_OPEN, on_conn_open)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc2 = ae.associate("localhost", 11112)
assert assoc2.is_established
assert len(scp.active_associations) == 2
assert scp.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert assoc2.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc2.get_handlers(evt.EVT_CONN_CLOSE) == []
child2 = scp.active_associations[1]
assert child2.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert child2.get_handlers(evt.EVT_CONN_CLOSE) == []
assert len(triggered_events) == 1
event = triggered_events[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.address[0], str)
assert isinstance(event.address[1], int)
assoc.release()
assoc2.release()
scp.shutdown()
def test_unbind_evt_conn_open(self):
"""Test unbinding an event while running."""
triggered_events = []
def on_conn_open(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(
("", 11112), block=False, evt_handlers=[(evt.EVT_CONN_OPEN, on_conn_open)]
)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assert len(triggered_events) == 1
event = triggered_events[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.address[0], str)
assert isinstance(event.address[1], int)
# Unbind
scp.unbind(evt.EVT_CONN_OPEN, on_conn_open)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc2 = ae.associate("localhost", 11112)
assert assoc2.is_established
assert len(scp.active_associations) == 2
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert assoc2.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc2.get_handlers(evt.EVT_CONN_CLOSE) == []
child2 = scp.active_associations[1]
assert child2.get_handlers(evt.EVT_CONN_OPEN) == []
assert child2.get_handlers(evt.EVT_CONN_CLOSE) == []
assert len(triggered_events) == 1
assoc.release()
assoc2.release()
scp.shutdown()
def test_unbind_no_event(self):
"""Test unbinding if no event bound."""
def dummy(event):
pass
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
scp.unbind(evt.EVT_CONN_CLOSE, dummy)
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
scp.shutdown()
def test_unbind_last_handler(self):
"""Test unbinding if no event bound."""
def dummy(event):
pass
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
scp.bind(evt.EVT_CONN_CLOSE, dummy)
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == [(dummy, None)]
scp.unbind(evt.EVT_CONN_CLOSE, dummy)
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert evt.EVT_CONN_CLOSE not in scp._handlers
scp.shutdown()
def test_conn_open_raises(self, caplog):
"""Test the handler for EVT_CONN_OPEN raising exception."""
def handle(event):
raise NotImplementedError("Exception description")
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_CONN_OPEN, handle)]
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
with caplog.at_level(logging.ERROR, logger="pynetdicom"):
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assoc.release()
while scp.active_associations:
time.sleep(0.05)
scp.shutdown()
msg = (
"Exception raised in user's 'evt.EVT_CONN_OPEN' event handler"
" 'handle'"
)
assert msg in caplog.text
assert "Exception description" in caplog.text
def test_bind_evt_conn_close(self):
"""Test associations as acceptor with EVT_CONN_CLOSE bound."""
triggered_events = []
def on_conn_close(event):
with threading.Lock():
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(
("", 11112), block=False, evt_handlers=[(evt.EVT_CONN_CLOSE, on_conn_close)]
)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered_events) == 1
event = triggered_events[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.address[0], str)
assert isinstance(event.address[1], int)
assert event.event.name == "EVT_CONN_CLOSE"
scp.shutdown()
def test_bind_evt_conn_close_running(self):
"""Test binding EVT_CONN_CLOSE while running."""
triggered_events = []
def on_conn_close(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(
("", 11112),
block=False,
)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
scp.bind(evt.EVT_CONN_CLOSE, on_conn_close)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
assoc.release()
assert assoc.is_released
time.sleep(0.1)
assert len(triggered_events) == 1
event = triggered_events[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
scp.shutdown()
def test_unbind_evt_conn_close(self):
"""Test unbinding EVT_CONN_CLOSE."""
triggered_events = []
def on_conn_close(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(
("", 11112), block=False, evt_handlers=[(evt.EVT_CONN_CLOSE, on_conn_close)]
)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
scp.unbind(evt.EVT_CONN_CLOSE, on_conn_close)
assert scp.get_handlers(evt.EVT_CONN_OPEN) == []
assert scp.get_handlers(evt.EVT_CONN_CLOSE) == []
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered_events) == 0
scp.shutdown()
def test_conn_close_raises(self, caplog):
"""Test the handler for EVT_CONN_CLOSE raising exception."""
def handle(event):
raise NotImplementedError("Exception description")
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_CONN_CLOSE, handle)]
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
with caplog.at_level(logging.ERROR, logger="pynetdicom"):
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assoc.release()
while scp.active_associations:
time.sleep(0.05)
scp.shutdown()
msg = (
"Exception raised in user's 'evt.EVT_CONN_CLOSE' event handler"
" 'handle'"
)
assert msg in caplog.text
assert "Exception description" in caplog.text
def test_data_sent(self):
"""Test binding to EVT_DATA_SENT."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_SENT, handle)]
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
assert scp.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
assert assoc.get_handlers(evt.EVT_DATA_SENT) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 2
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert event.event.name == "EVT_DATA_SENT"
assert triggered[0].data[0:1] == b"\x02" # A-ASSOCIATE-AC
assert triggered[1].data[0:1] == b"\x06" # A-RELEASE-RP
scp.shutdown()
def test_data_sent_bind(self):
"""Test binding to EVT_DATA_SENT."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_DATA_SENT) == []
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
time.sleep(0.5)
scp.bind(evt.EVT_DATA_SENT, handle)
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
assert assoc.get_handlers(evt.EVT_DATA_SENT) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 1
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert event.event.name == "EVT_DATA_SENT"
assert event.data[0:1] == b"\x06" # A-RELEASE-RP
scp.shutdown()
def test_data_sent_unbind(self):
"""Test unbinding EVT_DATA_SENT."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_SENT, handle)]
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
assert scp.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
time.sleep(0.5)
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
assert assoc.get_handlers(evt.EVT_DATA_SENT) == []
child = scp.active_associations[0]
assert child.dul.state_machine.current_state == "Sta6"
assert child.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
scp.unbind(evt.EVT_DATA_SENT, handle)
assoc.release()
while scp.active_associations:
time.sleep(0.05)
time.sleep(0.1)
assert len(triggered) == 1
assert triggered[0].data[0:1] == b"\x02" # A-ASSOCIATE-AC
scp.shutdown()
def test_data_sent_raises(self, caplog):
"""Test the handler for EVT_DATA_SENT raising exception."""
def handle(event):
raise NotImplementedError("Exception description")
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_SENT, handle)]
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
with caplog.at_level(logging.ERROR, logger="pynetdicom"):
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assoc.release()
while scp.active_associations:
time.sleep(0.05)
scp.shutdown()
msg = (
"Exception raised in user's 'evt.EVT_DATA_SENT' event handler"
" 'handle'"
)
assert msg in caplog.text
assert "Exception description" in caplog.text
def test_data_recv(self):
"""Test starting bound to EVT_DATA_RECV."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_RECV, handle)]
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
assert scp.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
assert assoc.get_handlers(evt.EVT_DATA_RECV) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 2
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert triggered[0].data[0:1] == b"\x01" # Should be A-ASSOCIATE-RQ PDU
assert triggered[1].data[0:1] == b"\x05" # Should be A-RELEASE-RQ PDU
assert event.event.name == "EVT_DATA_RECV"
scp.shutdown()
def test_data_recv_bind(self):
"""Test binding to EVT_DATA_RECV."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_DATA_RECV) == []
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
scp.bind(evt.EVT_DATA_RECV, handle)
assert scp.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
assert assoc.get_handlers(evt.EVT_DATA_RECV) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 1
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert event.data[0:1] == b"\x05" # Should be A-RELEASE-RQ PDU
assert event.event.name == "EVT_DATA_RECV"
scp.shutdown()
def test_data_recv_unbind(self):
"""Test unbinding to EVT_DATA_RECV."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_RECV, handle)]
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
assert scp.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
scp.unbind(evt.EVT_DATA_RECV, handle)
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_RECV) == []
assert assoc.get_handlers(evt.EVT_DATA_RECV) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_RECV) == []
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 1
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert triggered[0].data[0:1] == b"\x01" # Should be A-ASSOCIATE-RQ PDU
assert event.event.name == "EVT_DATA_RECV"
scp.shutdown()
def test_data_recv_raises(self, caplog):
"""Test the handler for EVT_DATA_RECV raising exception."""
def handle(event):
raise NotImplementedError("Exception description")
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_RECV, handle)]
scp = ae.start_server(("", 11112), block=False, evt_handlers=handlers)
with caplog.at_level(logging.ERROR, logger="pynetdicom"):
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assoc.release()
while scp.active_associations:
time.sleep(0.05)
scp.shutdown()
msg = (
"Exception raised in user's 'evt.EVT_DATA_RECV' event handler"
" 'handle'"
)
assert msg in caplog.text
assert "Exception description" in caplog.text
class TestEventHandlingRequestor:
"""Test the transport events and handling as requestor."""
def setup(self):
self.ae = None
def teardown(self):
if self.ae:
self.ae.shutdown()
def test_no_handlers(self):
"""Test associations as requestor with no handlers bound."""
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc.release()
scp.shutdown()
def test_bind_evt_conn_open(self):
"""Test start with a bound EVT_CONN_OPEN"""
triggered_events = []
def on_conn_open(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assoc = ae.associate(
"localhost", 11112, evt_handlers=[(evt.EVT_CONN_OPEN, on_conn_open)]
)
assert assoc.is_established
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assert len(triggered_events) == 1
event = triggered_events[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.address[0], str)
assert isinstance(event.address[1], int)
assoc.release()
scp.shutdown()
def test_unbind_evt_conn_open(self):
"""Test unbinding EVT_CONN_OPEN"""
triggered_events = []
def on_conn_open(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assoc = ae.associate(
"localhost", 11112, evt_handlers=[(evt.EVT_CONN_OPEN, on_conn_open)]
)
assert assoc.is_established
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == [(on_conn_open, None)]
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc.unbind(evt.EVT_CONN_OPEN, on_conn_open)
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
assert len(triggered_events) == 1
event = triggered_events[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.address[0], str)
assert isinstance(event.address[1], int)
assoc.release()
scp.shutdown()
def test_bind_evt_conn_close(self):
"""Test start with a bound EVT_CONN_CLOSED"""
triggered_events = []
def on_conn_close(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assoc = ae.associate(
"localhost", 11112, evt_handlers=[(evt.EVT_CONN_CLOSE, on_conn_close)]
)
assert assoc.is_established
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assert len(triggered_events) == 0
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered_events) == 1
event = triggered_events[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
scp.shutdown()
def test_bind_evt_conn_close_running(self):
"""Test binding EVT_CONN_CLOSED after assoc running."""
triggered_events = []
def on_conn_close(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assert len(triggered_events) == 0
assoc.bind(evt.EVT_CONN_CLOSE, on_conn_close)
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered_events) == 1
event = triggered_events[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
scp.shutdown()
def test_unbind_evt_conn_close(self):
"""Test unbinding EVT_CONN_CLOSED"""
triggered_events = []
def on_conn_close(event):
triggered_events.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assoc = ae.associate(
"localhost", 11112, evt_handlers=[(evt.EVT_CONN_CLOSE, on_conn_close)]
)
assert assoc.is_established
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == [(on_conn_close, None)]
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_CONN_OPEN) == []
assert child.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc.unbind(evt.EVT_CONN_CLOSE, on_conn_close)
assert assoc.get_handlers(evt.EVT_CONN_OPEN) == []
assert assoc.get_handlers(evt.EVT_CONN_CLOSE) == []
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered_events) == 0
scp.shutdown()
def test_connection_failure_log(self, caplog):
"""Test that a connection failure is logged."""
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
with caplog.at_level(logging.ERROR, logger="pynetdicom"):
assoc = ae.associate("localhost", 11113)
assert assoc.is_aborted
messages = [
"Association request failed: unable to connect to remote",
"TCP Initialisation Error",
]
for msg in messages:
assert msg in caplog.text
scp.shutdown()
def test_data_sent(self):
"""Test binding to EVT_DATA_SENT."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_SENT, handle)]
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_DATA_SENT) == []
assoc = ae.associate("localhost", 11112, evt_handlers=handlers)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_SENT) == []
assert assoc.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_SENT) == []
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 2
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert event.event.name == "EVT_DATA_SENT"
assert triggered[0].data[0:1] == b"\x01" # A-ASSOCIATE-RQ
assert triggered[1].data[0:1] == b"\x05" # A-RELEASE-RQ
scp.shutdown()
def test_data_sent_bind(self):
"""Test binding to EVT_DATA_SENT."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_DATA_SENT) == []
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assoc.bind(evt.EVT_DATA_SENT, handle)
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_SENT) == []
assert assoc.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_SENT) == []
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 1
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert event.event.name == "EVT_DATA_SENT"
assert event.data[0:1] == b"\x05" # A-RELEASE-RQ
scp.shutdown()
def test_data_sent_unbind(self):
"""Test unbinding EVT_DATA_SENT."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_SENT, handle)]
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_DATA_SENT) == []
assoc = ae.associate("localhost", 11112, evt_handlers=handlers)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_SENT) == []
assert assoc.get_handlers(evt.EVT_DATA_SENT) == [(handle, None)]
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_SENT) == []
assoc.unbind(evt.EVT_DATA_SENT, handle)
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 1
assert triggered[0].data[0:1] == b"\x01" # A-ASSOCIATE-RQ
scp.shutdown()
def test_data_recv(self):
"""Test starting bound to EVT_DATA_RECV."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_RECV, handle)]
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_DATA_RECV) == []
assoc = ae.associate("localhost", 11112, evt_handlers=handlers)
assert assoc.is_established
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_RECV) == []
assert assoc.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_RECV) == []
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 2
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert triggered[0].data[0:1] == b"\x02" # Should be A-ASSOCIATE-AC PDU
assert triggered[1].data[0:1] == b"\x06" # Should be A-RELEASE-RP PDU
assert event.event.name == "EVT_DATA_RECV"
scp.shutdown()
def test_data_recv_bind(self):
"""Test binding to EVT_DATA_RECV."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_DATA_RECV) == []
assoc = ae.associate("localhost", 11112)
assert assoc.is_established
assert len(scp.active_associations) == 1
assoc.bind(evt.EVT_DATA_RECV, handle)
assert scp.get_handlers(evt.EVT_DATA_RECV) == []
assert assoc.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_RECV) == []
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 1
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert event.data[0:1] == b"\x06" # Should be A-RELEASE-RP PDU
assert event.event.name == "EVT_DATA_RECV"
scp.shutdown()
def test_data_recv_unbind(self):
"""Test unbinding to EVT_DATA_RECV."""
triggered = []
def handle(event):
triggered.append(event)
self.ae = ae = AE()
ae.add_supported_context(Verification)
ae.add_requested_context(Verification)
handlers = [(evt.EVT_DATA_RECV, handle)]
scp = ae.start_server(("", 11112), block=False)
assert scp.get_handlers(evt.EVT_DATA_RECV) == []
assoc = ae.associate("localhost", 11112, evt_handlers=handlers)
assert assoc.is_established
assert assoc.get_handlers(evt.EVT_DATA_RECV) == [(handle, None)]
assoc.unbind(evt.EVT_DATA_RECV, handle)
assert len(scp.active_associations) == 1
assert scp.get_handlers(evt.EVT_DATA_RECV) == []
assert assoc.get_handlers(evt.EVT_DATA_RECV) == []
child = scp.active_associations[0]
assert child.get_handlers(evt.EVT_DATA_RECV) == []
assoc.release()
while scp.active_associations:
time.sleep(0.05)
assert len(triggered) == 1
event = triggered[0]
assert isinstance(event, Event)
assert isinstance(event.assoc, Association)
assert isinstance(event.timestamp, datetime)
assert isinstance(event.data, bytes)
assert triggered[0].data[0:1] == b"\x02" # Should be A-ASSOCIATE-AC PDU
assert event.event.name == "EVT_DATA_RECV"
scp.shutdown()
| 33.208439 | 88 | 0.623551 | 8,008 | 65,321 | 4.863886 | 0.048701 | 0.038357 | 0.072606 | 0.074634 | 0.874942 | 0.858383 | 0.82914 | 0.814608 | 0.808575 | 0.793633 | 0 | 0.026933 | 0.266193 | 65,321 | 1,966 | 89 | 33.225331 | 0.785659 | 0.062032 | 0 | 0.824771 | 0 | 0 | 0.035012 | 0 | 0 | 0 | 0.001281 | 0 | 0.313863 | 1 | 0.081633 | false | 0.003519 | 0.021112 | 0.004222 | 0.117523 | 0.000704 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
86c841e59d16b9a1af5386c00903e317d5495a7d | 130 | py | Python | discord/components.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/components.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | discord/components.py | kuzaku-developers/disnake | 61cc1ad4c2bafd39726a1447c85f7e469e41af10 | [
"MIT"
] | null | null | null | from disnake.components import *
from disnake.components import __dict__ as __original_dict__
locals().update(__original_dict__)
| 26 | 60 | 0.846154 | 16 | 130 | 6 | 0.5625 | 0.229167 | 0.4375 | 0.5625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092308 | 130 | 4 | 61 | 32.5 | 0.813559 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
86d63423639cb80c8bdbd26ba76fdc9ad0f9789b | 91 | py | Python | tests/test_A000110.py | TimothyDJones/oeis | d9d608bc32ee31c73c139e1b68e4eb6315205e8d | [
"MIT"
] | 21 | 2020-03-21T17:50:13.000Z | 2022-01-18T01:52:47.000Z | tests/test_A000110.py | TimothyDJones/oeis | d9d608bc32ee31c73c139e1b68e4eb6315205e8d | [
"MIT"
] | 296 | 2019-11-18T14:04:36.000Z | 2022-03-27T21:59:24.000Z | tests/test_A000110.py | TimothyDJones/oeis | d9d608bc32ee31c73c139e1b68e4eb6315205e8d | [
"MIT"
] | 29 | 2019-11-18T11:56:22.000Z | 2022-03-26T22:31:57.000Z | from oeis import A000110
def test_bell():
assert A000110[:6] == [1, 1, 2, 5, 15, 52]
| 15.166667 | 46 | 0.604396 | 16 | 91 | 3.375 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.3 | 0.230769 | 91 | 5 | 47 | 18.2 | 0.471429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 8 |
86f23fb78462d680f148ac045f39fe60ee060d12 | 345 | py | Python | sampleprj/pysubpckB/src/sampleprj/pysubpckB/pysubsub/__init__.py | astyl/AcrobatomaticBuildSystem | a8a4858d723a0673eeeb6f039af05dc86be638a9 | [
"BSD-2-Clause-FreeBSD"
] | 10 | 2019-01-07T20:17:05.000Z | 2022-03-07T20:46:58.000Z | sampleprj/pysubpckB/src/sampleprj/pysubpckB/pysubsub/__init__.py | astyl/AcrobatomaticBuildSystem | a8a4858d723a0673eeeb6f039af05dc86be638a9 | [
"BSD-2-Clause-FreeBSD"
] | 1 | 2019-09-04T11:27:10.000Z | 2019-09-04T15:02:36.000Z | sampleprj/pysubpckB/src/sampleprj/pysubpckB/pysubsub/__init__.py | astyl/AcrobatomaticBuildSystem | a8a4858d723a0673eeeb6f039af05dc86be638a9 | [
"BSD-2-Clause-FreeBSD"
] | 3 | 2019-08-30T10:01:55.000Z | 2022-01-27T21:06:39.000Z | # -*-coding:Utf-8 -*
# @file __init__.py
#
# Copyright 2016 Airbus Safran Launchers. All rights reserved.
# Use is subject to license terms.
#
# $Id$
# $Date$
#
# module import entry point
# for subpackage sampleprj.pysubpckB.pysubsub
print 'importing sampleprj.pysubpckB.pysubsub ...'
print 'importing sampleprj.pysubpckB.pysubsub ... OK'
| 21.5625 | 62 | 0.724638 | 42 | 345 | 5.857143 | 0.809524 | 0.219512 | 0.317073 | 0.252033 | 0.430894 | 0.430894 | 0.430894 | 0.430894 | 0 | 0 | 0 | 0.017065 | 0.150725 | 345 | 15 | 63 | 23 | 0.822526 | 0.623188 | 0 | 0 | 0 | 0 | 0.74359 | 0.478632 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 1 | null | null | 1 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 8 |
81021148469f7a523eff613be7d0b02abdc59665 | 107 | py | Python | stacker/config/translators/__init__.py | vkhatri/stacker | d4eccc3ad99a1b37f77a82eb909dd06c487ead23 | [
"BSD-2-Clause"
] | 1 | 2021-11-06T17:01:01.000Z | 2021-11-06T17:01:01.000Z | stacker/config/translators/__init__.py | vkhatri/stacker | d4eccc3ad99a1b37f77a82eb909dd06c487ead23 | [
"BSD-2-Clause"
] | null | null | null | stacker/config/translators/__init__.py | vkhatri/stacker | d4eccc3ad99a1b37f77a82eb909dd06c487ead23 | [
"BSD-2-Clause"
] | 1 | 2021-11-06T17:00:53.000Z | 2021-11-06T17:00:53.000Z | import yaml
from .kms import kms_simple_constructor
yaml.add_constructor('!kms', kms_simple_constructor)
| 17.833333 | 52 | 0.831776 | 15 | 107 | 5.6 | 0.466667 | 0.214286 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093458 | 107 | 5 | 53 | 21.4 | 0.865979 | 0 | 0 | 0 | 0 | 0 | 0.037383 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 7 |
be05ee8296609be6fddc1108a66bf38dce6d0d84 | 25,314 | py | Python | neutron/tests/unit/test_iptables_manager.py | plumgrid/plumgrid-quantum | dbd7e472ca28d22d694eeeba47e0738985583961 | [
"Apache-2.0"
] | 1 | 2016-04-23T21:33:31.000Z | 2016-04-23T21:33:31.000Z | neutron/tests/unit/test_iptables_manager.py | plumgrid/plumgrid-quantum | dbd7e472ca28d22d694eeeba47e0738985583961 | [
"Apache-2.0"
] | null | null | null | neutron/tests/unit/test_iptables_manager.py | plumgrid/plumgrid-quantum | dbd7e472ca28d22d694eeeba47e0738985583961 | [
"Apache-2.0"
] | 4 | 2015-04-14T10:06:51.000Z | 2019-10-02T01:28:34.000Z | # vim: tabstop=4 shiftwidth=4 softtabstop=4
# Copyright 2012 Locaweb.
# All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# @author: Juliano Martinez, Locaweb.
import inspect
import os
import mock
from neutron.agent.linux import iptables_manager
from neutron.tests import base
from neutron.tests import tools
IPTABLES_ARG = {'bn': iptables_manager.binary_name}
NAT_DUMP = ('# Generated by iptables_manager\n'
'*nat\n'
':neutron-postrouting-bottom - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
':%(bn)s-snat - [0:0]\n'
':%(bn)s-PREROUTING - [0:0]\n'
':%(bn)s-float-snat - [0:0]\n'
':%(bn)s-POSTROUTING - [0:0]\n'
'[0:0] -A PREROUTING -j %(bn)s-PREROUTING\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A POSTROUTING -j %(bn)s-POSTROUTING\n'
'[0:0] -A POSTROUTING -j neutron-postrouting-bottom\n'
'[0:0] -A neutron-postrouting-bottom -j %(bn)s-snat\n'
'[0:0] -A %(bn)s-snat -j '
'%(bn)s-float-snat\n'
'COMMIT\n'
'# Completed by iptables_manager\n' % IPTABLES_ARG)
FILTER_DUMP = ('# Generated by iptables_manager\n'
'*filter\n'
':neutron-filter-top - [0:0]\n'
':%(bn)s-FORWARD - [0:0]\n'
':%(bn)s-INPUT - [0:0]\n'
':%(bn)s-local - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
'[0:0] -A FORWARD -j neutron-filter-top\n'
'[0:0] -A OUTPUT -j neutron-filter-top\n'
'[0:0] -A neutron-filter-top -j %(bn)s-local\n'
'[0:0] -A INPUT -j %(bn)s-INPUT\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A FORWARD -j %(bn)s-FORWARD\n'
'COMMIT\n'
'# Completed by iptables_manager\n' % IPTABLES_ARG)
class IptablesManagerStateFulTestCase(base.BaseTestCase):
def setUp(self):
super(IptablesManagerStateFulTestCase, self).setUp()
self.root_helper = 'sudo'
self.iptables = (iptables_manager.
IptablesManager(root_helper=self.root_helper))
self.execute = mock.patch.object(self.iptables, "execute").start()
self.addCleanup(mock.patch.stopall)
def test_binary_name(self):
self.assertEqual(iptables_manager.binary_name,
os.path.basename(inspect.stack()[-1][1])[:16])
def test_get_chain_name(self):
name = '0123456789' * 5
# 28 chars is the maximum length of iptables chain name.
self.assertEqual(iptables_manager.get_chain_name(name, wrap=False),
name[:28])
# 11 chars is the maximum length of chain name of iptable_manager
# if binary_name is prepended.
self.assertEqual(iptables_manager.get_chain_name(name, wrap=True),
name[:11])
def test_add_and_remove_chain_custom_binary_name(self):
bn = ("abcdef" * 5)
self.iptables = (iptables_manager.
IptablesManager(root_helper=self.root_helper,
binary_name=bn))
self.execute = mock.patch.object(self.iptables, "execute").start()
iptables_args = {'bn': bn[:16]}
filter_dump = ('# Generated by iptables_manager\n'
'*filter\n'
':neutron-filter-top - [0:0]\n'
':%(bn)s-FORWARD - [0:0]\n'
':%(bn)s-INPUT - [0:0]\n'
':%(bn)s-local - [0:0]\n'
':%(bn)s-filter - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
'[0:0] -A FORWARD -j neutron-filter-top\n'
'[0:0] -A OUTPUT -j neutron-filter-top\n'
'[0:0] -A neutron-filter-top -j %(bn)s-local\n'
'[0:0] -A INPUT -j %(bn)s-INPUT\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A FORWARD -j %(bn)s-FORWARD\n'
'COMMIT\n'
'# Completed by iptables_manager\n' % iptables_args)
filter_dump_mod = ('# Generated by iptables_manager\n'
'*filter\n'
':neutron-filter-top - [0:0]\n'
':%(bn)s-FORWARD - [0:0]\n'
':%(bn)s-INPUT - [0:0]\n'
':%(bn)s-local - [0:0]\n'
':%(bn)s-filter - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
'[0:0] -A FORWARD -j neutron-filter-top\n'
'[0:0] -A OUTPUT -j neutron-filter-top\n'
'[0:0] -A neutron-filter-top -j %(bn)s-local\n'
'[0:0] -A INPUT -j %(bn)s-INPUT\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A FORWARD -j %(bn)s-FORWARD\n'
'COMMIT\n'
'# Completed by iptables_manager\n'
% iptables_args)
nat_dump = ('# Generated by iptables_manager\n'
'*nat\n'
':neutron-postrouting-bottom - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
':%(bn)s-snat - [0:0]\n'
':%(bn)s-PREROUTING - [0:0]\n'
':%(bn)s-float-snat - [0:0]\n'
':%(bn)s-POSTROUTING - [0:0]\n'
'[0:0] -A PREROUTING -j %(bn)s-PREROUTING\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A POSTROUTING -j %(bn)s-POSTROUTING\n'
'[0:0] -A POSTROUTING -j neutron-postrouting-bottom\n'
'[0:0] -A neutron-postrouting-bottom -j %(bn)s-snat\n'
'[0:0] -A %(bn)s-snat -j '
'%(bn)s-float-snat\n'
'COMMIT\n'
'# Completed by iptables_manager\n' % iptables_args)
expected_calls_and_values = [
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=nat_dump + filter_dump_mod,
root_helper=self.root_helper),
None),
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=nat_dump + filter_dump,
root_helper=self.root_helper),
None),
]
tools.setup_mock_calls(self.execute, expected_calls_and_values)
self.iptables.ipv4['filter'].add_chain('filter')
self.iptables.apply()
self.iptables.ipv4['filter'].empty_chain('filter')
self.iptables.apply()
tools.verify_mock_calls(self.execute, expected_calls_and_values)
def test_empty_chain_custom_binary_name(self):
bn = ("abcdef" * 5)[:16]
self.iptables = (iptables_manager.
IptablesManager(root_helper=self.root_helper,
binary_name=bn))
self.execute = mock.patch.object(self.iptables, "execute").start()
iptables_args = {'bn': bn}
filter_dump = ('# Generated by iptables_manager\n'
'*filter\n'
':neutron-filter-top - [0:0]\n'
':%(bn)s-FORWARD - [0:0]\n'
':%(bn)s-INPUT - [0:0]\n'
':%(bn)s-local - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
'[0:0] -A FORWARD -j neutron-filter-top\n'
'[0:0] -A OUTPUT -j neutron-filter-top\n'
'[0:0] -A neutron-filter-top -j %(bn)s-local\n'
'[0:0] -A INPUT -j %(bn)s-INPUT\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A FORWARD -j %(bn)s-FORWARD\n'
'COMMIT\n'
'# Completed by iptables_manager\n' % iptables_args)
filter_dump_mod = ('# Generated by iptables_manager\n'
'*filter\n'
':neutron-filter-top - [0:0]\n'
':%(bn)s-FORWARD - [0:0]\n'
':%(bn)s-INPUT - [0:0]\n'
':%(bn)s-local - [0:0]\n'
':%(bn)s-filter - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
'[0:0] -A FORWARD -j neutron-filter-top\n'
'[0:0] -A OUTPUT -j neutron-filter-top\n'
'[0:0] -A neutron-filter-top -j %(bn)s-local\n'
'[0:0] -A INPUT -j %(bn)s-INPUT\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A FORWARD -j %(bn)s-FORWARD\n'
'[0:0] -A %(bn)s-filter -s 0/0 -d 192.168.0.2\n'
'COMMIT\n'
'# Completed by iptables_manager\n'
% iptables_args)
nat_dump = ('# Generated by iptables_manager\n'
'*nat\n'
':neutron-postrouting-bottom - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
':%(bn)s-snat - [0:0]\n'
':%(bn)s-PREROUTING - [0:0]\n'
':%(bn)s-float-snat - [0:0]\n'
':%(bn)s-POSTROUTING - [0:0]\n'
'[0:0] -A PREROUTING -j %(bn)s-PREROUTING\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A POSTROUTING -j %(bn)s-POSTROUTING\n'
'[0:0] -A POSTROUTING -j neutron-postrouting-bottom\n'
'[0:0] -A neutron-postrouting-bottom -j %(bn)s-snat\n'
'[0:0] -A %(bn)s-snat -j '
'%(bn)s-float-snat\n'
'COMMIT\n'
'# Completed by iptables_manager\n' % iptables_args)
expected_calls_and_values = [
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=nat_dump + filter_dump_mod,
root_helper=self.root_helper),
None),
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=nat_dump + filter_dump,
root_helper=self.root_helper),
None),
]
tools.setup_mock_calls(self.execute, expected_calls_and_values)
self.iptables.ipv4['filter'].add_chain('filter')
self.iptables.ipv4['filter'].add_rule('filter',
'-s 0/0 -d 192.168.0.2')
self.iptables.apply()
self.iptables.ipv4['filter'].remove_chain('filter')
self.iptables.apply()
tools.verify_mock_calls(self.execute, expected_calls_and_values)
def test_add_and_remove_chain(self):
filter_dump_mod = ('# Generated by iptables_manager\n'
'*filter\n'
':neutron-filter-top - [0:0]\n'
':%(bn)s-FORWARD - [0:0]\n'
':%(bn)s-INPUT - [0:0]\n'
':%(bn)s-local - [0:0]\n'
':%(bn)s-filter - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
'[0:0] -A FORWARD -j neutron-filter-top\n'
'[0:0] -A OUTPUT -j neutron-filter-top\n'
'[0:0] -A neutron-filter-top -j %(bn)s-local\n'
'[0:0] -A INPUT -j %(bn)s-INPUT\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A FORWARD -j %(bn)s-FORWARD\n'
'COMMIT\n'
'# Completed by iptables_manager\n'
% IPTABLES_ARG)
expected_calls_and_values = [
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=NAT_DUMP + filter_dump_mod,
root_helper=self.root_helper),
None),
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=NAT_DUMP + FILTER_DUMP,
root_helper=self.root_helper),
None),
]
tools.setup_mock_calls(self.execute, expected_calls_and_values)
self.iptables.ipv4['filter'].add_chain('filter')
self.iptables.apply()
self.iptables.ipv4['filter'].remove_chain('filter')
self.iptables.apply()
tools.verify_mock_calls(self.execute, expected_calls_and_values)
def test_add_filter_rule(self):
filter_dump_mod = ('# Generated by iptables_manager\n'
'*filter\n'
':neutron-filter-top - [0:0]\n'
':%(bn)s-FORWARD - [0:0]\n'
':%(bn)s-INPUT - [0:0]\n'
':%(bn)s-local - [0:0]\n'
':%(bn)s-filter - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
'[0:0] -A FORWARD -j neutron-filter-top\n'
'[0:0] -A OUTPUT -j neutron-filter-top\n'
'[0:0] -A neutron-filter-top -j %(bn)s-local\n'
'[0:0] -A INPUT -j %(bn)s-INPUT\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A FORWARD -j %(bn)s-FORWARD\n'
'[0:0] -A %(bn)s-filter -j DROP\n'
'[0:0] -A %(bn)s-INPUT -s 0/0 -d 192.168.0.2 -j '
'%(bn)s-filter\n'
'COMMIT\n'
'# Completed by iptables_manager\n'
% IPTABLES_ARG)
expected_calls_and_values = [
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=NAT_DUMP + filter_dump_mod,
root_helper=self.root_helper),
None),
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=NAT_DUMP + FILTER_DUMP,
root_helper=self.root_helper
),
None),
]
tools.setup_mock_calls(self.execute, expected_calls_and_values)
self.iptables.ipv4['filter'].add_chain('filter')
self.iptables.ipv4['filter'].add_rule('filter', '-j DROP')
self.iptables.ipv4['filter'].add_rule('INPUT',
'-s 0/0 -d 192.168.0.2 -j'
' %(bn)s-filter' % IPTABLES_ARG)
self.iptables.apply()
self.iptables.ipv4['filter'].remove_rule('filter', '-j DROP')
self.iptables.ipv4['filter'].remove_rule('INPUT',
'-s 0/0 -d 192.168.0.2 -j'
' %(bn)s-filter'
% IPTABLES_ARG)
self.iptables.ipv4['filter'].remove_chain('filter')
self.iptables.apply()
tools.verify_mock_calls(self.execute, expected_calls_and_values)
def test_add_nat_rule(self):
nat_dump = ('# Generated by iptables_manager\n'
'*nat\n'
':neutron-postrouting-bottom - [0:0]\n'
':%(bn)s-float-snat - [0:0]\n'
':%(bn)s-POSTROUTING - [0:0]\n'
':%(bn)s-PREROUTING - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
':%(bn)s-snat - [0:0]\n'
'[0:0] -A PREROUTING -j %(bn)s-PREROUTING\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A POSTROUTING -j %(bn)s-POSTROUTING\n'
'[0:0] -A POSTROUTING -j neutron-postrouting-bottom\n'
'[0:0] -A neutron-postrouting-bottom -j %(bn)s-snat\n'
'[0:0] -A %(bn)s-snat -j %(bn)s-float-snat\n'
'COMMIT\n'
'# Completed by iptables_manager\n'
% IPTABLES_ARG)
nat_dump_mod = ('# Generated by iptables_manager\n'
'*nat\n'
':neutron-postrouting-bottom - [0:0]\n'
':%(bn)s-float-snat - [0:0]\n'
':%(bn)s-POSTROUTING - [0:0]\n'
':%(bn)s-PREROUTING - [0:0]\n'
':%(bn)s-nat - [0:0]\n'
':%(bn)s-OUTPUT - [0:0]\n'
':%(bn)s-snat - [0:0]\n'
'[0:0] -A PREROUTING -j %(bn)s-PREROUTING\n'
'[0:0] -A OUTPUT -j %(bn)s-OUTPUT\n'
'[0:0] -A POSTROUTING -j %(bn)s-POSTROUTING\n'
'[0:0] -A POSTROUTING -j neutron-postrouting-bottom\n'
'[0:0] -A neutron-postrouting-bottom -j %(bn)s-snat\n'
'[0:0] -A %(bn)s-snat -j %(bn)s-float-snat\n'
'[0:0] -A %(bn)s-PREROUTING -d 192.168.0.3 -j '
'%(bn)s-nat\n'
'[0:0] -A %(bn)s-nat -p tcp --dport 8080 -j '
'REDIRECT --to-port 80\n'
'COMMIT\n'
'# Completed by iptables_manager\n'
% IPTABLES_ARG)
expected_calls_and_values = [
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=nat_dump_mod + FILTER_DUMP,
root_helper=self.root_helper),
None),
(mock.call(['iptables-save', '-c'],
root_helper=self.root_helper),
''),
(mock.call(['iptables-restore', '-c'],
process_input=nat_dump + FILTER_DUMP,
root_helper=self.root_helper),
None),
]
tools.setup_mock_calls(self.execute, expected_calls_and_values)
self.iptables.ipv4['nat'].add_chain('nat')
self.iptables.ipv4['nat'].add_rule('PREROUTING',
'-d 192.168.0.3 -j '
'%(bn)s-nat' % IPTABLES_ARG)
self.iptables.ipv4['nat'].add_rule('nat',
'-p tcp --dport 8080' +
' -j REDIRECT --to-port 80')
self.iptables.apply()
self.iptables.ipv4['nat'].remove_rule('nat',
'-p tcp --dport 8080 -j'
' REDIRECT --to-port 80')
self.iptables.ipv4['nat'].remove_rule('PREROUTING',
'-d 192.168.0.3 -j '
'%(bn)s-nat' % IPTABLES_ARG)
self.iptables.ipv4['nat'].remove_chain('nat')
self.iptables.apply()
tools.verify_mock_calls(self.execute, expected_calls_and_values)
def test_add_rule_to_a_nonexistent_chain(self):
self.assertRaises(LookupError, self.iptables.ipv4['filter'].add_rule,
'nonexistent', '-j DROP')
def test_remove_nonexistent_chain(self):
with mock.patch.object(iptables_manager, "LOG") as log:
self.iptables.ipv4['filter'].remove_chain('nonexistent')
log.warn.assert_called_once_with(
'Attempted to remove chain %s which does not exist',
'nonexistent')
def test_remove_nonexistent_rule(self):
with mock.patch.object(iptables_manager, "LOG") as log:
self.iptables.ipv4['filter'].remove_rule('nonexistent', '-j DROP')
log.warn.assert_called_once_with(
'Tried to remove rule that was not there: '
'%(chain)r %(rule)r %(wrap)r %(top)r',
{'wrap': True, 'top': False, 'rule': '-j DROP',
'chain': 'nonexistent'})
def test_get_traffic_counters_chain_notexists(self):
with mock.patch.object(iptables_manager, "LOG") as log:
acc = self.iptables.get_traffic_counters('chain1')
self.assertIsNone(acc)
self.assertEqual(0, self.execute.call_count)
log.warn.assert_called_once_with(
'Attempted to get traffic counters of chain %s which '
'does not exist', 'chain1')
def test_get_traffic_counters(self):
iptables_dump = (
'Chain OUTPUT (policy ACCEPT 400 packets, 65901 bytes)\n'
' pkts bytes target prot opt in out source'
' destination \n'
' 400 65901 chain1 all -- * * 0.0.0.0/0'
' 0.0.0.0/0 \n'
' 400 65901 chain2 all -- * * 0.0.0.0/0'
' 0.0.0.0/0 \n')
expected_calls_and_values = [
(mock.call(['iptables', '-t', 'filter', '-L', 'OUTPUT',
'-n', '-v', '-x'],
root_helper=self.root_helper),
iptables_dump),
(mock.call(['iptables', '-t', 'nat', '-L', 'OUTPUT', '-n',
'-v', '-x'],
root_helper=self.root_helper),
''),
(mock.call(['ip6tables', '-t', 'filter', '-L', 'OUTPUT',
'-n', '-v', '-x'],
root_helper=self.root_helper),
iptables_dump),
]
tools.setup_mock_calls(self.execute, expected_calls_and_values)
acc = self.iptables.get_traffic_counters('OUTPUT')
self.assertEqual(acc['pkts'], 1600)
self.assertEqual(acc['bytes'], 263604)
tools.verify_mock_calls(self.execute, expected_calls_and_values)
def test_get_traffic_counters_with_zero(self):
iptables_dump = (
'Chain OUTPUT (policy ACCEPT 400 packets, 65901 bytes)\n'
' pkts bytes target prot opt in out source'
' destination \n'
' 400 65901 chain1 all -- * * 0.0.0.0/0'
' 0.0.0.0/0 \n'
' 400 65901 chain2 all -- * * 0.0.0.0/0'
' 0.0.0.0/0 \n')
expected_calls_and_values = [
(mock.call(['iptables', '-t', 'filter', '-L', 'OUTPUT',
'-n', '-v', '-x', '-Z'],
root_helper=self.root_helper),
iptables_dump),
(mock.call(['iptables', '-t', 'nat', '-L', 'OUTPUT', '-n',
'-v', '-x', '-Z'],
root_helper=self.root_helper),
''),
(mock.call(['ip6tables', '-t', 'filter', '-L', 'OUTPUT',
'-n', '-v', '-x', '-Z'],
root_helper=self.root_helper),
iptables_dump),
]
tools.setup_mock_calls(self.execute, expected_calls_and_values)
acc = self.iptables.get_traffic_counters('OUTPUT', zero=True)
self.assertEqual(acc['pkts'], 1600)
self.assertEqual(acc['bytes'], 263604)
tools.verify_mock_calls(self.execute, expected_calls_and_values)
class IptablesManagerStateLessTestCase(base.BaseTestCase):
def setUp(self):
super(IptablesManagerStateLessTestCase, self).setUp()
self.iptables = (iptables_manager.IptablesManager(state_less=True))
def test_nat_not_found(self):
self.assertNotIn('nat', self.iptables.ipv4)
| 45.284436 | 78 | 0.45386 | 2,899 | 25,314 | 3.836495 | 0.080028 | 0.033987 | 0.02077 | 0.027693 | 0.854792 | 0.843823 | 0.809297 | 0.808128 | 0.78556 | 0.769915 | 0 | 0.037221 | 0.397172 | 25,314 | 558 | 79 | 45.365591 | 0.691612 | 0.032354 | 0 | 0.782979 | 0 | 0.038298 | 0.329942 | 0.016548 | 0 | 0 | 0 | 0 | 0.029787 | 1 | 0.034043 | false | 0 | 0.012766 | 0 | 0.051064 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
be076e1c67ed9ca27b639778b56dfea40825eb6d | 96 | py | Python | chart_museum_cleaner/__main__.py | heretse/chart-museum-cleaner | 4bb65c2ace548f29672655fd3b1d907f95af0172 | [
"MIT"
] | null | null | null | chart_museum_cleaner/__main__.py | heretse/chart-museum-cleaner | 4bb65c2ace548f29672655fd3b1d907f95af0172 | [
"MIT"
] | null | null | null | chart_museum_cleaner/__main__.py | heretse/chart-museum-cleaner | 4bb65c2ace548f29672655fd3b1d907f95af0172 | [
"MIT"
] | 2 | 2021-06-16T09:01:32.000Z | 2022-02-23T13:36:12.000Z | import chart_museum_cleaner.cli
if __name__ == '__main__':
chart_museum_cleaner.cli.main()
| 19.2 | 35 | 0.770833 | 13 | 96 | 4.769231 | 0.615385 | 0.354839 | 0.580645 | 0.677419 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 96 | 4 | 36 | 24 | 0.738095 | 0 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 7 |
be0d48f5bade56c771884f5c0e48415aa0d1e5bf | 219,301 | py | Python | msgraph/cli/command_modules/teams/azext_teams/generated/custom.py | microsoftgraph/msgraph-cli-archived | 489f70bf4ede1ce67b84bfb31e66da3e4db76062 | [
"MIT"
] | null | null | null | msgraph/cli/command_modules/teams/azext_teams/generated/custom.py | microsoftgraph/msgraph-cli-archived | 489f70bf4ede1ce67b84bfb31e66da3e4db76062 | [
"MIT"
] | 22 | 2022-03-29T22:54:37.000Z | 2022-03-29T22:55:27.000Z | msgraph/cli/command_modules/teams/azext_teams/generated/custom.py | microsoftgraph/msgraph-cli-archived | 489f70bf4ede1ce67b84bfb31e66da3e4db76062 | [
"MIT"
] | null | null | null | # --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is
# regenerated.
# --------------------------------------------------------------------------
# pylint: disable=line-too-long
# pylint: disable=too-many-lines
def teams_chat_chat_create_chat(client,
id_=None):
body = {}
body['id'] = id_
return client.create_chat(body=body)
def teams_chat_chat_delete_chat(client,
chat_id,
if_match=None):
return client.delete_chat(chat_id=chat_id,
if_match=if_match)
def teams_chat_chat_list_chat(client,
orderby=None,
select=None,
expand=None):
return client.list_chat(orderby=orderby,
select=select,
expand=expand)
def teams_chat_chat_show_chat(client,
chat_id,
select=None,
expand=None):
return client.get_chat(chat_id=chat_id,
select=select,
expand=expand)
def teams_chat_chat_update_chat(client,
chat_id,
id_=None):
body = {}
body['id'] = id_
return client.update_chat(chat_id=chat_id,
body=body)
def teams_chat_show_all_message(client):
return client.get_all_messages()
def teams_group_delete_team(client,
group_id,
if_match=None):
return client.delete_team(group_id=group_id,
if_match=if_match)
def teams_group_show_team(client,
group_id,
select=None,
expand=None):
return client.get_team(group_id=group_id,
select=select,
expand=expand)
def teams_group_update_team(client,
group_id,
id_=None,
classification=None,
description=None,
display_name=None,
fun_settings=None,
guest_settings=None,
internal_id=None,
is_archived=None,
member_settings=None,
messaging_settings=None,
specialization=None,
visibility=None,
web_url=None,
channels=None,
installed_apps=None,
members=None,
operations=None,
primary_channel=None,
microsoft_graph_entity_id=None,
id1=None,
deleted_date_time=None,
assigned_labels=None,
assigned_licenses=None,
microsoft_graph_group_classification=None,
created_date_time=None,
microsoft_graph_group_description=None,
microsoft_graph_group_display_name=None,
expiration_date_time=None,
group_types=None,
has_members_with_license_errors=None,
license_processing_state=None,
mail=None,
mail_enabled=None,
mail_nickname=None,
membership_rule=None,
membership_rule_processing_state=None,
on_premises_domain_name=None,
on_premises_last_sync_date_time=None,
on_premises_net_bios_name=None,
on_premises_provisioning_errors=None,
on_premises_sam_account_name=None,
on_premises_security_identifier=None,
on_premises_sync_enabled=None,
preferred_data_location=None,
preferred_language=None,
proxy_addresses=None,
renewed_date_time=None,
security_enabled=None,
security_identifier=None,
theme=None,
microsoft_graph_group_visibility=None,
allow_external_senders=None,
auto_subscribe_new_members=None,
hide_from_address_lists=None,
hide_from_outlook_clients=None,
is_subscribed_by_mail=None,
unseen_count=None,
group_is_archived=None,
app_role_assignments=None,
created_on_behalf_of=None,
member_of=None,
microsoft_graph_group_members=None,
members_with_license_errors=None,
owners=None,
settings=None,
transitive_member_of=None,
transitive_members=None,
accepted_senders=None,
calendar=None,
calendar_view=None,
conversations=None,
events=None,
photo=None,
photos=None,
rejected_senders=None,
threads=None,
drive=None,
drives=None,
sites=None,
extensions=None,
group_lifecycle_policies=None,
planner=None,
onenote=None,
team=None,
id2=None,
enabled=None,
offer_shift_requests_enabled=None,
open_shifts_enabled=None,
provision_status=None,
provision_status_code=None,
swap_shifts_requests_enabled=None,
time_clock_enabled=None,
time_off_requests_enabled=None,
time_zone=None,
workforce_integration_ids=None,
offer_shift_requests=None,
open_shift_change_requests=None,
open_shifts=None,
scheduling_groups=None,
shifts=None,
swap_shifts_change_requests=None,
time_off_reasons=None,
time_off_requests=None,
times_off=None):
body = {}
body['id'] = id_
body['classification'] = classification
body['description'] = description
body['display_name'] = display_name
body['fun_settings'] = fun_settings
body['guest_settings'] = guest_settings
body['internal_id'] = internal_id
body['is_archived'] = is_archived
body['member_settings'] = member_settings
body['messaging_settings'] = messaging_settings
body['specialization'] = specialization
body['visibility'] = visibility
body['web_url'] = web_url
body['channels'] = channels
body['installed_apps'] = installed_apps
body['members'] = members
body['operations'] = operations
body['primary_channel'] = primary_channel
body['template'] = {}
body['template']['id'] = microsoft_graph_entity_id
body['group'] = {}
body['group']['id'] = id1
body['group']['deleted_date_time'] = deleted_date_time
body['group']['assigned_labels'] = assigned_labels
body['group']['assigned_licenses'] = assigned_licenses
body['group']['classification'] = microsoft_graph_group_classification
body['group']['created_date_time'] = created_date_time
body['group']['description'] = microsoft_graph_group_description
body['group']['display_name'] = microsoft_graph_group_display_name
body['group']['expiration_date_time'] = expiration_date_time
body['group']['group_types'] = group_types
body['group']['has_members_with_license_errors'] = has_members_with_license_errors
body['group']['license_processing_state'] = license_processing_state
body['group']['mail'] = mail
body['group']['mail_enabled'] = mail_enabled
body['group']['mail_nickname'] = mail_nickname
body['group']['membership_rule'] = membership_rule
body['group']['membership_rule_processing_state'] = membership_rule_processing_state
body['group']['on_premises_domain_name'] = on_premises_domain_name
body['group']['on_premises_last_sync_date_time'] = on_premises_last_sync_date_time
body['group']['on_premises_net_bios_name'] = on_premises_net_bios_name
body['group']['on_premises_provisioning_errors'] = on_premises_provisioning_errors
body['group']['on_premises_sam_account_name'] = on_premises_sam_account_name
body['group']['on_premises_security_identifier'] = on_premises_security_identifier
body['group']['on_premises_sync_enabled'] = on_premises_sync_enabled
body['group']['preferred_data_location'] = preferred_data_location
body['group']['preferred_language'] = preferred_language
body['group']['proxy_addresses'] = proxy_addresses
body['group']['renewed_date_time'] = renewed_date_time
body['group']['security_enabled'] = security_enabled
body['group']['security_identifier'] = security_identifier
body['group']['theme'] = theme
body['group']['visibility'] = microsoft_graph_group_visibility
body['group']['allow_external_senders'] = allow_external_senders
body['group']['auto_subscribe_new_members'] = auto_subscribe_new_members
body['group']['hide_from_address_lists'] = hide_from_address_lists
body['group']['hide_from_outlook_clients'] = hide_from_outlook_clients
body['group']['is_subscribed_by_mail'] = is_subscribed_by_mail
body['group']['unseen_count'] = unseen_count
body['group']['is_archived'] = group_is_archived
body['group']['app_role_assignments'] = app_role_assignments
body['group']['created_on_behalf_of'] = created_on_behalf_of
body['group']['member_of'] = member_of
body['group']['members'] = microsoft_graph_group_members
body['group']['members_with_license_errors'] = members_with_license_errors
body['group']['owners'] = owners
body['group']['settings'] = settings
body['group']['transitive_member_of'] = transitive_member_of
body['group']['transitive_members'] = transitive_members
body['group']['accepted_senders'] = accepted_senders
body['group']['calendar'] = calendar
body['group']['calendar_view'] = calendar_view
body['group']['conversations'] = conversations
body['group']['events'] = events
body['group']['photo'] = photo
body['group']['photos'] = photos
body['group']['rejected_senders'] = rejected_senders
body['group']['threads'] = threads
body['group']['drive'] = drive
body['group']['drives'] = drives
body['group']['sites'] = sites
body['group']['extensions'] = extensions
body['group']['group_lifecycle_policies'] = group_lifecycle_policies
body['group']['planner'] = planner
body['group']['onenote'] = onenote
body['group']['team'] = team
body['schedule'] = {}
body['schedule']['id'] = id2
body['schedule']['enabled'] = enabled
body['schedule']['offer_shift_requests_enabled'] = offer_shift_requests_enabled
body['schedule']['open_shifts_enabled'] = open_shifts_enabled
body['schedule']['provision_status'] = provision_status
body['schedule']['provision_status_code'] = provision_status_code
body['schedule']['swap_shifts_requests_enabled'] = swap_shifts_requests_enabled
body['schedule']['time_clock_enabled'] = time_clock_enabled
body['schedule']['time_off_requests_enabled'] = time_off_requests_enabled
body['schedule']['time_zone'] = time_zone
body['schedule']['workforce_integration_ids'] = workforce_integration_ids
body['schedule']['offer_shift_requests'] = offer_shift_requests
body['schedule']['open_shift_change_requests'] = open_shift_change_requests
body['schedule']['open_shifts'] = open_shifts
body['schedule']['scheduling_groups'] = scheduling_groups
body['schedule']['shifts'] = shifts
body['schedule']['swap_shifts_change_requests'] = swap_shifts_change_requests
body['schedule']['time_off_reasons'] = time_off_reasons
body['schedule']['time_off_requests'] = time_off_requests
body['schedule']['times_off'] = times_off
return client.update_team(group_id=group_id,
body=body)
def teams_team_list(client,
orderby=None,
select=None,
expand=None):
return client.list_team(orderby=orderby,
select=select,
expand=expand)
def teams_team_create(client,
team_id=None,
id_=None,
classification=None,
description=None,
display_name=None,
fun_settings=None,
guest_settings=None,
internal_id=None,
is_archived=None,
member_settings=None,
messaging_settings=None,
specialization=None,
visibility=None,
web_url=None,
channels=None,
installed_apps=None,
members=None,
operations=None,
primary_channel=None,
microsoft_graph_entity_id=None,
id1=None,
deleted_date_time=None,
assigned_labels=None,
assigned_licenses=None,
microsoft_graph_group_classification=None,
created_date_time=None,
microsoft_graph_group_description=None,
microsoft_graph_group_display_name=None,
expiration_date_time=None,
group_types=None,
has_members_with_license_errors=None,
license_processing_state=None,
mail=None,
mail_enabled=None,
mail_nickname=None,
membership_rule=None,
membership_rule_processing_state=None,
on_premises_domain_name=None,
on_premises_last_sync_date_time=None,
on_premises_net_bios_name=None,
on_premises_provisioning_errors=None,
on_premises_sam_account_name=None,
on_premises_security_identifier=None,
on_premises_sync_enabled=None,
preferred_data_location=None,
preferred_language=None,
proxy_addresses=None,
renewed_date_time=None,
security_enabled=None,
security_identifier=None,
theme=None,
microsoft_graph_group_visibility=None,
allow_external_senders=None,
auto_subscribe_new_members=None,
hide_from_address_lists=None,
hide_from_outlook_clients=None,
is_subscribed_by_mail=None,
unseen_count=None,
group_is_archived=None,
app_role_assignments=None,
created_on_behalf_of=None,
member_of=None,
microsoft_graph_group_members=None,
members_with_license_errors=None,
owners=None,
settings=None,
transitive_member_of=None,
transitive_members=None,
accepted_senders=None,
calendar=None,
calendar_view=None,
conversations=None,
events=None,
photo=None,
photos=None,
rejected_senders=None,
threads=None,
drive=None,
drives=None,
sites=None,
extensions=None,
group_lifecycle_policies=None,
planner=None,
onenote=None,
team=None,
id2=None,
enabled=None,
offer_shift_requests_enabled=None,
open_shifts_enabled=None,
provision_status=None,
provision_status_code=None,
swap_shifts_requests_enabled=None,
time_clock_enabled=None,
time_off_requests_enabled=None,
time_zone=None,
workforce_integration_ids=None,
offer_shift_requests=None,
open_shift_change_requests=None,
open_shifts=None,
scheduling_groups=None,
shifts=None,
swap_shifts_change_requests=None,
time_off_reasons=None,
time_off_requests=None,
times_off=None):
body = {}
body['id'] = id_
body['classification'] = classification
body['description'] = description
body['display_name'] = display_name
body['fun_settings'] = fun_settings
body['guest_settings'] = guest_settings
body['internal_id'] = internal_id
body['is_archived'] = is_archived
body['member_settings'] = member_settings
body['messaging_settings'] = messaging_settings
body['specialization'] = specialization
body['visibility'] = visibility
body['web_url'] = web_url
body['channels'] = channels
body['installed_apps'] = installed_apps
body['members'] = members
body['operations'] = operations
body['primary_channel'] = primary_channel
body['template'] = {}
body['template']['id'] = microsoft_graph_entity_id
body['group'] = {}
body['group']['id'] = id1
body['group']['deleted_date_time'] = deleted_date_time
body['group']['assigned_labels'] = assigned_labels
body['group']['assigned_licenses'] = assigned_licenses
body['group']['classification'] = microsoft_graph_group_classification
body['group']['created_date_time'] = created_date_time
body['group']['description'] = microsoft_graph_group_description
body['group']['display_name'] = microsoft_graph_group_display_name
body['group']['expiration_date_time'] = expiration_date_time
body['group']['group_types'] = group_types
body['group']['has_members_with_license_errors'] = has_members_with_license_errors
body['group']['license_processing_state'] = license_processing_state
body['group']['mail'] = mail
body['group']['mail_enabled'] = mail_enabled
body['group']['mail_nickname'] = mail_nickname
body['group']['membership_rule'] = membership_rule
body['group']['membership_rule_processing_state'] = membership_rule_processing_state
body['group']['on_premises_domain_name'] = on_premises_domain_name
body['group']['on_premises_last_sync_date_time'] = on_premises_last_sync_date_time
body['group']['on_premises_net_bios_name'] = on_premises_net_bios_name
body['group']['on_premises_provisioning_errors'] = on_premises_provisioning_errors
body['group']['on_premises_sam_account_name'] = on_premises_sam_account_name
body['group']['on_premises_security_identifier'] = on_premises_security_identifier
body['group']['on_premises_sync_enabled'] = on_premises_sync_enabled
body['group']['preferred_data_location'] = preferred_data_location
body['group']['preferred_language'] = preferred_language
body['group']['proxy_addresses'] = proxy_addresses
body['group']['renewed_date_time'] = renewed_date_time
body['group']['security_enabled'] = security_enabled
body['group']['security_identifier'] = security_identifier
body['group']['theme'] = theme
body['group']['visibility'] = microsoft_graph_group_visibility
body['group']['allow_external_senders'] = allow_external_senders
body['group']['auto_subscribe_new_members'] = auto_subscribe_new_members
body['group']['hide_from_address_lists'] = hide_from_address_lists
body['group']['hide_from_outlook_clients'] = hide_from_outlook_clients
body['group']['is_subscribed_by_mail'] = is_subscribed_by_mail
body['group']['unseen_count'] = unseen_count
body['group']['is_archived'] = group_is_archived
body['group']['app_role_assignments'] = app_role_assignments
body['group']['created_on_behalf_of'] = created_on_behalf_of
body['group']['member_of'] = member_of
body['group']['members'] = microsoft_graph_group_members
body['group']['members_with_license_errors'] = members_with_license_errors
body['group']['owners'] = owners
body['group']['settings'] = settings
body['group']['transitive_member_of'] = transitive_member_of
body['group']['transitive_members'] = transitive_members
body['group']['accepted_senders'] = accepted_senders
body['group']['calendar'] = calendar
body['group']['calendar_view'] = calendar_view
body['group']['conversations'] = conversations
body['group']['events'] = events
body['group']['photo'] = photo
body['group']['photos'] = photos
body['group']['rejected_senders'] = rejected_senders
body['group']['threads'] = threads
body['group']['drive'] = drive
body['group']['drives'] = drives
body['group']['sites'] = sites
body['group']['extensions'] = extensions
body['group']['group_lifecycle_policies'] = group_lifecycle_policies
body['group']['planner'] = planner
body['group']['onenote'] = onenote
body['group']['team'] = team
body['schedule'] = {}
body['schedule']['id'] = id2
body['schedule']['enabled'] = enabled
body['schedule']['offer_shift_requests_enabled'] = offer_shift_requests_enabled
body['schedule']['open_shifts_enabled'] = open_shifts_enabled
body['schedule']['provision_status'] = provision_status
body['schedule']['provision_status_code'] = provision_status_code
body['schedule']['swap_shifts_requests_enabled'] = swap_shifts_requests_enabled
body['schedule']['time_clock_enabled'] = time_clock_enabled
body['schedule']['time_off_requests_enabled'] = time_off_requests_enabled
body['schedule']['time_zone'] = time_zone
body['schedule']['workforce_integration_ids'] = workforce_integration_ids
body['schedule']['offer_shift_requests'] = offer_shift_requests
body['schedule']['open_shift_change_requests'] = open_shift_change_requests
body['schedule']['open_shifts'] = open_shifts
body['schedule']['scheduling_groups'] = scheduling_groups
body['schedule']['shifts'] = shifts
body['schedule']['swap_shifts_change_requests'] = swap_shifts_change_requests
body['schedule']['time_off_reasons'] = time_off_reasons
body['schedule']['time_off_requests'] = time_off_requests
body['schedule']['times_off'] = times_off
if team_id is not None:
return client.update_team(team_id=team_id,
body=body)
return client.create_team(body=body)
def teams_team_delete_team(client,
team_id,
if_match=None):
return client.delete_team(team_id=team_id,
if_match=if_match)
def teams_team_show_team(client,
team_id,
select=None,
expand=None):
return client.get_team(team_id=team_id,
select=select,
expand=expand)
def teams_team_archive(client,
team_id,
should_set_spo_site_read_only_for_members=None):
if should_set_spo_site_read_only_for_members is None:
should_set_spo_site_read_only_for_members = False
body = {}
body['should_set_spo_site_read_only_for_members'] = False if should_set_spo_site_read_only_for_members is None else should_set_spo_site_read_only_for_members
return client.archive(team_id=team_id,
body=body)
def teams_team_clone(client,
team_id,
display_name=None,
description=None,
mail_nickname=None,
classification=None,
visibility=None,
parts_to_clone=None):
body = {}
body['display_name'] = display_name
body['description'] = description
body['mail_nickname'] = mail_nickname
body['classification'] = classification
body['visibility'] = visibility
body['parts_to_clone'] = parts_to_clone
return client.clone(team_id=team_id,
body=body)
def teams_team_create_channel(client,
team_id,
id_=None,
description=None,
display_name=None,
email=None,
membership_type=None,
web_url=None,
files_folder=None,
members=None,
messages=None,
tabs=None):
body = {}
body['id'] = id_
body['description'] = description
body['display_name'] = display_name
body['email'] = email
body['membership_type'] = membership_type
body['web_url'] = web_url
body['files_folder'] = files_folder
body['members'] = members
body['messages'] = messages
body['tabs'] = tabs
return client.create_channels(team_id=team_id,
body=body)
def teams_team_create_installed_app(client,
team_id,
id_=None,
teams_app_definition=None,
microsoft_graph_entity_id=None,
display_name=None,
distribution_method=None,
external_id=None,
app_definitions=None):
body = {}
body['id'] = id_
body['teams_app_definition'] = teams_app_definition
body['teams_app'] = {}
body['teams_app']['id'] = microsoft_graph_entity_id
body['teams_app']['display_name'] = display_name
body['teams_app']['distribution_method'] = distribution_method
body['teams_app']['external_id'] = external_id
body['teams_app']['app_definitions'] = app_definitions
return client.create_installed_apps(team_id=team_id,
body=body)
def teams_team_create_member(client,
team_id,
id_=None,
display_name=None,
roles=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['roles'] = roles
return client.create_members(team_id=team_id,
body=body)
def teams_team_create_operation(client,
team_id,
id_=None,
attempts_count=None,
created_date_time=None,
error=None,
last_action_date_time=None,
operation_type=None,
status=None,
target_resource_id=None,
target_resource_location=None):
body = {}
body['id'] = id_
body['attempts_count'] = attempts_count
body['created_date_time'] = created_date_time
body['error'] = error
body['last_action_date_time'] = last_action_date_time
body['operation_type'] = operation_type
body['status'] = status
body['target_resource_id'] = target_resource_id
body['target_resource_location'] = target_resource_location
return client.create_operations(team_id=team_id,
body=body)
def teams_team_delete_channel(client,
team_id,
channel_id,
if_match=None):
return client.delete_channels(team_id=team_id,
channel_id=channel_id,
if_match=if_match)
def teams_team_delete_installed_app(client,
team_id,
teams_app_installation_id,
if_match=None):
return client.delete_installed_apps(team_id=team_id,
teams_app_installation_id=teams_app_installation_id,
if_match=if_match)
def teams_team_delete_member(client,
team_id,
conversation_member_id,
if_match=None):
return client.delete_members(team_id=team_id,
conversation_member_id=conversation_member_id,
if_match=if_match)
def teams_team_delete_operation(client,
team_id,
teams_async_operation_id,
if_match=None):
return client.delete_operations(team_id=team_id,
teams_async_operation_id=teams_async_operation_id,
if_match=if_match)
def teams_team_delete_primary_channel(client,
team_id,
if_match=None):
return client.delete_primary_channel(team_id=team_id,
if_match=if_match)
def teams_team_delete_ref_group(client,
team_id,
if_match=None):
return client.delete_ref_group(team_id=team_id,
if_match=if_match)
def teams_team_delete_ref_template(client,
team_id,
if_match=None):
return client.delete_ref_template(team_id=team_id,
if_match=if_match)
def teams_team_delete_schedule(client,
team_id,
if_match=None):
return client.delete_schedule(team_id=team_id,
if_match=if_match)
def teams_team_list_channel(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_channels(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_list_installed_app(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_installed_apps(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_list_member(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_members(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_list_operation(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_operations(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_set_ref_group(client,
team_id,
body):
return client.set_ref_group(team_id=team_id,
body=body)
def teams_team_set_ref_template(client,
team_id,
body):
return client.set_ref_template(team_id=team_id,
body=body)
def teams_team_show_all_message(client):
return client.get_all_messages()
def teams_team_show_channel(client,
team_id,
channel_id,
select=None,
expand=None):
return client.get_channels(team_id=team_id,
channel_id=channel_id,
select=select,
expand=expand)
def teams_team_show_group(client,
team_id,
select=None,
expand=None):
return client.get_group(team_id=team_id,
select=select,
expand=expand)
def teams_team_show_installed_app(client,
team_id,
teams_app_installation_id,
select=None,
expand=None):
return client.get_installed_apps(team_id=team_id,
teams_app_installation_id=teams_app_installation_id,
select=select,
expand=expand)
def teams_team_show_member(client,
team_id,
conversation_member_id,
select=None,
expand=None):
return client.get_members(team_id=team_id,
conversation_member_id=conversation_member_id,
select=select,
expand=expand)
def teams_team_show_operation(client,
team_id,
teams_async_operation_id,
select=None,
expand=None):
return client.get_operations(team_id=team_id,
teams_async_operation_id=teams_async_operation_id,
select=select,
expand=expand)
def teams_team_show_primary_channel(client,
team_id,
select=None,
expand=None):
return client.get_primary_channel(team_id=team_id,
select=select,
expand=expand)
def teams_team_show_ref_group(client,
team_id):
return client.get_ref_group(team_id=team_id)
def teams_team_show_ref_template(client,
team_id):
return client.get_ref_template(team_id=team_id)
def teams_team_show_schedule(client,
team_id,
select=None,
expand=None):
return client.get_schedule(team_id=team_id,
select=select,
expand=expand)
def teams_team_show_template(client,
team_id,
select=None,
expand=None):
return client.get_template(team_id=team_id,
select=select,
expand=expand)
def teams_team_unarchive(client,
team_id):
return client.unarchive(team_id=team_id)
def teams_team_update_channel(client,
team_id,
channel_id,
id_=None,
description=None,
display_name=None,
email=None,
membership_type=None,
web_url=None,
files_folder=None,
members=None,
messages=None,
tabs=None):
body = {}
body['id'] = id_
body['description'] = description
body['display_name'] = display_name
body['email'] = email
body['membership_type'] = membership_type
body['web_url'] = web_url
body['files_folder'] = files_folder
body['members'] = members
body['messages'] = messages
body['tabs'] = tabs
return client.update_channels(team_id=team_id,
channel_id=channel_id,
body=body)
def teams_team_update_installed_app(client,
team_id,
teams_app_installation_id,
id_=None,
teams_app_definition=None,
microsoft_graph_entity_id=None,
display_name=None,
distribution_method=None,
external_id=None,
app_definitions=None):
body = {}
body['id'] = id_
body['teams_app_definition'] = teams_app_definition
body['teams_app'] = {}
body['teams_app']['id'] = microsoft_graph_entity_id
body['teams_app']['display_name'] = display_name
body['teams_app']['distribution_method'] = distribution_method
body['teams_app']['external_id'] = external_id
body['teams_app']['app_definitions'] = app_definitions
return client.update_installed_apps(team_id=team_id,
teams_app_installation_id=teams_app_installation_id,
body=body)
def teams_team_update_member(client,
team_id,
conversation_member_id,
id_=None,
display_name=None,
roles=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['roles'] = roles
return client.update_members(team_id=team_id,
conversation_member_id=conversation_member_id,
body=body)
def teams_team_update_operation(client,
team_id,
teams_async_operation_id,
id_=None,
attempts_count=None,
created_date_time=None,
error=None,
last_action_date_time=None,
operation_type=None,
status=None,
target_resource_id=None,
target_resource_location=None):
body = {}
body['id'] = id_
body['attempts_count'] = attempts_count
body['created_date_time'] = created_date_time
body['error'] = error
body['last_action_date_time'] = last_action_date_time
body['operation_type'] = operation_type
body['status'] = status
body['target_resource_id'] = target_resource_id
body['target_resource_location'] = target_resource_location
return client.update_operations(team_id=team_id,
teams_async_operation_id=teams_async_operation_id,
body=body)
def teams_team_update_primary_channel(client,
team_id,
id_=None,
description=None,
display_name=None,
email=None,
membership_type=None,
web_url=None,
files_folder=None,
members=None,
messages=None,
tabs=None):
body = {}
body['id'] = id_
body['description'] = description
body['display_name'] = display_name
body['email'] = email
body['membership_type'] = membership_type
body['web_url'] = web_url
body['files_folder'] = files_folder
body['members'] = members
body['messages'] = messages
body['tabs'] = tabs
return client.update_primary_channel(team_id=team_id,
body=body)
def teams_team_update_schedule(client,
team_id,
id_=None,
enabled=None,
offer_shift_requests_enabled=None,
open_shifts_enabled=None,
provision_status=None,
provision_status_code=None,
swap_shifts_requests_enabled=None,
time_clock_enabled=None,
time_off_requests_enabled=None,
time_zone=None,
workforce_integration_ids=None,
offer_shift_requests=None,
open_shift_change_requests=None,
open_shifts=None,
scheduling_groups=None,
shifts=None,
swap_shifts_change_requests=None,
time_off_reasons=None,
time_off_requests=None,
times_off=None):
body = {}
body['id'] = id_
body['enabled'] = enabled
body['offer_shift_requests_enabled'] = offer_shift_requests_enabled
body['open_shifts_enabled'] = open_shifts_enabled
body['provision_status'] = provision_status
body['provision_status_code'] = provision_status_code
body['swap_shifts_requests_enabled'] = swap_shifts_requests_enabled
body['time_clock_enabled'] = time_clock_enabled
body['time_off_requests_enabled'] = time_off_requests_enabled
body['time_zone'] = time_zone
body['workforce_integration_ids'] = workforce_integration_ids
body['offer_shift_requests'] = offer_shift_requests
body['open_shift_change_requests'] = open_shift_change_requests
body['open_shifts'] = open_shifts
body['scheduling_groups'] = scheduling_groups
body['shifts'] = shifts
body['swap_shifts_change_requests'] = swap_shifts_change_requests
body['time_off_reasons'] = time_off_reasons
body['time_off_requests'] = time_off_requests
body['times_off'] = times_off
return client.update_schedule(team_id=team_id,
body=body)
def teams_team_channel_create_member(client,
team_id,
channel_id,
id_=None,
display_name=None,
roles=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['roles'] = roles
return client.create_members(team_id=team_id,
channel_id=channel_id,
body=body)
def teams_team_channel_create_message(client,
team_id,
channel_id,
body,
id_=None,
attachments=None,
created_date_time=None,
deleted_date_time=None,
etag=None,
importance=None,
last_edited_date_time=None,
last_modified_date_time=None,
locale=None,
mentions=None,
message_type=None,
reactions=None,
reply_to_id=None,
subject=None,
summary=None,
web_url=None,
hosted_contents=None,
replies=None,
dlp_action=None,
justification_text=None,
policy_tip=None,
user_action=None,
verdict_details=None,
application=None,
device=None,
user=None):
body = {}
body['id'] = id_
body['attachments'] = attachments
body['body'] = body
body['created_date_time'] = created_date_time
body['deleted_date_time'] = deleted_date_time
body['etag'] = etag
body['importance'] = importance
body['last_edited_date_time'] = last_edited_date_time
body['last_modified_date_time'] = last_modified_date_time
body['locale'] = locale
body['mentions'] = mentions
body['message_type'] = message_type
body['reactions'] = reactions
body['reply_to_id'] = reply_to_id
body['subject'] = subject
body['summary'] = summary
body['web_url'] = web_url
body['hosted_contents'] = hosted_contents
body['replies'] = replies
body['policy_violation'] = {}
body['policy_violation']['dlp_action'] = dlp_action
body['policy_violation']['justification_text'] = justification_text
body['policy_violation']['policy_tip'] = policy_tip
body['policy_violation']['user_action'] = user_action
body['policy_violation']['verdict_details'] = verdict_details
body['from_property'] = {}
body['from_property']['application'] = application
body['from_property']['device'] = device
body['from_property']['user'] = user
return client.create_messages(team_id=team_id,
channel_id=channel_id,
body=body)
def teams_team_channel_create_tab(client,
team_id,
channel_id,
id_=None,
configuration=None,
display_name=None,
web_url=None,
microsoft_graph_entity_id=None,
microsoft_graph_teams_app_display_name=None,
distribution_method=None,
external_id=None,
app_definitions=None):
body = {}
body['id'] = id_
body['configuration'] = configuration
body['display_name'] = display_name
body['web_url'] = web_url
body['teams_app'] = {}
body['teams_app']['id'] = microsoft_graph_entity_id
body['teams_app']['display_name'] = microsoft_graph_teams_app_display_name
body['teams_app']['distribution_method'] = distribution_method
body['teams_app']['external_id'] = external_id
body['teams_app']['app_definitions'] = app_definitions
return client.create_tabs(team_id=team_id,
channel_id=channel_id,
body=body)
def teams_team_channel_delete_file_folder(client,
team_id,
channel_id,
if_match=None):
return client.delete_files_folder(team_id=team_id,
channel_id=channel_id,
if_match=if_match)
def teams_team_channel_delete_member(client,
team_id,
channel_id,
conversation_member_id,
if_match=None):
return client.delete_members(team_id=team_id,
channel_id=channel_id,
conversation_member_id=conversation_member_id,
if_match=if_match)
def teams_team_channel_delete_message(client,
team_id,
channel_id,
chat_message_id,
if_match=None):
return client.delete_messages(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
if_match=if_match)
def teams_team_channel_delete_tab(client,
team_id,
channel_id,
teams_tab_id,
if_match=None):
return client.delete_tabs(team_id=team_id,
channel_id=channel_id,
teams_tab_id=teams_tab_id,
if_match=if_match)
def teams_team_channel_list_member(client,
team_id,
channel_id,
orderby=None,
select=None,
expand=None):
return client.list_members(team_id=team_id,
channel_id=channel_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_channel_list_message(client,
team_id,
channel_id,
orderby=None,
select=None,
expand=None):
return client.list_messages(team_id=team_id,
channel_id=channel_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_channel_list_tab(client,
team_id,
channel_id,
orderby=None,
select=None,
expand=None):
return client.list_tabs(team_id=team_id,
channel_id=channel_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_channel_show_file_folder(client,
team_id,
channel_id,
select=None,
expand=None):
return client.get_files_folder(team_id=team_id,
channel_id=channel_id,
select=select,
expand=expand)
def teams_team_channel_show_member(client,
team_id,
channel_id,
conversation_member_id,
select=None,
expand=None):
return client.get_members(team_id=team_id,
channel_id=channel_id,
conversation_member_id=conversation_member_id,
select=select,
expand=expand)
def teams_team_channel_show_message(client,
team_id,
channel_id,
chat_message_id,
select=None,
expand=None):
return client.get_messages(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
select=select,
expand=expand)
def teams_team_channel_show_tab(client,
team_id,
channel_id,
teams_tab_id,
select=None,
expand=None):
return client.get_tabs(team_id=team_id,
channel_id=channel_id,
teams_tab_id=teams_tab_id,
select=select,
expand=expand)
def teams_team_channel_update_file_folder(client,
team_id,
channel_id,
content_type,
id_=None,
created_date_time=None,
description=None,
e_tag=None,
last_modified_date_time=None,
name=None,
web_url=None,
created_by_user=None,
last_modified_by_user=None,
drive_id=None,
drive_type=None,
microsoft_graph_item_reference_id=None,
microsoft_graph_item_reference_name=None,
path=None,
share_id=None,
sharepoint_ids=None,
site_id=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
audio=None,
content=None,
c_tag=None,
file_system_info=None,
image=None,
location=None,
photo=None,
publication=None,
root=None,
microsoft_graph_sharepoint_ids=None,
size=None,
video=None,
web_dav_url=None,
children=None,
permissions=None,
subscriptions=None,
thumbnails=None,
versions=None,
microsoft_graph_entity_id=None,
microsoft_graph_base_item_created_date_time_created_date_time=None,
microsoft_graph_base_item_description=None,
microsoft_graph_base_item_e_tag=None,
microsoft_graph_base_item_last_modified_date_time_last_modified_date_time=None,
microsoft_graph_base_item_name=None,
microsoft_graph_base_item_web_url=None,
microsoft_graph_user_created_by_user=None,
microsoft_graph_user_last_modified_by_user=None,
microsoft_graph_item_reference_drive_id=None,
microsoft_graph_item_reference_drive_type=None,
id1=None,
name1=None,
microsoft_graph_item_reference_path=None,
microsoft_graph_item_reference_share_id=None,
sharepoint_ids1=None,
microsoft_graph_item_reference_site_id=None,
application1=None,
device1=None,
user1=None,
application2=None,
device2=None,
user2=None,
sharepoint_ids2=None,
analytics=None,
drive_item=None,
fields=None,
microsoft_graph_list_item_versions=None,
id2=None,
all_time=None,
item_activity_stats=None,
last_seven_days=None,
id3=None,
microsoft_graph_workbook_application=None,
comments=None,
functions=None,
names=None,
operations=None,
tables=None,
worksheets=None,
microsoft_graph_special_folder_name=None,
owner=None,
scope=None,
shared_by=None,
shared_date_time=None,
on_click_telemetry_url=None,
created_by=None,
microsoft_graph_remote_item_created_date_time_created_date_time=None,
file=None,
microsoft_graph_file_system_info_file_system_info=None,
folder=None,
microsoft_graph_remote_item_id=None,
microsoft_graph_image=None,
last_modified_by=None,
microsoft_graph_remote_item_last_modified_date_time_last_modified_date_time=None,
microsoft_graph_remote_item_name=None,
package=None,
parent_reference=None,
shared=None,
sharepoint_ids3=None,
integer_size=None,
special_folder=None,
microsoft_graph_video=None,
microsoft_graph_remote_item_web_dav_url_web_dav_url=None,
microsoft_graph_remote_item_web_url=None,
queued_date_time=None,
type_=None,
child_count=None,
view=None,
hashes=None,
mime_type=None,
processing_metadata=None,
state=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['description'] = description
body['e_tag'] = e_tag
body['last_modified_date_time'] = last_modified_date_time
body['name'] = name
body['web_url'] = web_url
body['created_by_user'] = created_by_user
body['last_modified_by_user'] = last_modified_by_user
body['parent_reference'] = {}
body['parent_reference']['drive_id'] = drive_id
body['parent_reference']['drive_type'] = drive_type
body['parent_reference']['id'] = microsoft_graph_item_reference_id
body['parent_reference']['name'] = microsoft_graph_item_reference_name
body['parent_reference']['path'] = path
body['parent_reference']['share_id'] = share_id
body['parent_reference']['sharepoint_ids'] = sharepoint_ids
body['parent_reference']['site_id'] = site_id
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['audio'] = audio
body['content'] = content
body['c_tag'] = c_tag
body['file_system_info'] = file_system_info
body['image'] = image
body['location'] = location
body['photo'] = photo
body['publication'] = publication
body['root'] = root
body['sharepoint_ids'] = microsoft_graph_sharepoint_ids
body['size'] = size
body['video'] = video
body['web_dav_url'] = web_dav_url
body['children'] = children
body['permissions'] = permissions
body['subscriptions'] = subscriptions
body['thumbnails'] = thumbnails
body['versions'] = versions
body['list_item'] = {}
body['list_item']['id'] = microsoft_graph_entity_id
body['list_item']['created_date_time'] = microsoft_graph_base_item_created_date_time_created_date_time
body['list_item']['description'] = microsoft_graph_base_item_description
body['list_item']['e_tag'] = microsoft_graph_base_item_e_tag
body['list_item']['last_modified_date_time'] = microsoft_graph_base_item_last_modified_date_time_last_modified_date_time
body['list_item']['name'] = microsoft_graph_base_item_name
body['list_item']['web_url'] = microsoft_graph_base_item_web_url
body['list_item']['created_by_user'] = microsoft_graph_user_created_by_user
body['list_item']['last_modified_by_user'] = microsoft_graph_user_last_modified_by_user
body['list_item']['parent_reference'] = {}
body['list_item']['parent_reference']['drive_id'] = microsoft_graph_item_reference_drive_id
body['list_item']['parent_reference']['drive_type'] = microsoft_graph_item_reference_drive_type
body['list_item']['parent_reference']['id'] = id1
body['list_item']['parent_reference']['name'] = name1
body['list_item']['parent_reference']['path'] = microsoft_graph_item_reference_path
body['list_item']['parent_reference']['share_id'] = microsoft_graph_item_reference_share_id
body['list_item']['parent_reference']['sharepoint_ids'] = sharepoint_ids1
body['list_item']['parent_reference']['site_id'] = microsoft_graph_item_reference_site_id
body['list_item']['last_modified_by'] = {}
body['list_item']['last_modified_by']['application'] = application1
body['list_item']['last_modified_by']['device'] = device1
body['list_item']['last_modified_by']['user'] = user1
body['list_item']['created_by'] = {}
body['list_item']['created_by']['application'] = application2
body['list_item']['created_by']['device'] = device2
body['list_item']['created_by']['user'] = user2
body['list_item']['content_type'] = content_type
body['list_item']['sharepoint_ids'] = sharepoint_ids2
body['list_item']['analytics'] = analytics
body['list_item']['drive_item'] = drive_item
body['list_item']['fields'] = fields
body['list_item']['versions'] = microsoft_graph_list_item_versions
body['analytics'] = {}
body['analytics']['id'] = id2
body['analytics']['all_time'] = all_time
body['analytics']['item_activity_stats'] = item_activity_stats
body['analytics']['last_seven_days'] = last_seven_days
body['workbook'] = {}
body['workbook']['id'] = id3
body['workbook']['application'] = microsoft_graph_workbook_application
body['workbook']['comments'] = comments
body['workbook']['functions'] = functions
body['workbook']['names'] = names
body['workbook']['operations'] = operations
body['workbook']['tables'] = tables
body['workbook']['worksheets'] = worksheets
body['special_folder'] = {}
body['special_folder']['name'] = microsoft_graph_special_folder_name
body['shared'] = {}
body['shared']['owner'] = owner
body['shared']['scope'] = scope
body['shared']['shared_by'] = shared_by
body['shared']['shared_date_time'] = shared_date_time
body['search_result'] = {}
body['search_result']['on_click_telemetry_url'] = on_click_telemetry_url
body['remote_item'] = {}
body['remote_item']['created_by'] = created_by
body['remote_item']['created_date_time'] = microsoft_graph_remote_item_created_date_time_created_date_time
body['remote_item']['file'] = file
body['remote_item']['file_system_info'] = microsoft_graph_file_system_info_file_system_info
body['remote_item']['folder'] = folder
body['remote_item']['id'] = microsoft_graph_remote_item_id
body['remote_item']['image'] = microsoft_graph_image
body['remote_item']['last_modified_by'] = last_modified_by
body['remote_item']['last_modified_date_time'] = microsoft_graph_remote_item_last_modified_date_time_last_modified_date_time
body['remote_item']['name'] = microsoft_graph_remote_item_name
body['remote_item']['package'] = package
body['remote_item']['parent_reference'] = parent_reference
body['remote_item']['shared'] = shared
body['remote_item']['sharepoint_ids'] = sharepoint_ids3
body['remote_item']['size'] = integer_size
body['remote_item']['special_folder'] = special_folder
body['remote_item']['video'] = microsoft_graph_video
body['remote_item']['web_dav_url'] = microsoft_graph_remote_item_web_dav_url_web_dav_url
body['remote_item']['web_url'] = microsoft_graph_remote_item_web_url
body['pending_content_update'] = {}
body['pending_content_update']['queued_date_time'] = queued_date_time
body['package'] = {}
body['package']['type'] = type_
body['folder'] = {}
body['folder']['child_count'] = child_count
body['folder']['view'] = view
body['file'] = {}
body['file']['hashes'] = hashes
body['file']['mime_type'] = mime_type
body['file']['processing_metadata'] = processing_metadata
body['deleted'] = {}
body['deleted']['state'] = state
return client.update_files_folder(team_id=team_id,
channel_id=channel_id,
body=body)
def teams_team_channel_update_member(client,
team_id,
channel_id,
conversation_member_id,
id_=None,
display_name=None,
roles=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['roles'] = roles
return client.update_members(team_id=team_id,
channel_id=channel_id,
conversation_member_id=conversation_member_id,
body=body)
def teams_team_channel_update_message(client,
team_id,
channel_id,
chat_message_id,
body,
id_=None,
attachments=None,
created_date_time=None,
deleted_date_time=None,
etag=None,
importance=None,
last_edited_date_time=None,
last_modified_date_time=None,
locale=None,
mentions=None,
message_type=None,
reactions=None,
reply_to_id=None,
subject=None,
summary=None,
web_url=None,
hosted_contents=None,
replies=None,
dlp_action=None,
justification_text=None,
policy_tip=None,
user_action=None,
verdict_details=None,
application=None,
device=None,
user=None):
body = {}
body['id'] = id_
body['attachments'] = attachments
body['body'] = body
body['created_date_time'] = created_date_time
body['deleted_date_time'] = deleted_date_time
body['etag'] = etag
body['importance'] = importance
body['last_edited_date_time'] = last_edited_date_time
body['last_modified_date_time'] = last_modified_date_time
body['locale'] = locale
body['mentions'] = mentions
body['message_type'] = message_type
body['reactions'] = reactions
body['reply_to_id'] = reply_to_id
body['subject'] = subject
body['summary'] = summary
body['web_url'] = web_url
body['hosted_contents'] = hosted_contents
body['replies'] = replies
body['policy_violation'] = {}
body['policy_violation']['dlp_action'] = dlp_action
body['policy_violation']['justification_text'] = justification_text
body['policy_violation']['policy_tip'] = policy_tip
body['policy_violation']['user_action'] = user_action
body['policy_violation']['verdict_details'] = verdict_details
body['from_property'] = {}
body['from_property']['application'] = application
body['from_property']['device'] = device
body['from_property']['user'] = user
return client.update_messages(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
body=body)
def teams_team_channel_update_tab(client,
team_id,
channel_id,
teams_tab_id,
id_=None,
configuration=None,
display_name=None,
web_url=None,
microsoft_graph_entity_id=None,
microsoft_graph_teams_app_display_name=None,
distribution_method=None,
external_id=None,
app_definitions=None):
body = {}
body['id'] = id_
body['configuration'] = configuration
body['display_name'] = display_name
body['web_url'] = web_url
body['teams_app'] = {}
body['teams_app']['id'] = microsoft_graph_entity_id
body['teams_app']['display_name'] = microsoft_graph_teams_app_display_name
body['teams_app']['distribution_method'] = distribution_method
body['teams_app']['external_id'] = external_id
body['teams_app']['app_definitions'] = app_definitions
return client.update_tabs(team_id=team_id,
channel_id=channel_id,
teams_tab_id=teams_tab_id,
body=body)
def teams_team_channel_message_create_hosted_content(client,
team_id,
channel_id,
chat_message_id,
id_=None):
body = {}
body['id'] = id_
return client.create_hosted_contents(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
body=body)
def teams_team_channel_message_create_reply(client,
team_id,
channel_id,
chat_message_id,
body,
id_=None,
attachments=None,
created_date_time=None,
deleted_date_time=None,
etag=None,
importance=None,
last_edited_date_time=None,
last_modified_date_time=None,
locale=None,
mentions=None,
message_type=None,
reactions=None,
reply_to_id=None,
subject=None,
summary=None,
web_url=None,
hosted_contents=None,
replies=None,
dlp_action=None,
justification_text=None,
policy_tip=None,
user_action=None,
verdict_details=None,
application=None,
device=None,
user=None):
body = {}
body['id'] = id_
body['attachments'] = attachments
body['body'] = body
body['created_date_time'] = created_date_time
body['deleted_date_time'] = deleted_date_time
body['etag'] = etag
body['importance'] = importance
body['last_edited_date_time'] = last_edited_date_time
body['last_modified_date_time'] = last_modified_date_time
body['locale'] = locale
body['mentions'] = mentions
body['message_type'] = message_type
body['reactions'] = reactions
body['reply_to_id'] = reply_to_id
body['subject'] = subject
body['summary'] = summary
body['web_url'] = web_url
body['hosted_contents'] = hosted_contents
body['replies'] = replies
body['policy_violation'] = {}
body['policy_violation']['dlp_action'] = dlp_action
body['policy_violation']['justification_text'] = justification_text
body['policy_violation']['policy_tip'] = policy_tip
body['policy_violation']['user_action'] = user_action
body['policy_violation']['verdict_details'] = verdict_details
body['from_property'] = {}
body['from_property']['application'] = application
body['from_property']['device'] = device
body['from_property']['user'] = user
return client.create_replies(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
body=body)
def teams_team_channel_message_delete_hosted_content(client,
team_id,
channel_id,
chat_message_id,
chat_message_hosted_content_id,
if_match=None):
return client.delete_hosted_contents(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
chat_message_hosted_content_id=chat_message_hosted_content_id,
if_match=if_match)
def teams_team_channel_message_delete_reply(client,
team_id,
channel_id,
chat_message_id,
chat_message_id1,
if_match=None):
return client.delete_replies(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
chat_message_id1=chat_message_id1,
if_match=if_match)
def teams_team_channel_message_list_hosted_content(client,
team_id,
channel_id,
chat_message_id,
orderby=None,
select=None,
expand=None):
return client.list_hosted_contents(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_channel_message_list_reply(client,
team_id,
channel_id,
chat_message_id,
orderby=None,
select=None,
expand=None):
return client.list_replies(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_channel_message_show_hosted_content(client,
team_id,
channel_id,
chat_message_id,
chat_message_hosted_content_id,
select=None,
expand=None):
return client.get_hosted_contents(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
chat_message_hosted_content_id=chat_message_hosted_content_id,
select=select,
expand=expand)
def teams_team_channel_message_show_reply(client,
team_id,
channel_id,
chat_message_id,
chat_message_id1,
select=None,
expand=None):
return client.get_replies(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
chat_message_id1=chat_message_id1,
select=select,
expand=expand)
def teams_team_channel_message_update_hosted_content(client,
team_id,
channel_id,
chat_message_id,
chat_message_hosted_content_id,
id_=None):
body = {}
body['id'] = id_
return client.update_hosted_contents(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
chat_message_hosted_content_id=chat_message_hosted_content_id,
body=body)
def teams_team_channel_message_update_reply(client,
team_id,
channel_id,
chat_message_id,
chat_message_id1,
body,
id_=None,
attachments=None,
created_date_time=None,
deleted_date_time=None,
etag=None,
importance=None,
last_edited_date_time=None,
last_modified_date_time=None,
locale=None,
mentions=None,
message_type=None,
reactions=None,
reply_to_id=None,
subject=None,
summary=None,
web_url=None,
hosted_contents=None,
replies=None,
dlp_action=None,
justification_text=None,
policy_tip=None,
user_action=None,
verdict_details=None,
application=None,
device=None,
user=None):
body = {}
body['id'] = id_
body['attachments'] = attachments
body['body'] = body
body['created_date_time'] = created_date_time
body['deleted_date_time'] = deleted_date_time
body['etag'] = etag
body['importance'] = importance
body['last_edited_date_time'] = last_edited_date_time
body['last_modified_date_time'] = last_modified_date_time
body['locale'] = locale
body['mentions'] = mentions
body['message_type'] = message_type
body['reactions'] = reactions
body['reply_to_id'] = reply_to_id
body['subject'] = subject
body['summary'] = summary
body['web_url'] = web_url
body['hosted_contents'] = hosted_contents
body['replies'] = replies
body['policy_violation'] = {}
body['policy_violation']['dlp_action'] = dlp_action
body['policy_violation']['justification_text'] = justification_text
body['policy_violation']['policy_tip'] = policy_tip
body['policy_violation']['user_action'] = user_action
body['policy_violation']['verdict_details'] = verdict_details
body['from_property'] = {}
body['from_property']['application'] = application
body['from_property']['device'] = device
body['from_property']['user'] = user
return client.update_replies(team_id=team_id,
channel_id=channel_id,
chat_message_id=chat_message_id,
chat_message_id1=chat_message_id1,
body=body)
def teams_team_channel_tab_delete_ref_team_app(client,
team_id,
channel_id,
teams_tab_id,
if_match=None):
return client.delete_ref_teams_app(team_id=team_id,
channel_id=channel_id,
teams_tab_id=teams_tab_id,
if_match=if_match)
def teams_team_channel_tab_set_ref_team_app(client,
team_id,
channel_id,
teams_tab_id,
body):
return client.set_ref_teams_app(team_id=team_id,
channel_id=channel_id,
teams_tab_id=teams_tab_id,
body=body)
def teams_team_channel_tab_show_ref_team_app(client,
team_id,
channel_id,
teams_tab_id):
return client.get_ref_teams_app(team_id=team_id,
channel_id=channel_id,
teams_tab_id=teams_tab_id)
def teams_team_channel_tab_show_team_app(client,
team_id,
channel_id,
teams_tab_id,
select=None,
expand=None):
return client.get_teams_app(team_id=team_id,
channel_id=channel_id,
teams_tab_id=teams_tab_id,
select=select,
expand=expand)
def teams_team_installed_app_delete_ref_team_app(client,
team_id,
teams_app_installation_id,
if_match=None):
return client.delete_ref_teams_app(team_id=team_id,
teams_app_installation_id=teams_app_installation_id,
if_match=if_match)
def teams_team_installed_app_delete_ref_team_app_definition(client,
team_id,
teams_app_installation_id,
if_match=None):
return client.delete_ref_teams_app_definition(team_id=team_id,
teams_app_installation_id=teams_app_installation_id,
if_match=if_match)
def teams_team_installed_app_set_ref_team_app(client,
team_id,
teams_app_installation_id,
body):
return client.set_ref_teams_app(team_id=team_id,
teams_app_installation_id=teams_app_installation_id,
body=body)
def teams_team_installed_app_set_ref_team_app_definition(client,
team_id,
teams_app_installation_id,
body):
return client.set_ref_teams_app_definition(team_id=team_id,
teams_app_installation_id=teams_app_installation_id,
body=body)
def teams_team_installed_app_show_ref_team_app(client,
team_id,
teams_app_installation_id):
return client.get_ref_teams_app(team_id=team_id,
teams_app_installation_id=teams_app_installation_id)
def teams_team_installed_app_show_ref_team_app_definition(client,
team_id,
teams_app_installation_id):
return client.get_ref_teams_app_definition(team_id=team_id,
teams_app_installation_id=teams_app_installation_id)
def teams_team_installed_app_show_team_app(client,
team_id,
teams_app_installation_id,
select=None,
expand=None):
return client.get_teams_app(team_id=team_id,
teams_app_installation_id=teams_app_installation_id,
select=select,
expand=expand)
def teams_team_installed_app_show_team_app_definition(client,
team_id,
teams_app_installation_id,
select=None,
expand=None):
return client.get_teams_app_definition(team_id=team_id,
teams_app_installation_id=teams_app_installation_id,
select=select,
expand=expand)
def teams_team_installed_app_upgrade(client,
team_id,
teams_app_installation_id):
return client.upgrade(team_id=team_id,
teams_app_installation_id=teams_app_installation_id)
def teams_team_primary_channel_create_member(client,
team_id,
id_=None,
display_name=None,
roles=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['roles'] = roles
return client.create_members(team_id=team_id,
body=body)
def teams_team_primary_channel_create_message(client,
team_id,
body,
id_=None,
attachments=None,
created_date_time=None,
deleted_date_time=None,
etag=None,
importance=None,
last_edited_date_time=None,
last_modified_date_time=None,
locale=None,
mentions=None,
message_type=None,
reactions=None,
reply_to_id=None,
subject=None,
summary=None,
web_url=None,
hosted_contents=None,
replies=None,
dlp_action=None,
justification_text=None,
policy_tip=None,
user_action=None,
verdict_details=None,
application=None,
device=None,
user=None):
body = {}
body['id'] = id_
body['attachments'] = attachments
body['body'] = body
body['created_date_time'] = created_date_time
body['deleted_date_time'] = deleted_date_time
body['etag'] = etag
body['importance'] = importance
body['last_edited_date_time'] = last_edited_date_time
body['last_modified_date_time'] = last_modified_date_time
body['locale'] = locale
body['mentions'] = mentions
body['message_type'] = message_type
body['reactions'] = reactions
body['reply_to_id'] = reply_to_id
body['subject'] = subject
body['summary'] = summary
body['web_url'] = web_url
body['hosted_contents'] = hosted_contents
body['replies'] = replies
body['policy_violation'] = {}
body['policy_violation']['dlp_action'] = dlp_action
body['policy_violation']['justification_text'] = justification_text
body['policy_violation']['policy_tip'] = policy_tip
body['policy_violation']['user_action'] = user_action
body['policy_violation']['verdict_details'] = verdict_details
body['from_property'] = {}
body['from_property']['application'] = application
body['from_property']['device'] = device
body['from_property']['user'] = user
return client.create_messages(team_id=team_id,
body=body)
def teams_team_primary_channel_create_tab(client,
team_id,
id_=None,
configuration=None,
display_name=None,
web_url=None,
microsoft_graph_entity_id=None,
microsoft_graph_teams_app_display_name=None,
distribution_method=None,
external_id=None,
app_definitions=None):
body = {}
body['id'] = id_
body['configuration'] = configuration
body['display_name'] = display_name
body['web_url'] = web_url
body['teams_app'] = {}
body['teams_app']['id'] = microsoft_graph_entity_id
body['teams_app']['display_name'] = microsoft_graph_teams_app_display_name
body['teams_app']['distribution_method'] = distribution_method
body['teams_app']['external_id'] = external_id
body['teams_app']['app_definitions'] = app_definitions
return client.create_tabs(team_id=team_id,
body=body)
def teams_team_primary_channel_delete_file_folder(client,
team_id,
if_match=None):
return client.delete_files_folder(team_id=team_id,
if_match=if_match)
def teams_team_primary_channel_delete_member(client,
team_id,
conversation_member_id,
if_match=None):
return client.delete_members(team_id=team_id,
conversation_member_id=conversation_member_id,
if_match=if_match)
def teams_team_primary_channel_delete_message(client,
team_id,
chat_message_id,
if_match=None):
return client.delete_messages(team_id=team_id,
chat_message_id=chat_message_id,
if_match=if_match)
def teams_team_primary_channel_delete_tab(client,
team_id,
teams_tab_id,
if_match=None):
return client.delete_tabs(team_id=team_id,
teams_tab_id=teams_tab_id,
if_match=if_match)
def teams_team_primary_channel_list_member(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_members(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_primary_channel_list_message(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_messages(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_primary_channel_list_tab(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_tabs(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_primary_channel_show_file_folder(client,
team_id,
select=None,
expand=None):
return client.get_files_folder(team_id=team_id,
select=select,
expand=expand)
def teams_team_primary_channel_show_member(client,
team_id,
conversation_member_id,
select=None,
expand=None):
return client.get_members(team_id=team_id,
conversation_member_id=conversation_member_id,
select=select,
expand=expand)
def teams_team_primary_channel_show_message(client,
team_id,
chat_message_id,
select=None,
expand=None):
return client.get_messages(team_id=team_id,
chat_message_id=chat_message_id,
select=select,
expand=expand)
def teams_team_primary_channel_show_tab(client,
team_id,
teams_tab_id,
select=None,
expand=None):
return client.get_tabs(team_id=team_id,
teams_tab_id=teams_tab_id,
select=select,
expand=expand)
def teams_team_primary_channel_update_file_folder(client,
team_id,
content_type,
id_=None,
created_date_time=None,
description=None,
e_tag=None,
last_modified_date_time=None,
name=None,
web_url=None,
created_by_user=None,
last_modified_by_user=None,
drive_id=None,
drive_type=None,
microsoft_graph_item_reference_id=None,
microsoft_graph_item_reference_name=None,
path=None,
share_id=None,
sharepoint_ids=None,
site_id=None,
application=None,
device=None,
user=None,
microsoft_graph_identity_application=None,
microsoft_graph_identity_device=None,
microsoft_graph_identity_user=None,
audio=None,
content=None,
c_tag=None,
file_system_info=None,
image=None,
location=None,
photo=None,
publication=None,
root=None,
microsoft_graph_sharepoint_ids=None,
size=None,
video=None,
web_dav_url=None,
children=None,
permissions=None,
subscriptions=None,
thumbnails=None,
versions=None,
microsoft_graph_entity_id=None,
microsoft_graph_base_item_created_date_time_created_date_time=None,
microsoft_graph_base_item_description=None,
microsoft_graph_base_item_e_tag=None,
microsoft_graph_base_item_last_modified_date_time_last_modified_date_time=None,
microsoft_graph_base_item_name=None,
microsoft_graph_base_item_web_url=None,
microsoft_graph_user_created_by_user=None,
microsoft_graph_user_last_modified_by_user=None,
microsoft_graph_item_reference_drive_id=None,
microsoft_graph_item_reference_drive_type=None,
id1=None,
name1=None,
microsoft_graph_item_reference_path=None,
microsoft_graph_item_reference_share_id=None,
sharepoint_ids1=None,
microsoft_graph_item_reference_site_id=None,
application1=None,
device1=None,
user1=None,
application2=None,
device2=None,
user2=None,
sharepoint_ids2=None,
analytics=None,
drive_item=None,
fields=None,
microsoft_graph_list_item_versions=None,
id2=None,
all_time=None,
item_activity_stats=None,
last_seven_days=None,
id3=None,
microsoft_graph_workbook_application=None,
comments=None,
functions=None,
names=None,
operations=None,
tables=None,
worksheets=None,
microsoft_graph_special_folder_name=None,
owner=None,
scope=None,
shared_by=None,
shared_date_time=None,
on_click_telemetry_url=None,
created_by=None,
microsoft_graph_remote_item_created_date_time_created_date_time=None,
file=None,
microsoft_graph_file_system_info_file_system_info=None,
folder=None,
microsoft_graph_remote_item_id=None,
microsoft_graph_image=None,
last_modified_by=None,
microsoft_graph_remote_item_last_modified_date_time_last_modified_date_time=None,
microsoft_graph_remote_item_name=None,
package=None,
parent_reference=None,
shared=None,
sharepoint_ids3=None,
integer_size=None,
special_folder=None,
microsoft_graph_video=None,
microsoft_graph_remote_item_web_dav_url_web_dav_url=None,
microsoft_graph_remote_item_web_url=None,
queued_date_time=None,
type_=None,
child_count=None,
view=None,
hashes=None,
mime_type=None,
processing_metadata=None,
state=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['description'] = description
body['e_tag'] = e_tag
body['last_modified_date_time'] = last_modified_date_time
body['name'] = name
body['web_url'] = web_url
body['created_by_user'] = created_by_user
body['last_modified_by_user'] = last_modified_by_user
body['parent_reference'] = {}
body['parent_reference']['drive_id'] = drive_id
body['parent_reference']['drive_type'] = drive_type
body['parent_reference']['id'] = microsoft_graph_item_reference_id
body['parent_reference']['name'] = microsoft_graph_item_reference_name
body['parent_reference']['path'] = path
body['parent_reference']['share_id'] = share_id
body['parent_reference']['sharepoint_ids'] = sharepoint_ids
body['parent_reference']['site_id'] = site_id
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['created_by'] = {}
body['created_by']['application'] = microsoft_graph_identity_application
body['created_by']['device'] = microsoft_graph_identity_device
body['created_by']['user'] = microsoft_graph_identity_user
body['audio'] = audio
body['content'] = content
body['c_tag'] = c_tag
body['file_system_info'] = file_system_info
body['image'] = image
body['location'] = location
body['photo'] = photo
body['publication'] = publication
body['root'] = root
body['sharepoint_ids'] = microsoft_graph_sharepoint_ids
body['size'] = size
body['video'] = video
body['web_dav_url'] = web_dav_url
body['children'] = children
body['permissions'] = permissions
body['subscriptions'] = subscriptions
body['thumbnails'] = thumbnails
body['versions'] = versions
body['list_item'] = {}
body['list_item']['id'] = microsoft_graph_entity_id
body['list_item']['created_date_time'] = microsoft_graph_base_item_created_date_time_created_date_time
body['list_item']['description'] = microsoft_graph_base_item_description
body['list_item']['e_tag'] = microsoft_graph_base_item_e_tag
body['list_item']['last_modified_date_time'] = microsoft_graph_base_item_last_modified_date_time_last_modified_date_time
body['list_item']['name'] = microsoft_graph_base_item_name
body['list_item']['web_url'] = microsoft_graph_base_item_web_url
body['list_item']['created_by_user'] = microsoft_graph_user_created_by_user
body['list_item']['last_modified_by_user'] = microsoft_graph_user_last_modified_by_user
body['list_item']['parent_reference'] = {}
body['list_item']['parent_reference']['drive_id'] = microsoft_graph_item_reference_drive_id
body['list_item']['parent_reference']['drive_type'] = microsoft_graph_item_reference_drive_type
body['list_item']['parent_reference']['id'] = id1
body['list_item']['parent_reference']['name'] = name1
body['list_item']['parent_reference']['path'] = microsoft_graph_item_reference_path
body['list_item']['parent_reference']['share_id'] = microsoft_graph_item_reference_share_id
body['list_item']['parent_reference']['sharepoint_ids'] = sharepoint_ids1
body['list_item']['parent_reference']['site_id'] = microsoft_graph_item_reference_site_id
body['list_item']['last_modified_by'] = {}
body['list_item']['last_modified_by']['application'] = application1
body['list_item']['last_modified_by']['device'] = device1
body['list_item']['last_modified_by']['user'] = user1
body['list_item']['created_by'] = {}
body['list_item']['created_by']['application'] = application2
body['list_item']['created_by']['device'] = device2
body['list_item']['created_by']['user'] = user2
body['list_item']['content_type'] = content_type
body['list_item']['sharepoint_ids'] = sharepoint_ids2
body['list_item']['analytics'] = analytics
body['list_item']['drive_item'] = drive_item
body['list_item']['fields'] = fields
body['list_item']['versions'] = microsoft_graph_list_item_versions
body['analytics'] = {}
body['analytics']['id'] = id2
body['analytics']['all_time'] = all_time
body['analytics']['item_activity_stats'] = item_activity_stats
body['analytics']['last_seven_days'] = last_seven_days
body['workbook'] = {}
body['workbook']['id'] = id3
body['workbook']['application'] = microsoft_graph_workbook_application
body['workbook']['comments'] = comments
body['workbook']['functions'] = functions
body['workbook']['names'] = names
body['workbook']['operations'] = operations
body['workbook']['tables'] = tables
body['workbook']['worksheets'] = worksheets
body['special_folder'] = {}
body['special_folder']['name'] = microsoft_graph_special_folder_name
body['shared'] = {}
body['shared']['owner'] = owner
body['shared']['scope'] = scope
body['shared']['shared_by'] = shared_by
body['shared']['shared_date_time'] = shared_date_time
body['search_result'] = {}
body['search_result']['on_click_telemetry_url'] = on_click_telemetry_url
body['remote_item'] = {}
body['remote_item']['created_by'] = created_by
body['remote_item']['created_date_time'] = microsoft_graph_remote_item_created_date_time_created_date_time
body['remote_item']['file'] = file
body['remote_item']['file_system_info'] = microsoft_graph_file_system_info_file_system_info
body['remote_item']['folder'] = folder
body['remote_item']['id'] = microsoft_graph_remote_item_id
body['remote_item']['image'] = microsoft_graph_image
body['remote_item']['last_modified_by'] = last_modified_by
body['remote_item']['last_modified_date_time'] = microsoft_graph_remote_item_last_modified_date_time_last_modified_date_time
body['remote_item']['name'] = microsoft_graph_remote_item_name
body['remote_item']['package'] = package
body['remote_item']['parent_reference'] = parent_reference
body['remote_item']['shared'] = shared
body['remote_item']['sharepoint_ids'] = sharepoint_ids3
body['remote_item']['size'] = integer_size
body['remote_item']['special_folder'] = special_folder
body['remote_item']['video'] = microsoft_graph_video
body['remote_item']['web_dav_url'] = microsoft_graph_remote_item_web_dav_url_web_dav_url
body['remote_item']['web_url'] = microsoft_graph_remote_item_web_url
body['pending_content_update'] = {}
body['pending_content_update']['queued_date_time'] = queued_date_time
body['package'] = {}
body['package']['type'] = type_
body['folder'] = {}
body['folder']['child_count'] = child_count
body['folder']['view'] = view
body['file'] = {}
body['file']['hashes'] = hashes
body['file']['mime_type'] = mime_type
body['file']['processing_metadata'] = processing_metadata
body['deleted'] = {}
body['deleted']['state'] = state
return client.update_files_folder(team_id=team_id,
body=body)
def teams_team_primary_channel_update_member(client,
team_id,
conversation_member_id,
id_=None,
display_name=None,
roles=None):
body = {}
body['id'] = id_
body['display_name'] = display_name
body['roles'] = roles
return client.update_members(team_id=team_id,
conversation_member_id=conversation_member_id,
body=body)
def teams_team_primary_channel_update_message(client,
team_id,
chat_message_id,
body,
id_=None,
attachments=None,
created_date_time=None,
deleted_date_time=None,
etag=None,
importance=None,
last_edited_date_time=None,
last_modified_date_time=None,
locale=None,
mentions=None,
message_type=None,
reactions=None,
reply_to_id=None,
subject=None,
summary=None,
web_url=None,
hosted_contents=None,
replies=None,
dlp_action=None,
justification_text=None,
policy_tip=None,
user_action=None,
verdict_details=None,
application=None,
device=None,
user=None):
body = {}
body['id'] = id_
body['attachments'] = attachments
body['body'] = body
body['created_date_time'] = created_date_time
body['deleted_date_time'] = deleted_date_time
body['etag'] = etag
body['importance'] = importance
body['last_edited_date_time'] = last_edited_date_time
body['last_modified_date_time'] = last_modified_date_time
body['locale'] = locale
body['mentions'] = mentions
body['message_type'] = message_type
body['reactions'] = reactions
body['reply_to_id'] = reply_to_id
body['subject'] = subject
body['summary'] = summary
body['web_url'] = web_url
body['hosted_contents'] = hosted_contents
body['replies'] = replies
body['policy_violation'] = {}
body['policy_violation']['dlp_action'] = dlp_action
body['policy_violation']['justification_text'] = justification_text
body['policy_violation']['policy_tip'] = policy_tip
body['policy_violation']['user_action'] = user_action
body['policy_violation']['verdict_details'] = verdict_details
body['from_property'] = {}
body['from_property']['application'] = application
body['from_property']['device'] = device
body['from_property']['user'] = user
return client.update_messages(team_id=team_id,
chat_message_id=chat_message_id,
body=body)
def teams_team_primary_channel_update_tab(client,
team_id,
teams_tab_id,
id_=None,
configuration=None,
display_name=None,
web_url=None,
microsoft_graph_entity_id=None,
microsoft_graph_teams_app_display_name=None,
distribution_method=None,
external_id=None,
app_definitions=None):
body = {}
body['id'] = id_
body['configuration'] = configuration
body['display_name'] = display_name
body['web_url'] = web_url
body['teams_app'] = {}
body['teams_app']['id'] = microsoft_graph_entity_id
body['teams_app']['display_name'] = microsoft_graph_teams_app_display_name
body['teams_app']['distribution_method'] = distribution_method
body['teams_app']['external_id'] = external_id
body['teams_app']['app_definitions'] = app_definitions
return client.update_tabs(team_id=team_id,
teams_tab_id=teams_tab_id,
body=body)
def teams_team_primary_channel_message_create_hosted_content(client,
team_id,
chat_message_id,
id_=None):
body = {}
body['id'] = id_
return client.create_hosted_contents(team_id=team_id,
chat_message_id=chat_message_id,
body=body)
def teams_team_primary_channel_message_create_reply(client,
team_id,
chat_message_id,
body,
id_=None,
attachments=None,
created_date_time=None,
deleted_date_time=None,
etag=None,
importance=None,
last_edited_date_time=None,
last_modified_date_time=None,
locale=None,
mentions=None,
message_type=None,
reactions=None,
reply_to_id=None,
subject=None,
summary=None,
web_url=None,
hosted_contents=None,
replies=None,
dlp_action=None,
justification_text=None,
policy_tip=None,
user_action=None,
verdict_details=None,
application=None,
device=None,
user=None):
body = {}
body['id'] = id_
body['attachments'] = attachments
body['body'] = body
body['created_date_time'] = created_date_time
body['deleted_date_time'] = deleted_date_time
body['etag'] = etag
body['importance'] = importance
body['last_edited_date_time'] = last_edited_date_time
body['last_modified_date_time'] = last_modified_date_time
body['locale'] = locale
body['mentions'] = mentions
body['message_type'] = message_type
body['reactions'] = reactions
body['reply_to_id'] = reply_to_id
body['subject'] = subject
body['summary'] = summary
body['web_url'] = web_url
body['hosted_contents'] = hosted_contents
body['replies'] = replies
body['policy_violation'] = {}
body['policy_violation']['dlp_action'] = dlp_action
body['policy_violation']['justification_text'] = justification_text
body['policy_violation']['policy_tip'] = policy_tip
body['policy_violation']['user_action'] = user_action
body['policy_violation']['verdict_details'] = verdict_details
body['from_property'] = {}
body['from_property']['application'] = application
body['from_property']['device'] = device
body['from_property']['user'] = user
return client.create_replies(team_id=team_id,
chat_message_id=chat_message_id,
body=body)
def teams_team_primary_channel_message_delete_hosted_content(client,
team_id,
chat_message_id,
chat_message_hosted_content_id,
if_match=None):
return client.delete_hosted_contents(team_id=team_id,
chat_message_id=chat_message_id,
chat_message_hosted_content_id=chat_message_hosted_content_id,
if_match=if_match)
def teams_team_primary_channel_message_delete_reply(client,
team_id,
chat_message_id,
chat_message_id1,
if_match=None):
return client.delete_replies(team_id=team_id,
chat_message_id=chat_message_id,
chat_message_id1=chat_message_id1,
if_match=if_match)
def teams_team_primary_channel_message_list_hosted_content(client,
team_id,
chat_message_id,
orderby=None,
select=None,
expand=None):
return client.list_hosted_contents(team_id=team_id,
chat_message_id=chat_message_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_primary_channel_message_list_reply(client,
team_id,
chat_message_id,
orderby=None,
select=None,
expand=None):
return client.list_replies(team_id=team_id,
chat_message_id=chat_message_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_primary_channel_message_show_hosted_content(client,
team_id,
chat_message_id,
chat_message_hosted_content_id,
select=None,
expand=None):
return client.get_hosted_contents(team_id=team_id,
chat_message_id=chat_message_id,
chat_message_hosted_content_id=chat_message_hosted_content_id,
select=select,
expand=expand)
def teams_team_primary_channel_message_show_reply(client,
team_id,
chat_message_id,
chat_message_id1,
select=None,
expand=None):
return client.get_replies(team_id=team_id,
chat_message_id=chat_message_id,
chat_message_id1=chat_message_id1,
select=select,
expand=expand)
def teams_team_primary_channel_message_update_hosted_content(client,
team_id,
chat_message_id,
chat_message_hosted_content_id,
id_=None):
body = {}
body['id'] = id_
return client.update_hosted_contents(team_id=team_id,
chat_message_id=chat_message_id,
chat_message_hosted_content_id=chat_message_hosted_content_id,
body=body)
def teams_team_primary_channel_message_update_reply(client,
team_id,
chat_message_id,
chat_message_id1,
body,
id_=None,
attachments=None,
created_date_time=None,
deleted_date_time=None,
etag=None,
importance=None,
last_edited_date_time=None,
last_modified_date_time=None,
locale=None,
mentions=None,
message_type=None,
reactions=None,
reply_to_id=None,
subject=None,
summary=None,
web_url=None,
hosted_contents=None,
replies=None,
dlp_action=None,
justification_text=None,
policy_tip=None,
user_action=None,
verdict_details=None,
application=None,
device=None,
user=None):
body = {}
body['id'] = id_
body['attachments'] = attachments
body['body'] = body
body['created_date_time'] = created_date_time
body['deleted_date_time'] = deleted_date_time
body['etag'] = etag
body['importance'] = importance
body['last_edited_date_time'] = last_edited_date_time
body['last_modified_date_time'] = last_modified_date_time
body['locale'] = locale
body['mentions'] = mentions
body['message_type'] = message_type
body['reactions'] = reactions
body['reply_to_id'] = reply_to_id
body['subject'] = subject
body['summary'] = summary
body['web_url'] = web_url
body['hosted_contents'] = hosted_contents
body['replies'] = replies
body['policy_violation'] = {}
body['policy_violation']['dlp_action'] = dlp_action
body['policy_violation']['justification_text'] = justification_text
body['policy_violation']['policy_tip'] = policy_tip
body['policy_violation']['user_action'] = user_action
body['policy_violation']['verdict_details'] = verdict_details
body['from_property'] = {}
body['from_property']['application'] = application
body['from_property']['device'] = device
body['from_property']['user'] = user
return client.update_replies(team_id=team_id,
chat_message_id=chat_message_id,
chat_message_id1=chat_message_id1,
body=body)
def teams_team_primary_channel_tab_delete_ref_team_app(client,
team_id,
teams_tab_id,
if_match=None):
return client.delete_ref_teams_app(team_id=team_id,
teams_tab_id=teams_tab_id,
if_match=if_match)
def teams_team_primary_channel_tab_set_ref_team_app(client,
team_id,
teams_tab_id,
body):
return client.set_ref_teams_app(team_id=team_id,
teams_tab_id=teams_tab_id,
body=body)
def teams_team_primary_channel_tab_show_ref_team_app(client,
team_id,
teams_tab_id):
return client.get_ref_teams_app(team_id=team_id,
teams_tab_id=teams_tab_id)
def teams_team_primary_channel_tab_show_team_app(client,
team_id,
teams_tab_id,
select=None,
expand=None):
return client.get_teams_app(team_id=team_id,
teams_tab_id=teams_tab_id,
select=select,
expand=expand)
def teams_team_schedule_create_offer_shift_request(client,
team_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
assigned_to=None,
manager_action_date_time=None,
manager_action_message=None,
manager_user_id=None,
sender_date_time=None,
sender_message=None,
sender_user_id=None,
state=None,
recipient_action_date_time=None,
recipient_action_message=None,
recipient_user_id=None,
sender_shift_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['assigned_to'] = assigned_to
body['manager_action_date_time'] = manager_action_date_time
body['manager_action_message'] = manager_action_message
body['manager_user_id'] = manager_user_id
body['sender_date_time'] = sender_date_time
body['sender_message'] = sender_message
body['sender_user_id'] = sender_user_id
body['state'] = state
body['recipient_action_date_time'] = recipient_action_date_time
body['recipient_action_message'] = recipient_action_message
body['recipient_user_id'] = recipient_user_id
body['sender_shift_id'] = sender_shift_id
return client.create_offer_shift_requests(team_id=team_id,
body=body)
def teams_team_schedule_create_open_shift(client,
team_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
draft_open_shift=None,
scheduling_group_id=None,
shared_open_shift=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['draft_open_shift'] = draft_open_shift
body['scheduling_group_id'] = scheduling_group_id
body['shared_open_shift'] = shared_open_shift
return client.create_open_shifts(team_id=team_id,
body=body)
def teams_team_schedule_create_open_shift_change_request(client,
team_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
assigned_to=None,
manager_action_date_time=None,
manager_action_message=None,
manager_user_id=None,
sender_date_time=None,
sender_message=None,
sender_user_id=None,
state=None,
open_shift_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['assigned_to'] = assigned_to
body['manager_action_date_time'] = manager_action_date_time
body['manager_action_message'] = manager_action_message
body['manager_user_id'] = manager_user_id
body['sender_date_time'] = sender_date_time
body['sender_message'] = sender_message
body['sender_user_id'] = sender_user_id
body['state'] = state
body['open_shift_id'] = open_shift_id
return client.create_open_shift_change_requests(team_id=team_id,
body=body)
def teams_team_schedule_create_scheduling_group(client,
team_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
display_name=None,
is_active=None,
user_ids=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['display_name'] = display_name
body['is_active'] = is_active
body['user_ids'] = user_ids
return client.create_scheduling_groups(team_id=team_id,
body=body)
def teams_team_schedule_create_shift(client,
team_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
scheduling_group_id=None,
user_id=None,
end_date_time=None,
start_date_time=None,
theme=None,
activities=None,
display_name=None,
notes=None,
microsoft_graph_schedule_entity_end_date_time_end_date_time=None,
microsoft_graph_schedule_entity_start_date_time_start_date_time=None,
microsoft_graph_schedule_entity_theme=None,
microsoft_graph_shift_item_activities=None,
microsoft_graph_shift_item_display_name=None,
microsoft_graph_shift_item_notes=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['scheduling_group_id'] = scheduling_group_id
body['user_id'] = user_id
body['shared_shift'] = {}
body['shared_shift']['end_date_time'] = end_date_time
body['shared_shift']['start_date_time'] = start_date_time
body['shared_shift']['theme'] = theme
body['shared_shift']['activities'] = activities
body['shared_shift']['display_name'] = display_name
body['shared_shift']['notes'] = notes
body['draft_shift'] = {}
body['draft_shift']['end_date_time'] = microsoft_graph_schedule_entity_end_date_time_end_date_time
body['draft_shift']['start_date_time'] = microsoft_graph_schedule_entity_start_date_time_start_date_time
body['draft_shift']['theme'] = microsoft_graph_schedule_entity_theme
body['draft_shift']['activities'] = microsoft_graph_shift_item_activities
body['draft_shift']['display_name'] = microsoft_graph_shift_item_display_name
body['draft_shift']['notes'] = microsoft_graph_shift_item_notes
return client.create_shifts(team_id=team_id,
body=body)
def teams_team_schedule_create_swap_shift_change_request(client,
team_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
assigned_to=None,
manager_action_date_time=None,
manager_action_message=None,
manager_user_id=None,
sender_date_time=None,
sender_message=None,
sender_user_id=None,
state=None,
recipient_action_date_time=None,
recipient_action_message=None,
recipient_user_id=None,
sender_shift_id=None,
recipient_shift_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['assigned_to'] = assigned_to
body['manager_action_date_time'] = manager_action_date_time
body['manager_action_message'] = manager_action_message
body['manager_user_id'] = manager_user_id
body['sender_date_time'] = sender_date_time
body['sender_message'] = sender_message
body['sender_user_id'] = sender_user_id
body['state'] = state
body['recipient_action_date_time'] = recipient_action_date_time
body['recipient_action_message'] = recipient_action_message
body['recipient_user_id'] = recipient_user_id
body['sender_shift_id'] = sender_shift_id
body['recipient_shift_id'] = recipient_shift_id
return client.create_swap_shifts_change_requests(team_id=team_id,
body=body)
def teams_team_schedule_create_time_off(client,
team_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
draft_time_off=None,
shared_time_off=None,
user_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['draft_time_off'] = draft_time_off
body['shared_time_off'] = shared_time_off
body['user_id'] = user_id
return client.create_times_off(team_id=team_id,
body=body)
def teams_team_schedule_create_time_off_reason(client,
team_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
display_name=None,
icon_type=None,
is_active=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['display_name'] = display_name
body['icon_type'] = icon_type
body['is_active'] = is_active
return client.create_time_off_reasons(team_id=team_id,
body=body)
def teams_team_schedule_create_time_off_request(client,
team_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
assigned_to=None,
manager_action_date_time=None,
manager_action_message=None,
manager_user_id=None,
sender_date_time=None,
sender_message=None,
sender_user_id=None,
state=None,
end_date_time=None,
start_date_time=None,
time_off_reason_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['assigned_to'] = assigned_to
body['manager_action_date_time'] = manager_action_date_time
body['manager_action_message'] = manager_action_message
body['manager_user_id'] = manager_user_id
body['sender_date_time'] = sender_date_time
body['sender_message'] = sender_message
body['sender_user_id'] = sender_user_id
body['state'] = state
body['end_date_time'] = end_date_time
body['start_date_time'] = start_date_time
body['time_off_reason_id'] = time_off_reason_id
return client.create_time_off_requests(team_id=team_id,
body=body)
def teams_team_schedule_delete_offer_shift_request(client,
team_id,
offer_shift_request_id,
if_match=None):
return client.delete_offer_shift_requests(team_id=team_id,
offer_shift_request_id=offer_shift_request_id,
if_match=if_match)
def teams_team_schedule_delete_open_shift(client,
team_id,
open_shift_id,
if_match=None):
return client.delete_open_shifts(team_id=team_id,
open_shift_id=open_shift_id,
if_match=if_match)
def teams_team_schedule_delete_open_shift_change_request(client,
team_id,
open_shift_change_request_id,
if_match=None):
return client.delete_open_shift_change_requests(team_id=team_id,
open_shift_change_request_id=open_shift_change_request_id,
if_match=if_match)
def teams_team_schedule_delete_scheduling_group(client,
team_id,
scheduling_group_id,
if_match=None):
return client.delete_scheduling_groups(team_id=team_id,
scheduling_group_id=scheduling_group_id,
if_match=if_match)
def teams_team_schedule_delete_shift(client,
team_id,
shift_id,
if_match=None):
return client.delete_shifts(team_id=team_id,
shift_id=shift_id,
if_match=if_match)
def teams_team_schedule_delete_swap_shift_change_request(client,
team_id,
swap_shifts_change_request_id,
if_match=None):
return client.delete_swap_shifts_change_requests(team_id=team_id,
swap_shifts_change_request_id=swap_shifts_change_request_id,
if_match=if_match)
def teams_team_schedule_delete_time_off(client,
team_id,
time_off_id,
if_match=None):
return client.delete_times_off(team_id=team_id,
time_off_id=time_off_id,
if_match=if_match)
def teams_team_schedule_delete_time_off_reason(client,
team_id,
time_off_reason_id,
if_match=None):
return client.delete_time_off_reasons(team_id=team_id,
time_off_reason_id=time_off_reason_id,
if_match=if_match)
def teams_team_schedule_delete_time_off_request(client,
team_id,
time_off_request_id,
if_match=None):
return client.delete_time_off_requests(team_id=team_id,
time_off_request_id=time_off_request_id,
if_match=if_match)
def teams_team_schedule_list_offer_shift_request(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_offer_shift_requests(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_schedule_list_open_shift(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_open_shifts(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_schedule_list_open_shift_change_request(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_open_shift_change_requests(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_schedule_list_scheduling_group(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_scheduling_groups(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_schedule_list_shift(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_shifts(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_schedule_list_swap_shift_change_request(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_swap_shifts_change_requests(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_schedule_list_time_off(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_times_off(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_schedule_list_time_off_reason(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_time_off_reasons(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_schedule_list_time_off_request(client,
team_id,
orderby=None,
select=None,
expand=None):
return client.list_time_off_requests(team_id=team_id,
orderby=orderby,
select=select,
expand=expand)
def teams_team_schedule_share(client,
team_id,
notify_team=None,
start_date_time=None,
end_date_time=None):
if notify_team is None:
notify_team = False
body = {}
body['notify_team'] = False if notify_team is None else notify_team
body['start_date_time'] = start_date_time
body['end_date_time'] = end_date_time
return client.share(team_id=team_id,
body=body)
def teams_team_schedule_show_offer_shift_request(client,
team_id,
offer_shift_request_id,
select=None,
expand=None):
return client.get_offer_shift_requests(team_id=team_id,
offer_shift_request_id=offer_shift_request_id,
select=select,
expand=expand)
def teams_team_schedule_show_open_shift(client,
team_id,
open_shift_id,
select=None,
expand=None):
return client.get_open_shifts(team_id=team_id,
open_shift_id=open_shift_id,
select=select,
expand=expand)
def teams_team_schedule_show_open_shift_change_request(client,
team_id,
open_shift_change_request_id,
select=None,
expand=None):
return client.get_open_shift_change_requests(team_id=team_id,
open_shift_change_request_id=open_shift_change_request_id,
select=select,
expand=expand)
def teams_team_schedule_show_scheduling_group(client,
team_id,
scheduling_group_id,
select=None,
expand=None):
return client.get_scheduling_groups(team_id=team_id,
scheduling_group_id=scheduling_group_id,
select=select,
expand=expand)
def teams_team_schedule_show_shift(client,
team_id,
shift_id,
select=None,
expand=None):
return client.get_shifts(team_id=team_id,
shift_id=shift_id,
select=select,
expand=expand)
def teams_team_schedule_show_swap_shift_change_request(client,
team_id,
swap_shifts_change_request_id,
select=None,
expand=None):
return client.get_swap_shifts_change_requests(team_id=team_id,
swap_shifts_change_request_id=swap_shifts_change_request_id,
select=select,
expand=expand)
def teams_team_schedule_show_time_off(client,
team_id,
time_off_id,
select=None,
expand=None):
return client.get_times_off(team_id=team_id,
time_off_id=time_off_id,
select=select,
expand=expand)
def teams_team_schedule_show_time_off_reason(client,
team_id,
time_off_reason_id,
select=None,
expand=None):
return client.get_time_off_reasons(team_id=team_id,
time_off_reason_id=time_off_reason_id,
select=select,
expand=expand)
def teams_team_schedule_show_time_off_request(client,
team_id,
time_off_request_id,
select=None,
expand=None):
return client.get_time_off_requests(team_id=team_id,
time_off_request_id=time_off_request_id,
select=select,
expand=expand)
def teams_team_schedule_update_offer_shift_request(client,
team_id,
offer_shift_request_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
assigned_to=None,
manager_action_date_time=None,
manager_action_message=None,
manager_user_id=None,
sender_date_time=None,
sender_message=None,
sender_user_id=None,
state=None,
recipient_action_date_time=None,
recipient_action_message=None,
recipient_user_id=None,
sender_shift_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['assigned_to'] = assigned_to
body['manager_action_date_time'] = manager_action_date_time
body['manager_action_message'] = manager_action_message
body['manager_user_id'] = manager_user_id
body['sender_date_time'] = sender_date_time
body['sender_message'] = sender_message
body['sender_user_id'] = sender_user_id
body['state'] = state
body['recipient_action_date_time'] = recipient_action_date_time
body['recipient_action_message'] = recipient_action_message
body['recipient_user_id'] = recipient_user_id
body['sender_shift_id'] = sender_shift_id
return client.update_offer_shift_requests(team_id=team_id,
offer_shift_request_id=offer_shift_request_id,
body=body)
def teams_team_schedule_update_open_shift(client,
team_id,
open_shift_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
draft_open_shift=None,
scheduling_group_id=None,
shared_open_shift=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['draft_open_shift'] = draft_open_shift
body['scheduling_group_id'] = scheduling_group_id
body['shared_open_shift'] = shared_open_shift
return client.update_open_shifts(team_id=team_id,
open_shift_id=open_shift_id,
body=body)
def teams_team_schedule_update_open_shift_change_request(client,
team_id,
open_shift_change_request_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
assigned_to=None,
manager_action_date_time=None,
manager_action_message=None,
manager_user_id=None,
sender_date_time=None,
sender_message=None,
sender_user_id=None,
state=None,
open_shift_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['assigned_to'] = assigned_to
body['manager_action_date_time'] = manager_action_date_time
body['manager_action_message'] = manager_action_message
body['manager_user_id'] = manager_user_id
body['sender_date_time'] = sender_date_time
body['sender_message'] = sender_message
body['sender_user_id'] = sender_user_id
body['state'] = state
body['open_shift_id'] = open_shift_id
return client.update_open_shift_change_requests(team_id=team_id,
open_shift_change_request_id=open_shift_change_request_id,
body=body)
def teams_team_schedule_update_scheduling_group(client,
team_id,
scheduling_group_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
display_name=None,
is_active=None,
user_ids=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['display_name'] = display_name
body['is_active'] = is_active
body['user_ids'] = user_ids
return client.update_scheduling_groups(team_id=team_id,
scheduling_group_id=scheduling_group_id,
body=body)
def teams_team_schedule_update_shift(client,
team_id,
shift_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
scheduling_group_id=None,
user_id=None,
end_date_time=None,
start_date_time=None,
theme=None,
activities=None,
display_name=None,
notes=None,
microsoft_graph_schedule_entity_end_date_time_end_date_time=None,
microsoft_graph_schedule_entity_start_date_time_start_date_time=None,
microsoft_graph_schedule_entity_theme=None,
microsoft_graph_shift_item_activities=None,
microsoft_graph_shift_item_display_name=None,
microsoft_graph_shift_item_notes=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['scheduling_group_id'] = scheduling_group_id
body['user_id'] = user_id
body['shared_shift'] = {}
body['shared_shift']['end_date_time'] = end_date_time
body['shared_shift']['start_date_time'] = start_date_time
body['shared_shift']['theme'] = theme
body['shared_shift']['activities'] = activities
body['shared_shift']['display_name'] = display_name
body['shared_shift']['notes'] = notes
body['draft_shift'] = {}
body['draft_shift']['end_date_time'] = microsoft_graph_schedule_entity_end_date_time_end_date_time
body['draft_shift']['start_date_time'] = microsoft_graph_schedule_entity_start_date_time_start_date_time
body['draft_shift']['theme'] = microsoft_graph_schedule_entity_theme
body['draft_shift']['activities'] = microsoft_graph_shift_item_activities
body['draft_shift']['display_name'] = microsoft_graph_shift_item_display_name
body['draft_shift']['notes'] = microsoft_graph_shift_item_notes
return client.update_shifts(team_id=team_id,
shift_id=shift_id,
body=body)
def teams_team_schedule_update_swap_shift_change_request(client,
team_id,
swap_shifts_change_request_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
assigned_to=None,
manager_action_date_time=None,
manager_action_message=None,
manager_user_id=None,
sender_date_time=None,
sender_message=None,
sender_user_id=None,
state=None,
recipient_action_date_time=None,
recipient_action_message=None,
recipient_user_id=None,
sender_shift_id=None,
recipient_shift_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['assigned_to'] = assigned_to
body['manager_action_date_time'] = manager_action_date_time
body['manager_action_message'] = manager_action_message
body['manager_user_id'] = manager_user_id
body['sender_date_time'] = sender_date_time
body['sender_message'] = sender_message
body['sender_user_id'] = sender_user_id
body['state'] = state
body['recipient_action_date_time'] = recipient_action_date_time
body['recipient_action_message'] = recipient_action_message
body['recipient_user_id'] = recipient_user_id
body['sender_shift_id'] = sender_shift_id
body['recipient_shift_id'] = recipient_shift_id
return client.update_swap_shifts_change_requests(team_id=team_id,
swap_shifts_change_request_id=swap_shifts_change_request_id,
body=body)
def teams_team_schedule_update_time_off(client,
team_id,
time_off_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
draft_time_off=None,
shared_time_off=None,
user_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['draft_time_off'] = draft_time_off
body['shared_time_off'] = shared_time_off
body['user_id'] = user_id
return client.update_times_off(team_id=team_id,
time_off_id=time_off_id,
body=body)
def teams_team_schedule_update_time_off_reason(client,
team_id,
time_off_reason_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
display_name=None,
icon_type=None,
is_active=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['display_name'] = display_name
body['icon_type'] = icon_type
body['is_active'] = is_active
return client.update_time_off_reasons(team_id=team_id,
time_off_reason_id=time_off_reason_id,
body=body)
def teams_team_schedule_update_time_off_request(client,
team_id,
time_off_request_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
assigned_to=None,
manager_action_date_time=None,
manager_action_message=None,
manager_user_id=None,
sender_date_time=None,
sender_message=None,
sender_user_id=None,
state=None,
end_date_time=None,
start_date_time=None,
time_off_reason_id=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['assigned_to'] = assigned_to
body['manager_action_date_time'] = manager_action_date_time
body['manager_action_message'] = manager_action_message
body['manager_user_id'] = manager_user_id
body['sender_date_time'] = sender_date_time
body['sender_message'] = sender_message
body['sender_user_id'] = sender_user_id
body['state'] = state
body['end_date_time'] = end_date_time
body['start_date_time'] = start_date_time
body['time_off_reason_id'] = time_off_reason_id
return client.update_time_off_requests(team_id=team_id,
time_off_request_id=time_off_request_id,
body=body)
def teams_teamwork_teamwork_show_teamwork(client,
select=None,
expand=None):
return client.get_teamwork(select=select,
expand=expand)
def teams_teamwork_teamwork_update_teamwork(client,
id_=None,
workforce_integrations=None):
body = {}
body['id'] = id_
body['workforce_integrations'] = workforce_integrations
return client.update_teamwork(body=body)
def teams_teamwork_create_workforce_integration(client,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
api_version=None,
display_name=None,
encryption=None,
is_active=None,
supported_entities=None,
url=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['api_version'] = api_version
body['display_name'] = display_name
body['encryption'] = encryption
body['is_active'] = is_active
body['supported_entities'] = supported_entities
body['url'] = url
return client.create_workforce_integrations(body=body)
def teams_teamwork_delete_workforce_integration(client,
workforce_integration_id,
if_match=None):
return client.delete_workforce_integrations(workforce_integration_id=workforce_integration_id,
if_match=if_match)
def teams_teamwork_list_workforce_integration(client,
orderby=None,
select=None,
expand=None):
return client.list_workforce_integrations(orderby=orderby,
select=select,
expand=expand)
def teams_teamwork_show_workforce_integration(client,
workforce_integration_id,
select=None,
expand=None):
return client.get_workforce_integrations(workforce_integration_id=workforce_integration_id,
select=select,
expand=expand)
def teams_teamwork_update_workforce_integration(client,
workforce_integration_id,
id_=None,
created_date_time=None,
last_modified_date_time=None,
application=None,
device=None,
user=None,
api_version=None,
display_name=None,
encryption=None,
is_active=None,
supported_entities=None,
url=None):
body = {}
body['id'] = id_
body['created_date_time'] = created_date_time
body['last_modified_date_time'] = last_modified_date_time
body['last_modified_by'] = {}
body['last_modified_by']['application'] = application
body['last_modified_by']['device'] = device
body['last_modified_by']['user'] = user
body['api_version'] = api_version
body['display_name'] = display_name
body['encryption'] = encryption
body['is_active'] = is_active
body['supported_entities'] = supported_entities
body['url'] = url
return client.update_workforce_integrations(workforce_integration_id=workforce_integration_id,
body=body)
def teams_user_create_joined_team(client,
user_id,
id_=None,
classification=None,
description=None,
display_name=None,
fun_settings=None,
guest_settings=None,
internal_id=None,
is_archived=None,
member_settings=None,
messaging_settings=None,
specialization=None,
visibility=None,
web_url=None,
channels=None,
installed_apps=None,
members=None,
operations=None,
primary_channel=None,
microsoft_graph_entity_id=None,
id1=None,
deleted_date_time=None,
assigned_labels=None,
assigned_licenses=None,
microsoft_graph_group_classification=None,
created_date_time=None,
microsoft_graph_group_description=None,
microsoft_graph_group_display_name=None,
expiration_date_time=None,
group_types=None,
has_members_with_license_errors=None,
license_processing_state=None,
mail=None,
mail_enabled=None,
mail_nickname=None,
membership_rule=None,
membership_rule_processing_state=None,
on_premises_domain_name=None,
on_premises_last_sync_date_time=None,
on_premises_net_bios_name=None,
on_premises_provisioning_errors=None,
on_premises_sam_account_name=None,
on_premises_security_identifier=None,
on_premises_sync_enabled=None,
preferred_data_location=None,
preferred_language=None,
proxy_addresses=None,
renewed_date_time=None,
security_enabled=None,
security_identifier=None,
theme=None,
microsoft_graph_group_visibility=None,
allow_external_senders=None,
auto_subscribe_new_members=None,
hide_from_address_lists=None,
hide_from_outlook_clients=None,
is_subscribed_by_mail=None,
unseen_count=None,
group_is_archived=None,
app_role_assignments=None,
created_on_behalf_of=None,
member_of=None,
microsoft_graph_group_members=None,
members_with_license_errors=None,
owners=None,
settings=None,
transitive_member_of=None,
transitive_members=None,
accepted_senders=None,
calendar=None,
calendar_view=None,
conversations=None,
events=None,
photo=None,
photos=None,
rejected_senders=None,
threads=None,
drive=None,
drives=None,
sites=None,
extensions=None,
group_lifecycle_policies=None,
planner=None,
onenote=None,
team=None,
id2=None,
enabled=None,
offer_shift_requests_enabled=None,
open_shifts_enabled=None,
provision_status=None,
provision_status_code=None,
swap_shifts_requests_enabled=None,
time_clock_enabled=None,
time_off_requests_enabled=None,
time_zone=None,
workforce_integration_ids=None,
offer_shift_requests=None,
open_shift_change_requests=None,
open_shifts=None,
scheduling_groups=None,
shifts=None,
swap_shifts_change_requests=None,
time_off_reasons=None,
time_off_requests=None,
times_off=None):
body = {}
body['id'] = id_
body['classification'] = classification
body['description'] = description
body['display_name'] = display_name
body['fun_settings'] = fun_settings
body['guest_settings'] = guest_settings
body['internal_id'] = internal_id
body['is_archived'] = is_archived
body['member_settings'] = member_settings
body['messaging_settings'] = messaging_settings
body['specialization'] = specialization
body['visibility'] = visibility
body['web_url'] = web_url
body['channels'] = channels
body['installed_apps'] = installed_apps
body['members'] = members
body['operations'] = operations
body['primary_channel'] = primary_channel
body['template'] = {}
body['template']['id'] = microsoft_graph_entity_id
body['group'] = {}
body['group']['id'] = id1
body['group']['deleted_date_time'] = deleted_date_time
body['group']['assigned_labels'] = assigned_labels
body['group']['assigned_licenses'] = assigned_licenses
body['group']['classification'] = microsoft_graph_group_classification
body['group']['created_date_time'] = created_date_time
body['group']['description'] = microsoft_graph_group_description
body['group']['display_name'] = microsoft_graph_group_display_name
body['group']['expiration_date_time'] = expiration_date_time
body['group']['group_types'] = group_types
body['group']['has_members_with_license_errors'] = has_members_with_license_errors
body['group']['license_processing_state'] = license_processing_state
body['group']['mail'] = mail
body['group']['mail_enabled'] = mail_enabled
body['group']['mail_nickname'] = mail_nickname
body['group']['membership_rule'] = membership_rule
body['group']['membership_rule_processing_state'] = membership_rule_processing_state
body['group']['on_premises_domain_name'] = on_premises_domain_name
body['group']['on_premises_last_sync_date_time'] = on_premises_last_sync_date_time
body['group']['on_premises_net_bios_name'] = on_premises_net_bios_name
body['group']['on_premises_provisioning_errors'] = on_premises_provisioning_errors
body['group']['on_premises_sam_account_name'] = on_premises_sam_account_name
body['group']['on_premises_security_identifier'] = on_premises_security_identifier
body['group']['on_premises_sync_enabled'] = on_premises_sync_enabled
body['group']['preferred_data_location'] = preferred_data_location
body['group']['preferred_language'] = preferred_language
body['group']['proxy_addresses'] = proxy_addresses
body['group']['renewed_date_time'] = renewed_date_time
body['group']['security_enabled'] = security_enabled
body['group']['security_identifier'] = security_identifier
body['group']['theme'] = theme
body['group']['visibility'] = microsoft_graph_group_visibility
body['group']['allow_external_senders'] = allow_external_senders
body['group']['auto_subscribe_new_members'] = auto_subscribe_new_members
body['group']['hide_from_address_lists'] = hide_from_address_lists
body['group']['hide_from_outlook_clients'] = hide_from_outlook_clients
body['group']['is_subscribed_by_mail'] = is_subscribed_by_mail
body['group']['unseen_count'] = unseen_count
body['group']['is_archived'] = group_is_archived
body['group']['app_role_assignments'] = app_role_assignments
body['group']['created_on_behalf_of'] = created_on_behalf_of
body['group']['member_of'] = member_of
body['group']['members'] = microsoft_graph_group_members
body['group']['members_with_license_errors'] = members_with_license_errors
body['group']['owners'] = owners
body['group']['settings'] = settings
body['group']['transitive_member_of'] = transitive_member_of
body['group']['transitive_members'] = transitive_members
body['group']['accepted_senders'] = accepted_senders
body['group']['calendar'] = calendar
body['group']['calendar_view'] = calendar_view
body['group']['conversations'] = conversations
body['group']['events'] = events
body['group']['photo'] = photo
body['group']['photos'] = photos
body['group']['rejected_senders'] = rejected_senders
body['group']['threads'] = threads
body['group']['drive'] = drive
body['group']['drives'] = drives
body['group']['sites'] = sites
body['group']['extensions'] = extensions
body['group']['group_lifecycle_policies'] = group_lifecycle_policies
body['group']['planner'] = planner
body['group']['onenote'] = onenote
body['group']['team'] = team
body['schedule'] = {}
body['schedule']['id'] = id2
body['schedule']['enabled'] = enabled
body['schedule']['offer_shift_requests_enabled'] = offer_shift_requests_enabled
body['schedule']['open_shifts_enabled'] = open_shifts_enabled
body['schedule']['provision_status'] = provision_status
body['schedule']['provision_status_code'] = provision_status_code
body['schedule']['swap_shifts_requests_enabled'] = swap_shifts_requests_enabled
body['schedule']['time_clock_enabled'] = time_clock_enabled
body['schedule']['time_off_requests_enabled'] = time_off_requests_enabled
body['schedule']['time_zone'] = time_zone
body['schedule']['workforce_integration_ids'] = workforce_integration_ids
body['schedule']['offer_shift_requests'] = offer_shift_requests
body['schedule']['open_shift_change_requests'] = open_shift_change_requests
body['schedule']['open_shifts'] = open_shifts
body['schedule']['scheduling_groups'] = scheduling_groups
body['schedule']['shifts'] = shifts
body['schedule']['swap_shifts_change_requests'] = swap_shifts_change_requests
body['schedule']['time_off_reasons'] = time_off_reasons
body['schedule']['time_off_requests'] = time_off_requests
body['schedule']['times_off'] = times_off
return client.create_joined_teams(user_id=user_id,
body=body)
def teams_user_delete_joined_team(client,
user_id,
team_id,
if_match=None):
return client.delete_joined_teams(user_id=user_id,
team_id=team_id,
if_match=if_match)
def teams_user_list_joined_team(client,
user_id,
orderby=None,
select=None,
expand=None):
return client.list_joined_teams(user_id=user_id,
orderby=orderby,
select=select,
expand=expand)
def teams_user_show_joined_team(client,
user_id,
team_id,
select=None,
expand=None):
return client.get_joined_teams(user_id=user_id,
team_id=team_id,
select=select,
expand=expand)
def teams_user_update_joined_team(client,
user_id,
team_id,
id_=None,
classification=None,
description=None,
display_name=None,
fun_settings=None,
guest_settings=None,
internal_id=None,
is_archived=None,
member_settings=None,
messaging_settings=None,
specialization=None,
visibility=None,
web_url=None,
channels=None,
installed_apps=None,
members=None,
operations=None,
primary_channel=None,
microsoft_graph_entity_id=None,
id1=None,
deleted_date_time=None,
assigned_labels=None,
assigned_licenses=None,
microsoft_graph_group_classification=None,
created_date_time=None,
microsoft_graph_group_description=None,
microsoft_graph_group_display_name=None,
expiration_date_time=None,
group_types=None,
has_members_with_license_errors=None,
license_processing_state=None,
mail=None,
mail_enabled=None,
mail_nickname=None,
membership_rule=None,
membership_rule_processing_state=None,
on_premises_domain_name=None,
on_premises_last_sync_date_time=None,
on_premises_net_bios_name=None,
on_premises_provisioning_errors=None,
on_premises_sam_account_name=None,
on_premises_security_identifier=None,
on_premises_sync_enabled=None,
preferred_data_location=None,
preferred_language=None,
proxy_addresses=None,
renewed_date_time=None,
security_enabled=None,
security_identifier=None,
theme=None,
microsoft_graph_group_visibility=None,
allow_external_senders=None,
auto_subscribe_new_members=None,
hide_from_address_lists=None,
hide_from_outlook_clients=None,
is_subscribed_by_mail=None,
unseen_count=None,
group_is_archived=None,
app_role_assignments=None,
created_on_behalf_of=None,
member_of=None,
microsoft_graph_group_members=None,
members_with_license_errors=None,
owners=None,
settings=None,
transitive_member_of=None,
transitive_members=None,
accepted_senders=None,
calendar=None,
calendar_view=None,
conversations=None,
events=None,
photo=None,
photos=None,
rejected_senders=None,
threads=None,
drive=None,
drives=None,
sites=None,
extensions=None,
group_lifecycle_policies=None,
planner=None,
onenote=None,
team=None,
id2=None,
enabled=None,
offer_shift_requests_enabled=None,
open_shifts_enabled=None,
provision_status=None,
provision_status_code=None,
swap_shifts_requests_enabled=None,
time_clock_enabled=None,
time_off_requests_enabled=None,
time_zone=None,
workforce_integration_ids=None,
offer_shift_requests=None,
open_shift_change_requests=None,
open_shifts=None,
scheduling_groups=None,
shifts=None,
swap_shifts_change_requests=None,
time_off_reasons=None,
time_off_requests=None,
times_off=None):
body = {}
body['id'] = id_
body['classification'] = classification
body['description'] = description
body['display_name'] = display_name
body['fun_settings'] = fun_settings
body['guest_settings'] = guest_settings
body['internal_id'] = internal_id
body['is_archived'] = is_archived
body['member_settings'] = member_settings
body['messaging_settings'] = messaging_settings
body['specialization'] = specialization
body['visibility'] = visibility
body['web_url'] = web_url
body['channels'] = channels
body['installed_apps'] = installed_apps
body['members'] = members
body['operations'] = operations
body['primary_channel'] = primary_channel
body['template'] = {}
body['template']['id'] = microsoft_graph_entity_id
body['group'] = {}
body['group']['id'] = id1
body['group']['deleted_date_time'] = deleted_date_time
body['group']['assigned_labels'] = assigned_labels
body['group']['assigned_licenses'] = assigned_licenses
body['group']['classification'] = microsoft_graph_group_classification
body['group']['created_date_time'] = created_date_time
body['group']['description'] = microsoft_graph_group_description
body['group']['display_name'] = microsoft_graph_group_display_name
body['group']['expiration_date_time'] = expiration_date_time
body['group']['group_types'] = group_types
body['group']['has_members_with_license_errors'] = has_members_with_license_errors
body['group']['license_processing_state'] = license_processing_state
body['group']['mail'] = mail
body['group']['mail_enabled'] = mail_enabled
body['group']['mail_nickname'] = mail_nickname
body['group']['membership_rule'] = membership_rule
body['group']['membership_rule_processing_state'] = membership_rule_processing_state
body['group']['on_premises_domain_name'] = on_premises_domain_name
body['group']['on_premises_last_sync_date_time'] = on_premises_last_sync_date_time
body['group']['on_premises_net_bios_name'] = on_premises_net_bios_name
body['group']['on_premises_provisioning_errors'] = on_premises_provisioning_errors
body['group']['on_premises_sam_account_name'] = on_premises_sam_account_name
body['group']['on_premises_security_identifier'] = on_premises_security_identifier
body['group']['on_premises_sync_enabled'] = on_premises_sync_enabled
body['group']['preferred_data_location'] = preferred_data_location
body['group']['preferred_language'] = preferred_language
body['group']['proxy_addresses'] = proxy_addresses
body['group']['renewed_date_time'] = renewed_date_time
body['group']['security_enabled'] = security_enabled
body['group']['security_identifier'] = security_identifier
body['group']['theme'] = theme
body['group']['visibility'] = microsoft_graph_group_visibility
body['group']['allow_external_senders'] = allow_external_senders
body['group']['auto_subscribe_new_members'] = auto_subscribe_new_members
body['group']['hide_from_address_lists'] = hide_from_address_lists
body['group']['hide_from_outlook_clients'] = hide_from_outlook_clients
body['group']['is_subscribed_by_mail'] = is_subscribed_by_mail
body['group']['unseen_count'] = unseen_count
body['group']['is_archived'] = group_is_archived
body['group']['app_role_assignments'] = app_role_assignments
body['group']['created_on_behalf_of'] = created_on_behalf_of
body['group']['member_of'] = member_of
body['group']['members'] = microsoft_graph_group_members
body['group']['members_with_license_errors'] = members_with_license_errors
body['group']['owners'] = owners
body['group']['settings'] = settings
body['group']['transitive_member_of'] = transitive_member_of
body['group']['transitive_members'] = transitive_members
body['group']['accepted_senders'] = accepted_senders
body['group']['calendar'] = calendar
body['group']['calendar_view'] = calendar_view
body['group']['conversations'] = conversations
body['group']['events'] = events
body['group']['photo'] = photo
body['group']['photos'] = photos
body['group']['rejected_senders'] = rejected_senders
body['group']['threads'] = threads
body['group']['drive'] = drive
body['group']['drives'] = drives
body['group']['sites'] = sites
body['group']['extensions'] = extensions
body['group']['group_lifecycle_policies'] = group_lifecycle_policies
body['group']['planner'] = planner
body['group']['onenote'] = onenote
body['group']['team'] = team
body['schedule'] = {}
body['schedule']['id'] = id2
body['schedule']['enabled'] = enabled
body['schedule']['offer_shift_requests_enabled'] = offer_shift_requests_enabled
body['schedule']['open_shifts_enabled'] = open_shifts_enabled
body['schedule']['provision_status'] = provision_status
body['schedule']['provision_status_code'] = provision_status_code
body['schedule']['swap_shifts_requests_enabled'] = swap_shifts_requests_enabled
body['schedule']['time_clock_enabled'] = time_clock_enabled
body['schedule']['time_off_requests_enabled'] = time_off_requests_enabled
body['schedule']['time_zone'] = time_zone
body['schedule']['workforce_integration_ids'] = workforce_integration_ids
body['schedule']['offer_shift_requests'] = offer_shift_requests
body['schedule']['open_shift_change_requests'] = open_shift_change_requests
body['schedule']['open_shifts'] = open_shifts
body['schedule']['scheduling_groups'] = scheduling_groups
body['schedule']['shifts'] = shifts
body['schedule']['swap_shifts_change_requests'] = swap_shifts_change_requests
body['schedule']['time_off_reasons'] = time_off_reasons
body['schedule']['time_off_requests'] = time_off_requests
body['schedule']['times_off'] = times_off
return client.update_joined_teams(user_id=user_id,
team_id=team_id,
body=body)
| 49.863802 | 161 | 0.466049 | 18,490 | 219,301 | 5.087507 | 0.018388 | 0.031063 | 0.014288 | 0.020666 | 0.984107 | 0.975752 | 0.968098 | 0.955479 | 0.938768 | 0.918187 | 0 | 0.000729 | 0.462068 | 219,301 | 4,397 | 162 | 49.875142 | 0.796667 | 0.00228 | 0 | 0.919194 | 0 | 0 | 0.107574 | 0.021349 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045251 | false | 0 | 0.003978 | 0.029836 | 0.094729 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7 |
078ae598d12e8efa099d958572713797ada90daa | 3,766 | py | Python | filtering-searching-mineral-catalogue/minerals/migrations/0003_auto_20170408_0837.py | squadran2003/filtering-searching-mineral-catalogue | 5eac2de7d97155f2ee4cb43926b4cbc9b2fadf69 | [
"MIT"
] | null | null | null | filtering-searching-mineral-catalogue/minerals/migrations/0003_auto_20170408_0837.py | squadran2003/filtering-searching-mineral-catalogue | 5eac2de7d97155f2ee4cb43926b4cbc9b2fadf69 | [
"MIT"
] | null | null | null | filtering-searching-mineral-catalogue/minerals/migrations/0003_auto_20170408_0837.py | squadran2003/filtering-searching-mineral-catalogue | 5eac2de7d97155f2ee4cb43926b4cbc9b2fadf69 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11 on 2017-04-08 08:37
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('minerals', '0002_auto_20170407_2018'),
]
operations = [
migrations.AlterField(
model_name='mineral',
name='category',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='cleavage',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='color',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='crystal_habit',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='crystal_symmetry',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='crystal_system',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='diaphaneity',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='formula',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='group',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='image_caption',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='image_filename',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='luster',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='mohs_scale_hardness',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='name',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='optical_properties',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='refractive_index',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='specific_gravity',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='streak',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='strunz_classification',
field=models.CharField(default='', max_length=1000),
),
migrations.AlterField(
model_name='mineral',
name='unit_cell',
field=models.CharField(default='', max_length=1000),
),
]
| 32.465517 | 64 | 0.549389 | 328 | 3,766 | 6.125 | 0.207317 | 0.199104 | 0.24888 | 0.288701 | 0.811847 | 0.811847 | 0.791936 | 0.772026 | 0.772026 | 0.772026 | 0 | 0.044042 | 0.324748 | 3,766 | 115 | 65 | 32.747826 | 0.745969 | 0.017525 | 0 | 0.740741 | 1 | 0 | 0.108196 | 0.011902 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.018519 | 0 | 0.046296 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10 |
0796fa60bcf6a0f8ee387078a5f97cbfb4659576 | 22,516 | py | Python | sdk/python/pulumi_alicloud/dns/alidns_instance.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 42 | 2019-03-18T06:34:37.000Z | 2022-03-24T07:08:57.000Z | sdk/python/pulumi_alicloud/dns/alidns_instance.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 152 | 2019-04-15T21:03:44.000Z | 2022-03-29T18:00:57.000Z | sdk/python/pulumi_alicloud/dns/alidns_instance.py | pulumi/pulumi-alicloud | 9c34d84b4588a7c885c6bec1f03b5016e5a41683 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-08-26T17:30:07.000Z | 2021-07-05T01:37:45.000Z | # coding=utf-8
# *** WARNING: this file was generated by the Pulumi Terraform Bridge (tfgen) Tool. ***
# *** Do not edit by hand unless you're certain you know what you are doing! ***
import warnings
import pulumi
import pulumi.runtime
from typing import Any, Mapping, Optional, Sequence, Union, overload
from .. import _utilities
__all__ = ['AlidnsInstanceArgs', 'AlidnsInstance']
@pulumi.input_type
class AlidnsInstanceArgs:
def __init__(__self__, *,
dns_security: pulumi.Input[str],
domain_numbers: pulumi.Input[str],
version_code: pulumi.Input[str],
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
renew_period: Optional[pulumi.Input[int]] = None,
renewal_status: Optional[pulumi.Input[str]] = None):
"""
The set of arguments for constructing a AlidnsInstance resource.
:param pulumi.Input[str] dns_security: Alidns security level. Valid values: `no`, `basic`, `advanced`.
:param pulumi.Input[str] domain_numbers: Number of domain names bound.
:param pulumi.Input[str] version_code: Paid package version. Valid values: `version_personal`, `version_enterprise_basic`, `version_enterprise_advanced`.
:param pulumi.Input[str] payment_type: The billing method of the Alidns instance. Valid values: `Subscription`. Default to `Subscription`.
:param pulumi.Input[int] period: Creating a pre-paid instance, it must be set, the unit is month, please enter an integer multiple of 12 for annually paid products.
:param pulumi.Input[int] renew_period: Automatic renewal period, the unit is month. When setting RenewalStatus to AutoRenewal, it must be set.
:param pulumi.Input[str] renewal_status: Automatic renewal status. Valid values: `AutoRenewal`, `ManualRenewal`, default to `ManualRenewal`.
"""
pulumi.set(__self__, "dns_security", dns_security)
pulumi.set(__self__, "domain_numbers", domain_numbers)
pulumi.set(__self__, "version_code", version_code)
if payment_type is not None:
pulumi.set(__self__, "payment_type", payment_type)
if period is not None:
pulumi.set(__self__, "period", period)
if renew_period is not None:
pulumi.set(__self__, "renew_period", renew_period)
if renewal_status is not None:
pulumi.set(__self__, "renewal_status", renewal_status)
@property
@pulumi.getter(name="dnsSecurity")
def dns_security(self) -> pulumi.Input[str]:
"""
Alidns security level. Valid values: `no`, `basic`, `advanced`.
"""
return pulumi.get(self, "dns_security")
@dns_security.setter
def dns_security(self, value: pulumi.Input[str]):
pulumi.set(self, "dns_security", value)
@property
@pulumi.getter(name="domainNumbers")
def domain_numbers(self) -> pulumi.Input[str]:
"""
Number of domain names bound.
"""
return pulumi.get(self, "domain_numbers")
@domain_numbers.setter
def domain_numbers(self, value: pulumi.Input[str]):
pulumi.set(self, "domain_numbers", value)
@property
@pulumi.getter(name="versionCode")
def version_code(self) -> pulumi.Input[str]:
"""
Paid package version. Valid values: `version_personal`, `version_enterprise_basic`, `version_enterprise_advanced`.
"""
return pulumi.get(self, "version_code")
@version_code.setter
def version_code(self, value: pulumi.Input[str]):
pulumi.set(self, "version_code", value)
@property
@pulumi.getter(name="paymentType")
def payment_type(self) -> Optional[pulumi.Input[str]]:
"""
The billing method of the Alidns instance. Valid values: `Subscription`. Default to `Subscription`.
"""
return pulumi.get(self, "payment_type")
@payment_type.setter
def payment_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "payment_type", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
Creating a pre-paid instance, it must be set, the unit is month, please enter an integer multiple of 12 for annually paid products.
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter(name="renewPeriod")
def renew_period(self) -> Optional[pulumi.Input[int]]:
"""
Automatic renewal period, the unit is month. When setting RenewalStatus to AutoRenewal, it must be set.
"""
return pulumi.get(self, "renew_period")
@renew_period.setter
def renew_period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "renew_period", value)
@property
@pulumi.getter(name="renewalStatus")
def renewal_status(self) -> Optional[pulumi.Input[str]]:
"""
Automatic renewal status. Valid values: `AutoRenewal`, `ManualRenewal`, default to `ManualRenewal`.
"""
return pulumi.get(self, "renewal_status")
@renewal_status.setter
def renewal_status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "renewal_status", value)
@pulumi.input_type
class _AlidnsInstanceState:
def __init__(__self__, *,
dns_security: Optional[pulumi.Input[str]] = None,
domain_numbers: Optional[pulumi.Input[str]] = None,
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
renew_period: Optional[pulumi.Input[int]] = None,
renewal_status: Optional[pulumi.Input[str]] = None,
version_code: Optional[pulumi.Input[str]] = None,
version_name: Optional[pulumi.Input[str]] = None):
"""
Input properties used for looking up and filtering AlidnsInstance resources.
:param pulumi.Input[str] dns_security: Alidns security level. Valid values: `no`, `basic`, `advanced`.
:param pulumi.Input[str] domain_numbers: Number of domain names bound.
:param pulumi.Input[str] payment_type: The billing method of the Alidns instance. Valid values: `Subscription`. Default to `Subscription`.
:param pulumi.Input[int] period: Creating a pre-paid instance, it must be set, the unit is month, please enter an integer multiple of 12 for annually paid products.
:param pulumi.Input[int] renew_period: Automatic renewal period, the unit is month. When setting RenewalStatus to AutoRenewal, it must be set.
:param pulumi.Input[str] renewal_status: Automatic renewal status. Valid values: `AutoRenewal`, `ManualRenewal`, default to `ManualRenewal`.
:param pulumi.Input[str] version_code: Paid package version. Valid values: `version_personal`, `version_enterprise_basic`, `version_enterprise_advanced`.
:param pulumi.Input[str] version_name: Paid package version name.
"""
if dns_security is not None:
pulumi.set(__self__, "dns_security", dns_security)
if domain_numbers is not None:
pulumi.set(__self__, "domain_numbers", domain_numbers)
if payment_type is not None:
pulumi.set(__self__, "payment_type", payment_type)
if period is not None:
pulumi.set(__self__, "period", period)
if renew_period is not None:
pulumi.set(__self__, "renew_period", renew_period)
if renewal_status is not None:
pulumi.set(__self__, "renewal_status", renewal_status)
if version_code is not None:
pulumi.set(__self__, "version_code", version_code)
if version_name is not None:
pulumi.set(__self__, "version_name", version_name)
@property
@pulumi.getter(name="dnsSecurity")
def dns_security(self) -> Optional[pulumi.Input[str]]:
"""
Alidns security level. Valid values: `no`, `basic`, `advanced`.
"""
return pulumi.get(self, "dns_security")
@dns_security.setter
def dns_security(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "dns_security", value)
@property
@pulumi.getter(name="domainNumbers")
def domain_numbers(self) -> Optional[pulumi.Input[str]]:
"""
Number of domain names bound.
"""
return pulumi.get(self, "domain_numbers")
@domain_numbers.setter
def domain_numbers(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "domain_numbers", value)
@property
@pulumi.getter(name="paymentType")
def payment_type(self) -> Optional[pulumi.Input[str]]:
"""
The billing method of the Alidns instance. Valid values: `Subscription`. Default to `Subscription`.
"""
return pulumi.get(self, "payment_type")
@payment_type.setter
def payment_type(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "payment_type", value)
@property
@pulumi.getter
def period(self) -> Optional[pulumi.Input[int]]:
"""
Creating a pre-paid instance, it must be set, the unit is month, please enter an integer multiple of 12 for annually paid products.
"""
return pulumi.get(self, "period")
@period.setter
def period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "period", value)
@property
@pulumi.getter(name="renewPeriod")
def renew_period(self) -> Optional[pulumi.Input[int]]:
"""
Automatic renewal period, the unit is month. When setting RenewalStatus to AutoRenewal, it must be set.
"""
return pulumi.get(self, "renew_period")
@renew_period.setter
def renew_period(self, value: Optional[pulumi.Input[int]]):
pulumi.set(self, "renew_period", value)
@property
@pulumi.getter(name="renewalStatus")
def renewal_status(self) -> Optional[pulumi.Input[str]]:
"""
Automatic renewal status. Valid values: `AutoRenewal`, `ManualRenewal`, default to `ManualRenewal`.
"""
return pulumi.get(self, "renewal_status")
@renewal_status.setter
def renewal_status(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "renewal_status", value)
@property
@pulumi.getter(name="versionCode")
def version_code(self) -> Optional[pulumi.Input[str]]:
"""
Paid package version. Valid values: `version_personal`, `version_enterprise_basic`, `version_enterprise_advanced`.
"""
return pulumi.get(self, "version_code")
@version_code.setter
def version_code(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "version_code", value)
@property
@pulumi.getter(name="versionName")
def version_name(self) -> Optional[pulumi.Input[str]]:
"""
Paid package version name.
"""
return pulumi.get(self, "version_name")
@version_name.setter
def version_name(self, value: Optional[pulumi.Input[str]]):
pulumi.set(self, "version_name", value)
class AlidnsInstance(pulumi.CustomResource):
@overload
def __init__(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
dns_security: Optional[pulumi.Input[str]] = None,
domain_numbers: Optional[pulumi.Input[str]] = None,
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
renew_period: Optional[pulumi.Input[int]] = None,
renewal_status: Optional[pulumi.Input[str]] = None,
version_code: Optional[pulumi.Input[str]] = None,
__props__=None):
"""
Create an Alidns Instance resource.
> **NOTE:** Available in v1.95.0+.
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
example = alicloud.dns.AlidnsInstance("example",
dns_security="no",
domain_numbers="2",
period=1,
renew_period=1,
renewal_status="ManualRenewal",
version_code="version_personal")
```
## Import
DNS instance be imported using the id, e.g.
```sh
$ pulumi import alicloud:dns/alidnsInstance:AlidnsInstance example dns-cn-v0h1ldjhfff
```
:param str resource_name: The name of the resource.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] dns_security: Alidns security level. Valid values: `no`, `basic`, `advanced`.
:param pulumi.Input[str] domain_numbers: Number of domain names bound.
:param pulumi.Input[str] payment_type: The billing method of the Alidns instance. Valid values: `Subscription`. Default to `Subscription`.
:param pulumi.Input[int] period: Creating a pre-paid instance, it must be set, the unit is month, please enter an integer multiple of 12 for annually paid products.
:param pulumi.Input[int] renew_period: Automatic renewal period, the unit is month. When setting RenewalStatus to AutoRenewal, it must be set.
:param pulumi.Input[str] renewal_status: Automatic renewal status. Valid values: `AutoRenewal`, `ManualRenewal`, default to `ManualRenewal`.
:param pulumi.Input[str] version_code: Paid package version. Valid values: `version_personal`, `version_enterprise_basic`, `version_enterprise_advanced`.
"""
...
@overload
def __init__(__self__,
resource_name: str,
args: AlidnsInstanceArgs,
opts: Optional[pulumi.ResourceOptions] = None):
"""
Create an Alidns Instance resource.
> **NOTE:** Available in v1.95.0+.
## Example Usage
Basic Usage
```python
import pulumi
import pulumi_alicloud as alicloud
example = alicloud.dns.AlidnsInstance("example",
dns_security="no",
domain_numbers="2",
period=1,
renew_period=1,
renewal_status="ManualRenewal",
version_code="version_personal")
```
## Import
DNS instance be imported using the id, e.g.
```sh
$ pulumi import alicloud:dns/alidnsInstance:AlidnsInstance example dns-cn-v0h1ldjhfff
```
:param str resource_name: The name of the resource.
:param AlidnsInstanceArgs args: The arguments to use to populate this resource's properties.
:param pulumi.ResourceOptions opts: Options for the resource.
"""
...
def __init__(__self__, resource_name: str, *args, **kwargs):
resource_args, opts = _utilities.get_resource_args_opts(AlidnsInstanceArgs, pulumi.ResourceOptions, *args, **kwargs)
if resource_args is not None:
__self__._internal_init(resource_name, opts, **resource_args.__dict__)
else:
__self__._internal_init(resource_name, *args, **kwargs)
def _internal_init(__self__,
resource_name: str,
opts: Optional[pulumi.ResourceOptions] = None,
dns_security: Optional[pulumi.Input[str]] = None,
domain_numbers: Optional[pulumi.Input[str]] = None,
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
renew_period: Optional[pulumi.Input[int]] = None,
renewal_status: Optional[pulumi.Input[str]] = None,
version_code: Optional[pulumi.Input[str]] = None,
__props__=None):
if opts is None:
opts = pulumi.ResourceOptions()
if not isinstance(opts, pulumi.ResourceOptions):
raise TypeError('Expected resource options to be a ResourceOptions instance')
if opts.version is None:
opts.version = _utilities.get_version()
if opts.id is None:
if __props__ is not None:
raise TypeError('__props__ is only valid when passed in combination with a valid opts.id to get an existing resource')
__props__ = AlidnsInstanceArgs.__new__(AlidnsInstanceArgs)
if dns_security is None and not opts.urn:
raise TypeError("Missing required property 'dns_security'")
__props__.__dict__["dns_security"] = dns_security
if domain_numbers is None and not opts.urn:
raise TypeError("Missing required property 'domain_numbers'")
__props__.__dict__["domain_numbers"] = domain_numbers
__props__.__dict__["payment_type"] = payment_type
__props__.__dict__["period"] = period
__props__.__dict__["renew_period"] = renew_period
__props__.__dict__["renewal_status"] = renewal_status
if version_code is None and not opts.urn:
raise TypeError("Missing required property 'version_code'")
__props__.__dict__["version_code"] = version_code
__props__.__dict__["version_name"] = None
super(AlidnsInstance, __self__).__init__(
'alicloud:dns/alidnsInstance:AlidnsInstance',
resource_name,
__props__,
opts)
@staticmethod
def get(resource_name: str,
id: pulumi.Input[str],
opts: Optional[pulumi.ResourceOptions] = None,
dns_security: Optional[pulumi.Input[str]] = None,
domain_numbers: Optional[pulumi.Input[str]] = None,
payment_type: Optional[pulumi.Input[str]] = None,
period: Optional[pulumi.Input[int]] = None,
renew_period: Optional[pulumi.Input[int]] = None,
renewal_status: Optional[pulumi.Input[str]] = None,
version_code: Optional[pulumi.Input[str]] = None,
version_name: Optional[pulumi.Input[str]] = None) -> 'AlidnsInstance':
"""
Get an existing AlidnsInstance resource's state with the given name, id, and optional extra
properties used to qualify the lookup.
:param str resource_name: The unique name of the resulting resource.
:param pulumi.Input[str] id: The unique provider ID of the resource to lookup.
:param pulumi.ResourceOptions opts: Options for the resource.
:param pulumi.Input[str] dns_security: Alidns security level. Valid values: `no`, `basic`, `advanced`.
:param pulumi.Input[str] domain_numbers: Number of domain names bound.
:param pulumi.Input[str] payment_type: The billing method of the Alidns instance. Valid values: `Subscription`. Default to `Subscription`.
:param pulumi.Input[int] period: Creating a pre-paid instance, it must be set, the unit is month, please enter an integer multiple of 12 for annually paid products.
:param pulumi.Input[int] renew_period: Automatic renewal period, the unit is month. When setting RenewalStatus to AutoRenewal, it must be set.
:param pulumi.Input[str] renewal_status: Automatic renewal status. Valid values: `AutoRenewal`, `ManualRenewal`, default to `ManualRenewal`.
:param pulumi.Input[str] version_code: Paid package version. Valid values: `version_personal`, `version_enterprise_basic`, `version_enterprise_advanced`.
:param pulumi.Input[str] version_name: Paid package version name.
"""
opts = pulumi.ResourceOptions.merge(opts, pulumi.ResourceOptions(id=id))
__props__ = _AlidnsInstanceState.__new__(_AlidnsInstanceState)
__props__.__dict__["dns_security"] = dns_security
__props__.__dict__["domain_numbers"] = domain_numbers
__props__.__dict__["payment_type"] = payment_type
__props__.__dict__["period"] = period
__props__.__dict__["renew_period"] = renew_period
__props__.__dict__["renewal_status"] = renewal_status
__props__.__dict__["version_code"] = version_code
__props__.__dict__["version_name"] = version_name
return AlidnsInstance(resource_name, opts=opts, __props__=__props__)
@property
@pulumi.getter(name="dnsSecurity")
def dns_security(self) -> pulumi.Output[str]:
"""
Alidns security level. Valid values: `no`, `basic`, `advanced`.
"""
return pulumi.get(self, "dns_security")
@property
@pulumi.getter(name="domainNumbers")
def domain_numbers(self) -> pulumi.Output[str]:
"""
Number of domain names bound.
"""
return pulumi.get(self, "domain_numbers")
@property
@pulumi.getter(name="paymentType")
def payment_type(self) -> pulumi.Output[Optional[str]]:
"""
The billing method of the Alidns instance. Valid values: `Subscription`. Default to `Subscription`.
"""
return pulumi.get(self, "payment_type")
@property
@pulumi.getter
def period(self) -> pulumi.Output[Optional[int]]:
"""
Creating a pre-paid instance, it must be set, the unit is month, please enter an integer multiple of 12 for annually paid products.
"""
return pulumi.get(self, "period")
@property
@pulumi.getter(name="renewPeriod")
def renew_period(self) -> pulumi.Output[Optional[int]]:
"""
Automatic renewal period, the unit is month. When setting RenewalStatus to AutoRenewal, it must be set.
"""
return pulumi.get(self, "renew_period")
@property
@pulumi.getter(name="renewalStatus")
def renewal_status(self) -> pulumi.Output[str]:
"""
Automatic renewal status. Valid values: `AutoRenewal`, `ManualRenewal`, default to `ManualRenewal`.
"""
return pulumi.get(self, "renewal_status")
@property
@pulumi.getter(name="versionCode")
def version_code(self) -> pulumi.Output[str]:
"""
Paid package version. Valid values: `version_personal`, `version_enterprise_basic`, `version_enterprise_advanced`.
"""
return pulumi.get(self, "version_code")
@property
@pulumi.getter(name="versionName")
def version_name(self) -> pulumi.Output[str]:
"""
Paid package version name.
"""
return pulumi.get(self, "version_name")
| 43.720388 | 172 | 0.649227 | 2,579 | 22,516 | 5.440093 | 0.075223 | 0.079187 | 0.072844 | 0.062723 | 0.875695 | 0.854669 | 0.845474 | 0.827014 | 0.808909 | 0.786529 | 0 | 0.001939 | 0.244226 | 22,516 | 514 | 173 | 43.805447 | 0.82253 | 0.347309 | 0 | 0.707143 | 1 | 0 | 0.104899 | 0.003127 | 0 | 0 | 0 | 0 | 0 | 1 | 0.160714 | false | 0.003571 | 0.017857 | 0 | 0.275 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
ed20ca137413462c181d31f977f44a4ce284145d | 12,394 | py | Python | tests/fixture.py | ehfeng/ics.py | 987b95ccb8301681ef339865ca997a2535d59a7c | [
"Apache-2.0"
] | null | null | null | tests/fixture.py | ehfeng/ics.py | 987b95ccb8301681ef339865ca997a2535d59a7c | [
"Apache-2.0"
] | null | null | null | tests/fixture.py | ehfeng/ics.py | 987b95ccb8301681ef339865ca997a2535d59a7c | [
"Apache-2.0"
] | null | null | null | from __future__ import unicode_literals
cal1 = """
BEGIN:VCALENDAR
METHOD:PUBLISH
VERSION:2.0
X-WR-CALNAME:plop
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
X-APPLE-CALENDAR-COLOR:#882F00
X-WR-TIMEZONE:Europe/Brussels
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:Europe/Brussels
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU
DTSTART:19810329T020000
TZNAME:UTC+2
TZOFFSETTO:+0200
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU
DTSTART:19961027T030000
TZNAME:UTC+1
TZOFFSETTO:+0100
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
CREATED:20131024T204716Z
UID:ABBF2903-092F-4202-98B6-F757437A5B28
DTEND;TZID=Europe/Brussels:20131029T113000
TRANSP:OPAQUE
SUMMARY:dfqsdfjqkshflqsjdfhqs fqsfhlqs dfkqsldfkqsdfqsfqsfqsfs
DTSTART;TZID=Europe/Brussels:20131029T103000
DTSTAMP:20131024T204741Z
SEQUENCE:3
DESCRIPTION:Lorem ipsum dolor sit amet, consectetur adipiscing elit. Sed
vitae facilisis enim. Morbi blandit et lectus venenatis tristique. Donec
sit amet egestas lacus. Donec ullamcorper, mi vitae congue dictum, quam
dolor luctus augue, id cursus purus justo vel lorem. Ut feugiat enim ips
um, quis porta nibh ultricies congue. Pellentesque nisl mi, molestie id
sem vel, vehicula nullam.
END:VEVENT
BEGIN:VTODO
DTSTAMP:20180218T154700Z
UID:Uid
DESCRIPTION:Lorem ipsum dolor sit amet.
PERCENT-COMPLETE:0
PRIORITY:0
SUMMARY:Name
END:VTODO
END:VCALENDAR
"""
cal2 = """
BEGIN:VCALENDAR
BEGIN:VEVENT
DTEND;TZID=Europe/Berlin:20120608T212500
SUMMARY:Name
DTSTART;TZID=Europe/Berlin:20120608T202500
LOCATION:MUC
SEQUENCE:0
BEGIN:VALARM
TRIGGER:-PT1H
DESCRIPTION:Event reminder
ACTION:DISPLAY
END:VALARM
END:VEVENT
END:VCALENDAR
"""
cal3 = """
BEGIN:VCALENDAR
END:VCALENDAR
"""
cal4 = """BEGIN:VCALENDAR"""
cal5 = """
BEGIN:VCALENDAR
VERSION:2.0
END:VCALENDAR
"""
cal6 = """
DESCRIPTION:a
b
"""
cal7 = """
BEGIN:VCALENDAR
END:VCALENDAR
"""
cal8 = """
BEGIN:VCALENDAR
\t
END:VCALENDAR
"""
cal9 = """
BEGIN:VCALENDAR
END:VCALENDAR
"""
cal10 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
DTEND;TZID=Europe/Berlin:20120608T212500
DTSTAMP:20131024T204741Z
SUMMARY:Name
DTSTART;TZID=Europe/Berlin:20120608T202500
LOCATION:MUC
SEQUENCE:0
BEGIN:VALARM
TRIGGER:-PT1H
DESCRIPTION:Event reminder
ACTION:DISPLAY
END:VALARM
END:VEVENT
END:VCALENDAR
"""
cal11 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
END:VCAL
"""
cal12 = """
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.8//EN
BEGIN:VEVENT
SUMMARY:Name
DTSTART;TZID=Europe/Berlin:20120608T202500
DURATION:P1DT1H
LOCATION:MUC
SEQUENCE:0
BEGIN:VALARM
TRIGGER:-PT1H
DESCRIPTION:Event reminder
ACTION:DISPLAY
END:VALARM
ATTENDEE;RSVP=FALSE;CN=test:mailto:test@test.com
END:VEVENT
END:VCALENDAR
"""
cal13 = """
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Name
DTSTART;TZID=Europe/Berlin:20120608T202500
DTEND;TZID=Europe/Berlin:20120608T212500
DURATION:P1DT1H
LOCATION:MUC
SEQUENCE:0
BEGIN:VALARM
TRIGGER:-PT1H
DESCRIPTION:Event reminder
ACTION:DISPLAY
END:VALARM
END:VEVENT
END:VCALENDAR
"""
cal14 = u"""
BEGIN:VCALENDAR
VERSION:2.0;42
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
END:VCALENDAR
"""
cal15 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Hello, \\n World\\; This is a backslash : \\\\ and another new \\N line
DTSTART;TZID=Europe/Berlin:20120608T202500
DTEND;TZID=Europe/Berlin:20120608T212500
LOCATION:MUC
END:VEVENT
END:VCALENDAR
"""
# Event with URL and STATUS
cal16 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Hello, \\n World\\; This is a backslash : \\\\ and another new \\N line
DTSTART;TZID=Europe/Berlin:20120608T202500
DTEND;TZID=Europe/Berlin:20120608T212500
LOCATION:MUC
URL:http://example.com/pub/calendars/jsmith/mytime.ics
STATUS:CONFIRMED
CATEGORIES:Simple Category,My "Quoted" Category,Category\\, with comma
END:VEVENT
END:VCALENDAR
"""
cal17 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Some special \\; chars
DTSTART;TZID=Europe/Berlin:20130608T202501
DTEND;TZID=Europe/Berlin:20130608T212501
LOCATION:In\\, every text field
DESCRIPTION:Yes\\, all of them\\;
END:VEVENT
END:VCALENDAR
"""
# long event which is not all_day
cal18 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:ownCloud Calendar 0.7.3
X-WR-CALNAME:test
BEGIN:VEVENT
UID:3912dcd3d4
DTSTAMP:20151113T004809Z
CREATED:20151113T004809Z
LAST-MODIFIED:20151113T004809Z
SUMMARY:long event
DTSTART;TZID=Europe/Berlin:20151113T140000
DTEND;TZID=Europe/Berlin:20151124T080000
LOCATION:
DESCRIPTION:
CATEGORIES:
END:VEVENT
END:VCALENDAR
"""
# Event with TRANSP
cal19 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Hello, \\n World\\; This is a backslash : \\\\ and another new \\N line
DTSTART;TZID=Europe/Berlin:20120608T202500
DTEND;TZID=Europe/Berlin:20120608T212500
LOCATION:MUC
TRANSP:OPAQUE
END:VEVENT
END:VCALENDAR
"""
# 3 days all-day event including end date
cal20 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:manually crafted from an ownCloud 8.0 ics
BEGIN:VEVENT
SUMMARY:3 days party
DTSTART;VALUE=DATE:20151114
DTEND;VALUE=DATE:20151116
END:VEVENT
END:VCALENDAR
"""
# Event with Display alarm without repeats
cal21 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Some special \\; chars
DTSTART;TZID=Europe/Berlin:20130608T202501
DTEND;TZID=Europe/Berlin:20130608T212501
LOCATION:In\\, every text field
DESCRIPTION:Yes\\, all of them\\;
BEGIN:VALARM
TRIGGER:-PT1H
DESCRIPTION:Event reminder
ACTION:DISPLAY
END:VALARM
END:VEVENT
END:VCALENDAR
"""
# Event with Display alarm WITH repeats
cal22 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Some special \\; chars
DTSTART;TZID=Europe/Berlin:20130608T202501
DTEND;TZID=Europe/Berlin:20130608T212501
LOCATION:In\\, every text field
DESCRIPTION:Yes\\, all of them\\;
BEGIN:VALARM
TRIGGER:-PT1H
REPEAT:2
DURATION:PT10M
DESCRIPTION:Event reminder
ACTION:DISPLAY
END:VALARM
END:VEVENT
END:VCALENDAR
"""
# Event with Display alarm without repeats
cal23 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Some special \\; chars
DTSTART;TZID=Europe/Berlin:20130608T202501
DTEND;TZID=Europe/Berlin:20130608T212501
LOCATION:In\\, every text field
DESCRIPTION:Yes\\, all of them\\;
BEGIN:VALARM
TRIGGER;VALUE=DATE-TIME:20160101T000000Z
DESCRIPTION:Event reminder
ACTION:DISPLAY
END:VALARM
END:VEVENT
END:VCALENDAR
"""
# Event with AUDIO alarm without attach
cal24 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Some special \\; chars
DTSTART;TZID=Europe/Berlin:20130608T202501
DTEND;TZID=Europe/Berlin:20130608T212501
LOCATION:In\\, every text field
DESCRIPTION:Yes\\, all of them\\;
BEGIN:VALARM
TRIGGER;VALUE=DATE-TIME:20160101T000000Z
ACTION:AUDIO
END:VALARM
END:VEVENT
END:VCALENDAR
"""
# Event with AUDIO alarm WITH attach
cal25 = u"""
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VEVENT
SUMMARY:Some special \\; chars
DTSTART;TZID=Europe/Berlin:20130608T202501
DTEND;TZID=Europe/Berlin:20130608T212501
LOCATION:In\\, every text field
DESCRIPTION:Yes\\, all of them\\;
BEGIN:VALARM
TRIGGER;VALUE=DATE-TIME:20160101T000000Z
ACTION:AUDIO
ATTACH;FMTTYPE=audio/basic:ftp://example.com/pub/sounds/bell-01.aud
END:VALARM
END:VEVENT
END:VCALENDAR
"""
# Event with a tabbed line fold
cal26 = u"""
BEGIN:VCALENDAR
BEGIN:VEVENT
DTEND;TZID=Europe/Berlin:20120608T212500
SUMMARY:Name
DTSTART;TZID=Europe/Berlin:20120608T202500
LOCATION:MUC
SEQUENCE:0
UID:040000008200E00074C5B7101A82E0080000000050B9861DFE30D101000000000000000
010000000DC18788D5CDAF947A99D8A91D04C601C
BEGIN:VALARM
TRIGGER:-PT1H
DESCRIPTION:Event reminder
ACTION:DISPLAY
END:VALARM
END:VEVENT
END:VCALENDAR
"""
# All VTODO attributes beside duration.
cal27 = """
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VTODO
DTSTAMP:20180218T154700Z
UID:Uid
COMPLETED:20180418T150000Z
CREATED:20180218T154800Z
DESCRIPTION:Lorem ipsum dolor sit amet.
DTSTART:20180218T164800Z
LOCATION:Earth
PERCENT-COMPLETE:0
PRIORITY:0
SUMMARY:Name
URL:https://www.example.com/cal.php/todo.ics
DURATION:PT10M
SEQUENCE:0
BEGIN:VALARM
TRIGGER:-PT1H
DESCRIPTION:Event reminder
ACTION:DISPLAY
END:VALARM
END:VTODO
END:VCALENDAR
"""
# Test due.
cal28 = """
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VTODO
DTSTAMP:20180218T154741Z
UID:Uid
DUE:20180218T164800Z
END:VTODO
END:VCALENDAR
"""
# Test error due and duration.
cal29 = """
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VTODO
DTSTAMP:20180218T154741Z
UID:Uid
DURATION:PT10M
DUE:20180218T164800Z
END:VTODO
END:VCALENDAR
"""
cal30 = """
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VTODO
DTSTAMP:20180218T154741Z
UID:Uid
DUE:20180218T164800Z
DURATION:PT10M
END:VTODO
END:VCALENDAR
"""
cal31 = """
BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//Apple Inc.//Mac OS X 10.9//EN
BEGIN:VTODO
DTSTAMP:20180218T154741Z
UID:Uid
SUMMARY:Hello, \\n World\\; This is a backslash : \\\\ and another new \\N line
LOCATION:In\\, every text field
DESCRIPTION:Yes\\, all of them\\;
END:VTODO
END:VCALENDAR
"""
unfolded_cal2 = [
'BEGIN:VCALENDAR',
'BEGIN:VEVENT',
'DTEND;TZID=Europe/Berlin:20120608T212500',
'SUMMARY:Name',
'DTSTART;TZID=Europe/Berlin:20120608T202500',
'LOCATION:MUC',
'SEQUENCE:0',
'BEGIN:VALARM',
'TRIGGER:-PT1H',
'DESCRIPTION:Event reminder',
'ACTION:DISPLAY',
'END:VALARM',
'END:VEVENT',
'END:VCALENDAR',
]
unfolded_cal1 = [
'BEGIN:VCALENDAR',
'METHOD:PUBLISH',
'VERSION:2.0',
'X-WR-CALNAME:plop',
'PRODID:-//Apple Inc.//Mac OS X 10.9//EN',
'X-APPLE-CALENDAR-COLOR:#882F00',
'X-WR-TIMEZONE:Europe/Brussels',
'CALSCALE:GREGORIAN',
'BEGIN:VTIMEZONE',
'TZID:Europe/Brussels',
'BEGIN:DAYLIGHT',
'TZOFFSETFROM:+0100',
'RRULE:FREQ=YEARLY;BYMONTH=3;BYDAY=-1SU',
'DTSTART:19810329T020000',
'TZNAME:UTC+2',
'TZOFFSETTO:+0200',
'END:DAYLIGHT',
'BEGIN:STANDARD',
'TZOFFSETFROM:+0200',
'RRULE:FREQ=YEARLY;BYMONTH=10;BYDAY=-1SU',
'DTSTART:19961027T030000',
'TZNAME:UTC+1',
'TZOFFSETTO:+0100',
'END:STANDARD',
'END:VTIMEZONE',
'BEGIN:VEVENT',
'CREATED:20131024T204716Z',
'UID:ABBF2903-092F-4202-98B6-F757437A5B28',
'DTEND;TZID=Europe/Brussels:20131029T113000',
'TRANSP:OPAQUE',
'SUMMARY:dfqsdfjqkshflqsjdfhqs fqsfhlqs dfkqsldfkqsdfqsfqsfqsfs',
'DTSTART;TZID=Europe/Brussels:20131029T103000',
'DTSTAMP:20131024T204741Z',
'SEQUENCE:3',
'DESCRIPTION:Lorem ipsum dolor sit amet, consectetur adipiscing elit. \
Sedvitae facilisis enim. Morbi blandit et lectus venenatis tristique. \
Donecsit amet egestas lacus. Donec ullamcorper, mi vitae congue dictum, \
quamdolor luctus augue, id cursus purus justo vel lorem. \
Ut feugiat enim ipsum, quis porta nibh ultricies congue. \
Pellentesque nisl mi, molestie idsem vel, vehicula nullam.',
'END:VEVENT',
'BEGIN:VTODO',
'DTSTAMP:20180218T154700Z',
'UID:Uid',
'DESCRIPTION:Lorem ipsum dolor sit amet.',
'PERCENT-COMPLETE:0',
'PRIORITY:0',
'SUMMARY:Name',
'END:VTODO',
'END:VCALENDAR',
]
unfolded_cal6 = ['DESCRIPTION:ab']
unfolded_cal21 = [
'BEGIN:VCALENDAR',
'BEGIN:VEVENT',
'DTEND;TZID=Europe/Berlin:20120608T212500',
'SUMMARY:Name',
'DTSTART;TZID=Europe/Berlin:20120608T202500',
'LOCATION:MUC',
'SEQUENCE:0',
'BEGIN:VALARM',
'TRIGGER:-PT1H',
'REPEAT:2',
'DURATION:PT10M',
'DESCRIPTION:Event reminder',
'ACTION:DISPLAY',
'END:VALARM',
'END:VEVENT',
'END:VCALENDAR',
]
unfolded_cal26 = [
'BEGIN:VCALENDAR',
'BEGIN:VEVENT',
'DTEND;TZID=Europe/Berlin:20120608T212500',
'SUMMARY:Name',
'DTSTART;TZID=Europe/Berlin:20120608T202500',
'LOCATION:MUC',
'SEQUENCE:0',
'UID:040000008200E00074C5B7101A82E0080000000050B9861DFE30D101000000000000000010000000DC18788D5CDAF947A99D8A91D04C601C',
'BEGIN:VALARM',
'TRIGGER:-PT1H',
'DESCRIPTION:Event reminder',
'ACTION:DISPLAY',
'END:VALARM',
'END:VEVENT',
'END:VCALENDAR',
]
| 20.622296 | 123 | 0.752219 | 1,683 | 12,394 | 5.532977 | 0.1735 | 0.044029 | 0.060137 | 0.051976 | 0.817655 | 0.808956 | 0.787693 | 0.773518 | 0.758376 | 0.735825 | 0 | 0.134016 | 0.117396 | 12,394 | 600 | 124 | 20.656667 | 0.71725 | 0.033484 | 0 | 0.704673 | 0 | 0.001869 | 0.851316 | 0.260343 | 0 | 0 | 0 | 0.001667 | 0 | 1 | 0 | false | 0 | 0.001869 | 0 | 0.001869 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 8 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.