hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
a671987b4aec698f200fd1e8f711ebabd9e8fdf6 | 359 | py | Python | afwf_fts_anything/__init__.py | MacHu-GWU/afwf_fts_anything-project | 7050f12f6df9688fd553a5673ab21e10fa571cf2 | [
"MIT"
] | 20 | 2019-01-03T22:31:41.000Z | 2021-10-14T11:32:29.000Z | afwf_fts_anything/__init__.py | MacHu-GWU/afwf_fts_anything-project | 7050f12f6df9688fd553a5673ab21e10fa571cf2 | [
"MIT"
] | 2 | 2019-01-02T21:36:40.000Z | 2020-08-23T18:03:54.000Z | afwf_fts_anything/__init__.py | MacHu-GWU/afwf_fts_anything-project | 7050f12f6df9688fd553a5673ab21e10fa571cf2 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Full text search workflow for Alfred.
"""
from ._version import __version__
__short_description__ = "Full text search workflow for Alfred."
__license__ = "MIT"
__author__ = "Sanhe Hu"
__author_email__ = "husanhe@gmail.com"
__maintainer__ = "Sanhe Hu"
__maintainer_email__ = "husanhe@gmail.com"
__github_username__ = "MacHu-GWU"
| 22.4375 | 63 | 0.743733 | 43 | 359 | 5.348837 | 0.651163 | 0.069565 | 0.121739 | 0.191304 | 0.269565 | 0.269565 | 0 | 0 | 0 | 0 | 0 | 0.003215 | 0.133705 | 359 | 15 | 64 | 23.933333 | 0.736334 | 0.167131 | 0 | 0 | 0 | 0 | 0.340206 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6736259f691ee42e01b77cfe927dab8c82e2223 | 609 | py | Python | users/home_work.py | annadokuchaeva2002/python-home-bot | 662acd3e3998d5c58a034004e0eef9e782d57447 | [
"MIT"
] | null | null | null | users/home_work.py | annadokuchaeva2002/python-home-bot | 662acd3e3998d5c58a034004e0eef9e782d57447 | [
"MIT"
] | null | null | null | users/home_work.py | annadokuchaeva2002/python-home-bot | 662acd3e3998d5c58a034004e0eef9e782d57447 | [
"MIT"
] | null | null | null | from main import dp
from aiogram import types
from aiogram.dispatcher.filters.builtin import Text
@dp.message_handler(Text(equals="Все задания 🤩"))
async def vse_zadaniya(msg: types.Message):
await msg.answer(text="<b>Ваши задания:</b>\n\nскоро наполню")
@dp.message_handler(Text(equals="Добавить 📝"))
async def dobavit(msg: types.Message):
await msg.answer(text="прикрепите ваше задание")
@dp.message_handler(Text(equals="Скрыть клавиутуру 😤"))
async def dobavit(msg: types.Message):
await msg.answer(text="Клавиатура скрыта\nДля вызова /start", reply_markup=types.ReplyKeyboardRemove())
| 29 | 107 | 0.758621 | 88 | 609 | 5.227273 | 0.511364 | 0.058696 | 0.104348 | 0.130435 | 0.45 | 0.280435 | 0.280435 | 0.208696 | 0.208696 | 0.208696 | 0 | 0 | 0.108374 | 609 | 20 | 108 | 30.45 | 0.841621 | 0 | 0 | 0.166667 | 0 | 0 | 0.226974 | 0.034539 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a674f43bddee54cbea1107c11705b1d7ee339b2f | 583 | py | Python | vcorelib/paths/context.py | vkottler/vcorelib | 97c3b92932d5b2f8c6d9cdca55f34bf167980a21 | [
"MIT"
] | 1 | 2022-03-31T09:26:04.000Z | 2022-03-31T09:26:04.000Z | vcorelib/paths/context.py | vkottler/vcorelib | 97c3b92932d5b2f8c6d9cdca55f34bf167980a21 | [
"MIT"
] | 2 | 2022-03-31T09:35:06.000Z | 2022-03-31T09:38:07.000Z | vcorelib/paths/context.py | vkottler/vcorelib | 97c3b92932d5b2f8c6d9cdca55f34bf167980a21 | [
"MIT"
] | null | null | null | """
A module for context managers related to file-system paths.
"""
# built-in
from contextlib import contextmanager
from os import chdir as _chdir
from pathlib import Path as _Path
from typing import Iterator as _Iterator
# internal
from vcorelib.paths import Pathlike as _Pathlike
from vcorelib.paths import normalize as _normalize
@contextmanager
def in_dir(path: _Pathlike) -> _Iterator[None]:
"""Change the current working directory as a context manager."""
cwd = _Path.cwd()
try:
_chdir(_normalize(path))
yield
finally:
_chdir(cwd)
| 22.423077 | 68 | 0.728988 | 77 | 583 | 5.363636 | 0.519481 | 0.058111 | 0.082324 | 0.11138 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.200686 | 583 | 25 | 69 | 23.32 | 0.886266 | 0.234991 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.428571 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
a675cde92791cd0863858fcb8d5b4afd657a5c32 | 467 | py | Python | tf.py | thuyduongtt/region_based_active_learning | b3653c31a44135b5680949790549799c83a5a18b | [
"MIT"
] | null | null | null | tf.py | thuyduongtt/region_based_active_learning | b3653c31a44135b5680949790549799c83a5a18b | [
"MIT"
] | null | null | null | tf.py | thuyduongtt/region_based_active_learning | b3653c31a44135b5680949790549799c83a5a18b | [
"MIT"
] | null | null | null | def test_tf():
import tensorflow as tf
from utils import list_devices
list_devices()
gpu_available = tf.test.is_gpu_available()
print('GPU available:', gpu_available)
with tf.Session(config=tf.ConfigProto(log_device_placement=True)).as_default() as sess:
print('Session has started!')
def test_pt():
import torch
print('GPU available:', torch.cuda.is_available())
if __name__ == '__main__':
test_tf()
# test_pt()
| 22.238095 | 91 | 0.683084 | 63 | 467 | 4.714286 | 0.507937 | 0.20202 | 0.114478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.197002 | 467 | 20 | 92 | 23.35 | 0.792 | 0.019272 | 0 | 0 | 0 | 0 | 0.122807 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.153846 | false | 0 | 0.230769 | 0 | 0.384615 | 0.230769 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a68504f4c7d13cc8c0e0489a51095d8397a4bb6d | 1,370 | gyp | Python | binding.gyp | Ciccio99/electron-overlay-window | f25e29ec1ffe8f4ccb942051a986d217fb36a2b8 | [
"MIT"
] | null | null | null | binding.gyp | Ciccio99/electron-overlay-window | f25e29ec1ffe8f4ccb942051a986d217fb36a2b8 | [
"MIT"
] | null | null | null | binding.gyp | Ciccio99/electron-overlay-window | f25e29ec1ffe8f4ccb942051a986d217fb36a2b8 | [
"MIT"
] | null | null | null | {
'targets': [
{
'target_name': 'overlay_window',
'sources': [
'src/lib/addon.c',
'src/lib/napi_helpers.c'
],
'include_dirs': [
'src/lib'
],
'conditions': [
['OS=="win"', {
'defines': [
'WIN32_LEAN_AND_MEAN'
],
'link_settings': {
'libraries': [
'oleacc.lib'
]
},
'sources': [
'src/lib/windows.c',
]
}],
['OS=="linux"', {
'defines': [
'_GNU_SOURCE'
],
'link_settings': {
'libraries': [
'-lxcb', '-lpthread'
]
},
'cflags': ['-std=c99', '-pedantic', '-Wall', '-pthread'],
'sources': [
'src/lib/x11.c',
]
}],
['OS=="mac"', {
'link_settings': {
'libraries': [
'-lpthread', '-framework AppKit', '-framework ApplicationServices'
]
},
'xcode_settings': {
'OTHER_CFLAGS': [
'-fobjc-arc'
]
},
'cflags': ['-std=c99', '-pedantic', '-Wall', '-pthread'],
'sources': [
'src/lib/mac.mm',
'src/lib/mac/OWFullscreenObserver.mm'
]
}]
]
}
]
}
| 22.459016 | 80 | 0.350365 | 90 | 1,370 | 5.177778 | 0.544444 | 0.090129 | 0.111588 | 0.085837 | 0.188841 | 0.188841 | 0.188841 | 0.188841 | 0.188841 | 0 | 0 | 0.010974 | 0.467883 | 1,370 | 60 | 81 | 22.833333 | 0.628258 | 0 | 0 | 0.333333 | 0 | 0 | 0.388321 | 0.041606 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a689a5d6a9c1f320fa1c6a898b3f880278371c7b | 769 | py | Python | directory/models/directory_access_group.py | darkismus/kompassi | 35dea2c7af2857a69cae5c5982b48f01ba56da1f | [
"CC-BY-3.0"
] | 13 | 2015-11-29T12:19:12.000Z | 2021-02-21T15:42:11.000Z | directory/models/directory_access_group.py | darkismus/kompassi | 35dea2c7af2857a69cae5c5982b48f01ba56da1f | [
"CC-BY-3.0"
] | 23 | 2015-04-29T19:43:34.000Z | 2021-02-10T05:50:17.000Z | directory/models/directory_access_group.py | darkismus/kompassi | 35dea2c7af2857a69cae5c5982b48f01ba56da1f | [
"CC-BY-3.0"
] | 11 | 2015-09-20T18:59:00.000Z | 2020-02-07T08:47:34.000Z | from django.db import models
from django.utils.translation import ugettext_lazy as _
class DirectoryAccessGroup(models.Model):
"""
Grants expiring group access to the personnel directory.
"""
organization = models.ForeignKey('core.Organization', on_delete=models.CASCADE)
group = models.ForeignKey('auth.Group', on_delete=models.CASCADE)
active_from = models.DateTimeField(blank=True, null=True)
active_until = models.DateTimeField(blank=True, null=True)
created_at = models.DateTimeField(auto_now_add=True)
updated_at = models.DateTimeField(auto_now=True)
class Meta:
verbose_name = _('directory access group')
verbose_name_plural = _('directory access groups')
ordering = ('organization', 'group')
| 34.954545 | 83 | 0.73212 | 90 | 769 | 6.077778 | 0.511111 | 0.13894 | 0.051188 | 0.076782 | 0.234004 | 0.131627 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16775 | 769 | 21 | 84 | 36.619048 | 0.854688 | 0.072822 | 0 | 0 | 0 | 0 | 0.12769 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
a695aa6f1ab93783e6c03d418b1565d4a3c0d64d | 2,696 | py | Python | cli.py | hiway/adventofcode | c4b3e94ad2bc967bc684a10ddc90541d6e1657fc | [
"MIT"
] | null | null | null | cli.py | hiway/adventofcode | c4b3e94ad2bc967bc684a10ddc90541d6e1657fc | [
"MIT"
] | null | null | null | cli.py | hiway/adventofcode | c4b3e94ad2bc967bc684a10ddc90541d6e1657fc | [
"MIT"
] | null | null | null | import click
import collections
@click.group()
def cli():
pass
@cli.group()
def day1():
pass
@day1.command()
@click.argument('input_file', type=click.File())
def part1(input_file):
from day1 import part1_stanta_floor_positioning_system
for directions in input_file:
floor = part1_stanta_floor_positioning_system(directions)
print('Final Floor: {0}'.format(floor))
@day1.command()
@click.argument('input_file', type=click.File())
@click.option('--halt', type=int)
def part2(input_file, halt):
from day1 import part2_santa_fps_halt
for directions in input_file:
position = part2_santa_fps_halt(directions, halt)
if position is not None:
print('Position: {0}'.format(position))
else:
print('Never reached floor: {0}'.format(halt))
@cli.group()
def day2():
pass
@day2.command()
@click.argument('input_file', type=click.File())
def part1(input_file):
from day2 import part1_wrapping_paper_estimate
total = 0
for dimensions in input_file:
total += part1_wrapping_paper_estimate(dimensions)
print('Total wrapping paper required: {0} sq ft'.format(total))
@day2.command()
@click.argument('input_file', type=click.File())
def part2(input_file):
from day2 import part2_ribbon_estimate
total = 0
for dimensions in input_file:
total += part2_ribbon_estimate(dimensions)
print('Total ribbon required: {0} ft'.format(total))
@cli.group()
def day3():
pass
@day3.command()
@click.argument('input_file', type=click.File())
def part1(input_file):
from day3 import part1_santa_gps
for directions in input_file:
print(part1_santa_gps(directions.strip()))
@day3.command()
@click.argument('input_file', type=click.File())
def part2(input_file):
from day3 import part2_santa_and_robo_gps
for directions in input_file:
print(part2_santa_and_robo_gps(directions.strip()))
@cli.group()
def day4():
pass
@day4.command()
@click.argument('secret_key')
def part1(secret_key):
from day4 import part1_adventcoin_miner
print(part1_adventcoin_miner(secret_key.strip()))
@day4.command()
@click.argument('secret_key')
@click.argument('match', default='000000')
def part2(secret_key, match):
from day4 import part2_adventcoin_miner
print(part2_adventcoin_miner(secret_key.strip(), match))
@cli.group()
def day5():
pass
@day5.command()
@click.argument('input_file', type=click.File())
def part1(input_file):
from day5 import part1_is_nice_string
results = collections.Counter()
results.update([part1_is_nice_string(directions.strip()) for directions in input_file])
print(results[True])
| 23.241379 | 91 | 0.709941 | 367 | 2,696 | 5.00545 | 0.190736 | 0.102885 | 0.097986 | 0.095264 | 0.510615 | 0.375068 | 0.323353 | 0.288514 | 0.288514 | 0.21448 | 0 | 0.027975 | 0.164688 | 2,696 | 115 | 92 | 23.443478 | 0.787744 | 0 | 0 | 0.5 | 0 | 0 | 0.084972 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.178571 | false | 0.071429 | 0.130952 | 0 | 0.309524 | 0.119048 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
a698ed363c72fb8db096b455d881f891dde49eb2 | 6,282 | py | Python | src/src/graph_environments.py | aka-cs/ia-sim-cmp | fa26f3d961a992698ca08f4213d6eae39f3ec039 | [
"MIT"
] | null | null | null | src/src/graph_environments.py | aka-cs/ia-sim-cmp | fa26f3d961a992698ca08f4213d6eae39f3ec039 | [
"MIT"
] | null | null | null | src/src/graph_environments.py | aka-cs/ia-sim-cmp | fa26f3d961a992698ca08f4213d6eae39f3ec039 | [
"MIT"
] | null | null | null | from __future__ import annotations
from .base_classes import Event, SetEvent, DeleteEvent, GenerateEvent, MapObject, Agent, Position, Generator, \
Environment
class GraphEnvironment(Environment):
"""
Implementación particular de entorno, en el que hay noción de localizaciones y adyacencias entre estas.
Está representado sobre un grafo.
"""
graph: {str: {str: float}}
objects: {str: {int: MapObject}}
generators: {str: Generator}
def __init__(self, graph: {str: {str: float}}, objects: {str: {int: MapObject}}, generators: {str: Generator}):
# Guardamos el grafo y los objetos del entorno.
self.graph = graph
self.objects = objects
self.generators = generators
self.counter = 0
# Nos aseguramos que la lista de objetos tenga el formato correcto.
# Por cada localización del grafo.
for place in graph:
# Si en el listado de objetos no existe esta localización, la añadimos sin objetos.
if place not in objects:
self.objects[place] = {}
# Si existe al menos una localización en la lista de objetos que no existe en el entorno,
# lanzamos excepción.
for place in objects:
for object_id in self.objects[place]:
self.counter = max(self.counter, object_id)
if place not in graph:
raise Exception("Invalid objects list.")
def next(self):
self.counter += 1
return self.counter
def get_places(self) -> [str]:
"""
Devuelve las localizaciones del entorno simulado.
"""
# Construimos una lista de localizaciones y la devolvemos.
return [place for place in self.graph]
def get_objects(self):
"""
Devuelve los objetos del entorno.
"""
# Lista para guardar las objetos del entorno.
map_objects = []
# Por cada destino del vehículo.
for place in self.objects:
# Añadimos las cargas asociadas a este destino.
map_objects.extend(self.get_all_objects(place))
# Devolvemos el listado de objetos.
return map_objects
def update_state(self, event: Event) -> [Event]:
"""
Dado un evento, actualiza el entorno simulado.
"""
events = []
# Si es un evento de borrado, borramos el elemento correspondiente en la posición dada.
if isinstance(event, DeleteEvent):
self.remove_object(event.position, event.object_id)
# Si es un evento de adición, añadimos el elemento correspondiente.
elif isinstance(event, SetEvent):
event.object.identifier = self.next()
self.set_object(event.object)
elif isinstance(event, GenerateEvent) and event.generator_name in self.generators:
map_object = self.generators[event.generator_name].generate(self.get_places())
map_object.identifier = self.next()
self.set_object(map_object)
next_genesis = self.generators[event.generator_name].next(event.time)
if next_genesis > event.time:
events.append(GenerateEvent(next_genesis, event.issuer_id, event.generator_name))
# Actualizamos cada objeto del entorno.
for map_object in self.get_objects():
# Si es un agente, actualizamos su estado.
if isinstance(map_object, Agent):
events.extend(map_object.update_state(event, self))
# Lanzamos los eventos obtenidos.
return events
def get_all_objects(self, position: str) -> [MapObject]:
"""
Devuelve el listado de objetos localizados en la posición dada del entorno simulado.
"""
# Construimos una lista con los objetos en la posición dadda y la devolvemos.
return [element for element in self.objects.get(position, {}).values()]
def get_object(self, position: str, identifier: int) -> MapObject:
"""
Devuelve el elemento del entorno simulado con el id especificado.
"""
# Si en la posición dada existe un objeto con el id especificado, lo devolvemos.
# En caso contrario devolvemos None.
if position in self.objects and identifier in self.objects[position]:
return self.objects[position][identifier]
def set_object(self, element: MapObject) -> None:
"""
Coloca al elemento dado en la posición especificada del entorno simulado.
"""
# Si la posición especificada existe.
if element.position in self.graph:
# Guardamos el objeto dado en la posición especificada.
self.objects[element.position][element.identifier] = element
def remove_object(self, position: str, identifier: int) -> None:
"""
Remueve al elemento dado en la posición especificada del entorno simulado.
"""
# Si en la posición dada existe un objeto con el id especificado, lo eliminamos.
if position in self.objects and identifier in self.objects[position]:
del self.objects[position][identifier]
class MapEnvironment(GraphEnvironment):
positions: {str: Position}
def __init__(self, graph: {str: {str: float}}, objects: {str: {int: MapObject}}, positions: {str: Position},
generators: {str: Generator}):
# Guardamos el grafo y los objetos del entorno.
super().__init__(graph, objects, generators)
# Guardamos las posiciones.
self.positions = positions
# Si existe al menos una localización en la lista de posiciones que no existe en el entorno,
# lanzamos excepción.
for place in positions:
if place not in graph:
raise Exception("Invalid positions list.")
# Si existe al menos una localización en la lista de objetos que no existe en el entorno,
# lanzamos excepción.
for place in graph:
if place not in positions:
raise Exception("Invalid positions list.")
def get_position(self, name: str) -> Position:
"""
Devuelve la posición asociada a la localización que recibe como argumento.
"""
return self.positions.get(name, None)
| 40.269231 | 115 | 0.639128 | 745 | 6,282 | 5.316779 | 0.216107 | 0.033325 | 0.024236 | 0.012118 | 0.34158 | 0.299419 | 0.26357 | 0.244888 | 0.225701 | 0.225701 | 0 | 0.000444 | 0.282553 | 6,282 | 155 | 116 | 40.529032 | 0.878411 | 0.334925 | 0 | 0.109589 | 1 | 0 | 0.016984 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.150685 | false | 0 | 0.027397 | 0 | 0.356164 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a699cfaf6878aec2d41bc0a3750f8c53d818736b | 638 | py | Python | award/migrations/0002_auto_20190701_0754.py | maurinesinami/awards | 0f8e390a41a0c462cdb2104797daa4b59c986656 | [
"MIT"
] | null | null | null | award/migrations/0002_auto_20190701_0754.py | maurinesinami/awards | 0f8e390a41a0c462cdb2104797daa4b59c986656 | [
"MIT"
] | null | null | null | award/migrations/0002_auto_20190701_0754.py | maurinesinami/awards | 0f8e390a41a0c462cdb2104797daa4b59c986656 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.11 on 2019-07-01 04:54
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('award', '0001_initial'),
]
operations = [
migrations.RenameField(
model_name='image',
old_name='image_caption',
new_name='image_description',
),
migrations.AddField(
model_name='image',
name='live_link',
field=models.CharField(default=2, max_length=30),
preserve_default=False,
),
]
| 23.62963 | 61 | 0.587774 | 66 | 638 | 5.454545 | 0.742424 | 0.1 | 0.077778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051454 | 0.299373 | 638 | 26 | 62 | 24.538462 | 0.753915 | 0.103448 | 0 | 0.210526 | 1 | 0 | 0.115993 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.105263 | 0 | 0.263158 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a69c988cc5ea3fab5e018e4dee803ea9a79a1f3a | 938 | py | Python | backend/gsr_booking/management/commands/load_gsrs.py | pennlabs/penn-mobile | fb3b514a55afbf6f29dd8bd589b4e76bf52e3e90 | [
"MIT"
] | 2 | 2021-11-23T18:06:40.000Z | 2022-01-05T19:13:33.000Z | backend/gsr_booking/management/commands/load_gsrs.py | pennlabs/penn-mobile | fb3b514a55afbf6f29dd8bd589b4e76bf52e3e90 | [
"MIT"
] | 30 | 2021-10-17T23:29:44.000Z | 2022-03-31T02:03:13.000Z | backend/gsr_booking/management/commands/load_gsrs.py | pennlabs/penn-mobile | fb3b514a55afbf6f29dd8bd589b4e76bf52e3e90 | [
"MIT"
] | null | null | null | import csv
from django.core.management.base import BaseCommand
from gsr_booking.models import GSR
class Command(BaseCommand):
def handle(self, *args, **kwargs):
with open("gsr_booking/data/gsr_data.csv") as data:
reader = csv.reader(data)
for i, row in enumerate(reader):
if i == 0:
continue
# collects room information from csv
lid, gid, name, service = row
# gets image from s3 given the lid and gid
# TODO: fix image url!
image_url = (
f"https://s3.us-east-2.amazonaws.com/labs.api/gsr/lid-{lid}-gid-{gid}.jpg"
)
kind = GSR.KIND_WHARTON if service == "wharton" else GSR.KIND_LIBCAL
GSR.objects.create(lid=lid, gid=gid, name=name, kind=kind, image_url=image_url)
self.stdout.write("Uploaded GSRs!")
| 31.266667 | 95 | 0.559701 | 119 | 938 | 4.344538 | 0.563025 | 0.061896 | 0.05029 | 0.061896 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006452 | 0.339019 | 938 | 29 | 96 | 32.344828 | 0.827419 | 0.102345 | 0 | 0 | 0 | 0.058824 | 0.144391 | 0.034606 | 0 | 0 | 0 | 0.034483 | 0 | 1 | 0.058824 | false | 0 | 0.176471 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6a173dfa16946d1b343a80b4e42d5cc67ea6e07 | 397 | py | Python | lolexport/log.py | dleiferives/lolexport | 894c97240893da829e96f46e2c4cdebf85846412 | [
"MIT"
] | 2 | 2021-02-23T09:21:07.000Z | 2022-03-25T15:02:50.000Z | lolexport/log.py | dleiferives/lolexport | 894c97240893da829e96f46e2c4cdebf85846412 | [
"MIT"
] | 5 | 2021-02-24T01:26:36.000Z | 2022-02-27T13:05:27.000Z | lolexport/log.py | dleiferives/lolexport | 894c97240893da829e96f46e2c4cdebf85846412 | [
"MIT"
] | 1 | 2022-02-27T02:17:17.000Z | 2022-02-27T02:17:17.000Z | from os import environ
import logging
from logzero import setup_logger # type: ignore[import]
# https://docs.python.org/3/library/logging.html#logging-levels
loglevel: int = logging.DEBUG # (10)
if "LOLEXPORT" in environ:
loglevel = int(environ["LOLEXPORT"])
# logzero handles this fine, can be imported/configured
# multiple times
logger = setup_logger(name="lolexport", level=loglevel)
| 28.357143 | 63 | 0.758186 | 54 | 397 | 5.537037 | 0.666667 | 0.073579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008696 | 0.130982 | 397 | 13 | 64 | 30.538462 | 0.857971 | 0.390428 | 0 | 0 | 0 | 0 | 0.114894 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
a6b27c6859301d1606a8d5f073f133b9b73bc5e5 | 188 | py | Python | Code/circle.py | notha99y/Satellite-Scheduling | 6231eccf353f37ba643a7e37aa60525355f5d005 | [
"MIT"
] | 14 | 2018-04-06T22:36:30.000Z | 2022-02-15T02:36:58.000Z | Code/circle.py | notha99y/Satellite-Scheduling | 6231eccf353f37ba643a7e37aa60525355f5d005 | [
"MIT"
] | null | null | null | Code/circle.py | notha99y/Satellite-Scheduling | 6231eccf353f37ba643a7e37aa60525355f5d005 | [
"MIT"
] | 4 | 2018-04-06T22:36:57.000Z | 2022-02-15T02:37:00.000Z | import matplotlib.pyplot as plt
circle = plt.Circle((0,0),5, fill=False)
fig, ax = plt.subplots()
ax.add_artist(circle)
ax.set_xlim((-10, 10))
ax.set_ylim((-10, 10))
plt.show()
| 18.8 | 41 | 0.654255 | 33 | 188 | 3.636364 | 0.606061 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.069182 | 0.154255 | 188 | 9 | 42 | 20.888889 | 0.685535 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6b564871deacaebf5076f647494e02a77ffcc72 | 2,129 | py | Python | keystoneworkout/benchmark.py | dstanek/keystone-exercises | 5023fe87896ffefb462936ca9e6a982b9d099d6c | [
"Apache-2.0"
] | null | null | null | keystoneworkout/benchmark.py | dstanek/keystone-exercises | 5023fe87896ffefb462936ca9e6a982b9d099d6c | [
"Apache-2.0"
] | null | null | null | keystoneworkout/benchmark.py | dstanek/keystone-exercises | 5023fe87896ffefb462936ca9e6a982b9d099d6c | [
"Apache-2.0"
] | null | null | null | import shelve
import sys
import threading
import time
class Benchmark(object):
def __init__(self, concurrency=10, iterations=10):
self.concurrency = concurrency
self.iterations = iterations
self.shelf = Shelf()
def __call__(self, f):
def wrapped(*args, **kwargs):
print 'Benchmarking %s...' % f.__name__,
sys.stdout.flush()
# build threads
threads = [threading.Thread(target=f, args=args, kwargs=kwargs)
for _ in range(self.concurrency)]
start = time.time()
for thread in threads:
thread.start()
while any(thread.is_alive() for thread in threads):
pass
end = time.time()
total_time = end - start
mean_time = total_time / (self.concurrency * self.iterations)
task_per_sec = (self.concurrency * self.iterations) / total_time
previous = self.shelf.get(f.__name__)
self.shelf.set(f.__name__, total_time)
if previous is not None:
percent_diff = 100.0 * (total_time - previous) / previous
print ('%2.3f seconds total (%+2.3f%%), %2.3f seconds per task, %2.3f tasks per second'
% (total_time, percent_diff, mean_time, task_per_sec))
else:
print ('%2.3f seconds total, %2.3f seconds per task, %2.3f tasks per second'
% (total_time, mean_time, task_per_sec))
return wrapped
class Shelf(object):
def __init__(self):
self.filename = '.keystoneworkout-benchmark-shelf'
def get(self, key):
shelf = shelve.open(self.filename)
try:
return shelf.get(key)
finally:
shelf.close()
def set(self, key, value):
shelf = shelve.open(self.filename)
try:
shelf[key] = value
finally:
shelf.close()
def delete(self, key):
shelf = shelve.open(self.filename)
try:
del shelf[key]
finally:
shelf.close()
| 30.855072 | 103 | 0.550493 | 240 | 2,129 | 4.7 | 0.3 | 0.055851 | 0.035461 | 0.050532 | 0.238475 | 0.20656 | 0.179965 | 0.141844 | 0.076241 | 0.076241 | 0 | 0.015827 | 0.347111 | 2,129 | 68 | 104 | 31.308824 | 0.795683 | 0.006106 | 0 | 0.218182 | 0 | 0.036364 | 0.092242 | 0.015137 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.018182 | 0.072727 | null | null | 0.054545 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6b8534dfb59965e01c0a204829dc917ef20d463 | 5,938 | py | Python | src/dipus/search_js_t.py | shirou/dipus | 1c8a9cc89fb95a5c6ae99e692488496bd3fbec34 | [
"BSD-2-Clause"
] | null | null | null | src/dipus/search_js_t.py | shirou/dipus | 1c8a9cc89fb95a5c6ae99e692488496bd3fbec34 | [
"BSD-2-Clause"
] | null | null | null | src/dipus/search_js_t.py | shirou/dipus | 1c8a9cc89fb95a5c6ae99e692488496bd3fbec34 | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
template = """/*
* * search_dipus
* * ~~~~~~~~~~~~~~
* *
* * Dipus JavaScript utilties for the full-text search.
* * This files is based on searchtools.js of Sphinx.
* *
* * :copyright: Copyright 2007-2012 by the Sphinx team.
* * :license: BSD, see LICENSE for details.
* *
* */
/**
* * helper function to return a node containing the
* * search summary for a given text. keywords is a list
* * of stemmed words, hlwords is the list of normal, unstemmed
* * words. the first one is used to find the occurance, the
* * latter for highlighting it.
* */
jQuery.makeSearchSummary = function(text, keywords, hlwords) {{
var textLower = text.toLowerCase();
var start = 0;
$.each(keywords, function() {{
var i = textLower.indexOf(this.toLowerCase());
if (i > -1)
start = i;
}});
start = Math.max(start - 120, 0);
var excerpt = ((start > 0) ? '...' : '') +
$.trim(text.substr(start, 240)) +
((start + 240 - text.length) ? '...' : '');
var rv = $('<div class="context"></div>').text(excerpt);
$.each(hlwords, function() {{
rv = rv.highlightText(this, 'highlighted');
}});
return rv;
}};
/**
* Search Module
*/
var Search = {{
_dipus_url: "{dipus_url}",
_index: null,
_pulse_status : -1,
init : function (){{
var params = $.getQueryParameters();
if (params.q) {{
var query = params.q[0];
$('input[name="q"]')[0].value = query;
this.performSearch(query);
}}
}},
stopPulse : function() {{
this._pulse_status = 0;
}},
startPulse : function() {{
if (this._pulse_status >= 0)
return;
function pulse() {{
Search._pulse_status = (Search._pulse_status + 1) % 4;
var dotString = '';
for (var i = 0; i < Search._pulse_status; i++)
dotString += '.';
Search.dots.text(dotString);
if (Search._pulse_status > -1)
window.setTimeout(pulse, 500);
}};
pulse();
}},
/**
* perform a search for something
*/
performSearch : function(query) {{
// create the required interface elements
this.out = $('#search-results');
this.title = $('<h2>' + _('Searching') + '</h2>').appendTo(this.out);
this.dots = $('<span></span>').appendTo(this.title);
this.status = $('<p style="display: none"></p>').appendTo(this.out);
this.output = $('<ul class="search"/>').appendTo(this.out);
$('#search-progress').text(_('Preparing search...'));
this.startPulse();
this.query(query);
}},
query : function(query) {{
var hlterms = [];
var highlightstring = '?highlight=' + $.urlencode(hlterms.join(" "));
$('#search-progress').empty();
var url = this._dipus_url + "?q=" + $.urlencode(query);
$.ajax({{
url: url,
dataType: 'jsonp',
success: function(json){{
for(var i = 0; i < json.hits.length; i++){{
var hit = json.hits[i];
var listItem = $('<li style="display:none"></li>');
var msgbody = hit._source.message;
if (DOCUMENTATION_OPTIONS.FILE_SUFFIX == '') {{
// dirhtml builder
var dirname = hit._source.path;
if (dirname.match(/\/index\/$/)) {{
dirname = dirname.substring(0, dirname.length-6);
}} else if (dirname == 'index/') {{
dirname = '';
}}
listItem.append($('<a/>').attr('href',
DOCUMENTATION_OPTIONS.URL_ROOT + dirname +
highlightstring + query).html(hit._source.title));
}} else {{
// normal html builders
listItem.append($('<a/>').attr('href',
hit._source.path + DOCUMENTATION_OPTIONS.FILE_SUFFIX +
highlightstring + query).html(hit._source.title));
}}
if (msgbody) {{
listItem.append($.makeSearchSummary(msgbody, Array(query), Array(query)));
Search.output.append(listItem);
listItem.slideDown(5);
}} else if (DOCUMENTATION_OPTIONS.HAS_SOURCE) {{
$.get(DOCUMENTATION_OPTIONS.URL_ROOT + '_sources/' +
hit._source.path + '.txt', function(data) {{
if (data != '') {{
listItem.append($.makeSearchSummary(data, Array(query), hlterms));
Search.output.append(listItem);
}}
listItem.slideDown(5);
}});
}} else {{
// no source available, just display title
Search.output.append(listItem);
listItem.slideDown(5);
}}
}};
Search.stopPulse();
Search.title.text(_('Search Results'));
if (json.hits.length === 0){{
Search.status.text(_('Your search did not match any documents. Please make sure that all words are spelled correctly and that you\\'ve selected enough categories.'));
}}else{{
Search.status.text(_('Search finished, found %s page(s) matching the search query.').replace('%s', json.hits.length));
}}
Search.status.fadeIn(500);
}},
error: function(XMLHttpRequest, textStatus, errorThrown) {{
console.log(textStatus, errorThrown);
}}
}});
}}
}};
$(document).ready(function() {{
Search.init();
}});
"""
| 35.556886 | 184 | 0.492253 | 555 | 5,938 | 5.189189 | 0.376577 | 0.026736 | 0.023611 | 0.027083 | 0.097222 | 0.075 | 0.048611 | 0.033333 | 0 | 0 | 0 | 0.011902 | 0.349107 | 5,938 | 166 | 185 | 35.771084 | 0.733247 | 0.007073 | 0 | 0.3 | 0 | 0.02 | 0.996437 | 0.238887 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0.006667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6ba1034d83cf267912fbc83efe67828fa37bf25 | 2,393 | py | Python | samples/sample_file_handling.py | Wacom-Developer/universal-ink-library | 689ed90e09e912b8fc9ac249984df43a7b59aa59 | [
"Apache-2.0"
] | 5 | 2021-09-06T11:45:37.000Z | 2022-03-24T15:56:06.000Z | samples/sample_file_handling.py | Wacom-Developer/universal-ink-library | 689ed90e09e912b8fc9ac249984df43a7b59aa59 | [
"Apache-2.0"
] | null | null | null | samples/sample_file_handling.py | Wacom-Developer/universal-ink-library | 689ed90e09e912b8fc9ac249984df43a7b59aa59 | [
"Apache-2.0"
] | 2 | 2021-09-03T09:08:45.000Z | 2021-12-15T14:03:16.000Z | # -*- coding: utf-8 -*-
# Copyright © 2021 Wacom Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import io
from uim.codec.parser.uim import UIMParser
from uim.codec.parser.will import WILL2Parser
from uim.codec.writer.encoder.encoder_3_1_0 import UIMEncoder310
from uim.model.ink import InkModel
if __name__ == '__main__':
parser: UIMParser = UIMParser()
# Parse UIM v3.0.0
ink_model: InkModel = parser.parse('../ink/uim_3.0.0/1) Value of Ink 1.uim')
# Save the model, this will overwrite an existing file
with io.open('1) Value of Ink 1_3_0_0_to_3_1_0.uim', 'wb') as uim:
# Encode as UIM v3.1.0
uim.write(UIMEncoder310().encode(ink_model))
# ------------------------------------------------------------------------------------------------------------------
# Parse UIM v3.1.0
# ------------------------------------------------------------------------------------------------------------------
ink_model: InkModel = parser.parse('../ink/uim_3.1.0/1) Value of Ink 1 (3.1 delta).uim')
# Save the model, this will overwrite an existing file
with io.open('1) Value of Ink 1_3_1_0.uim', 'wb') as uim:
# Encode as UIM v3.1.0
uim.write(UIMEncoder310().encode(ink_model))
# ------------------------------------------------------------------------------------------------------------------
# Parse WILL 2 file from Inkspace (https://inkspace.wacom.com/)
# ------------------------------------------------------------------------------------------------------------------
parser: WILL2Parser = WILL2Parser()
ink_model_2: InkModel = parser.parse('../ink/will/elephant.will')
# Save the model, this will overwrite an existing file
with io.open('elephant.uim', 'wb') as uim:
# Encode as UIM v3.1.0
uim.write(UIMEncoder310().encode(ink_model_2))
| 50.914894 | 120 | 0.558713 | 310 | 2,393 | 4.216129 | 0.33871 | 0.012242 | 0.019128 | 0.033665 | 0.370314 | 0.370314 | 0.359602 | 0.348891 | 0.348891 | 0.295333 | 0 | 0.034913 | 0.16214 | 2,393 | 46 | 121 | 52.021739 | 0.616459 | 0.578771 | 0 | 0.117647 | 0 | 0.058824 | 0.206122 | 0.02551 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.294118 | 0 | 0.294118 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6c04b1be112a409e5c402b61de90de419055381 | 389 | py | Python | autharch_sharc/editor/migrations/0042_sharciiif_order.py | kingsdigitallab/autharch_sharc | 92de5fbec8cc72ce48a9e25eb634d40ac2cc83ca | [
"MIT"
] | null | null | null | autharch_sharc/editor/migrations/0042_sharciiif_order.py | kingsdigitallab/autharch_sharc | 92de5fbec8cc72ce48a9e25eb634d40ac2cc83ca | [
"MIT"
] | null | null | null | autharch_sharc/editor/migrations/0042_sharciiif_order.py | kingsdigitallab/autharch_sharc | 92de5fbec8cc72ce48a9e25eb634d40ac2cc83ca | [
"MIT"
] | null | null | null | # Generated by Django 3.0.10 on 2021-07-09 09:38
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('editor', '0041_themeobject_sort_order'),
]
operations = [
migrations.AddField(
model_name='sharciiif',
name='order',
field=models.IntegerField(default=1),
),
]
| 20.473684 | 50 | 0.601542 | 41 | 389 | 5.609756 | 0.804878 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07554 | 0.285347 | 389 | 18 | 51 | 21.611111 | 0.751799 | 0.118252 | 0 | 0 | 1 | 0 | 0.13783 | 0.079179 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6c079133086435312474069fd2c024714d94107 | 13,761 | py | Python | turtlebot3_dqn/src/turtlebot3_dqn/simulation_environment_real.py | 2529342549/turtlebot3_m_learning | 19fc961de8a993eafcd421186ad1c38473d04818 | [
"Apache-2.0"
] | 3 | 2020-01-27T09:23:50.000Z | 2022-03-24T09:58:48.000Z | turtlebot3_dqn/src/turtlebot3_dqn/simulation_environment_real.py | 2529342549/turtlebot3_machine_learning | bdb8cc0fa0110269cd3573d3f78011c3e0201e09 | [
"Apache-2.0"
] | null | null | null | turtlebot3_dqn/src/turtlebot3_dqn/simulation_environment_real.py | 2529342549/turtlebot3_machine_learning | bdb8cc0fa0110269cd3573d3f78011c3e0201e09 | [
"Apache-2.0"
] | 2 | 2020-01-27T09:23:54.000Z | 2021-09-20T04:07:13.000Z | #!/usr/bin/env python
#################################################################################
# Copyright 2018 ROBOTIS CO., LTD.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#################################################################################
# Authors: Gilbert #
import rospy
import numpy as np
import math
import time
from math import pi
from geometry_msgs.msg import Twist, Point, Pose, PoseWithCovarianceStamped
from sensor_msgs.msg import LaserScan
from nav_msgs.msg import Odometry
from std_srvs.srv import Empty
from std_msgs.msg import String
from tf.transformations import euler_from_quaternion, quaternion_from_euler
from simulation_respawn_real import Respawn
# from nodes.turtlebot3_real_transmission_2 import Sender
# from gazebo_msgs.msg import ModelStates, ModelState
class Env():
def __init__(self, action_size):
self.goal_x = 0
self.goal_y = 0
self.start_x = 0
self.start_y = 0
self.start_orientation = PoseWithCovarianceStamped()
self.heading = 0
self.count = 0
self.action_size = action_size
self.initGoal = True
self.get_goalbox = False
self.position = Pose()
self.position_x, self.position_y = 0, 0
self.pub_cmd_vel = rospy.Publisher('cmd_vel', Twist, queue_size=1, latch = True)
self.sub_odom = rospy.Subscriber('odom', Odometry, self.getOdometry)
self.respawn_goal = Respawn()
self.action_memory = []
self.time_start = time.time()
self.orientation, self.yaw_init = 0, 0
self.goal_x_map, self.goal_y_map = 0, 0
def getGoalDistace(self):
goal_distance = round(math.hypot(self.goal_x - self.position.x, self.goal_y - self.position.y), 2)
return goal_distance
def getOdometry(self, odom):
self.position = odom.pose.pose.position
self.position_x, self.position_y = self.position.x, self.position.y
orientation = odom.pose.pose.orientation
self.orientation = orientation
orientation_list = [orientation.x, orientation.y, orientation.z, orientation.w]
_, _, yaw = euler_from_quaternion(orientation_list)
# print "odom yaw: ", yaw
goal_angle = math.atan2(self.goal_y - self.position.y , self.goal_x - self.position.x)
heading = goal_angle - yaw
if heading > pi:
heading -= 2 * pi
elif heading < -pi:
heading += 2 * pi
self.heading = round(heading, 2)
def getState(self, scan):
scan_range = []
scan_range2 = []
# print scan.ranges
heading = self.heading
min_range = 0.3
done = False
# no filter
# for i in range(len(scan.ranges)):
# if scan.ranges[i] == float('Inf'):
# scan_range.append(3.5)
# # zero Problem
# # elif np.isnan(scan.ranges[i]):
# # scan_range.append(0)
# elif scan.ranges[i] <= 0.07:
# scan_range.append(3.5)
# else:
# scan_range.append(scan.ranges[i])
# Filter
i = 0
while i <= len(scan.ranges)-1:
# print "length", len(scan_range)
if scan.ranges[i] == float('Inf'):
scan_range.append(3.5)
i += 1
elif scan.ranges[i] == 0:
k = 1
t = 0
if i == 0:
while scan.ranges[k]==0:
k += 1
while t <= k:
scan_range.append(scan.ranges[k])
t += 1
i = k + 1
else:
k = i
m = i
a = scan.ranges[i-1]
while scan.ranges[k]==0:
if k == 359:
while m <= k:
scan_range.append(a)
m += 1
for i in range(len(scan_range)):
if scan_range[i] < 0.12:
scan_range2.append(0.12)
else:
scan_range2.append(scan_range[i])
current_distance = round(math.hypot(self.goal_x - self.position.x, self.goal_y - self.position.y),2)
# if current_distance < 0.2:
if current_distance < 0.15:
vel_cmd = Twist()
self.get_goalbox = True
obstacle_min_range = round(min(scan_range), 2)
obstacle_angle = np.argmin(scan_range)
if min_range > min(scan_range) > 0:
done = True
return scan_range2 + [heading, current_distance, obstacle_min_range, obstacle_angle], done
k += 1
b = scan.ranges[k]
while m < k:
scan_range.append(max(a, b))
m += 1
i = k
else:
scan_range.append(scan.ranges[i])
i += 1
i=0
for i in range(len(scan_range)):
if scan_range[i] < 0.12:
scan_range2.append(0.12)
else:
scan_range2.append(scan_range[i])
obstacle_min_range = round(min(scan_range), 2)
obstacle_angle = np.argmin(scan_range)
if min_range > min(scan_range) > 0:
done = True
current_distance = round(math.hypot(self.goal_x - self.position.x, self.goal_y - self.position.y),2)
# if current_distance < 0.2:
if current_distance < 0.15:
vel_cmd = Twist()
self.get_goalbox = True
return scan_range2 + [heading, current_distance, obstacle_min_range, obstacle_angle], done
def setReward(self, state, done, action):
yaw_reward = []
obstacle_min_range = state[-2]
current_distance = state[-3]
heading = state[-4]
for i in range(5):
angle = -pi / 4 + heading + (pi / 8 * i) + pi / 2
tr = 1 - 4 * math.fabs(0.5 - math.modf(0.25 + 0.5 * angle % (2 * math.pi) / math.pi)[0])
yaw_reward.append(tr)
distance_rate = 2 ** (current_distance / self.goal_distance)
if obstacle_min_range < 0.5:
ob_reward = -5
else:
ob_reward = 0
reward = ((round(yaw_reward[action] * 5, 2)) * distance_rate) + ob_reward
if done:
rospy.loginfo("Near Collision!!")
reward = -200
# driving backwards last 25 actions ~5 seconds
t = 0
l = len(self.action_memory)
vel_cmd = Twist()
# while t <= 10:
# if len(self.action_memory) > 20:
# max_angular_vel = -1.5
# action = self.action_memory[l-t-1]
# ang_vel = ((-self.action_size + 1)/2 - action) * max_angular_vel * 0.5
# vel_cmd.linear.x = -0.15
# # vel_cmd.angular.z = ang_vel
# vel_cmd.angular.z = 0
# time_start = time.time()
# a=0
# self.pub_cmd_vel.publish(vel_cmd)
# t += 1
# else:
# t = 10
# stand still after collision
vel_cmd.linear.x = 0
vel_cmd.angular.z = 0
time_start = time.time()
a=0
while a < 1:
self.pub_cmd_vel.publish(vel_cmd)
a = time.time() - time_start
if self.get_goalbox:
rospy.loginfo("Goal!!")
print "start_position: ", self.start_x,"/ ", self.start_y
print "odom_position:", self.position.x,"/ " ,self.position.y
print "goal_position: ", self.goal_x,"/ ", self.goal_y
print "action: ", action
print "_______________________________________________________________"
reward = 500
self.get_goalbox = False
done = True
vel_cmd = Twist()
vel_cmd.linear.x = 0
vel_cmd.angular.z = 0
start = 0
start_1 = time.time()
while start - 5 < 0:
self.pub_cmd_vel.publish(vel_cmd)
start = time.time() - start_1
# self.pub_cmd_vel.publish(vel_cmd)
# self.goal_x, self.goal_y = self.respawn_goal.getPosition()
# self.goal_distance = self.getGoalDistace()
return reward, done
def speed(self, state):
# Calculate the data new with a filter
scan_range = []
speed = 0.15
speed_goal = 0
for i in range(len(state)):
if state[i] < 0.30:
scan_range.append(3.5)
else:
scan_range.append(state[i])
scan_range = state
obstacle_min_range = round(min(scan_range), 2)
goal_distance = scan_range[361]
# print obstacle_min_range
if obstacle_min_range >= 1:
speed = 0.15
elif obstacle_min_range < 1 and obstacle_min_range >= 0.3:
speed = 0.15 + ((obstacle_min_range-1)/7)
speed_goal = speed
if goal_distance < 0.5:
speed_goal = 0.15 + (goal_distance - 0.)/8
speed = min([speed, speed_goal])
return speed
def step(self, action):
time1 = time.time()
data = None
while data is None:
try:
data = rospy.wait_for_message('scan', LaserScan, timeout=5)
except:
pass
vel_cmd = Twist()
vel_cmd.linear.x = 0
state, done = self.getState(data)
reward, done = self.setReward(state, done, action)
if not done:
max_angular_vel = 1.5
# max_angular_vel = 0.15
ang_vel = ((self.action_size - 1)/2 - action) * max_angular_vel * 0.5
vel_cmd = Twist()
vel_cmd.linear.x = self.speed(state)
# vel_cmd.linear.x = 0.15
vel_cmd.angular.z = ang_vel
self.action_memory.append(-1*action)
time_start = time.time()
self.pub_cmd_vel.publish(vel_cmd)
if self.count % 2 == 0:
print "start_position: ", self.start_x,"/ ", self.start_y
print "odom_position:", self.position.x,"/ " ,self.position.y
print "goal_position: ", self.goal_x,"/ ", self.goal_y
print "goal_distance: ", state[-3],"/ obstacle_distance: ", state[-2]
print "Vel_linear: ",vel_cmd.linear.x , "action: ", action
print done
print "_____________________________________________________________"
self.count += 1
return np.asarray(state), reward, done
def reset(self):
# corrdinate receive, transformation
yaw_neu = 0
if self.initGoal:
self.start_x_map, self.start_y_map, start_orientation_2 = self.respawn_goal.getstartPosition()
self.goal_x_map, self.goal_y_map = self.respawn_goal.getPosition()
start_orientation_list = [start_orientation_2.x, start_orientation_2.y, start_orientation_2.z, start_orientation_2.w]
_, _, self.yaw_init = euler_from_quaternion(start_orientation_list)
self.initGoal = False
# self.goal_x, self.goal_y = self.goal_x_map, self.goal_y_map
else:
self.start_x_map, self.start_y_map = self.goal_x_map, self.goal_y_map
self.goal_x_map, self.goal_y_map = self.respawn_goal.getPosition()
orientation = self.orientation
orientation_list = [orientation.x, orientation.y, orientation.z, orientation.w]
_, _, yaw_neu = euler_from_quaternion(orientation_list)
print "yaw_neu:", yaw_neu
# self.goal_x_map, self.goal_y_map = self.goal_x, self.goal_y
print "Wait 3 sec"
time.sleep(3)
# in map coordinates
# diff_x = self.goal_x - self.start_x + self.position
# diff_y = self.goal_y - self.start_y + self.position
diff_x = self.goal_x_map - self.start_x_map
diff_y = self.goal_y_map - self.start_y_map
print "diff_x: ", diff_x
print "diff_y: ", diff_y
print "yaw_neu: ", yaw_neu
# yaw = yaw_neu + self.yaw_init
# print "yaw: ",yaw
# Transformation
yaw = self.yaw_init
self.goal_x = math.cos(yaw)*diff_x + math.sin(yaw)*diff_y + self.position_x
self.goal_y = -1*math.sin(yaw)*diff_x + math.cos(yaw)*diff_y + self.position_y
self.goal_distance = self.getGoalDistace()
data = None
while data is None:
try:
data = rospy.wait_for_message('scan', LaserScan, timeout=5)
except:
pass
self.goal_distance = self.getGoalDistace()
state, done = self.getState(data)
return np.asarray(state)
| 37.70137 | 129 | 0.534263 | 1,680 | 13,761 | 4.098214 | 0.147619 | 0.049964 | 0.024837 | 0.018882 | 0.472331 | 0.397821 | 0.354829 | 0.329702 | 0.290051 | 0.279593 | 0 | 0.025331 | 0.357387 | 13,761 | 364 | 130 | 37.804945 | 0.753251 | 0.160235 | 0 | 0.384921 | 0 | 0 | 0.033295 | 0.010951 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.007937 | 0.047619 | null | null | 0.06746 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6c390b45f2052455e5898ffbc22af9be8ea36fa | 878 | py | Python | run.py | sulavmhrzn/facebook-autoreply-bot | 2196f392c03305a9d9eca9bd70e2e6dafc38c995 | [
"MIT"
] | null | null | null | run.py | sulavmhrzn/facebook-autoreply-bot | 2196f392c03305a9d9eca9bd70e2e6dafc38c995 | [
"MIT"
] | null | null | null | run.py | sulavmhrzn/facebook-autoreply-bot | 2196f392c03305a9d9eca9bd70e2e6dafc38c995 | [
"MIT"
] | null | null | null | from utils.app import SendBot
try:
from dotenv import load_dotenv
import os
except ModuleNotFoundError:
print('Required modules not found.')
exit()
load_dotenv()
env = input('Load environment variables? (y/n): ').lower()
options = ['y', 'n']
if env in options:
if env == 'n':
email = input('Email: ')
password = input('Password: ')
if email and password:
client = SendBot(email, password, max_tries=100)
# Sets active status
client.setActiveStatus(markAlive=False)
client.listen()
else:
print('Enter credentials.')
else:
client = SendBot(os.getenv('EMAIL'), os.getenv(
'PASSWORD'), max_tries=100)
# Sets active status
client.setActiveStatus(markAlive=False)
client.listen()
else:
print('Please type y or n')
| 24.388889 | 60 | 0.595672 | 99 | 878 | 5.242424 | 0.484848 | 0.046243 | 0.061657 | 0.073218 | 0.350674 | 0.350674 | 0.350674 | 0.350674 | 0.350674 | 0.350674 | 0 | 0.0096 | 0.288155 | 878 | 35 | 61 | 25.085714 | 0.8208 | 0.042141 | 0 | 0.259259 | 0 | 0 | 0.156325 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.148148 | 0.111111 | 0 | 0.111111 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
a6c7dfeaf23ecebc079761c27c16e7671b2bf8e6 | 334 | py | Python | data/ratings/tezxt.py | SLAB-NLP/Akk | baa07b0fdf8c7d8623fbd78508867c30a8a7ff6d | [
"MIT"
] | 5 | 2021-09-14T07:09:07.000Z | 2021-11-15T19:43:13.000Z | data/ratings/tezxt.py | SLAB-NLP/Akk | baa07b0fdf8c7d8623fbd78508867c30a8a7ff6d | [
"MIT"
] | null | null | null | data/ratings/tezxt.py | SLAB-NLP/Akk | baa07b0fdf8c7d8623fbd78508867c30a8a7ff6d | [
"MIT"
] | 1 | 2021-11-15T19:43:19.000Z | 2021-11-15T19:43:19.000Z | with open(r"D:\Drive\לימודים\מאגרי מידע\זמני\ancient-text-processing\jsons_unzipped\saao\saa01\catalogue.json","r",encoding="utf_8") as file:
catalog = eval(file.read())["members"]
rulers = []
for c in catalog:
cat = catalog[c]
if cat["period"] == "Neo-Assyrian" and cat.get("ruler"):
rulers += cat["ruler"]
| 33.4 | 141 | 0.652695 | 49 | 334 | 4.408163 | 0.795918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010676 | 0.158683 | 334 | 9 | 142 | 37.111111 | 0.758007 | 0 | 0 | 0 | 0 | 0.142857 | 0.413174 | 0.287425 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6cd531ba3259edc8b54ccf233a89ab0a561de13 | 2,179 | py | Python | tests/events/test_api_gateway_authorizer.py | chuckwondo/aws-lambda-typing | 8417ab67f2492be1508fe38b2c34bc106619a56d | [
"MIT"
] | 29 | 2021-01-07T13:35:16.000Z | 2022-03-25T07:20:54.000Z | tests/events/test_api_gateway_authorizer.py | chuckwondo/aws-lambda-typing | 8417ab67f2492be1508fe38b2c34bc106619a56d | [
"MIT"
] | 13 | 2021-02-28T00:31:00.000Z | 2022-03-29T15:24:01.000Z | tests/events/test_api_gateway_authorizer.py | chuckwondo/aws-lambda-typing | 8417ab67f2492be1508fe38b2c34bc106619a56d | [
"MIT"
] | 5 | 2021-02-27T13:50:42.000Z | 2022-01-13T15:05:44.000Z | from aws_lambda_typing.events import (
APIGatewayRequestAuthorizerEvent,
APIGatewayTokenAuthorizerEvent,
)
def test_api_gateway_token_authorizer_event() -> None:
event: APIGatewayTokenAuthorizerEvent = {
"type": "TOKEN",
"authorizationToken": "allow",
"methodArn": "arn:aws:execute-api:us-west-2:123456789012:ymy8tbxw7b/*/GET/", # noqa: E501
}
def test_api_gateway_request_authorizer_event() -> None:
event: APIGatewayRequestAuthorizerEvent = {
"type": "REQUEST",
"methodArn": "arn:aws:execute-api:us-east-1:123456789012:abcdef123/test/GET/request", # noqa: E501
"resource": "/request",
"path": "/request",
"httpMethod": "GET",
"headers": {
"X-AMZ-Date": "20170718T062915Z",
"Accept": "*/*",
"HeaderAuth1": "headerValue1",
"CloudFront-Viewer-Country": "US",
"CloudFront-Forwarded-Proto": "https",
"CloudFront-Is-Tablet-Viewer": "false",
"CloudFront-Is-Mobile-Viewer": "false",
"User-Agent": "...",
},
"queryStringParameters": {"QueryString1": "queryValue1"},
"pathParameters": {},
"stageVariables": {"StageVar1": "stageValue1"},
"requestContext": {
"path": "/request",
"accountId": "123456789012",
"resourceId": "05c7jb",
"stage": "test",
"requestId": "...",
"identity": {
"apiKey": "...",
"sourceIp": "...",
"clientCert": {
"clientCertPem": "CERT_CONTENT",
"subjectDN": "www.example.com",
"issuerDN": "Example issuer",
"serialNumber": "a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1:a1", # noqa: E501
"validity": {
"notBefore": "May 28 12:30:02 2019 GMT",
"notAfter": "Aug 5 09:36:04 2021 GMT",
},
},
},
"resourcePath": "/request",
"httpMethod": "GET",
"apiId": "abcdef123",
},
}
| 36.316667 | 107 | 0.502065 | 169 | 2,179 | 6.39645 | 0.609467 | 0.055504 | 0.077706 | 0.096207 | 0.079556 | 0.079556 | 0.029602 | 0.029602 | 0.029602 | 0.029602 | 0 | 0.080745 | 0.335016 | 2,179 | 59 | 108 | 36.932203 | 0.665286 | 0.014686 | 0 | 0.072727 | 0 | 0.036364 | 0.417639 | 0.140924 | 0 | 0 | 0 | 0 | 0 | 1 | 0.036364 | true | 0 | 0.018182 | 0 | 0.054545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6cea9ab25c6ee3d7b3a6630ab209a88876c39c1 | 713 | py | Python | airflow/pyspark/weekday/avg_temperature.py | juliocnsouzadev/gcp-data-engineer | c32a516440c8989f28a33234a05a02873c7fc5b8 | [
"MIT"
] | null | null | null | airflow/pyspark/weekday/avg_temperature.py | juliocnsouzadev/gcp-data-engineer | c32a516440c8989f28a33234a05a02873c7fc5b8 | [
"MIT"
] | null | null | null | airflow/pyspark/weekday/avg_temperature.py | juliocnsouzadev/gcp-data-engineer | c32a516440c8989f28a33234a05a02873c7fc5b8 | [
"MIT"
] | null | null | null | #!/usr/bin/python
from pyspark.sql import SparkSession
spark = (
SparkSession.builder.master("yarn")
.appName("bigquery-analytics-avg-temperature")
.getOrCreate()
)
bucket = "01-logistics-backup"
spark.conf.set("temporaryGcsBucket", bucket)
history = (
spark.read.format("bigquery").option("table", "vehicle_analytics.history").load()
)
history.createOrReplaceTempView("history")
avg_temperature = spark.sql(
"SELECT vehicle_id, date, AVG(temperature) AS avg_temperature FROM history GROUP BY vehicle_id, date"
)
avg_temperature.show()
avg_temperature.printSchema()
avg_temperature.write.format("bigquery").option(
"table", "vehicle_analytics.avg_temperature"
).mode("append").save()
| 26.407407 | 105 | 0.748948 | 82 | 713 | 6.390244 | 0.54878 | 0.21374 | 0.087786 | 0.09542 | 0.259542 | 0.156489 | 0 | 0 | 0 | 0 | 0 | 0.003135 | 0.105189 | 713 | 26 | 106 | 27.423077 | 0.818182 | 0.02244 | 0 | 0 | 0 | 0 | 0.389368 | 0.132184 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.05 | 0.05 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6d31d9e8aa8ac0ef8afc1f1dce26c8c8fb6d88e | 422 | py | Python | py/tests/test_cli.py | sthagen/odata-url-parser | b05397c5fb9f33bcd2b883f82bda0a5a388eadae | [
"MIT"
] | 2 | 2020-09-11T20:01:08.000Z | 2020-09-12T11:40:43.000Z | py/tests/test_cli.py | sthagen/python-odata_url_parser | b05397c5fb9f33bcd2b883f82bda0a5a388eadae | [
"MIT"
] | 11 | 2020-09-10T20:55:45.000Z | 2020-09-12T12:51:02.000Z | py/tests/test_cli.py | sthagen/python-odata_url_parser | b05397c5fb9f33bcd2b883f82bda0a5a388eadae | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# pylint: disable=missing-docstring,unused-import,reimported
import io
import pytest # type: ignore
import odata_url_parser.cli as cli
import odata_url_parser.odata_url_parser as oup
def test_main_ok_minimal(capsys):
job = ['does not matter']
report_expected = job[0]
assert cli.main(job) is None
out, err = capsys.readouterr()
assert out.strip() == report_expected.strip()
| 26.375 | 60 | 0.725118 | 62 | 422 | 4.758065 | 0.645161 | 0.081356 | 0.142373 | 0.135593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005666 | 0.163507 | 422 | 15 | 61 | 28.133333 | 0.830028 | 0.220379 | 0 | 0 | 0 | 0 | 0.046154 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.1 | false | 0 | 0.4 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
a6d47efb044c92d4dfec30a904a3f3088fdb915c | 1,005 | py | Python | openapi_documentor/openapi/views.py | codeasashu/openapi-documentor | dde825edaac85bb117d06adf0a4eabf1f5da44f5 | [
"MIT"
] | null | null | null | openapi_documentor/openapi/views.py | codeasashu/openapi-documentor | dde825edaac85bb117d06adf0a4eabf1f5da44f5 | [
"MIT"
] | 5 | 2021-04-06T07:46:47.000Z | 2022-03-02T13:12:20.000Z | openapi_documentor/openapi/views.py | codeasashu/openapi-documentor | dde825edaac85bb117d06adf0a4eabf1f5da44f5 | [
"MIT"
] | null | null | null | from django.contrib.auth.mixins import LoginRequiredMixin
from django.shortcuts import get_object_or_404
from django.views.generic import DetailView, ListView
from taggit.models import Tag
from .models import Document
class OpenapiListView(LoginRequiredMixin, ListView):
model = Document
context_object_name = "apis"
paginate_by = 10
api_list_view = OpenapiListView.as_view()
class OpenapiDetailView(LoginRequiredMixin, DetailView):
model = Document
context_object_name = "api"
api_detail_view = OpenapiDetailView.as_view()
class OpenapiTaggedView(LoginRequiredMixin, ListView):
context_object_name = "apis"
paginate_by = 10
template_name = "document_list.html"
def get_queryset(self):
slug = self.kwargs.get("tag", None)
if slug:
tag = get_object_or_404(Tag, slug=slug)
return Document.objects.filter(tags=tag)
else:
return Document.objects.none()
api_tagged_view = OpenapiTaggedView.as_view()
| 24.512195 | 57 | 0.734328 | 119 | 1,005 | 5.983193 | 0.428571 | 0.042135 | 0.071629 | 0.039326 | 0.15309 | 0.092697 | 0.092697 | 0 | 0 | 0 | 0 | 0.01227 | 0.189055 | 1,005 | 40 | 58 | 25.125 | 0.86135 | 0 | 0 | 0.230769 | 0 | 0 | 0.031841 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.038462 | false | 0 | 0.192308 | 0 | 0.730769 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
a6da3998b2da3e2208b50eecda9469f494a116aa | 630 | py | Python | app/config.py | midnights-straychild/weatherman | 50354f0639fbcdde01e1ac6290bf71379581868b | [
"MIT"
] | null | null | null | app/config.py | midnights-straychild/weatherman | 50354f0639fbcdde01e1ac6290bf71379581868b | [
"MIT"
] | null | null | null | app/config.py | midnights-straychild/weatherman | 50354f0639fbcdde01e1ac6290bf71379581868b | [
"MIT"
] | null | null | null | class Config:
conf = {
"labels": {
"pageTitle": "Weatherman V0.0.1"
},
"db.database": "weatherman",
"db.username": "postgres",
"db.password": "postgres",
"navigation": [
{
"url": "/",
"name": "Home"
},
{
"url": "/cakes",
"name": "Cakes"
},
{
"url": "/mqtt",
"name": "MQTT"
}
]
}
def get_config(self):
return self.conf
def get(self, key):
return self.conf[key]
| 21 | 44 | 0.339683 | 45 | 630 | 4.733333 | 0.555556 | 0.056338 | 0.131455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009615 | 0.504762 | 630 | 29 | 45 | 21.724138 | 0.673077 | 0 | 0 | 0 | 0 | 0 | 0.233333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0.037037 | 0 | 0.074074 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6e3c3ffa2830e6dc6e8d6bc0393272aef1d0fd6 | 349 | py | Python | tab2comma.py | Guerillero/GeolocationFun | f61be4f2b3e0a6a2c4641f83ae29ff161eb861fe | [
"MIT"
] | 1 | 2016-03-11T10:26:08.000Z | 2016-03-11T10:26:08.000Z | tab2comma.py | Guerillero/GeolocationFun | f61be4f2b3e0a6a2c4641f83ae29ff161eb861fe | [
"MIT"
] | null | null | null | tab2comma.py | Guerillero/GeolocationFun | f61be4f2b3e0a6a2c4641f83ae29ff161eb861fe | [
"MIT"
] | null | null | null | #Converts the geo_data tvs into a more ArcMap friendly csv
import csv
import sys
fin = open('geo_data.tsv', 'r')
fout = open('geo_data.csv', 'w')
csv.field_size_limit(sys.maxsize)
tabfile = csv.reader(fin, dialect=csv.excel_tab)
commafile = csv.writer(fout, dialect=csv.excel)
for row in tabfile:
commafile.writerow(row)
print "done"
| 20.529412 | 58 | 0.724928 | 57 | 349 | 4.333333 | 0.631579 | 0.08502 | 0.089069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151862 | 349 | 16 | 59 | 21.8125 | 0.834459 | 0.163324 | 0 | 0 | 0 | 0 | 0.103093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.2 | null | null | 0.1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6e636c3407d17a05d2d215c473b50da1c8bf471 | 464 | py | Python | systemfixtures/tests/test_users.py | alejdg/systemfixtures | d1c42d83c3dca2a36b52e8fc214639ebcb1cd8a1 | [
"MIT"
] | 13 | 2017-01-24T15:25:47.000Z | 2022-01-06T23:56:06.000Z | systemfixtures/tests/test_users.py | cjwatson/systemfixtures | 6ff52e224585d8fab2908dc08a22fe36dcaf93d4 | [
"MIT"
] | 10 | 2017-03-08T09:36:01.000Z | 2022-02-09T11:08:00.000Z | systemfixtures/tests/test_users.py | cjwatson/systemfixtures | 6ff52e224585d8fab2908dc08a22fe36dcaf93d4 | [
"MIT"
] | 5 | 2017-03-08T09:30:51.000Z | 2022-02-05T23:22:25.000Z | import pwd
from testtools import TestCase
from ..users import FakeUsers
class FakeUsersTest(TestCase):
def setUp(self):
super(FakeUsersTest, self).setUp()
self.users = self.useFixture(FakeUsers())
def test_real(self):
info = pwd.getpwnam("root")
self.assertEqual(0, info.pw_uid)
def test_fake(self):
self.users.add("foo", 123)
info = pwd.getpwnam("foo")
self.assertEqual(123, info.pw_uid)
| 21.090909 | 49 | 0.642241 | 58 | 464 | 5.068966 | 0.465517 | 0.061224 | 0.102041 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019774 | 0.237069 | 464 | 21 | 50 | 22.095238 | 0.810734 | 0 | 0 | 0 | 0 | 0 | 0.021552 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0.214286 | false | 0 | 0.214286 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6e82e8401f083b412aeb15f384d0aa8ee6b7b91 | 5,666 | py | Python | StatisticalModelling.py | bdolenc/Zemanta-challenge | 5ece77c48bf6da4e96de6bceb910ac77496f54e2 | [
"MIT"
] | null | null | null | StatisticalModelling.py | bdolenc/Zemanta-challenge | 5ece77c48bf6da4e96de6bceb910ac77496f54e2 | [
"MIT"
] | null | null | null | StatisticalModelling.py | bdolenc/Zemanta-challenge | 5ece77c48bf6da4e96de6bceb910ac77496f54e2 | [
"MIT"
] | null | null | null | #The code is published under MIT license.
from sklearn.ensemble import RandomForestClassifier
from sklearn.ensemble import ExtraTreesClassifier
from sklearn.ensemble import GradientBoostingClassifier
from sklearn import cross_validation
from sklearn.cross_validation import StratifiedKFold
from sklearn.metrics import roc_curve, auc
from sklearn.linear_model import LogisticRegression
import pandas as pd
import csv
import numpy as np
def prepare_data(learn_file, labels_file):
"""
Open learning set, cluster labels
and change ZIP codes with corresponding
cluster label. Return X and y for learning.
"""
print "---preparing data...",
l_set = pd.read_csv(learn_file, sep='\t')
# t_set = pd.read_csv(test_file, sep='\t', header=None, names=['click', 'creative_id', 'zip', 'domain', 'page'])
# t_set = pd.read_csv(test_file, sep='\t', header=None, names=['creative_id', 'zip', 'domain', 'page'])
l_set = l_set.iloc[::5, :]
# t_set = t_set.iloc[::5, :]
#replace NaN values with zero.
l_set = l_set.fillna(0)
# t_set = t_set.fillna(0)
with open(labels_file, mode='r') as file_in:
reader = csv.reader(file_in)
c_labels = {float(rows[0]): rows[1] for rows in reader}
#change ZIP with label
l_set['zip'] = l_set['zip'].convert_objects(convert_numeric=True).dropna()
l_set['zip'] = l_set['zip'].map(c_labels.get)
# Change ZIP with label
# t_set['zip'] = t_set['zip'].convert_objects(convert_numeric=True).dropna()
# t_set['zip'] = t_set['zip'].map(c_labels.get)
l_set = l_set.reindex(np.random.permutation(l_set.index))
print "done---"
#remove where ZIP None - for testing on part data
# l_set = l_set[l_set.zip.notnull()]
# t_set = t_set[t_set.zip.notnull()]
#X for learning features, y for click
X = l_set[['creative_id', 'zip', 'domain']]
y = l_set['click']
# X_sub = t_set[['creative_id', 'zip', 'domain']]
# y_sub = t_set['click']
#Replace domain with numeric
unique_d = set(X['domain'])
# print len(unique_d)
# unique_d |= set(X_sub['domain'])
dict_d = {}
for c, d in enumerate(unique_d):
dict_d[d] = c
X['domain'] = X['domain'].map(dict_d.get)
X = X.fillna(0)
# X_sub['domain'] = X_sub['domain'].map(dict_d.get)
# X_sub = X_sub.fillna(0)
return X, y, # X_sub, y_sub
def random_forest(X, y, n_estimators):
"""
Scikit Random Forest implementation
with 100 trees, testing on 0.4 part
of the data, and train on 0.6.
"""
#Scale data
#X = StandardScaler().fit_transform(X)
#split data to train and test
X_train, X_test, y_train, y_test = cross_validation.train_test_split(X, y, test_size=0.4)
# print X_train
# print y_train
# create rfc object
forest = RandomForestClassifier(n_estimators=n_estimators)
#fit training data
prob = forest.fit(X_train, y_train, ).predict_proba(X_test)
#compute ROC
fpr, tpr, thresholds = roc_curve(y_test, prob[:, 1])
roc_auc = auc(fpr, tpr)
#print fpr, tpr, thresholds
print "AUC Random Forest: " + str(roc_auc)
def stacking_scikit(X, y, n_estimators):
"""
Stacking with classifiers from scikit-learn
library. Based on example
https://github.com/log0/vertebral/blob/master/stacked_generalization.py
"""
X = X.as_matrix()
y = y.as_matrix()
base_classifiers = [RandomForestClassifier(n_estimators=n_estimators),
ExtraTreesClassifier(n_estimators=n_estimators),
GradientBoostingClassifier(n_estimators=n_estimators)]
clf_names = ["Random Forest", "Extra Trees Classifier", "Gradient Boosting Classifier"]
# Divide data on training and test set
X_train, X_test, y_train, y_test = cross_validation.train_test_split(X, y, test_size=0.2)
# Arrays for classifier results
out_train = np.zeros((X_train.shape[0], len(base_classifiers)))
out_test = np.zeros((X_test.shape[0], len(base_classifiers)))
t_cv = list(StratifiedKFold(y_train, n_folds=5))
for i, clf in enumerate(base_classifiers):
print "Training classifier " + clf_names[i]
cv_probabilities = np.zeros((X_test.shape[0], len(t_cv)))
# cross validation train
for j, (train_i, test_i) in enumerate(t_cv):
X_train_0 = X_train[train_i]
y_train_0 = y_train[train_i]
X_test_0 = X_train[test_i]
# train each classifier
clf.fit(X_train_0, y_train_0)
# Get probabilities for click on internal test data
proba = clf.predict_proba(X_test_0)
out_train[test_i, i] = proba[:, 1]
# Probabilities for test data
proba_test = clf.predict_proba(X_test)
cv_probabilities[:, j] = proba_test[:, 1]
# Average of predictions
out_test[:, i] = cv_probabilities.mean(1)
print "Stacking with Logistic regression"
stack_clf = LogisticRegression(C=10)
stack_clf.fit(out_train, y_train)
stack_prediction = stack_clf.predict_proba(out_test)
#compute ROC
fpr, tpr, thresholds = roc_curve(y_test, stack_prediction[:, 1])
roc_auc = auc(fpr, tpr)
print "AUC Stacking: " + str(roc_auc)
#write to file
np.savetxt(fname="results.txt", X=stack_prediction[:, 1], fmt="%0.6f")
learning_set = "C:\BigData\Zemanta_challenge_1_data/training_set.tsv"
learning_part = "C:\BigData\Zemanta_challenge_1_data/training_part.tsv"
test_set = "C:\BigData\Zemanta_challenge_1_data/test_set.tsv"
labels = "hc_results.csv"
X, y = prepare_data(learning_set, labels)
random_forest(X, y, 10)
stacking_scikit(X, y, 10)
| 35.192547 | 116 | 0.667137 | 844 | 5,666 | 4.242891 | 0.228673 | 0.018989 | 0.006981 | 0.01117 | 0.26138 | 0.201899 | 0.159453 | 0.105557 | 0.080983 | 0.080983 | 0 | 0.010486 | 0.208966 | 5,666 | 160 | 117 | 35.4125 | 0.788487 | 0.229615 | 0 | 0.025974 | 0 | 0 | 0.107629 | 0.039301 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.12987 | null | null | 0.077922 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6efb681feeb49e4829de2d74d70c18a039c51a6 | 719 | py | Python | PythonEdition/03_lengthOfLongestSubstring.py | cxiaolong/Algorithm-Practice | 6f3d3f4b14a3fc170a3dc47b2ab24f8e37cb941c | [
"MIT"
] | null | null | null | PythonEdition/03_lengthOfLongestSubstring.py | cxiaolong/Algorithm-Practice | 6f3d3f4b14a3fc170a3dc47b2ab24f8e37cb941c | [
"MIT"
] | null | null | null | PythonEdition/03_lengthOfLongestSubstring.py | cxiaolong/Algorithm-Practice | 6f3d3f4b14a3fc170a3dc47b2ab24f8e37cb941c | [
"MIT"
] | null | null | null | class Solution:
def lengthOfLongestSubstring(self, s: str) -> int:
occ = set()
n = len(s)
max_length = 0
cur = 0
for i in range(n):
if i != 0:
# 左指针向右移动一格,移除一个字符
occ.remove(s[i-1])
while cur < n and s[cur] not in occ:
occ.add(s[cur])
cur += 1
max_length = max(max_length, cur-i)
return max_length
if __name__ == '__main__':
s = Solution()
s1 = "abcabcbb"
s2 = "bbbbb"
s3 = "pwwkew"
print(s.lengthOfLongestSubstring(s1))
print(s.lengthOfLongestSubstring(s2))
print(s.lengthOfLongestSubstring(s3))
print(s.lengthOfLongestSubstring("")) | 27.653846 | 54 | 0.527121 | 84 | 719 | 4.369048 | 0.464286 | 0.098093 | 0.326975 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023707 | 0.354659 | 719 | 26 | 55 | 27.653846 | 0.767241 | 0.022253 | 0 | 0 | 0 | 0 | 0.038462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043478 | false | 0 | 0 | 0 | 0.130435 | 0.173913 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6f04c096ac02af94095dc0b90b868e0e2b87e2f | 2,750 | py | Python | gefen-hdsdi2dvi.py | timvideos/panacontrol | 3fbaec8d9491255735b8f685fc05bd1abc078a96 | [
"Apache-2.0"
] | null | null | null | gefen-hdsdi2dvi.py | timvideos/panacontrol | 3fbaec8d9491255735b8f685fc05bd1abc078a96 | [
"Apache-2.0"
] | 2 | 2015-01-06T02:36:27.000Z | 2015-01-20T00:06:51.000Z | gefen-hdsdi2dvi.py | timvideos/panacontrol | 3fbaec8d9491255735b8f685fc05bd1abc078a96 | [
"Apache-2.0"
] | 2 | 2015-01-05T21:20:54.000Z | 2022-01-13T00:20:48.000Z | #!/usr/bin/python
import fcntl
import struct
import sys
import termios
import time
import math
import os
class SerialPort(object):
def __init__(self, tty_name):
self.tty_name = tty_name
self.tty = None
self.old_termios = None
self.InitTTY()
def __del__(self):
if self.tty and self.old_termios:
fd = self.tty.fileno()
termios.tcsetattr(fd, termios.TCSAFLUSH, self.old_termios)
def InitTTY(self):
#self.tty = open(self.tty_name, 'rb+', 0)
#fd = open("/dev/ttyUSB0", O_RDWR | O_NOCTTY | O_NONBLOCK);
#fcntl(fd, F_SETFL, 0);
ttyfd = os.open(self.tty_name, os.O_RDWR | os.O_NOCTTY | os.O_NONBLOCK)
fcntl.fcntl(ttyfd, fcntl.F_SETFL, 0)
self.tty = os.fdopen(ttyfd, 'rb+', 0)
fd = self.tty.fileno()
self.old_termios = termios.tcgetattr(fd)
new_termios = [termios.IGNPAR, # iflag
0, # oflag
termios.B115200 | termios.CS8 |
termios.CLOCAL | termios.CREAD, # cflag
0, # lflag
termios.B115200, # ispeed
termios.B115200, # ospeed
self.old_termios[6] # special characters
]
termios.tcsetattr(fd, termios.TCSANOW, new_termios)
#fcntl.ioctl(self.fd, termios.TIOCMBIS, TIOCM_RTS_str)
#control = fcntl.ioctl(fd, termios.TIOCMGET, struct.pack('I', 0))
#print '%04X' % struct.unpack('I',control)[0]
#fcntl.ioctl(fd, termios.TIOCMBIC, struct.pack('I', termios.TIOCM_RTS))
#fcntl.ioctl(fd, termios.TIOCMBIC, struct.pack('I', termios.TIOCM_DTR))
#control = fcntl.ioctl(fd, termios.TIOCMGET, struct.pack('I', 0))
#print '%04X' % struct.unpack('I',control)[0]
def ReadByte(self):
return self.tty.read(1)
def WriteByte(self, byte):
return self.tty.write(byte)
pass
def main():
input_buffer = []
try:
tty_name = sys.argv[1]
except IndexError:
tty_name = '/dev/ttyS0'
port = SerialPort(tty_name)
for i in "\r\r\r":
port.WriteByte(i)
for s in ["#FRAME 8\r","#OUTPUT 8\r"]: #LIST\r",]: #"#DEVTYPE\r","#DEVERSION\r",'#LIST\r',"#OUTPUT_8\r",]:
for i in s:
port.WriteByte(i)
print "Wrote %r\nWaiting for response!" % (s,)
response = False
while True:
r = ['']
while r[-1] != '\r':
r.append(port.ReadByte())
#sys.stdout.write(repr(r[-1]))
#sys.stdout.flush()
if "".join(r).strip() != "":
print "Response %r" % ("".join(r),)
response = True
break
else:
print "Empty"
if response:
break
if __name__ == '__main__':
main()
| 26.960784 | 108 | 0.560727 | 353 | 2,750 | 4.240793 | 0.325779 | 0.056112 | 0.04676 | 0.050768 | 0.167001 | 0.167001 | 0.167001 | 0.167001 | 0.167001 | 0.167001 | 0 | 0.022085 | 0.292 | 2,750 | 101 | 109 | 27.227723 | 0.74679 | 0.258182 | 0 | 0.151515 | 0 | 0 | 0.048187 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.015152 | 0.106061 | null | null | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6f141e9d9f97e34bca576e7230af21be66d021b | 18,603 | py | Python | main.py | 3ntr0phy/Binance_New_Coins_Scraper | 8d5dadf937f818f079aa64b3bc56381d7caff56b | [
"MIT"
] | null | null | null | main.py | 3ntr0phy/Binance_New_Coins_Scraper | 8d5dadf937f818f079aa64b3bc56381d7caff56b | [
"MIT"
] | null | null | null | main.py | 3ntr0phy/Binance_New_Coins_Scraper | 8d5dadf937f818f079aa64b3bc56381d7caff56b | [
"MIT"
] | null | null | null | import os
import re
import time
import json
import requests
import threading
import traceback
from json_manage import *
from binance_key import *
from config import *
from datetime import datetime, timedelta
import dateutil.parser as dparser
ARTICLES_URL = 'https://www.binance.com/bapi/composite/v1/public/cms/article/catalog/list/query?catalogId=48&pageNo=1&pageSize=30'
ARTICLE = 'https://www.binance.com/bapi/composite/v1/public/cms/article/detail/query?articleCode='
existing_assets = ["BTC","LTC","ETH","NEO","BNB","QTUM","EOS","SNT","BNT","GAS","BCC","USDT","HSR","OAX","DNT","MCO","ICN","ZRX","OMG","WTC","YOYO","LRC","TRX","SNGLS","STRAT","BQX","FUN","KNC","CDT","XVG","IOTA","SNM","LINK","CVC","TNT","REP","MDA","MTL","SALT","NULS","SUB","STX","MTH","ADX","ETC","ENG","ZEC","AST","GNT","DGD","BAT","DASH","POWR","BTG","REQ","XMR","EVX","VIB","ENJ","VEN","ARK","XRP","MOD","STORJ","KMD","RCN","EDO","DATA","DLT","MANA","PPT","RDN","GXS","AMB","ARN","BCPT","CND","GVT","POE","BTS","FUEL","XZC","QSP","LSK","BCD","TNB","ADA","LEND","XLM","CMT","WAVES","WABI","GTO","ICX","OST","ELF","AION","WINGS","BRD","NEBL","NAV","VIBE","LUN","TRIG","APPC","CHAT","RLC","INS","PIVX","IOST","STEEM","NANO","AE","VIA","BLZ","SYS","RPX","NCASH","POA","ONT","ZIL","STORM","XEM","WAN","WPR","QLC","GRS","CLOAK","LOOM","BCN","TUSD","ZEN","SKY","THETA","IOTX","QKC","AGI","NXS","SC","NPXS","KEY","NAS","MFT","DENT","IQ","ARDR","HOT","VET","DOCK","POLY","VTHO","ONG","PHX","HC","GO","PAX","RVN","DCR","USDC","MITH","BCHABC","BCHSV","REN","BTT","USDS","FET","TFUEL","CELR","MATIC","ATOM","PHB","ONE","FTM","BTCB","USDSB","CHZ","COS","ALGO","ERD","DOGE","BGBP","DUSK","ANKR","WIN","TUSDB","COCOS","PERL","TOMO","BUSD","BAND","BEAM","HBAR","XTZ","NGN","DGB","NKN","GBP","EUR","KAVA","RUB","UAH","ARPA","TRY","CTXC","AERGO","BCH","TROY","BRL","VITE","FTT","AUD","OGN","DREP","BULL","BEAR","ETHBULL","ETHBEAR","XRPBULL","XRPBEAR","EOSBULL","EOSBEAR","TCT","WRX","LTO","ZAR","MBL","COTI","BKRW","BNBBULL","BNBBEAR","HIVE","STPT","SOL","IDRT","CTSI","CHR","BTCUP","BTCDOWN","HNT","JST","FIO","BIDR","STMX","MDT","PNT","COMP","IRIS","MKR","SXP","SNX","DAI","ETHUP","ETHDOWN","ADAUP","ADADOWN","LINKUP","LINKDOWN","DOT","RUNE","BNBUP","BNBDOWN","XTZUP","XTZDOWN","AVA","BAL","YFI","SRM","ANT","CRV","SAND","OCEAN","NMR","LUNA","IDEX","RSR","PAXG","WNXM","TRB","EGLD","BZRX","WBTC","KSM","SUSHI","YFII","DIA","BEL","UMA","EOSUP","TRXUP","EOSDOWN","TRXDOWN","XRPUP","XRPDOWN","DOTUP","DOTDOWN","NBS","WING","SWRV","LTCUP","LTCDOWN","CREAM","UNI","OXT","SUN","AVAX","BURGER","BAKE","FLM","SCRT","XVS","CAKE","SPARTA","UNIUP","UNIDOWN","ALPHA","ORN","UTK","NEAR","VIDT","AAVE","FIL","SXPUP","SXPDOWN","INJ","FILDOWN","FILUP","YFIUP","YFIDOWN","CTK","EASY","AUDIO","BCHUP","BCHDOWN","BOT","AXS","AKRO","HARD","KP3R","RENBTC","SLP","STRAX","UNFI","CVP","BCHA","FOR","FRONT","ROSE","HEGIC","AAVEUP","AAVEDOWN","PROM","BETH","SKL","GLM","SUSD","COVER","GHST","SUSHIUP","SUSHIDOWN","XLMUP","XLMDOWN","DF","JUV","PSG","BVND","GRT","CELO","TWT","REEF","OG","ATM","ASR","1INCH","RIF","BTCST","TRU","DEXE","CKB","FIRO","LIT","PROS","VAI","SFP","FXS","DODO","AUCTION","UFT","ACM","PHA","TVK","BADGER","FIS","OM","POND","ALICE","DEGO","BIFI","LINA"]
key_words = ['Futures', 'Isolated', 'Margin', 'Launchpool', 'Launchpad', 'Cross', 'Perpetual']
filter_List = ['body', 'type', 'catalogId', 'catalogName', 'publishDate']
file = 'announcements.json'
schedules_file = 'scheduled_order.json'
executed_trades_file = 'executed_trades.json'
executed_sells_file = 'executed_sells_trades.json'
executed_queque = []
pair_Dict = {}
cnf = load_config('config.yml')
client = load_binance_creds(r'auth.yml')
telegram_status = True
telegram_keys=[]
if os.path.exists('telegram.yml'):
telegram_keys = load_config('telegram.yml')
else: telegram_status = False
def telegram_bot_sendtext(bot_message):
send_text = 'https://api.telegram.org/bot' + str(telegram_keys['telegram_key']) + '/sendMessage?chat_id=' + str(telegram_keys['chat_id']) + '&parse_mode=Markdown&text=' + bot_message
response = requests.get(send_text)
return response.json()['result']['message_id']
def telegram_delete_message(message_id):
send_text = 'https://api.telegram.org/bot' + str(telegram_keys['telegram_key']) + '/deleteMessage?chat_id=' + str(telegram_keys['chat_id']) + '&message_id=' + str(message_id)
requests.get(send_text)
class Send_Without_Spamming():
def __init__(self):
self.id =0000
self.first = True
def send(self, message):
if telegram_status:
if self.first:
self.first = False
self.id = telegram_bot_sendtext(message)
else:
telegram_delete_message(self.id)
self.id = telegram_bot_sendtext(message)
else:
print(message)
def kill(self, pair):
if telegram_status:
telegram_delete_message(self.id)
del pair_Dict[pair]
def killSpam(pair):
try:
pair_Dict[pair].kill(pair)
except Exception:
pass
def sendSpam(pair, message):
try:
pair_Dict[pair].send(message)
except Exception:
pair_Dict[pair] = Send_Without_Spamming()
pair_Dict[pair].send(message)
tp = cnf['TRADE_OPTIONS']['TP']
sl = cnf['TRADE_OPTIONS']['SL']
tsl_mode = cnf['TRADE_OPTIONS']['ENABLE_TSL']
tsl = cnf['TRADE_OPTIONS']['TSL']
ttp = cnf['TRADE_OPTIONS']['TTP']
pairing = cnf['TRADE_OPTIONS']['PAIRING']
ammount = cnf['TRADE_OPTIONS']['QUANTITY']
frequency = cnf['TRADE_OPTIONS']['RUN_EVERY']
test_mode = cnf['TRADE_OPTIONS']['TEST']
delay_mode = cnf['TRADE_OPTIONS']['CONSIDER_DELAY']
percentage = cnf['TRADE_OPTIONS']['PERCENTAGE']
existing_assets.remove(pairing)
regex = '\S{2,6}?/'+ pairing
def sendmsg(message):
print(message)
if telegram_status:
threading.Thread(target=telegram_bot_sendtext, args=(message,)).start()
else:
print(message)
def ping_binance():
sum = 0
for i in range(3):
time_before = datetime.timestamp(datetime.now())
client.ping()
time_after = datetime.timestamp(datetime.now())
sum += (time_after - time_before)
return (sum / 3)
####announcements
def get_Announcements():
unfiltered_Articles = requests.get(ARTICLES_URL).json()['data']['articles']
articles = []
for article in unfiltered_Articles:
flag = True
for word in key_words:
if word in article['title']:
flag = False
if flag:
articles.append(article)
for article in articles:
for undesired_Data in filter_List:
if undesired_Data in article:
del article[undesired_Data]
return articles
def get_Pair_and_DateTime(ARTICLE_CODE):
new_Coin = requests.get(ARTICLE+ARTICLE_CODE).json()['data']['seoDesc']
try:
datetime = dparser.parse(new_Coin, fuzzy=True, ignoretz=True)
raw_pairs = re.findall(regex, new_Coin)
pairs = []
for pair in raw_pairs:
present= False
for j in existing_assets:
if j in pair:
present = True
break
if present == False:
pairs.append(pair.replace('/', ''))
return [datetime, pairs]
except Exception as e:
print(e)
return None
####orders
def get_price(coin):
return client.get_ticker(symbol=coin)['lastPrice']
def create_order(pair, usdt_to_spend, action):
try:
order = client.create_order(
symbol = pair,
side = action,
type = 'MARKET',
quoteOrderQty = usdt_to_spend,
recvWindow = "10000"
)
except Exception as exception:
wrong = traceback.format_exc(limit=None, chain=True)
sendmsg(wrong)
return order
def executed_orders():
global executed_queque
while True:
if len(executed_queque) > 0:
if os.path.exists(executed_trades_file):
existing_file = load_json(executed_trades_file)
existing_file += executed_queque
else:
existing_file = executed_queque
save_json(executed_trades_file, existing_file)
executed_queque = []
time.sleep(0.1)
def schedule_Order(time_And_Pair, announcement):
try:
scheduled_order = {'time':time_And_Pair[0].strftime("%Y-%m-%d %H:%M:%S"), 'pairs':time_And_Pair[1]}
sendmsg(f'Scheduled an order for: {time_And_Pair[1]} at: {time_And_Pair[0]}')
update_json(schedules_file, scheduled_order)
update_json(file, announcement)
except Exception as exception:
wrong = traceback.format_exc(limit=None, chain=True)
sendmsg(wrong)
def place_Order_On_Time(time_till_live, pair, threads):
delay = 0
global executed_queque
try:
if delay_mode:
delay = (ping_binance() * percentage)
time_till_live = (time_till_live - timedelta(seconds = delay))
time_to_wait = ((time_till_live - datetime.utcnow()).total_seconds() - 10)
time.sleep(time_to_wait)
order = {}
if test_mode:
price = get_price(pair)
while True:
if (datetime.utcnow() - timedelta(seconds = 1) <= time_till_live <= datetime.utcnow() - timedelta(seconds = delay * 0.9)):
order = {
"symbol": pair,
"transactTime": datetime.timestamp(datetime.now()),
"price": price,
"origQty": ammount/float(price),
"executedQty": ammount/float(price),
"cummulativeQuoteQty": ammount,
"status": "FILLED",
"type": "MARKET",
"side": "BUY"
}
break
else:
while True:
if (datetime.utcnow() - timedelta(seconds = 1) <= time_till_live <= datetime.utcnow() - timedelta(seconds = delay * 0.9)):
order = create_order(pair, ammount, 'BUY')
break
order['tp'] = tp
order['sl'] = sl
amount = order['executedQty']
price =order['price']
if price <= 0.00001:
price = get_price(pair)
sendmsg(f'Bougth {amount} of {pair} at {price}')
executed_queque.append(order)
except Exception as exception:
wrong = traceback.format_exc(limit=None, chain=True)
sendmsg(wrong)
######
def check_Schedules():
try:
if os.path.exists(schedules_file):
unfiltered_schedules = load_json(schedules_file)
schedules = []
for schedule in unfiltered_schedules:
flag = True
datetime = dparser.parse(schedule['time'], fuzzy=True, ignoretz=True)
if datetime < datetime.utcnow():
flag = False
if flag:
schedules.append(schedule)
for pair in schedule['pairs']:
threading.Thread(target=place_Order_On_Time, args=(datetime, pair, threading.active_count() + 1)).start()
sendmsg(f'Found scheduled order for: {pair} adding it to new thread')
save_json(schedules_file, schedules)
except Exception as exception:
wrong = traceback.format_exc(limit=None, chain=True)
sendmsg(wrong)
def sell():
while True:
try:
flag_update = False
not_sold_orders = []
order = []
if os.path.exists(executed_trades_file):
order = load_json(executed_trades_file)
if len(order) > 0:
for coin in list(order):
# store some necesarry trade info for a sell
stored_price = float(coin['fills'][0]['price'])
coin_tp = coin['tp']
coin_sl = coin['sl']
volume = round(float(coin['executedQty']) - float(coin['fills'][0]['commission']),2)
symbol = coin['symbol']
last_price = get_price(symbol)
# update stop loss and take profit values if threshold is reached
if float(last_price) > stored_price + (stored_price * float(coin_tp) /100) and tsl_mode:
# increase as absolute value for TP
new_tp = float(last_price) + (float(last_price)*ttp /100)
# convert back into % difference from when the coin was bought
new_tp = float( (new_tp - stored_price) / stored_price*100)
# same deal as above, only applied to trailing SL
new_sl = float(last_price) - (float(last_price)*tsl /100)
new_sl = float((new_sl - stored_price) / stored_price*100)
# new values to be added to the json file
coin['tp'] = new_tp
coin['sl'] = new_sl
not_sold_orders.append(coin)
flag_update = True
threading.Thread(target=sendSpam, args=(symbol, f'Updated tp: {round(new_tp, 3)} and sl: {round(new_sl, 3)} for: {symbol}')).start()
# close trade if tsl is reached or trail option is not enabled
elif float(last_price) < stored_price - (stored_price*sl /100) or float(last_price) > stored_price + (stored_price*tp /100) and not tsl_mode:
try:
# sell for real if test mode is set to false
if not test_mode:
sell = client.create_order(symbol = symbol, side = 'SELL', type = 'MARKET', quantity = volume, recvWindow = "10000")
sendmsg(f"Sold {symbol} at {(float(last_price) - stored_price) / float(stored_price)*100}")
killSpam(symbol)
flag_update = True
# remove order from json file by not adding it
except Exception as exception:
wrong = traceback.format_exc(limit=None, chain=True)
sendmsg(wrong)
# store sold trades data
else:
if os.path.exists(executed_sells_file):
sold_coins = load_json(executed_sells_file)
else:
sold_coins = []
if not test_mode:
sold_coins.append(sell)
else:
sell = {
'symbol':symbol,
'price':last_price,
'volume':volume,
'time':datetime.timestamp(datetime.now()),
'profit': float(last_price) - stored_price,
'relative_profit': round((float(last_price) - stored_price) / stored_price*100, 3)
}
sold_coins.append(sell)
save_json(executed_sells_file, sold_coins)
else:
not_sold_orders.append(coin)
if flag_update: save_json(executed_trades_file, not_sold_orders)
time.sleep(0.2)
except Exception as exception:
wrong = traceback.format_exc(limit=None, chain=True)
sendmsg(wrong)
def main():
if os.path.exists(file):
existing_Anouncements = load_json(file)
else:
existing_Anouncements = get_Announcements()
for announcement in existing_Anouncements:
time_And_Pair = get_Pair_and_DateTime(announcement['code'])
if time_And_Pair is not None:
if time_And_Pair[0] >= datetime.utcnow() and len(time_And_Pair[1]) > 0:
schedule_Order(time_And_Pair, announcement)
sendmsg(f'Found new announcement preparing schedule for: {time_And_Pair[1]}')
save_json(file, existing_Anouncements)
threading.Thread(target=check_Schedules, args=()).start()
threading.Thread(target=sell, args=()).start()
threading.Thread(target=executed_orders, args=()).start()
while True:
new_Anouncements = get_Announcements()
for announcement in new_Anouncements:
if not announcement in existing_Anouncements:
time_And_Pair = get_Pair_and_DateTime(announcement['code'])
if time_And_Pair is not None:
if time_And_Pair[0] >= datetime.utcnow() and len(time_And_Pair[1]) > 0 :
schedule_Order(time_And_Pair, announcement)
for pair in time_And_Pair[1]:
threading.Thread(target=place_Order_On_Time, args=(time_And_Pair[0], pair, threading.active_count() + 1)).start()
sendmsg(f'Found new announcement preparing schedule for {pair}')
existing_Anouncements = load_json(file)
threading.Thread(target=sendSpam, args=("sleep", f'Done checking announcements going to sleep for: {frequency} seconds&disable_notification=true')).start()
threading.Thread(target=sendSpam, args=("ping", f'Current Average delay: {ping_binance()}&disable_notification=true')).start()
time.sleep(frequency)
#TODO:
# posible integration with AWS lambda ping it time before the coin is listed so it can place a limit order a little bti more than opening price
if __name__ == '__main__':
try:
if not test_mode:
sendmsg('Warning runnig it on live mode')
sendmsg('starting')
sendmsg(f'Aproximate delay: {ping_binance()}')
main()
except Exception as exception:
wrong = traceback.format_exc(limit=None, chain=True)
sendmsg(wrong)
#debuggin order
#{
# "time": "2021-09-24 10:00:00",
# "pairs": [
# "DFUSDT",
# "SYSUSDT"
# ]
#} | 40.975771 | 2,739 | 0.563619 | 2,142 | 18,603 | 4.729225 | 0.339402 | 0.012438 | 0.019546 | 0.017966 | 0.287167 | 0.215597 | 0.199013 | 0.166041 | 0.140276 | 0.131787 | 0 | 0.008344 | 0.2849 | 18,603 | 454 | 2,740 | 40.975771 | 0.753138 | 0.039295 | 0 | 0.302469 | 0 | 0.009259 | 0.187398 | 0.010931 | 0 | 0 | 0 | 0.002203 | 0 | 1 | 0.058642 | false | 0.003086 | 0.037037 | 0.003086 | 0.12037 | 0.012346 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6fed8cafb5d0edd2dc00b848834bd0e891d7ca5 | 379 | py | Python | Drawing Book/code.py | swy20190/HackerRankChallenge | c7f73e72daa5a9f892e07ab8fc1bc4d71f240c2a | [
"MIT"
] | null | null | null | Drawing Book/code.py | swy20190/HackerRankChallenge | c7f73e72daa5a9f892e07ab8fc1bc4d71f240c2a | [
"MIT"
] | null | null | null | Drawing Book/code.py | swy20190/HackerRankChallenge | c7f73e72daa5a9f892e07ab8fc1bc4d71f240c2a | [
"MIT"
] | null | null | null | import os
def pageCount(n, p):
# Write your code here
front_flip = int(p/2)
end_flip = int(n/2)-front_flip
return min(front_flip, end_flip)
if __name__ == '__main__':
fptr = open(os.environ['OUTPUT_PATH'], 'w')
n = int(input().strip())
p = int(input().strip())
result = pageCount(n, p)
fptr.write(str(result) + '\n')
fptr.close()
| 16.478261 | 47 | 0.591029 | 57 | 379 | 3.684211 | 0.54386 | 0.128571 | 0.104762 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00692 | 0.237467 | 379 | 22 | 48 | 17.227273 | 0.719723 | 0.05277 | 0 | 0 | 0 | 0 | 0.061625 | 0 | 0 | 0 | 0 | 0.045455 | 0 | 1 | 0.083333 | false | 0 | 0.083333 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6ff14ad05f8ebe1c83e39372f5bf76d98c7dc1d | 4,944 | py | Python | pylearn2/sampling/replace_samples.py | CKehl/pylearn2 | 086a198b9f437cf03c35d606e6b3b56b4634ebd8 | [
"BSD-3-Clause"
] | null | null | null | pylearn2/sampling/replace_samples.py | CKehl/pylearn2 | 086a198b9f437cf03c35d606e6b3b56b4634ebd8 | [
"BSD-3-Clause"
] | null | null | null | pylearn2/sampling/replace_samples.py | CKehl/pylearn2 | 086a198b9f437cf03c35d606e6b3b56b4634ebd8 | [
"BSD-3-Clause"
] | null | null | null | '''
Created on Apr 16, 2015
@author: christian
'''
from optparse import OptionParser
import numpy
import os
from os import listdir
from os.path import isfile, join
import cPickle
import glob
def unpickle(file):
fo = open(file, 'rb')
dictionary = cPickle.load(fo)
fo.close()
return dictionary
if __name__ == '__main__':
optionParser = OptionParser("usage: %prog -i INPUT_FILE -m META_FILE -t SOURCE_TAG -r DESTINATION_TAG")
optionParser.add_option("-i", "--input", action="store", dest="input", type="string", metavar="FILE", help="pickled python dataset file")
optionParser.add_option("-m", "--meta", action="store", dest="meta", type="string", metavar="FILE", help="pickled python metadata file")
optionParser.add_option("-t", "--tag", action="store", dest="tag", type="int", help="selected tag to add the image", default = 0)
optionParser.add_option("-r", "--replace_tag", action="store", dest="rtag", type="int", help="replacement tag number for <tag>", default = 0)
(options, args) = optionParser.parse_args()
my_obj = dict()
meta_obj = dict()
label_str = ''
meta_label_str = ''
test_img_array = None
test_array = []
test_classes = []
test_obj = dict()
if options.input != None:
my_obj = unpickle(options.input)
in_dir = os.path.dirname(options.input)
test_obj = unpickle(os.path.join(in_dir, "test"))
else:
exit()
ds = 0
if("fine_labels" in my_obj.keys()):
ds = 1 #CIFAR-100
label_str = 'fine_labels'
meta_label_str = 'fine_label_names'
else:
ds = 0 #CIFAR-10 and combined
label_str = 'labels'
meta_label_str = 'label_names'
meta_inputs = []
if(options.meta == None) or (options.meta == ""):
meta_inputs = glob.glob(os.path.dirname(options.input)+os.path.sep+"*meta*")
else:
meta_inputs.append(options.meta)
meta_obj = unpickle(meta_inputs[0])
num_img_base_array = [0]*(len(my_obj[label_str]))
img_base_array = [[0]*3072]*(my_obj['data'].shape[0])
img_array = numpy.array(img_base_array, dtype=numpy.uint8)
class_array = [0]*(my_obj['data'].shape[0])
for i in range(0, my_obj['data'].shape[0]):
data_entry = my_obj['data'][i]
tag_no = my_obj[label_str][i]
img_array[i] = data_entry
class_array[i] = tag_no
num_img_base_array[tag_no]+=1
# Test array generation
tcursor_point = 0
#print "Test Data fieldsize: "+str(test_obj['data'].shape[0])
for i in range(0, test_obj['data'].shape[0]):
data_entry = test_obj['data'][i]
tag_no = test_obj['labels'][i]
#print "Test Image: "+str(i)+" => Tag: "+str(tag_no)
test_array.append(data_entry.tolist())
test_classes.append(tag_no)
tcursor_point+=1
tag_img_number = num_img_base_array[options.tag]
img_of_tag = []
for i in range(0, len(class_array)):
if(class_array[i] == options.tag):
img_of_tag.append(i)
print "Data with selected tag: "+str(img_of_tag)+" ("+str(tag_img_number)+")"
print "Dataset size before replacement: "+str(len(class_array))+" | "+str(img_array.shape[0])
for i in range(0, len(img_of_tag)):
class_array[img_of_tag[i]]=options.rtag
del num_img_base_array[options.tag]
print "Dataset size after replacement: "+str(len(class_array))+" | "+str(img_array.shape[0])
print "Label dictionary before replacement: "+str(len(meta_obj[meta_label_str]))
del meta_obj[meta_label_str][options.tag]
print "Label dictionary after replacement: "+str(len(meta_obj[meta_label_str]))
# re-adapt mapping
for i in range(0, len(class_array)):
if(class_array[i] > options.tag):
class_array[i]-=1
################
# TESTING DATA #
################
del img_of_tag[:]
for i in range(0, len(test_classes)):
if(test_classes[i] == options.tag):
img_of_tag.append(i)
for i in range(0, len(img_of_tag)):
test_classes[img_of_tag[i]]=options.rtag
# re-adapt mapping
for i in range(0, len(test_classes)):
if(test_classes[i] > options.tag):
test_classes[i]-=1
out_obj = dict()
out_obj['data']=img_array
out_obj['labels']=class_array
out_dir = os.path.dirname(options.input)
#fo = open(os.path.join(options.output,"experiment"), 'wb')
cPickle.dump(out_obj, open(os.path.join(out_dir,"experiment_rp"), "wb"), protocol=2)
test_n_obj = dict()
test_img_array = numpy.array(test_array, dtype=numpy.uint8)
test_n_obj['data']=test_img_array
test_n_obj['labels']=test_classes
cPickle.dump(test_n_obj, open(os.path.join(out_dir,"test_rp"), "wb"), protocol=2)
meta_obj_out = dict()
meta_obj_out['label_names'] = meta_obj[meta_label_str]
cPickle.dump(meta_obj_out, open(os.path.join(out_dir,"meta_rp"), "wb"), protocol=2)
| 34.573427 | 145 | 0.63835 | 733 | 4,944 | 4.061392 | 0.188267 | 0.032247 | 0.024185 | 0.02956 | 0.327511 | 0.289889 | 0.217669 | 0.176688 | 0.142425 | 0.095398 | 0 | 0.012456 | 0.204288 | 4,944 | 142 | 146 | 34.816901 | 0.744281 | 0.054409 | 0 | 0.125 | 0 | 0 | 0.137336 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.067308 | null | null | 0.048077 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6ff7fff18ec58c6987302477424ca3924375eef | 4,332 | py | Python | GAN/CGAN/cgan_mnist.py | fengjiran/scholar_project | 35e86b7a8d0226ad0fee3b2983821a3f331f68aa | [
"Apache-2.0"
] | 3 | 2017-08-20T08:47:18.000Z | 2019-06-21T06:09:27.000Z | GAN/CGAN/cgan_mnist.py | fengjiran/scholar_project | 35e86b7a8d0226ad0fee3b2983821a3f331f68aa | [
"Apache-2.0"
] | null | null | null | GAN/CGAN/cgan_mnist.py | fengjiran/scholar_project | 35e86b7a8d0226ad0fee3b2983821a3f331f68aa | [
"Apache-2.0"
] | null | null | null | from __future__ import division
import numpy as np
from keras.models import Model, Sequential
from keras.optimizers import SGD, Adam
from keras.layers import Input, Dense, Dropout, LeakyReLU, concatenate
from keras.layers.normalization import BatchNormalization
from keras.regularizers import l1, l1_l2
from keras.datasets import mnist
from keras.utils.np_utils import to_categorical
from PIL import Image
import h5py
class CGAN(object):
"""Simple MLP CGAN."""
def __init__(self,
latent_dim=100,
image_shape=(28, 28),
batch_size=100,
epochs=100):
self.latent_dim = latent_dim
self.image_shape = image_shape
self.batch_size = batch_size
self.epochs = epochs
# Construct the generator
p_z = Input(shape=(100,))
x = Dense(units=200,
kernel_regularizer=l1(1e-5))(p_z)
x = LeakyReLU(0.2)(x)
condition_y = Input(shape=(10,))
y = Dense(units=1000,
kernel_regularizer=l1(1e-5))(condition_y)
y = LeakyReLU(0.2)(y)
merge_xy = concatenate([x, y], axis=1)
g_outputs = Dense(units=784,
activation='tanh',
kernel_regularizer=l1(1e-5))(merge_xy)
self.generator = Model(inputs=[p_z, condition_y],
outputs=g_outputs)
# Construct the discriminator
d_x = Input(shape=(784,))
d_condition_y = Input(shape=(10,))
d_input = concatenate([d_x, d_condition_y], axis=1)
d_input = Dense(units=128,
kernel_regularizer=l1(1e-5))(d_input)
d_input = LeakyReLU(0.2)(d_input)
d_output = Dense(units=1,
activation='sigmoid',
kernel_regularizer=l1(1e-5))(d_input)
self.discriminator = Model(inputs=[d_x, d_condition_y],
outputs=d_output)
print self.generator.summary()
print self.discriminator.summary()
def train(self):
d_optim = Adam(lr=2e-4, beta_1=0.5)
g_optim = Adam(lr=2e-4, beta_1=0.5)
self.discriminator.compile(optimizer=d_optim,
loss='binary_crossentropy')
self.generator.compile(optimizer=g_optim,
loss='binary_crossentropy')
latent = Input(shape=(self.latent_dim,))
g_condition = Input(shape=(10,))
d_condition = Input(shape=(10,))
# Get the fake image
fake = self.generator([latent, g_condition])
# we only want to be able to train generation for the combined model
self.discriminator.trainable = False
d_output = self.discriminator([fake, d_condition])
combined_model = Model(inputs=[latent, g_condition, d_condition],
outputs=d_output)
combined_model.compile(optimizer=g_optim,
loss='binary_crossentropy')
(X_train, y_train), (X_test, y_test) = mnist.load_data('/home/richard/datasets/mnist.npz')
X_train = (X_train.astype(np.float32) - 127.5) / 127.5
X_train = X_train.reshape((X_train.shape[0], X_train.shape[1] * X_train.shape[1]))
condition = []
for i in range(10):
condition.extend([i] * 10)
condition = np.asarray(condition)
# one-hot encode
condition = to_categorical(condition, 10)
for epoch in range(self.epochs):
print 'Epoch {} of {}'.format(epoch + 1, self.epochs)
num_batches = int(X_train.shape[0] / self.batch_size)
for index in range(num_batches):
noise = np.random.normal(loc=0.0,
scale=1.0,
size=(self.batch_size, self.latent_dim))
image_batch = X_train[index * self.batch_size:(index + 1) * self.batch_size]
generated_images = self.generator.predict([noise, condition], verbose=0)
X = np.concatenate((image_batch, generated_images))
if __name__ == '__main__':
model = CGAN()
| 35.801653 | 99 | 0.560018 | 513 | 4,332 | 4.520468 | 0.280702 | 0.025873 | 0.028029 | 0.045278 | 0.138853 | 0.080207 | 0.080207 | 0.018111 | 0.018111 | 0 | 0 | 0.036547 | 0.336796 | 4,332 | 120 | 100 | 36.1 | 0.770623 | 0.035088 | 0 | 0.082353 | 0 | 0 | 0.030258 | 0.007937 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.129412 | null | null | 0.035294 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
a6ffb2cce71e31e979d8246a2d1f4abb5ecfebb0 | 1,516 | py | Python | user/migrations/0002_auto_20200816_0510.py | moewahed/trade_cycle | 8ace51f08781a568ef087234b65a7864236dfcaf | [
"MIT"
] | null | null | null | user/migrations/0002_auto_20200816_0510.py | moewahed/trade_cycle | 8ace51f08781a568ef087234b65a7864236dfcaf | [
"MIT"
] | null | null | null | user/migrations/0002_auto_20200816_0510.py | moewahed/trade_cycle | 8ace51f08781a568ef087234b65a7864236dfcaf | [
"MIT"
] | null | null | null | # Generated by Django 2.2 on 2020-08-16 02:10
import django.core.validators
from django.db import migrations, models
import user.model_addon
class Migration(migrations.Migration):
dependencies = [
('user', '0001_initial'),
]
operations = [
migrations.AlterModelOptions(
name='user',
options={'verbose_name': 'Account', 'verbose_name_plural': 'Accounts'},
),
migrations.AlterField(
model_name='user',
name='cover_pic',
field=models.ImageField(default='default/img/cover.png', help_text='Limits:<ul><li>Size 4MB</li><li>Dimensions Range: Width & height (400-2600)</li></ul>', upload_to=user.model_addon.UploadToPathAndRename('upload/img/cover'), validators=[django.core.validators.FileExtensionValidator(['png', 'jpg', 'jpeg', 'PNG', 'JPG'])], verbose_name='Cover Image'),
),
migrations.AlterField(
model_name='user',
name='is_active',
field=models.BooleanField(default=False),
),
migrations.AlterField(
model_name='user',
name='profile_pic',
field=models.ImageField(default='default/img/profile.png', help_text='Limits:<ul><li>Size 2MB</li><li>Dimensions Range: Width & height (200-1600)</li></ul>', upload_to=user.model_addon.UploadToPathAndRename('upload/img/profile'), validators=[django.core.validators.FileExtensionValidator(['png', 'jpg', 'jpeg'])], verbose_name='Profile Image'),
),
]
| 43.314286 | 364 | 0.64314 | 169 | 1,516 | 5.656805 | 0.408284 | 0.033473 | 0.062762 | 0.091004 | 0.563808 | 0.563808 | 0.384937 | 0.246862 | 0.117155 | 0.117155 | 0 | 0.028053 | 0.200528 | 1,516 | 34 | 365 | 44.588235 | 0.760726 | 0.028364 | 0 | 0.357143 | 1 | 0.071429 | 0.275323 | 0.059823 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.107143 | 0 | 0.214286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4701b69ece1ffb86a010433c4dab85c899b9d3ee | 951 | py | Python | App/migrations/0001_initial.py | GnomGad/BackendDjangoTodoServer | 0aae8e5dc751b8914ee81e280a248a4154d8d5c0 | [
"MIT"
] | null | null | null | App/migrations/0001_initial.py | GnomGad/BackendDjangoTodoServer | 0aae8e5dc751b8914ee81e280a248a4154d8d5c0 | [
"MIT"
] | null | null | null | App/migrations/0001_initial.py | GnomGad/BackendDjangoTodoServer | 0aae8e5dc751b8914ee81e280a248a4154d8d5c0 | [
"MIT"
] | null | null | null | # Generated by Django 3.0.6 on 2020-07-10 22:53
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Person',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('idPerson', models.IntegerField()),
],
),
migrations.CreateModel(
name='Todo',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('text', models.CharField(max_length=350)),
('ittodo', models.IntegerField()),
('person', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='App.Person')),
],
),
]
| 29.71875 | 114 | 0.566772 | 94 | 951 | 5.648936 | 0.542553 | 0.045198 | 0.052731 | 0.082863 | 0.297552 | 0.297552 | 0.297552 | 0.297552 | 0.297552 | 0.297552 | 0 | 0.026946 | 0.297581 | 951 | 31 | 115 | 30.677419 | 0.767964 | 0.047319 | 0 | 0.416667 | 1 | 0 | 0.057522 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4708f606f6729a7f4233abfc1e9dcd8087cb2f67 | 196 | py | Python | python-objects_statements_and_data_structures-practice-master/even_in_range.py | valcal/python_practice | bbc9b7075a5af7b7461afa322acb8c946bad1024 | [
"MIT"
] | null | null | null | python-objects_statements_and_data_structures-practice-master/even_in_range.py | valcal/python_practice | bbc9b7075a5af7b7461afa322acb8c946bad1024 | [
"MIT"
] | null | null | null | python-objects_statements_and_data_structures-practice-master/even_in_range.py | valcal/python_practice | bbc9b7075a5af7b7461afa322acb8c946bad1024 | [
"MIT"
] | null | null | null | """
EVEN IN RANGE: Use range() to print all the even numbers from 0 to 10.
"""
for number in range(11):
if number % 2 == 0:
if number == 0:
continue
print(number)
| 19.6 | 70 | 0.55102 | 30 | 196 | 3.6 | 0.6 | 0.12963 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061538 | 0.336735 | 196 | 9 | 71 | 21.777778 | 0.769231 | 0.357143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
470a71e16d544812e51ec3558f13c543292f57e3 | 935 | py | Python | DummyTile.py | godesab/minesweep | 06396f5be237438411a4df6370b739d0d38e89f3 | [
"MIT"
] | null | null | null | DummyTile.py | godesab/minesweep | 06396f5be237438411a4df6370b739d0d38e89f3 | [
"MIT"
] | null | null | null | DummyTile.py | godesab/minesweep | 06396f5be237438411a4df6370b739d0d38e89f3 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
class DummyTile:
def __init__(self, pos, label=None):
self.pos = pos
# Label tells adjacent mine count for this tile. Int between 0...8 or None if unknown
self.label = label
self.checked = False
self.marked = False
self.adj_mines = None
self.adj_tiles = 0
self.adj_checked = 0
self.adj_unchecked = None
def set_label(self, label):
self.label = label
self.checked = True
def set_adj_mines(self, count):
self.adj_mines = count
self.checked = True
def set_adj_tiles(self, count):
self.adj_tiles = count
def set_adj_checked(self, count):
self.adj_checked = count
def add_adj_checked(self):
self.adj_checked = self.adj_checked + 1
def set_adj_unchecked(self, count):
self.adj_unchecked = count
def mark(self):
self.marked = True
| 23.974359 | 93 | 0.605348 | 127 | 935 | 4.259843 | 0.283465 | 0.12939 | 0.103512 | 0.118299 | 0.160813 | 0.088725 | 0 | 0 | 0 | 0 | 0 | 0.009245 | 0.305882 | 935 | 38 | 94 | 24.605263 | 0.824345 | 0.112299 | 0 | 0.153846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0 | 0 | 0 | 0.346154 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
470c68bee171c558b2fa27895830521e28ccc585 | 7,758 | py | Python | tests/acceptance/features/component/environment.py | AlvaroVega/fiware-facts | 6224dd8d87c361bde5eb87f1d9d61a386c5632a8 | [
"Apache-2.0"
] | 1 | 2017-11-28T14:52:49.000Z | 2017-11-28T14:52:49.000Z | tests/acceptance/features/component/environment.py | AlvaroVega/fiware-facts | 6224dd8d87c361bde5eb87f1d9d61a386c5632a8 | [
"Apache-2.0"
] | 70 | 2015-01-26T17:32:39.000Z | 2018-08-15T14:28:19.000Z | tests/acceptance/features/component/environment.py | AlvaroVega/fiware-facts | 6224dd8d87c361bde5eb87f1d9d61a386c5632a8 | [
"Apache-2.0"
] | 9 | 2015-01-22T09:29:19.000Z | 2020-03-05T19:27:56.000Z | # -*- coding: utf-8 -*-
#
# Copyright 2015 Telefónica Investigación y Desarrollo, S.A.U
#
# This file is parpt of FI-WARE project.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
#
# You may obtain a copy of the License at:
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
#
# See the License for the specific language governing permissions and
# limitations under the License.
#
# For those usages not covered by the Apache version 2.0 License please
# contact with opensource@tid.es
#
__author__ = "@jframos"
from qautils.logger.logger_utils import get_logger
from qautils.configuration.configuration_utils import set_up_project
from fiwarefacts_client.client import FactsClient
from fiwarecloto_client.client import ClotoClient
from commons.rabbit_utils import RabbitMQConsumer, RabbitMQPublisher
import qautils.configuration.configuration_utils as configuration_utils
from fiwarefacts_client.window_size_model_utils import get_window_size_rabbitmq_message
from qautils.configuration.configuration_properties import PROPERTIES_CONFIG_SERVICE_PROTOCOL, \
PROPERTIES_CONFIG_SERVICE_RESOURCE, PROPERTIES_CONFIG_SERVICE_PORT, PROPERTIES_CONFIG_SERVICE_HOST, \
PROPERTIES_CONFIG_SERVICE_OS_USERNAME, PROPERTIES_CONFIG_SERVICE_OS_PASSWORD, \
PROPERTIES_CONFIG_SERVICE_OS_TENANT_ID, PROPERTIES_CONFIG_SERVICE_OS_AUTH_URL, PROPERTIES_CONFIG_SERVICE_USER, \
PROPERTIES_CONFIG_SERVICE_PASSWORD
from commons.constants import * # All custom constants are used in this file.
import time
__logger__ = get_logger(__name__)
def before_all(context):
__logger__.info("START ...")
__logger__.info("Setting UP acceptance test project ")
set_up_project() # Load setting using 'qautils.configuration.configuration_utils'
# Save tenantId
context.tenant_id = \
configuration_utils.config[PROPERTIES_CONFIG_CLOTO_SERVICE][PROPERTIES_CONFIG_SERVICE_OS_TENANT_ID]
# Create REST Clients
context.facts_client = FactsClient(
protocol=configuration_utils.config[PROPERTIES_CONFIG_FACTS_SERVICE][PROPERTIES_CONFIG_SERVICE_PROTOCOL],
host=configuration_utils.config[PROPERTIES_CONFIG_FACTS_SERVICE][PROPERTIES_CONFIG_SERVICE_HOST],
port=configuration_utils.config[PROPERTIES_CONFIG_FACTS_SERVICE][PROPERTIES_CONFIG_SERVICE_PORT],
resource=configuration_utils.config[PROPERTIES_CONFIG_FACTS_SERVICE][PROPERTIES_CONFIG_SERVICE_RESOURCE])
context.cloto_client = ClotoClient(
username=configuration_utils.config[PROPERTIES_CONFIG_CLOTO_SERVICE][PROPERTIES_CONFIG_SERVICE_OS_USERNAME],
password=configuration_utils.config[PROPERTIES_CONFIG_CLOTO_SERVICE][PROPERTIES_CONFIG_SERVICE_OS_PASSWORD],
tenant_id=configuration_utils.config[PROPERTIES_CONFIG_CLOTO_SERVICE][PROPERTIES_CONFIG_SERVICE_OS_TENANT_ID],
auth_url=configuration_utils.config[PROPERTIES_CONFIG_CLOTO_SERVICE][PROPERTIES_CONFIG_SERVICE_OS_AUTH_URL],
api_protocol=configuration_utils.config[PROPERTIES_CONFIG_CLOTO_SERVICE][PROPERTIES_CONFIG_SERVICE_PROTOCOL],
api_host=configuration_utils.config[PROPERTIES_CONFIG_CLOTO_SERVICE][PROPERTIES_CONFIG_SERVICE_HOST],
api_port=configuration_utils.config[PROPERTIES_CONFIG_CLOTO_SERVICE][PROPERTIES_CONFIG_SERVICE_PORT],
api_resource=configuration_utils.config[PROPERTIES_CONFIG_CLOTO_SERVICE][PROPERTIES_CONFIG_SERVICE_RESOURCE])
def before_feature(context, feature):
__logger__.info("=========== START FEATURE =========== ")
__logger__.info("Feature name: %s", feature.name)
def before_scenario(context, scenario):
__logger__.info("********** START SCENARIO **********")
__logger__.info("Scenario name: %s", scenario.name)
# Clean scenario variables
context.context_elements = dict()
context.response = None
# List of RabbitMQ Consumers for testing purposes. This list is necessary to be used as Multi-Tenancy test cases.
# By default, this list only will have information for the main tenant used in test cases. Additional RabbitMQ
# consumers should be added by each test case if they are needed.
context.rabbitmq_consumer_list = list()
# Init RabbitMQ consumer and append it on the list - Main tenantId
context.rabbitmq_consumer = RabbitMQConsumer(
amqp_host=configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_SERVICE_HOST],
amqp_port=configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_SERVICE_PORT],
amqp_user=configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_SERVICE_USER],
amqp_password=configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_SERVICE_PASSWORD])
facts_message_config = \
configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_RABBITMQ_SERVICE_FACTS_MESSAGES]
context.rabbitmq_consumer.exchange = \
facts_message_config[PROPERTIES_CONFIG_RABBITMQ_SERVICE_EXCHANGE_NAME]
context.rabbitmq_consumer.exchange_type = \
facts_message_config[PROPERTIES_CONFIG_RABBITMQ_SERVICE_EXCHANGE_TYPE]
context.rabbitmq_consumer_list.append(context.rabbitmq_consumer)
# Init RabbitMQ publisher
context.rabbitmq_publisher = RabbitMQPublisher(
amqp_host=configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_SERVICE_HOST],
amqp_port=configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_SERVICE_PORT],
amqp_user=configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_SERVICE_USER],
amqp_password=configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_SERVICE_PASSWORD])
facts_window_size_config = \
configuration_utils.config[PROPERTIES_CONFIG_RABBITMQ_SERVICE][PROPERTIES_CONFIG_RABBITMQ_SERVICE_WINDOW_SIZE]
context.rabbitmq_publisher.exchange = \
facts_window_size_config[PROPERTIES_CONFIG_RABBITMQ_SERVICE_EXCHANGE_NAME]
context.rabbitmq_publisher.routing_key = \
facts_window_size_config[PROPERTIES_CONFIG_RABBITMQ_SERVICE_ROUTING_KEY]
# Set default window size to 2 (FACTS), for the main tenantId configured
message = get_window_size_rabbitmq_message(context.tenant_id, FACTS_DEFAULT_WINDOW_SIZE)
context.rabbitmq_publisher.send_message(message)
def after_scenario(context, scenario):
__logger__.info("********** END SCENARIO **********")
# Close all RabbitMQ consumers (if initiated)
for consumer in context.rabbitmq_consumer_list:
try:
consumer.stop()
except Exception:
__logger__.warn("Rabbitmq consumer was already stopped")
try:
consumer.close_connection()
except Exception:
__logger__.warn("Rabbitmq consumer was already closed connection")
# Close RabbitMQ publisher (if initiated)
context.rabbitmq_publisher.close()
# Wait for grace period defined (FACTS component) to delete all stored facts.
grace_period = \
configuration_utils.config[PROPERTIES_CONFIG_FACTS_SERVICE][PROPERTIES_CONFIG_FACTS_SERVICE_GRACE_PERIOD]
__logger__.info("Explicit wait for FACTS grace period: %d seconds", grace_period)
time.sleep(grace_period)
def after_feature(context, feature):
__logger__.info("=========== END FEATURE =========== ")
def after_all(context):
__logger__.info("... END :)")
| 45.905325 | 121 | 0.792472 | 928 | 7,758 | 6.204741 | 0.231681 | 0.172282 | 0.123828 | 0.141716 | 0.502084 | 0.435742 | 0.413512 | 0.413512 | 0.372178 | 0.338138 | 0 | 0.001786 | 0.134055 | 7,758 | 168 | 122 | 46.178571 | 0.855314 | 0.195669 | 0 | 0.133333 | 0 | 0 | 0.059981 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0.055556 | 0.111111 | 0 | 0.177778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
470f7eeda75283f4e7198eb5c53e3a28834ff646 | 1,227 | py | Python | config.py | paris3200/flask-inventory | fe858e5c12a8e193590e978e573a3891dfde37de | [
"MIT"
] | 6 | 2016-05-29T05:24:39.000Z | 2021-12-02T14:23:36.000Z | config.py | paris3200/flask-inventory | fe858e5c12a8e193590e978e573a3891dfde37de | [
"MIT"
] | 20 | 2016-11-04T23:22:25.000Z | 2019-10-21T13:39:49.000Z | config.py | paris3200/flask-inventory | fe858e5c12a8e193590e978e573a3891dfde37de | [
"MIT"
] | 7 | 2016-03-27T12:19:33.000Z | 2019-11-18T23:11:23.000Z | # config.jinja2
import os
basedir = os.path.abspath(os.path.dirname(__file__))
class BaseConfig(object):
"""Base configuration."""
SECRET_KEY = '8abcd123352a91a90aecd86d3d0dc5a5844894c24338ad13639bc593fdb20330d67a'
DEBUG = False
BCRYPT_LOG_ROUNDS = 13
WTF_CSRF_ENABLED = True
DEBUG_TB_ENABLED = False
DEBUG_TB_INTERCEPT_REDIRECTS = False
class DevConfig(BaseConfig):
"""Development configuration."""
DEBUG = True
BCRYPT_LOG_ROUNDS = 1
WTF_CSRF_ENABLED = False
SQLALCHEMY_DATABASE_URI = 'sqlite:///' + os.path.join(basedir, 'dev.sqlite')
PICTURES_FOLDER = os.path.join(basedir,'app','static','pictures')
SQLALCHEMY_TRACK_MODIFICATIONS = False
DEBUG_TB_ENABLED = False
class TestingConfig(BaseConfig):
"""Testing configuration."""
DEBUG = True
TESTING = True
BCRYPT_LOG_ROUNDS = 1
WTF_CSRF_ENABLED = False
SQLALCHEMY_DATABASE_URI = 'sqlite:///'
DEBUG_TB_ENABLED = False
class ProductionConfig(BaseConfig):
"""Production configuration."""
SECRET_KEY = 'ljasdfUgreKnyvZRm8R77FkkoS9ab5duf0ypmABtkWPk3ESf8DnbVVL5K84ssyUEeSA'
DEBUG = False
SQLALCHEMY_DATABASE_URI = 'postgresql:///inventory'
DEBUG_TB_ENABLED = False
| 27.886364 | 87 | 0.731051 | 125 | 1,227 | 6.872 | 0.416 | 0.083818 | 0.065192 | 0.088475 | 0.209546 | 0.153667 | 0.153667 | 0.153667 | 0.153667 | 0.153667 | 0 | 0.058128 | 0.172779 | 1,227 | 43 | 88 | 28.534884 | 0.788177 | 0.08965 | 0 | 0.413793 | 0 | 0 | 0.187044 | 0.144161 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.034483 | 0 | 0.965517 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
4715857bcca4a207ecfd692fa8d2496336300ae1 | 449,136 | py | Python | py-fedex/pyfedex/location_service_v11.py | QwadwoNyamekye/purplship-carriers | ce34e3054de246e3d85ddf6928b607193d061ae2 | [
"MIT"
] | 2 | 2021-04-12T22:40:28.000Z | 2021-04-21T18:28:31.000Z | py-fedex/pyfedex/location_service_v11.py | QwadwoNyamekye/purplship-carriers | ce34e3054de246e3d85ddf6928b607193d061ae2 | [
"MIT"
] | 2 | 2021-01-29T07:14:31.000Z | 2021-02-18T18:29:23.000Z | py-fedex/pyfedex/location_service_v11.py | QwadwoNyamekye/purplship-carriers | ce34e3054de246e3d85ddf6928b607193d061ae2 | [
"MIT"
] | 3 | 2020-09-09T17:04:46.000Z | 2021-03-05T00:32:32.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Generated Fri Mar 6 15:54:36 2020 by generateDS.py version 2.35.15.
# Python 3.8.1 (v3.8.1:1b293b6006, Dec 18 2019, 14:08:53) [Clang 6.0 (clang-600.0.57)]
#
# Command line options:
# ('--no-namespace-defs', '')
# ('-o', './python/location_service_v11.py')
#
# Command line arguments:
# ./schemas/LocationService_v11.xsd
#
# Command line:
# /Users/danielkobina/Documents/Open/.sandbox/bin/generateDS --no-namespace-defs -o "./python/location_service_v11.py" ./schemas/LocationService_v11.xsd
#
# Current working directory (os.getcwd()):
# 2020-02
#
from six.moves import zip_longest
import os
import sys
import re as re_
import base64
import datetime as datetime_
import decimal as decimal_
try:
from lxml import etree as etree_
except ImportError:
from xml.etree import ElementTree as etree_
Validate_simpletypes_ = True
SaveElementTreeNode = True
if sys.version_info.major == 2:
BaseStrType_ = basestring
else:
BaseStrType_ = str
def parsexml_(infile, parser=None, **kwargs):
if parser is None:
# Use the lxml ElementTree compatible parser so that, e.g.,
# we ignore comments.
try:
parser = etree_.ETCompatXMLParser()
except AttributeError:
# fallback to xml.etree
parser = etree_.XMLParser()
try:
if isinstance(infile, os.PathLike):
infile = os.path.join(infile)
except AttributeError:
pass
doc = etree_.parse(infile, parser=parser, **kwargs)
return doc
def parsexmlstring_(instring, parser=None, **kwargs):
if parser is None:
# Use the lxml ElementTree compatible parser so that, e.g.,
# we ignore comments.
try:
parser = etree_.ETCompatXMLParser()
except AttributeError:
# fallback to xml.etree
parser = etree_.XMLParser()
element = etree_.fromstring(instring, parser=parser, **kwargs)
return element
#
# Namespace prefix definition table (and other attributes, too)
#
# The module generatedsnamespaces, if it is importable, must contain
# a dictionary named GeneratedsNamespaceDefs. This Python dictionary
# should map element type names (strings) to XML schema namespace prefix
# definitions. The export method for any class for which there is
# a namespace prefix definition, will export that definition in the
# XML representation of that element. See the export method of
# any generated element type class for an example of the use of this
# table.
# A sample table is:
#
# # File: generatedsnamespaces.py
#
# GenerateDSNamespaceDefs = {
# "ElementtypeA": "http://www.xxx.com/namespaceA",
# "ElementtypeB": "http://www.xxx.com/namespaceB",
# }
#
# Additionally, the generatedsnamespaces module can contain a python
# dictionary named GenerateDSNamespaceTypePrefixes that associates element
# types with the namespace prefixes that are to be added to the
# "xsi:type" attribute value. See the exportAttributes method of
# any generated element type and the generation of "xsi:type" for an
# example of the use of this table.
# An example table:
#
# # File: generatedsnamespaces.py
#
# GenerateDSNamespaceTypePrefixes = {
# "ElementtypeC": "aaa:",
# "ElementtypeD": "bbb:",
# }
#
try:
from generatedsnamespaces import GenerateDSNamespaceDefs as GenerateDSNamespaceDefs_
except ImportError:
GenerateDSNamespaceDefs_ = {}
try:
from generatedsnamespaces import GenerateDSNamespaceTypePrefixes as GenerateDSNamespaceTypePrefixes_
except ImportError:
GenerateDSNamespaceTypePrefixes_ = {}
#
# You can replace the following class definition by defining an
# importable module named "generatedscollector" containing a class
# named "GdsCollector". See the default class definition below for
# clues about the possible content of that class.
#
try:
from generatedscollector import GdsCollector as GdsCollector_
except ImportError:
class GdsCollector_(object):
def __init__(self, messages=None):
if messages is None:
self.messages = []
else:
self.messages = messages
def add_message(self, msg):
self.messages.append(msg)
def get_messages(self):
return self.messages
def clear_messages(self):
self.messages = []
def print_messages(self):
for msg in self.messages:
print("Warning: {}".format(msg))
def write_messages(self, outstream):
for msg in self.messages:
outstream.write("Warning: {}\n".format(msg))
#
# The super-class for enum types
#
try:
from enum import Enum
except ImportError:
Enum = object
#
# The root super-class for element type classes
#
# Calls to the methods in these classes are generated by generateDS.py.
# You can replace these methods by re-implementing the following class
# in a module named generatedssuper.py.
try:
from generatedssuper import GeneratedsSuper
except ImportError as exp:
class GeneratedsSuper(object):
__hash__ = object.__hash__
tzoff_pattern = re_.compile(r'(\+|-)((0\d|1[0-3]):[0-5]\d|14:00)$')
class _FixedOffsetTZ(datetime_.tzinfo):
def __init__(self, offset, name):
self.__offset = datetime_.timedelta(minutes=offset)
self.__name = name
def utcoffset(self, dt):
return self.__offset
def tzname(self, dt):
return self.__name
def dst(self, dt):
return None
def gds_format_string(self, input_data, input_name=''):
return input_data
def gds_parse_string(self, input_data, node=None, input_name=''):
return input_data
def gds_validate_string(self, input_data, node=None, input_name=''):
if not input_data:
return ''
else:
return input_data
def gds_format_base64(self, input_data, input_name=''):
return base64.b64encode(input_data)
def gds_validate_base64(self, input_data, node=None, input_name=''):
return input_data
def gds_format_integer(self, input_data, input_name=''):
return '%d' % input_data
def gds_parse_integer(self, input_data, node=None, input_name=''):
try:
ival = int(input_data)
except (TypeError, ValueError) as exp:
raise_parse_error(node, 'Requires integer value: %s' % exp)
return ival
def gds_validate_integer(self, input_data, node=None, input_name=''):
try:
value = int(input_data)
except (TypeError, ValueError):
raise_parse_error(node, 'Requires integer value')
return value
def gds_format_integer_list(self, input_data, input_name=''):
return '%s' % ' '.join(input_data)
def gds_validate_integer_list(
self, input_data, node=None, input_name=''):
values = input_data.split()
for value in values:
try:
int(value)
except (TypeError, ValueError):
raise_parse_error(node, 'Requires sequence of integer valuess')
return values
def gds_format_float(self, input_data, input_name=''):
return ('%.15f' % input_data).rstrip('0')
def gds_parse_float(self, input_data, node=None, input_name=''):
try:
fval_ = float(input_data)
except (TypeError, ValueError) as exp:
raise_parse_error(node, 'Requires float or double value: %s' % exp)
return fval_
def gds_validate_float(self, input_data, node=None, input_name=''):
try:
value = float(input_data)
except (TypeError, ValueError):
raise_parse_error(node, 'Requires float value')
return value
def gds_format_float_list(self, input_data, input_name=''):
return '%s' % ' '.join(input_data)
def gds_validate_float_list(
self, input_data, node=None, input_name=''):
values = input_data.split()
for value in values:
try:
float(value)
except (TypeError, ValueError):
raise_parse_error(node, 'Requires sequence of float values')
return values
def gds_format_decimal(self, input_data, input_name=''):
return ('%s' % input_data).rstrip('0')
def gds_parse_decimal(self, input_data, node=None, input_name=''):
try:
decimal_value = decimal_.Decimal(input_data)
except (TypeError, ValueError):
raise_parse_error(node, 'Requires decimal value')
return decimal_value
def gds_validate_decimal(self, input_data, node=None, input_name=''):
try:
value = decimal_.Decimal(input_data)
except (TypeError, ValueError):
raise_parse_error(node, 'Requires decimal value')
return value
def gds_format_decimal_list(self, input_data, input_name=''):
return '%s' % ' '.join(input_data)
def gds_validate_decimal_list(
self, input_data, node=None, input_name=''):
values = input_data.split()
for value in values:
try:
decimal_.Decimal(value)
except (TypeError, ValueError):
raise_parse_error(node, 'Requires sequence of decimal values')
return values
def gds_format_double(self, input_data, input_name=''):
return '%e' % input_data
def gds_parse_double(self, input_data, node=None, input_name=''):
try:
fval_ = float(input_data)
except (TypeError, ValueError) as exp:
raise_parse_error(node, 'Requires double or float value: %s' % exp)
return fval_
def gds_validate_double(self, input_data, node=None, input_name=''):
try:
value = float(input_data)
except (TypeError, ValueError):
raise_parse_error(node, 'Requires double or float value')
return value
def gds_format_double_list(self, input_data, input_name=''):
return '%s' % ' '.join(input_data)
def gds_validate_double_list(
self, input_data, node=None, input_name=''):
values = input_data.split()
for value in values:
try:
float(value)
except (TypeError, ValueError):
raise_parse_error(
node, 'Requires sequence of double or float values')
return values
def gds_format_boolean(self, input_data, input_name=''):
return ('%s' % input_data).lower()
def gds_parse_boolean(self, input_data, node=None, input_name=''):
if input_data in ('true', '1'):
bval = True
elif input_data in ('false', '0'):
bval = False
else:
raise_parse_error(node, 'Requires boolean value')
return bval
def gds_validate_boolean(self, input_data, node=None, input_name=''):
if input_data not in (True, 1, False, 0, ):
raise_parse_error(
node,
'Requires boolean value '
'(one of True, 1, False, 0)')
return input_data
def gds_format_boolean_list(self, input_data, input_name=''):
return '%s' % ' '.join(input_data)
def gds_validate_boolean_list(
self, input_data, node=None, input_name=''):
values = input_data.split()
for value in values:
if value not in (True, 1, False, 0, ):
raise_parse_error(
node,
'Requires sequence of boolean values '
'(one of True, 1, False, 0)')
return values
def gds_validate_datetime(self, input_data, node=None, input_name=''):
return input_data
def gds_format_datetime(self, input_data, input_name=''):
if input_data.microsecond == 0:
_svalue = '%04d-%02d-%02dT%02d:%02d:%02d' % (
input_data.year,
input_data.month,
input_data.day,
input_data.hour,
input_data.minute,
input_data.second,
)
else:
_svalue = '%04d-%02d-%02dT%02d:%02d:%02d.%s' % (
input_data.year,
input_data.month,
input_data.day,
input_data.hour,
input_data.minute,
input_data.second,
('%f' % (float(input_data.microsecond) / 1000000))[2:],
)
if input_data.tzinfo is not None:
tzoff = input_data.tzinfo.utcoffset(input_data)
if tzoff is not None:
total_seconds = tzoff.seconds + (86400 * tzoff.days)
if total_seconds == 0:
_svalue += 'Z'
else:
if total_seconds < 0:
_svalue += '-'
total_seconds *= -1
else:
_svalue += '+'
hours = total_seconds // 3600
minutes = (total_seconds - (hours * 3600)) // 60
_svalue += '{0:02d}:{1:02d}'.format(hours, minutes)
return _svalue
@classmethod
def gds_parse_datetime(cls, input_data):
tz = None
if input_data[-1] == 'Z':
tz = GeneratedsSuper._FixedOffsetTZ(0, 'UTC')
input_data = input_data[:-1]
else:
results = GeneratedsSuper.tzoff_pattern.search(input_data)
if results is not None:
tzoff_parts = results.group(2).split(':')
tzoff = int(tzoff_parts[0]) * 60 + int(tzoff_parts[1])
if results.group(1) == '-':
tzoff *= -1
tz = GeneratedsSuper._FixedOffsetTZ(
tzoff, results.group(0))
input_data = input_data[:-6]
time_parts = input_data.split('.')
if len(time_parts) > 1:
micro_seconds = int(float('0.' + time_parts[1]) * 1000000)
input_data = '%s.%s' % (
time_parts[0], "{}".format(micro_seconds).rjust(6, "0"), )
dt = datetime_.datetime.strptime(
input_data, '%Y-%m-%dT%H:%M:%S.%f')
else:
dt = datetime_.datetime.strptime(
input_data, '%Y-%m-%dT%H:%M:%S')
dt = dt.replace(tzinfo=tz)
return dt
def gds_validate_date(self, input_data, node=None, input_name=''):
return input_data
def gds_format_date(self, input_data, input_name=''):
_svalue = '%04d-%02d-%02d' % (
input_data.year,
input_data.month,
input_data.day,
)
try:
if input_data.tzinfo is not None:
tzoff = input_data.tzinfo.utcoffset(input_data)
if tzoff is not None:
total_seconds = tzoff.seconds + (86400 * tzoff.days)
if total_seconds == 0:
_svalue += 'Z'
else:
if total_seconds < 0:
_svalue += '-'
total_seconds *= -1
else:
_svalue += '+'
hours = total_seconds // 3600
minutes = (total_seconds - (hours * 3600)) // 60
_svalue += '{0:02d}:{1:02d}'.format(
hours, minutes)
except AttributeError:
pass
return _svalue
@classmethod
def gds_parse_date(cls, input_data):
tz = None
if input_data[-1] == 'Z':
tz = GeneratedsSuper._FixedOffsetTZ(0, 'UTC')
input_data = input_data[:-1]
else:
results = GeneratedsSuper.tzoff_pattern.search(input_data)
if results is not None:
tzoff_parts = results.group(2).split(':')
tzoff = int(tzoff_parts[0]) * 60 + int(tzoff_parts[1])
if results.group(1) == '-':
tzoff *= -1
tz = GeneratedsSuper._FixedOffsetTZ(
tzoff, results.group(0))
input_data = input_data[:-6]
dt = datetime_.datetime.strptime(input_data, '%Y-%m-%d')
dt = dt.replace(tzinfo=tz)
return dt.date()
def gds_validate_time(self, input_data, node=None, input_name=''):
return input_data
def gds_format_time(self, input_data, input_name=''):
if input_data.microsecond == 0:
_svalue = '%02d:%02d:%02d' % (
input_data.hour,
input_data.minute,
input_data.second,
)
else:
_svalue = '%02d:%02d:%02d.%s' % (
input_data.hour,
input_data.minute,
input_data.second,
('%f' % (float(input_data.microsecond) / 1000000))[2:],
)
if input_data.tzinfo is not None:
tzoff = input_data.tzinfo.utcoffset(input_data)
if tzoff is not None:
total_seconds = tzoff.seconds + (86400 * tzoff.days)
if total_seconds == 0:
_svalue += 'Z'
else:
if total_seconds < 0:
_svalue += '-'
total_seconds *= -1
else:
_svalue += '+'
hours = total_seconds // 3600
minutes = (total_seconds - (hours * 3600)) // 60
_svalue += '{0:02d}:{1:02d}'.format(hours, minutes)
return _svalue
def gds_validate_simple_patterns(self, patterns, target):
# pat is a list of lists of strings/patterns.
# The target value must match at least one of the patterns
# in order for the test to succeed.
found1 = True
for patterns1 in patterns:
found2 = False
for patterns2 in patterns1:
mo = re_.search(patterns2, target)
if mo is not None and len(mo.group(0)) == len(target):
found2 = True
break
if not found2:
found1 = False
break
return found1
@classmethod
def gds_parse_time(cls, input_data):
tz = None
if input_data[-1] == 'Z':
tz = GeneratedsSuper._FixedOffsetTZ(0, 'UTC')
input_data = input_data[:-1]
else:
results = GeneratedsSuper.tzoff_pattern.search(input_data)
if results is not None:
tzoff_parts = results.group(2).split(':')
tzoff = int(tzoff_parts[0]) * 60 + int(tzoff_parts[1])
if results.group(1) == '-':
tzoff *= -1
tz = GeneratedsSuper._FixedOffsetTZ(
tzoff, results.group(0))
input_data = input_data[:-6]
if len(input_data.split('.')) > 1:
dt = datetime_.datetime.strptime(input_data, '%H:%M:%S.%f')
else:
dt = datetime_.datetime.strptime(input_data, '%H:%M:%S')
dt = dt.replace(tzinfo=tz)
return dt.time()
def gds_check_cardinality_(
self, value, input_name,
min_occurs=0, max_occurs=1, required=None):
if value is None:
length = 0
elif isinstance(value, list):
length = len(value)
else:
length = 1
if required is not None :
if required and length < 1:
self.gds_collector_.add_message(
"Required value {}{} is missing".format(
input_name, self.gds_get_node_lineno_()))
if length < min_occurs:
self.gds_collector_.add_message(
"Number of values for {}{} is below "
"the minimum allowed, "
"expected at least {}, found {}".format(
input_name, self.gds_get_node_lineno_(),
min_occurs, length))
elif length > max_occurs:
self.gds_collector_.add_message(
"Number of values for {}{} is above "
"the maximum allowed, "
"expected at most {}, found {}".format(
input_name, self.gds_get_node_lineno_(),
max_occurs, length))
def gds_validate_builtin_ST_(
self, validator, value, input_name,
min_occurs=None, max_occurs=None, required=None):
if value is not None:
try:
validator(value, input_name=input_name)
except GDSParseError as parse_error:
self.gds_collector_.add_message(str(parse_error))
def gds_validate_defined_ST_(
self, validator, value, input_name,
min_occurs=None, max_occurs=None, required=None):
if value is not None:
try:
validator(value)
except GDSParseError as parse_error:
self.gds_collector_.add_message(str(parse_error))
def gds_str_lower(self, instring):
return instring.lower()
def get_path_(self, node):
path_list = []
self.get_path_list_(node, path_list)
path_list.reverse()
path = '/'.join(path_list)
return path
Tag_strip_pattern_ = re_.compile(r'\{.*\}')
def get_path_list_(self, node, path_list):
if node is None:
return
tag = GeneratedsSuper.Tag_strip_pattern_.sub('', node.tag)
if tag:
path_list.append(tag)
self.get_path_list_(node.getparent(), path_list)
def get_class_obj_(self, node, default_class=None):
class_obj1 = default_class
if 'xsi' in node.nsmap:
classname = node.get('{%s}type' % node.nsmap['xsi'])
if classname is not None:
names = classname.split(':')
if len(names) == 2:
classname = names[1]
class_obj2 = globals().get(classname)
if class_obj2 is not None:
class_obj1 = class_obj2
return class_obj1
def gds_build_any(self, node, type_name=None):
# provide default value in case option --disable-xml is used.
content = ""
content = etree_.tostring(node, encoding="unicode")
return content
@classmethod
def gds_reverse_node_mapping(cls, mapping):
return dict(((v, k) for k, v in mapping.items()))
@staticmethod
def gds_encode(instring):
if sys.version_info.major == 2:
if ExternalEncoding:
encoding = ExternalEncoding
else:
encoding = 'utf-8'
return instring.encode(encoding)
else:
return instring
@staticmethod
def convert_unicode(instring):
if isinstance(instring, str):
result = quote_xml(instring)
elif sys.version_info.major == 2 and isinstance(instring, unicode):
result = quote_xml(instring).encode('utf8')
else:
result = GeneratedsSuper.gds_encode(str(instring))
return result
def __eq__(self, other):
def excl_select_objs_(obj):
return (obj[0] != 'parent_object_' and
obj[0] != 'gds_collector_')
if type(self) != type(other):
return False
return all(x == y for x, y in zip_longest(
filter(excl_select_objs_, self.__dict__.items()),
filter(excl_select_objs_, other.__dict__.items())))
def __ne__(self, other):
return not self.__eq__(other)
# Django ETL transform hooks.
def gds_djo_etl_transform(self):
pass
def gds_djo_etl_transform_db_obj(self, dbobj):
pass
# SQLAlchemy ETL transform hooks.
def gds_sqa_etl_transform(self):
return 0, None
def gds_sqa_etl_transform_db_obj(self, dbobj):
pass
def gds_get_node_lineno_(self):
if (hasattr(self, "gds_elementtree_node_") and
self.gds_elementtree_node_ is not None):
return ' near line {}'.format(
self.gds_elementtree_node_.sourceline)
else:
return ""
def getSubclassFromModule_(module, class_):
'''Get the subclass of a class from a specific module.'''
name = class_.__name__ + 'Sub'
if hasattr(module, name):
return getattr(module, name)
else:
return None
#
# If you have installed IPython you can uncomment and use the following.
# IPython is available from http://ipython.scipy.org/.
#
## from IPython.Shell import IPShellEmbed
## args = ''
## ipshell = IPShellEmbed(args,
## banner = 'Dropping into IPython',
## exit_msg = 'Leaving Interpreter, back to program.')
# Then use the following line where and when you want to drop into the
# IPython shell:
# ipshell('<some message> -- Entering ipshell.\nHit Ctrl-D to exit')
#
# Globals
#
ExternalEncoding = ''
# Set this to false in order to deactivate during export, the use of
# name space prefixes captured from the input document.
UseCapturedNS_ = True
CapturedNsmap_ = {}
Tag_pattern_ = re_.compile(r'({.*})?(.*)')
String_cleanup_pat_ = re_.compile(r"[\n\r\s]+")
Namespace_extract_pat_ = re_.compile(r'{(.*)}(.*)')
CDATA_pattern_ = re_.compile(r"<!\[CDATA\[.*?\]\]>", re_.DOTALL)
# Change this to redirect the generated superclass module to use a
# specific subclass module.
CurrentSubclassModule_ = None
#
# Support/utility functions.
#
def showIndent(outfile, level, pretty_print=True):
if pretty_print:
for idx in range(level):
outfile.write(' ')
def quote_xml(inStr):
"Escape markup chars, but do not modify CDATA sections."
if not inStr:
return ''
s1 = (isinstance(inStr, BaseStrType_) and inStr or '%s' % inStr)
s2 = ''
pos = 0
matchobjects = CDATA_pattern_.finditer(s1)
for mo in matchobjects:
s3 = s1[pos:mo.start()]
s2 += quote_xml_aux(s3)
s2 += s1[mo.start():mo.end()]
pos = mo.end()
s3 = s1[pos:]
s2 += quote_xml_aux(s3)
return s2
def quote_xml_aux(inStr):
s1 = inStr.replace('&', '&')
s1 = s1.replace('<', '<')
s1 = s1.replace('>', '>')
return s1
def quote_attrib(inStr):
s1 = (isinstance(inStr, BaseStrType_) and inStr or '%s' % inStr)
s1 = s1.replace('&', '&')
s1 = s1.replace('<', '<')
s1 = s1.replace('>', '>')
if '"' in s1:
if "'" in s1:
s1 = '"%s"' % s1.replace('"', """)
else:
s1 = "'%s'" % s1
else:
s1 = '"%s"' % s1
return s1
def quote_python(inStr):
s1 = inStr
if s1.find("'") == -1:
if s1.find('\n') == -1:
return "'%s'" % s1
else:
return "'''%s'''" % s1
else:
if s1.find('"') != -1:
s1 = s1.replace('"', '\\"')
if s1.find('\n') == -1:
return '"%s"' % s1
else:
return '"""%s"""' % s1
def get_all_text_(node):
if node.text is not None:
text = node.text
else:
text = ''
for child in node:
if child.tail is not None:
text += child.tail
return text
def find_attr_value_(attr_name, node):
attrs = node.attrib
attr_parts = attr_name.split(':')
value = None
if len(attr_parts) == 1:
value = attrs.get(attr_name)
elif len(attr_parts) == 2:
prefix, name = attr_parts
namespace = node.nsmap.get(prefix)
if namespace is not None:
value = attrs.get('{%s}%s' % (namespace, name, ))
return value
def encode_str_2_3(instr):
return instr
class GDSParseError(Exception):
pass
def raise_parse_error(node, msg):
if node is not None:
msg = '%s (element %s/line %d)' % (msg, node.tag, node.sourceline, )
raise GDSParseError(msg)
class MixedContainer:
# Constants for category:
CategoryNone = 0
CategoryText = 1
CategorySimple = 2
CategoryComplex = 3
# Constants for content_type:
TypeNone = 0
TypeText = 1
TypeString = 2
TypeInteger = 3
TypeFloat = 4
TypeDecimal = 5
TypeDouble = 6
TypeBoolean = 7
TypeBase64 = 8
def __init__(self, category, content_type, name, value):
self.category = category
self.content_type = content_type
self.name = name
self.value = value
def getCategory(self):
return self.category
def getContenttype(self, content_type):
return self.content_type
def getValue(self):
return self.value
def getName(self):
return self.name
def export(self, outfile, level, name, namespace,
pretty_print=True):
if self.category == MixedContainer.CategoryText:
# Prevent exporting empty content as empty lines.
if self.value.strip():
outfile.write(self.value)
elif self.category == MixedContainer.CategorySimple:
self.exportSimple(outfile, level, name)
else: # category == MixedContainer.CategoryComplex
self.value.export(
outfile, level, namespace, name_=name,
pretty_print=pretty_print)
def exportSimple(self, outfile, level, name):
if self.content_type == MixedContainer.TypeString:
outfile.write('<%s>%s</%s>' % (
self.name, self.value, self.name))
elif self.content_type == MixedContainer.TypeInteger or \
self.content_type == MixedContainer.TypeBoolean:
outfile.write('<%s>%d</%s>' % (
self.name, self.value, self.name))
elif self.content_type == MixedContainer.TypeFloat or \
self.content_type == MixedContainer.TypeDecimal:
outfile.write('<%s>%f</%s>' % (
self.name, self.value, self.name))
elif self.content_type == MixedContainer.TypeDouble:
outfile.write('<%s>%g</%s>' % (
self.name, self.value, self.name))
elif self.content_type == MixedContainer.TypeBase64:
outfile.write('<%s>%s</%s>' % (
self.name,
base64.b64encode(self.value),
self.name))
def to_etree(self, element):
if self.category == MixedContainer.CategoryText:
# Prevent exporting empty content as empty lines.
if self.value.strip():
if len(element) > 0:
if element[-1].tail is None:
element[-1].tail = self.value
else:
element[-1].tail += self.value
else:
if element.text is None:
element.text = self.value
else:
element.text += self.value
elif self.category == MixedContainer.CategorySimple:
subelement = etree_.SubElement(
element, '%s' % self.name)
subelement.text = self.to_etree_simple()
else: # category == MixedContainer.CategoryComplex
self.value.to_etree(element)
def to_etree_simple(self):
if self.content_type == MixedContainer.TypeString:
text = self.value
elif (self.content_type == MixedContainer.TypeInteger or
self.content_type == MixedContainer.TypeBoolean):
text = '%d' % self.value
elif (self.content_type == MixedContainer.TypeFloat or
self.content_type == MixedContainer.TypeDecimal):
text = '%f' % self.value
elif self.content_type == MixedContainer.TypeDouble:
text = '%g' % self.value
elif self.content_type == MixedContainer.TypeBase64:
text = '%s' % base64.b64encode(self.value)
return text
def exportLiteral(self, outfile, level, name):
if self.category == MixedContainer.CategoryText:
showIndent(outfile, level)
outfile.write(
'model_.MixedContainer(%d, %d, "%s", "%s"),\n' % (
self.category, self.content_type,
self.name, self.value))
elif self.category == MixedContainer.CategorySimple:
showIndent(outfile, level)
outfile.write(
'model_.MixedContainer(%d, %d, "%s", "%s"),\n' % (
self.category, self.content_type,
self.name, self.value))
else: # category == MixedContainer.CategoryComplex
showIndent(outfile, level)
outfile.write(
'model_.MixedContainer(%d, %d, "%s",\n' % (
self.category, self.content_type, self.name,))
self.value.exportLiteral(outfile, level + 1)
showIndent(outfile, level)
outfile.write(')\n')
class MemberSpec_(object):
def __init__(self, name='', data_type='', container=0,
optional=0, child_attrs=None, choice=None):
self.name = name
self.data_type = data_type
self.container = container
self.child_attrs = child_attrs
self.choice = choice
self.optional = optional
def set_name(self, name): self.name = name
def get_name(self): return self.name
def set_data_type(self, data_type): self.data_type = data_type
def get_data_type_chain(self): return self.data_type
def get_data_type(self):
if isinstance(self.data_type, list):
if len(self.data_type) > 0:
return self.data_type[-1]
else:
return 'xs:string'
else:
return self.data_type
def set_container(self, container): self.container = container
def get_container(self): return self.container
def set_child_attrs(self, child_attrs): self.child_attrs = child_attrs
def get_child_attrs(self): return self.child_attrs
def set_choice(self, choice): self.choice = choice
def get_choice(self): return self.choice
def set_optional(self, optional): self.optional = optional
def get_optional(self): return self.optional
def _cast(typ, value):
if typ is None or value is None:
return value
return typ(value)
#
# Data representation classes.
#
class CarrierCodeType(Enum):
"""Identification of a FedEx operating company (transportation)."""
FDXC='FDXC'
FDXE='FDXE'
FDXG='FDXG'
FXCC='FXCC'
FXFR='FXFR'
FXSP='FXSP'
class ConsolidationType(Enum):
INTERNATIONAL_DISTRIBUTION_FREIGHT='INTERNATIONAL_DISTRIBUTION_FREIGHT'
INTERNATIONAL_ECONOMY_DISTRIBUTION='INTERNATIONAL_ECONOMY_DISTRIBUTION'
INTERNATIONAL_GROUND_DISTRIBUTION='INTERNATIONAL_GROUND_DISTRIBUTION'
INTERNATIONAL_PRIORITY_DISTRIBUTION='INTERNATIONAL_PRIORITY_DISTRIBUTION'
TRANSBORDER_DISTRIBUTION='TRANSBORDER_DISTRIBUTION'
class CountryRelationshipType(Enum):
"""Describes relationship between origin and destination countries."""
DOMESTIC='DOMESTIC'
INTERNATIONAL='INTERNATIONAL'
class DayOfWeekType(Enum):
FRI='FRI'
MON='MON'
SAT='SAT'
SUN='SUN'
THU='THU'
TUE='TUE'
WED='WED'
class DistanceUnits(Enum):
KM='KM'
MI='MI'
class DistributionClearanceType(Enum):
DESTINATION_COUNTRY_CLEARANCE='DESTINATION_COUNTRY_CLEARANCE'
SINGLE_POINT_OF_CLEARANCE='SINGLE_POINT_OF_CLEARANCE'
class EnterprisePermissionType(Enum):
ALLOWED='ALLOWED'
ALLOWED_BY_EXCEPTION='ALLOWED_BY_EXCEPTION'
DISALLOWED='DISALLOWED'
class ExpressRegionCode(Enum):
"""Indicates a FedEx Express operating region."""
APAC='APAC'
CA='CA'
EMEA='EMEA'
LAC='LAC'
US='US'
class FedExLocationType(Enum):
"""Identifies a kind of FedEx facility."""
FEDEX_AUTHORIZED_SHIP_CENTER='FEDEX_AUTHORIZED_SHIP_CENTER'
FEDEX_EXPRESS_STATION='FEDEX_EXPRESS_STATION'
FEDEX_FACILITY='FEDEX_FACILITY'
FEDEX_FREIGHT_SERVICE_CENTER='FEDEX_FREIGHT_SERVICE_CENTER'
FEDEX_GROUND_TERMINAL='FEDEX_GROUND_TERMINAL'
FEDEX_HOME_DELIVERY_STATION='FEDEX_HOME_DELIVERY_STATION'
FEDEX_OFFICE='FEDEX_OFFICE'
FEDEX_ONSITE='FEDEX_ONSITE'
FEDEX_SELF_SERVICE_LOCATION='FEDEX_SELF_SERVICE_LOCATION'
FEDEX_SHIPSITE='FEDEX_SHIPSITE'
FEDEX_SHIP_AND_GET='FEDEX_SHIP_AND_GET'
FEDEX_SMART_POST_HUB='FEDEX_SMART_POST_HUB'
class LatestDropOffOverlayType(Enum):
"""Specifies the reason for the overlay of the daily last drop off time for
a carrier."""
US_WEST_COAST='US_WEST_COAST'
class LinearUnits(Enum):
CM='CM'
IN='IN'
class LocationAccessibilityType(Enum):
"""Indicates how this can be accessed."""
INSIDE='INSIDE'
OUTSIDE='OUTSIDE'
class LocationAttributesType(Enum):
ACCEPTS_CASH='ACCEPTS_CASH'
ALREADY_OPEN='ALREADY_OPEN'
CLEARANCE_SERVICES='CLEARANCE_SERVICES'
COPY_AND_PRINT_SERVICES='COPY_AND_PRINT_SERVICES'
DANGEROUS_GOODS_SERVICES='DANGEROUS_GOODS_SERVICES'
DIRECT_MAIL_SERVICES='DIRECT_MAIL_SERVICES'
DOMESTIC_SHIPPING_SERVICES='DOMESTIC_SHIPPING_SERVICES'
DROP_BOX='DROP_BOX'
INTERNATIONAL_SHIPPING_SERVICES='INTERNATIONAL_SHIPPING_SERVICES'
LOCATION_IS_IN_AIRPORT='LOCATION_IS_IN_AIRPORT'
NOTARY_SERVICES='NOTARY_SERVICES'
OBSERVES_DAY_LIGHT_SAVING_TIMES='OBSERVES_DAY_LIGHT_SAVING_TIMES'
OPEN_TWENTY_FOUR_HOURS='OPEN_TWENTY_FOUR_HOURS'
PACKAGING_SUPPLIES='PACKAGING_SUPPLIES'
PACK_AND_SHIP='PACK_AND_SHIP'
PASSPORT_PHOTO_SERVICES='PASSPORT_PHOTO_SERVICES'
RETURNS_SERVICES='RETURNS_SERVICES'
SIGNS_AND_BANNERS_SERVICE='SIGNS_AND_BANNERS_SERVICE'
SONY_PICTURE_STATION='SONY_PICTURE_STATION'
class LocationContentOptionType(Enum):
HOLIDAYS='HOLIDAYS'
LOCATION_DROPOFF_TIMES='LOCATION_DROPOFF_TIMES'
MAP_URL='MAP_URL'
TIMEZONE_OFFSET='TIMEZONE_OFFSET'
class LocationSearchFilterType(Enum):
"""Specifies the crieteria used to filter the location search results."""
EXCLUDE_LOCATIONS_OUTSIDE_COUNTRY='EXCLUDE_LOCATIONS_OUTSIDE_COUNTRY'
EXCLUDE_LOCATIONS_OUTSIDE_STATE_OR_PROVINCE='EXCLUDE_LOCATIONS_OUTSIDE_STATE_OR_PROVINCE'
EXCLUDE_UNAVAILABLE_LOCATIONS='EXCLUDE_UNAVAILABLE_LOCATIONS'
class LocationSortCriteriaType(Enum):
"""Specifies the criterion to be used to sort the location details."""
DISTANCE='DISTANCE'
LATEST_EXPRESS_DROPOFF_TIME='LATEST_EXPRESS_DROPOFF_TIME'
LATEST_GROUND_DROPOFF_TIME='LATEST_GROUND_DROPOFF_TIME'
LOCATION_TYPE='LOCATION_TYPE'
class LocationSortOrderType(Enum):
"""Specifies sort order of the location details."""
HIGHEST_TO_LOWEST='HIGHEST_TO_LOWEST'
LOWEST_TO_HIGHEST='LOWEST_TO_HIGHEST'
class LocationTransferOfPossessionType(Enum):
DROPOFF='DROPOFF'
HOLD_AT_LOCATION='HOLD_AT_LOCATION'
REDIRECT_TO_HOLD_AT_LOCATION='REDIRECT_TO_HOLD_AT_LOCATION'
class LocationsSearchCriteriaType(Enum):
"""Specifies the criteria types that may be used to search for FedEx
locations."""
ADDRESS='ADDRESS'
GEOGRAPHIC_COORDINATES='GEOGRAPHIC_COORDINATES'
PHONE_NUMBER='PHONE_NUMBER'
class MultipleMatchesActionType(Enum):
RETURN_ALL='RETURN_ALL'
RETURN_ERROR='RETURN_ERROR'
RETURN_FIRST='RETURN_FIRST'
class NotificationSeverityType(Enum):
ERROR='ERROR'
FAILURE='FAILURE'
NOTE='NOTE'
SUCCESS='SUCCESS'
WARNING='WARNING'
class OperationalHoursType(Enum):
CLOSED_ALL_DAY='CLOSED_ALL_DAY'
OPEN_ALL_DAY='OPEN_ALL_DAY'
OPEN_BY_HOURS='OPEN_BY_HOURS'
class ReservationAttributesType(Enum):
"""Attributes about a reservation at a FedEx location."""
RESERVATION_AVAILABLE='RESERVATION_AVAILABLE'
class ServiceCategoryType(Enum):
EXPRESS_FREIGHT='EXPRESS_FREIGHT'
EXPRESS_PARCEL='EXPRESS_PARCEL'
GROUND_HOME_DELIVERY='GROUND_HOME_DELIVERY'
class ShippingActionType(Enum):
DELIVERIES='DELIVERIES'
PICKUPS='PICKUPS'
class SupportedRedirectToHoldServiceType(Enum):
"""DEPRECATED as of July 2017."""
FEDEX_EXPRESS='FEDEX_EXPRESS'
FEDEX_GROUND='FEDEX_GROUND'
FEDEX_GROUND_HOME_DELIVERY='FEDEX_GROUND_HOME_DELIVERY'
class WeightUnits(Enum):
KG='KG'
LB='LB'
class Address(GeneratedsSuper):
"""Descriptive data for a physical location. May be used as an actual
physical address (place to which one could go), or as a container of
"address parts" which should be handled as a unit (such as a city-
state-ZIP combination within the US)."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, StreetLines=None, City=None, StateOrProvinceCode=None, PostalCode=None, UrbanizationCode=None, CountryCode=None, CountryName=None, Residential=None, GeographicCoordinates=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
if StreetLines is None:
self.StreetLines = []
else:
self.StreetLines = StreetLines
self.StreetLines_nsprefix_ = None
self.City = City
self.City_nsprefix_ = None
self.StateOrProvinceCode = StateOrProvinceCode
self.StateOrProvinceCode_nsprefix_ = None
self.PostalCode = PostalCode
self.PostalCode_nsprefix_ = None
self.UrbanizationCode = UrbanizationCode
self.UrbanizationCode_nsprefix_ = None
self.CountryCode = CountryCode
self.CountryCode_nsprefix_ = None
self.CountryName = CountryName
self.CountryName_nsprefix_ = None
self.Residential = Residential
self.Residential_nsprefix_ = None
self.GeographicCoordinates = GeographicCoordinates
self.GeographicCoordinates_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, Address)
if subclass is not None:
return subclass(*args_, **kwargs_)
if Address.subclass:
return Address.subclass(*args_, **kwargs_)
else:
return Address(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_StreetLines(self):
return self.StreetLines
def set_StreetLines(self, StreetLines):
self.StreetLines = StreetLines
def add_StreetLines(self, value):
self.StreetLines.append(value)
def insert_StreetLines_at(self, index, value):
self.StreetLines.insert(index, value)
def replace_StreetLines_at(self, index, value):
self.StreetLines[index] = value
def get_City(self):
return self.City
def set_City(self, City):
self.City = City
def get_StateOrProvinceCode(self):
return self.StateOrProvinceCode
def set_StateOrProvinceCode(self, StateOrProvinceCode):
self.StateOrProvinceCode = StateOrProvinceCode
def get_PostalCode(self):
return self.PostalCode
def set_PostalCode(self, PostalCode):
self.PostalCode = PostalCode
def get_UrbanizationCode(self):
return self.UrbanizationCode
def set_UrbanizationCode(self, UrbanizationCode):
self.UrbanizationCode = UrbanizationCode
def get_CountryCode(self):
return self.CountryCode
def set_CountryCode(self, CountryCode):
self.CountryCode = CountryCode
def get_CountryName(self):
return self.CountryName
def set_CountryName(self, CountryName):
self.CountryName = CountryName
def get_Residential(self):
return self.Residential
def set_Residential(self, Residential):
self.Residential = Residential
def get_GeographicCoordinates(self):
return self.GeographicCoordinates
def set_GeographicCoordinates(self, GeographicCoordinates):
self.GeographicCoordinates = GeographicCoordinates
def hasContent_(self):
if (
self.StreetLines or
self.City is not None or
self.StateOrProvinceCode is not None or
self.PostalCode is not None or
self.UrbanizationCode is not None or
self.CountryCode is not None or
self.CountryName is not None or
self.Residential is not None or
self.GeographicCoordinates is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Address', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('Address')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'Address':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='Address')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='Address', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='Address'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Address', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
for StreetLines_ in self.StreetLines:
namespaceprefix_ = self.StreetLines_nsprefix_ + ':' if (UseCapturedNS_ and self.StreetLines_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sStreetLines>%s</%sStreetLines>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(StreetLines_), input_name='StreetLines')), namespaceprefix_ , eol_))
if self.City is not None:
namespaceprefix_ = self.City_nsprefix_ + ':' if (UseCapturedNS_ and self.City_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCity>%s</%sCity>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.City), input_name='City')), namespaceprefix_ , eol_))
if self.StateOrProvinceCode is not None:
namespaceprefix_ = self.StateOrProvinceCode_nsprefix_ + ':' if (UseCapturedNS_ and self.StateOrProvinceCode_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sStateOrProvinceCode>%s</%sStateOrProvinceCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.StateOrProvinceCode), input_name='StateOrProvinceCode')), namespaceprefix_ , eol_))
if self.PostalCode is not None:
namespaceprefix_ = self.PostalCode_nsprefix_ + ':' if (UseCapturedNS_ and self.PostalCode_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sPostalCode>%s</%sPostalCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.PostalCode), input_name='PostalCode')), namespaceprefix_ , eol_))
if self.UrbanizationCode is not None:
namespaceprefix_ = self.UrbanizationCode_nsprefix_ + ':' if (UseCapturedNS_ and self.UrbanizationCode_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sUrbanizationCode>%s</%sUrbanizationCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.UrbanizationCode), input_name='UrbanizationCode')), namespaceprefix_ , eol_))
if self.CountryCode is not None:
namespaceprefix_ = self.CountryCode_nsprefix_ + ':' if (UseCapturedNS_ and self.CountryCode_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCountryCode>%s</%sCountryCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.CountryCode), input_name='CountryCode')), namespaceprefix_ , eol_))
if self.CountryName is not None:
namespaceprefix_ = self.CountryName_nsprefix_ + ':' if (UseCapturedNS_ and self.CountryName_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCountryName>%s</%sCountryName>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.CountryName), input_name='CountryName')), namespaceprefix_ , eol_))
if self.Residential is not None:
namespaceprefix_ = self.Residential_nsprefix_ + ':' if (UseCapturedNS_ and self.Residential_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sResidential>%s</%sResidential>%s' % (namespaceprefix_ , self.gds_format_boolean(self.Residential, input_name='Residential'), namespaceprefix_ , eol_))
if self.GeographicCoordinates is not None:
namespaceprefix_ = self.GeographicCoordinates_nsprefix_ + ':' if (UseCapturedNS_ and self.GeographicCoordinates_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sGeographicCoordinates>%s</%sGeographicCoordinates>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.GeographicCoordinates), input_name='GeographicCoordinates')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'StreetLines':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'StreetLines')
value_ = self.gds_validate_string(value_, node, 'StreetLines')
self.StreetLines.append(value_)
self.StreetLines_nsprefix_ = child_.prefix
elif nodeName_ == 'City':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'City')
value_ = self.gds_validate_string(value_, node, 'City')
self.City = value_
self.City_nsprefix_ = child_.prefix
elif nodeName_ == 'StateOrProvinceCode':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'StateOrProvinceCode')
value_ = self.gds_validate_string(value_, node, 'StateOrProvinceCode')
self.StateOrProvinceCode = value_
self.StateOrProvinceCode_nsprefix_ = child_.prefix
elif nodeName_ == 'PostalCode':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'PostalCode')
value_ = self.gds_validate_string(value_, node, 'PostalCode')
self.PostalCode = value_
self.PostalCode_nsprefix_ = child_.prefix
elif nodeName_ == 'UrbanizationCode':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'UrbanizationCode')
value_ = self.gds_validate_string(value_, node, 'UrbanizationCode')
self.UrbanizationCode = value_
self.UrbanizationCode_nsprefix_ = child_.prefix
elif nodeName_ == 'CountryCode':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'CountryCode')
value_ = self.gds_validate_string(value_, node, 'CountryCode')
self.CountryCode = value_
self.CountryCode_nsprefix_ = child_.prefix
elif nodeName_ == 'CountryName':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'CountryName')
value_ = self.gds_validate_string(value_, node, 'CountryName')
self.CountryName = value_
self.CountryName_nsprefix_ = child_.prefix
elif nodeName_ == 'Residential':
sval_ = child_.text
ival_ = self.gds_parse_boolean(sval_, node, 'Residential')
ival_ = self.gds_validate_boolean(ival_, node, 'Residential')
self.Residential = ival_
self.Residential_nsprefix_ = child_.prefix
elif nodeName_ == 'GeographicCoordinates':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'GeographicCoordinates')
value_ = self.gds_validate_string(value_, node, 'GeographicCoordinates')
self.GeographicCoordinates = value_
self.GeographicCoordinates_nsprefix_ = child_.prefix
# end class Address
class AddressAncillaryDetail(GeneratedsSuper):
"""Additional information about a physical location, such as suite number,
cross street, floor number in a building. These details are not
typically a part of a standard address definition; however, these
details might help locate the address."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, LocationInCity=None, LocationInProperty=None, Accessibility=None, Building=None, Department=None, RoomFloor=None, Suite=None, Apartment=None, Room=None, CrossStreet=None, AdditionalDescriptions=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.LocationInCity = LocationInCity
self.LocationInCity_nsprefix_ = None
self.LocationInProperty = LocationInProperty
self.LocationInProperty_nsprefix_ = None
self.Accessibility = Accessibility
self.validate_LocationAccessibilityType(self.Accessibility)
self.Accessibility_nsprefix_ = None
self.Building = Building
self.Building_nsprefix_ = None
self.Department = Department
self.Department_nsprefix_ = None
self.RoomFloor = RoomFloor
self.RoomFloor_nsprefix_ = None
self.Suite = Suite
self.Suite_nsprefix_ = None
self.Apartment = Apartment
self.Apartment_nsprefix_ = None
self.Room = Room
self.Room_nsprefix_ = None
self.CrossStreet = CrossStreet
self.CrossStreet_nsprefix_ = None
if AdditionalDescriptions is None:
self.AdditionalDescriptions = []
else:
self.AdditionalDescriptions = AdditionalDescriptions
self.AdditionalDescriptions_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, AddressAncillaryDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if AddressAncillaryDetail.subclass:
return AddressAncillaryDetail.subclass(*args_, **kwargs_)
else:
return AddressAncillaryDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_LocationInCity(self):
return self.LocationInCity
def set_LocationInCity(self, LocationInCity):
self.LocationInCity = LocationInCity
def get_LocationInProperty(self):
return self.LocationInProperty
def set_LocationInProperty(self, LocationInProperty):
self.LocationInProperty = LocationInProperty
def get_Accessibility(self):
return self.Accessibility
def set_Accessibility(self, Accessibility):
self.Accessibility = Accessibility
def get_Building(self):
return self.Building
def set_Building(self, Building):
self.Building = Building
def get_Department(self):
return self.Department
def set_Department(self, Department):
self.Department = Department
def get_RoomFloor(self):
return self.RoomFloor
def set_RoomFloor(self, RoomFloor):
self.RoomFloor = RoomFloor
def get_Suite(self):
return self.Suite
def set_Suite(self, Suite):
self.Suite = Suite
def get_Apartment(self):
return self.Apartment
def set_Apartment(self, Apartment):
self.Apartment = Apartment
def get_Room(self):
return self.Room
def set_Room(self, Room):
self.Room = Room
def get_CrossStreet(self):
return self.CrossStreet
def set_CrossStreet(self, CrossStreet):
self.CrossStreet = CrossStreet
def get_AdditionalDescriptions(self):
return self.AdditionalDescriptions
def set_AdditionalDescriptions(self, AdditionalDescriptions):
self.AdditionalDescriptions = AdditionalDescriptions
def add_AdditionalDescriptions(self, value):
self.AdditionalDescriptions.append(value)
def insert_AdditionalDescriptions_at(self, index, value):
self.AdditionalDescriptions.insert(index, value)
def replace_AdditionalDescriptions_at(self, index, value):
self.AdditionalDescriptions[index] = value
def validate_LocationAccessibilityType(self, value):
result = True
# Validate type LocationAccessibilityType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['INSIDE', 'OUTSIDE']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LocationAccessibilityType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.LocationInCity is not None or
self.LocationInProperty is not None or
self.Accessibility is not None or
self.Building is not None or
self.Department is not None or
self.RoomFloor is not None or
self.Suite is not None or
self.Apartment is not None or
self.Room is not None or
self.CrossStreet is not None or
self.AdditionalDescriptions
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='AddressAncillaryDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('AddressAncillaryDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'AddressAncillaryDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='AddressAncillaryDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='AddressAncillaryDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='AddressAncillaryDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='AddressAncillaryDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.LocationInCity is not None:
namespaceprefix_ = self.LocationInCity_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationInCity_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocationInCity>%s</%sLocationInCity>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.LocationInCity), input_name='LocationInCity')), namespaceprefix_ , eol_))
if self.LocationInProperty is not None:
namespaceprefix_ = self.LocationInProperty_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationInProperty_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocationInProperty>%s</%sLocationInProperty>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.LocationInProperty), input_name='LocationInProperty')), namespaceprefix_ , eol_))
if self.Accessibility is not None:
namespaceprefix_ = self.Accessibility_nsprefix_ + ':' if (UseCapturedNS_ and self.Accessibility_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sAccessibility>%s</%sAccessibility>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Accessibility), input_name='Accessibility')), namespaceprefix_ , eol_))
if self.Building is not None:
namespaceprefix_ = self.Building_nsprefix_ + ':' if (UseCapturedNS_ and self.Building_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sBuilding>%s</%sBuilding>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Building), input_name='Building')), namespaceprefix_ , eol_))
if self.Department is not None:
namespaceprefix_ = self.Department_nsprefix_ + ':' if (UseCapturedNS_ and self.Department_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sDepartment>%s</%sDepartment>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Department), input_name='Department')), namespaceprefix_ , eol_))
if self.RoomFloor is not None:
namespaceprefix_ = self.RoomFloor_nsprefix_ + ':' if (UseCapturedNS_ and self.RoomFloor_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sRoomFloor>%s</%sRoomFloor>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.RoomFloor), input_name='RoomFloor')), namespaceprefix_ , eol_))
if self.Suite is not None:
namespaceprefix_ = self.Suite_nsprefix_ + ':' if (UseCapturedNS_ and self.Suite_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sSuite>%s</%sSuite>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Suite), input_name='Suite')), namespaceprefix_ , eol_))
if self.Apartment is not None:
namespaceprefix_ = self.Apartment_nsprefix_ + ':' if (UseCapturedNS_ and self.Apartment_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sApartment>%s</%sApartment>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Apartment), input_name='Apartment')), namespaceprefix_ , eol_))
if self.Room is not None:
namespaceprefix_ = self.Room_nsprefix_ + ':' if (UseCapturedNS_ and self.Room_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sRoom>%s</%sRoom>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Room), input_name='Room')), namespaceprefix_ , eol_))
if self.CrossStreet is not None:
namespaceprefix_ = self.CrossStreet_nsprefix_ + ':' if (UseCapturedNS_ and self.CrossStreet_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCrossStreet>%s</%sCrossStreet>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.CrossStreet), input_name='CrossStreet')), namespaceprefix_ , eol_))
for AdditionalDescriptions_ in self.AdditionalDescriptions:
namespaceprefix_ = self.AdditionalDescriptions_nsprefix_ + ':' if (UseCapturedNS_ and self.AdditionalDescriptions_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sAdditionalDescriptions>%s</%sAdditionalDescriptions>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(AdditionalDescriptions_), input_name='AdditionalDescriptions')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'LocationInCity':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocationInCity')
value_ = self.gds_validate_string(value_, node, 'LocationInCity')
self.LocationInCity = value_
self.LocationInCity_nsprefix_ = child_.prefix
elif nodeName_ == 'LocationInProperty':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocationInProperty')
value_ = self.gds_validate_string(value_, node, 'LocationInProperty')
self.LocationInProperty = value_
self.LocationInProperty_nsprefix_ = child_.prefix
elif nodeName_ == 'Accessibility':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Accessibility')
value_ = self.gds_validate_string(value_, node, 'Accessibility')
self.Accessibility = value_
self.Accessibility_nsprefix_ = child_.prefix
# validate type LocationAccessibilityType
self.validate_LocationAccessibilityType(self.Accessibility)
elif nodeName_ == 'Building':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Building')
value_ = self.gds_validate_string(value_, node, 'Building')
self.Building = value_
self.Building_nsprefix_ = child_.prefix
elif nodeName_ == 'Department':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Department')
value_ = self.gds_validate_string(value_, node, 'Department')
self.Department = value_
self.Department_nsprefix_ = child_.prefix
elif nodeName_ == 'RoomFloor':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'RoomFloor')
value_ = self.gds_validate_string(value_, node, 'RoomFloor')
self.RoomFloor = value_
self.RoomFloor_nsprefix_ = child_.prefix
elif nodeName_ == 'Suite':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Suite')
value_ = self.gds_validate_string(value_, node, 'Suite')
self.Suite = value_
self.Suite_nsprefix_ = child_.prefix
elif nodeName_ == 'Apartment':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Apartment')
value_ = self.gds_validate_string(value_, node, 'Apartment')
self.Apartment = value_
self.Apartment_nsprefix_ = child_.prefix
elif nodeName_ == 'Room':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Room')
value_ = self.gds_validate_string(value_, node, 'Room')
self.Room = value_
self.Room_nsprefix_ = child_.prefix
elif nodeName_ == 'CrossStreet':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'CrossStreet')
value_ = self.gds_validate_string(value_, node, 'CrossStreet')
self.CrossStreet = value_
self.CrossStreet_nsprefix_ = child_.prefix
elif nodeName_ == 'AdditionalDescriptions':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'AdditionalDescriptions')
value_ = self.gds_validate_string(value_, node, 'AdditionalDescriptions')
self.AdditionalDescriptions.append(value_)
self.AdditionalDescriptions_nsprefix_ = child_.prefix
# end class AddressAncillaryDetail
class AddressToLocationRelationshipDetail(GeneratedsSuper):
"""Specifies the relationship between the address specificed and the
address of the FedEx Location in terms of distance."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, MatchedAddress=None, MatchedAddressGeographicCoordinates=None, DistanceAndLocationDetails=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.MatchedAddress = MatchedAddress
self.MatchedAddress_nsprefix_ = None
self.MatchedAddressGeographicCoordinates = MatchedAddressGeographicCoordinates
self.MatchedAddressGeographicCoordinates_nsprefix_ = None
if DistanceAndLocationDetails is None:
self.DistanceAndLocationDetails = []
else:
self.DistanceAndLocationDetails = DistanceAndLocationDetails
self.DistanceAndLocationDetails_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, AddressToLocationRelationshipDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if AddressToLocationRelationshipDetail.subclass:
return AddressToLocationRelationshipDetail.subclass(*args_, **kwargs_)
else:
return AddressToLocationRelationshipDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_MatchedAddress(self):
return self.MatchedAddress
def set_MatchedAddress(self, MatchedAddress):
self.MatchedAddress = MatchedAddress
def get_MatchedAddressGeographicCoordinates(self):
return self.MatchedAddressGeographicCoordinates
def set_MatchedAddressGeographicCoordinates(self, MatchedAddressGeographicCoordinates):
self.MatchedAddressGeographicCoordinates = MatchedAddressGeographicCoordinates
def get_DistanceAndLocationDetails(self):
return self.DistanceAndLocationDetails
def set_DistanceAndLocationDetails(self, DistanceAndLocationDetails):
self.DistanceAndLocationDetails = DistanceAndLocationDetails
def add_DistanceAndLocationDetails(self, value):
self.DistanceAndLocationDetails.append(value)
def insert_DistanceAndLocationDetails_at(self, index, value):
self.DistanceAndLocationDetails.insert(index, value)
def replace_DistanceAndLocationDetails_at(self, index, value):
self.DistanceAndLocationDetails[index] = value
def hasContent_(self):
if (
self.MatchedAddress is not None or
self.MatchedAddressGeographicCoordinates is not None or
self.DistanceAndLocationDetails
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='AddressToLocationRelationshipDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('AddressToLocationRelationshipDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'AddressToLocationRelationshipDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='AddressToLocationRelationshipDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='AddressToLocationRelationshipDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='AddressToLocationRelationshipDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='AddressToLocationRelationshipDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.MatchedAddress is not None:
namespaceprefix_ = self.MatchedAddress_nsprefix_ + ':' if (UseCapturedNS_ and self.MatchedAddress_nsprefix_) else ''
self.MatchedAddress.export(outfile, level, namespaceprefix_, namespacedef_='', name_='MatchedAddress', pretty_print=pretty_print)
if self.MatchedAddressGeographicCoordinates is not None:
namespaceprefix_ = self.MatchedAddressGeographicCoordinates_nsprefix_ + ':' if (UseCapturedNS_ and self.MatchedAddressGeographicCoordinates_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sMatchedAddressGeographicCoordinates>%s</%sMatchedAddressGeographicCoordinates>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.MatchedAddressGeographicCoordinates), input_name='MatchedAddressGeographicCoordinates')), namespaceprefix_ , eol_))
for DistanceAndLocationDetails_ in self.DistanceAndLocationDetails:
namespaceprefix_ = self.DistanceAndLocationDetails_nsprefix_ + ':' if (UseCapturedNS_ and self.DistanceAndLocationDetails_nsprefix_) else ''
DistanceAndLocationDetails_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='DistanceAndLocationDetails', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'MatchedAddress':
obj_ = Address.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.MatchedAddress = obj_
obj_.original_tagname_ = 'MatchedAddress'
elif nodeName_ == 'MatchedAddressGeographicCoordinates':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'MatchedAddressGeographicCoordinates')
value_ = self.gds_validate_string(value_, node, 'MatchedAddressGeographicCoordinates')
self.MatchedAddressGeographicCoordinates = value_
self.MatchedAddressGeographicCoordinates_nsprefix_ = child_.prefix
elif nodeName_ == 'DistanceAndLocationDetails':
obj_ = DistanceAndLocationDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.DistanceAndLocationDetails.append(obj_)
obj_.original_tagname_ = 'DistanceAndLocationDetails'
# end class AddressToLocationRelationshipDetail
class CarrierDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Carrier=None, ServiceCategory=None, ServiceType=None, CountryRelationship=None, NormalLatestDropOffDetails=None, ExceptionalLatestDropOffDetails=None, EffectiveLatestDropOffDetails=None, ShippingHolidays=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Carrier = Carrier
self.validate_CarrierCodeType(self.Carrier)
self.Carrier_nsprefix_ = None
self.ServiceCategory = ServiceCategory
self.validate_ServiceCategoryType(self.ServiceCategory)
self.ServiceCategory_nsprefix_ = None
self.ServiceType = ServiceType
self.ServiceType_nsprefix_ = None
self.CountryRelationship = CountryRelationship
self.validate_CountryRelationshipType(self.CountryRelationship)
self.CountryRelationship_nsprefix_ = None
if NormalLatestDropOffDetails is None:
self.NormalLatestDropOffDetails = []
else:
self.NormalLatestDropOffDetails = NormalLatestDropOffDetails
self.NormalLatestDropOffDetails_nsprefix_ = None
if ExceptionalLatestDropOffDetails is None:
self.ExceptionalLatestDropOffDetails = []
else:
self.ExceptionalLatestDropOffDetails = ExceptionalLatestDropOffDetails
self.ExceptionalLatestDropOffDetails_nsprefix_ = None
self.EffectiveLatestDropOffDetails = EffectiveLatestDropOffDetails
self.EffectiveLatestDropOffDetails_nsprefix_ = None
if ShippingHolidays is None:
self.ShippingHolidays = []
else:
self.ShippingHolidays = ShippingHolidays
self.ShippingHolidays_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, CarrierDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if CarrierDetail.subclass:
return CarrierDetail.subclass(*args_, **kwargs_)
else:
return CarrierDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Carrier(self):
return self.Carrier
def set_Carrier(self, Carrier):
self.Carrier = Carrier
def get_ServiceCategory(self):
return self.ServiceCategory
def set_ServiceCategory(self, ServiceCategory):
self.ServiceCategory = ServiceCategory
def get_ServiceType(self):
return self.ServiceType
def set_ServiceType(self, ServiceType):
self.ServiceType = ServiceType
def get_CountryRelationship(self):
return self.CountryRelationship
def set_CountryRelationship(self, CountryRelationship):
self.CountryRelationship = CountryRelationship
def get_NormalLatestDropOffDetails(self):
return self.NormalLatestDropOffDetails
def set_NormalLatestDropOffDetails(self, NormalLatestDropOffDetails):
self.NormalLatestDropOffDetails = NormalLatestDropOffDetails
def add_NormalLatestDropOffDetails(self, value):
self.NormalLatestDropOffDetails.append(value)
def insert_NormalLatestDropOffDetails_at(self, index, value):
self.NormalLatestDropOffDetails.insert(index, value)
def replace_NormalLatestDropOffDetails_at(self, index, value):
self.NormalLatestDropOffDetails[index] = value
def get_ExceptionalLatestDropOffDetails(self):
return self.ExceptionalLatestDropOffDetails
def set_ExceptionalLatestDropOffDetails(self, ExceptionalLatestDropOffDetails):
self.ExceptionalLatestDropOffDetails = ExceptionalLatestDropOffDetails
def add_ExceptionalLatestDropOffDetails(self, value):
self.ExceptionalLatestDropOffDetails.append(value)
def insert_ExceptionalLatestDropOffDetails_at(self, index, value):
self.ExceptionalLatestDropOffDetails.insert(index, value)
def replace_ExceptionalLatestDropOffDetails_at(self, index, value):
self.ExceptionalLatestDropOffDetails[index] = value
def get_EffectiveLatestDropOffDetails(self):
return self.EffectiveLatestDropOffDetails
def set_EffectiveLatestDropOffDetails(self, EffectiveLatestDropOffDetails):
self.EffectiveLatestDropOffDetails = EffectiveLatestDropOffDetails
def get_ShippingHolidays(self):
return self.ShippingHolidays
def set_ShippingHolidays(self, ShippingHolidays):
self.ShippingHolidays = ShippingHolidays
def add_ShippingHolidays(self, value):
self.ShippingHolidays.append(value)
def insert_ShippingHolidays_at(self, index, value):
self.ShippingHolidays.insert(index, value)
def replace_ShippingHolidays_at(self, index, value):
self.ShippingHolidays[index] = value
def validate_CarrierCodeType(self, value):
result = True
# Validate type CarrierCodeType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FDXC', 'FDXE', 'FDXG', 'FXCC', 'FXFR', 'FXSP']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on CarrierCodeType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_ServiceCategoryType(self, value):
result = True
# Validate type ServiceCategoryType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['EXPRESS_FREIGHT', 'EXPRESS_PARCEL', 'GROUND_HOME_DELIVERY']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on ServiceCategoryType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_CountryRelationshipType(self, value):
result = True
# Validate type CountryRelationshipType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['DOMESTIC', 'INTERNATIONAL']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on CountryRelationshipType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Carrier is not None or
self.ServiceCategory is not None or
self.ServiceType is not None or
self.CountryRelationship is not None or
self.NormalLatestDropOffDetails or
self.ExceptionalLatestDropOffDetails or
self.EffectiveLatestDropOffDetails is not None or
self.ShippingHolidays
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='CarrierDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('CarrierDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'CarrierDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='CarrierDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='CarrierDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='CarrierDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='CarrierDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Carrier is not None:
namespaceprefix_ = self.Carrier_nsprefix_ + ':' if (UseCapturedNS_ and self.Carrier_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCarrier>%s</%sCarrier>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Carrier), input_name='Carrier')), namespaceprefix_ , eol_))
if self.ServiceCategory is not None:
namespaceprefix_ = self.ServiceCategory_nsprefix_ + ':' if (UseCapturedNS_ and self.ServiceCategory_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sServiceCategory>%s</%sServiceCategory>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ServiceCategory), input_name='ServiceCategory')), namespaceprefix_ , eol_))
if self.ServiceType is not None:
namespaceprefix_ = self.ServiceType_nsprefix_ + ':' if (UseCapturedNS_ and self.ServiceType_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sServiceType>%s</%sServiceType>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ServiceType), input_name='ServiceType')), namespaceprefix_ , eol_))
if self.CountryRelationship is not None:
namespaceprefix_ = self.CountryRelationship_nsprefix_ + ':' if (UseCapturedNS_ and self.CountryRelationship_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCountryRelationship>%s</%sCountryRelationship>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.CountryRelationship), input_name='CountryRelationship')), namespaceprefix_ , eol_))
for NormalLatestDropOffDetails_ in self.NormalLatestDropOffDetails:
namespaceprefix_ = self.NormalLatestDropOffDetails_nsprefix_ + ':' if (UseCapturedNS_ and self.NormalLatestDropOffDetails_nsprefix_) else ''
NormalLatestDropOffDetails_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='NormalLatestDropOffDetails', pretty_print=pretty_print)
for ExceptionalLatestDropOffDetails_ in self.ExceptionalLatestDropOffDetails:
namespaceprefix_ = self.ExceptionalLatestDropOffDetails_nsprefix_ + ':' if (UseCapturedNS_ and self.ExceptionalLatestDropOffDetails_nsprefix_) else ''
ExceptionalLatestDropOffDetails_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ExceptionalLatestDropOffDetails', pretty_print=pretty_print)
if self.EffectiveLatestDropOffDetails is not None:
namespaceprefix_ = self.EffectiveLatestDropOffDetails_nsprefix_ + ':' if (UseCapturedNS_ and self.EffectiveLatestDropOffDetails_nsprefix_) else ''
self.EffectiveLatestDropOffDetails.export(outfile, level, namespaceprefix_, namespacedef_='', name_='EffectiveLatestDropOffDetails', pretty_print=pretty_print)
for ShippingHolidays_ in self.ShippingHolidays:
namespaceprefix_ = self.ShippingHolidays_nsprefix_ + ':' if (UseCapturedNS_ and self.ShippingHolidays_nsprefix_) else ''
ShippingHolidays_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ShippingHolidays', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Carrier':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Carrier')
value_ = self.gds_validate_string(value_, node, 'Carrier')
self.Carrier = value_
self.Carrier_nsprefix_ = child_.prefix
# validate type CarrierCodeType
self.validate_CarrierCodeType(self.Carrier)
elif nodeName_ == 'ServiceCategory':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ServiceCategory')
value_ = self.gds_validate_string(value_, node, 'ServiceCategory')
self.ServiceCategory = value_
self.ServiceCategory_nsprefix_ = child_.prefix
# validate type ServiceCategoryType
self.validate_ServiceCategoryType(self.ServiceCategory)
elif nodeName_ == 'ServiceType':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ServiceType')
value_ = self.gds_validate_string(value_, node, 'ServiceType')
self.ServiceType = value_
self.ServiceType_nsprefix_ = child_.prefix
elif nodeName_ == 'CountryRelationship':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'CountryRelationship')
value_ = self.gds_validate_string(value_, node, 'CountryRelationship')
self.CountryRelationship = value_
self.CountryRelationship_nsprefix_ = child_.prefix
# validate type CountryRelationshipType
self.validate_CountryRelationshipType(self.CountryRelationship)
elif nodeName_ == 'NormalLatestDropOffDetails':
obj_ = LatestDropOffDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.NormalLatestDropOffDetails.append(obj_)
obj_.original_tagname_ = 'NormalLatestDropOffDetails'
elif nodeName_ == 'ExceptionalLatestDropOffDetails':
obj_ = LatestDropOffDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ExceptionalLatestDropOffDetails.append(obj_)
obj_.original_tagname_ = 'ExceptionalLatestDropOffDetails'
elif nodeName_ == 'EffectiveLatestDropOffDetails':
obj_ = LatestDropOffDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.EffectiveLatestDropOffDetails = obj_
obj_.original_tagname_ = 'EffectiveLatestDropOffDetails'
elif nodeName_ == 'ShippingHolidays':
obj_ = ShippingHoliday.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ShippingHolidays.append(obj_)
obj_.original_tagname_ = 'ShippingHolidays'
# end class CarrierDetail
class ClearanceCountryDetail(GeneratedsSuper):
"""Specifies the special services supported at the clearance location for
an individual destination country."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, ClearanceCountry=None, ServicesSupported=None, SpecialServicesSupported=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.ClearanceCountry = ClearanceCountry
self.ClearanceCountry_nsprefix_ = None
if ServicesSupported is None:
self.ServicesSupported = []
else:
self.ServicesSupported = ServicesSupported
self.ServicesSupported_nsprefix_ = None
if SpecialServicesSupported is None:
self.SpecialServicesSupported = []
else:
self.SpecialServicesSupported = SpecialServicesSupported
self.SpecialServicesSupported_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, ClearanceCountryDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if ClearanceCountryDetail.subclass:
return ClearanceCountryDetail.subclass(*args_, **kwargs_)
else:
return ClearanceCountryDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_ClearanceCountry(self):
return self.ClearanceCountry
def set_ClearanceCountry(self, ClearanceCountry):
self.ClearanceCountry = ClearanceCountry
def get_ServicesSupported(self):
return self.ServicesSupported
def set_ServicesSupported(self, ServicesSupported):
self.ServicesSupported = ServicesSupported
def add_ServicesSupported(self, value):
self.ServicesSupported.append(value)
def insert_ServicesSupported_at(self, index, value):
self.ServicesSupported.insert(index, value)
def replace_ServicesSupported_at(self, index, value):
self.ServicesSupported[index] = value
def get_SpecialServicesSupported(self):
return self.SpecialServicesSupported
def set_SpecialServicesSupported(self, SpecialServicesSupported):
self.SpecialServicesSupported = SpecialServicesSupported
def add_SpecialServicesSupported(self, value):
self.SpecialServicesSupported.append(value)
def insert_SpecialServicesSupported_at(self, index, value):
self.SpecialServicesSupported.insert(index, value)
def replace_SpecialServicesSupported_at(self, index, value):
self.SpecialServicesSupported[index] = value
def hasContent_(self):
if (
self.ClearanceCountry is not None or
self.ServicesSupported or
self.SpecialServicesSupported
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ClearanceCountryDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('ClearanceCountryDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'ClearanceCountryDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='ClearanceCountryDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='ClearanceCountryDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='ClearanceCountryDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ClearanceCountryDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.ClearanceCountry is not None:
namespaceprefix_ = self.ClearanceCountry_nsprefix_ + ':' if (UseCapturedNS_ and self.ClearanceCountry_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sClearanceCountry>%s</%sClearanceCountry>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ClearanceCountry), input_name='ClearanceCountry')), namespaceprefix_ , eol_))
for ServicesSupported_ in self.ServicesSupported:
namespaceprefix_ = self.ServicesSupported_nsprefix_ + ':' if (UseCapturedNS_ and self.ServicesSupported_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sServicesSupported>%s</%sServicesSupported>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(ServicesSupported_), input_name='ServicesSupported')), namespaceprefix_ , eol_))
for SpecialServicesSupported_ in self.SpecialServicesSupported:
namespaceprefix_ = self.SpecialServicesSupported_nsprefix_ + ':' if (UseCapturedNS_ and self.SpecialServicesSupported_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sSpecialServicesSupported>%s</%sSpecialServicesSupported>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(SpecialServicesSupported_), input_name='SpecialServicesSupported')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'ClearanceCountry':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ClearanceCountry')
value_ = self.gds_validate_string(value_, node, 'ClearanceCountry')
self.ClearanceCountry = value_
self.ClearanceCountry_nsprefix_ = child_.prefix
elif nodeName_ == 'ServicesSupported':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ServicesSupported')
value_ = self.gds_validate_string(value_, node, 'ServicesSupported')
self.ServicesSupported.append(value_)
self.ServicesSupported_nsprefix_ = child_.prefix
elif nodeName_ == 'SpecialServicesSupported':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'SpecialServicesSupported')
value_ = self.gds_validate_string(value_, node, 'SpecialServicesSupported')
self.SpecialServicesSupported.append(value_)
self.SpecialServicesSupported_nsprefix_ = child_.prefix
# end class ClearanceCountryDetail
class ClearanceLocationDetail(GeneratedsSuper):
"""Specifies the details about the countries supported by this location."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, ServicesSupported=None, ConsolidationType=None, ClearanceLocationType=None, SpecialServicesSupported=None, ClearanceCountries=None, ClearanceRoutingCode=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
if ServicesSupported is None:
self.ServicesSupported = []
else:
self.ServicesSupported = ServicesSupported
self.ServicesSupported_nsprefix_ = None
self.ConsolidationType = ConsolidationType
self.validate_ConsolidationType(self.ConsolidationType)
self.ConsolidationType_nsprefix_ = None
self.ClearanceLocationType = ClearanceLocationType
self.validate_DistributionClearanceType(self.ClearanceLocationType)
self.ClearanceLocationType_nsprefix_ = None
if SpecialServicesSupported is None:
self.SpecialServicesSupported = []
else:
self.SpecialServicesSupported = SpecialServicesSupported
self.SpecialServicesSupported_nsprefix_ = None
if ClearanceCountries is None:
self.ClearanceCountries = []
else:
self.ClearanceCountries = ClearanceCountries
self.ClearanceCountries_nsprefix_ = None
self.ClearanceRoutingCode = ClearanceRoutingCode
self.ClearanceRoutingCode_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, ClearanceLocationDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if ClearanceLocationDetail.subclass:
return ClearanceLocationDetail.subclass(*args_, **kwargs_)
else:
return ClearanceLocationDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_ServicesSupported(self):
return self.ServicesSupported
def set_ServicesSupported(self, ServicesSupported):
self.ServicesSupported = ServicesSupported
def add_ServicesSupported(self, value):
self.ServicesSupported.append(value)
def insert_ServicesSupported_at(self, index, value):
self.ServicesSupported.insert(index, value)
def replace_ServicesSupported_at(self, index, value):
self.ServicesSupported[index] = value
def get_ConsolidationType(self):
return self.ConsolidationType
def set_ConsolidationType(self, ConsolidationType):
self.ConsolidationType = ConsolidationType
def get_ClearanceLocationType(self):
return self.ClearanceLocationType
def set_ClearanceLocationType(self, ClearanceLocationType):
self.ClearanceLocationType = ClearanceLocationType
def get_SpecialServicesSupported(self):
return self.SpecialServicesSupported
def set_SpecialServicesSupported(self, SpecialServicesSupported):
self.SpecialServicesSupported = SpecialServicesSupported
def add_SpecialServicesSupported(self, value):
self.SpecialServicesSupported.append(value)
def insert_SpecialServicesSupported_at(self, index, value):
self.SpecialServicesSupported.insert(index, value)
def replace_SpecialServicesSupported_at(self, index, value):
self.SpecialServicesSupported[index] = value
def get_ClearanceCountries(self):
return self.ClearanceCountries
def set_ClearanceCountries(self, ClearanceCountries):
self.ClearanceCountries = ClearanceCountries
def add_ClearanceCountries(self, value):
self.ClearanceCountries.append(value)
def insert_ClearanceCountries_at(self, index, value):
self.ClearanceCountries.insert(index, value)
def replace_ClearanceCountries_at(self, index, value):
self.ClearanceCountries[index] = value
def get_ClearanceRoutingCode(self):
return self.ClearanceRoutingCode
def set_ClearanceRoutingCode(self, ClearanceRoutingCode):
self.ClearanceRoutingCode = ClearanceRoutingCode
def validate_ConsolidationType(self, value):
result = True
# Validate type ConsolidationType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['INTERNATIONAL_DISTRIBUTION_FREIGHT', 'INTERNATIONAL_ECONOMY_DISTRIBUTION', 'INTERNATIONAL_GROUND_DISTRIBUTION', 'INTERNATIONAL_PRIORITY_DISTRIBUTION', 'TRANSBORDER_DISTRIBUTION']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on ConsolidationType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_DistributionClearanceType(self, value):
result = True
# Validate type DistributionClearanceType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['DESTINATION_COUNTRY_CLEARANCE', 'SINGLE_POINT_OF_CLEARANCE']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on DistributionClearanceType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.ServicesSupported or
self.ConsolidationType is not None or
self.ClearanceLocationType is not None or
self.SpecialServicesSupported or
self.ClearanceCountries or
self.ClearanceRoutingCode is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ClearanceLocationDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('ClearanceLocationDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'ClearanceLocationDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='ClearanceLocationDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='ClearanceLocationDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='ClearanceLocationDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ClearanceLocationDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
for ServicesSupported_ in self.ServicesSupported:
namespaceprefix_ = self.ServicesSupported_nsprefix_ + ':' if (UseCapturedNS_ and self.ServicesSupported_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sServicesSupported>%s</%sServicesSupported>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(ServicesSupported_), input_name='ServicesSupported')), namespaceprefix_ , eol_))
if self.ConsolidationType is not None:
namespaceprefix_ = self.ConsolidationType_nsprefix_ + ':' if (UseCapturedNS_ and self.ConsolidationType_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sConsolidationType>%s</%sConsolidationType>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ConsolidationType), input_name='ConsolidationType')), namespaceprefix_ , eol_))
if self.ClearanceLocationType is not None:
namespaceprefix_ = self.ClearanceLocationType_nsprefix_ + ':' if (UseCapturedNS_ and self.ClearanceLocationType_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sClearanceLocationType>%s</%sClearanceLocationType>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ClearanceLocationType), input_name='ClearanceLocationType')), namespaceprefix_ , eol_))
for SpecialServicesSupported_ in self.SpecialServicesSupported:
namespaceprefix_ = self.SpecialServicesSupported_nsprefix_ + ':' if (UseCapturedNS_ and self.SpecialServicesSupported_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sSpecialServicesSupported>%s</%sSpecialServicesSupported>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(SpecialServicesSupported_), input_name='SpecialServicesSupported')), namespaceprefix_ , eol_))
for ClearanceCountries_ in self.ClearanceCountries:
namespaceprefix_ = self.ClearanceCountries_nsprefix_ + ':' if (UseCapturedNS_ and self.ClearanceCountries_nsprefix_) else ''
ClearanceCountries_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ClearanceCountries', pretty_print=pretty_print)
if self.ClearanceRoutingCode is not None:
namespaceprefix_ = self.ClearanceRoutingCode_nsprefix_ + ':' if (UseCapturedNS_ and self.ClearanceRoutingCode_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sClearanceRoutingCode>%s</%sClearanceRoutingCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ClearanceRoutingCode), input_name='ClearanceRoutingCode')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'ServicesSupported':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ServicesSupported')
value_ = self.gds_validate_string(value_, node, 'ServicesSupported')
self.ServicesSupported.append(value_)
self.ServicesSupported_nsprefix_ = child_.prefix
elif nodeName_ == 'ConsolidationType':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ConsolidationType')
value_ = self.gds_validate_string(value_, node, 'ConsolidationType')
self.ConsolidationType = value_
self.ConsolidationType_nsprefix_ = child_.prefix
# validate type ConsolidationType
self.validate_ConsolidationType(self.ConsolidationType)
elif nodeName_ == 'ClearanceLocationType':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ClearanceLocationType')
value_ = self.gds_validate_string(value_, node, 'ClearanceLocationType')
self.ClearanceLocationType = value_
self.ClearanceLocationType_nsprefix_ = child_.prefix
# validate type DistributionClearanceType
self.validate_DistributionClearanceType(self.ClearanceLocationType)
elif nodeName_ == 'SpecialServicesSupported':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'SpecialServicesSupported')
value_ = self.gds_validate_string(value_, node, 'SpecialServicesSupported')
self.SpecialServicesSupported.append(value_)
self.SpecialServicesSupported_nsprefix_ = child_.prefix
elif nodeName_ == 'ClearanceCountries':
obj_ = ClearanceCountryDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ClearanceCountries.append(obj_)
obj_.original_tagname_ = 'ClearanceCountries'
elif nodeName_ == 'ClearanceRoutingCode':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ClearanceRoutingCode')
value_ = self.gds_validate_string(value_, node, 'ClearanceRoutingCode')
self.ClearanceRoutingCode = value_
self.ClearanceRoutingCode_nsprefix_ = child_.prefix
# end class ClearanceLocationDetail
class ClientDetail(GeneratedsSuper):
"""Descriptive data for the client submitting a transaction."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, AccountNumber=None, MeterNumber=None, MeterInstance=None, IntegratorId=None, Region=None, Localization=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.AccountNumber = AccountNumber
self.AccountNumber_nsprefix_ = None
self.MeterNumber = MeterNumber
self.MeterNumber_nsprefix_ = None
self.MeterInstance = MeterInstance
self.MeterInstance_nsprefix_ = None
self.IntegratorId = IntegratorId
self.IntegratorId_nsprefix_ = None
self.Region = Region
self.validate_ExpressRegionCode(self.Region)
self.Region_nsprefix_ = None
self.Localization = Localization
self.Localization_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, ClientDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if ClientDetail.subclass:
return ClientDetail.subclass(*args_, **kwargs_)
else:
return ClientDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_AccountNumber(self):
return self.AccountNumber
def set_AccountNumber(self, AccountNumber):
self.AccountNumber = AccountNumber
def get_MeterNumber(self):
return self.MeterNumber
def set_MeterNumber(self, MeterNumber):
self.MeterNumber = MeterNumber
def get_MeterInstance(self):
return self.MeterInstance
def set_MeterInstance(self, MeterInstance):
self.MeterInstance = MeterInstance
def get_IntegratorId(self):
return self.IntegratorId
def set_IntegratorId(self, IntegratorId):
self.IntegratorId = IntegratorId
def get_Region(self):
return self.Region
def set_Region(self, Region):
self.Region = Region
def get_Localization(self):
return self.Localization
def set_Localization(self, Localization):
self.Localization = Localization
def validate_ExpressRegionCode(self, value):
result = True
# Validate type ExpressRegionCode, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['APAC', 'CA', 'EMEA', 'LAC', 'US']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on ExpressRegionCode' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.AccountNumber is not None or
self.MeterNumber is not None or
self.MeterInstance is not None or
self.IntegratorId is not None or
self.Region is not None or
self.Localization is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ClientDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('ClientDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'ClientDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='ClientDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='ClientDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='ClientDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ClientDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.AccountNumber is not None:
namespaceprefix_ = self.AccountNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.AccountNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sAccountNumber>%s</%sAccountNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.AccountNumber), input_name='AccountNumber')), namespaceprefix_ , eol_))
if self.MeterNumber is not None:
namespaceprefix_ = self.MeterNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.MeterNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sMeterNumber>%s</%sMeterNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.MeterNumber), input_name='MeterNumber')), namespaceprefix_ , eol_))
if self.MeterInstance is not None:
namespaceprefix_ = self.MeterInstance_nsprefix_ + ':' if (UseCapturedNS_ and self.MeterInstance_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sMeterInstance>%s</%sMeterInstance>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.MeterInstance), input_name='MeterInstance')), namespaceprefix_ , eol_))
if self.IntegratorId is not None:
namespaceprefix_ = self.IntegratorId_nsprefix_ + ':' if (UseCapturedNS_ and self.IntegratorId_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sIntegratorId>%s</%sIntegratorId>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.IntegratorId), input_name='IntegratorId')), namespaceprefix_ , eol_))
if self.Region is not None:
namespaceprefix_ = self.Region_nsprefix_ + ':' if (UseCapturedNS_ and self.Region_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sRegion>%s</%sRegion>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Region), input_name='Region')), namespaceprefix_ , eol_))
if self.Localization is not None:
namespaceprefix_ = self.Localization_nsprefix_ + ':' if (UseCapturedNS_ and self.Localization_nsprefix_) else ''
self.Localization.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Localization', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'AccountNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'AccountNumber')
value_ = self.gds_validate_string(value_, node, 'AccountNumber')
self.AccountNumber = value_
self.AccountNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'MeterNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'MeterNumber')
value_ = self.gds_validate_string(value_, node, 'MeterNumber')
self.MeterNumber = value_
self.MeterNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'MeterInstance':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'MeterInstance')
value_ = self.gds_validate_string(value_, node, 'MeterInstance')
self.MeterInstance = value_
self.MeterInstance_nsprefix_ = child_.prefix
elif nodeName_ == 'IntegratorId':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'IntegratorId')
value_ = self.gds_validate_string(value_, node, 'IntegratorId')
self.IntegratorId = value_
self.IntegratorId_nsprefix_ = child_.prefix
elif nodeName_ == 'Region':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Region')
value_ = self.gds_validate_string(value_, node, 'Region')
self.Region = value_
self.Region_nsprefix_ = child_.prefix
# validate type ExpressRegionCode
self.validate_ExpressRegionCode(self.Region)
elif nodeName_ == 'Localization':
obj_ = Localization.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Localization = obj_
obj_.original_tagname_ = 'Localization'
# end class ClientDetail
class Contact(GeneratedsSuper):
"""The descriptive data for a point-of-contact person."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, ContactId=None, PersonName=None, Title=None, CompanyName=None, PhoneNumber=None, PhoneExtension=None, TollFreePhoneNumber=None, PagerNumber=None, FaxNumber=None, EMailAddress=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.ContactId = ContactId
self.ContactId_nsprefix_ = None
self.PersonName = PersonName
self.PersonName_nsprefix_ = None
self.Title = Title
self.Title_nsprefix_ = None
self.CompanyName = CompanyName
self.CompanyName_nsprefix_ = None
self.PhoneNumber = PhoneNumber
self.PhoneNumber_nsprefix_ = None
self.PhoneExtension = PhoneExtension
self.PhoneExtension_nsprefix_ = None
self.TollFreePhoneNumber = TollFreePhoneNumber
self.TollFreePhoneNumber_nsprefix_ = None
self.PagerNumber = PagerNumber
self.PagerNumber_nsprefix_ = None
self.FaxNumber = FaxNumber
self.FaxNumber_nsprefix_ = None
self.EMailAddress = EMailAddress
self.EMailAddress_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, Contact)
if subclass is not None:
return subclass(*args_, **kwargs_)
if Contact.subclass:
return Contact.subclass(*args_, **kwargs_)
else:
return Contact(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_ContactId(self):
return self.ContactId
def set_ContactId(self, ContactId):
self.ContactId = ContactId
def get_PersonName(self):
return self.PersonName
def set_PersonName(self, PersonName):
self.PersonName = PersonName
def get_Title(self):
return self.Title
def set_Title(self, Title):
self.Title = Title
def get_CompanyName(self):
return self.CompanyName
def set_CompanyName(self, CompanyName):
self.CompanyName = CompanyName
def get_PhoneNumber(self):
return self.PhoneNumber
def set_PhoneNumber(self, PhoneNumber):
self.PhoneNumber = PhoneNumber
def get_PhoneExtension(self):
return self.PhoneExtension
def set_PhoneExtension(self, PhoneExtension):
self.PhoneExtension = PhoneExtension
def get_TollFreePhoneNumber(self):
return self.TollFreePhoneNumber
def set_TollFreePhoneNumber(self, TollFreePhoneNumber):
self.TollFreePhoneNumber = TollFreePhoneNumber
def get_PagerNumber(self):
return self.PagerNumber
def set_PagerNumber(self, PagerNumber):
self.PagerNumber = PagerNumber
def get_FaxNumber(self):
return self.FaxNumber
def set_FaxNumber(self, FaxNumber):
self.FaxNumber = FaxNumber
def get_EMailAddress(self):
return self.EMailAddress
def set_EMailAddress(self, EMailAddress):
self.EMailAddress = EMailAddress
def hasContent_(self):
if (
self.ContactId is not None or
self.PersonName is not None or
self.Title is not None or
self.CompanyName is not None or
self.PhoneNumber is not None or
self.PhoneExtension is not None or
self.TollFreePhoneNumber is not None or
self.PagerNumber is not None or
self.FaxNumber is not None or
self.EMailAddress is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Contact', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('Contact')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'Contact':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='Contact')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='Contact', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='Contact'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Contact', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.ContactId is not None:
namespaceprefix_ = self.ContactId_nsprefix_ + ':' if (UseCapturedNS_ and self.ContactId_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sContactId>%s</%sContactId>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ContactId), input_name='ContactId')), namespaceprefix_ , eol_))
if self.PersonName is not None:
namespaceprefix_ = self.PersonName_nsprefix_ + ':' if (UseCapturedNS_ and self.PersonName_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sPersonName>%s</%sPersonName>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.PersonName), input_name='PersonName')), namespaceprefix_ , eol_))
if self.Title is not None:
namespaceprefix_ = self.Title_nsprefix_ + ':' if (UseCapturedNS_ and self.Title_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sTitle>%s</%sTitle>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Title), input_name='Title')), namespaceprefix_ , eol_))
if self.CompanyName is not None:
namespaceprefix_ = self.CompanyName_nsprefix_ + ':' if (UseCapturedNS_ and self.CompanyName_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCompanyName>%s</%sCompanyName>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.CompanyName), input_name='CompanyName')), namespaceprefix_ , eol_))
if self.PhoneNumber is not None:
namespaceprefix_ = self.PhoneNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.PhoneNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sPhoneNumber>%s</%sPhoneNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.PhoneNumber), input_name='PhoneNumber')), namespaceprefix_ , eol_))
if self.PhoneExtension is not None:
namespaceprefix_ = self.PhoneExtension_nsprefix_ + ':' if (UseCapturedNS_ and self.PhoneExtension_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sPhoneExtension>%s</%sPhoneExtension>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.PhoneExtension), input_name='PhoneExtension')), namespaceprefix_ , eol_))
if self.TollFreePhoneNumber is not None:
namespaceprefix_ = self.TollFreePhoneNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.TollFreePhoneNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sTollFreePhoneNumber>%s</%sTollFreePhoneNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.TollFreePhoneNumber), input_name='TollFreePhoneNumber')), namespaceprefix_ , eol_))
if self.PagerNumber is not None:
namespaceprefix_ = self.PagerNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.PagerNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sPagerNumber>%s</%sPagerNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.PagerNumber), input_name='PagerNumber')), namespaceprefix_ , eol_))
if self.FaxNumber is not None:
namespaceprefix_ = self.FaxNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.FaxNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sFaxNumber>%s</%sFaxNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.FaxNumber), input_name='FaxNumber')), namespaceprefix_ , eol_))
if self.EMailAddress is not None:
namespaceprefix_ = self.EMailAddress_nsprefix_ + ':' if (UseCapturedNS_ and self.EMailAddress_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sEMailAddress>%s</%sEMailAddress>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.EMailAddress), input_name='EMailAddress')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'ContactId':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ContactId')
value_ = self.gds_validate_string(value_, node, 'ContactId')
self.ContactId = value_
self.ContactId_nsprefix_ = child_.prefix
elif nodeName_ == 'PersonName':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'PersonName')
value_ = self.gds_validate_string(value_, node, 'PersonName')
self.PersonName = value_
self.PersonName_nsprefix_ = child_.prefix
elif nodeName_ == 'Title':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Title')
value_ = self.gds_validate_string(value_, node, 'Title')
self.Title = value_
self.Title_nsprefix_ = child_.prefix
elif nodeName_ == 'CompanyName':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'CompanyName')
value_ = self.gds_validate_string(value_, node, 'CompanyName')
self.CompanyName = value_
self.CompanyName_nsprefix_ = child_.prefix
elif nodeName_ == 'PhoneNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'PhoneNumber')
value_ = self.gds_validate_string(value_, node, 'PhoneNumber')
self.PhoneNumber = value_
self.PhoneNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'PhoneExtension':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'PhoneExtension')
value_ = self.gds_validate_string(value_, node, 'PhoneExtension')
self.PhoneExtension = value_
self.PhoneExtension_nsprefix_ = child_.prefix
elif nodeName_ == 'TollFreePhoneNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'TollFreePhoneNumber')
value_ = self.gds_validate_string(value_, node, 'TollFreePhoneNumber')
self.TollFreePhoneNumber = value_
self.TollFreePhoneNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'PagerNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'PagerNumber')
value_ = self.gds_validate_string(value_, node, 'PagerNumber')
self.PagerNumber = value_
self.PagerNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'FaxNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'FaxNumber')
value_ = self.gds_validate_string(value_, node, 'FaxNumber')
self.FaxNumber = value_
self.FaxNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'EMailAddress':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'EMailAddress')
value_ = self.gds_validate_string(value_, node, 'EMailAddress')
self.EMailAddress = value_
self.EMailAddress_nsprefix_ = child_.prefix
# end class Contact
class Dimensions(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Length=None, Width=None, Height=None, Units=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Length = Length
self.Length_nsprefix_ = None
self.Width = Width
self.Width_nsprefix_ = None
self.Height = Height
self.Height_nsprefix_ = None
self.Units = Units
self.validate_LinearUnits(self.Units)
self.Units_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, Dimensions)
if subclass is not None:
return subclass(*args_, **kwargs_)
if Dimensions.subclass:
return Dimensions.subclass(*args_, **kwargs_)
else:
return Dimensions(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Length(self):
return self.Length
def set_Length(self, Length):
self.Length = Length
def get_Width(self):
return self.Width
def set_Width(self, Width):
self.Width = Width
def get_Height(self):
return self.Height
def set_Height(self, Height):
self.Height = Height
def get_Units(self):
return self.Units
def set_Units(self, Units):
self.Units = Units
def validate_LinearUnits(self, value):
result = True
# Validate type LinearUnits, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['CM', 'IN']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LinearUnits' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Length is not None or
self.Width is not None or
self.Height is not None or
self.Units is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Dimensions', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('Dimensions')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'Dimensions':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='Dimensions')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='Dimensions', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='Dimensions'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Dimensions', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Length is not None:
namespaceprefix_ = self.Length_nsprefix_ + ':' if (UseCapturedNS_ and self.Length_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLength>%s</%sLength>%s' % (namespaceprefix_ , self.gds_format_integer(self.Length, input_name='Length'), namespaceprefix_ , eol_))
if self.Width is not None:
namespaceprefix_ = self.Width_nsprefix_ + ':' if (UseCapturedNS_ and self.Width_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sWidth>%s</%sWidth>%s' % (namespaceprefix_ , self.gds_format_integer(self.Width, input_name='Width'), namespaceprefix_ , eol_))
if self.Height is not None:
namespaceprefix_ = self.Height_nsprefix_ + ':' if (UseCapturedNS_ and self.Height_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sHeight>%s</%sHeight>%s' % (namespaceprefix_ , self.gds_format_integer(self.Height, input_name='Height'), namespaceprefix_ , eol_))
if self.Units is not None:
namespaceprefix_ = self.Units_nsprefix_ + ':' if (UseCapturedNS_ and self.Units_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sUnits>%s</%sUnits>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Units), input_name='Units')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Length' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'Length')
if ival_ < 0:
raise_parse_error(child_, 'requires nonNegativeInteger')
ival_ = self.gds_validate_integer(ival_, node, 'Length')
self.Length = ival_
self.Length_nsprefix_ = child_.prefix
elif nodeName_ == 'Width' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'Width')
if ival_ < 0:
raise_parse_error(child_, 'requires nonNegativeInteger')
ival_ = self.gds_validate_integer(ival_, node, 'Width')
self.Width = ival_
self.Width_nsprefix_ = child_.prefix
elif nodeName_ == 'Height' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'Height')
if ival_ < 0:
raise_parse_error(child_, 'requires nonNegativeInteger')
ival_ = self.gds_validate_integer(ival_, node, 'Height')
self.Height = ival_
self.Height_nsprefix_ = child_.prefix
elif nodeName_ == 'Units':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Units')
value_ = self.gds_validate_string(value_, node, 'Units')
self.Units = value_
self.Units_nsprefix_ = child_.prefix
# validate type LinearUnits
self.validate_LinearUnits(self.Units)
# end class Dimensions
class Distance(GeneratedsSuper):
"""Driving or other transportation distances, distinct from dimension
measurements."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Value=None, Units=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Value = Value
self.Value_nsprefix_ = None
self.Units = Units
self.validate_DistanceUnits(self.Units)
self.Units_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, Distance)
if subclass is not None:
return subclass(*args_, **kwargs_)
if Distance.subclass:
return Distance.subclass(*args_, **kwargs_)
else:
return Distance(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Value(self):
return self.Value
def set_Value(self, Value):
self.Value = Value
def get_Units(self):
return self.Units
def set_Units(self, Units):
self.Units = Units
def validate_DistanceUnits(self, value):
result = True
# Validate type DistanceUnits, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['KM', 'MI']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on DistanceUnits' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Value is not None or
self.Units is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Distance', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('Distance')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'Distance':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='Distance')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='Distance', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='Distance'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Distance', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Value is not None:
namespaceprefix_ = self.Value_nsprefix_ + ':' if (UseCapturedNS_ and self.Value_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sValue>%s</%sValue>%s' % (namespaceprefix_ , self.gds_format_decimal(self.Value, input_name='Value'), namespaceprefix_ , eol_))
if self.Units is not None:
namespaceprefix_ = self.Units_nsprefix_ + ':' if (UseCapturedNS_ and self.Units_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sUnits>%s</%sUnits>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Units), input_name='Units')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Value' and child_.text:
sval_ = child_.text
fval_ = self.gds_parse_decimal(sval_, node, 'Value')
fval_ = self.gds_validate_decimal(fval_, node, 'Value')
self.Value = fval_
self.Value_nsprefix_ = child_.prefix
elif nodeName_ == 'Units':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Units')
value_ = self.gds_validate_string(value_, node, 'Units')
self.Units = value_
self.Units_nsprefix_ = child_.prefix
# validate type DistanceUnits
self.validate_DistanceUnits(self.Units)
# end class Distance
class DistanceAndLocationDetail(GeneratedsSuper):
"""Specifies the location details and other information relevant to the
location that is derived from the inputs provided in the request."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Distance=None, ReservationAvailabilityDetail=None, SupportedRedirectToHoldServices=None, LocationDetail=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Distance = Distance
self.Distance_nsprefix_ = None
self.ReservationAvailabilityDetail = ReservationAvailabilityDetail
self.ReservationAvailabilityDetail_nsprefix_ = None
if SupportedRedirectToHoldServices is None:
self.SupportedRedirectToHoldServices = []
else:
self.SupportedRedirectToHoldServices = SupportedRedirectToHoldServices
self.SupportedRedirectToHoldServices_nsprefix_ = None
self.LocationDetail = LocationDetail
self.LocationDetail_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, DistanceAndLocationDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if DistanceAndLocationDetail.subclass:
return DistanceAndLocationDetail.subclass(*args_, **kwargs_)
else:
return DistanceAndLocationDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Distance(self):
return self.Distance
def set_Distance(self, Distance):
self.Distance = Distance
def get_ReservationAvailabilityDetail(self):
return self.ReservationAvailabilityDetail
def set_ReservationAvailabilityDetail(self, ReservationAvailabilityDetail):
self.ReservationAvailabilityDetail = ReservationAvailabilityDetail
def get_SupportedRedirectToHoldServices(self):
return self.SupportedRedirectToHoldServices
def set_SupportedRedirectToHoldServices(self, SupportedRedirectToHoldServices):
self.SupportedRedirectToHoldServices = SupportedRedirectToHoldServices
def add_SupportedRedirectToHoldServices(self, value):
self.SupportedRedirectToHoldServices.append(value)
def insert_SupportedRedirectToHoldServices_at(self, index, value):
self.SupportedRedirectToHoldServices.insert(index, value)
def replace_SupportedRedirectToHoldServices_at(self, index, value):
self.SupportedRedirectToHoldServices[index] = value
def get_LocationDetail(self):
return self.LocationDetail
def set_LocationDetail(self, LocationDetail):
self.LocationDetail = LocationDetail
def validate_SupportedRedirectToHoldServiceType(self, value):
result = True
# Validate type SupportedRedirectToHoldServiceType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FEDEX_EXPRESS', 'FEDEX_GROUND', 'FEDEX_GROUND_HOME_DELIVERY']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on SupportedRedirectToHoldServiceType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Distance is not None or
self.ReservationAvailabilityDetail is not None or
self.SupportedRedirectToHoldServices or
self.LocationDetail is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='DistanceAndLocationDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('DistanceAndLocationDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'DistanceAndLocationDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='DistanceAndLocationDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='DistanceAndLocationDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='DistanceAndLocationDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='DistanceAndLocationDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Distance is not None:
namespaceprefix_ = self.Distance_nsprefix_ + ':' if (UseCapturedNS_ and self.Distance_nsprefix_) else ''
self.Distance.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Distance', pretty_print=pretty_print)
if self.ReservationAvailabilityDetail is not None:
namespaceprefix_ = self.ReservationAvailabilityDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.ReservationAvailabilityDetail_nsprefix_) else ''
self.ReservationAvailabilityDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ReservationAvailabilityDetail', pretty_print=pretty_print)
for SupportedRedirectToHoldServices_ in self.SupportedRedirectToHoldServices:
namespaceprefix_ = self.SupportedRedirectToHoldServices_nsprefix_ + ':' if (UseCapturedNS_ and self.SupportedRedirectToHoldServices_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sSupportedRedirectToHoldServices>%s</%sSupportedRedirectToHoldServices>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(SupportedRedirectToHoldServices_), input_name='SupportedRedirectToHoldServices')), namespaceprefix_ , eol_))
if self.LocationDetail is not None:
namespaceprefix_ = self.LocationDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationDetail_nsprefix_) else ''
self.LocationDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='LocationDetail', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Distance':
obj_ = Distance.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Distance = obj_
obj_.original_tagname_ = 'Distance'
elif nodeName_ == 'ReservationAvailabilityDetail':
obj_ = ReservationAvailabilityDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ReservationAvailabilityDetail = obj_
obj_.original_tagname_ = 'ReservationAvailabilityDetail'
elif nodeName_ == 'SupportedRedirectToHoldServices':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'SupportedRedirectToHoldServices')
value_ = self.gds_validate_string(value_, node, 'SupportedRedirectToHoldServices')
self.SupportedRedirectToHoldServices.append(value_)
self.SupportedRedirectToHoldServices_nsprefix_ = child_.prefix
# validate type SupportedRedirectToHoldServiceType
self.validate_SupportedRedirectToHoldServiceType(self.SupportedRedirectToHoldServices[-1])
elif nodeName_ == 'LocationDetail':
obj_ = LocationDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.LocationDetail = obj_
obj_.original_tagname_ = 'LocationDetail'
# end class DistanceAndLocationDetail
class Holiday(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Name=None, Date=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Name = Name
self.Name_nsprefix_ = None
if isinstance(Date, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(Date, '%Y-%m-%d').date()
else:
initvalue_ = Date
self.Date = initvalue_
self.Date_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, Holiday)
if subclass is not None:
return subclass(*args_, **kwargs_)
if Holiday.subclass:
return Holiday.subclass(*args_, **kwargs_)
else:
return Holiday(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Name(self):
return self.Name
def set_Name(self, Name):
self.Name = Name
def get_Date(self):
return self.Date
def set_Date(self, Date):
self.Date = Date
def hasContent_(self):
if (
self.Name is not None or
self.Date is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Holiday', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('Holiday')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'Holiday':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='Holiday')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='Holiday', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='Holiday'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Holiday', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Name is not None:
namespaceprefix_ = self.Name_nsprefix_ + ':' if (UseCapturedNS_ and self.Name_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sName>%s</%sName>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Name), input_name='Name')), namespaceprefix_ , eol_))
if self.Date is not None:
namespaceprefix_ = self.Date_nsprefix_ + ':' if (UseCapturedNS_ and self.Date_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sDate>%s</%sDate>%s' % (namespaceprefix_ , self.gds_format_date(self.Date, input_name='Date'), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Name':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Name')
value_ = self.gds_validate_string(value_, node, 'Name')
self.Name = value_
self.Name_nsprefix_ = child_.prefix
elif nodeName_ == 'Date':
sval_ = child_.text
dval_ = self.gds_parse_date(sval_)
self.Date = dval_
self.Date_nsprefix_ = child_.prefix
# end class Holiday
class LatestDropOffDetail(GeneratedsSuper):
"""Specifies the latest time by which a package can be dropped off at a
FedEx location."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, DayOfWeek=None, Time=None, Overlays=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.DayOfWeek = DayOfWeek
self.validate_DayOfWeekType(self.DayOfWeek)
self.DayOfWeek_nsprefix_ = None
if isinstance(Time, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(Time, '%H:%M:%S').time()
else:
initvalue_ = Time
self.Time = initvalue_
self.Time_nsprefix_ = None
if Overlays is None:
self.Overlays = []
else:
self.Overlays = Overlays
self.Overlays_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LatestDropOffDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LatestDropOffDetail.subclass:
return LatestDropOffDetail.subclass(*args_, **kwargs_)
else:
return LatestDropOffDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_DayOfWeek(self):
return self.DayOfWeek
def set_DayOfWeek(self, DayOfWeek):
self.DayOfWeek = DayOfWeek
def get_Time(self):
return self.Time
def set_Time(self, Time):
self.Time = Time
def get_Overlays(self):
return self.Overlays
def set_Overlays(self, Overlays):
self.Overlays = Overlays
def add_Overlays(self, value):
self.Overlays.append(value)
def insert_Overlays_at(self, index, value):
self.Overlays.insert(index, value)
def replace_Overlays_at(self, index, value):
self.Overlays[index] = value
def validate_DayOfWeekType(self, value):
result = True
# Validate type DayOfWeekType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FRI', 'MON', 'SAT', 'SUN', 'THU', 'TUE', 'WED']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on DayOfWeekType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.DayOfWeek is not None or
self.Time is not None or
self.Overlays
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LatestDropOffDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LatestDropOffDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LatestDropOffDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LatestDropOffDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LatestDropOffDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LatestDropOffDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LatestDropOffDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.DayOfWeek is not None:
namespaceprefix_ = self.DayOfWeek_nsprefix_ + ':' if (UseCapturedNS_ and self.DayOfWeek_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sDayOfWeek>%s</%sDayOfWeek>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.DayOfWeek), input_name='DayOfWeek')), namespaceprefix_ , eol_))
if self.Time is not None:
namespaceprefix_ = self.Time_nsprefix_ + ':' if (UseCapturedNS_ and self.Time_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sTime>%s</%sTime>%s' % (namespaceprefix_ , self.gds_format_time(self.Time, input_name='Time'), namespaceprefix_ , eol_))
for Overlays_ in self.Overlays:
namespaceprefix_ = self.Overlays_nsprefix_ + ':' if (UseCapturedNS_ and self.Overlays_nsprefix_) else ''
Overlays_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Overlays', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'DayOfWeek':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'DayOfWeek')
value_ = self.gds_validate_string(value_, node, 'DayOfWeek')
self.DayOfWeek = value_
self.DayOfWeek_nsprefix_ = child_.prefix
# validate type DayOfWeekType
self.validate_DayOfWeekType(self.DayOfWeek)
elif nodeName_ == 'Time':
sval_ = child_.text
dval_ = self.gds_parse_time(sval_)
self.Time = dval_
self.Time_nsprefix_ = child_.prefix
elif nodeName_ == 'Overlays':
obj_ = LatestDropoffOverlayDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Overlays.append(obj_)
obj_.original_tagname_ = 'Overlays'
# end class LatestDropOffDetail
class LatestDropoffOverlayDetail(GeneratedsSuper):
"""Specifies the time and reason to overlay the last drop off time for a
carrier at a FedEx location."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Type=None, Time=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Type = Type
self.validate_LatestDropOffOverlayType(self.Type)
self.Type_nsprefix_ = None
if isinstance(Time, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(Time, '%H:%M:%S').time()
else:
initvalue_ = Time
self.Time = initvalue_
self.Time_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LatestDropoffOverlayDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LatestDropoffOverlayDetail.subclass:
return LatestDropoffOverlayDetail.subclass(*args_, **kwargs_)
else:
return LatestDropoffOverlayDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Type(self):
return self.Type
def set_Type(self, Type):
self.Type = Type
def get_Time(self):
return self.Time
def set_Time(self, Time):
self.Time = Time
def validate_LatestDropOffOverlayType(self, value):
result = True
# Validate type LatestDropOffOverlayType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['US_WEST_COAST']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LatestDropOffOverlayType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Type is not None or
self.Time is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LatestDropoffOverlayDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LatestDropoffOverlayDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LatestDropoffOverlayDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LatestDropoffOverlayDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LatestDropoffOverlayDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LatestDropoffOverlayDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LatestDropoffOverlayDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Type is not None:
namespaceprefix_ = self.Type_nsprefix_ + ':' if (UseCapturedNS_ and self.Type_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sType>%s</%sType>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Type), input_name='Type')), namespaceprefix_ , eol_))
if self.Time is not None:
namespaceprefix_ = self.Time_nsprefix_ + ':' if (UseCapturedNS_ and self.Time_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sTime>%s</%sTime>%s' % (namespaceprefix_ , self.gds_format_time(self.Time, input_name='Time'), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Type':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Type')
value_ = self.gds_validate_string(value_, node, 'Type')
self.Type = value_
self.Type_nsprefix_ = child_.prefix
# validate type LatestDropOffOverlayType
self.validate_LatestDropOffOverlayType(self.Type)
elif nodeName_ == 'Time':
sval_ = child_.text
dval_ = self.gds_parse_time(sval_)
self.Time = dval_
self.Time_nsprefix_ = child_.prefix
# end class LatestDropoffOverlayDetail
class Localization(GeneratedsSuper):
"""Identifies the representation of human-readable text."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, LanguageCode=None, LocaleCode=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.LanguageCode = LanguageCode
self.LanguageCode_nsprefix_ = None
self.LocaleCode = LocaleCode
self.LocaleCode_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, Localization)
if subclass is not None:
return subclass(*args_, **kwargs_)
if Localization.subclass:
return Localization.subclass(*args_, **kwargs_)
else:
return Localization(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_LanguageCode(self):
return self.LanguageCode
def set_LanguageCode(self, LanguageCode):
self.LanguageCode = LanguageCode
def get_LocaleCode(self):
return self.LocaleCode
def set_LocaleCode(self, LocaleCode):
self.LocaleCode = LocaleCode
def hasContent_(self):
if (
self.LanguageCode is not None or
self.LocaleCode is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Localization', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('Localization')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'Localization':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='Localization')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='Localization', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='Localization'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Localization', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.LanguageCode is not None:
namespaceprefix_ = self.LanguageCode_nsprefix_ + ':' if (UseCapturedNS_ and self.LanguageCode_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLanguageCode>%s</%sLanguageCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.LanguageCode), input_name='LanguageCode')), namespaceprefix_ , eol_))
if self.LocaleCode is not None:
namespaceprefix_ = self.LocaleCode_nsprefix_ + ':' if (UseCapturedNS_ and self.LocaleCode_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocaleCode>%s</%sLocaleCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.LocaleCode), input_name='LocaleCode')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'LanguageCode':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LanguageCode')
value_ = self.gds_validate_string(value_, node, 'LanguageCode')
self.LanguageCode = value_
self.LanguageCode_nsprefix_ = child_.prefix
elif nodeName_ == 'LocaleCode':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocaleCode')
value_ = self.gds_validate_string(value_, node, 'LocaleCode')
self.LocaleCode = value_
self.LocaleCode_nsprefix_ = child_.prefix
# end class Localization
class LocationCapabilityDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, CarrierCode=None, ServiceType=None, ServiceCategory=None, TransferOfPossessionType=None, DaysOfWeek=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.CarrierCode = CarrierCode
self.validate_CarrierCodeType(self.CarrierCode)
self.CarrierCode_nsprefix_ = None
self.ServiceType = ServiceType
self.ServiceType_nsprefix_ = None
self.ServiceCategory = ServiceCategory
self.validate_ServiceCategoryType(self.ServiceCategory)
self.ServiceCategory_nsprefix_ = None
self.TransferOfPossessionType = TransferOfPossessionType
self.validate_LocationTransferOfPossessionType(self.TransferOfPossessionType)
self.TransferOfPossessionType_nsprefix_ = None
if DaysOfWeek is None:
self.DaysOfWeek = []
else:
self.DaysOfWeek = DaysOfWeek
self.DaysOfWeek_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LocationCapabilityDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LocationCapabilityDetail.subclass:
return LocationCapabilityDetail.subclass(*args_, **kwargs_)
else:
return LocationCapabilityDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_CarrierCode(self):
return self.CarrierCode
def set_CarrierCode(self, CarrierCode):
self.CarrierCode = CarrierCode
def get_ServiceType(self):
return self.ServiceType
def set_ServiceType(self, ServiceType):
self.ServiceType = ServiceType
def get_ServiceCategory(self):
return self.ServiceCategory
def set_ServiceCategory(self, ServiceCategory):
self.ServiceCategory = ServiceCategory
def get_TransferOfPossessionType(self):
return self.TransferOfPossessionType
def set_TransferOfPossessionType(self, TransferOfPossessionType):
self.TransferOfPossessionType = TransferOfPossessionType
def get_DaysOfWeek(self):
return self.DaysOfWeek
def set_DaysOfWeek(self, DaysOfWeek):
self.DaysOfWeek = DaysOfWeek
def add_DaysOfWeek(self, value):
self.DaysOfWeek.append(value)
def insert_DaysOfWeek_at(self, index, value):
self.DaysOfWeek.insert(index, value)
def replace_DaysOfWeek_at(self, index, value):
self.DaysOfWeek[index] = value
def validate_CarrierCodeType(self, value):
result = True
# Validate type CarrierCodeType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FDXC', 'FDXE', 'FDXG', 'FXCC', 'FXFR', 'FXSP']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on CarrierCodeType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_ServiceCategoryType(self, value):
result = True
# Validate type ServiceCategoryType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['EXPRESS_FREIGHT', 'EXPRESS_PARCEL', 'GROUND_HOME_DELIVERY']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on ServiceCategoryType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_LocationTransferOfPossessionType(self, value):
result = True
# Validate type LocationTransferOfPossessionType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['DROPOFF', 'HOLD_AT_LOCATION', 'REDIRECT_TO_HOLD_AT_LOCATION']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LocationTransferOfPossessionType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_DayOfWeekType(self, value):
result = True
# Validate type DayOfWeekType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FRI', 'MON', 'SAT', 'SUN', 'THU', 'TUE', 'WED']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on DayOfWeekType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.CarrierCode is not None or
self.ServiceType is not None or
self.ServiceCategory is not None or
self.TransferOfPossessionType is not None or
self.DaysOfWeek
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationCapabilityDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LocationCapabilityDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LocationCapabilityDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LocationCapabilityDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LocationCapabilityDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LocationCapabilityDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationCapabilityDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.CarrierCode is not None:
namespaceprefix_ = self.CarrierCode_nsprefix_ + ':' if (UseCapturedNS_ and self.CarrierCode_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCarrierCode>%s</%sCarrierCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.CarrierCode), input_name='CarrierCode')), namespaceprefix_ , eol_))
if self.ServiceType is not None:
namespaceprefix_ = self.ServiceType_nsprefix_ + ':' if (UseCapturedNS_ and self.ServiceType_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sServiceType>%s</%sServiceType>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ServiceType), input_name='ServiceType')), namespaceprefix_ , eol_))
if self.ServiceCategory is not None:
namespaceprefix_ = self.ServiceCategory_nsprefix_ + ':' if (UseCapturedNS_ and self.ServiceCategory_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sServiceCategory>%s</%sServiceCategory>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ServiceCategory), input_name='ServiceCategory')), namespaceprefix_ , eol_))
if self.TransferOfPossessionType is not None:
namespaceprefix_ = self.TransferOfPossessionType_nsprefix_ + ':' if (UseCapturedNS_ and self.TransferOfPossessionType_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sTransferOfPossessionType>%s</%sTransferOfPossessionType>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.TransferOfPossessionType), input_name='TransferOfPossessionType')), namespaceprefix_ , eol_))
for DaysOfWeek_ in self.DaysOfWeek:
namespaceprefix_ = self.DaysOfWeek_nsprefix_ + ':' if (UseCapturedNS_ and self.DaysOfWeek_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sDaysOfWeek>%s</%sDaysOfWeek>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(DaysOfWeek_), input_name='DaysOfWeek')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'CarrierCode':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'CarrierCode')
value_ = self.gds_validate_string(value_, node, 'CarrierCode')
self.CarrierCode = value_
self.CarrierCode_nsprefix_ = child_.prefix
# validate type CarrierCodeType
self.validate_CarrierCodeType(self.CarrierCode)
elif nodeName_ == 'ServiceType':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ServiceType')
value_ = self.gds_validate_string(value_, node, 'ServiceType')
self.ServiceType = value_
self.ServiceType_nsprefix_ = child_.prefix
elif nodeName_ == 'ServiceCategory':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ServiceCategory')
value_ = self.gds_validate_string(value_, node, 'ServiceCategory')
self.ServiceCategory = value_
self.ServiceCategory_nsprefix_ = child_.prefix
# validate type ServiceCategoryType
self.validate_ServiceCategoryType(self.ServiceCategory)
elif nodeName_ == 'TransferOfPossessionType':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'TransferOfPossessionType')
value_ = self.gds_validate_string(value_, node, 'TransferOfPossessionType')
self.TransferOfPossessionType = value_
self.TransferOfPossessionType_nsprefix_ = child_.prefix
# validate type LocationTransferOfPossessionType
self.validate_LocationTransferOfPossessionType(self.TransferOfPossessionType)
elif nodeName_ == 'DaysOfWeek':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'DaysOfWeek')
value_ = self.gds_validate_string(value_, node, 'DaysOfWeek')
self.DaysOfWeek.append(value_)
self.DaysOfWeek_nsprefix_ = child_.prefix
# validate type DayOfWeekType
self.validate_DayOfWeekType(self.DaysOfWeek[-1])
# end class LocationCapabilityDetail
class LocationContactAndAddress(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Contact=None, Address=None, AddressAncillaryDetail=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Contact = Contact
self.Contact_nsprefix_ = None
self.Address = Address
self.Address_nsprefix_ = None
self.AddressAncillaryDetail = AddressAncillaryDetail
self.AddressAncillaryDetail_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LocationContactAndAddress)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LocationContactAndAddress.subclass:
return LocationContactAndAddress.subclass(*args_, **kwargs_)
else:
return LocationContactAndAddress(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Contact(self):
return self.Contact
def set_Contact(self, Contact):
self.Contact = Contact
def get_Address(self):
return self.Address
def set_Address(self, Address):
self.Address = Address
def get_AddressAncillaryDetail(self):
return self.AddressAncillaryDetail
def set_AddressAncillaryDetail(self, AddressAncillaryDetail):
self.AddressAncillaryDetail = AddressAncillaryDetail
def hasContent_(self):
if (
self.Contact is not None or
self.Address is not None or
self.AddressAncillaryDetail is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationContactAndAddress', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LocationContactAndAddress')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LocationContactAndAddress':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LocationContactAndAddress')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LocationContactAndAddress', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LocationContactAndAddress'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationContactAndAddress', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Contact is not None:
namespaceprefix_ = self.Contact_nsprefix_ + ':' if (UseCapturedNS_ and self.Contact_nsprefix_) else ''
self.Contact.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Contact', pretty_print=pretty_print)
if self.Address is not None:
namespaceprefix_ = self.Address_nsprefix_ + ':' if (UseCapturedNS_ and self.Address_nsprefix_) else ''
self.Address.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Address', pretty_print=pretty_print)
if self.AddressAncillaryDetail is not None:
namespaceprefix_ = self.AddressAncillaryDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.AddressAncillaryDetail_nsprefix_) else ''
self.AddressAncillaryDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='AddressAncillaryDetail', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Contact':
obj_ = Contact.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Contact = obj_
obj_.original_tagname_ = 'Contact'
elif nodeName_ == 'Address':
obj_ = Address.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Address = obj_
obj_.original_tagname_ = 'Address'
elif nodeName_ == 'AddressAncillaryDetail':
obj_ = AddressAncillaryDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.AddressAncillaryDetail = obj_
obj_.original_tagname_ = 'AddressAncillaryDetail'
# end class LocationContactAndAddress
class LocationDetail(GeneratedsSuper):
"""Describes an individual location providing a set of customer service
features."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, LocationId=None, StoreNumber=None, LocationContactAndAddress=None, SpecialInstructions=None, TimeZoneOffset=None, LocationType=None, LocationTypeForDisplay=None, Attributes=None, LocationCapabilities=None, PackageMaximumLimits=None, ClearanceLocationDetail=None, ServicingLocationDetails=None, AcceptedCurrency=None, LocationHolidays=None, MapUrl=None, EntityId=None, NormalHours=None, ExceptionalHours=None, HoursForEffectiveDate=None, CarrierDetails=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.LocationId = LocationId
self.LocationId_nsprefix_ = None
self.StoreNumber = StoreNumber
self.StoreNumber_nsprefix_ = None
self.LocationContactAndAddress = LocationContactAndAddress
self.LocationContactAndAddress_nsprefix_ = None
self.SpecialInstructions = SpecialInstructions
self.SpecialInstructions_nsprefix_ = None
self.TimeZoneOffset = TimeZoneOffset
self.TimeZoneOffset_nsprefix_ = None
self.LocationType = LocationType
self.validate_FedExLocationType(self.LocationType)
self.LocationType_nsprefix_ = None
self.LocationTypeForDisplay = LocationTypeForDisplay
self.LocationTypeForDisplay_nsprefix_ = None
if Attributes is None:
self.Attributes = []
else:
self.Attributes = Attributes
self.Attributes_nsprefix_ = None
if LocationCapabilities is None:
self.LocationCapabilities = []
else:
self.LocationCapabilities = LocationCapabilities
self.LocationCapabilities_nsprefix_ = None
self.PackageMaximumLimits = PackageMaximumLimits
self.PackageMaximumLimits_nsprefix_ = None
self.ClearanceLocationDetail = ClearanceLocationDetail
self.ClearanceLocationDetail_nsprefix_ = None
if ServicingLocationDetails is None:
self.ServicingLocationDetails = []
else:
self.ServicingLocationDetails = ServicingLocationDetails
self.ServicingLocationDetails_nsprefix_ = None
self.AcceptedCurrency = AcceptedCurrency
self.AcceptedCurrency_nsprefix_ = None
if LocationHolidays is None:
self.LocationHolidays = []
else:
self.LocationHolidays = LocationHolidays
self.LocationHolidays_nsprefix_ = None
self.MapUrl = MapUrl
self.MapUrl_nsprefix_ = None
self.EntityId = EntityId
self.EntityId_nsprefix_ = None
if NormalHours is None:
self.NormalHours = []
else:
self.NormalHours = NormalHours
self.NormalHours_nsprefix_ = None
if ExceptionalHours is None:
self.ExceptionalHours = []
else:
self.ExceptionalHours = ExceptionalHours
self.ExceptionalHours_nsprefix_ = None
if HoursForEffectiveDate is None:
self.HoursForEffectiveDate = []
else:
self.HoursForEffectiveDate = HoursForEffectiveDate
self.HoursForEffectiveDate_nsprefix_ = None
if CarrierDetails is None:
self.CarrierDetails = []
else:
self.CarrierDetails = CarrierDetails
self.CarrierDetails_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LocationDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LocationDetail.subclass:
return LocationDetail.subclass(*args_, **kwargs_)
else:
return LocationDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_LocationId(self):
return self.LocationId
def set_LocationId(self, LocationId):
self.LocationId = LocationId
def get_StoreNumber(self):
return self.StoreNumber
def set_StoreNumber(self, StoreNumber):
self.StoreNumber = StoreNumber
def get_LocationContactAndAddress(self):
return self.LocationContactAndAddress
def set_LocationContactAndAddress(self, LocationContactAndAddress):
self.LocationContactAndAddress = LocationContactAndAddress
def get_SpecialInstructions(self):
return self.SpecialInstructions
def set_SpecialInstructions(self, SpecialInstructions):
self.SpecialInstructions = SpecialInstructions
def get_TimeZoneOffset(self):
return self.TimeZoneOffset
def set_TimeZoneOffset(self, TimeZoneOffset):
self.TimeZoneOffset = TimeZoneOffset
def get_LocationType(self):
return self.LocationType
def set_LocationType(self, LocationType):
self.LocationType = LocationType
def get_LocationTypeForDisplay(self):
return self.LocationTypeForDisplay
def set_LocationTypeForDisplay(self, LocationTypeForDisplay):
self.LocationTypeForDisplay = LocationTypeForDisplay
def get_Attributes(self):
return self.Attributes
def set_Attributes(self, Attributes):
self.Attributes = Attributes
def add_Attributes(self, value):
self.Attributes.append(value)
def insert_Attributes_at(self, index, value):
self.Attributes.insert(index, value)
def replace_Attributes_at(self, index, value):
self.Attributes[index] = value
def get_LocationCapabilities(self):
return self.LocationCapabilities
def set_LocationCapabilities(self, LocationCapabilities):
self.LocationCapabilities = LocationCapabilities
def add_LocationCapabilities(self, value):
self.LocationCapabilities.append(value)
def insert_LocationCapabilities_at(self, index, value):
self.LocationCapabilities.insert(index, value)
def replace_LocationCapabilities_at(self, index, value):
self.LocationCapabilities[index] = value
def get_PackageMaximumLimits(self):
return self.PackageMaximumLimits
def set_PackageMaximumLimits(self, PackageMaximumLimits):
self.PackageMaximumLimits = PackageMaximumLimits
def get_ClearanceLocationDetail(self):
return self.ClearanceLocationDetail
def set_ClearanceLocationDetail(self, ClearanceLocationDetail):
self.ClearanceLocationDetail = ClearanceLocationDetail
def get_ServicingLocationDetails(self):
return self.ServicingLocationDetails
def set_ServicingLocationDetails(self, ServicingLocationDetails):
self.ServicingLocationDetails = ServicingLocationDetails
def add_ServicingLocationDetails(self, value):
self.ServicingLocationDetails.append(value)
def insert_ServicingLocationDetails_at(self, index, value):
self.ServicingLocationDetails.insert(index, value)
def replace_ServicingLocationDetails_at(self, index, value):
self.ServicingLocationDetails[index] = value
def get_AcceptedCurrency(self):
return self.AcceptedCurrency
def set_AcceptedCurrency(self, AcceptedCurrency):
self.AcceptedCurrency = AcceptedCurrency
def get_LocationHolidays(self):
return self.LocationHolidays
def set_LocationHolidays(self, LocationHolidays):
self.LocationHolidays = LocationHolidays
def add_LocationHolidays(self, value):
self.LocationHolidays.append(value)
def insert_LocationHolidays_at(self, index, value):
self.LocationHolidays.insert(index, value)
def replace_LocationHolidays_at(self, index, value):
self.LocationHolidays[index] = value
def get_MapUrl(self):
return self.MapUrl
def set_MapUrl(self, MapUrl):
self.MapUrl = MapUrl
def get_EntityId(self):
return self.EntityId
def set_EntityId(self, EntityId):
self.EntityId = EntityId
def get_NormalHours(self):
return self.NormalHours
def set_NormalHours(self, NormalHours):
self.NormalHours = NormalHours
def add_NormalHours(self, value):
self.NormalHours.append(value)
def insert_NormalHours_at(self, index, value):
self.NormalHours.insert(index, value)
def replace_NormalHours_at(self, index, value):
self.NormalHours[index] = value
def get_ExceptionalHours(self):
return self.ExceptionalHours
def set_ExceptionalHours(self, ExceptionalHours):
self.ExceptionalHours = ExceptionalHours
def add_ExceptionalHours(self, value):
self.ExceptionalHours.append(value)
def insert_ExceptionalHours_at(self, index, value):
self.ExceptionalHours.insert(index, value)
def replace_ExceptionalHours_at(self, index, value):
self.ExceptionalHours[index] = value
def get_HoursForEffectiveDate(self):
return self.HoursForEffectiveDate
def set_HoursForEffectiveDate(self, HoursForEffectiveDate):
self.HoursForEffectiveDate = HoursForEffectiveDate
def add_HoursForEffectiveDate(self, value):
self.HoursForEffectiveDate.append(value)
def insert_HoursForEffectiveDate_at(self, index, value):
self.HoursForEffectiveDate.insert(index, value)
def replace_HoursForEffectiveDate_at(self, index, value):
self.HoursForEffectiveDate[index] = value
def get_CarrierDetails(self):
return self.CarrierDetails
def set_CarrierDetails(self, CarrierDetails):
self.CarrierDetails = CarrierDetails
def add_CarrierDetails(self, value):
self.CarrierDetails.append(value)
def insert_CarrierDetails_at(self, index, value):
self.CarrierDetails.insert(index, value)
def replace_CarrierDetails_at(self, index, value):
self.CarrierDetails[index] = value
def validate_FedExLocationType(self, value):
result = True
# Validate type FedExLocationType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FEDEX_AUTHORIZED_SHIP_CENTER', 'FEDEX_EXPRESS_STATION', 'FEDEX_FACILITY', 'FEDEX_FREIGHT_SERVICE_CENTER', 'FEDEX_GROUND_TERMINAL', 'FEDEX_HOME_DELIVERY_STATION', 'FEDEX_OFFICE', 'FEDEX_ONSITE', 'FEDEX_SELF_SERVICE_LOCATION', 'FEDEX_SHIPSITE', 'FEDEX_SHIP_AND_GET', 'FEDEX_SMART_POST_HUB']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on FedExLocationType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_LocationAttributesType(self, value):
result = True
# Validate type LocationAttributesType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['ACCEPTS_CASH', 'ALREADY_OPEN', 'CLEARANCE_SERVICES', 'COPY_AND_PRINT_SERVICES', 'DANGEROUS_GOODS_SERVICES', 'DIRECT_MAIL_SERVICES', 'DOMESTIC_SHIPPING_SERVICES', 'DROP_BOX', 'INTERNATIONAL_SHIPPING_SERVICES', 'LOCATION_IS_IN_AIRPORT', 'NOTARY_SERVICES', 'OBSERVES_DAY_LIGHT_SAVING_TIMES', 'OPEN_TWENTY_FOUR_HOURS', 'PACKAGING_SUPPLIES', 'PACK_AND_SHIP', 'PASSPORT_PHOTO_SERVICES', 'RETURNS_SERVICES', 'SIGNS_AND_BANNERS_SERVICE', 'SONY_PICTURE_STATION']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LocationAttributesType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.LocationId is not None or
self.StoreNumber is not None or
self.LocationContactAndAddress is not None or
self.SpecialInstructions is not None or
self.TimeZoneOffset is not None or
self.LocationType is not None or
self.LocationTypeForDisplay is not None or
self.Attributes or
self.LocationCapabilities or
self.PackageMaximumLimits is not None or
self.ClearanceLocationDetail is not None or
self.ServicingLocationDetails or
self.AcceptedCurrency is not None or
self.LocationHolidays or
self.MapUrl is not None or
self.EntityId is not None or
self.NormalHours or
self.ExceptionalHours or
self.HoursForEffectiveDate or
self.CarrierDetails
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LocationDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LocationDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LocationDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LocationDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LocationDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.LocationId is not None:
namespaceprefix_ = self.LocationId_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationId_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocationId>%s</%sLocationId>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.LocationId), input_name='LocationId')), namespaceprefix_ , eol_))
if self.StoreNumber is not None:
namespaceprefix_ = self.StoreNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.StoreNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sStoreNumber>%s</%sStoreNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.StoreNumber), input_name='StoreNumber')), namespaceprefix_ , eol_))
if self.LocationContactAndAddress is not None:
namespaceprefix_ = self.LocationContactAndAddress_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationContactAndAddress_nsprefix_) else ''
self.LocationContactAndAddress.export(outfile, level, namespaceprefix_, namespacedef_='', name_='LocationContactAndAddress', pretty_print=pretty_print)
if self.SpecialInstructions is not None:
namespaceprefix_ = self.SpecialInstructions_nsprefix_ + ':' if (UseCapturedNS_ and self.SpecialInstructions_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sSpecialInstructions>%s</%sSpecialInstructions>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.SpecialInstructions), input_name='SpecialInstructions')), namespaceprefix_ , eol_))
if self.TimeZoneOffset is not None:
namespaceprefix_ = self.TimeZoneOffset_nsprefix_ + ':' if (UseCapturedNS_ and self.TimeZoneOffset_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sTimeZoneOffset>%s</%sTimeZoneOffset>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.TimeZoneOffset), input_name='TimeZoneOffset')), namespaceprefix_ , eol_))
if self.LocationType is not None:
namespaceprefix_ = self.LocationType_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationType_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocationType>%s</%sLocationType>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.LocationType), input_name='LocationType')), namespaceprefix_ , eol_))
if self.LocationTypeForDisplay is not None:
namespaceprefix_ = self.LocationTypeForDisplay_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationTypeForDisplay_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocationTypeForDisplay>%s</%sLocationTypeForDisplay>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.LocationTypeForDisplay), input_name='LocationTypeForDisplay')), namespaceprefix_ , eol_))
for Attributes_ in self.Attributes:
namespaceprefix_ = self.Attributes_nsprefix_ + ':' if (UseCapturedNS_ and self.Attributes_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sAttributes>%s</%sAttributes>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(Attributes_), input_name='Attributes')), namespaceprefix_ , eol_))
for LocationCapabilities_ in self.LocationCapabilities:
namespaceprefix_ = self.LocationCapabilities_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationCapabilities_nsprefix_) else ''
LocationCapabilities_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='LocationCapabilities', pretty_print=pretty_print)
if self.PackageMaximumLimits is not None:
namespaceprefix_ = self.PackageMaximumLimits_nsprefix_ + ':' if (UseCapturedNS_ and self.PackageMaximumLimits_nsprefix_) else ''
self.PackageMaximumLimits.export(outfile, level, namespaceprefix_, namespacedef_='', name_='PackageMaximumLimits', pretty_print=pretty_print)
if self.ClearanceLocationDetail is not None:
namespaceprefix_ = self.ClearanceLocationDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.ClearanceLocationDetail_nsprefix_) else ''
self.ClearanceLocationDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ClearanceLocationDetail', pretty_print=pretty_print)
for ServicingLocationDetails_ in self.ServicingLocationDetails:
namespaceprefix_ = self.ServicingLocationDetails_nsprefix_ + ':' if (UseCapturedNS_ and self.ServicingLocationDetails_nsprefix_) else ''
ServicingLocationDetails_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ServicingLocationDetails', pretty_print=pretty_print)
if self.AcceptedCurrency is not None:
namespaceprefix_ = self.AcceptedCurrency_nsprefix_ + ':' if (UseCapturedNS_ and self.AcceptedCurrency_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sAcceptedCurrency>%s</%sAcceptedCurrency>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.AcceptedCurrency), input_name='AcceptedCurrency')), namespaceprefix_ , eol_))
for LocationHolidays_ in self.LocationHolidays:
namespaceprefix_ = self.LocationHolidays_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationHolidays_nsprefix_) else ''
LocationHolidays_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='LocationHolidays', pretty_print=pretty_print)
if self.MapUrl is not None:
namespaceprefix_ = self.MapUrl_nsprefix_ + ':' if (UseCapturedNS_ and self.MapUrl_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sMapUrl>%s</%sMapUrl>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.MapUrl), input_name='MapUrl')), namespaceprefix_ , eol_))
if self.EntityId is not None:
namespaceprefix_ = self.EntityId_nsprefix_ + ':' if (UseCapturedNS_ and self.EntityId_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sEntityId>%s</%sEntityId>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.EntityId), input_name='EntityId')), namespaceprefix_ , eol_))
for NormalHours_ in self.NormalHours:
namespaceprefix_ = self.NormalHours_nsprefix_ + ':' if (UseCapturedNS_ and self.NormalHours_nsprefix_) else ''
NormalHours_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='NormalHours', pretty_print=pretty_print)
for ExceptionalHours_ in self.ExceptionalHours:
namespaceprefix_ = self.ExceptionalHours_nsprefix_ + ':' if (UseCapturedNS_ and self.ExceptionalHours_nsprefix_) else ''
ExceptionalHours_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ExceptionalHours', pretty_print=pretty_print)
for HoursForEffectiveDate_ in self.HoursForEffectiveDate:
namespaceprefix_ = self.HoursForEffectiveDate_nsprefix_ + ':' if (UseCapturedNS_ and self.HoursForEffectiveDate_nsprefix_) else ''
HoursForEffectiveDate_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='HoursForEffectiveDate', pretty_print=pretty_print)
for CarrierDetails_ in self.CarrierDetails:
namespaceprefix_ = self.CarrierDetails_nsprefix_ + ':' if (UseCapturedNS_ and self.CarrierDetails_nsprefix_) else ''
CarrierDetails_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='CarrierDetails', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'LocationId':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocationId')
value_ = self.gds_validate_string(value_, node, 'LocationId')
self.LocationId = value_
self.LocationId_nsprefix_ = child_.prefix
elif nodeName_ == 'StoreNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'StoreNumber')
value_ = self.gds_validate_string(value_, node, 'StoreNumber')
self.StoreNumber = value_
self.StoreNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'LocationContactAndAddress':
obj_ = LocationContactAndAddress.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.LocationContactAndAddress = obj_
obj_.original_tagname_ = 'LocationContactAndAddress'
elif nodeName_ == 'SpecialInstructions':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'SpecialInstructions')
value_ = self.gds_validate_string(value_, node, 'SpecialInstructions')
self.SpecialInstructions = value_
self.SpecialInstructions_nsprefix_ = child_.prefix
elif nodeName_ == 'TimeZoneOffset':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'TimeZoneOffset')
value_ = self.gds_validate_string(value_, node, 'TimeZoneOffset')
self.TimeZoneOffset = value_
self.TimeZoneOffset_nsprefix_ = child_.prefix
elif nodeName_ == 'LocationType':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocationType')
value_ = self.gds_validate_string(value_, node, 'LocationType')
self.LocationType = value_
self.LocationType_nsprefix_ = child_.prefix
# validate type FedExLocationType
self.validate_FedExLocationType(self.LocationType)
elif nodeName_ == 'LocationTypeForDisplay':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocationTypeForDisplay')
value_ = self.gds_validate_string(value_, node, 'LocationTypeForDisplay')
self.LocationTypeForDisplay = value_
self.LocationTypeForDisplay_nsprefix_ = child_.prefix
elif nodeName_ == 'Attributes':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Attributes')
value_ = self.gds_validate_string(value_, node, 'Attributes')
self.Attributes.append(value_)
self.Attributes_nsprefix_ = child_.prefix
# validate type LocationAttributesType
self.validate_LocationAttributesType(self.Attributes[-1])
elif nodeName_ == 'LocationCapabilities':
obj_ = LocationCapabilityDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.LocationCapabilities.append(obj_)
obj_.original_tagname_ = 'LocationCapabilities'
elif nodeName_ == 'PackageMaximumLimits':
obj_ = LocationPackageLimitsDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.PackageMaximumLimits = obj_
obj_.original_tagname_ = 'PackageMaximumLimits'
elif nodeName_ == 'ClearanceLocationDetail':
obj_ = ClearanceLocationDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ClearanceLocationDetail = obj_
obj_.original_tagname_ = 'ClearanceLocationDetail'
elif nodeName_ == 'ServicingLocationDetails':
obj_ = LocationIdentificationDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ServicingLocationDetails.append(obj_)
obj_.original_tagname_ = 'ServicingLocationDetails'
elif nodeName_ == 'AcceptedCurrency':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'AcceptedCurrency')
value_ = self.gds_validate_string(value_, node, 'AcceptedCurrency')
self.AcceptedCurrency = value_
self.AcceptedCurrency_nsprefix_ = child_.prefix
elif nodeName_ == 'LocationHolidays':
obj_ = Holiday.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.LocationHolidays.append(obj_)
obj_.original_tagname_ = 'LocationHolidays'
elif nodeName_ == 'MapUrl':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'MapUrl')
value_ = self.gds_validate_string(value_, node, 'MapUrl')
self.MapUrl = value_
self.MapUrl_nsprefix_ = child_.prefix
elif nodeName_ == 'EntityId':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'EntityId')
value_ = self.gds_validate_string(value_, node, 'EntityId')
self.EntityId = value_
self.EntityId_nsprefix_ = child_.prefix
elif nodeName_ == 'NormalHours':
obj_ = LocationHours.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.NormalHours.append(obj_)
obj_.original_tagname_ = 'NormalHours'
elif nodeName_ == 'ExceptionalHours':
obj_ = LocationHours.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ExceptionalHours.append(obj_)
obj_.original_tagname_ = 'ExceptionalHours'
elif nodeName_ == 'HoursForEffectiveDate':
obj_ = LocationHours.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.HoursForEffectiveDate.append(obj_)
obj_.original_tagname_ = 'HoursForEffectiveDate'
elif nodeName_ == 'CarrierDetails':
obj_ = CarrierDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.CarrierDetails.append(obj_)
obj_.original_tagname_ = 'CarrierDetails'
# end class LocationDetail
class LocationHours(GeneratedsSuper):
"""Specifies the location hours for a location."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, DayofWeek=None, OperationalHours=None, Hours=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.DayofWeek = DayofWeek
self.validate_DayOfWeekType(self.DayofWeek)
self.DayofWeek_nsprefix_ = None
self.OperationalHours = OperationalHours
self.validate_OperationalHoursType(self.OperationalHours)
self.OperationalHours_nsprefix_ = None
if Hours is None:
self.Hours = []
else:
self.Hours = Hours
self.Hours_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LocationHours)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LocationHours.subclass:
return LocationHours.subclass(*args_, **kwargs_)
else:
return LocationHours(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_DayofWeek(self):
return self.DayofWeek
def set_DayofWeek(self, DayofWeek):
self.DayofWeek = DayofWeek
def get_OperationalHours(self):
return self.OperationalHours
def set_OperationalHours(self, OperationalHours):
self.OperationalHours = OperationalHours
def get_Hours(self):
return self.Hours
def set_Hours(self, Hours):
self.Hours = Hours
def add_Hours(self, value):
self.Hours.append(value)
def insert_Hours_at(self, index, value):
self.Hours.insert(index, value)
def replace_Hours_at(self, index, value):
self.Hours[index] = value
def validate_DayOfWeekType(self, value):
result = True
# Validate type DayOfWeekType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FRI', 'MON', 'SAT', 'SUN', 'THU', 'TUE', 'WED']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on DayOfWeekType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_OperationalHoursType(self, value):
result = True
# Validate type OperationalHoursType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['CLOSED_ALL_DAY', 'OPEN_ALL_DAY', 'OPEN_BY_HOURS']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on OperationalHoursType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.DayofWeek is not None or
self.OperationalHours is not None or
self.Hours
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationHours', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LocationHours')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LocationHours':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LocationHours')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LocationHours', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LocationHours'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationHours', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.DayofWeek is not None:
namespaceprefix_ = self.DayofWeek_nsprefix_ + ':' if (UseCapturedNS_ and self.DayofWeek_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sDayofWeek>%s</%sDayofWeek>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.DayofWeek), input_name='DayofWeek')), namespaceprefix_ , eol_))
if self.OperationalHours is not None:
namespaceprefix_ = self.OperationalHours_nsprefix_ + ':' if (UseCapturedNS_ and self.OperationalHours_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sOperationalHours>%s</%sOperationalHours>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.OperationalHours), input_name='OperationalHours')), namespaceprefix_ , eol_))
for Hours_ in self.Hours:
namespaceprefix_ = self.Hours_nsprefix_ + ':' if (UseCapturedNS_ and self.Hours_nsprefix_) else ''
Hours_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Hours', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'DayofWeek':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'DayofWeek')
value_ = self.gds_validate_string(value_, node, 'DayofWeek')
self.DayofWeek = value_
self.DayofWeek_nsprefix_ = child_.prefix
# validate type DayOfWeekType
self.validate_DayOfWeekType(self.DayofWeek)
elif nodeName_ == 'OperationalHours':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'OperationalHours')
value_ = self.gds_validate_string(value_, node, 'OperationalHours')
self.OperationalHours = value_
self.OperationalHours_nsprefix_ = child_.prefix
# validate type OperationalHoursType
self.validate_OperationalHoursType(self.OperationalHours)
elif nodeName_ == 'Hours':
obj_ = TimeRange.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Hours.append(obj_)
obj_.original_tagname_ = 'Hours'
# end class LocationHours
class LocationIdentificationDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Type=None, Id=None, Number=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Type = Type
self.validate_FedExLocationType(self.Type)
self.Type_nsprefix_ = None
self.Id = Id
self.Id_nsprefix_ = None
self.Number = Number
self.Number_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LocationIdentificationDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LocationIdentificationDetail.subclass:
return LocationIdentificationDetail.subclass(*args_, **kwargs_)
else:
return LocationIdentificationDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Type(self):
return self.Type
def set_Type(self, Type):
self.Type = Type
def get_Id(self):
return self.Id
def set_Id(self, Id):
self.Id = Id
def get_Number(self):
return self.Number
def set_Number(self, Number):
self.Number = Number
def validate_FedExLocationType(self, value):
result = True
# Validate type FedExLocationType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FEDEX_AUTHORIZED_SHIP_CENTER', 'FEDEX_EXPRESS_STATION', 'FEDEX_FACILITY', 'FEDEX_FREIGHT_SERVICE_CENTER', 'FEDEX_GROUND_TERMINAL', 'FEDEX_HOME_DELIVERY_STATION', 'FEDEX_OFFICE', 'FEDEX_ONSITE', 'FEDEX_SELF_SERVICE_LOCATION', 'FEDEX_SHIPSITE', 'FEDEX_SHIP_AND_GET', 'FEDEX_SMART_POST_HUB']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on FedExLocationType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Type is not None or
self.Id is not None or
self.Number is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationIdentificationDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LocationIdentificationDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LocationIdentificationDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LocationIdentificationDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LocationIdentificationDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LocationIdentificationDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationIdentificationDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Type is not None:
namespaceprefix_ = self.Type_nsprefix_ + ':' if (UseCapturedNS_ and self.Type_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sType>%s</%sType>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Type), input_name='Type')), namespaceprefix_ , eol_))
if self.Id is not None:
namespaceprefix_ = self.Id_nsprefix_ + ':' if (UseCapturedNS_ and self.Id_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sId>%s</%sId>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Id), input_name='Id')), namespaceprefix_ , eol_))
if self.Number is not None:
namespaceprefix_ = self.Number_nsprefix_ + ':' if (UseCapturedNS_ and self.Number_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sNumber>%s</%sNumber>%s' % (namespaceprefix_ , self.gds_format_integer(self.Number, input_name='Number'), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Type':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Type')
value_ = self.gds_validate_string(value_, node, 'Type')
self.Type = value_
self.Type_nsprefix_ = child_.prefix
# validate type FedExLocationType
self.validate_FedExLocationType(self.Type)
elif nodeName_ == 'Id':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Id')
value_ = self.gds_validate_string(value_, node, 'Id')
self.Id = value_
self.Id_nsprefix_ = child_.prefix
elif nodeName_ == 'Number' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'Number')
ival_ = self.gds_validate_integer(ival_, node, 'Number')
self.Number = ival_
self.Number_nsprefix_ = child_.prefix
# end class LocationIdentificationDetail
class LocationPackageLimitsDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Weight=None, Dimensions=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Weight = Weight
self.Weight_nsprefix_ = None
self.Dimensions = Dimensions
self.Dimensions_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LocationPackageLimitsDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LocationPackageLimitsDetail.subclass:
return LocationPackageLimitsDetail.subclass(*args_, **kwargs_)
else:
return LocationPackageLimitsDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Weight(self):
return self.Weight
def set_Weight(self, Weight):
self.Weight = Weight
def get_Dimensions(self):
return self.Dimensions
def set_Dimensions(self, Dimensions):
self.Dimensions = Dimensions
def hasContent_(self):
if (
self.Weight is not None or
self.Dimensions is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationPackageLimitsDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LocationPackageLimitsDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LocationPackageLimitsDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LocationPackageLimitsDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LocationPackageLimitsDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LocationPackageLimitsDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationPackageLimitsDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Weight is not None:
namespaceprefix_ = self.Weight_nsprefix_ + ':' if (UseCapturedNS_ and self.Weight_nsprefix_) else ''
self.Weight.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Weight', pretty_print=pretty_print)
if self.Dimensions is not None:
namespaceprefix_ = self.Dimensions_nsprefix_ + ':' if (UseCapturedNS_ and self.Dimensions_nsprefix_) else ''
self.Dimensions.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Dimensions', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Weight':
obj_ = Weight.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Weight = obj_
obj_.original_tagname_ = 'Weight'
elif nodeName_ == 'Dimensions':
obj_ = Dimensions.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Dimensions = obj_
obj_.original_tagname_ = 'Dimensions'
# end class LocationPackageLimitsDetail
class LocationSortDetail(GeneratedsSuper):
"""Specifies the criterion and order to be used to sort the location
details."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Criterion=None, Order=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Criterion = Criterion
self.validate_LocationSortCriteriaType(self.Criterion)
self.Criterion_nsprefix_ = None
self.Order = Order
self.validate_LocationSortOrderType(self.Order)
self.Order_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LocationSortDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LocationSortDetail.subclass:
return LocationSortDetail.subclass(*args_, **kwargs_)
else:
return LocationSortDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Criterion(self):
return self.Criterion
def set_Criterion(self, Criterion):
self.Criterion = Criterion
def get_Order(self):
return self.Order
def set_Order(self, Order):
self.Order = Order
def validate_LocationSortCriteriaType(self, value):
result = True
# Validate type LocationSortCriteriaType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['DISTANCE', 'LATEST_EXPRESS_DROPOFF_TIME', 'LATEST_GROUND_DROPOFF_TIME', 'LOCATION_TYPE']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LocationSortCriteriaType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_LocationSortOrderType(self, value):
result = True
# Validate type LocationSortOrderType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['HIGHEST_TO_LOWEST', 'LOWEST_TO_HIGHEST']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LocationSortOrderType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Criterion is not None or
self.Order is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationSortDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LocationSortDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LocationSortDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LocationSortDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LocationSortDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LocationSortDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationSortDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Criterion is not None:
namespaceprefix_ = self.Criterion_nsprefix_ + ':' if (UseCapturedNS_ and self.Criterion_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCriterion>%s</%sCriterion>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Criterion), input_name='Criterion')), namespaceprefix_ , eol_))
if self.Order is not None:
namespaceprefix_ = self.Order_nsprefix_ + ':' if (UseCapturedNS_ and self.Order_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sOrder>%s</%sOrder>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Order), input_name='Order')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Criterion':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Criterion')
value_ = self.gds_validate_string(value_, node, 'Criterion')
self.Criterion = value_
self.Criterion_nsprefix_ = child_.prefix
# validate type LocationSortCriteriaType
self.validate_LocationSortCriteriaType(self.Criterion)
elif nodeName_ == 'Order':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Order')
value_ = self.gds_validate_string(value_, node, 'Order')
self.Order = value_
self.Order_nsprefix_ = child_.prefix
# validate type LocationSortOrderType
self.validate_LocationSortOrderType(self.Order)
# end class LocationSortDetail
class LocationSupportedPackageDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Weight=None, Dimensions=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Weight = Weight
self.Weight_nsprefix_ = None
self.Dimensions = Dimensions
self.Dimensions_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LocationSupportedPackageDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LocationSupportedPackageDetail.subclass:
return LocationSupportedPackageDetail.subclass(*args_, **kwargs_)
else:
return LocationSupportedPackageDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Weight(self):
return self.Weight
def set_Weight(self, Weight):
self.Weight = Weight
def get_Dimensions(self):
return self.Dimensions
def set_Dimensions(self, Dimensions):
self.Dimensions = Dimensions
def hasContent_(self):
if (
self.Weight is not None or
self.Dimensions is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationSupportedPackageDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LocationSupportedPackageDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LocationSupportedPackageDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LocationSupportedPackageDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LocationSupportedPackageDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LocationSupportedPackageDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationSupportedPackageDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Weight is not None:
namespaceprefix_ = self.Weight_nsprefix_ + ':' if (UseCapturedNS_ and self.Weight_nsprefix_) else ''
self.Weight.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Weight', pretty_print=pretty_print)
if self.Dimensions is not None:
namespaceprefix_ = self.Dimensions_nsprefix_ + ':' if (UseCapturedNS_ and self.Dimensions_nsprefix_) else ''
self.Dimensions.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Dimensions', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Weight':
obj_ = Weight.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Weight = obj_
obj_.original_tagname_ = 'Weight'
elif nodeName_ == 'Dimensions':
obj_ = Dimensions.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Dimensions = obj_
obj_.original_tagname_ = 'Dimensions'
# end class LocationSupportedPackageDetail
class LocationSupportedShipmentDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, PackageDetails=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
if PackageDetails is None:
self.PackageDetails = []
else:
self.PackageDetails = PackageDetails
self.PackageDetails_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, LocationSupportedShipmentDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if LocationSupportedShipmentDetail.subclass:
return LocationSupportedShipmentDetail.subclass(*args_, **kwargs_)
else:
return LocationSupportedShipmentDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_PackageDetails(self):
return self.PackageDetails
def set_PackageDetails(self, PackageDetails):
self.PackageDetails = PackageDetails
def add_PackageDetails(self, value):
self.PackageDetails.append(value)
def insert_PackageDetails_at(self, index, value):
self.PackageDetails.insert(index, value)
def replace_PackageDetails_at(self, index, value):
self.PackageDetails[index] = value
def hasContent_(self):
if (
self.PackageDetails
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationSupportedShipmentDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('LocationSupportedShipmentDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'LocationSupportedShipmentDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='LocationSupportedShipmentDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='LocationSupportedShipmentDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='LocationSupportedShipmentDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='LocationSupportedShipmentDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
for PackageDetails_ in self.PackageDetails:
namespaceprefix_ = self.PackageDetails_nsprefix_ + ':' if (UseCapturedNS_ and self.PackageDetails_nsprefix_) else ''
PackageDetails_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='PackageDetails', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'PackageDetails':
obj_ = LocationSupportedPackageDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.PackageDetails.append(obj_)
obj_.original_tagname_ = 'PackageDetails'
# end class LocationSupportedShipmentDetail
class Notification(GeneratedsSuper):
"""The descriptive data regarding the result of the submitted
transaction."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Severity=None, Source=None, Code=None, Message=None, LocalizedMessage=None, MessageParameters=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Severity = Severity
self.validate_NotificationSeverityType(self.Severity)
self.Severity_nsprefix_ = None
self.Source = Source
self.Source_nsprefix_ = None
self.Code = Code
self.Code_nsprefix_ = None
self.Message = Message
self.Message_nsprefix_ = None
self.LocalizedMessage = LocalizedMessage
self.LocalizedMessage_nsprefix_ = None
if MessageParameters is None:
self.MessageParameters = []
else:
self.MessageParameters = MessageParameters
self.MessageParameters_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, Notification)
if subclass is not None:
return subclass(*args_, **kwargs_)
if Notification.subclass:
return Notification.subclass(*args_, **kwargs_)
else:
return Notification(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Severity(self):
return self.Severity
def set_Severity(self, Severity):
self.Severity = Severity
def get_Source(self):
return self.Source
def set_Source(self, Source):
self.Source = Source
def get_Code(self):
return self.Code
def set_Code(self, Code):
self.Code = Code
def get_Message(self):
return self.Message
def set_Message(self, Message):
self.Message = Message
def get_LocalizedMessage(self):
return self.LocalizedMessage
def set_LocalizedMessage(self, LocalizedMessage):
self.LocalizedMessage = LocalizedMessage
def get_MessageParameters(self):
return self.MessageParameters
def set_MessageParameters(self, MessageParameters):
self.MessageParameters = MessageParameters
def add_MessageParameters(self, value):
self.MessageParameters.append(value)
def insert_MessageParameters_at(self, index, value):
self.MessageParameters.insert(index, value)
def replace_MessageParameters_at(self, index, value):
self.MessageParameters[index] = value
def validate_NotificationSeverityType(self, value):
result = True
# Validate type NotificationSeverityType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['ERROR', 'FAILURE', 'NOTE', 'SUCCESS', 'WARNING']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on NotificationSeverityType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Severity is not None or
self.Source is not None or
self.Code is not None or
self.Message is not None or
self.LocalizedMessage is not None or
self.MessageParameters
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Notification', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('Notification')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'Notification':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='Notification')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='Notification', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='Notification'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Notification', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Severity is not None:
namespaceprefix_ = self.Severity_nsprefix_ + ':' if (UseCapturedNS_ and self.Severity_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sSeverity>%s</%sSeverity>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Severity), input_name='Severity')), namespaceprefix_ , eol_))
if self.Source is not None:
namespaceprefix_ = self.Source_nsprefix_ + ':' if (UseCapturedNS_ and self.Source_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sSource>%s</%sSource>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Source), input_name='Source')), namespaceprefix_ , eol_))
if self.Code is not None:
namespaceprefix_ = self.Code_nsprefix_ + ':' if (UseCapturedNS_ and self.Code_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCode>%s</%sCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Code), input_name='Code')), namespaceprefix_ , eol_))
if self.Message is not None:
namespaceprefix_ = self.Message_nsprefix_ + ':' if (UseCapturedNS_ and self.Message_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sMessage>%s</%sMessage>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Message), input_name='Message')), namespaceprefix_ , eol_))
if self.LocalizedMessage is not None:
namespaceprefix_ = self.LocalizedMessage_nsprefix_ + ':' if (UseCapturedNS_ and self.LocalizedMessage_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocalizedMessage>%s</%sLocalizedMessage>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.LocalizedMessage), input_name='LocalizedMessage')), namespaceprefix_ , eol_))
for MessageParameters_ in self.MessageParameters:
namespaceprefix_ = self.MessageParameters_nsprefix_ + ':' if (UseCapturedNS_ and self.MessageParameters_nsprefix_) else ''
MessageParameters_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='MessageParameters', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Severity':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Severity')
value_ = self.gds_validate_string(value_, node, 'Severity')
self.Severity = value_
self.Severity_nsprefix_ = child_.prefix
# validate type NotificationSeverityType
self.validate_NotificationSeverityType(self.Severity)
elif nodeName_ == 'Source':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Source')
value_ = self.gds_validate_string(value_, node, 'Source')
self.Source = value_
self.Source_nsprefix_ = child_.prefix
elif nodeName_ == 'Code':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Code')
value_ = self.gds_validate_string(value_, node, 'Code')
self.Code = value_
self.Code_nsprefix_ = child_.prefix
elif nodeName_ == 'Message':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Message')
value_ = self.gds_validate_string(value_, node, 'Message')
self.Message = value_
self.Message_nsprefix_ = child_.prefix
elif nodeName_ == 'LocalizedMessage':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocalizedMessage')
value_ = self.gds_validate_string(value_, node, 'LocalizedMessage')
self.LocalizedMessage = value_
self.LocalizedMessage_nsprefix_ = child_.prefix
elif nodeName_ == 'MessageParameters':
obj_ = NotificationParameter.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.MessageParameters.append(obj_)
obj_.original_tagname_ = 'MessageParameters'
# end class Notification
class NotificationParameter(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Id=None, Value=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Id = Id
self.Id_nsprefix_ = None
self.Value = Value
self.Value_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, NotificationParameter)
if subclass is not None:
return subclass(*args_, **kwargs_)
if NotificationParameter.subclass:
return NotificationParameter.subclass(*args_, **kwargs_)
else:
return NotificationParameter(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Id(self):
return self.Id
def set_Id(self, Id):
self.Id = Id
def get_Value(self):
return self.Value
def set_Value(self, Value):
self.Value = Value
def hasContent_(self):
if (
self.Id is not None or
self.Value is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='NotificationParameter', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('NotificationParameter')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'NotificationParameter':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='NotificationParameter')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='NotificationParameter', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='NotificationParameter'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='NotificationParameter', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Id is not None:
namespaceprefix_ = self.Id_nsprefix_ + ':' if (UseCapturedNS_ and self.Id_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sId>%s</%sId>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Id), input_name='Id')), namespaceprefix_ , eol_))
if self.Value is not None:
namespaceprefix_ = self.Value_nsprefix_ + ':' if (UseCapturedNS_ and self.Value_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sValue>%s</%sValue>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Value), input_name='Value')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Id':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Id')
value_ = self.gds_validate_string(value_, node, 'Id')
self.Id = value_
self.Id_nsprefix_ = child_.prefix
elif nodeName_ == 'Value':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Value')
value_ = self.gds_validate_string(value_, node, 'Value')
self.Value = value_
self.Value_nsprefix_ = child_.prefix
# end class NotificationParameter
class ReservationAvailabilityDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Attributes=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
if Attributes is None:
self.Attributes = []
else:
self.Attributes = Attributes
self.Attributes_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, ReservationAvailabilityDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if ReservationAvailabilityDetail.subclass:
return ReservationAvailabilityDetail.subclass(*args_, **kwargs_)
else:
return ReservationAvailabilityDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Attributes(self):
return self.Attributes
def set_Attributes(self, Attributes):
self.Attributes = Attributes
def add_Attributes(self, value):
self.Attributes.append(value)
def insert_Attributes_at(self, index, value):
self.Attributes.insert(index, value)
def replace_Attributes_at(self, index, value):
self.Attributes[index] = value
def validate_ReservationAttributesType(self, value):
result = True
# Validate type ReservationAttributesType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['RESERVATION_AVAILABLE']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on ReservationAttributesType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Attributes
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ReservationAvailabilityDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('ReservationAvailabilityDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'ReservationAvailabilityDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='ReservationAvailabilityDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='ReservationAvailabilityDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='ReservationAvailabilityDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ReservationAvailabilityDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
for Attributes_ in self.Attributes:
namespaceprefix_ = self.Attributes_nsprefix_ + ':' if (UseCapturedNS_ and self.Attributes_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sAttributes>%s</%sAttributes>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(Attributes_), input_name='Attributes')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Attributes':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Attributes')
value_ = self.gds_validate_string(value_, node, 'Attributes')
self.Attributes.append(value_)
self.Attributes_nsprefix_ = child_.prefix
# validate type ReservationAttributesType
self.validate_ReservationAttributesType(self.Attributes[-1])
# end class ReservationAvailabilityDetail
class SearchLocationConstraints(GeneratedsSuper):
"""Specifies additional constraints on the attributes of the locations
being searched."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, RadiusDistance=None, DropOffTimeNeeded=None, ResultsFilters=None, SupportedRedirectToHoldServices=None, RequiredLocationAttributes=None, RequiredLocationCapabilities=None, ShipmentDetail=None, ResultsToSkip=None, ResultsRequested=None, LocationContentOptions=None, LocationTypesToInclude=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.RadiusDistance = RadiusDistance
self.RadiusDistance_nsprefix_ = None
if isinstance(DropOffTimeNeeded, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(DropOffTimeNeeded, '%H:%M:%S').time()
else:
initvalue_ = DropOffTimeNeeded
self.DropOffTimeNeeded = initvalue_
self.DropOffTimeNeeded_nsprefix_ = None
if ResultsFilters is None:
self.ResultsFilters = []
else:
self.ResultsFilters = ResultsFilters
self.ResultsFilters_nsprefix_ = None
if SupportedRedirectToHoldServices is None:
self.SupportedRedirectToHoldServices = []
else:
self.SupportedRedirectToHoldServices = SupportedRedirectToHoldServices
self.SupportedRedirectToHoldServices_nsprefix_ = None
if RequiredLocationAttributes is None:
self.RequiredLocationAttributes = []
else:
self.RequiredLocationAttributes = RequiredLocationAttributes
self.RequiredLocationAttributes_nsprefix_ = None
if RequiredLocationCapabilities is None:
self.RequiredLocationCapabilities = []
else:
self.RequiredLocationCapabilities = RequiredLocationCapabilities
self.RequiredLocationCapabilities_nsprefix_ = None
self.ShipmentDetail = ShipmentDetail
self.ShipmentDetail_nsprefix_ = None
self.ResultsToSkip = ResultsToSkip
self.ResultsToSkip_nsprefix_ = None
self.ResultsRequested = ResultsRequested
self.ResultsRequested_nsprefix_ = None
if LocationContentOptions is None:
self.LocationContentOptions = []
else:
self.LocationContentOptions = LocationContentOptions
self.LocationContentOptions_nsprefix_ = None
if LocationTypesToInclude is None:
self.LocationTypesToInclude = []
else:
self.LocationTypesToInclude = LocationTypesToInclude
self.LocationTypesToInclude_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, SearchLocationConstraints)
if subclass is not None:
return subclass(*args_, **kwargs_)
if SearchLocationConstraints.subclass:
return SearchLocationConstraints.subclass(*args_, **kwargs_)
else:
return SearchLocationConstraints(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_RadiusDistance(self):
return self.RadiusDistance
def set_RadiusDistance(self, RadiusDistance):
self.RadiusDistance = RadiusDistance
def get_DropOffTimeNeeded(self):
return self.DropOffTimeNeeded
def set_DropOffTimeNeeded(self, DropOffTimeNeeded):
self.DropOffTimeNeeded = DropOffTimeNeeded
def get_ResultsFilters(self):
return self.ResultsFilters
def set_ResultsFilters(self, ResultsFilters):
self.ResultsFilters = ResultsFilters
def add_ResultsFilters(self, value):
self.ResultsFilters.append(value)
def insert_ResultsFilters_at(self, index, value):
self.ResultsFilters.insert(index, value)
def replace_ResultsFilters_at(self, index, value):
self.ResultsFilters[index] = value
def get_SupportedRedirectToHoldServices(self):
return self.SupportedRedirectToHoldServices
def set_SupportedRedirectToHoldServices(self, SupportedRedirectToHoldServices):
self.SupportedRedirectToHoldServices = SupportedRedirectToHoldServices
def add_SupportedRedirectToHoldServices(self, value):
self.SupportedRedirectToHoldServices.append(value)
def insert_SupportedRedirectToHoldServices_at(self, index, value):
self.SupportedRedirectToHoldServices.insert(index, value)
def replace_SupportedRedirectToHoldServices_at(self, index, value):
self.SupportedRedirectToHoldServices[index] = value
def get_RequiredLocationAttributes(self):
return self.RequiredLocationAttributes
def set_RequiredLocationAttributes(self, RequiredLocationAttributes):
self.RequiredLocationAttributes = RequiredLocationAttributes
def add_RequiredLocationAttributes(self, value):
self.RequiredLocationAttributes.append(value)
def insert_RequiredLocationAttributes_at(self, index, value):
self.RequiredLocationAttributes.insert(index, value)
def replace_RequiredLocationAttributes_at(self, index, value):
self.RequiredLocationAttributes[index] = value
def get_RequiredLocationCapabilities(self):
return self.RequiredLocationCapabilities
def set_RequiredLocationCapabilities(self, RequiredLocationCapabilities):
self.RequiredLocationCapabilities = RequiredLocationCapabilities
def add_RequiredLocationCapabilities(self, value):
self.RequiredLocationCapabilities.append(value)
def insert_RequiredLocationCapabilities_at(self, index, value):
self.RequiredLocationCapabilities.insert(index, value)
def replace_RequiredLocationCapabilities_at(self, index, value):
self.RequiredLocationCapabilities[index] = value
def get_ShipmentDetail(self):
return self.ShipmentDetail
def set_ShipmentDetail(self, ShipmentDetail):
self.ShipmentDetail = ShipmentDetail
def get_ResultsToSkip(self):
return self.ResultsToSkip
def set_ResultsToSkip(self, ResultsToSkip):
self.ResultsToSkip = ResultsToSkip
def get_ResultsRequested(self):
return self.ResultsRequested
def set_ResultsRequested(self, ResultsRequested):
self.ResultsRequested = ResultsRequested
def get_LocationContentOptions(self):
return self.LocationContentOptions
def set_LocationContentOptions(self, LocationContentOptions):
self.LocationContentOptions = LocationContentOptions
def add_LocationContentOptions(self, value):
self.LocationContentOptions.append(value)
def insert_LocationContentOptions_at(self, index, value):
self.LocationContentOptions.insert(index, value)
def replace_LocationContentOptions_at(self, index, value):
self.LocationContentOptions[index] = value
def get_LocationTypesToInclude(self):
return self.LocationTypesToInclude
def set_LocationTypesToInclude(self, LocationTypesToInclude):
self.LocationTypesToInclude = LocationTypesToInclude
def add_LocationTypesToInclude(self, value):
self.LocationTypesToInclude.append(value)
def insert_LocationTypesToInclude_at(self, index, value):
self.LocationTypesToInclude.insert(index, value)
def replace_LocationTypesToInclude_at(self, index, value):
self.LocationTypesToInclude[index] = value
def validate_LocationSearchFilterType(self, value):
result = True
# Validate type LocationSearchFilterType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['EXCLUDE_LOCATIONS_OUTSIDE_COUNTRY', 'EXCLUDE_LOCATIONS_OUTSIDE_STATE_OR_PROVINCE', 'EXCLUDE_UNAVAILABLE_LOCATIONS']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LocationSearchFilterType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_SupportedRedirectToHoldServiceType(self, value):
result = True
# Validate type SupportedRedirectToHoldServiceType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FEDEX_EXPRESS', 'FEDEX_GROUND', 'FEDEX_GROUND_HOME_DELIVERY']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on SupportedRedirectToHoldServiceType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_LocationAttributesType(self, value):
result = True
# Validate type LocationAttributesType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['ACCEPTS_CASH', 'ALREADY_OPEN', 'CLEARANCE_SERVICES', 'COPY_AND_PRINT_SERVICES', 'DANGEROUS_GOODS_SERVICES', 'DIRECT_MAIL_SERVICES', 'DOMESTIC_SHIPPING_SERVICES', 'DROP_BOX', 'INTERNATIONAL_SHIPPING_SERVICES', 'LOCATION_IS_IN_AIRPORT', 'NOTARY_SERVICES', 'OBSERVES_DAY_LIGHT_SAVING_TIMES', 'OPEN_TWENTY_FOUR_HOURS', 'PACKAGING_SUPPLIES', 'PACK_AND_SHIP', 'PASSPORT_PHOTO_SERVICES', 'RETURNS_SERVICES', 'SIGNS_AND_BANNERS_SERVICE', 'SONY_PICTURE_STATION']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LocationAttributesType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_LocationContentOptionType(self, value):
result = True
# Validate type LocationContentOptionType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['HOLIDAYS', 'LOCATION_DROPOFF_TIMES', 'MAP_URL', 'TIMEZONE_OFFSET']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LocationContentOptionType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_FedExLocationType(self, value):
result = True
# Validate type FedExLocationType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FEDEX_AUTHORIZED_SHIP_CENTER', 'FEDEX_EXPRESS_STATION', 'FEDEX_FACILITY', 'FEDEX_FREIGHT_SERVICE_CENTER', 'FEDEX_GROUND_TERMINAL', 'FEDEX_HOME_DELIVERY_STATION', 'FEDEX_OFFICE', 'FEDEX_ONSITE', 'FEDEX_SELF_SERVICE_LOCATION', 'FEDEX_SHIPSITE', 'FEDEX_SHIP_AND_GET', 'FEDEX_SMART_POST_HUB']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on FedExLocationType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.RadiusDistance is not None or
self.DropOffTimeNeeded is not None or
self.ResultsFilters or
self.SupportedRedirectToHoldServices or
self.RequiredLocationAttributes or
self.RequiredLocationCapabilities or
self.ShipmentDetail is not None or
self.ResultsToSkip is not None or
self.ResultsRequested is not None or
self.LocationContentOptions or
self.LocationTypesToInclude
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='SearchLocationConstraints', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('SearchLocationConstraints')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'SearchLocationConstraints':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='SearchLocationConstraints')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='SearchLocationConstraints', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='SearchLocationConstraints'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='SearchLocationConstraints', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.RadiusDistance is not None:
namespaceprefix_ = self.RadiusDistance_nsprefix_ + ':' if (UseCapturedNS_ and self.RadiusDistance_nsprefix_) else ''
self.RadiusDistance.export(outfile, level, namespaceprefix_, namespacedef_='', name_='RadiusDistance', pretty_print=pretty_print)
if self.DropOffTimeNeeded is not None:
namespaceprefix_ = self.DropOffTimeNeeded_nsprefix_ + ':' if (UseCapturedNS_ and self.DropOffTimeNeeded_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sDropOffTimeNeeded>%s</%sDropOffTimeNeeded>%s' % (namespaceprefix_ , self.gds_format_time(self.DropOffTimeNeeded, input_name='DropOffTimeNeeded'), namespaceprefix_ , eol_))
for ResultsFilters_ in self.ResultsFilters:
namespaceprefix_ = self.ResultsFilters_nsprefix_ + ':' if (UseCapturedNS_ and self.ResultsFilters_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sResultsFilters>%s</%sResultsFilters>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(ResultsFilters_), input_name='ResultsFilters')), namespaceprefix_ , eol_))
for SupportedRedirectToHoldServices_ in self.SupportedRedirectToHoldServices:
namespaceprefix_ = self.SupportedRedirectToHoldServices_nsprefix_ + ':' if (UseCapturedNS_ and self.SupportedRedirectToHoldServices_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sSupportedRedirectToHoldServices>%s</%sSupportedRedirectToHoldServices>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(SupportedRedirectToHoldServices_), input_name='SupportedRedirectToHoldServices')), namespaceprefix_ , eol_))
for RequiredLocationAttributes_ in self.RequiredLocationAttributes:
namespaceprefix_ = self.RequiredLocationAttributes_nsprefix_ + ':' if (UseCapturedNS_ and self.RequiredLocationAttributes_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sRequiredLocationAttributes>%s</%sRequiredLocationAttributes>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(RequiredLocationAttributes_), input_name='RequiredLocationAttributes')), namespaceprefix_ , eol_))
for RequiredLocationCapabilities_ in self.RequiredLocationCapabilities:
namespaceprefix_ = self.RequiredLocationCapabilities_nsprefix_ + ':' if (UseCapturedNS_ and self.RequiredLocationCapabilities_nsprefix_) else ''
RequiredLocationCapabilities_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='RequiredLocationCapabilities', pretty_print=pretty_print)
if self.ShipmentDetail is not None:
namespaceprefix_ = self.ShipmentDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.ShipmentDetail_nsprefix_) else ''
self.ShipmentDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ShipmentDetail', pretty_print=pretty_print)
if self.ResultsToSkip is not None:
namespaceprefix_ = self.ResultsToSkip_nsprefix_ + ':' if (UseCapturedNS_ and self.ResultsToSkip_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sResultsToSkip>%s</%sResultsToSkip>%s' % (namespaceprefix_ , self.gds_format_integer(self.ResultsToSkip, input_name='ResultsToSkip'), namespaceprefix_ , eol_))
if self.ResultsRequested is not None:
namespaceprefix_ = self.ResultsRequested_nsprefix_ + ':' if (UseCapturedNS_ and self.ResultsRequested_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sResultsRequested>%s</%sResultsRequested>%s' % (namespaceprefix_ , self.gds_format_integer(self.ResultsRequested, input_name='ResultsRequested'), namespaceprefix_ , eol_))
for LocationContentOptions_ in self.LocationContentOptions:
namespaceprefix_ = self.LocationContentOptions_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationContentOptions_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocationContentOptions>%s</%sLocationContentOptions>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(LocationContentOptions_), input_name='LocationContentOptions')), namespaceprefix_ , eol_))
for LocationTypesToInclude_ in self.LocationTypesToInclude:
namespaceprefix_ = self.LocationTypesToInclude_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationTypesToInclude_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocationTypesToInclude>%s</%sLocationTypesToInclude>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(LocationTypesToInclude_), input_name='LocationTypesToInclude')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'RadiusDistance':
obj_ = Distance.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.RadiusDistance = obj_
obj_.original_tagname_ = 'RadiusDistance'
elif nodeName_ == 'DropOffTimeNeeded':
sval_ = child_.text
dval_ = self.gds_parse_time(sval_)
self.DropOffTimeNeeded = dval_
self.DropOffTimeNeeded_nsprefix_ = child_.prefix
elif nodeName_ == 'ResultsFilters':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ResultsFilters')
value_ = self.gds_validate_string(value_, node, 'ResultsFilters')
self.ResultsFilters.append(value_)
self.ResultsFilters_nsprefix_ = child_.prefix
# validate type LocationSearchFilterType
self.validate_LocationSearchFilterType(self.ResultsFilters[-1])
elif nodeName_ == 'SupportedRedirectToHoldServices':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'SupportedRedirectToHoldServices')
value_ = self.gds_validate_string(value_, node, 'SupportedRedirectToHoldServices')
self.SupportedRedirectToHoldServices.append(value_)
self.SupportedRedirectToHoldServices_nsprefix_ = child_.prefix
# validate type SupportedRedirectToHoldServiceType
self.validate_SupportedRedirectToHoldServiceType(self.SupportedRedirectToHoldServices[-1])
elif nodeName_ == 'RequiredLocationAttributes':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'RequiredLocationAttributes')
value_ = self.gds_validate_string(value_, node, 'RequiredLocationAttributes')
self.RequiredLocationAttributes.append(value_)
self.RequiredLocationAttributes_nsprefix_ = child_.prefix
# validate type LocationAttributesType
self.validate_LocationAttributesType(self.RequiredLocationAttributes[-1])
elif nodeName_ == 'RequiredLocationCapabilities':
obj_ = LocationCapabilityDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.RequiredLocationCapabilities.append(obj_)
obj_.original_tagname_ = 'RequiredLocationCapabilities'
elif nodeName_ == 'ShipmentDetail':
obj_ = LocationSupportedShipmentDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ShipmentDetail = obj_
obj_.original_tagname_ = 'ShipmentDetail'
elif nodeName_ == 'ResultsToSkip' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'ResultsToSkip')
if ival_ < 0:
raise_parse_error(child_, 'requires nonNegativeInteger')
ival_ = self.gds_validate_integer(ival_, node, 'ResultsToSkip')
self.ResultsToSkip = ival_
self.ResultsToSkip_nsprefix_ = child_.prefix
elif nodeName_ == 'ResultsRequested' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'ResultsRequested')
if ival_ < 0:
raise_parse_error(child_, 'requires nonNegativeInteger')
ival_ = self.gds_validate_integer(ival_, node, 'ResultsRequested')
self.ResultsRequested = ival_
self.ResultsRequested_nsprefix_ = child_.prefix
elif nodeName_ == 'LocationContentOptions':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocationContentOptions')
value_ = self.gds_validate_string(value_, node, 'LocationContentOptions')
self.LocationContentOptions.append(value_)
self.LocationContentOptions_nsprefix_ = child_.prefix
# validate type LocationContentOptionType
self.validate_LocationContentOptionType(self.LocationContentOptions[-1])
elif nodeName_ == 'LocationTypesToInclude':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocationTypesToInclude')
value_ = self.gds_validate_string(value_, node, 'LocationTypesToInclude')
self.LocationTypesToInclude.append(value_)
self.LocationTypesToInclude_nsprefix_ = child_.prefix
# validate type FedExLocationType
self.validate_FedExLocationType(self.LocationTypesToInclude[-1])
# end class SearchLocationConstraints
class ValidateLocationAvailabilityRequest(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, RestrictionsAndPrivileges=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.RestrictionsAndPrivileges = RestrictionsAndPrivileges
self.RestrictionsAndPrivileges_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, ValidateLocationAvailabilityRequest)
if subclass is not None:
return subclass(*args_, **kwargs_)
if ValidateLocationAvailabilityRequest.subclass:
return ValidateLocationAvailabilityRequest.subclass(*args_, **kwargs_)
else:
return ValidateLocationAvailabilityRequest(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_RestrictionsAndPrivileges(self):
return self.RestrictionsAndPrivileges
def set_RestrictionsAndPrivileges(self, RestrictionsAndPrivileges):
self.RestrictionsAndPrivileges = RestrictionsAndPrivileges
def hasContent_(self):
if (
self.RestrictionsAndPrivileges is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ValidateLocationAvailabilityRequest', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('ValidateLocationAvailabilityRequest')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'ValidateLocationAvailabilityRequest':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='ValidateLocationAvailabilityRequest')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='ValidateLocationAvailabilityRequest', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='ValidateLocationAvailabilityRequest'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ValidateLocationAvailabilityRequest', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.RestrictionsAndPrivileges is not None:
namespaceprefix_ = self.RestrictionsAndPrivileges_nsprefix_ + ':' if (UseCapturedNS_ and self.RestrictionsAndPrivileges_nsprefix_) else ''
self.RestrictionsAndPrivileges.export(outfile, level, namespaceprefix_, namespacedef_='', name_='RestrictionsAndPrivileges', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'RestrictionsAndPrivileges':
obj_ = RestrictionsAndPrivilegesPolicyDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.RestrictionsAndPrivileges = obj_
obj_.original_tagname_ = 'RestrictionsAndPrivileges'
# end class ValidateLocationAvailabilityRequest
class RestrictionsAndPrivilegesPolicyDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, PrivilegeDetails=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
if PrivilegeDetails is None:
self.PrivilegeDetails = []
else:
self.PrivilegeDetails = PrivilegeDetails
self.PrivilegeDetails_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, RestrictionsAndPrivilegesPolicyDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if RestrictionsAndPrivilegesPolicyDetail.subclass:
return RestrictionsAndPrivilegesPolicyDetail.subclass(*args_, **kwargs_)
else:
return RestrictionsAndPrivilegesPolicyDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_PrivilegeDetails(self):
return self.PrivilegeDetails
def set_PrivilegeDetails(self, PrivilegeDetails):
self.PrivilegeDetails = PrivilegeDetails
def add_PrivilegeDetails(self, value):
self.PrivilegeDetails.append(value)
def insert_PrivilegeDetails_at(self, index, value):
self.PrivilegeDetails.insert(index, value)
def replace_PrivilegeDetails_at(self, index, value):
self.PrivilegeDetails[index] = value
def hasContent_(self):
if (
self.PrivilegeDetails
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='RestrictionsAndPrivilegesPolicyDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('RestrictionsAndPrivilegesPolicyDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'RestrictionsAndPrivilegesPolicyDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='RestrictionsAndPrivilegesPolicyDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='RestrictionsAndPrivilegesPolicyDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='RestrictionsAndPrivilegesPolicyDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='RestrictionsAndPrivilegesPolicyDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
for PrivilegeDetails_ in self.PrivilegeDetails:
namespaceprefix_ = self.PrivilegeDetails_nsprefix_ + ':' if (UseCapturedNS_ and self.PrivilegeDetails_nsprefix_) else ''
PrivilegeDetails_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='PrivilegeDetails', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'PrivilegeDetails':
obj_ = EnterprisePrivilegeDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.PrivilegeDetails.append(obj_)
obj_.original_tagname_ = 'PrivilegeDetails'
# end class RestrictionsAndPrivilegesPolicyDetail
class DateRange(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Begins=None, Ends=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
if isinstance(Begins, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(Begins, '%Y-%m-%d').date()
else:
initvalue_ = Begins
self.Begins = initvalue_
self.Begins_nsprefix_ = None
if isinstance(Ends, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(Ends, '%Y-%m-%d').date()
else:
initvalue_ = Ends
self.Ends = initvalue_
self.Ends_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, DateRange)
if subclass is not None:
return subclass(*args_, **kwargs_)
if DateRange.subclass:
return DateRange.subclass(*args_, **kwargs_)
else:
return DateRange(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Begins(self):
return self.Begins
def set_Begins(self, Begins):
self.Begins = Begins
def get_Ends(self):
return self.Ends
def set_Ends(self, Ends):
self.Ends = Ends
def hasContent_(self):
if (
self.Begins is not None or
self.Ends is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='DateRange', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('DateRange')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'DateRange':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='DateRange')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='DateRange', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='DateRange'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='DateRange', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Begins is not None:
namespaceprefix_ = self.Begins_nsprefix_ + ':' if (UseCapturedNS_ and self.Begins_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sBegins>%s</%sBegins>%s' % (namespaceprefix_ , self.gds_format_date(self.Begins, input_name='Begins'), namespaceprefix_ , eol_))
if self.Ends is not None:
namespaceprefix_ = self.Ends_nsprefix_ + ':' if (UseCapturedNS_ and self.Ends_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sEnds>%s</%sEnds>%s' % (namespaceprefix_ , self.gds_format_date(self.Ends, input_name='Ends'), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Begins':
sval_ = child_.text
dval_ = self.gds_parse_date(sval_)
self.Begins = dval_
self.Begins_nsprefix_ = child_.prefix
elif nodeName_ == 'Ends':
sval_ = child_.text
dval_ = self.gds_parse_date(sval_)
self.Ends = dval_
self.Ends_nsprefix_ = child_.prefix
# end class DateRange
class EnterprisePrivilegeDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Id=None, Permission=None, CarrierCode=None, EffectiveDateRange=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Id = Id
self.Id_nsprefix_ = None
self.Permission = Permission
self.validate_EnterprisePermissionType(self.Permission)
self.Permission_nsprefix_ = None
self.CarrierCode = CarrierCode
self.validate_CarrierCodeType(self.CarrierCode)
self.CarrierCode_nsprefix_ = None
self.EffectiveDateRange = EffectiveDateRange
self.EffectiveDateRange_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, EnterprisePrivilegeDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if EnterprisePrivilegeDetail.subclass:
return EnterprisePrivilegeDetail.subclass(*args_, **kwargs_)
else:
return EnterprisePrivilegeDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Id(self):
return self.Id
def set_Id(self, Id):
self.Id = Id
def get_Permission(self):
return self.Permission
def set_Permission(self, Permission):
self.Permission = Permission
def get_CarrierCode(self):
return self.CarrierCode
def set_CarrierCode(self, CarrierCode):
self.CarrierCode = CarrierCode
def get_EffectiveDateRange(self):
return self.EffectiveDateRange
def set_EffectiveDateRange(self, EffectiveDateRange):
self.EffectiveDateRange = EffectiveDateRange
def validate_EnterprisePermissionType(self, value):
result = True
# Validate type EnterprisePermissionType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['ALLOWED', 'ALLOWED_BY_EXCEPTION', 'DISALLOWED']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on EnterprisePermissionType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_CarrierCodeType(self, value):
result = True
# Validate type CarrierCodeType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['FDXC', 'FDXE', 'FDXG', 'FXCC', 'FXFR', 'FXSP']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on CarrierCodeType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Id is not None or
self.Permission is not None or
self.CarrierCode is not None or
self.EffectiveDateRange is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='EnterprisePrivilegeDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('EnterprisePrivilegeDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'EnterprisePrivilegeDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='EnterprisePrivilegeDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='EnterprisePrivilegeDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='EnterprisePrivilegeDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='EnterprisePrivilegeDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Id is not None:
namespaceprefix_ = self.Id_nsprefix_ + ':' if (UseCapturedNS_ and self.Id_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sId>%s</%sId>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Id), input_name='Id')), namespaceprefix_ , eol_))
if self.Permission is not None:
namespaceprefix_ = self.Permission_nsprefix_ + ':' if (UseCapturedNS_ and self.Permission_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sPermission>%s</%sPermission>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Permission), input_name='Permission')), namespaceprefix_ , eol_))
if self.CarrierCode is not None:
namespaceprefix_ = self.CarrierCode_nsprefix_ + ':' if (UseCapturedNS_ and self.CarrierCode_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCarrierCode>%s</%sCarrierCode>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.CarrierCode), input_name='CarrierCode')), namespaceprefix_ , eol_))
if self.EffectiveDateRange is not None:
namespaceprefix_ = self.EffectiveDateRange_nsprefix_ + ':' if (UseCapturedNS_ and self.EffectiveDateRange_nsprefix_) else ''
self.EffectiveDateRange.export(outfile, level, namespaceprefix_, namespacedef_='', name_='EffectiveDateRange', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Id':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Id')
value_ = self.gds_validate_string(value_, node, 'Id')
self.Id = value_
self.Id_nsprefix_ = child_.prefix
elif nodeName_ == 'Permission':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Permission')
value_ = self.gds_validate_string(value_, node, 'Permission')
self.Permission = value_
self.Permission_nsprefix_ = child_.prefix
# validate type EnterprisePermissionType
self.validate_EnterprisePermissionType(self.Permission)
elif nodeName_ == 'CarrierCode':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'CarrierCode')
value_ = self.gds_validate_string(value_, node, 'CarrierCode')
self.CarrierCode = value_
self.CarrierCode_nsprefix_ = child_.prefix
# validate type CarrierCodeType
self.validate_CarrierCodeType(self.CarrierCode)
elif nodeName_ == 'EffectiveDateRange':
obj_ = DateRange.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.EffectiveDateRange = obj_
obj_.original_tagname_ = 'EffectiveDateRange'
# end class EnterprisePrivilegeDetail
class SearchLocationsReply(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, HighestSeverity=None, Notifications=None, TransactionDetail=None, Version=None, TotalResultsAvailable=None, ResultsReturned=None, FormattedAddress=None, AddressToLocationRelationships=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.HighestSeverity = HighestSeverity
self.validate_NotificationSeverityType(self.HighestSeverity)
self.HighestSeverity_nsprefix_ = None
if Notifications is None:
self.Notifications = []
else:
self.Notifications = Notifications
self.Notifications_nsprefix_ = None
self.TransactionDetail = TransactionDetail
self.TransactionDetail_nsprefix_ = None
self.Version = Version
self.Version_nsprefix_ = None
self.TotalResultsAvailable = TotalResultsAvailable
self.TotalResultsAvailable_nsprefix_ = None
self.ResultsReturned = ResultsReturned
self.ResultsReturned_nsprefix_ = None
self.FormattedAddress = FormattedAddress
self.FormattedAddress_nsprefix_ = None
if AddressToLocationRelationships is None:
self.AddressToLocationRelationships = []
else:
self.AddressToLocationRelationships = AddressToLocationRelationships
self.AddressToLocationRelationships_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, SearchLocationsReply)
if subclass is not None:
return subclass(*args_, **kwargs_)
if SearchLocationsReply.subclass:
return SearchLocationsReply.subclass(*args_, **kwargs_)
else:
return SearchLocationsReply(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_HighestSeverity(self):
return self.HighestSeverity
def set_HighestSeverity(self, HighestSeverity):
self.HighestSeverity = HighestSeverity
def get_Notifications(self):
return self.Notifications
def set_Notifications(self, Notifications):
self.Notifications = Notifications
def add_Notifications(self, value):
self.Notifications.append(value)
def insert_Notifications_at(self, index, value):
self.Notifications.insert(index, value)
def replace_Notifications_at(self, index, value):
self.Notifications[index] = value
def get_TransactionDetail(self):
return self.TransactionDetail
def set_TransactionDetail(self, TransactionDetail):
self.TransactionDetail = TransactionDetail
def get_Version(self):
return self.Version
def set_Version(self, Version):
self.Version = Version
def get_TotalResultsAvailable(self):
return self.TotalResultsAvailable
def set_TotalResultsAvailable(self, TotalResultsAvailable):
self.TotalResultsAvailable = TotalResultsAvailable
def get_ResultsReturned(self):
return self.ResultsReturned
def set_ResultsReturned(self, ResultsReturned):
self.ResultsReturned = ResultsReturned
def get_FormattedAddress(self):
return self.FormattedAddress
def set_FormattedAddress(self, FormattedAddress):
self.FormattedAddress = FormattedAddress
def get_AddressToLocationRelationships(self):
return self.AddressToLocationRelationships
def set_AddressToLocationRelationships(self, AddressToLocationRelationships):
self.AddressToLocationRelationships = AddressToLocationRelationships
def add_AddressToLocationRelationships(self, value):
self.AddressToLocationRelationships.append(value)
def insert_AddressToLocationRelationships_at(self, index, value):
self.AddressToLocationRelationships.insert(index, value)
def replace_AddressToLocationRelationships_at(self, index, value):
self.AddressToLocationRelationships[index] = value
def validate_NotificationSeverityType(self, value):
result = True
# Validate type NotificationSeverityType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['ERROR', 'FAILURE', 'NOTE', 'SUCCESS', 'WARNING']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on NotificationSeverityType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.HighestSeverity is not None or
self.Notifications or
self.TransactionDetail is not None or
self.Version is not None or
self.TotalResultsAvailable is not None or
self.ResultsReturned is not None or
self.FormattedAddress is not None or
self.AddressToLocationRelationships
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='SearchLocationsReply', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('SearchLocationsReply')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'SearchLocationsReply':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='SearchLocationsReply')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='SearchLocationsReply', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='SearchLocationsReply'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='SearchLocationsReply', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.HighestSeverity is not None:
namespaceprefix_ = self.HighestSeverity_nsprefix_ + ':' if (UseCapturedNS_ and self.HighestSeverity_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sHighestSeverity>%s</%sHighestSeverity>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.HighestSeverity), input_name='HighestSeverity')), namespaceprefix_ , eol_))
for Notifications_ in self.Notifications:
namespaceprefix_ = self.Notifications_nsprefix_ + ':' if (UseCapturedNS_ and self.Notifications_nsprefix_) else ''
Notifications_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Notifications', pretty_print=pretty_print)
if self.TransactionDetail is not None:
namespaceprefix_ = self.TransactionDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.TransactionDetail_nsprefix_) else ''
self.TransactionDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='TransactionDetail', pretty_print=pretty_print)
if self.Version is not None:
namespaceprefix_ = self.Version_nsprefix_ + ':' if (UseCapturedNS_ and self.Version_nsprefix_) else ''
self.Version.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Version', pretty_print=pretty_print)
if self.TotalResultsAvailable is not None:
namespaceprefix_ = self.TotalResultsAvailable_nsprefix_ + ':' if (UseCapturedNS_ and self.TotalResultsAvailable_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sTotalResultsAvailable>%s</%sTotalResultsAvailable>%s' % (namespaceprefix_ , self.gds_format_integer(self.TotalResultsAvailable, input_name='TotalResultsAvailable'), namespaceprefix_ , eol_))
if self.ResultsReturned is not None:
namespaceprefix_ = self.ResultsReturned_nsprefix_ + ':' if (UseCapturedNS_ and self.ResultsReturned_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sResultsReturned>%s</%sResultsReturned>%s' % (namespaceprefix_ , self.gds_format_integer(self.ResultsReturned, input_name='ResultsReturned'), namespaceprefix_ , eol_))
if self.FormattedAddress is not None:
namespaceprefix_ = self.FormattedAddress_nsprefix_ + ':' if (UseCapturedNS_ and self.FormattedAddress_nsprefix_) else ''
self.FormattedAddress.export(outfile, level, namespaceprefix_, namespacedef_='', name_='FormattedAddress', pretty_print=pretty_print)
for AddressToLocationRelationships_ in self.AddressToLocationRelationships:
namespaceprefix_ = self.AddressToLocationRelationships_nsprefix_ + ':' if (UseCapturedNS_ and self.AddressToLocationRelationships_nsprefix_) else ''
AddressToLocationRelationships_.export(outfile, level, namespaceprefix_, namespacedef_='', name_='AddressToLocationRelationships', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'HighestSeverity':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'HighestSeverity')
value_ = self.gds_validate_string(value_, node, 'HighestSeverity')
self.HighestSeverity = value_
self.HighestSeverity_nsprefix_ = child_.prefix
# validate type NotificationSeverityType
self.validate_NotificationSeverityType(self.HighestSeverity)
elif nodeName_ == 'Notifications':
obj_ = Notification.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Notifications.append(obj_)
obj_.original_tagname_ = 'Notifications'
elif nodeName_ == 'TransactionDetail':
obj_ = TransactionDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.TransactionDetail = obj_
obj_.original_tagname_ = 'TransactionDetail'
elif nodeName_ == 'Version':
obj_ = VersionId.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Version = obj_
obj_.original_tagname_ = 'Version'
elif nodeName_ == 'TotalResultsAvailable' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'TotalResultsAvailable')
if ival_ < 0:
raise_parse_error(child_, 'requires nonNegativeInteger')
ival_ = self.gds_validate_integer(ival_, node, 'TotalResultsAvailable')
self.TotalResultsAvailable = ival_
self.TotalResultsAvailable_nsprefix_ = child_.prefix
elif nodeName_ == 'ResultsReturned' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'ResultsReturned')
if ival_ < 0:
raise_parse_error(child_, 'requires nonNegativeInteger')
ival_ = self.gds_validate_integer(ival_, node, 'ResultsReturned')
self.ResultsReturned = ival_
self.ResultsReturned_nsprefix_ = child_.prefix
elif nodeName_ == 'FormattedAddress':
obj_ = Address.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.FormattedAddress = obj_
obj_.original_tagname_ = 'FormattedAddress'
elif nodeName_ == 'AddressToLocationRelationships':
obj_ = AddressToLocationRelationshipDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.AddressToLocationRelationships.append(obj_)
obj_.original_tagname_ = 'AddressToLocationRelationships'
# end class SearchLocationsReply
class SearchLocationsRequest(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, WebAuthenticationDetail=None, ClientDetail=None, TransactionDetail=None, Version=None, EffectiveDate=None, LocationsSearchCriterion=None, ShipperAccountNumber=None, UniqueTrackingNumber=None, Address=None, PhoneNumber=None, GeographicCoordinates=None, MultipleMatchesAction=None, SortDetail=None, Constraints=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.WebAuthenticationDetail = WebAuthenticationDetail
self.WebAuthenticationDetail_nsprefix_ = None
self.ClientDetail = ClientDetail
self.ClientDetail_nsprefix_ = None
self.TransactionDetail = TransactionDetail
self.TransactionDetail_nsprefix_ = None
self.Version = Version
self.Version_nsprefix_ = None
if isinstance(EffectiveDate, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(EffectiveDate, '%Y-%m-%d').date()
else:
initvalue_ = EffectiveDate
self.EffectiveDate = initvalue_
self.EffectiveDate_nsprefix_ = None
self.LocationsSearchCriterion = LocationsSearchCriterion
self.validate_LocationsSearchCriteriaType(self.LocationsSearchCriterion)
self.LocationsSearchCriterion_nsprefix_ = None
self.ShipperAccountNumber = ShipperAccountNumber
self.ShipperAccountNumber_nsprefix_ = None
self.UniqueTrackingNumber = UniqueTrackingNumber
self.UniqueTrackingNumber_nsprefix_ = None
self.Address = Address
self.Address_nsprefix_ = None
self.PhoneNumber = PhoneNumber
self.PhoneNumber_nsprefix_ = None
self.GeographicCoordinates = GeographicCoordinates
self.GeographicCoordinates_nsprefix_ = None
self.MultipleMatchesAction = MultipleMatchesAction
self.validate_MultipleMatchesActionType(self.MultipleMatchesAction)
self.MultipleMatchesAction_nsprefix_ = None
self.SortDetail = SortDetail
self.SortDetail_nsprefix_ = None
self.Constraints = Constraints
self.Constraints_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, SearchLocationsRequest)
if subclass is not None:
return subclass(*args_, **kwargs_)
if SearchLocationsRequest.subclass:
return SearchLocationsRequest.subclass(*args_, **kwargs_)
else:
return SearchLocationsRequest(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_WebAuthenticationDetail(self):
return self.WebAuthenticationDetail
def set_WebAuthenticationDetail(self, WebAuthenticationDetail):
self.WebAuthenticationDetail = WebAuthenticationDetail
def get_ClientDetail(self):
return self.ClientDetail
def set_ClientDetail(self, ClientDetail):
self.ClientDetail = ClientDetail
def get_TransactionDetail(self):
return self.TransactionDetail
def set_TransactionDetail(self, TransactionDetail):
self.TransactionDetail = TransactionDetail
def get_Version(self):
return self.Version
def set_Version(self, Version):
self.Version = Version
def get_EffectiveDate(self):
return self.EffectiveDate
def set_EffectiveDate(self, EffectiveDate):
self.EffectiveDate = EffectiveDate
def get_LocationsSearchCriterion(self):
return self.LocationsSearchCriterion
def set_LocationsSearchCriterion(self, LocationsSearchCriterion):
self.LocationsSearchCriterion = LocationsSearchCriterion
def get_ShipperAccountNumber(self):
return self.ShipperAccountNumber
def set_ShipperAccountNumber(self, ShipperAccountNumber):
self.ShipperAccountNumber = ShipperAccountNumber
def get_UniqueTrackingNumber(self):
return self.UniqueTrackingNumber
def set_UniqueTrackingNumber(self, UniqueTrackingNumber):
self.UniqueTrackingNumber = UniqueTrackingNumber
def get_Address(self):
return self.Address
def set_Address(self, Address):
self.Address = Address
def get_PhoneNumber(self):
return self.PhoneNumber
def set_PhoneNumber(self, PhoneNumber):
self.PhoneNumber = PhoneNumber
def get_GeographicCoordinates(self):
return self.GeographicCoordinates
def set_GeographicCoordinates(self, GeographicCoordinates):
self.GeographicCoordinates = GeographicCoordinates
def get_MultipleMatchesAction(self):
return self.MultipleMatchesAction
def set_MultipleMatchesAction(self, MultipleMatchesAction):
self.MultipleMatchesAction = MultipleMatchesAction
def get_SortDetail(self):
return self.SortDetail
def set_SortDetail(self, SortDetail):
self.SortDetail = SortDetail
def get_Constraints(self):
return self.Constraints
def set_Constraints(self, Constraints):
self.Constraints = Constraints
def validate_LocationsSearchCriteriaType(self, value):
result = True
# Validate type LocationsSearchCriteriaType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['ADDRESS', 'GEOGRAPHIC_COORDINATES', 'PHONE_NUMBER']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on LocationsSearchCriteriaType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def validate_MultipleMatchesActionType(self, value):
result = True
# Validate type MultipleMatchesActionType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['RETURN_ALL', 'RETURN_ERROR', 'RETURN_FIRST']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on MultipleMatchesActionType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.WebAuthenticationDetail is not None or
self.ClientDetail is not None or
self.TransactionDetail is not None or
self.Version is not None or
self.EffectiveDate is not None or
self.LocationsSearchCriterion is not None or
self.ShipperAccountNumber is not None or
self.UniqueTrackingNumber is not None or
self.Address is not None or
self.PhoneNumber is not None or
self.GeographicCoordinates is not None or
self.MultipleMatchesAction is not None or
self.SortDetail is not None or
self.Constraints is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='SearchLocationsRequest', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('SearchLocationsRequest')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'SearchLocationsRequest':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='SearchLocationsRequest')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='SearchLocationsRequest', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='SearchLocationsRequest'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='SearchLocationsRequest', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.WebAuthenticationDetail is not None:
namespaceprefix_ = self.WebAuthenticationDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.WebAuthenticationDetail_nsprefix_) else ''
self.WebAuthenticationDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='WebAuthenticationDetail', pretty_print=pretty_print)
if self.ClientDetail is not None:
namespaceprefix_ = self.ClientDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.ClientDetail_nsprefix_) else ''
self.ClientDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ClientDetail', pretty_print=pretty_print)
if self.TransactionDetail is not None:
namespaceprefix_ = self.TransactionDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.TransactionDetail_nsprefix_) else ''
self.TransactionDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='TransactionDetail', pretty_print=pretty_print)
if self.Version is not None:
namespaceprefix_ = self.Version_nsprefix_ + ':' if (UseCapturedNS_ and self.Version_nsprefix_) else ''
self.Version.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Version', pretty_print=pretty_print)
if self.EffectiveDate is not None:
namespaceprefix_ = self.EffectiveDate_nsprefix_ + ':' if (UseCapturedNS_ and self.EffectiveDate_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sEffectiveDate>%s</%sEffectiveDate>%s' % (namespaceprefix_ , self.gds_format_date(self.EffectiveDate, input_name='EffectiveDate'), namespaceprefix_ , eol_))
if self.LocationsSearchCriterion is not None:
namespaceprefix_ = self.LocationsSearchCriterion_nsprefix_ + ':' if (UseCapturedNS_ and self.LocationsSearchCriterion_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sLocationsSearchCriterion>%s</%sLocationsSearchCriterion>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.LocationsSearchCriterion), input_name='LocationsSearchCriterion')), namespaceprefix_ , eol_))
if self.ShipperAccountNumber is not None:
namespaceprefix_ = self.ShipperAccountNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.ShipperAccountNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sShipperAccountNumber>%s</%sShipperAccountNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ShipperAccountNumber), input_name='ShipperAccountNumber')), namespaceprefix_ , eol_))
if self.UniqueTrackingNumber is not None:
namespaceprefix_ = self.UniqueTrackingNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.UniqueTrackingNumber_nsprefix_) else ''
self.UniqueTrackingNumber.export(outfile, level, namespaceprefix_, namespacedef_='', name_='UniqueTrackingNumber', pretty_print=pretty_print)
if self.Address is not None:
namespaceprefix_ = self.Address_nsprefix_ + ':' if (UseCapturedNS_ and self.Address_nsprefix_) else ''
self.Address.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Address', pretty_print=pretty_print)
if self.PhoneNumber is not None:
namespaceprefix_ = self.PhoneNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.PhoneNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sPhoneNumber>%s</%sPhoneNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.PhoneNumber), input_name='PhoneNumber')), namespaceprefix_ , eol_))
if self.GeographicCoordinates is not None:
namespaceprefix_ = self.GeographicCoordinates_nsprefix_ + ':' if (UseCapturedNS_ and self.GeographicCoordinates_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sGeographicCoordinates>%s</%sGeographicCoordinates>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.GeographicCoordinates), input_name='GeographicCoordinates')), namespaceprefix_ , eol_))
if self.MultipleMatchesAction is not None:
namespaceprefix_ = self.MultipleMatchesAction_nsprefix_ + ':' if (UseCapturedNS_ and self.MultipleMatchesAction_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sMultipleMatchesAction>%s</%sMultipleMatchesAction>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.MultipleMatchesAction), input_name='MultipleMatchesAction')), namespaceprefix_ , eol_))
if self.SortDetail is not None:
namespaceprefix_ = self.SortDetail_nsprefix_ + ':' if (UseCapturedNS_ and self.SortDetail_nsprefix_) else ''
self.SortDetail.export(outfile, level, namespaceprefix_, namespacedef_='', name_='SortDetail', pretty_print=pretty_print)
if self.Constraints is not None:
namespaceprefix_ = self.Constraints_nsprefix_ + ':' if (UseCapturedNS_ and self.Constraints_nsprefix_) else ''
self.Constraints.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Constraints', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'WebAuthenticationDetail':
obj_ = WebAuthenticationDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.WebAuthenticationDetail = obj_
obj_.original_tagname_ = 'WebAuthenticationDetail'
elif nodeName_ == 'ClientDetail':
obj_ = ClientDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ClientDetail = obj_
obj_.original_tagname_ = 'ClientDetail'
elif nodeName_ == 'TransactionDetail':
obj_ = TransactionDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.TransactionDetail = obj_
obj_.original_tagname_ = 'TransactionDetail'
elif nodeName_ == 'Version':
obj_ = VersionId.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Version = obj_
obj_.original_tagname_ = 'Version'
elif nodeName_ == 'EffectiveDate':
sval_ = child_.text
dval_ = self.gds_parse_date(sval_)
self.EffectiveDate = dval_
self.EffectiveDate_nsprefix_ = child_.prefix
elif nodeName_ == 'LocationsSearchCriterion':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'LocationsSearchCriterion')
value_ = self.gds_validate_string(value_, node, 'LocationsSearchCriterion')
self.LocationsSearchCriterion = value_
self.LocationsSearchCriterion_nsprefix_ = child_.prefix
# validate type LocationsSearchCriteriaType
self.validate_LocationsSearchCriteriaType(self.LocationsSearchCriterion)
elif nodeName_ == 'ShipperAccountNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ShipperAccountNumber')
value_ = self.gds_validate_string(value_, node, 'ShipperAccountNumber')
self.ShipperAccountNumber = value_
self.ShipperAccountNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'UniqueTrackingNumber':
obj_ = UniqueTrackingNumber.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.UniqueTrackingNumber = obj_
obj_.original_tagname_ = 'UniqueTrackingNumber'
elif nodeName_ == 'Address':
obj_ = Address.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Address = obj_
obj_.original_tagname_ = 'Address'
elif nodeName_ == 'PhoneNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'PhoneNumber')
value_ = self.gds_validate_string(value_, node, 'PhoneNumber')
self.PhoneNumber = value_
self.PhoneNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'GeographicCoordinates':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'GeographicCoordinates')
value_ = self.gds_validate_string(value_, node, 'GeographicCoordinates')
self.GeographicCoordinates = value_
self.GeographicCoordinates_nsprefix_ = child_.prefix
elif nodeName_ == 'MultipleMatchesAction':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'MultipleMatchesAction')
value_ = self.gds_validate_string(value_, node, 'MultipleMatchesAction')
self.MultipleMatchesAction = value_
self.MultipleMatchesAction_nsprefix_ = child_.prefix
# validate type MultipleMatchesActionType
self.validate_MultipleMatchesActionType(self.MultipleMatchesAction)
elif nodeName_ == 'SortDetail':
obj_ = LocationSortDetail.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.SortDetail = obj_
obj_.original_tagname_ = 'SortDetail'
elif nodeName_ == 'Constraints':
obj_ = SearchLocationConstraints.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Constraints = obj_
obj_.original_tagname_ = 'Constraints'
# end class SearchLocationsRequest
class ShippingHoliday(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Holiday=None, UnavailableActions=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Holiday = Holiday
self.Holiday_nsprefix_ = None
if UnavailableActions is None:
self.UnavailableActions = []
else:
self.UnavailableActions = UnavailableActions
self.UnavailableActions_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, ShippingHoliday)
if subclass is not None:
return subclass(*args_, **kwargs_)
if ShippingHoliday.subclass:
return ShippingHoliday.subclass(*args_, **kwargs_)
else:
return ShippingHoliday(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Holiday(self):
return self.Holiday
def set_Holiday(self, Holiday):
self.Holiday = Holiday
def get_UnavailableActions(self):
return self.UnavailableActions
def set_UnavailableActions(self, UnavailableActions):
self.UnavailableActions = UnavailableActions
def add_UnavailableActions(self, value):
self.UnavailableActions.append(value)
def insert_UnavailableActions_at(self, index, value):
self.UnavailableActions.insert(index, value)
def replace_UnavailableActions_at(self, index, value):
self.UnavailableActions[index] = value
def validate_ShippingActionType(self, value):
result = True
# Validate type ShippingActionType, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['DELIVERIES', 'PICKUPS']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on ShippingActionType' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Holiday is not None or
self.UnavailableActions
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ShippingHoliday', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('ShippingHoliday')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'ShippingHoliday':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='ShippingHoliday')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='ShippingHoliday', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='ShippingHoliday'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='ShippingHoliday', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Holiday is not None:
namespaceprefix_ = self.Holiday_nsprefix_ + ':' if (UseCapturedNS_ and self.Holiday_nsprefix_) else ''
self.Holiday.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Holiday', pretty_print=pretty_print)
for UnavailableActions_ in self.UnavailableActions:
namespaceprefix_ = self.UnavailableActions_nsprefix_ + ':' if (UseCapturedNS_ and self.UnavailableActions_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sUnavailableActions>%s</%sUnavailableActions>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(UnavailableActions_), input_name='UnavailableActions')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Holiday':
obj_ = Holiday.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Holiday = obj_
obj_.original_tagname_ = 'Holiday'
elif nodeName_ == 'UnavailableActions':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'UnavailableActions')
value_ = self.gds_validate_string(value_, node, 'UnavailableActions')
self.UnavailableActions.append(value_)
self.UnavailableActions_nsprefix_ = child_.prefix
# validate type ShippingActionType
self.validate_ShippingActionType(self.UnavailableActions[-1])
# end class ShippingHoliday
class TimeRange(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Begins=None, Ends=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
if isinstance(Begins, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(Begins, '%H:%M:%S').time()
else:
initvalue_ = Begins
self.Begins = initvalue_
self.Begins_nsprefix_ = None
if isinstance(Ends, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(Ends, '%H:%M:%S').time()
else:
initvalue_ = Ends
self.Ends = initvalue_
self.Ends_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, TimeRange)
if subclass is not None:
return subclass(*args_, **kwargs_)
if TimeRange.subclass:
return TimeRange.subclass(*args_, **kwargs_)
else:
return TimeRange(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Begins(self):
return self.Begins
def set_Begins(self, Begins):
self.Begins = Begins
def get_Ends(self):
return self.Ends
def set_Ends(self, Ends):
self.Ends = Ends
def hasContent_(self):
if (
self.Begins is not None or
self.Ends is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='TimeRange', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('TimeRange')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'TimeRange':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='TimeRange')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='TimeRange', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='TimeRange'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='TimeRange', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Begins is not None:
namespaceprefix_ = self.Begins_nsprefix_ + ':' if (UseCapturedNS_ and self.Begins_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sBegins>%s</%sBegins>%s' % (namespaceprefix_ , self.gds_format_time(self.Begins, input_name='Begins'), namespaceprefix_ , eol_))
if self.Ends is not None:
namespaceprefix_ = self.Ends_nsprefix_ + ':' if (UseCapturedNS_ and self.Ends_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sEnds>%s</%sEnds>%s' % (namespaceprefix_ , self.gds_format_time(self.Ends, input_name='Ends'), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Begins':
sval_ = child_.text
dval_ = self.gds_parse_time(sval_)
self.Begins = dval_
self.Begins_nsprefix_ = child_.prefix
elif nodeName_ == 'Ends':
sval_ = child_.text
dval_ = self.gds_parse_time(sval_)
self.Ends = dval_
self.Ends_nsprefix_ = child_.prefix
# end class TimeRange
class TransactionDetail(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, CustomerTransactionId=None, Localization=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.CustomerTransactionId = CustomerTransactionId
self.CustomerTransactionId_nsprefix_ = None
self.Localization = Localization
self.Localization_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, TransactionDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if TransactionDetail.subclass:
return TransactionDetail.subclass(*args_, **kwargs_)
else:
return TransactionDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_CustomerTransactionId(self):
return self.CustomerTransactionId
def set_CustomerTransactionId(self, CustomerTransactionId):
self.CustomerTransactionId = CustomerTransactionId
def get_Localization(self):
return self.Localization
def set_Localization(self, Localization):
self.Localization = Localization
def hasContent_(self):
if (
self.CustomerTransactionId is not None or
self.Localization is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='TransactionDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('TransactionDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'TransactionDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='TransactionDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='TransactionDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='TransactionDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='TransactionDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.CustomerTransactionId is not None:
namespaceprefix_ = self.CustomerTransactionId_nsprefix_ + ':' if (UseCapturedNS_ and self.CustomerTransactionId_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sCustomerTransactionId>%s</%sCustomerTransactionId>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.CustomerTransactionId), input_name='CustomerTransactionId')), namespaceprefix_ , eol_))
if self.Localization is not None:
namespaceprefix_ = self.Localization_nsprefix_ + ':' if (UseCapturedNS_ and self.Localization_nsprefix_) else ''
self.Localization.export(outfile, level, namespaceprefix_, namespacedef_='', name_='Localization', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'CustomerTransactionId':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'CustomerTransactionId')
value_ = self.gds_validate_string(value_, node, 'CustomerTransactionId')
self.CustomerTransactionId = value_
self.CustomerTransactionId_nsprefix_ = child_.prefix
elif nodeName_ == 'Localization':
obj_ = Localization.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.Localization = obj_
obj_.original_tagname_ = 'Localization'
# end class TransactionDetail
class UniqueTrackingNumber(GeneratedsSuper):
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, TrackingNumber=None, TrackingNumberUniqueIdentifier=None, ShipDate=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.TrackingNumber = TrackingNumber
self.TrackingNumber_nsprefix_ = None
self.TrackingNumberUniqueIdentifier = TrackingNumberUniqueIdentifier
self.TrackingNumberUniqueIdentifier_nsprefix_ = None
if isinstance(ShipDate, BaseStrType_):
initvalue_ = datetime_.datetime.strptime(ShipDate, '%Y-%m-%d').date()
else:
initvalue_ = ShipDate
self.ShipDate = initvalue_
self.ShipDate_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, UniqueTrackingNumber)
if subclass is not None:
return subclass(*args_, **kwargs_)
if UniqueTrackingNumber.subclass:
return UniqueTrackingNumber.subclass(*args_, **kwargs_)
else:
return UniqueTrackingNumber(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_TrackingNumber(self):
return self.TrackingNumber
def set_TrackingNumber(self, TrackingNumber):
self.TrackingNumber = TrackingNumber
def get_TrackingNumberUniqueIdentifier(self):
return self.TrackingNumberUniqueIdentifier
def set_TrackingNumberUniqueIdentifier(self, TrackingNumberUniqueIdentifier):
self.TrackingNumberUniqueIdentifier = TrackingNumberUniqueIdentifier
def get_ShipDate(self):
return self.ShipDate
def set_ShipDate(self, ShipDate):
self.ShipDate = ShipDate
def hasContent_(self):
if (
self.TrackingNumber is not None or
self.TrackingNumberUniqueIdentifier is not None or
self.ShipDate is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='UniqueTrackingNumber', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('UniqueTrackingNumber')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'UniqueTrackingNumber':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='UniqueTrackingNumber')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='UniqueTrackingNumber', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='UniqueTrackingNumber'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='UniqueTrackingNumber', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.TrackingNumber is not None:
namespaceprefix_ = self.TrackingNumber_nsprefix_ + ':' if (UseCapturedNS_ and self.TrackingNumber_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sTrackingNumber>%s</%sTrackingNumber>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.TrackingNumber), input_name='TrackingNumber')), namespaceprefix_ , eol_))
if self.TrackingNumberUniqueIdentifier is not None:
namespaceprefix_ = self.TrackingNumberUniqueIdentifier_nsprefix_ + ':' if (UseCapturedNS_ and self.TrackingNumberUniqueIdentifier_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sTrackingNumberUniqueIdentifier>%s</%sTrackingNumberUniqueIdentifier>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.TrackingNumberUniqueIdentifier), input_name='TrackingNumberUniqueIdentifier')), namespaceprefix_ , eol_))
if self.ShipDate is not None:
namespaceprefix_ = self.ShipDate_nsprefix_ + ':' if (UseCapturedNS_ and self.ShipDate_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sShipDate>%s</%sShipDate>%s' % (namespaceprefix_ , self.gds_format_date(self.ShipDate, input_name='ShipDate'), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'TrackingNumber':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'TrackingNumber')
value_ = self.gds_validate_string(value_, node, 'TrackingNumber')
self.TrackingNumber = value_
self.TrackingNumber_nsprefix_ = child_.prefix
elif nodeName_ == 'TrackingNumberUniqueIdentifier':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'TrackingNumberUniqueIdentifier')
value_ = self.gds_validate_string(value_, node, 'TrackingNumberUniqueIdentifier')
self.TrackingNumberUniqueIdentifier = value_
self.TrackingNumberUniqueIdentifier_nsprefix_ = child_.prefix
elif nodeName_ == 'ShipDate':
sval_ = child_.text
dval_ = self.gds_parse_date(sval_)
self.ShipDate = dval_
self.ShipDate_nsprefix_ = child_.prefix
# end class UniqueTrackingNumber
class Weight(GeneratedsSuper):
"""The descriptive data for the heaviness of an object."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Units=None, Value=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Units = Units
self.validate_WeightUnits(self.Units)
self.Units_nsprefix_ = None
self.Value = Value
self.Value_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, Weight)
if subclass is not None:
return subclass(*args_, **kwargs_)
if Weight.subclass:
return Weight.subclass(*args_, **kwargs_)
else:
return Weight(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Units(self):
return self.Units
def set_Units(self, Units):
self.Units = Units
def get_Value(self):
return self.Value
def set_Value(self, Value):
self.Value = Value
def validate_WeightUnits(self, value):
result = True
# Validate type WeightUnits, a restriction on xs:string.
if value is not None and Validate_simpletypes_ and self.gds_collector_ is not None:
if not isinstance(value, str):
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s is not of the correct base simple type (str)' % {"value": value, "lineno": lineno, })
return False
value = value
enumerations = ['KG', 'LB']
if value not in enumerations:
lineno = self.gds_get_node_lineno_()
self.gds_collector_.add_message('Value "%(value)s"%(lineno)s does not match xsd enumeration restriction on WeightUnits' % {"value" : encode_str_2_3(value), "lineno": lineno} )
result = False
return result
def hasContent_(self):
if (
self.Units is not None or
self.Value is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Weight', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('Weight')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'Weight':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='Weight')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='Weight', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='Weight'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='Weight', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Units is not None:
namespaceprefix_ = self.Units_nsprefix_ + ':' if (UseCapturedNS_ and self.Units_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sUnits>%s</%sUnits>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Units), input_name='Units')), namespaceprefix_ , eol_))
if self.Value is not None:
namespaceprefix_ = self.Value_nsprefix_ + ':' if (UseCapturedNS_ and self.Value_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sValue>%s</%sValue>%s' % (namespaceprefix_ , self.gds_format_decimal(self.Value, input_name='Value'), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Units':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Units')
value_ = self.gds_validate_string(value_, node, 'Units')
self.Units = value_
self.Units_nsprefix_ = child_.prefix
# validate type WeightUnits
self.validate_WeightUnits(self.Units)
elif nodeName_ == 'Value' and child_.text:
sval_ = child_.text
fval_ = self.gds_parse_decimal(sval_, node, 'Value')
fval_ = self.gds_validate_decimal(fval_, node, 'Value')
self.Value = fval_
self.Value_nsprefix_ = child_.prefix
# end class Weight
class WebAuthenticationDetail(GeneratedsSuper):
"""Used in authentication of the sender's identity."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, ParentCredential=None, UserCredential=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.ParentCredential = ParentCredential
self.ParentCredential_nsprefix_ = None
self.UserCredential = UserCredential
self.UserCredential_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, WebAuthenticationDetail)
if subclass is not None:
return subclass(*args_, **kwargs_)
if WebAuthenticationDetail.subclass:
return WebAuthenticationDetail.subclass(*args_, **kwargs_)
else:
return WebAuthenticationDetail(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_ParentCredential(self):
return self.ParentCredential
def set_ParentCredential(self, ParentCredential):
self.ParentCredential = ParentCredential
def get_UserCredential(self):
return self.UserCredential
def set_UserCredential(self, UserCredential):
self.UserCredential = UserCredential
def hasContent_(self):
if (
self.ParentCredential is not None or
self.UserCredential is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='WebAuthenticationDetail', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('WebAuthenticationDetail')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'WebAuthenticationDetail':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='WebAuthenticationDetail')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='WebAuthenticationDetail', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='WebAuthenticationDetail'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='WebAuthenticationDetail', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.ParentCredential is not None:
namespaceprefix_ = self.ParentCredential_nsprefix_ + ':' if (UseCapturedNS_ and self.ParentCredential_nsprefix_) else ''
self.ParentCredential.export(outfile, level, namespaceprefix_, namespacedef_='', name_='ParentCredential', pretty_print=pretty_print)
if self.UserCredential is not None:
namespaceprefix_ = self.UserCredential_nsprefix_ + ':' if (UseCapturedNS_ and self.UserCredential_nsprefix_) else ''
self.UserCredential.export(outfile, level, namespaceprefix_, namespacedef_='', name_='UserCredential', pretty_print=pretty_print)
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'ParentCredential':
obj_ = WebAuthenticationCredential.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.ParentCredential = obj_
obj_.original_tagname_ = 'ParentCredential'
elif nodeName_ == 'UserCredential':
obj_ = WebAuthenticationCredential.factory(parent_object_=self)
obj_.build(child_, gds_collector_=gds_collector_)
self.UserCredential = obj_
obj_.original_tagname_ = 'UserCredential'
# end class WebAuthenticationDetail
class WebAuthenticationCredential(GeneratedsSuper):
"""Two part authentication string used for the sender's identity"""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, Key=None, Password=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.Key = Key
self.Key_nsprefix_ = None
self.Password = Password
self.Password_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, WebAuthenticationCredential)
if subclass is not None:
return subclass(*args_, **kwargs_)
if WebAuthenticationCredential.subclass:
return WebAuthenticationCredential.subclass(*args_, **kwargs_)
else:
return WebAuthenticationCredential(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_Key(self):
return self.Key
def set_Key(self, Key):
self.Key = Key
def get_Password(self):
return self.Password
def set_Password(self, Password):
self.Password = Password
def hasContent_(self):
if (
self.Key is not None or
self.Password is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='WebAuthenticationCredential', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('WebAuthenticationCredential')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'WebAuthenticationCredential':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='WebAuthenticationCredential')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='WebAuthenticationCredential', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='WebAuthenticationCredential'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='WebAuthenticationCredential', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.Key is not None:
namespaceprefix_ = self.Key_nsprefix_ + ':' if (UseCapturedNS_ and self.Key_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sKey>%s</%sKey>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Key), input_name='Key')), namespaceprefix_ , eol_))
if self.Password is not None:
namespaceprefix_ = self.Password_nsprefix_ + ':' if (UseCapturedNS_ and self.Password_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sPassword>%s</%sPassword>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.Password), input_name='Password')), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'Key':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Key')
value_ = self.gds_validate_string(value_, node, 'Key')
self.Key = value_
self.Key_nsprefix_ = child_.prefix
elif nodeName_ == 'Password':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'Password')
value_ = self.gds_validate_string(value_, node, 'Password')
self.Password = value_
self.Password_nsprefix_ = child_.prefix
# end class WebAuthenticationCredential
class VersionId(GeneratedsSuper):
"""Identifies the version/level of a service operation expected by a caller
(in each request) and performed by the callee (in each reply)."""
__hash__ = GeneratedsSuper.__hash__
subclass = None
superclass = None
def __init__(self, ServiceId=None, Major=None, Intermediate=None, Minor=None, gds_collector_=None, **kwargs_):
self.gds_collector_ = gds_collector_
self.gds_elementtree_node_ = None
self.original_tagname_ = None
self.parent_object_ = kwargs_.get('parent_object_')
self.ns_prefix_ = None
self.ServiceId = ServiceId
self.ServiceId_nsprefix_ = None
self.Major = Major
self.Major_nsprefix_ = None
self.Intermediate = Intermediate
self.Intermediate_nsprefix_ = None
self.Minor = Minor
self.Minor_nsprefix_ = None
def factory(*args_, **kwargs_):
if CurrentSubclassModule_ is not None:
subclass = getSubclassFromModule_(
CurrentSubclassModule_, VersionId)
if subclass is not None:
return subclass(*args_, **kwargs_)
if VersionId.subclass:
return VersionId.subclass(*args_, **kwargs_)
else:
return VersionId(*args_, **kwargs_)
factory = staticmethod(factory)
def get_ns_prefix_(self):
return self.ns_prefix_
def set_ns_prefix_(self, ns_prefix):
self.ns_prefix_ = ns_prefix
def get_ServiceId(self):
return self.ServiceId
def set_ServiceId(self, ServiceId):
self.ServiceId = ServiceId
def get_Major(self):
return self.Major
def set_Major(self, Major):
self.Major = Major
def get_Intermediate(self):
return self.Intermediate
def set_Intermediate(self, Intermediate):
self.Intermediate = Intermediate
def get_Minor(self):
return self.Minor
def set_Minor(self, Minor):
self.Minor = Minor
def hasContent_(self):
if (
self.ServiceId is not None or
self.Major is not None or
self.Intermediate is not None or
self.Minor is not None
):
return True
else:
return False
def export(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='VersionId', pretty_print=True):
imported_ns_def_ = GenerateDSNamespaceDefs_.get('VersionId')
if imported_ns_def_ is not None:
namespacedef_ = imported_ns_def_
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.original_tagname_ is not None and name_ == 'VersionId':
name_ = self.original_tagname_
if UseCapturedNS_ and self.ns_prefix_:
namespaceprefix_ = self.ns_prefix_ + ':'
showIndent(outfile, level, pretty_print)
outfile.write('<%s%s%s' % (namespaceprefix_, name_, namespacedef_ and ' ' + namespacedef_ or '', ))
already_processed = set()
self.exportAttributes(outfile, level, already_processed, namespaceprefix_, name_='VersionId')
if self.hasContent_():
outfile.write('>%s' % (eol_, ))
self.exportChildren(outfile, level + 1, namespaceprefix_, namespacedef_, name_='VersionId', pretty_print=pretty_print)
showIndent(outfile, level, pretty_print)
outfile.write('</%s%s>%s' % (namespaceprefix_, name_, eol_))
else:
outfile.write('/>%s' % (eol_, ))
def exportAttributes(self, outfile, level, already_processed, namespaceprefix_='', name_='VersionId'):
pass
def exportChildren(self, outfile, level, namespaceprefix_='', namespacedef_='', name_='VersionId', fromsubclass_=False, pretty_print=True):
if pretty_print:
eol_ = '\n'
else:
eol_ = ''
if self.ServiceId is not None:
namespaceprefix_ = self.ServiceId_nsprefix_ + ':' if (UseCapturedNS_ and self.ServiceId_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sServiceId>%s</%sServiceId>%s' % (namespaceprefix_ , self.gds_encode(self.gds_format_string(quote_xml(self.ServiceId), input_name='ServiceId')), namespaceprefix_ , eol_))
if self.Major is not None:
namespaceprefix_ = self.Major_nsprefix_ + ':' if (UseCapturedNS_ and self.Major_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sMajor>%s</%sMajor>%s' % (namespaceprefix_ , self.gds_format_integer(self.Major, input_name='Major'), namespaceprefix_ , eol_))
if self.Intermediate is not None:
namespaceprefix_ = self.Intermediate_nsprefix_ + ':' if (UseCapturedNS_ and self.Intermediate_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sIntermediate>%s</%sIntermediate>%s' % (namespaceprefix_ , self.gds_format_integer(self.Intermediate, input_name='Intermediate'), namespaceprefix_ , eol_))
if self.Minor is not None:
namespaceprefix_ = self.Minor_nsprefix_ + ':' if (UseCapturedNS_ and self.Minor_nsprefix_) else ''
showIndent(outfile, level, pretty_print)
outfile.write('<%sMinor>%s</%sMinor>%s' % (namespaceprefix_ , self.gds_format_integer(self.Minor, input_name='Minor'), namespaceprefix_ , eol_))
def build(self, node, gds_collector_=None):
self.gds_collector_ = gds_collector_
if SaveElementTreeNode:
self.gds_elementtree_node_ = node
already_processed = set()
self.ns_prefix_ = node.prefix
self.buildAttributes(node, node.attrib, already_processed)
for child in node:
nodeName_ = Tag_pattern_.match(child.tag).groups()[-1]
self.buildChildren(child, node, nodeName_, gds_collector_=gds_collector_)
return self
def buildAttributes(self, node, attrs, already_processed):
pass
def buildChildren(self, child_, node, nodeName_, fromsubclass_=False, gds_collector_=None):
if nodeName_ == 'ServiceId':
value_ = child_.text
value_ = self.gds_parse_string(value_, node, 'ServiceId')
value_ = self.gds_validate_string(value_, node, 'ServiceId')
self.ServiceId = value_
self.ServiceId_nsprefix_ = child_.prefix
elif nodeName_ == 'Major' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'Major')
ival_ = self.gds_validate_integer(ival_, node, 'Major')
self.Major = ival_
self.Major_nsprefix_ = child_.prefix
elif nodeName_ == 'Intermediate' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'Intermediate')
ival_ = self.gds_validate_integer(ival_, node, 'Intermediate')
self.Intermediate = ival_
self.Intermediate_nsprefix_ = child_.prefix
elif nodeName_ == 'Minor' and child_.text:
sval_ = child_.text
ival_ = self.gds_parse_integer(sval_, node, 'Minor')
ival_ = self.gds_validate_integer(ival_, node, 'Minor')
self.Minor = ival_
self.Minor_nsprefix_ = child_.prefix
# end class VersionId
GDSClassesMapping = {
'SearchLocationsReply': SearchLocationsReply,
'SearchLocationsRequest': SearchLocationsRequest,
'ValidateLocationAvailabilityRequest': ValidateLocationAvailabilityRequest,
}
USAGE_TEXT = """
Usage: python <Parser>.py [ -s ] <in_xml_file>
"""
def usage():
print(USAGE_TEXT)
sys.exit(1)
def get_root_tag(node):
tag = Tag_pattern_.match(node.tag).groups()[-1]
rootClass = GDSClassesMapping.get(tag)
if rootClass is None:
rootClass = globals().get(tag)
return tag, rootClass
def get_required_ns_prefix_defs(rootNode):
'''Get all name space prefix definitions required in this XML doc.
Return a dictionary of definitions and a char string of definitions.
'''
nsmap = {
prefix: uri
for node in rootNode.iter()
for (prefix, uri) in node.nsmap.items()
if prefix is not None
}
namespacedefs = ' '.join([
'xmlns:{}="{}"'.format(prefix, uri)
for prefix, uri in nsmap.items()
])
return nsmap, namespacedefs
def parse(inFileName, silence=False, print_warnings=True):
global CapturedNsmap_
gds_collector = GdsCollector_()
parser = None
doc = parsexml_(inFileName, parser)
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'SearchLocationsReply'
rootClass = SearchLocationsReply
rootObj = rootClass.factory()
rootObj.build(rootNode, gds_collector_=gds_collector)
CapturedNsmap_, namespacedefs = get_required_ns_prefix_defs(rootNode)
if not SaveElementTreeNode:
doc = None
rootNode = None
if not silence:
sys.stdout.write('<?xml version="1.0" ?>\n')
rootObj.export(
sys.stdout, 0, name_=rootTag,
namespacedef_=namespacedefs,
pretty_print=True)
if print_warnings and len(gds_collector.get_messages()) > 0:
separator = ('-' * 50) + '\n'
sys.stderr.write(separator)
sys.stderr.write('----- Warnings -- count: {} -----\n'.format(
len(gds_collector.get_messages()), ))
gds_collector.write_messages(sys.stderr)
sys.stderr.write(separator)
return rootObj
def parseEtree(inFileName, silence=False, print_warnings=True):
parser = None
doc = parsexml_(inFileName, parser)
gds_collector = GdsCollector_()
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'SearchLocationsReply'
rootClass = SearchLocationsReply
rootObj = rootClass.factory()
rootObj.build(rootNode, gds_collector_=gds_collector)
# Enable Python to collect the space used by the DOM.
mapping = {}
rootElement = rootObj.to_etree(None, name_=rootTag, mapping_=mapping)
reverse_mapping = rootObj.gds_reverse_node_mapping(mapping)
if not SaveElementTreeNode:
doc = None
rootNode = None
if not silence:
content = etree_.tostring(
rootElement, pretty_print=True,
xml_declaration=True, encoding="utf-8")
sys.stdout.write(str(content))
sys.stdout.write('\n')
if print_warnings and len(gds_collector.get_messages()) > 0:
separator = ('-' * 50) + '\n'
sys.stderr.write(separator)
sys.stderr.write('----- Warnings -- count: {} -----\n'.format(
len(gds_collector.get_messages()), ))
gds_collector.write_messages(sys.stderr)
sys.stderr.write(separator)
return rootObj, rootElement, mapping, reverse_mapping
def parseString(inString, silence=False, print_warnings=True):
'''Parse a string, create the object tree, and export it.
Arguments:
- inString -- A string. This XML fragment should not start
with an XML declaration containing an encoding.
- silence -- A boolean. If False, export the object.
Returns -- The root object in the tree.
'''
parser = None
rootNode= parsexmlstring_(inString, parser)
gds_collector = GdsCollector_()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'SearchLocationsReply'
rootClass = SearchLocationsReply
rootObj = rootClass.factory()
rootObj.build(rootNode, gds_collector_=gds_collector)
if not SaveElementTreeNode:
rootNode = None
if not silence:
sys.stdout.write('<?xml version="1.0" ?>\n')
rootObj.export(
sys.stdout, 0, name_=rootTag,
namespacedef_='xmlns:ns="http://fedex.com/ws/locs/v11"')
if print_warnings and len(gds_collector.get_messages()) > 0:
separator = ('-' * 50) + '\n'
sys.stderr.write(separator)
sys.stderr.write('----- Warnings -- count: {} -----\n'.format(
len(gds_collector.get_messages()), ))
gds_collector.write_messages(sys.stderr)
sys.stderr.write(separator)
return rootObj
def parseLiteral(inFileName, silence=False, print_warnings=True):
parser = None
doc = parsexml_(inFileName, parser)
gds_collector = GdsCollector_()
rootNode = doc.getroot()
rootTag, rootClass = get_root_tag(rootNode)
if rootClass is None:
rootTag = 'SearchLocationsReply'
rootClass = SearchLocationsReply
rootObj = rootClass.factory()
rootObj.build(rootNode, gds_collector_=gds_collector)
# Enable Python to collect the space used by the DOM.
if not SaveElementTreeNode:
doc = None
rootNode = None
if not silence:
sys.stdout.write('#from location_service_v11 import *\n\n')
sys.stdout.write('import location_service_v11 as model_\n\n')
sys.stdout.write('rootObj = model_.rootClass(\n')
rootObj.exportLiteral(sys.stdout, 0, name_=rootTag)
sys.stdout.write(')\n')
if print_warnings and len(gds_collector.get_messages()) > 0:
separator = ('-' * 50) + '\n'
sys.stderr.write(separator)
sys.stderr.write('----- Warnings -- count: {} -----\n'.format(
len(gds_collector.get_messages()), ))
gds_collector.write_messages(sys.stderr)
sys.stderr.write(separator)
return rootObj
def main():
args = sys.argv[1:]
if len(args) == 1:
parse(args[0])
else:
usage()
if __name__ == '__main__':
#import pdb; pdb.set_trace()
main()
RenameMappings_ = {
}
__all__ = [
"Address",
"AddressAncillaryDetail",
"AddressToLocationRelationshipDetail",
"CarrierDetail",
"ClearanceCountryDetail",
"ClearanceLocationDetail",
"ClientDetail",
"Contact",
"DateRange",
"Dimensions",
"Distance",
"DistanceAndLocationDetail",
"EnterprisePrivilegeDetail",
"Holiday",
"LatestDropOffDetail",
"LatestDropoffOverlayDetail",
"Localization",
"LocationCapabilityDetail",
"LocationContactAndAddress",
"LocationDetail",
"LocationHours",
"LocationIdentificationDetail",
"LocationPackageLimitsDetail",
"LocationSortDetail",
"LocationSupportedPackageDetail",
"LocationSupportedShipmentDetail",
"Notification",
"NotificationParameter",
"ReservationAvailabilityDetail",
"RestrictionsAndPrivilegesPolicyDetail",
"SearchLocationConstraints",
"SearchLocationsReply",
"SearchLocationsRequest",
"ShippingHoliday",
"TimeRange",
"TransactionDetail",
"UniqueTrackingNumber",
"ValidateLocationAvailabilityRequest",
"VersionId",
"WebAuthenticationCredential",
"WebAuthenticationDetail",
"Weight"
]
| 51.282941 | 513 | 0.664514 | 44,658 | 449,136 | 6.357047 | 0.024968 | 0.020959 | 0.017785 | 0.017591 | 0.708707 | 0.640562 | 0.61966 | 0.578496 | 0.563219 | 0.530608 | 0 | 0.001814 | 0.245182 | 449,136 | 8,757 | 514 | 51.288798 | 0.835589 | 0.026929 | 0 | 0.601674 | 1 | 0.000121 | 0.082148 | 0.038459 | 0 | 0 | 0 | 0 | 0 | 1 | 0.132371 | false | 0.013225 | 0.017957 | 0.032031 | 0.273235 | 0.06018 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
4718c3e6f88080fb3ddbf1b2d1feee70ac12c913 | 1,669 | py | Python | src/beanmachine/applications/hme/interface.py | horizon-blue/beanmachine-1 | b13e4e3e28ffb860947eb8046863b0cabb581222 | [
"MIT"
] | 177 | 2021-12-12T14:19:05.000Z | 2022-03-24T05:48:10.000Z | src/beanmachine/applications/hme/interface.py | horizon-blue/beanmachine-1 | b13e4e3e28ffb860947eb8046863b0cabb581222 | [
"MIT"
] | 171 | 2021-12-11T06:12:05.000Z | 2022-03-31T20:26:29.000Z | src/beanmachine/applications/hme/interface.py | horizon-blue/beanmachine-1 | b13e4e3e28ffb860947eb8046863b0cabb581222 | [
"MIT"
] | 31 | 2021-12-11T06:27:19.000Z | 2022-03-25T13:31:56.000Z | # Copyright (c) Meta Platforms, Inc. and affiliates.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
from typing import Tuple
import pandas as pd
from .configs import InferConfig, ModelConfig
from .null_mixture_model import NullMixtureMixedEffectModel
class HME:
"""The Hierarchical Mixed Effect model interface.
:param data: observed train data
:param model_config: HME model configuration parameters
"""
def __init__(self, data: pd.DataFrame, model_config: ModelConfig) -> None:
self.model = NullMixtureMixedEffectModel(data, model_config)
self.posterior_samples = None
self.posterior_diagnostics = None
def infer(self, infer_config: InferConfig) -> Tuple[pd.DataFrame]:
"""Performs MCMC posterior inference on HME model parameters and
returns MCMC samples for those parameters registered in the query.
:param infer_config: configuration settings of posterior inference
:return: posterior samples and their diagnostic summary statistics
"""
self.posterior_samples, self.posterior_diagnostics = self.model.infer(
infer_config
)
return self.posterior_samples, self.posterior_diagnostics
def predict(self, new_data: pd.DataFrame) -> pd.DataFrame:
"""Computes predictive distributions on the new test data according to
MCMC posterior samples.
:param new_data: test data for prediction
:return: predictive distributions on the new test data
"""
return self.model.predict(new_data, self.posterior_samples)
| 34.770833 | 78 | 0.720791 | 200 | 1,669 | 5.905 | 0.415 | 0.077053 | 0.067739 | 0.040644 | 0.140559 | 0.140559 | 0.066046 | 0 | 0 | 0 | 0 | 0 | 0.218095 | 1,669 | 47 | 79 | 35.510638 | 0.904981 | 0.456561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0.25 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
472102e62d417732e12272e8b4c6fc0b9a1fb145 | 352 | py | Python | api/migrations/0002_auto_20181108_2243.py | apigram/jade-api | 1aece29c3109db68897fdf854be431554e7f2863 | [
"Apache-2.0"
] | null | null | null | api/migrations/0002_auto_20181108_2243.py | apigram/jade-api | 1aece29c3109db68897fdf854be431554e7f2863 | [
"Apache-2.0"
] | null | null | null | api/migrations/0002_auto_20181108_2243.py | apigram/jade-api | 1aece29c3109db68897fdf854be431554e7f2863 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 2.1.3 on 2018-11-08 11:43
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('api', '0001_initial'),
]
operations = [
migrations.RenameField(
model_name='company',
old_name='contact',
new_name='contacts',
),
]
| 18.526316 | 47 | 0.571023 | 37 | 352 | 5.324324 | 0.810811 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078512 | 0.3125 | 352 | 18 | 48 | 19.555556 | 0.735537 | 0.127841 | 0 | 0 | 1 | 0 | 0.121311 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b25cfe6974828d106200cab739a7140d4fcfc56 | 306 | py | Python | settings/settings.py | guilhermegch/telegram-twitter-bot | 463517569e60d4caab5bf4fd36deac9b7f1bc2cd | [
"MIT"
] | 2 | 2020-12-24T15:10:43.000Z | 2021-02-13T05:23:10.000Z | settings/settings.py | guilhermegch/telegram-twitter-bot | 463517569e60d4caab5bf4fd36deac9b7f1bc2cd | [
"MIT"
] | 1 | 2021-02-26T16:30:36.000Z | 2021-02-26T16:30:36.000Z | settings/settings.py | guilhermegch/telegram-twitter-bot | 463517569e60d4caab5bf4fd36deac9b7f1bc2cd | [
"MIT"
] | null | null | null | import os
from dotenv import load_dotenv
load_dotenv()
TELEGRAM_TOKEN = os.getenv("TELEGRAM_TOKEN")
CHAT_ID = os.getenv("CHAT_ID")
API_KEY = os.getenv("API_KEY")
API_SECRET_KEY = os.getenv("API_SECRET_KEY")
ACCESS_TOKEN = os.getenv("ACCESS_TOKEN")
ACCESS_TOKEN_SECRET = os.getenv("ACCESS_TOKEN_SECRET")
| 23.538462 | 54 | 0.784314 | 49 | 306 | 4.530612 | 0.285714 | 0.216216 | 0.117117 | 0.126126 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091503 | 306 | 12 | 55 | 25.5 | 0.798561 | 0 | 0 | 0 | 0 | 0 | 0.238562 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.222222 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b26ea4b942bbea2547de23d670546a45074eb42 | 292 | py | Python | Python Basics/Week 4/assess_ac4_1_1_8.py | sasathornt/Python-3-Programming-Specialization | 34a204662112f8977bdd2831687a020d775d6f39 | [
"MIT"
] | 1 | 2020-04-17T14:22:15.000Z | 2020-04-17T14:22:15.000Z | Python Basics/Week 4/assess_ac4_1_1_8.py | sasathornt/Python-3-Programming-Specialization | 34a204662112f8977bdd2831687a020d775d6f39 | [
"MIT"
] | null | null | null | Python Basics/Week 4/assess_ac4_1_1_8.py | sasathornt/Python-3-Programming-Specialization | 34a204662112f8977bdd2831687a020d775d6f39 | [
"MIT"
] | null | null | null |
##Write code to switch the order of the winners list so that it is now Z to A. Assign this list to the variable z_winners.
winners = ['Alice Munro', 'Alvin E. Roth', 'Kazuo Ishiguro', 'Malala Yousafzai', 'Rainer Weiss', 'Youyou Tu']
z_winners = winners
z_winners.reverse()
print(z_winners) | 36.5 | 122 | 0.732877 | 49 | 292 | 4.285714 | 0.693878 | 0.152381 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160959 | 292 | 8 | 123 | 36.5 | 0.857143 | 0.410959 | 0 | 0 | 0 | 0 | 0.441176 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b31e7140c554c44dbfaf956752e0ce5c615d9a0 | 3,647 | py | Python | sinfo/perifericos/views.py | webdesigncuba/Sinfo | 15998b43057b0c0f13083a3017f27740c64239bf | [
"MIT"
] | null | null | null | sinfo/perifericos/views.py | webdesigncuba/Sinfo | 15998b43057b0c0f13083a3017f27740c64239bf | [
"MIT"
] | null | null | null | sinfo/perifericos/views.py | webdesigncuba/Sinfo | 15998b43057b0c0f13083a3017f27740c64239bf | [
"MIT"
] | null | null | null | #
# Created on Sat Dec 25 2021
#
# The MIT License (MIT)
# Copyright (c) 2021 David Cordero Rosales
#
# Permission is hereby granted, free of charge, to any person obtaining a copy of this software
# and associated documentation files (the "Software"), to deal in the Software without restriction,
# including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,
# and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so,
# subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in all copies or substantial
# portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED
# TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
# THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
# TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
#
# Django
from django.shortcuts import render
from django.shortcuts import render, HttpResponseRedirect
from django.views.generic import ListView, CreateView, UpdateView, DeleteView
from django.shortcuts import reverse
from django.urls import reverse_lazy
from django.http import HttpResponse
from django.template.loader import get_template
from django.template import context
from django_renderpdf.views import PDFView
from django.contrib import messages
# Models
from .models import *
# Forms
from .forms import *
class ChasisListView(ListView):
model = Chasis
paginate_by = 10
def get_context_data(self, *, object_list=None, **kwargs):
context = super().get_context_data(**kwargs)
context['title']='Listado de Chasis'
return context
class ChasisCreateView(CreateView):
model = Chasis
form_class = ChasisForm
template_name = 'perifericos/chasis_form.html'
success_url = reverse_lazy('chasislist')
def post(self, request, *args, **kwargs):
print(request.POST)
form = ChasisForm(request.POST)
if form.is_valid():
form.save()
messages.success(request, 'Guardado exitoso')
return HttpResponseRedirect(self.success_url)
self.object = None
context = self.get_context_data(**kwargs)
context['form'] = form
return render(request, self.template_name, context)
print(form.errors)
def get_context_data(self, *, object_list=None, **kwargs):
context = super().get_context_data(**kwargs)
context['title']='Creacion de Chasis'
return context
# def get_success_url(self):
# return reverse('marcalist')
class ChasisUpdateView(UpdateView):
model = Chasis
form_class = ChasisForm
template_name = 'perifericos/chasis_update.html'
success_url = reverse_lazy('chasislist')
def get_context_data(self, *, object_list=None, **kwargs):
print(self.object)
context = super().get_context_data(**kwargs)
context['title'] = 'Edicion de Chasis'
return context
class ChasisDeleteView(DeleteView):
model = Chasis
success_url = reverse_lazy('chasislist')
class ChasisPDF(PDFView):
template_name = 'report.html'
def get_context_data(self, *args, **kwargs):
"""Pass some extra context to the template."""
context = super().get_context_data(*args, **kwargs)
context['chasis'] = Chasis.objects.all()
return context
| 33.458716 | 122 | 0.718124 | 466 | 3,647 | 5.527897 | 0.38412 | 0.03882 | 0.048913 | 0.026398 | 0.258152 | 0.173137 | 0.173137 | 0.144798 | 0.127717 | 0.065994 | 0 | 0.004104 | 0.198245 | 3,647 | 108 | 123 | 33.768519 | 0.876881 | 0.337538 | 0 | 0.322034 | 0 | 0 | 0.080571 | 0.024339 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084746 | false | 0 | 0.20339 | 0 | 0.694915 | 0.050847 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
5b332c72b302cfe177178082cf6acc6352315dcc | 2,548 | py | Python | autogluon/utils/tabular/ml/models/tab_transformer/hyperparameters/parameters.py | joshr17/autogluon | 893cf20ad761751886a827e2d10710eb6df61291 | [
"Apache-2.0"
] | null | null | null | autogluon/utils/tabular/ml/models/tab_transformer/hyperparameters/parameters.py | joshr17/autogluon | 893cf20ad761751886a827e2d10710eb6df61291 | [
"Apache-2.0"
] | null | null | null | autogluon/utils/tabular/ml/models/tab_transformer/hyperparameters/parameters.py | joshr17/autogluon | 893cf20ad761751886a827e2d10710eb6df61291 | [
"Apache-2.0"
] | null | null | null | from ....constants import BINARY, MULTICLASS, REGRESSION
def get_fixed_params():
""" Parameters that currently cannot be searched during HPO
TODO: HPO NOT CURRENTLY IMPLEMENTED FOR TABTRANSFORMER
Will need to figure out what (in future PR) is "fixed" and what is searchable. """
fixed_params = {'batch_size': 512,
'tab_kwargs': {'n_cont_embeddings': 0,
'n_layers': 1,
'n_heads': 8,
'hidden_dim': 128,
'norm_class_name': 'LayerNorm',
'tab_readout': 'none',
'column_embedding': True,
'shared_embedding': False,
#'n_shared_embs': 8, #8, #careful
'p_dropout': 0.1,
'orig_emb_resid': False,
'one_hot_embeddings': False,
'drop_whole_embeddings': False,
'max_emb_dim': 8,
'lr': 1e-3,
'weight_decay': 1e-6,
'base_exp_decay': 0.95},
'encoders': {'CATEGORICAL': 'CategoricalOrdinalEnc',
'DATETIME' : 'DatetimeOrdinalEnc',
'LATLONG' : 'LatLongQuantileOrdinalEnc',
'SCALAR' : 'ScalarQuantileOrdinalEnc',
'TEXT' : 'TextSummaryScalarEnc'},
'augmentation': {'mask_prob': 0.4,
'num_augs' : 1},
'pretext': 'BERT_pretext',
'n_cont_features': 8,
'fix_attention': False,
'freq': 1,
'pretrain_freq': 100,
'feature_dim': 64,
'epochs': 100,
'pretrain_epochs': 200,
'epochs_wo_improve': 10}
return fixed_params
def get_default_param(problem_type, nunique=None):
params = get_fixed_params()
params['problem_type'] = problem_type
if problem_type==REGRESSION:
params['n_classes'] = 1
elif problem_type==BINARY:
params['n_classes'] = 2
elif problem_type==MULTICLASS:
params['n_classes'] = nunique
return params
| 44.701754 | 86 | 0.431319 | 203 | 2,548 | 5.147783 | 0.615764 | 0.063158 | 0.040191 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030712 | 0.47606 | 2,548 | 56 | 87 | 45.5 | 0.75206 | 0.086735 | 0 | 0 | 0 | 0 | 0.245455 | 0.039394 | 0 | 0 | 0 | 0.017857 | 0 | 1 | 0.043478 | false | 0 | 0.021739 | 0 | 0.108696 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b346d8eb912e92694627beff6d3cb24dfe08b7c | 327 | py | Python | Accounts/migrations/0014_remove_timetable_class_time.py | Anand911/E-LEARNING-SCLMAXO- | a16c317ae482c91f4f91c967ddc3e498a43ac7e9 | [
"MIT"
] | 1 | 2021-02-14T10:43:21.000Z | 2021-02-14T10:43:21.000Z | Accounts/migrations/0014_remove_timetable_class_time.py | Anand911/E-LEARNING-SCLMAXO- | a16c317ae482c91f4f91c967ddc3e498a43ac7e9 | [
"MIT"
] | 1 | 2021-01-12T07:22:08.000Z | 2021-01-13T19:07:02.000Z | Accounts/migrations/0014_remove_timetable_class_time.py | Anand911/E-LEARNING-SCLMAXO- | a16c317ae482c91f4f91c967ddc3e498a43ac7e9 | [
"MIT"
] | 6 | 2020-12-13T17:46:37.000Z | 2021-02-10T13:47:25.000Z | # Generated by Django 3.1.3 on 2020-12-18 12:59
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('Accounts', '0013_timetable'),
]
operations = [
migrations.RemoveField(
model_name='timetable',
name='class_time',
),
]
| 18.166667 | 47 | 0.590214 | 34 | 327 | 5.588235 | 0.735294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082609 | 0.296636 | 327 | 17 | 48 | 19.235294 | 0.743478 | 0.137615 | 0 | 0 | 1 | 0 | 0.146429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b37b2fb348d45fd02d8f2acaf70e041233fe2f7 | 1,334 | py | Python | fastflix/encoders/copy/settings_panel.py | AwesomeGitHubRepos/FastFlix | 60adf2b68a13907ac17013cb621867b2b302c101 | [
"MIT"
] | 1 | 2021-06-14T04:35:50.000Z | 2021-06-14T04:35:50.000Z | fastflix/encoders/copy/settings_panel.py | AwesomeGitHubRepos/FastFlix | 60adf2b68a13907ac17013cb621867b2b302c101 | [
"MIT"
] | 1 | 2020-12-24T13:08:56.000Z | 2020-12-24T13:08:56.000Z | fastflix/encoders/copy/settings_panel.py | leonardyan/FastFlix | 01f19c2de74945a4c60db61711aea9d3fe01b0cc | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import logging
from qtpy import QtWidgets
from fastflix.encoders.common.setting_panel import SettingPanel
from fastflix.language import t
from fastflix.models.encode import CopySettings
from fastflix.models.fastflix_app import FastFlixApp
logger = logging.getLogger("fastflix")
class Copy(SettingPanel):
profile_name = "copy_settings"
def __init__(self, parent, main, app: FastFlixApp):
super().__init__(parent, main, app)
self.main = main
self.app = app
grid = QtWidgets.QGridLayout()
grid.addWidget(QtWidgets.QLabel(t("This will just copy the video track as is.")), 0, 0)
grid.addWidget(
QtWidgets.QLabel(t("No crop, scale, rotation,flip nor any other filters will be applied.")), 1, 0
)
grid.addWidget(QtWidgets.QWidget(), 2, 0, 10, 1)
grid.addLayout(self._add_custom(disable_both_passes=True), 11, 0, 1, 6)
self.setLayout(grid)
self.hide()
def update_video_encoder_settings(self):
self.app.fastflix.current_video.video_settings.video_encoder_settings = CopySettings()
self.app.fastflix.current_video.video_settings.video_encoder_settings.extra = self.ffmpeg_extras
self.app.fastflix.current_video.video_settings.video_encoder_settings.extra_both_passes = False
| 36.054054 | 109 | 0.715142 | 173 | 1,334 | 5.317919 | 0.462428 | 0.052174 | 0.086957 | 0.071739 | 0.269565 | 0.206522 | 0.206522 | 0.206522 | 0.206522 | 0.206522 | 0 | 0.013799 | 0.185157 | 1,334 | 36 | 110 | 37.055556 | 0.832567 | 0.015742 | 0 | 0 | 0 | 0 | 0.099924 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0.076923 | 0.230769 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5b3912de771cac5c847fca87045280e13b237dd4 | 2,048 | py | Python | tests/test_params.py | matthewjohnpayne/MPCData | bf5cc4b846437928d3c7e4bfb8d809f8bf9f9dc4 | [
"MIT"
] | null | null | null | tests/test_params.py | matthewjohnpayne/MPCData | bf5cc4b846437928d3c7e4bfb8d809f8bf9f9dc4 | [
"MIT"
] | null | null | null | tests/test_params.py | matthewjohnpayne/MPCData | bf5cc4b846437928d3c7e4bfb8d809f8bf9f9dc4 | [
"MIT"
] | null | null | null | # mpcdata/tests/test_query.py
# import pytest
# Third-party imports
import os
# Import the specific package/module/function we are testing
import mpcdata.params as params
# from .context import mpcdata
def test_required_dictionaries_exist():
"""
Does params.py contain all of the required dictionaries ?
"""
assert hasattr(params, 'urlIDDict') # This is also testing that it's been pulled in from params_masterlists
assert hasattr(params, 'dirDict')
assert hasattr(params, 'fileDict')
assert hasattr(params, 'downloadSpecDict')
def test_required_directory_paths_exist():
"""
Does dirDict contain the required directory paths ?
"""
for item in ['top','code','share','external','internal','test']:
assert item in params.dirDict
def test_expected_directory_paths():
"""
Does dirDict contain the expected directory paths ?
"""
testDir = os.path.realpath(os.path.dirname( __file__ ))
topDir = os.path.realpath(os.path.dirname( testDir ))
shareDir = os.path.join(topDir, 'share')
externalDir = os.path.join(topDir, 'share','data_external')
internalDir = os.path.join(topDir, 'share','data_internal')
testDir = os.path.join(topDir, 'share','data_test')
devDir = os.path.join(topDir, 'share','data_dev')
assert topDir == params.dirDict['top']
assert shareDir == params.dirDict['share']
assert externalDir == params.dirDict['external']
assert internalDir == params.dirDict['internal']
assert testDir == params.dirDict['test']
assert devDir == params.dirDict['dev']
def test_required_filepaths_are_defined():
"""
Does fileDict contain the required directory paths ?
"""
for item in ['external','internal']:#,'test','dev']:
assert item in params.fileDict
def test_required_specs_exist_for_data_downloads():
"""
Does downloadSpecDict contain the required paths ?
"""
for item in ['attemptsMax']:
assert item in params.downloadSpecDict
| 32 | 111 | 0.679199 | 244 | 2,048 | 5.577869 | 0.307377 | 0.039677 | 0.036738 | 0.05878 | 0.188832 | 0.173402 | 0.06025 | 0.06025 | 0 | 0 | 0 | 0 | 0.202148 | 2,048 | 63 | 112 | 32.507937 | 0.832925 | 0.244629 | 0 | 0 | 0 | 0 | 0.135154 | 0 | 0 | 0 | 0 | 0 | 0.433333 | 1 | 0.166667 | false | 0 | 0.066667 | 0 | 0.233333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b3c6f097c1609e8407e35a3ba36afcad1ca179f | 1,290 | py | Python | python/exercicios mundo 2/ex36_45.py/ex011.py | LEXW3B/PYTHON | 1ae54ea709c008bd7fab7602e034773610e7985e | [
"MIT"
] | 1 | 2022-01-05T08:51:16.000Z | 2022-01-05T08:51:16.000Z | python/exercicios mundo 2/ex36_45.py/ex011.py | LEXW3B/PYTHON | 1ae54ea709c008bd7fab7602e034773610e7985e | [
"MIT"
] | null | null | null | python/exercicios mundo 2/ex36_45.py/ex011.py | LEXW3B/PYTHON | 1ae54ea709c008bd7fab7602e034773610e7985e | [
"MIT"
] | null | null | null | #45-crie um programa que faça o computador jogar jokenpo com voce.
print('=====JOKENPO=====')
print('')
from random import randint
from time import sleep
itens = ('pedra','papel','tesoura')
computador = randint(0, 2)
print('''FAÇA SUA ESCOLHA
[ 0 ] pedra
[ 1 ] papel
[ 2 ] tesoura
''')
jogador = int(input('Qual a sua jogada ? '))
print('JO')
sleep(1)
print('KEN')
sleep(1)
print('PO')
sleep(1)
print('computador jogou {}.'.format(itens[computador]))
print('jogador jogou {}.'.format(itens[jogador]))
if computador == 0: #computador jogou pedra
if jogador == 0:
print('EMPATE')
elif jogador == 1:
print('JOGADOR VENCE')
elif jogador == 2:
print('COMPUTADOR VENCE')
else:
print('jogada invalida')
elif computador == 1: #computador jogou papel
if jogador == 0:
print('COMPUTADOR VENCE')
elif jogador == 1:
print('EMPATE')
elif jogador == 2:
print('JOGADOR VENCE')
else:
print('jogada invalida')
elif computador == 2: #computador jogou tesoura
if jogador == 0:
print('JOGADOR VENCE')
elif jogador == 1:
print('COMPUTADOR VENCE')
elif jogador == 2:
print('EMPATE')
else:
print('jogada invalida')
#FIM//A\\ | 20.47619 | 66 | 0.589922 | 155 | 1,290 | 4.909677 | 0.296774 | 0.047306 | 0.0841 | 0.059133 | 0.290407 | 0.110381 | 0.110381 | 0 | 0 | 0 | 0 | 0.023013 | 0.258915 | 1,290 | 63 | 67 | 20.47619 | 0.773013 | 0.109302 | 0 | 0.574468 | 0 | 0 | 0.265502 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.042553 | 0 | 0.042553 | 0.425532 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
5b3d25843802a6a90ece238f7b25e27cfaa9ff9d | 8,105 | py | Python | backend/api/models.py | ezequielaranda/solservicios | 275405e628b17543a42b28a9bc25bea345669aae | [
"MIT"
] | null | null | null | backend/api/models.py | ezequielaranda/solservicios | 275405e628b17543a42b28a9bc25bea345669aae | [
"MIT"
] | 6 | 2020-05-07T19:32:26.000Z | 2021-06-10T22:56:56.000Z | backend/api/models.py | ezequielaranda/solservicios | 275405e628b17543a42b28a9bc25bea345669aae | [
"MIT"
] | null | null | null | from django.db import models
from rest_framework import serializers
from django.utils import timezone
from django.contrib.auth.models import User
class Empresa(models.Model):
nombre = models.CharField(max_length=150, null=True)
domicilio = models.CharField(max_length=150, null=True)
ciudad = models.CharField(max_length=150, null=True)
class Estado(models.Model):
codigo = models.CharField(max_length=8, unique=True)
descripcion = models.CharField(max_length=60)
def __str__(self):
return self.descripcion
class Cliente(models.Model):
nombre_completo = models.CharField(max_length=50)
domicilio = models.CharField(max_length=60)
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
def __str__(self):
return self.nombre_completo
class PuntoLimpiezaCliente(models.Model):
nombre_completo = models.CharField(max_length=50)
domicilio = models.CharField(max_length=60, null=True)
cliente = models.ForeignKey(Cliente, on_delete=models.CASCADE)
# Definición de la clase PROVEEDOR
class Proveedor(models.Model):
nombre_completo = models.CharField(max_length=50)
razon_social = models.CharField(max_length=60)
domicilio = models.CharField(max_length=70)
ingresos_brutos = models.CharField(max_length=10)
fecha_inicio_actividades = models.DateField(auto_now=False)
CONDICION_IVA_CHOICES = (
('RI', 'Responsable Inscripto'),
('MO', 'Monotributista'),
('EX', 'Exento'),
('NR', 'No Responsable'),
('CF', 'Consumidor Final'),
)
condicionIVA = models.CharField(max_length=2,choices=CONDICION_IVA_CHOICES)
cuit = models.CharField(max_length=11)
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
def __str__(self):
return self.nombre_completo
# Definición de la TIPOS de PRODUCTO
class TipoProducto(models.Model):
descripcion = models.CharField(max_length=50)
codigo = models.CharField(max_length=8, unique=True)
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
def __str__(self):
return self.descripcion
# Definición de las FAMILIAS de PRODUCTO
class FamiliaProducto(models.Model):
descripcion = models.CharField(max_length=50)
codigo = models.CharField(max_length=8, unique=True)
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
def __str__(self):
return self.descripcion
#Definición de la clase ACCION
class Accion(models.Model):
nombre_completo = models.CharField(max_length=50)
codigo_accion = models.CharField(max_length=8, unique=True)
# Definición de la clase PRODUCTO
class Producto(models.Model):
nombre_completo = models.CharField(max_length=50)
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
proveedor = models.ForeignKey(Proveedor, related_name='productos', on_delete=models.SET_NULL, null=True)
tipoProducto = models.ForeignKey(TipoProducto, on_delete=models.SET_NULL, null=True)
familiaProducto = models.ForeignKey(FamiliaProducto, on_delete=models.SET_NULL, null=True)
factor_multiplicacion = models.FloatField(default=1)
def __str__(self):
return self.nombre_completo
# Definición de la clase USUARIO
class Usuario(models.Model):
ROL_USUARIO_CHOICES = (
('EMP', 'Empleado'),
('REP', 'Repartidor'),
('SUP', 'Supervisor'),
('ADM', 'Administrador'),
)
rol_usuario = models.CharField(max_length=3,choices=ROL_USUARIO_CHOICES)
nombre_completo = models.CharField(max_length=50)
accionesPermitidas = models.ManyToManyField(Accion)
class EntregaCliente(models.Model):
fecha_entrega = models.DateField()
fecha_alta_entrega = models.DateField(auto_now=True)
punto_limpieza_cliente = models.ForeignKey(PuntoLimpiezaCliente, on_delete=models.PROTECT)
usuario_alta = models.ForeignKey(User, on_delete=models.DO_NOTHING)
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
class ItemEntregaCliente(models.Model):
entregaCliente = models.ForeignKey(EntregaCliente, related_name='itemsEntrega', on_delete=models.CASCADE)
fecha_alta_item_entrega = models.DateField(auto_now=True)
producto = models.ForeignKey(Producto, related_name='productosEntrega', on_delete=models.PROTECT)
cantidad = models.IntegerField()
esEntrega = models.BooleanField()
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
# Definición de las FACTURAS de COMPRA
class FacturaCompra(models.Model):
fecha_factura_compra = models.DateField()
fecha_alta_factura = models.DateField()
proveedor = models.ForeignKey(Proveedor, on_delete=models.PROTECT)
usuario_alta = models.ForeignKey(User, on_delete=models.SET_NULL, null=True)
importe_neto_gravado = models.FloatField()
importe_total = models.FloatField()
iva27 = models.FloatField()
iva21 = models.FloatField()
iva105 = models.FloatField()
iva0 = models.FloatField()
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
class ItemsFactura(models.Model):
facturaCompra = models.ForeignKey(FacturaCompra, related_name='itemsFactura', on_delete=models.CASCADE)
producto = models.ForeignKey(Producto, on_delete=models.PROTECT)
cantidad = models.IntegerField()
precio_compra = models.FloatField()
unidad_medida = models.CharField(max_length=10)
alicuotaIVA = models.FloatField()
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
class OrdenCompra(models.Model):
fecha_orden_compra = models.DateField()
fecha_alta_orden = models.DateField()
proveedor = models.ForeignKey(Proveedor, on_delete=models.CASCADE)
usuario_alta = models.ForeignKey(User, on_delete=models.DO_NOTHING)
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
class ItemsOrdenCompra(models.Model):
ordenCompra = models.ForeignKey(OrdenCompra, related_name='itemsOrdenCompra', on_delete=models.CASCADE)
producto = models.ForeignKey(Producto, on_delete=models.CASCADE)
cantidad = models.IntegerField()
estado = models.ForeignKey(Estado, on_delete=models.SET_NULL, null=True)
# Definición del STOCK de los productos
class StockHistoricoProducto(models.Model):
producto = models.ForeignKey(Producto, on_delete=models.CASCADE, related_name='stocks')
#itemFactura = models.ForeignKey(ItemsFactura, unique=True, on_delete=models.CASCADE)
itemFactura = models.OneToOneField(ItemsFactura, on_delete=models.CASCADE, null=True)
itemEntrega = models.OneToOneField(ItemEntregaCliente, on_delete=models.CASCADE, null=True)
fecha_alta = models.DateField()
cantidad = models.IntegerField()
KANBAN_STATIONS = (
('OC_IN', 'Ingreso de Orden de Compra'),
('OC_OUT', 'Egreso de Orden de Compra'),
('ST_IN', 'Ingreso de Stock'),
('ST_OUT', 'Egreso de Stock'),
)
estacion_kanban = models.CharField(null=True, max_length=6, choices=KANBAN_STATIONS)
estado = models.IntegerField(null=True)
comments = models.CharField(null=True, max_length=100)
# Definición de los PRECIOS de los productos
class PrecioHistoricoProducto(models.Model):
fecha_inicio = models.DateField()
importe = models.FloatField()
#isCurrent = models.BooleanField()
producto = models.ForeignKey(Producto, related_name='preciosProducto', on_delete=models.CASCADE)
#itemFactura = models.ForeignKey(ItemsFactura, unique=True, on_delete=models.CASCADE)
itemFactura = models.OneToOneField(ItemsFactura, on_delete=models.CASCADE, null=True)
class Meta:
ordering = ['-fecha_inicio']
# Definición de las acciones realizadas por los usuarios
class AccionesRealizadasUsuario(models.Model):
accion = models.ForeignKey(Accion, on_delete=models.DO_NOTHING)
usuario = models.ForeignKey(User, on_delete=models.DO_NOTHING)
fecha_accion = models.DateField(auto_now=True)
descripcion = models.CharField(max_length=150)
| 41.142132 | 109 | 0.744232 | 966 | 8,105 | 6.060041 | 0.182195 | 0.049197 | 0.086095 | 0.106594 | 0.557909 | 0.51657 | 0.452169 | 0.390331 | 0.365733 | 0.320465 | 0 | 0.00916 | 0.151388 | 8,105 | 196 | 110 | 41.352041 | 0.84196 | 0.070574 | 0 | 0.287671 | 0 | 0 | 0.044897 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041096 | false | 0 | 0.047945 | 0.041096 | 0.90411 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
5b3e1ecc5b38c542cefb82dcef9d99e754c07ad0 | 8,113 | py | Python | Leak #5 - Lost In Translation/windows/Resources/Dsz/PyScripts/Lib/dsz/mca/survey/cmd/passworddump/errors.py | bidhata/EquationGroupLeaks | 1ff4bc115cb2bd5bf2ed6bf769af44392926830c | [
"Unlicense"
] | 9 | 2019-11-22T04:58:40.000Z | 2022-02-26T16:47:28.000Z | Leak #5 - Lost In Translation/windows/Resources/Dsz/PyScripts/Lib/dsz/mca/survey/cmd/passworddump/errors.py | bidhata/EquationGroupLeaks | 1ff4bc115cb2bd5bf2ed6bf769af44392926830c | [
"Unlicense"
] | null | null | null | Leak #5 - Lost In Translation/windows/Resources/Dsz/PyScripts/Lib/dsz/mca/survey/cmd/passworddump/errors.py | bidhata/EquationGroupLeaks | 1ff4bc115cb2bd5bf2ed6bf769af44392926830c | [
"Unlicense"
] | 8 | 2017-09-27T10:31:18.000Z | 2022-01-08T10:30:46.000Z | # uncompyle6 version 2.9.10
# Python bytecode 2.7 (62211)
# Decompiled from: Python 3.6.0b2 (default, Oct 11 2016, 05:27:10)
# [GCC 6.2.0 20161005]
# Embedded file name: errors.py
import mcl.status
ERR_SUCCESS = mcl.status.MCL_SUCCESS
ERR_INVALID_PARAM = mcl.status.framework.ERR_START
ERR_NOT_IMPLEMENTED = mcl.status.framework.ERR_START + 1
ERR_MARSHAL_FAILED = mcl.status.framework.ERR_START + 2
ERR_GET_API_FAILED = mcl.status.framework.ERR_START + 3
ERR_OPEN_DATA_PIPE_FAILED = mcl.status.framework.ERR_START + 4
ERR_INJECT_SETUP_FAILED = mcl.status.framework.ERR_START + 5
ERR_INJECT_FAILED = mcl.status.framework.ERR_START + 6
ERR_FAILED_TO_FIND_PROCESS = mcl.status.framework.ERR_START + 7
ERR_OPEN_PROCESS_FAILED = mcl.status.framework.ERR_START + 8
ERR_CONNECT_PIPE_FAILED = mcl.status.framework.ERR_START + 9
ERR_READ_PIPE_FAILED = mcl.status.framework.ERR_START + 10
ERR_EXCEPTION = mcl.status.framework.ERR_START + 11
ERR_ALLOC_FAILED = mcl.status.framework.ERR_START + 12
ERR_INJECTION_FINISHED = mcl.status.framework.ERR_START + 13
ERR_GET_EXIT_CODE_FAILED = mcl.status.framework.ERR_START + 14
ERR_MISSING_BUFFER_DATA = mcl.status.framework.ERR_START + 15
ERR_INJECT_WRITE_FAILED = mcl.status.framework.ERR_START + 16
ERR_INJECT_THREAD_ENDED = mcl.status.framework.ERR_START + 17
ERR_INJECT_OPEN_PIPE_FAILED = mcl.status.framework.ERR_START + 18
ERR_INJECT_LOAD_LIBRARY_FAILED = mcl.status.framework.ERR_START + 19
ERR_INJECT_LSA_OPEN_FAILED = mcl.status.framework.ERR_START + 20
ERR_INJECT_LSA_QUERY_FAILED = mcl.status.framework.ERR_START + 21
ERR_INJECT_SAMI_CONNECT_FAILED = mcl.status.framework.ERR_START + 22
ERR_INJECT_SAMR_OPEN_DOMAIN_FAILED = mcl.status.framework.ERR_START + 23
ERR_INJECT_SAMR_ENUM_USERS_FAILED = mcl.status.framework.ERR_START + 24
ERR_INJECT_SAMR_OPEN_USER_FAILED = mcl.status.framework.ERR_START + 25
ERR_INJECT_SAMR_QUERY_USER_FAILED = mcl.status.framework.ERR_START + 26
ERR_INJECT_LSAI_OPEN_POLICY_FAILED = mcl.status.framework.ERR_START + 27
ERR_INJECT_REG_OPEN_FAILED = mcl.status.framework.ERR_START + 28
ERR_INJECT_LSAR_OPEN_SECRET_FAILED = mcl.status.framework.ERR_START + 29
ERR_INJECT_LSAR_QUERY_SECRET_FAILED = mcl.status.framework.ERR_START + 30
ERR_INJECT_POINTER_NULL = mcl.status.framework.ERR_START + 31
ERR_INJECT_EXCEPTION = mcl.status.framework.ERR_START + 32
ERR_REQUIRED_LIBRARY_NOT_LOADED = mcl.status.framework.ERR_START + 33
ERR_INJECT_DIGEST_ENUM_FAILED = mcl.status.framework.ERR_START + 34
ERR_INJECT_DIGEST_GET_LOGON_DATA_FAILED = mcl.status.framework.ERR_START + 35
ERR_INJECT_DIGEST_LOGON_TO_ID_FAILED = mcl.status.framework.ERR_START + 36
ERR_INJECT_DIGEST_LOG_SESS_PASSWD_GET_FAILED = mcl.status.framework.ERR_START + 37
ERR_INJECT_FIND_FUNCTION_1 = mcl.status.framework.ERR_START + 38
ERR_INJECT_FIND_FUNCTION_2 = mcl.status.framework.ERR_START + 39
ERR_INJECT_FIND_FUNCTION_3 = mcl.status.framework.ERR_START + 40
ERR_INJECT_FIND_FUNCTION_4 = mcl.status.framework.ERR_START + 41
ERR_INJECT_FIND_FUNCTION_5 = mcl.status.framework.ERR_START + 42
ERR_INJECT_GPA_FAILED_1 = mcl.status.framework.ERR_START + 50
ERR_INJECT_GPA_FAILED_2 = mcl.status.framework.ERR_START + 51
ERR_INJECT_GPA_FAILED_3 = mcl.status.framework.ERR_START + 52
ERR_INJECT_GPA_FAILED_4 = mcl.status.framework.ERR_START + 53
ERR_INJECT_GPA_FAILED_5 = mcl.status.framework.ERR_START + 54
ERR_INJECT_GPA_FAILED_6 = mcl.status.framework.ERR_START + 55
ERR_INJECT_GPA_FAILED_7 = mcl.status.framework.ERR_START + 56
ERR_INJECT_GPA_FAILED_8 = mcl.status.framework.ERR_START + 57
ERR_INJECT_GPA_FAILED_9 = mcl.status.framework.ERR_START + 58
ERR_INJECT_GPA_FAILED_10 = mcl.status.framework.ERR_START + 59
ERR_UNSUPPORTED_PLATFORM = mcl.status.framework.ERR_START + 60
ERR_INJECT_FUNCTION_INVALID = mcl.status.framework.ERR_START + 61
errorStrings = {ERR_INVALID_PARAM: 'Invalid parameter(s)',
ERR_NOT_IMPLEMENTED: 'Not implemented on this platform',
ERR_MARSHAL_FAILED: 'Marshaling data failed',
ERR_GET_API_FAILED: 'Failed to get required API',
ERR_OPEN_DATA_PIPE_FAILED: 'Open of data pipe for transfer failed',
ERR_INJECT_SETUP_FAILED: 'Setup of necessary injection functions failed',
ERR_INJECT_FAILED: 'Injection into process failed',
ERR_FAILED_TO_FIND_PROCESS: 'Failed to find required process',
ERR_OPEN_PROCESS_FAILED: 'Unable to open process for injection',
ERR_CONNECT_PIPE_FAILED: 'Connect to data pipe failed',
ERR_READ_PIPE_FAILED: 'Read from data pipe failed',
ERR_EXCEPTION: 'Exception encountered',
ERR_ALLOC_FAILED: 'Memory allocation failed',
ERR_INJECTION_FINISHED: 'Injection finished',
ERR_GET_EXIT_CODE_FAILED: 'Get of injected thread exit code failed',
ERR_MISSING_BUFFER_DATA: 'Data returned from injected thread is invalid',
ERR_INJECT_WRITE_FAILED: 'Write of data to pipe failed',
ERR_INJECT_THREAD_ENDED: 'Injection thread has closed abnormally',
ERR_INJECT_OPEN_PIPE_FAILED: 'InjectThread: Open of data pipe failed',
ERR_INJECT_LOAD_LIBRARY_FAILED: 'InjectThread: Failed to load required library',
ERR_INJECT_LSA_OPEN_FAILED: 'InjectThread: LsaOpenPolicy call failed',
ERR_INJECT_LSA_QUERY_FAILED: 'InjectThread: LsaQueryInformationPolicy call failed',
ERR_INJECT_SAMI_CONNECT_FAILED: 'InjectThread: SamIConnect call failed',
ERR_INJECT_SAMR_OPEN_DOMAIN_FAILED: 'InjectThread: SamrOpenDomain call failed',
ERR_INJECT_SAMR_ENUM_USERS_FAILED: 'InjectThread: SamrEnumerateUsersInDomain call failed',
ERR_INJECT_SAMR_OPEN_USER_FAILED: 'InjectThread: SamrOpenUser call failed',
ERR_INJECT_SAMR_QUERY_USER_FAILED: 'InjectThread: SamrQueryInformationUser call failed',
ERR_INJECT_LSAI_OPEN_POLICY_FAILED: 'InjectThread: LsaIOpenPolicyTrusted call failed',
ERR_INJECT_REG_OPEN_FAILED: 'InjectThread: Failed to open registry key',
ERR_INJECT_LSAR_OPEN_SECRET_FAILED: 'InjectThread: LsarOpenSecret call failed',
ERR_INJECT_LSAR_QUERY_SECRET_FAILED: 'InjectThread: LsarQuerySecret call failed',
ERR_INJECT_POINTER_NULL: 'InjectThread: Internal pointer is NULL',
ERR_INJECT_EXCEPTION: 'InjectThread: Exception encountered',
ERR_REQUIRED_LIBRARY_NOT_LOADED: 'Library required for operation not loaded',
ERR_INJECT_DIGEST_ENUM_FAILED: 'InjectThread: LsaEnumerateLogonSessions failed',
ERR_INJECT_DIGEST_GET_LOGON_DATA_FAILED: 'InjectThread: LsaGetLogonSessionData failed',
ERR_INJECT_DIGEST_LOGON_TO_ID_FAILED: 'InjectThread: LogSessHandlerLogonIdToPtr failed',
ERR_INJECT_DIGEST_LOG_SESS_PASSWD_GET_FAILED: 'InjectThread: LogSessHandlerPasswdGet failed',
ERR_INJECT_FIND_FUNCTION_1: 'InjectThread: Pattern match for function LogSessHandlerLogonIdToPtr failed',
ERR_INJECT_FIND_FUNCTION_2: 'InjectThread: Pattern match for function LogSessHandlerPasswdGet failed',
ERR_INJECT_FIND_FUNCTION_3: 'InjectThread: Pattern match for function LogSessHandlerRelease failed',
ERR_INJECT_FIND_FUNCTION_4: 'InjectThread: Pattern match for function StringFree failed',
ERR_INJECT_FIND_FUNCTION_5: 'InjectThread: Pattern match for function LsaEncryptMemory failed',
ERR_INJECT_GPA_FAILED_1: 'InjectThread: Failed to get required procedure address (1)',
ERR_INJECT_GPA_FAILED_2: 'InjectThread: Failed to get required procedure address (2)',
ERR_INJECT_GPA_FAILED_3: 'InjectThread: Failed to get required procedure address (3)',
ERR_INJECT_GPA_FAILED_4: 'InjectThread: Failed to get required procedure address (4)',
ERR_INJECT_GPA_FAILED_5: 'InjectThread: Failed to get required procedure address (5)',
ERR_INJECT_GPA_FAILED_6: 'InjectThread: Failed to get required procedure address (6)',
ERR_INJECT_GPA_FAILED_7: 'InjectThread: Failed to get required procedure address (7)',
ERR_INJECT_GPA_FAILED_8: 'InjectThread: Failed to get required procedure address (8)',
ERR_INJECT_GPA_FAILED_9: 'InjectThread: Failed to get required procedure address (9)',
ERR_INJECT_GPA_FAILED_10: 'InjectThread: Failed to get required procedure address (10)',
ERR_UNSUPPORTED_PLATFORM: 'The desired operation is not supported on this platform.',
ERR_INJECT_FUNCTION_INVALID: 'The function to inject cannot be located'
} | 68.754237 | 109 | 0.824972 | 1,193 | 8,113 | 5.218776 | 0.162615 | 0.112753 | 0.159011 | 0.185512 | 0.640379 | 0.383874 | 0.166078 | 0.011886 | 0 | 0 | 0 | 0.02488 | 0.103291 | 8,113 | 118 | 110 | 68.754237 | 0.830928 | 0.020954 | 0 | 0 | 0 | 0 | 0.304737 | 0.033006 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.026549 | 0.00885 | 0 | 0.00885 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b3f258a42659e83c84a193664cad0abb36435d6 | 22,508 | py | Python | inversehaar.py | matthewearl/inversehaar | 876cfb055e656e1c75ef25dbfbecdf3ce9f8ad3b | [
"MIT"
] | 20 | 2016-01-18T12:26:06.000Z | 2019-04-15T11:13:49.000Z | inversehaar.py | matthewearl/inversehaar | 876cfb055e656e1c75ef25dbfbecdf3ce9f8ad3b | [
"MIT"
] | null | null | null | inversehaar.py | matthewearl/inversehaar | 876cfb055e656e1c75ef25dbfbecdf3ce9f8ad3b | [
"MIT"
] | 4 | 2017-05-31T02:40:41.000Z | 2020-04-22T10:38:03.000Z | #!/usr/bin/env python
# Copyright (c) 2015 Matthew Earl
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included
# in all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
# MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN
# NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,
# DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR
# OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE
# USE OR OTHER DEALINGS IN THE SOFTWARE.
"""
Invert OpenCV haar cascades.
See http://matthewearl.github.io/2016/01/14/inverse-haar/ for an overview of
how the code works.
"""
__all__ = (
'Cascade',
'inverse_haar',
)
import collections
import sys
import xml.etree.ElementTree
import cv2
import numpy
from docplex.mp.context import DOcloudContext
from docplex.mp.environment import Environment
from docplex.mp.model import Model
# Constants
# URL to connect to DoCloud with. This may need changing to your particular
# URL.
DOCLOUD_URL = 'https://api-oaas.docloud.ibmcloud.com/job_manager/rest/v1/'
# OpenCV preprocesses analysed regions by dividing by the standard deviation.
# Unfortunately this step cannot be modelled with LP constraints, so we just
# allow a reasonably high pixel value. This value should be at least 2, seeing
# as the maximum standard deviation of a set of values between 0 and 1 is 0.5.
MAX_PIXEL_VALUE = 2.0
# Grid classes
class Grid(object):
"""
A division of an image area into cells.
For example, `SquareGrid` divides the image into pixels.
Cell values are represented with "cell vectors", so for example,
`Grid.render_cell_vec` will take a cell vector and produce an image.
"""
@property
def num_cells(self):
"""The number of cells in this grid"""
raise NotImplementedError
def rect_to_cell_vec(self, r):
"""
Return a boolean cell vector corresponding with the input rectangle.
Elements of the returned vector are True if and only if the
corresponding cells fall within the input rectangle
"""
raise NotImplementedError
def render_cell_vec(self, vec, im_width, im_height):
"""Render an image, using a cell vector and image dimensions."""
raise NotImplementedError
class SquareGrid(Grid):
"""
A grid where cells correspond with pixels.
This grid type is used for cascades which do not contain diagonal features.
"""
def __init__(self, width, height):
self._width = width
self._height = height
self.cell_names = ["pixel_{}_{}".format(x, y)
for y in range(height) for x in range(width)]
@property
def num_cells(self):
return self._width * self._height
def rect_to_cell_vec(self, r):
assert not r.tilted
out = numpy.zeros((self._width, self._height), dtype=numpy.bool)
out[r.y:r.y + r.h, r.x:r.x + r.w] = True
return out.flatten()
def render_cell_vec(self, vec, im_width, im_height):
im = vec.reshape(self._height, self._width)
return cv2.resize(im, (im_width, im_height),
interpolation=cv2.INTER_NEAREST)
class TiltedGrid(Grid):
"""
A square grid, but each square consists of 4 cells.
The squares are cut diagonally, resulting in a north, east, south and west
triangle for each cell.
This grid type is used for cascades which contain diagonal features: The
idea is that the area which a diagonal feature should be integrated can be
represented exactly by this structure.
Unfortunately, this is not quite accurate: OpenCV's trainer and detector
always resizes its images so that pixels correspond with one grid cell. As
such cascades which contain diagonal features will not be accurately
inverted by this script, however, they will have more detail as a result of
the grid square subdivision.
"""
def __init__(self, width, height):
self._width = width
self._height = height
self._cell_indices = {(d, x, y): 4 * ((width * y) + x) + d
for y in range(height)
for x in range(width)
for d in range(4)}
self.cell_names = ['cell_{}_{}_{}'.format(x, y, "NESW"[d])
for y in range(height)
for x in range(width)
for d in range(4)]
self._cell_points = numpy.zeros((width * height * 4, 2))
for y in range(height):
for x in range(width):
self._cell_points[self._cell_indices[0, x, y], :] = \
numpy.array([x + 0.5, y + 0.25])
self._cell_points[self._cell_indices[1, x, y], :] = \
numpy.array([x + 0.75, y + 0.5])
self._cell_points[self._cell_indices[2, x, y], :] = \
numpy.array([x + 0.5, y + 0.75])
self._cell_points[self._cell_indices[3, x, y], :] = \
numpy.array([x + 0.25, y + 0.5])
@property
def num_cells(self):
return self._width * self._height * 4
def _rect_to_bounds(self, r):
if not r.tilted:
dirs = numpy.matrix([[0, 1], [-1, 0], [0, -1], [1, 0]])
limits = numpy.matrix([[r.y, -(r.x + r.w), -(r.y + r.h), r.x]]).T
else:
dirs = numpy.matrix([[-1, 1], [-1, -1], [1, -1], [1, 1]])
limits = numpy.matrix([[r.y - r.x,
2 + -r.x -r.y - 2 * r.w,
r.x - r.y - 2 * r.h,
-2 + r.x + r.y]]).T
return dirs, limits
def rect_to_cell_vec(self, r):
dirs, limits = self._rect_to_bounds(r)
out = numpy.all(numpy.array(dirs * numpy.matrix(self._cell_points).T)
>= limits,
axis=0)
return numpy.array(out)[0]
def render_cell_vec(self, vec, im_width, im_height):
out_im = numpy.zeros((im_height, im_width), dtype=vec.dtype)
tris = numpy.array([[[0, 0], [1, 0], [0.5, 0.5]],
[[1, 0], [1, 1], [0.5, 0.5]],
[[1, 1], [0, 1], [0.5, 0.5]],
[[0, 1], [0, 0], [0.5, 0.5]]])
scale_factor = numpy.array([im_width / self._width,
im_height / self._height])
for y in reversed(range(self._height)):
for x in range(self._width):
for d in (2, 3, 1, 0):
points = (tris[d] + numpy.array([x, y])) * scale_factor
cv2.fillConvexPoly(
img=out_im,
points=points.astype(numpy.int32),
color=vec[self._cell_indices[d, x, y]])
return out_im
# Cascade definition
class Stage(collections.namedtuple('_StageBase',
['threshold', 'weak_classifiers'])):
"""
A stage in an OpenCV cascade.
.. attribute:: weak_classifiers
A list of weak classifiers in this stage.
.. attribute:: threshold
The value that the weak classifiers must exceed for this stage to pass.
"""
class WeakClassifier(collections.namedtuple('_WeakClassifierBase',
['feature_idx', 'threshold', 'fail_val', 'pass_val'])):
"""
A weak classifier in an OpenCV cascade.
.. attribute:: feature_idx
Feature associated with this classifier.
.. attribute:: threshold
The value that this feature dotted with the input image must exceed for the
feature to have passed.
.. attribute:: fail_val
The value contributed to the stage threshold if this classifier fails.
.. attribute:: pass_val
The value contributed to the stage threshold if this classifier passes.
"""
class Rect(collections.namedtuple('_RectBase',
['x', 'y', 'w', 'h', 'tilted', 'weight'])):
"""
A rectangle in an OpenCV cascade.
Two or more of these make up a feature.
.. attribute:: x, y
Coordinates of the rectangle.
.. attribute:: w, h
Width and height of the rectangle, respectively.
.. attribute:: tilted
If true, the rectangle is to be considered rotated 45 degrees clockwise
about its top-left corner. (+X is right, +Y is down.)
.. attribute:: weight
The value this rectangle contributes to the feature.
"""
class Cascade(collections.namedtuple('_CascadeBase',
['width', 'height', 'stages', 'features', 'tilted', 'grid'])):
"""
Pythonic interface to an OpenCV cascade file.
.. attribute:: width
Width of the cascade grid.
.. attribute:: height
Height of the cascade grid.
.. attribute:: stages
List of :class:`.Stage` objects.
.. attribute:: features
List of features. Each feature is in turn a list of :class:`.Rect`s.
.. attribute:: tilted
True if any of the features are tilted.
.. attribute:: grid
A :class:`.Grid` object suitable for use with the cascade.
"""
@staticmethod
def _split_text_content(n):
return n.text.strip().split(' ')
@classmethod
def load(cls, fname):
"""
Parse an OpenCV haar cascade XML file.
"""
root = xml.etree.ElementTree.parse(fname)
width = int(root.find('./cascade/width').text.strip())
height = int(root.find('./cascade/height').text.strip())
stages = []
for stage_node in root.findall('./cascade/stages/_'):
stage_threshold = float(
stage_node.find('./stageThreshold').text.strip())
weak_classifiers = []
for classifier_node in stage_node.findall('weakClassifiers/_'):
sp = cls._split_text_content(
classifier_node.find('./internalNodes'))
if sp[0] != "0" or sp[1] != "-1":
raise Exception("Only simple cascade files are supported")
feature_idx = int(sp[2])
threshold = float(sp[3])
sp = cls._split_text_content(
classifier_node.find('./leafValues'))
fail_val = float(sp[0])
pass_val = float(sp[1])
weak_classifiers.append(
WeakClassifier(feature_idx, threshold, fail_val, pass_val))
stages.append(Stage(stage_threshold, weak_classifiers))
features = []
for feature_node in root.findall('./cascade/features/_'):
feature = []
tilted_node = feature_node.find('./tilted')
if tilted_node is not None:
tilted = bool(int(tilted_node.text))
else:
tilted = False
for rect_node in feature_node.findall('./rects/_'):
sp = cls._split_text_content(rect_node)
x, y, w, h = (int(x) for x in sp[:4])
weight = float(sp[4])
feature.append(Rect(x, y, w, h, tilted, weight))
features.append(feature)
tilted = any(r.tilted for f in features for r in f)
if tilted:
grid = TiltedGrid(width, height)
else:
grid = SquareGrid(width, height)
stages = stages[:]
return cls(width, height, stages, features, tilted, grid)
def detect(self, im, epsilon=0.00001, scale_by_std_dev=False):
"""
Apply the cascade forwards on a potential face image.
The algorithm is relatively slow compared to the integral image
implementation, but is relatively terse and consequently useful for
debugging.
:param im:
Image to apply the detector to.
:param epsilon:
Maximum rounding error to account for. This biases the classifier
and stage thresholds towards passing. As a result, passing too
large a value may result in false positive detections.
:param scale_by_std_dev:
If true, divide the input image by its standard deviation before
processing. This simulates OpenCV's algorithm, however the reverse
haar mapping implemented by this script does not account for the
standard deviation divide, so to get the forward version of
`inverse_haar`, pass False.
"""
im = im.astype(numpy.float64)
im = cv2.resize(im, (self.width, self.height),
interpolation=cv2.INTER_AREA)
scale_factor = numpy.std(im) if scale_by_std_dev else 256.
im /= scale_factor * (im.shape[1] * im.shape[0])
for stage_idx, stage in enumerate(self.stages):
total = 0
for classifier in stage.weak_classifiers:
feature_array = self.grid.render_cell_vec(
sum(self.grid.rect_to_cell_vec(r) * r.weight
for r in self.features[classifier.feature_idx]),
im.shape[1], im.shape[0])
if classifier.pass_val > classifier.fail_val:
thr = classifier.threshold - epsilon
else:
thr = classifier.threshold + epsilon
if numpy.sum(feature_array * im) >= thr:
total += classifier.pass_val
else:
total += classifier.fail_val
if total < stage.threshold - epsilon:
return -stage_idx
return 1
class CascadeModel(Model):
"""
Model of the variables and constraints associated with a Haar cascade.
This is in fact a wrapper around a docplex model.
.. attribute:: cell_vars
List of variables corresponding with the cells in the cascade's grid.
.. attribute:: feature_vars
Dict of feature indices to binary variables. Each variable represents
whether the corresponding feature is present.
.. attribute:: cascade
The underlying :class:`.Cascade`.
"""
def __init__(self, cascade, docloud_context):
"""Make a model from a :class:`.Cascade`."""
super(CascadeModel, self).__init__("Inverse haar cascade",
docloud_context=docloud_context)
cell_vars = [self.continuous_var(
name=cascade.grid.cell_names[i],
lb=0., ub=MAX_PIXEL_VALUE)
for i in range(cascade.grid.num_cells)]
feature_vars = {idx: self.binary_var(name="feature_{}".format(idx))
for idx in range(len(cascade.features))}
for stage in cascade.stages:
# Add constraints for the feature vars.
#
# If the classifier's pass value is greater than its fail value,
# then add a constraint equivalent to the following:
#
# feature var set => corresponding feature is present in image
#
# Conversely, if the classifier's pass vlaue is less than its fail
# value, add a constraint equivalent to:
#
# corresponding feature is present in image => feature var set
for classifier in stage.weak_classifiers:
feature_vec = numpy.sum(
cascade.grid.rect_to_cell_vec(r) * r.weight
for r in cascade.features[classifier.feature_idx])
feature_vec /= (cascade.width * cascade.height)
thr = classifier.threshold
feature_var = feature_vars[classifier.feature_idx]
feature_val = sum(cell_vars[i] * feature_vec[i]
for i in numpy.argwhere(
feature_vec != 0.).flatten())
if classifier.pass_val >= classifier.fail_val:
big_num = 0.1 + thr - numpy.sum(numpy.min(
[feature_vec, numpy.zeros(feature_vec.shape)],
axis=0))
self.add_constraint(feature_val - feature_var * big_num >=
thr - big_num)
else:
big_num = 0.1 + numpy.sum(numpy.max(
[feature_vec, numpy.zeros(feature_vec.shape)],
axis=0)) - thr
self.add_constraint(feature_val - feature_var * big_num <=
thr)
# Enforce that the sum of features present in this stage exceeds
# the stage threshold.
fail_val_total = sum(c.fail_val for c in stage.weak_classifiers)
adjusted_stage_threshold = stage.threshold
self.add_constraint(sum((c.pass_val - c.fail_val) *
feature_vars[c.feature_idx]
for c in stage.weak_classifiers) >=
adjusted_stage_threshold - fail_val_total)
self.cascade = cascade
self.cell_vars = cell_vars
self.feature_vars = feature_vars
def set_best_objective(self, minimize=False):
"""
Amend the model with an objective.
The objective used is to maximise the score from each stage of the
cascade.
"""
self.set_objective("min" if minimize else "max",
sum((c.pass_val - c.fail_val) *
self.feature_vars[c.feature_idx]
for s in self.cascade.stages
for c in s.weak_classifiers))
def inverse_haar(cascade, min_optimize=False, max_optimize=False,
time_limit=None, docloud_context=None, lp_path=None):
"""
Invert a haar cascade.
:param cascade:
A :class:`.Cascade` to invert.
:param min_optimize:
Attempt to the solution which exceeds the stage constraints as little
as possible.
:param max_optimize:
Attempt to the solution which exceeds the stage constraints as much
as possible.
:param time_limit:
Maximum time to allow the solver to work, in seconds.
:param docloud_context:
:class:`docplex.mp.context.DOcloudContext` to use for solving.
:param lp_path:
File to write the LP constraints to. Useful for debugging. (Optional).
"""
if min_optimize and max_optimize:
raise ValueError("Cannot pass both min_optimize and max_optimize")
cascade_model = CascadeModel(cascade, docloud_context)
if min_optimize or max_optimize:
cascade_model.set_best_objective(minimize=min_optimize)
if time_limit is not None:
cascade_model.set_time_limit(time_limit)
cascade_model.print_information()
if lp_path:
cascade_model.export_as_lp(path=lp_path)
if not cascade_model.solve():
raise Exception("Failed to find solution")
sol_vec = numpy.array([v.solution_value / MAX_PIXEL_VALUE
for v in cascade_model.cell_vars])
im = cascade_model.cascade.grid.render_cell_vec(sol_vec,
10 * cascade.width,
10 * cascade.height)
im = (im * 255.).astype(numpy.uint8)
return im
if __name__ == "__main__":
import argparse
parser = argparse.ArgumentParser(description=
'Inverse haar feature object detection')
parser.add_argument('-c', '--cascade', type=str, required=True,
help='OpenCV cascade file to be reversed.')
parser.add_argument('-o', '--output', type=str, required=True,
help='Output image name')
parser.add_argument('-t', '--time-limit', type=float, default=None,
help='Maximum time to allow the solver to work, in '
'seconds.')
parser.add_argument('-O', '--optimize', nargs='?', type=str, const='max',
help='Try and find the "best" solution, rather than '
'just a feasible solution. Pass "min" to find '
'the least best solution.')
parser.add_argument('-C', '--check', action='store_true',
help='Check the result against the (forward) cascade.')
parser.add_argument('-l', '--lp-path',type=str, default=None,
help='File to write LP constraints to.')
args = parser.parse_args()
print "Loading cascade..."
cascade = Cascade.load(args.cascade)
docloud_context = DOcloudContext.make_default_context(DOCLOUD_URL)
docloud_context.print_information()
env = Environment()
env.print_information()
print "Solving..."
im = inverse_haar(cascade,
min_optimize=(args.optimize == "min"),
max_optimize=(args.optimize == "max"),
time_limit=args.time_limit,
docloud_context=docloud_context,
lp_path=args.lp_path)
cv2.imwrite(args.output, im)
print "Wrote {}".format(args.output)
if args.check:
print "Checking..."
ret = cascade.detect(im)
if ret != 1:
print "Image failed the forward cascade at stage {}".format(-ret)
| 36.070513 | 83 | 0.571708 | 2,734 | 22,508 | 4.579371 | 0.20117 | 0.009585 | 0.00623 | 0.005192 | 0.223482 | 0.172204 | 0.136342 | 0.10655 | 0.094888 | 0.078115 | 0 | 0.010971 | 0.335836 | 22,508 | 623 | 84 | 36.128411 | 0.826544 | 0.089168 | 0 | 0.128028 | 0 | 0 | 0.073464 | 0 | 0 | 0 | 0 | 0 | 0.00346 | 0 | null | null | 0.034602 | 0.031142 | null | null | 0.027682 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b46be0de52122650ce7b53028183e21f425fa14 | 185 | py | Python | ex048.py | CarlosEduardoAS/Python-exercicios | c0063660191a86e83f25708239b62b6764a51670 | [
"MIT"
] | null | null | null | ex048.py | CarlosEduardoAS/Python-exercicios | c0063660191a86e83f25708239b62b6764a51670 | [
"MIT"
] | null | null | null | ex048.py | CarlosEduardoAS/Python-exercicios | c0063660191a86e83f25708239b62b6764a51670 | [
"MIT"
] | null | null | null | s = 0
cont = 0
for c in range(1, 501, 2):
if c % 3 == 0:
cont += 1
s += c
print('A soma ente todos os {} ímpares múltiplos de 3 entre 1 e 500 é {}.'.format(cont, s)) | 26.428571 | 91 | 0.524324 | 37 | 185 | 2.621622 | 0.702703 | 0.103093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12 | 0.324324 | 185 | 7 | 91 | 26.428571 | 0.656 | 0 | 0 | 0 | 0 | 0 | 0.354839 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b487a8b99ccb82c0f1a7123624b4438aa8ebb71 | 1,767 | py | Python | app.py | ruturajshete1008/Heart-health-prediction | 75fe4232f19014ed2b6fb2463c815ba023ad7a0d | [
"MIT"
] | 3 | 2018-11-28T04:30:15.000Z | 2019-01-02T09:13:03.000Z | app.py | ruturajshete1008/Heart-health-prediction | 75fe4232f19014ed2b6fb2463c815ba023ad7a0d | [
"MIT"
] | null | null | null | app.py | ruturajshete1008/Heart-health-prediction | 75fe4232f19014ed2b6fb2463c815ba023ad7a0d | [
"MIT"
] | null | null | null | from flask import Flask, jsonify, render_template
import pandas as pd
import os
import pymongo
from flask import send_from_directory
from pymongo import MongoClient
# initialize flask app
app = Flask(__name__)
app.config['JSON_SORT_KEYS'] = False
# read the data and merge it
df_labels = pd.read_csv('train_labels.csv')
df_values = pd.read_csv('train_values.csv')
merged_df = pd.merge(df_values, df_labels, how='inner', on='patient_id')
# filter dataframe for with and w/o HD
merged_df_1 = merged_df.drop(merged_df.index[(merged_df.heart_disease_present.eq(0))])
merged_df_0 = merged_df.drop(merged_df.index[(merged_df.heart_disease_present.eq(1))])
conn = os.environ.get('MONGODB_URI')
if not conn:
conn = 'mongodb://localhost:27017/'
client = MongoClient(conn)
db = client.heart_data
collection = db.train_values
listt = []
for obj in collection.find():
obj.pop("_id")
listt.append(obj)
#build out the routes
@app.route('/')
def home():
return render_template('index.html')
@app.route('/favicon.ico')
def favicon():
return send_from_directory(os.path.join(app.root_path,'static','images'),
'favicon.ico', mimetype='image/png')
@app.route('/analysis')
def analysis():
return render_template('analysis.html')
@app.route('/prediction')
def predict():
return render_template('health-prediction.html')
@app.route('/data')
def data():
return render_template('data.html')
@app.route('/chart')
def chart():
# build a dictionary to jsonify into a route
my_data = {"age_hd": list(merged_df_1['age']), "age_no_hd": list(merged_df_0['age'])}
return jsonify(my_data)
@app.route('/table')
def tab_content():
return jsonify(listt)
if __name__ == '__main__':
app.run(debug=True)
| 26.772727 | 89 | 0.710243 | 265 | 1,767 | 4.501887 | 0.415094 | 0.073764 | 0.067058 | 0.02347 | 0.090528 | 0.090528 | 0.090528 | 0.090528 | 0.090528 | 0.090528 | 0 | 0.007275 | 0.144312 | 1,767 | 65 | 90 | 27.184615 | 0.781746 | 0.083758 | 0 | 0 | 0 | 0 | 0.16491 | 0.029758 | 0 | 0 | 0 | 0 | 0 | 1 | 0.145833 | false | 0 | 0.125 | 0.125 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
5b4a8c62ba6b42395b0ef146b30118ae4a735b55 | 539 | py | Python | train.py | ex7763/pytorch-HED | 8695c996af48d917851bfe629ca219471d40aa5c | [
"MIT"
] | null | null | null | train.py | ex7763/pytorch-HED | 8695c996af48d917851bfe629ca219471d40aa5c | [
"MIT"
] | null | null | null | train.py | ex7763/pytorch-HED | 8695c996af48d917851bfe629ca219471d40aa5c | [
"MIT"
] | null | null | null | import torch
import yaml
import argparse
from dataset.BSD500 import BSD500Dataset
from models.HED import HED
###############
# parse cfg
###############
parser = argparse.ArgumentParser()
parser.add_argument('--cfg', dest='cfg', required=True, help='path to config file')
args = parser.parse_known_args()
args = parser.parse_args()
#print(args)
cfg_file = args.cfg
print('cfg_file: ', cfg_file)
with open('config/'+cfg_file, 'r') as f:
cfg = yaml.load(f)
print(cfg)
########################################
model = HED(cfg)
| 17.387097 | 83 | 0.625232 | 71 | 539 | 4.633803 | 0.478873 | 0.085106 | 0.091185 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012793 | 0.12987 | 539 | 30 | 84 | 17.966667 | 0.688699 | 0.037106 | 0 | 0 | 0 | 0 | 0.100897 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.133333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5b4efefa7c72bf1ac0d47d8be34d61c51dd67ac7 | 571 | py | Python | geinos/app/core/radius/radius.py | falhenaki/GEINOS | c0bf883c582972b08ab5ee45d2bd0463d6c48087 | [
"MIT"
] | 3 | 2018-03-19T16:51:21.000Z | 2019-01-18T22:52:19.000Z | geinos/app/core/radius/radius.py | falhenaki/GEINOS | c0bf883c582972b08ab5ee45d2bd0463d6c48087 | [
"MIT"
] | 261 | 2018-02-08T16:24:26.000Z | 2018-08-07T03:38:16.000Z | geinos/app/core/radius/radius.py | falhenaki/GEINOS | c0bf883c582972b08ab5ee45d2bd0463d6c48087 | [
"MIT"
] | null | null | null | from sqlalchemy import *
from sqlalchemy import Column, String
from sqlalchemy.ext.declarative import declarative_base
from app.core.sqlalchemy_base.augmented_base import CustomMixin
Base = declarative_base()
class Radius(CustomMixin, Base):
__tablename__ = "Radius"
host = Column(String, primary_key=True)
port = Column(Integer)
secret = Column(String)
# ----------------------------------------------------------------------
def __init__(self, secret, host, port):
self.secret = secret
self.host = host
self.port = port
| 33.588235 | 76 | 0.628722 | 60 | 571 | 5.766667 | 0.416667 | 0.121387 | 0.115607 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176883 | 571 | 16 | 77 | 35.6875 | 0.73617 | 0.122592 | 0 | 0 | 0 | 0 | 0.012024 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.285714 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
5b5975a909c1137119120f7fc6fd7242b0e2d766 | 656 | py | Python | regtests/list/slice.py | ahakingdom/Rusthon | 5b6b78111b62281cd1381e53362c5d4b520ade30 | [
"BSD-3-Clause"
] | 622 | 2015-01-01T14:53:51.000Z | 2022-03-27T14:52:25.000Z | regtests/list/slice.py | ahakingdom/Rusthon | 5b6b78111b62281cd1381e53362c5d4b520ade30 | [
"BSD-3-Clause"
] | 74 | 2015-01-05T01:24:09.000Z | 2021-04-26T00:06:38.000Z | regtests/list/slice.py | ahakingdom/Rusthon | 5b6b78111b62281cd1381e53362c5d4b520ade30 | [
"BSD-3-Clause"
] | 67 | 2015-01-18T22:54:54.000Z | 2022-03-01T12:54:23.000Z | from runtime import *
"""list slice"""
class XXX:
def __init__(self):
self.v = range(10)
def method(self, a):
return a
def main():
a = range(10)[:-5]
assert( len(a)==5 )
assert( a[4]==4 )
print '--------'
b = range(10)[::2]
print b
assert( len(b)==5 )
assert( b[0]==0 )
assert( b[1]==2 )
assert( b[2]==4 )
assert( b[3]==6 )
assert( b[4]==8 )
#if BACKEND=='DART':
# print(b[...])
#else:
# print(b)
c = range(20)
d = c[ len(b) : ]
#if BACKEND=='DART':
# print(d[...])
#else:
# print(d)
assert( len(d)==15 )
x = XXX()
e = x.v[ len(b) : ]
assert( len(e)==5 )
f = x.method( x.v[len(b):] )
assert( len(f)==5 )
main()
| 13.387755 | 29 | 0.501524 | 115 | 656 | 2.826087 | 0.347826 | 0.138462 | 0.092308 | 0.110769 | 0.092308 | 0.092308 | 0 | 0 | 0 | 0 | 0 | 0.054902 | 0.222561 | 656 | 48 | 30 | 13.666667 | 0.582353 | 0.143293 | 0 | 0 | 0 | 0 | 0.01487 | 0 | 0 | 0 | 0 | 0 | 0.392857 | 0 | null | null | 0 | 0.035714 | null | null | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b5b10607266561433613c17a985d903f65eb540 | 524 | py | Python | forums/migrations/0007_auto_20191203_0820.py | phiratio/django-forums-app | a8d50b436bc34f74ab8c58234f5f7cf5175e00c5 | [
"MIT"
] | 22 | 2019-10-14T20:57:18.000Z | 2022-01-13T11:32:16.000Z | forums/migrations/0007_auto_20191203_0820.py | phiratio/django-forums-app | a8d50b436bc34f74ab8c58234f5f7cf5175e00c5 | [
"MIT"
] | 22 | 2019-10-16T12:21:59.000Z | 2021-12-16T14:05:46.000Z | forums/migrations/0007_auto_20191203_0820.py | phiratio/django-forums-app | a8d50b436bc34f74ab8c58234f5f7cf5175e00c5 | [
"MIT"
] | 10 | 2019-10-15T19:55:30.000Z | 2022-02-27T13:53:55.000Z | # Generated by Django 2.2.5 on 2019-12-03 08:20
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('forums', '0006_auto_20191203_0758'),
]
operations = [
migrations.AlterField(
model_name='post',
name='thread',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='posts',
to='forums.Thread'),
),
]
| 26.2 | 102 | 0.599237 | 57 | 524 | 5.403509 | 0.666667 | 0.077922 | 0.090909 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.082888 | 0.28626 | 524 | 19 | 103 | 27.578947 | 0.740642 | 0.085878 | 0 | 0 | 1 | 0 | 0.119497 | 0.048218 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.357143 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b5eb638f3c45dd8372789c8fa5a6b4ca5ba99fc | 738 | py | Python | app/middlewares/apikey_auth.py | meongbego/IOT_ADRINI | 0923b86a9d1da5d6859b70726ad1e041aecc97b2 | [
"MIT"
] | 1 | 2019-07-27T12:17:23.000Z | 2019-07-27T12:17:23.000Z | app/middlewares/apikey_auth.py | meongbego/ADRINI_IOT_PLATFORM | 0923b86a9d1da5d6859b70726ad1e041aecc97b2 | [
"MIT"
] | 4 | 2021-04-18T11:41:31.000Z | 2021-06-01T23:12:19.000Z | app/middlewares/apikey_auth.py | sofyan48/ADRINI_IOT_PLATFORM | 0923b86a9d1da5d6859b70726ad1e041aecc97b2 | [
"MIT"
] | null | null | null | from functools import wraps
from app.helpers.rest import *
from app import redis_store
from flask import request
from app.models import model as db
import hashlib
def apikey_required(f):
@wraps(f)
def decorated_function(*args, **kwargs):
if 'apikey' not in request.headers:
return response(400, message=" Invalid access apikey ")
else:
access_token = db.get_by_id(
table="tb_channels",
field="channels_key",
value=request.headers['apikey']
)
if not access_token:
return response(400, message=" Invalid access apikey ")
return f(*args, **kwargs)
return decorated_function | 29.52 | 71 | 0.605691 | 86 | 738 | 5.081395 | 0.523256 | 0.048055 | 0.077803 | 0.10984 | 0.196796 | 0.196796 | 0.196796 | 0 | 0 | 0 | 0 | 0.011858 | 0.314363 | 738 | 25 | 72 | 29.52 | 0.851779 | 0 | 0 | 0.095238 | 0 | 0 | 0.109608 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
5b67e852602727d6512e5eacbc68b00f61443e8e | 9,486 | py | Python | rel2/bluecat_app/bin/bluecat/entity.py | mheidir/BlueCatSG-SplunkApp-UnOfficial | bd914b8650d191e48c18acda5bdd70aeabb99207 | [
"Apache-2.0"
] | 1 | 2018-06-26T14:57:54.000Z | 2018-06-26T14:57:54.000Z | rel2/bluecat_app/bin/bluecat/entity.py | mheidir/BlueCatSG-SplunkApp-UnOfficial | bd914b8650d191e48c18acda5bdd70aeabb99207 | [
"Apache-2.0"
] | null | null | null | rel2/bluecat_app/bin/bluecat/entity.py | mheidir/BlueCatSG-SplunkApp-UnOfficial | bd914b8650d191e48c18acda5bdd70aeabb99207 | [
"Apache-2.0"
] | null | null | null | from suds import WebFault
from api_exception import api_exception
from util import *
from version import version
from wrappers.generic_setters import *
class entity(object):
"""Instantiate an entity. Entities are hashable and comparable with the = operator.
:param api: API instance used by the entity to communicate with BAM.
:param soap_entity: the SOAP (suds) entity returned by the BAM API.
:param soap_client: the suds client instance.
"""
def __init__(self, api, soap_entity, soap_client, ver=''):
self._api = api
if not ver:
self._version = api.get_version()
else:
self._version = version(ver)
if (self._version >= '8.1.0'):
self._none_parameter = ''
else:
self._none_parameter = None
self._soap_entity = soap_entity
self._soap_client = soap_client
self._properties = {}
self._immutable_properties = ['parentId', 'parentType']
if 'properties' in self._soap_entity and self._soap_entity['properties'] is not None:
self._properties = properties_to_map(self._soap_entity['properties'])
Entity = 'Entity'
Configuration = 'Configuration'
View = 'View'
Zone = 'Zone'
InternalRootZone = 'InternalRootZone'
ZoneTemplate = 'ZoneTemplate'
EnumZone = 'EnumZone'
EnumNumber = 'EnumNumber'
RPZone = 'RPZone'
HostRecord = 'HostRecord'
AliasRecord = 'AliasRecord'
MXRecord = 'MXRecord'
TXTRecord = 'TXTRecord'
SRVRecord = 'SRVRecord'
GenericRecord = 'GenericRecord'
HINFORecord = 'HINFORecord'
NAPTRRecord = 'NAPTRRecord'
RecordWithLink = 'RecordWithLink'
ExternalHostRecord = 'ExternalHostRecord'
StartOfAuthority = 'StartOfAuthority'
IP4Block = 'IP4Block'
IP4Network = 'IP4Network'
IP6Block = 'IP6Block'
IP6Network = 'IP6Network'
IP4NetworkTemplate = 'IP4NetworkTemplate'
DHCP4Range = 'DHCP4Range'
IP4Address = 'IP4Address'
IP6Address = 'IP6Address'
InterfaceID = 'InterfaceID'
MACPool = 'MACPool'
DenyMACPool = 'DenyMACPool'
MACAddress = 'MACAddress'
TagGroup = 'TagGroup'
Tag = 'Tag'
User = 'User'
UserGroup = 'UserGroup'
Server = 'Server'
NetworkServerInterface = 'NetworkServerInterface'
PublishedServerInterface = 'PublishedServerInterface'
NetworkInterface = 'NetworkInterface'
VirtualInterface = 'VirtualInterface'
LDAP = 'LDAP'
Kerberos = 'Kerberos'
Radius = 'Radius'
TFTPGroup = 'TFTPGroup'
TFTPFolder = 'TFTPFolder'
TFTPFile = 'TFTPFile'
TFTPDeploymentRole = 'TFTPDeploymentRole'
DeploymentRole = 'DNSDeploymentRole'
DHCPDeploymentRole = 'DHCPDeploymentRole'
DNSOption = 'DNSOption'
DHCPV4ClientOption = 'DHCPV4ClientOption'
DHCPServiceOption = 'DHCPServiceOption'
DHCPV6ClientOption = 'DHCPV6ClientOption'
DHCPV6ServiceOption = 'DHCPV6ServiceOption'
VendorProfile = 'VendorProfile'
VendorOptionDef = 'VendorOptionDef'
VendorClientOption = 'VendorClientOption'
CustomOptionDef = 'CustomOptionDef'
DHCPMatchClass = 'DHCPMatchClass'
DHCPSubClass = 'DHCPSubClass'
Device = 'Device'
DeviceType = 'DeviceType'
DeviceSubtype = 'DeviceSubtype'
DeploymentScheduler = 'DeploymentScheduler'
IP4ReconciliationPolicy = 'IP4ReconciliationPolicy'
DNSSECSigningPolicy = 'DNSSECSigningPolicy'
IP4IPGroup = 'IP4IPGroup'
ResponsePolicy = 'ResponsePolicy'
KerberosRealm = 'KerberosRealm'
DHCPRawOption = 'DHCPRawOption'
DHCPV6RawOption = 'DHCPV6RawOption'
DNSRawOption = 'DNSRawOption'
DHCP6Range = 'DHCP6Range'
ACL = 'ACL'
TSIGKey = 'TSIGKey'
def __hash__(self):
return hash(self.get_id())
def __eq__(self, other):
return self.get_id() == other.get_id()
def get_url(self):
return self._api.get_url()
def get_id(self):
"""Get the BAM ID of an entity.
"""
return self._soap_entity['id']
def is_null(self):
"""Is this the null entity? (ID == 0).
"""
return 'id' not in self._soap_entity or self._soap_entity['id'] == 0
def get_name(self):
"""Get the BAM name of the entity.
"""
if 'name' in self._soap_entity:
return self._soap_entity['name']
else:
return None
def get_type(self):
"""Get the BAM type of the entity.
"""
return self._soap_entity['type']
def get_properties(self):
"""Get the properties of the entity in the form of a dictionary containing one entry per property.
"""
return self._properties
def get_property(self, name):
"""Get a single named property for the entity or None if not defined.
"""
if name in self._properties:
return self._properties[name]
else:
return None
def get_parent(self):
"""Get the parent entity or None if the entity is at the top of the hierarchy.
"""
try:
res = self._api.instantiate_entity(self._soap_client.service.getParent(self.get_id()), self._soap_client)
return None if res.get_id() == 0 else res
except WebFault as e:
raise api_exception(e.message)
def get_parent_of_type(self, type):
"""Walk up the entity hierarchy and return the first parent entity of the given type or, if none was found, None
"""
parent = self
count = 0
while count < 100:
parent = parent.get_parent()
if parent.is_null():
raise api_exception('No parent of type %s found.' % type)
if parent.get_type() == type:
try:
return self._api.instantiate_entity(self._soap_client.service.getEntityById(parent.get_id()),
self._soap_client)
except WebFault as e:
raise api_exception(e.message)
if count >= 100:
raise api_exception('API failure, no parent of type %s found.' % type)
def get_children_of_type(self, type, max_results=500):
"""Get all the immediate children of an entity of the given type.
"""
try:
res = []
s = self._soap_client.service.getEntities(self.get_id(), type, 0, max_results)
if not has_response(s):
return res
else:
for dr in s.item:
res.append(self._api.instantiate_entity(dr, self._soap_client))
return res
except WebFault as e:
raise api_exception(e.message)
def get_linked_entities(self, type, max_results=500):
"""Get all the linked entities of a given type
"""
try:
res = []
s = self._soap_client.service.getLinkedEntities(self.get_id(), type, 0, max_results)
if not has_response(s):
return res
else:
for dr in s.item:
res.append(self._api.instantiate_entity(dr, self._soap_client))
return res
except WebFault as e:
raise api_exception(e.message)
def get_child_by_name(self, name, type):
"""Get a specific named immediate child entity of a given type.
"""
try:
res = self._soap_client.service.getEntityByName(self.get_id(), name, type)
if not has_response(res):
return None
else:
return self._api.instantiate_entity(res, self._soap_client)
except WebFault as e:
raise api_exception(e.message)
def set_property(self, name, value):
"""Set a property value. The change is not persisted until update() is called.
"""
self._properties[name] = value
def update(self):
"""Persist any changes to the entity to the BAM database.
"""
s = ''
for k, v in self._properties.items():
if k not in self._immutable_properties:
s += k + '=' + v + '|'
self._soap_entity['properties'] = s
try:
self._soap_client.service.update(self._soap_entity)
except WebFault as e:
raise api_exception(e.message)
def delete(self):
"""Delete the entity from the BAM database.
"""
try:
delete_entity(self._soap_client, self.get_id(), self._version)
except WebFault as e:
raise api_exception(e.message)
def dump(self):
"""Dump out details of the entity to stdout. Useful for debug.
"""
print self._soap_entity
def get_deployment_roles(self, types=[]):
"""Get deployment roles for the entity.
:param types: An optional list of deployment role types (documented in the deployment_role class). If the list is empty all types are returned.
"""
try:
res = []
s = self._soap_client.service.getDeploymentRoles(self.get_id())
if has_response(s):
for dr in self._soap_client.service.getDeploymentRoles(self.get_id()).item:
if len(types) > 0 and dr['type'] in types:
res.append(self._api.instantiate_entity(dr, self._soap_client))
return res
except WebFault as e:
raise api_exception(e.message)
| 34.747253 | 151 | 0.613641 | 1,028 | 9,486 | 5.494163 | 0.222763 | 0.041076 | 0.03966 | 0.029745 | 0.232472 | 0.20255 | 0.190333 | 0.180595 | 0.13704 | 0.123938 | 0 | 0.007777 | 0.295172 | 9,486 | 272 | 152 | 34.875 | 0.836973 | 0 | 0 | 0.237864 | 0 | 0 | 0.132448 | 0.008729 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.024272 | null | null | 0.004854 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b6a869041b8f7099f1366b7e23ffabaa5619e62 | 1,083 | py | Python | selenium/load-html-from-string-instead-of-url/main.py | whitmans-max/python-examples | 881a8f23f0eebc76816a0078e19951893f0daaaa | [
"MIT"
] | 140 | 2017-02-21T22:49:04.000Z | 2022-03-22T17:51:58.000Z | selenium/load-html-from-string-instead-of-url/main.py | whitmans-max/python-examples | 881a8f23f0eebc76816a0078e19951893f0daaaa | [
"MIT"
] | 5 | 2017-12-02T19:55:00.000Z | 2021-09-22T23:18:39.000Z | selenium/load-html-from-string-instead-of-url/main.py | whitmans-max/python-examples | 881a8f23f0eebc76816a0078e19951893f0daaaa | [
"MIT"
] | 79 | 2017-01-25T10:53:33.000Z | 2022-03-11T16:13:57.000Z | #!/usr/bin/env python3
# date: 2019.11.24
import selenium.webdriver
driver = selenium.webdriver.Firefox()
html_content = """
<div class=div1>
<ul>
<li>
<a href='path/to/div1stuff/1'>Generic string 1</a>
<a href='path/to/div1stuff/2'>Generic string 2</a>
<a href='path/to/div1stuff/3'>Generic string 3</a>
</li>
</ul>
</div>
<div class=div2>
<ul>
<li>
<a href='path/to/div2stuff/1'>Generic string 1</a>
<a href='path/to/div2stuff/2'>Generic string 2</a>
<a href='path/to/div2stuff/3'>Generic string 3</a>
</li>
</ul>
</div>
"""
driver.get("data:text/html;charset=utf-8," + html_content)
elements = driver.find_elements_by_css_selector("div.div2 a")
for x in elements:
print(x.get_attribute('href'))
item = driver.find_element_by_xpath("//div[@class='div2']//a[contains(text(),'Generic string 2')]")
print(item.get_attribute('href'))
item.click()
| 26.414634 | 99 | 0.54663 | 144 | 1,083 | 4.034722 | 0.368056 | 0.156627 | 0.092943 | 0.113597 | 0.409639 | 0.378657 | 0.26506 | 0.26506 | 0.185886 | 0 | 0 | 0.042636 | 0.285319 | 1,083 | 40 | 100 | 27.075 | 0.70801 | 0.036011 | 0 | 0.344828 | 0 | 0 | 0.693858 | 0.269674 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.034483 | 0 | 0.034483 | 0.068966 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b74ca87ccbb7137e1d1ff7086010eae374775ba | 1,646 | py | Python | src/oci_cli/cli_clients.py | honzajavorek/oci-cli | 6ea058afba323c6b3b70e98212ffaebb0d31985e | [
"Apache-2.0"
] | null | null | null | src/oci_cli/cli_clients.py | honzajavorek/oci-cli | 6ea058afba323c6b3b70e98212ffaebb0d31985e | [
"Apache-2.0"
] | null | null | null | src/oci_cli/cli_clients.py | honzajavorek/oci-cli | 6ea058afba323c6b3b70e98212ffaebb0d31985e | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
# Copyright (c) 2016, 2019, Oracle and/or its affiliates. All rights reserved.
import os
import pkgutil
from os.path import abspath
from inspect import getsourcefile
CLIENT_MAP = {}
MODULE_TO_TYPE_MAPPINGS = {}
ALL_SERVICES_DIR = "services"
this_file_path = abspath(getsourcefile(lambda: 0))
if "site-packages" in this_file_path or "dist-packages" in this_file_path:
python_cli_root_dir = this_file_path[0:this_file_path.index("oci_cli")]
else:
python_cli_root_dir = this_file_path[0:this_file_path.index("/src/oci_cli")]
services_dir = os.path.join(python_cli_root_dir, ALL_SERVICES_DIR)
# Import client mappings from platformization directories.
# This imports the generated client_mappings which populates CLIENT_MAP and MODULE_TO_TYPE_MAPPINGS.
for importer1, modname1, ispkg1 in pkgutil.iter_modules(path=[services_dir]):
for importer, modname, ispkg in pkgutil.iter_modules(path=[services_dir + '/' + modname1 + '/src']):
if ispkg and modname.startswith("oci_cli_"):
oci_cli_module_name = modname.split(".")[0]
service_name = oci_cli_module_name[8:]
oci_cli_module = __import__(ALL_SERVICES_DIR + '.' + modname1 + '.src.' + oci_cli_module_name)
services_dir = oci_cli_module.__path__[0]
service_dir = os.path.join(services_dir, modname1, 'src', oci_cli_module_name)
generated_module = "client_mappings"
if os.path.isfile(os.path.join(service_dir, 'generated', generated_module + ".py")):
__import__(ALL_SERVICES_DIR + '.' + modname1 + '.src.' + oci_cli_module_name + ".generated." + generated_module)
| 51.4375 | 128 | 0.727825 | 230 | 1,646 | 4.821739 | 0.304348 | 0.054103 | 0.075744 | 0.072137 | 0.32101 | 0.281335 | 0.281335 | 0.218215 | 0.218215 | 0.167719 | 0 | 0.015988 | 0.164034 | 1,646 | 31 | 129 | 53.096774 | 0.789971 | 0.149453 | 0 | 0 | 0 | 0 | 0.086022 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5b75521d38171fc3f17cd63f930032aca3a11c04 | 799 | py | Python | main.py | neohanju/GoogleImageSearchDownload | 09344be0f356816ee16921f75b16114340e40b67 | [
"MIT"
] | null | null | null | main.py | neohanju/GoogleImageSearchDownload | 09344be0f356816ee16921f75b16114340e40b67 | [
"MIT"
] | null | null | null | main.py | neohanju/GoogleImageSearchDownload | 09344be0f356816ee16921f75b16114340e40b67 | [
"MIT"
] | null | null | null | # reference: http://icrawler.readthedocs.io/en/latest/usage.html
from icrawler.builtin import GoogleImageCrawler
import os
dataset_base_dir = 'D:/Workspace/Dataset/fake_image_detection/task_2'
keyword_lists = ['snapchat face swap', 'MSQRD']
for keyword in keyword_lists:
folder_path = dataset_base_dir + '/' + keyword
if not os.path.exists(folder_path):
os.makedirs(folder_path)
print(folder_path + ' is created!')
else:
pass
google_crawler = GoogleImageCrawler(parser_threads=2, downloader_threads=4,
storage={'root_dir': folder_path})
keyword_comma = keyword.replace(' ', ',')
google_crawler.crawl(keyword=keyword, max_num=10000)
print('Crawling ' + keyword + ' is done')
# ()()
# ('')HAANJU.YOO
| 27.551724 | 79 | 0.669587 | 95 | 799 | 5.410526 | 0.642105 | 0.097276 | 0.054475 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012638 | 0.20776 | 799 | 28 | 80 | 28.535714 | 0.799368 | 0.102628 | 0 | 0 | 0 | 0 | 0.15568 | 0.067321 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.0625 | 0.125 | 0 | 0.125 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5b77b55412440ca872621e6080c45d3d24321ec7 | 1,632 | py | Python | src/daipeproject/silver/01_some_notebook.py | DataSentics/daipe-bad-practices-1 | a23c1b42908f646763a3a1d0821600f8d0d78db2 | [
"MIT"
] | null | null | null | src/daipeproject/silver/01_some_notebook.py | DataSentics/daipe-bad-practices-1 | a23c1b42908f646763a3a1d0821600f8d0d78db2 | [
"MIT"
] | null | null | null | src/daipeproject/silver/01_some_notebook.py | DataSentics/daipe-bad-practices-1 | a23c1b42908f646763a3a1d0821600f8d0d78db2 | [
"MIT"
] | null | null | null | # Databricks notebook source
# MAGIC %run ../app/bootstrap
# COMMAND ----------
from pyspark.sql.dataframe import DataFrame
from datalakebundle.imports import transformation
# COMMAND ----------
datasets = [
{
"id": "123",
"name": "knihydobrovsky_cz",
"custom_attrs": {
105: "EXT_ID",
104: "ADFORM_ID",
2: "GA_ID",
},
},
{
"id": "4564",
"name": "knihomol_cz",
"custom_attrs": {
3: "EXT_ID",
2: "GA_ID",
},
},
]
# TODO 2: tahání configu a předávání přes globální proměnnou
@transformation("%datalake.base_base_path%")
def get_config(base_base_path: str):
return base_base_path
base_path = get_config.result
# TODO 1: cyklus
for dataset in datasets:
# TODO 3: use logger instead of print
print(dataset['name'])
dataset_name = dataset['name']
@transformation()
def load_visits():
return spark.read.format("delta").load(base_path + "/bronze/raw/visits/" + dataset_name)
def load_custom_attrs():
return spark.read.format("delta").load(base_path + "/bronze/raw/custom_attrs/" + dataset_name)
# TODO 4: rule of thumb: one notebook should always produce/output one dataset
@transformation(load_visits)
def save_visits(df: DataFrame):
df.write.format("delta").save(base_path + "/silver/parsed/visits/" + dataset_name, mode="append")
@transformation(load_custom_attrs)
def save_custom_attrs(df: DataFrame):
df.write.format("delta").save(base_path + "/silver/parsed/custom_attrs/" + dataset_name, mode="append")
| 27.661017 | 111 | 0.628676 | 196 | 1,632 | 5.040816 | 0.423469 | 0.064777 | 0.036437 | 0.01417 | 0.202429 | 0.202429 | 0.202429 | 0.202429 | 0.202429 | 0.202429 | 0 | 0.015898 | 0.229167 | 1,632 | 58 | 112 | 28.137931 | 0.769475 | 0.170956 | 0 | 0.102564 | 0 | 0 | 0.194196 | 0.074405 | 0 | 0 | 0 | 0.017241 | 0 | 1 | 0.128205 | false | 0 | 0.051282 | 0.076923 | 0.25641 | 0.025641 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b789ac50113811cea9f5671853ed49ad42da467 | 560 | py | Python | tests/pyspark_utils/test_convert_cerberus_schema_to_pyspark.py | ONS-SST/cis_households | e475df5929e6763a46cd05aff1f7e960ccbe8e21 | [
"MIT"
] | null | null | null | tests/pyspark_utils/test_convert_cerberus_schema_to_pyspark.py | ONS-SST/cis_households | e475df5929e6763a46cd05aff1f7e960ccbe8e21 | [
"MIT"
] | 252 | 2021-05-19T11:12:43.000Z | 2022-03-02T10:39:10.000Z | tests/pyspark_utils/test_convert_cerberus_schema_to_pyspark.py | ONS-SST/cis_households | e475df5929e6763a46cd05aff1f7e960ccbe8e21 | [
"MIT"
] | null | null | null | from pyspark.sql.types import StructField
from cishouseholds.pyspark_utils import convert_cerberus_schema_to_pyspark
def test_conversion():
cerberus_schema = {"id": {"type": "string"}, "whole_number": {"type": "integer"}}
pyspark_schema = convert_cerberus_schema_to_pyspark(cerberus_schema)
assert len(pyspark_schema) == len(cerberus_schema)
assert sorted([column_schema.name for column_schema in pyspark_schema]) == sorted(cerberus_schema.keys())
assert all(isinstance(column_schema, StructField) for column_schema in pyspark_schema)
| 40 | 109 | 0.783929 | 71 | 560 | 5.859155 | 0.43662 | 0.201923 | 0.100962 | 0.110577 | 0.288462 | 0.144231 | 0 | 0 | 0 | 0 | 0 | 0 | 0.116071 | 560 | 13 | 110 | 43.076923 | 0.840404 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0 | 0 | 0 | 0 | 0 | 0.375 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b85533c3707d4c478552c5ac09e55d8b975b426 | 201 | py | Python | inference.py | andreasr27/bidding_simulator | c2e1665b5eb72d6464025ce330682fb25780cb56 | [
"Apache-2.0"
] | null | null | null | inference.py | andreasr27/bidding_simulator | c2e1665b5eb72d6464025ce330682fb25780cb56 | [
"Apache-2.0"
] | null | null | null | inference.py | andreasr27/bidding_simulator | c2e1665b5eb72d6464025ce330682fb25780cb56 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
import regret as r
import sys
import os
n=int(sys.argv[1])
fout=open(sys.argv[3],'w')
print >>fout, n
for i in range(0,n):
print >>fout, i, r.mult_valuation(sys.argv[2],i)
| 15.461538 | 56 | 0.646766 | 41 | 201 | 3.146341 | 0.634146 | 0.162791 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 0.164179 | 201 | 12 | 57 | 16.75 | 0.744048 | 0.079602 | 0 | 0 | 0 | 0 | 0.005435 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5b8868ab4678426738010cc86dd5157444746730 | 339 | py | Python | tests/__init__.py | jfardello/dyn53 | bf40f9f979273cfcb64817ae8117992ce03acbac | [
"MIT"
] | null | null | null | tests/__init__.py | jfardello/dyn53 | bf40f9f979273cfcb64817ae8117992ce03acbac | [
"MIT"
] | null | null | null | tests/__init__.py | jfardello/dyn53 | bf40f9f979273cfcb64817ae8117992ce03acbac | [
"MIT"
] | null | null | null | import unittest
from . import test_cli, test_client
def suite():
test_suite = unittest.TestSuite()
test_suite.addTests(unittest.makeSuite(test_cli.TestCli))
test_suite.addTests(unittest.makeSuite(test_client.TestClient))
return test_suite
if __name__ == '__main__':
unittest.TextTestRunner(verbosity=2).run(suite())
| 26.076923 | 67 | 0.761062 | 42 | 339 | 5.761905 | 0.5 | 0.14876 | 0.140496 | 0.206612 | 0.31405 | 0.31405 | 0 | 0 | 0 | 0 | 0 | 0.00339 | 0.129794 | 339 | 12 | 68 | 28.25 | 0.816949 | 0 | 0 | 0 | 0 | 0 | 0.023599 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5b96dfeca9efe5e44a9af8ca8317c32891758506 | 1,489 | py | Python | tests/seahub/invitations/test_views.py | Xandersoft/seahub | f75f238b3e0a907e8a8003f419e367fa36e992e7 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | tests/seahub/invitations/test_views.py | Xandersoft/seahub | f75f238b3e0a907e8a8003f419e367fa36e992e7 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | tests/seahub/invitations/test_views.py | Xandersoft/seahub | f75f238b3e0a907e8a8003f419e367fa36e992e7 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | from django.utils import timezone
from django.core.urlresolvers import reverse
from seahub.invitations.models import Invitation
from seahub.test_utils import BaseTestCase
class TokenViewTest(BaseTestCase):
def setUp(self):
self.accepter = 'random@foo.com'
self.iv = Invitation.objects.add(inviter=self.user.username,
accepter=self.accepter)
self.url = reverse('invitations:token_view', args=[self.iv.token])
def tearDown(self):
self.remove_user(self.accepter)
def test_get(self):
resp = self.client.get(self.url)
self.assertEqual(200, resp.status_code)
self.assertRegexpMatches(resp.content, 'Set your password')
def test_expired_token(self):
self.iv.expire_time = timezone.now()
self.iv.save()
resp = self.client.get(self.url)
self.assertEqual(404, resp.status_code)
def test_post(self):
assert self.iv.accept_time is None
resp = self.client.post(self.url, {
'password': 'passwd'
})
self.assertEqual(302, resp.status_code)
assert Invitation.objects.get(pk=self.iv.pk).accept_time is not None
def test_post_empty_password(self):
assert self.iv.accept_time is None
resp = self.client.post(self.url, {
'password': '',
})
self.assertEqual(302, resp.status_code)
assert Invitation.objects.get(pk=self.iv.pk).accept_time is None
| 33.840909 | 76 | 0.650772 | 189 | 1,489 | 5.026455 | 0.333333 | 0.050526 | 0.058947 | 0.050526 | 0.387368 | 0.387368 | 0.387368 | 0.387368 | 0.305263 | 0.305263 | 0 | 0.010629 | 0.241773 | 1,489 | 43 | 77 | 34.627907 | 0.830824 | 0 | 0 | 0.285714 | 0 | 0 | 0.050369 | 0.014775 | 0 | 0 | 0 | 0 | 0.257143 | 1 | 0.171429 | false | 0.114286 | 0.114286 | 0 | 0.314286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5b994b983a9b32668d74419e6514f5b302e70c33 | 445 | py | Python | django101/django101/urls.py | nrgxtra/web_basics | 073ccb361af666c9fe2b3fa0b5cf74d721acb1b4 | [
"MIT"
] | null | null | null | django101/django101/urls.py | nrgxtra/web_basics | 073ccb361af666c9fe2b3fa0b5cf74d721acb1b4 | [
"MIT"
] | null | null | null | django101/django101/urls.py | nrgxtra/web_basics | 073ccb361af666c9fe2b3fa0b5cf74d721acb1b4 | [
"MIT"
] | null | null | null |
from django.contrib import admin
from django.urls import path, include
from django101 import cities
from django101.cities.views import index, list_phones, test_index, create_person
urlpatterns = [
path('admin/', admin.site.urls),
path('test/', test_index),
path('create/', create_person, name='create person'),
path('cities/', include('django101.cities.urls')),
path('', include('django101.people.urls')),
]
| 29.666667 | 81 | 0.694382 | 55 | 445 | 5.527273 | 0.381818 | 0.118421 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0.164045 | 445 | 14 | 82 | 31.785714 | 0.784946 | 0 | 0 | 0 | 0 | 0 | 0.186047 | 0.097674 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.363636 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5ba693f02b6eec231bd51eee1c8015fbc38f46e9 | 2,882 | py | Python | obj_sys/obj_tools.py | weijia/obj_sys | 7654a84f155f8e0da942f980d06c6ada34a6d71e | [
"BSD-3-Clause"
] | null | null | null | obj_sys/obj_tools.py | weijia/obj_sys | 7654a84f155f8e0da942f980d06c6ada34a6d71e | [
"BSD-3-Clause"
] | null | null | null | obj_sys/obj_tools.py | weijia/obj_sys | 7654a84f155f8e0da942f980d06c6ada34a6d71e | [
"BSD-3-Clause"
] | null | null | null | import socket
import logging
from ufs_tools import format_path
def get_fs_protocol_separator():
try:
import configurationTools as config
return config.getFsProtocolSeparator()
except ImportError:
return "://"
gUfsObjUrlPrefix = u'ufs' + get_fs_protocol_separator()
gUfsObjUrlSeparator = u'/'
log = logging.getLogger(__name__)
def is_web_url(url):
# log.error(url)
if is_ufs_url(url):
protocol = get_protocol(url)
if protocol in ["https", "http", "ftp"]:
return True
return False
def get_protocol(url):
parse_res = parse_url(url)
protocol = parse_res[0]
return protocol
def get_formatted_full_path(full_path):
return format_path(full_path)
def parse_url(url):
return url.split(get_fs_protocol_separator(), 2)
def get_hostname():
return unicode(socket.gethostname())
def get_ufs_url_for_local_path(full_path):
return gUfsObjUrlPrefix + get_hostname() + gUfsObjUrlSeparator + format_path(full_path)
def get_full_path_from_ufs_url(ufs_url):
if not is_ufs_fs(ufs_url):
raise "not ufs url"
objPath = parse_url(ufs_url)[1]
hostname, full_path = objPath.split(gUfsObjUrlSeparator, 1)
# print hostname, full_path
if unicode(hostname) != get_hostname():
raise 'not a local file'
return full_path
def get_full_path_for_local_os(ufs_url):
url_content = parse_url(ufs_url)[1]
if '/' == url_content[0]:
# The path returned by qt is file:///d:/xxxx, so we must remove the '/' char first
return url_content[1:]
return url_content
def is_uuid(url):
return url.find(u"uuid" + get_fs_protocol_separator()) == 0
def get_url_content(url):
protocol, content = parse_url(url)
return content
def get_path_for_ufs_url(url):
url_content = get_url_content(url)
return url_content.split(gUfsObjUrlSeparator, 1)[1]
def get_uuid(url):
return get_url_content(url)
def get_url_for_uuid(id):
return u"uuid" + get_fs_protocol_separator() + id
def is_ufs_url(url):
"""
In format of xxxx://xxxx
:param url:
"""
if url.find(get_fs_protocol_separator()) == -1:
return False
else:
return True
def get_ufs_local_root_url():
return gUfsObjUrlPrefix + get_hostname() + gUfsObjUrlSeparator
def is_ufs_fs(url):
return url.find(gUfsObjUrlPrefix) == 0
def get_ufs_basename(url):
return url.rsplit(gUfsObjUrlSeparator, 1)[1]
def get_host(ufs_url):
if is_ufs_fs(ufs_url):
path_with_host = parse_url(ufs_url)[1]
return path_with_host.split(u"/")[0]
raise "Not Ufs URL"
def is_local(ufs_url):
"""
ufs_url in format ufs://hostname/D:/tmp/xxx.xxx
"""
if get_host(ufs_url) == get_hostname():
return True
else:
print "not local", get_host(ufs_url), get_hostname()
return False
| 22 | 91 | 0.683206 | 415 | 2,882 | 4.440964 | 0.20241 | 0.061856 | 0.042322 | 0.071622 | 0.226804 | 0.08573 | 0.032556 | 0 | 0 | 0 | 0 | 0.007052 | 0.2127 | 2,882 | 130 | 92 | 22.169231 | 0.805201 | 0.041985 | 0 | 0.103896 | 0 | 0 | 0.02881 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.064935 | null | null | 0.012987 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5ba7d313b7cd41420e9e5febeab4e32d8628ee1f | 2,304 | py | Python | src/apps/recommendations/tests/x_test_comparable_inventories.py | Remy-TPP/q-api | 761dd2d15557cb9a8bdc0b397fc10e0cf7c95c03 | [
"MIT"
] | null | null | null | src/apps/recommendations/tests/x_test_comparable_inventories.py | Remy-TPP/q-api | 761dd2d15557cb9a8bdc0b397fc10e0cf7c95c03 | [
"MIT"
] | 14 | 2020-06-23T20:32:16.000Z | 2022-01-13T03:13:20.000Z | src/apps/recommendations/tests/x_test_comparable_inventories.py | MartinCura/q-api | 761dd2d15557cb9a8bdc0b397fc10e0cf7c95c03 | [
"MIT"
] | null | null | null | from unittest import TestCase
from apps.profiles.models import Profile
from apps.recipes.models import Recipe, Ingredient
from apps.recommendations.utils import ComparableInventory
# TODO: written for manual testing with preloaded db; for general use should create resources in setUp()
class ComparableInventoryTest(TestCase):
def setUp(self):
prof = Profile.objects.get(pk=3)
place = prof.places.first()
inv = place.inventory.all().prefetch_related('product', 'unit')
try:
print('creating')
print(f'---- {self.inv}')
self.inv.print_inventory(product_id=329)
except AttributeError:
pass
self.inv = ComparableInventory(inv)
self.inv.print_inventory(product_id=329)
def tearDown(self):
print('destroying')
self.inv.destroy()
self.inv.print_inventory()
self.inv = None
print('destroyed')
def test_print(self):
self.inv.print_inventory(product_id=329)
def test_substract_ingredient(self):
# Product ID 329: pepino
ing = Ingredient.objects.filter(product_id=329)[0]
print(ing)
self.inv.print_inventory(product_id=329)
self.inv.substract(ing)
print(self.inv.inventory.get(329))
self.inv.print_inventory(product_id=329)
def test_reset(self):
ing = Ingredient.objects.filter(product_id=329)[0]
self.assertEqual(self.inv.get(329).quantity, 3)
self.inv.substract(ing)
self.assertEqual(self.inv.get(329).quantity, 2)
self.inv.substract(ing)
self.assertEqual(self.inv.get(329).quantity, 1)
self.inv.reset()
self.assertEqual(self.inv.get(329).quantity, 3)
def test_can_make_recipe(self):
# Shouldn't be able to do this
recipe1 = Recipe.objects.get(pk=313)
self.assertFalse(self.inv.can_make(recipe1))
# Should be able to make this one
recipe2 = Recipe.objects.get(pk=291)
self.assertTrue(self.inv.can_make(recipe2))
def test_can_make_multiple_times(self):
recipe = Recipe.objects.get(pk=291)
self.assertTrue(self.inv.can_make(recipe))
self.assertTrue(self.inv.can_make(recipe))
self.assertTrue(self.inv.can_make(recipe))
| 32 | 104 | 0.656684 | 297 | 2,304 | 4.993266 | 0.306397 | 0.113284 | 0.064734 | 0.084963 | 0.41942 | 0.41942 | 0.41942 | 0.397168 | 0.2441 | 0.190155 | 0 | 0.033296 | 0.230903 | 2,304 | 71 | 105 | 32.450704 | 0.803612 | 0.080729 | 0 | 0.294118 | 0 | 0 | 0.025083 | 0 | 0 | 0 | 0 | 0.014085 | 0.176471 | 1 | 0.137255 | false | 0.019608 | 0.078431 | 0 | 0.235294 | 0.254902 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bac7ae6136379e32a79e0f45ffc9c4c50e02c35 | 274 | py | Python | pacote-download/pythonProject/exercicios_python_guanabara/ex019.py | oliveirajonathas/python_estudos | 28921672d7e5d0866030c45b077a28998905f752 | [
"MIT"
] | null | null | null | pacote-download/pythonProject/exercicios_python_guanabara/ex019.py | oliveirajonathas/python_estudos | 28921672d7e5d0866030c45b077a28998905f752 | [
"MIT"
] | null | null | null | pacote-download/pythonProject/exercicios_python_guanabara/ex019.py | oliveirajonathas/python_estudos | 28921672d7e5d0866030c45b077a28998905f752 | [
"MIT"
] | null | null | null | import random
aluno1 = input('Nome aluno 1: ')
aluno2 = input('Nome aluno 2: ')
aluno3 = input('Nome aluno 3: ')
aluno4 = input('Nome aluno 4: ')
sorteado = random.choice([aluno1, aluno2, aluno3, aluno4])
print('O sorteado para apagar o quadro foi: {}'.format(sorteado))
| 24.909091 | 65 | 0.686131 | 39 | 274 | 4.820513 | 0.564103 | 0.191489 | 0.297872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051948 | 0.156934 | 274 | 10 | 66 | 27.4 | 0.761905 | 0 | 0 | 0 | 0 | 0 | 0.346715 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.142857 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5badd82fc3129e47361d6ac0510385c34b87a716 | 306 | py | Python | tests/test_reset_plot.py | l-johnston/toolbag | 1bd6ca61bfaf5856e5de320926d5593291e39e9c | [
"MIT"
] | null | null | null | tests/test_reset_plot.py | l-johnston/toolbag | 1bd6ca61bfaf5856e5de320926d5593291e39e9c | [
"MIT"
] | null | null | null | tests/test_reset_plot.py | l-johnston/toolbag | 1bd6ca61bfaf5856e5de320926d5593291e39e9c | [
"MIT"
] | null | null | null | """Test reset_plot"""
import matplotlib.pyplot as plt
from toolbag import reset_plot
plt.ion()
# pylint: disable = missing-function-docstring
def test_reset_plot():
fig, ax = plt.subplots()
ax.plot([1, 2, 3])
plt.close()
reset_plot(fig)
assert id(ax) == id(fig.gca())
plt.close()
| 20.4 | 46 | 0.656863 | 46 | 306 | 4.26087 | 0.586957 | 0.183673 | 0.132653 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012097 | 0.189542 | 306 | 14 | 47 | 21.857143 | 0.778226 | 0.199346 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 1 | 0.1 | true | 0 | 0.2 | 0 | 0.3 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bade8022eec80f2a26ec5d44c9c7347683447d6 | 330 | py | Python | af/shovel/oonipl/popen.py | mimi89999/pipeline | 3e9eaf74c0966df907a230fbe89407c2bbc3d930 | [
"BSD-3-Clause"
] | null | null | null | af/shovel/oonipl/popen.py | mimi89999/pipeline | 3e9eaf74c0966df907a230fbe89407c2bbc3d930 | [
"BSD-3-Clause"
] | null | null | null | af/shovel/oonipl/popen.py | mimi89999/pipeline | 3e9eaf74c0966df907a230fbe89407c2bbc3d930 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python2.7
# -*- coding: utf-8 -*-
from subprocess import Popen, PIPE
from contextlib import contextmanager
@contextmanager
def ScopedPopen(*args, **kwargs):
proc = Popen(*args, **kwargs)
try:
yield proc
finally:
try:
proc.kill()
except Exception:
pass
| 18.333333 | 37 | 0.593939 | 36 | 330 | 5.444444 | 0.75 | 0.102041 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012821 | 0.290909 | 330 | 17 | 38 | 19.411765 | 0.824786 | 0.136364 | 0 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.083333 | 0.166667 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
5bb69ffabb9285aa28e934c2aba594cdec056ae0 | 519 | py | Python | jobs/migrations/0055_savedfeatureselection_uid.py | hotosm/hot-exports-two | d60530445e89b2a46bd55ea3b7c2e72409b0f493 | [
"BSD-3-Clause"
] | 95 | 2017-09-29T13:20:38.000Z | 2022-03-14T06:43:47.000Z | jobs/migrations/0055_savedfeatureselection_uid.py | hotosm/hot-exports-two | d60530445e89b2a46bd55ea3b7c2e72409b0f493 | [
"BSD-3-Clause"
] | 229 | 2015-07-29T08:50:27.000Z | 2017-09-21T18:05:56.000Z | jobs/migrations/0055_savedfeatureselection_uid.py | hotosm/hot-exports-two | d60530445e89b2a46bd55ea3b7c2e72409b0f493 | [
"BSD-3-Clause"
] | 30 | 2017-10-06T23:53:48.000Z | 2022-03-10T06:17:07.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.9 on 2017-06-26 12:06
from __future__ import unicode_literals
from django.db import migrations, models
import uuid
class Migration(migrations.Migration):
dependencies = [
('jobs', '0054_savedfeatureselection'),
]
operations = [
migrations.AddField(
model_name='savedfeatureselection',
name='uid',
field=models.UUIDField(db_index=True, default=uuid.uuid4, editable=False, unique=True),
),
]
| 23.590909 | 99 | 0.645472 | 56 | 519 | 5.839286 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.050761 | 0.240848 | 519 | 21 | 100 | 24.714286 | 0.779188 | 0.125241 | 0 | 0 | 1 | 0 | 0.119734 | 0.104213 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.214286 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bbbb9f34d714ae4dbca7523d97576b3cc0761f0 | 227 | py | Python | test_client.py | lilydjwg/udt_py | 90cb36f3ef503bb45a9aa4f9dad5c86eddce0abd | [
"BSD-3-Clause"
] | 9 | 2015-01-11T06:59:40.000Z | 2022-02-02T14:57:59.000Z | test_client.py | lilydjwg/udt_py | 90cb36f3ef503bb45a9aa4f9dad5c86eddce0abd | [
"BSD-3-Clause"
] | 1 | 2015-01-15T21:03:51.000Z | 2015-01-16T03:05:25.000Z | test_client.py | lilydjwg/udt_py | 90cb36f3ef503bb45a9aa4f9dad5c86eddce0abd | [
"BSD-3-Clause"
] | 1 | 2020-06-20T08:39:06.000Z | 2020-06-20T08:39:06.000Z | #!/usr/bin/env python3
import udt
import socket
import time
s = udt.socket(socket.AF_INET, socket.SOCK_STREAM, 0)
s.connect(("localhost", 5555))
print("Sending...")
s.send(b"Hello", 0)
buf = s.recv(1024, 0)
print(repr(buf))
| 16.214286 | 53 | 0.696035 | 39 | 227 | 4 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.059701 | 0.114537 | 227 | 13 | 54 | 17.461538 | 0.716418 | 0.092511 | 0 | 0 | 0 | 0 | 0.117073 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.222222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
5bbfc3eb5cd5c998c4f1c0a160a1b3c4c4e0bd4e | 1,060 | py | Python | obkey_parts/__version__.py | evyd13/obkey3 | bb49ed6d6696299a410c43d0ba6471ee3f594c26 | [
"MIT"
] | 4 | 2018-12-17T03:53:26.000Z | 2022-01-06T19:40:02.000Z | obkey_parts/__version__.py | evyd13/obkey3 | bb49ed6d6696299a410c43d0ba6471ee3f594c26 | [
"MIT"
] | 9 | 2016-05-01T09:42:23.000Z | 2022-01-10T11:10:12.000Z | obkey_parts/__version__.py | evyd13/obkey3 | bb49ed6d6696299a410c43d0ba6471ee3f594c26 | [
"MIT"
] | 6 | 2019-03-11T13:14:22.000Z | 2022-01-02T23:55:17.000Z | """
Obkey package informations.
This file is a part of Openbox Key Editor
Code under GPL (originally MIT) from version 1.3 - 2018.
See Licenses information in ../obkey .
"""
MAJOR = 1
MINOR = 3
PATCH = 2
__version__ = "{0}.{1}.{2}".format(MAJOR, MINOR, PATCH)
__description__ = 'Openbox Key Editor'
__long_description__ = """
A keybinding editor for OpenBox, it includes launchers and window management keys.
It allows to:
* can check almost all keybinds in one second.
* add new keybinds, the default key associated will be 'a' and no action will be associated;
* add new child keybinds;
* setup existing keybinds :
* add/remove/sort/setup actions in the actions list;
* change the keybind by clicking on the item in the list;
* duplicate existing keybinds;
* remove keybinds.
The current drawbacks :
* XML inculsion is not managed. If you want to edit many files, then you shall open them with `obkey <config file>.xml`;
* `if` conditionnal tag is not supported (but did you knew it exists).
"""
| 32.121212 | 124 | 0.700943 | 155 | 1,060 | 4.709677 | 0.645161 | 0.027397 | 0.043836 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014475 | 0.217925 | 1,060 | 32 | 125 | 33.125 | 0.866104 | 0.156604 | 0 | 0 | 0 | 0.1 | 0.853075 | 0.023918 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bc679b5e30f69fabf91b151ec50839acee5a93d | 1,227 | py | Python | hermione/module_templates/__IMPLEMENTED_BASE__/src/ml/preprocessing/preprocessing.py | RodrigoATorres/hermione | 6cbed73e309f8025a48f33165d8f29561c6a3cc7 | [
"Apache-2.0"
] | 183 | 2020-06-03T22:43:14.000Z | 2022-03-17T22:39:07.000Z | hermione/module_templates/__IMPLEMENTED_BASE__/src/ml/preprocessing/preprocessing.py | RodrigoATorres/hermione | 6cbed73e309f8025a48f33165d8f29561c6a3cc7 | [
"Apache-2.0"
] | 31 | 2020-06-03T22:55:18.000Z | 2022-03-27T20:06:17.000Z | hermione/module_templates/__IMPLEMENTED_BASE__/src/ml/preprocessing/preprocessing.py | RodrigoATorres/hermione | 6cbed73e309f8025a48f33165d8f29561c6a3cc7 | [
"Apache-2.0"
] | 43 | 2020-06-03T22:45:03.000Z | 2021-12-29T19:43:54.000Z | import pandas as pd
from ml.preprocessing.normalization import Normalizer
from category_encoders import *
import logging
logging.getLogger().setLevel(logging.INFO)
class Preprocessing:
"""
Class to perform data preprocessing before training
"""
def clean_data(self, df: pd.DataFrame):
"""
Perform data cleansing.
Parameters
----------
df : pd.Dataframe
Dataframe to be processed
Returns
-------
pd.Dataframe
Cleaned Data Frame
"""
logging.info("Cleaning data")
df_copy = df.copy()
df_copy['Pclass'] = df_copy.Pclass.astype('object')
df_copy = df_copy.dropna()
return df_copy
def categ_encoding(self, df: pd.DataFrame):
"""
Perform encoding of the categorical variables
Parameters
----------
df : pd.Dataframe
Dataframe to be processed
Returns
-------
pd.Dataframe
Cleaned Data Frame
"""
logging.info("Category encoding")
df_copy = df.copy()
df_copy = pd.get_dummies(df_copy)
return df_copy
| 23.596154 | 59 | 0.546862 | 123 | 1,227 | 5.341463 | 0.390244 | 0.109589 | 0.060883 | 0.091324 | 0.401826 | 0.328767 | 0.273973 | 0.273973 | 0.273973 | 0.273973 | 0 | 0 | 0.356153 | 1,227 | 51 | 60 | 24.058824 | 0.831646 | 0.331703 | 0 | 0.235294 | 0 | 0 | 0.066879 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.235294 | 0 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
5bc7db691db6bd3aeb10eeebc1044c688e5b67aa | 3,124 | py | Python | DevTools/lineCount.py | spiiin/CadEditor | b28316ddd9be4e79ac5e6adf89b7751df609d94e | [
"MIT"
] | 164 | 2015-01-16T17:09:45.000Z | 2022-03-17T03:39:08.000Z | DevTools/lineCount.py | spiiin/CadEditor | b28316ddd9be4e79ac5e6adf89b7751df609d94e | [
"MIT"
] | 13 | 2015-02-20T17:07:40.000Z | 2021-07-11T05:33:17.000Z | DevTools/lineCount.py | spiiin/CadEditor | b28316ddd9be4e79ac5e6adf89b7751df609d94e | [
"MIT"
] | 23 | 2015-09-01T22:29:11.000Z | 2022-01-10T01:32:32.000Z | #!/usr/bin/env python2
#Script for calculate LoC of all source files of project
import os,string
import sys
extension_list = ['h','hpp','cpp','c','pas','dpr','asm','py','q3asm','def','sh','bat','cs','java','cl','lisp','ui',"nut"]
comment_sims = {'asm' : ';', 'py' : '#', 'cl':';','lisp':';'}
source_files = { }
exclude_names = ["libs", "release", ".git"]
if len(sys.argv)!=2:
print "You must call script as 'lineCount.py <path_to_folder>'"
raw_input()
exit(-1)
path = sys.argv[1]
files_count = 0
def calc_files_count (arg,dirname, names):
global files_count
files_count+=len(names)
def calc_strings_count(arg, dirname, names):
#print "%32s"%dirname
if any(dirname.lower().find(exclude_name) != -1 for exclude_name in exclude_names):
return
for name in names:
full_name = os.path.join(dirname,name)
file_name,file_ext = os.path.splitext(full_name)
file_ext = file_ext[1:].lower()
if comment_sims.has_key(file_ext):
comment_sim = comment_sims[file_ext]
else:
comment_sim = "//"
if file_ext in extension_list:
#.designer.cs files don't count
if file_name.lower().find(".designer") != -1:
continue
f = file(full_name)
file_text = f.readlines()
empty_lines_count = 0
comment_lines = 0
for line in file_text :
line_without_spaces = line.lstrip(string.whitespace)
if line_without_spaces=="":
empty_lines_count += 1
elif line_without_spaces.startswith(comment_sim):
comment_lines +=1
source_files[full_name]= {"full" : len(file_text) ,"empty" :empty_lines_count, "comment":comment_lines}
f.close()
def calc(path_root):
os.path.walk(path_root,calc_files_count,0)
print "Found : %4i files"%files_count
print ""
#calculate line count
os.path.walk(path_root,calc_strings_count,0)
#convert to list and sort
lst = source_files.items()
lst.sort(key = lambda (key, val): val["full"])
strings_count=0
empty_lines_count=0
comment_lines_count=0
for name,val in lst:
l_f,l_e,l_c = val["full"],val["empty"],val["comment"]
dummy,short_name = os.path.split(name)
print "%-36s : %5i (%i/%i/%i)"%(short_name,l_f, l_f-l_c-l_e,l_c,l_e )
strings_count+=l_f
empty_lines_count+=l_e
comment_lines_count+=l_c
print "\nformat -\nfilename : full_lines_count (code_lines_count/comments_count/empty_lines_count)"
print 24*"-"
print "Found : %4i files"%files_count
print "Summary : %4i lines"%strings_count
print "Code : %4i lines"%(strings_count - comment_lines_count - empty_lines_count)
print "Comments: %4i lines"%comment_lines_count
print "Empty : %4i lines"%empty_lines_count
print 24*"-"
print "%s %s %s"%( "="*24, "================", "="*24)
print "%-24s %s %24s"%( "="*3, "Spiiin LineCounter", "="*3)
print "%s %s %s"%( "="*24, "================", "="*24)
calc(path)
raw_input() | 34.32967 | 121 | 0.604033 | 440 | 3,124 | 4.045455 | 0.297727 | 0.078652 | 0.067416 | 0.033708 | 0.147191 | 0.130337 | 0.035955 | 0 | 0 | 0 | 0 | 0.019674 | 0.235275 | 3,124 | 91 | 122 | 34.32967 | 0.725408 | 0.054417 | 0 | 0.140845 | 0 | 0 | 0.168871 | 0.017294 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.028169 | null | null | 0.211268 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bcb08c67e2fb3285c52b357489d036e67867bc6 | 1,546 | py | Python | src/dataset.py | EmanuelSamir/adaptive-learning-qpcbfclf-elm | d913a8b98600a0d844adf156700b3b69c73c694a | [
"MIT"
] | null | null | null | src/dataset.py | EmanuelSamir/adaptive-learning-qpcbfclf-elm | d913a8b98600a0d844adf156700b3b69c73c694a | [
"MIT"
] | null | null | null | src/dataset.py | EmanuelSamir/adaptive-learning-qpcbfclf-elm | d913a8b98600a0d844adf156700b3b69c73c694a | [
"MIT"
] | null | null | null | from collections import deque, namedtuple
import random
class ELMDataset:
def __init__(self, dt, features = ('x'), time_th = 0.5, maxlen = 5):
self.time_th = time_th
self._maxlen = maxlen
self.D_pre = deque()
self.D_post = deque(maxlen = self._maxlen)
self.dt = dt
self.trans = namedtuple('trans',
features)
def reset(self):
self.D_pre = deque()
self.D_post = deque(maxlen = self._maxlen)
def update(self, t, *args):
if t/self.dt < self.time_th:
self.D_pre.append(self.trans(*args))
self.D_post.append(self.trans(*args))
else:
self.D_post.append(self.trans(*args))
def shuffle(self):
random.shuffle(self.D_post)
def get_D(self, t):
if t/self.dt < self.time_th:
return self.trans(*zip(*self.D_pre))
else:
return self.trans(*zip(*self.D_post))
class NNDataset:
def __init__(self, features = ('x'), maxlen = 5):
self.D = deque(maxlen = maxlen)
self.trans = namedtuple('trans',
features) # ('x', 'k', 'dh', 'dh_e')
def reset(self):
self.D = deque()
def update(self, *args):
self.D.append(self.trans(*args))
def shuffle(self):
random.shuffle(self.D)
def get_D(self):
sample = self.trans(*zip(*self.D))
return sample
| 28.62963 | 72 | 0.510996 | 190 | 1,546 | 4.005263 | 0.205263 | 0.098555 | 0.070959 | 0.099869 | 0.550591 | 0.406045 | 0.345598 | 0.247043 | 0.247043 | 0.247043 | 0 | 0.004044 | 0.360285 | 1,546 | 54 | 73 | 28.62963 | 0.76542 | 0.015524 | 0 | 0.439024 | 0 | 0 | 0.00789 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.243902 | false | 0 | 0.04878 | 0 | 0.414634 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bcd4fe94307999ec0e963c569a27c56d4a42eff | 985 | py | Python | common/stat.py | 7workday/TT | 8887a85652c387a50a65e2598abdc833400e56f3 | [
"Apache-2.0"
] | null | null | null | common/stat.py | 7workday/TT | 8887a85652c387a50a65e2598abdc833400e56f3 | [
"Apache-2.0"
] | null | null | null | common/stat.py | 7workday/TT | 8887a85652c387a50a65e2598abdc833400e56f3 | [
"Apache-2.0"
] | null | null | null | '''程序的状态码'''
OK = 0
class LogicErr(Exception):
code = None
data = None
def __init__(self,data=None):
self.data = data or self.__class__.__name__ # 如果 data 为 None, 使用类的名字作为 data 值
def gen_logic_err(name, code):
'''生成一个新的 LogicErr 的子类 (LogicErr 的工厂函数)'''
return type(name, (LogicErr,), {'code': code})
SmsErr = gen_logic_err('SmsErr', 1000) # 短信发送失败
VcodeErr = gen_logic_err('VcodeErr', 1001) # 验证码错误
LoginRequired = gen_logic_err('LoginRequired', 1002) # 用户未登录
UserFormErr = gen_logic_err('UserFormErr', 1003) # 用户表单数据错误
ProfileFormErr = gen_logic_err('ProfileFormErr', 1004) # 用户资料表单错误
RepeatSwipeErr = gen_logic_err('RepeatSwipeErr', 1005) # 重复滑动的错误
AreadyFriends = gen_logic_err('AreadyFriends',1006) # 重复好友
RewindLimited = gen_logic_err('RewindLimited',1007) #当天反悔次数达到上线
RewindTimeout = gen_logic_err('RewindTimeout',1008) #反悔超时
PermRequired = gen_logic_err('PermRequired',1009) #缺少某种权限 | 35.178571 | 85 | 0.690355 | 116 | 985 | 5.568966 | 0.482759 | 0.136223 | 0.187307 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051443 | 0.190863 | 985 | 28 | 86 | 35.178571 | 0.759097 | 0.148223 | 0 | 0 | 0 | 0 | 0.147741 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bcff95678159476daf98ca938631d4f52d0298a | 380 | py | Python | Aula07/chef007.py | AdryanPablo/Python | d469d394b41f44dbd753bf9a7f7eebaa81096562 | [
"MIT"
] | null | null | null | Aula07/chef007.py | AdryanPablo/Python | d469d394b41f44dbd753bf9a7f7eebaa81096562 | [
"MIT"
] | null | null | null | Aula07/chef007.py | AdryanPablo/Python | d469d394b41f44dbd753bf9a7f7eebaa81096562 | [
"MIT"
] | null | null | null | # Desenvolva um programa que leia as duas notas de um aluno, calcule e mostre a sua média.
nome = str(input("Digite o nome do(a) aluno(a): "))
not1 = float(input("Digite a 1ª nota de {}: ".format(nome)))
not2 = float(input("Digite a 2ª nota de {}: ".format(nome)))
media = (not1 + not2) / 2
print("Já que {} tirou {} e {}, sua média é {:.2f}.".format(nome, not1, not2, media))
| 38 | 90 | 0.642105 | 65 | 380 | 3.753846 | 0.553846 | 0.135246 | 0.131148 | 0.139344 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032051 | 0.178947 | 380 | 9 | 91 | 42.222222 | 0.75 | 0.231579 | 0 | 0 | 0 | 0 | 0.42069 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bd168e636c697b8dbaf73f30e8402fe9d300c82 | 1,035 | py | Python | mriutils/utils/tonii.py | kuangmeng/MRIUtils | 3a79e8104071deb0dc17c402ac878f94161d9b4a | [
"MIT"
] | null | null | null | mriutils/utils/tonii.py | kuangmeng/MRIUtils | 3a79e8104071deb0dc17c402ac878f94161d9b4a | [
"MIT"
] | null | null | null | mriutils/utils/tonii.py | kuangmeng/MRIUtils | 3a79e8104071deb0dc17c402ac878f94161d9b4a | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import skimage.io as skio
from skimage.transform import resize
import nibabel as nib
class SaveNiiFile():
def __init__(self, data, save_path, new_shape = (10, 256, 256), order = 3):
self.data = data
self.save_path = save_path
self.new_shape = new_shape
def showSingleMRI(self, frame):
skio.imshow(self.data[int(frame)], cmap = 'gray')
skio.show()
def resizeData(self):
if len(self.data) > 0:
self.data = resize(self.data, self.new_shape, order = self.order, mode='edge')
def load_nii(self, img_path):
nimg = nib.load(img_path)
return nimg.get_data(), nimg.affine, nimg.header
def save_nii(self, data = None, save_path = None, affine = None, header = None):
if data == None:
data = self.data
if save_path == None:
save_path = self.save_path
nimg = nib.Nifti1Image(data, affine = affine, header = header)
nimg.to_filename(save_path)
| 28.75 | 90 | 0.607729 | 141 | 1,035 | 4.304965 | 0.375887 | 0.105437 | 0.039539 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014745 | 0.279227 | 1,035 | 36 | 91 | 28.75 | 0.798928 | 0.019324 | 0 | 0 | 0 | 0 | 0.007882 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.208333 | false | 0 | 0.125 | 0 | 0.416667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bd77e4c61c0d5d7f63689e483c2e6eeec0b5053 | 310 | py | Python | kitsune/community/urls.py | yfdyh000/kitsune | 9a50b03b715bbabe0543b45d2146875995a6d1e4 | [
"BSD-3-Clause"
] | null | null | null | kitsune/community/urls.py | yfdyh000/kitsune | 9a50b03b715bbabe0543b45d2146875995a6d1e4 | [
"BSD-3-Clause"
] | null | null | null | kitsune/community/urls.py | yfdyh000/kitsune | 9a50b03b715bbabe0543b45d2146875995a6d1e4 | [
"BSD-3-Clause"
] | null | null | null | from django.conf.urls import patterns, url
urlpatterns = patterns(
'kitsune.community.views',
url(r'^/contributor_results$', 'contributor_results', name='community.contributor_results'),
url(r'^/view_all$', 'view_all', name='community.view_all'),
url(r'^$', 'home', name='community.home'),
)
| 31 | 96 | 0.693548 | 38 | 310 | 5.5 | 0.473684 | 0.057416 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119355 | 310 | 9 | 97 | 34.444444 | 0.765568 | 0 | 0 | 0 | 0 | 0 | 0.483871 | 0.23871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bd81eabed49d75dd2407602a6036e1582495777 | 4,650 | py | Python | sw/groundstation/tools/imu-raw-logger.py | nzjrs/wasp | b763309af59e7784811baa6dd80e17dba1b27d81 | [
"MIT"
] | 2 | 2021-07-11T13:47:17.000Z | 2021-11-08T11:21:51.000Z | sw/groundstation/tools/imu-raw-logger.py | nzjrs/wasp | b763309af59e7784811baa6dd80e17dba1b27d81 | [
"MIT"
] | null | null | null | sw/groundstation/tools/imu-raw-logger.py | nzjrs/wasp | b763309af59e7784811baa6dd80e17dba1b27d81 | [
"MIT"
] | 2 | 2015-10-03T06:24:07.000Z | 2016-01-21T11:36:20.000Z | #!/usr/bin/env python
# vim: ai ts=4 sts=4 et sw=4
import time
import os.path
import optparse
import gobject
import wasp
import wasp.transport as transport
import wasp.communication as communication
import wasp.messages as messages
import calibration
import calibration.utils
FREQ = 25.0 #Hz
STEP_DELAY = 5 #Seconds
class IMUCalibrator:
def __init__(self, sensor, logfile, messages):
pass
class IMULogger:
STATES = \
(["WAIT"] * 2 ) +\
["%s %s" % (o,d) for o in ("ROLL", "PITCH") for d in (90, 180, 270, 360)] +\
(["WAIT"] * 2 )
def __init__(self, port, messages_file, sensor_name, log_file):
raw_imu_message_name = "IMU_%s_RAW" % sensor_name
self.sensor_name = sensor_name
self.m = messages.MessagesFile(path=messages_file, debug=False)
self.m.parse()
self.msg = self.m.get_message_by_name(raw_imu_message_name)
if not self.msg:
raise SystemExit("Could Not Find Message %s" % raw_imu_message_name)
self.loop = gobject.MainLoop()
self.state = 0
self.measurements = []
self.log = None
if log_file:
if os.path.exists(log_file):
self.measurements = calibration.utils.read_log(log_file, sensor_name)
self.capture = False
else:
self.log = open(log_file, "w")
self.capture = True
else:
self.capture = True
if self.capture:
self.s = communication.SerialCommunication(
transport.Transport(check_crc=True, debug=False),
self.m,
wasp.transport.TransportHeaderFooter(acid=wasp.ACID_GROUNDSTATION))
self.s.configure_connection(serial_port=port,serial_speed=57600,serial_timeout=1)
self.s.connect("message-received", self._on_message_received)
self.s.connect("uav-connected", self._on_uav_connected)
def _on_message_received(self, comm, msg, header, payload):
if msg == self.msg:
x,y,z = msg.unpack_values(payload)
if self.log:
self.log.write("%s %d %d %d\n" % (msg.name,x,y,z))
self.measurements.append( (float(x),float(y),float(z)) )
print ".",
def _request_message(self):
self.s.send_message(
self.m.get_message_by_name("REQUEST_TELEMETRY"),
(self.msg.id, FREQ))
return False
def _on_uav_connected(self, comm, connected):
if connected:
gobject.timeout_add(1000, self._request_message)
else:
self.loop.quit()
raise SystemExit("Not Connected")
def _rotate_state_machine(self):
try:
self.state += 1
print self.STATES[self.state]
return True
except IndexError:
self.loop.quit()
return False
def collect(self):
if self.capture:
self.s.connect_to_uav()
gobject.timeout_add(STEP_DELAY*1000, self._rotate_state_machine)
self.loop.run()
if self.log:
self.log.close()
self.measurements = calibration.utils.read_list(self.measurements)
return self.measurements
if __name__ == "__main__":
thisdir = os.path.abspath(os.path.dirname(__file__))
default_messages = os.path.join(thisdir, "..", "..", "onboard", "config", "messages.xml")
parser = optparse.OptionParser()
parser.add_option("-m", "--messages",
default=default_messages,
help="messages xml file", metavar="FILE")
parser.add_option("-p", "--port",
default="/dev/ttyUSB0",
help="Serial port")
parser.add_option("-l", "--log-file",
help="log file for analysis. "\
"If it exists it will be used. "\
"If it does not exist it will be created. "\
"If not supplied the data will be captured directly "\
"from the UAV for analysis", metavar="FILE")
parser.add_option("-s", "--sensor", choices=calibration.SENSORS,
help="sensor to calibrate",
metavar="[%s]" % ",".join(calibration.SENSORS))
options, args = parser.parse_args()
if not options.sensor:
parser.error("must supply sensor")
logger = IMULogger(options.port, options.messages, options.sensor, options.log_file)
measurements = logger.collect()
calibration.calibrate_sensor(options.sensor, measurements, True)
| 34.191176 | 93 | 0.585161 | 549 | 4,650 | 4.779599 | 0.324226 | 0.021341 | 0.022866 | 0.019436 | 0.089177 | 0.016006 | 0 | 0 | 0 | 0 | 0 | 0.011388 | 0.30129 | 4,650 | 135 | 94 | 34.444444 | 0.796245 | 0.012043 | 0 | 0.119266 | 0 | 0 | 0.102419 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.009174 | 0.091743 | null | null | 0.018349 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5be00ec4042c62d09e71fd43458751614ef646dd | 1,182 | py | Python | emendation_box/serializers.py | fga-eps-mds/2017.2-SiGI-Op_API | 4532019c15414fd17e06bb3aa78501886e00da1d | [
"BSD-3-Clause"
] | 6 | 2017-08-24T13:18:21.000Z | 2017-10-03T18:06:13.000Z | emendation_box/serializers.py | fga-gpp-mds/2017.2-Grupo9 | 4532019c15414fd17e06bb3aa78501886e00da1d | [
"BSD-3-Clause"
] | 173 | 2017-08-31T15:29:01.000Z | 2017-12-14T13:40:13.000Z | emendation_box/serializers.py | fga-gpp-mds/2017.2-SiGI-Op_API | 4532019c15414fd17e06bb3aa78501886e00da1d | [
"BSD-3-Clause"
] | 2 | 2018-11-19T10:33:00.000Z | 2019-06-19T22:35:43.000Z | from rest_framework import serializers
from .models import EmendationBoxStructure, EmendationBoxType, EmendationBox
class EmendationBoxTypeSerializer(serializers.ModelSerializer):
class Meta:
model = EmendationBoxType
fields = [
'id',
'description',
]
class EmendationBoxStructureSerializer(serializers.ModelSerializer):
class Meta:
model = EmendationBoxStructure
fields = [
'id',
'description',
]
class EmendationBoxSerializer(serializers.ModelSerializer):
class Meta:
model = EmendationBox
emendation_type = EmendationBoxTypeSerializer(many=True,
read_only=True)
emendation_structure = EmendationBoxStructureSerializer(many=True,
read_only=True)
fields = [
'id',
'lattitude',
'longitude',
'designNumber',
'access_box',
'creation_date',
'extinction_date',
'emendation_type',
'emendation_structure',
]
| 28.829268 | 79 | 0.555838 | 72 | 1,182 | 8.986111 | 0.472222 | 0.120556 | 0.14374 | 0.162287 | 0.247295 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.375635 | 1,182 | 40 | 80 | 29.55 | 0.876694 | 0 | 0 | 0.382353 | 0 | 0 | 0.110829 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bf36b81a5557f4a83bb24b4b66ced5aca960191 | 8,132 | py | Python | PDBtoDots.py | RMeli/sensaas | f9a23c65cf6b70d4a6ccf482b233bf17b86c7b3b | [
"BSD-3-Clause"
] | 12 | 2021-06-18T07:58:00.000Z | 2022-01-19T04:57:46.000Z | PDBtoDots.py | RMeli/sensaas | f9a23c65cf6b70d4a6ccf482b233bf17b86c7b3b | [
"BSD-3-Clause"
] | 11 | 2021-06-18T14:38:09.000Z | 2022-01-21T11:53:16.000Z | PDBtoDots.py | RMeli/sensaas | f9a23c65cf6b70d4a6ccf482b233bf17b86c7b3b | [
"BSD-3-Clause"
] | 3 | 2021-06-18T14:28:01.000Z | 2022-01-19T01:16:43.000Z | #!/usr/bin/python3.7
#Author: Lucas GRANDMOUGIN
import sys
import os
import math
import re
import numpy as np
#print('usage: <>.py <file.pdb> \nexecute nsc to generate point-based surface and create tables and if verbose==1 files dotslabel1.xyzrgb dotslabel2.xyzrgb dotslabel3.xyzrgb and dotslabel4.xyzrgb\n')
def pdbsurface(filepdb,nscexe):
verbose=0
#label1 {H, Cl, Br, I} white/grey 0.9 0.9 0.9
#label2 {O, N, S, F} red 1 0 0
#label3 {C, P, B} green 0 1 0
#label4 {others} blue 0 0 1
tabR= {'C':'%.2f' % 1.70, 'O':1.52, 'N':1.55, 'S':1.80, 'P':1.80, 'B':1.72, 'Br':1.85, 'Cl':1.75, 'I':1.98, 'F':1.47, 'H':'%.2f' % 1.20, 'Hp':'%.2f' % 1.10, 'X':'%.2f' % 1.10}
label= {'C':3, 'P':3, 'B':3, 'O':2, 'N':2, 'S':2, 'F':2, 'Hp':2, 'H':1, 'Cl':1, 'Br':1, 'I':1}
rgb= np.array([[0, 0, 0], [0.9, 0.9, 0.9], [1, 0, 0], [0, 1, 0], [0, 0, 1]])
espace5=' '
espace6=' '
fichier2D=0
filepdb=open(filepdb,'r')
getstr=filepdb.read().split('\n')
filepdb.close()
tabLignesPdb=[]
tabLignesPdb.append('')
compt=1
while (compt < len(getstr)):
tabLignesPdb.append(re.split('\s+', getstr[compt].strip()))
compt=compt+1
compt=1
comptatomes=0
getx=[]
getx.append('')
gety=[]
gety.append('')
getz=[]
getz.append('')
getA=[]
getA.append('')
getRayon=[]
getRayon.append('')
while (compt < len(tabLignesPdb)):
if (tabLignesPdb[compt][0] == 'HETATM' or tabLignesPdb[compt][0] == 'ATOM'):
xAtome=float(tabLignesPdb[compt][5])
yAtome=float(tabLignesPdb[compt][6])
zAtome=float(tabLignesPdb[compt][7])
getx.append(xAtome)
gety.append(yAtome)
getz.append(zAtome)
if (float(zAtome) == 0):
fichier2D=fichier2D+1
getA.append(tabLignesPdb[compt][2])
if(getA[compt]!='C' and getA[compt]!='O' and getA[compt]!='N' and getA[compt]!='P' and getA[compt]!='B' and getA[compt]!='H' and getA[compt]!='F' and getA[compt]!='Br' and getA[compt]!='Cl' and getA[compt]!='S' and getA[compt]!='I' and getA[compt]!='X' and getA[compt]!='Hp'):
print("Warning: atom %s set as C because it is not the tab (unusual in medchem)" % getA[compt])
getA[compt]='C'
getRayon.append(tabR[getA[compt]])
comptatomes=comptatomes+1
compt=compt+1
nbatomes=comptatomes
if (fichier2D==int(nbatomes)):
print("Warning: pdb file in 2D; SenSaaS needs 3D coordinates to work properly")
compt=1
while (compt <= nbatomes):
if (getA[compt] == 'H'):
compt2=1
while(compt2 <= nbatomes):
if (getA[compt2] == 'N' or getA[compt2] == 'O'):
distHp= math.sqrt((getx[compt] - getx[compt2])**2 + (gety[compt] - gety[compt2])**2 + (getz[compt] - getz[compt2])**2)
if (distHp <= 1.2):
getRayon[compt]=tabR['Hp']
compt2=compt2+1
compt=compt+1
#nsc:
compt=1
psaIn=open('psa.in','w')
psaIn.write('* XYZR\n')
psaIn.write(espace6+str(nbatomes)+'\n')
while (compt <= nbatomes):
x='%.2f' % getx[compt]
y='%.2f' % gety[compt]
z='%.2f' % getz[compt]
psaIn.write('%8s %8s %8s %8s %8s \n'%(x,y,z,getRayon[compt],getA[compt]))
compt=compt+1
psaIn.close()
cmd = '%s psa.in ' % (nscexe)
os.system(cmd)
psaOut=open('psa.out', 'r')
lignepsaOut= psaOut.readlines()
psaOut.close()
tabLignesPsaOut=[]
compt=3
while (compt < len(lignepsaOut)):
tabLignesPsaOut.append(re.split('\s+', lignepsaOut[compt].strip()))
compt=compt+1
nbDots= int(tabLignesPsaOut[0][2])
#print("nbDots= %6s" % (nbDots))
del tabLignesPsaOut[0]
del tabLignesPsaOut[0]
getDots=np.empty(shape=[nbDots,3], dtype='float64')
getrgb=np.empty(shape=[nbDots,3], dtype='float64')
compt=nbatomes+2
comptDots=0
ligneFicDots=[]
label1=[]
label2=[]
label3=[]
label4=[]
if(verbose==1):
dotsFic=open('dots.xyzrgb', 'w')
while (compt < nbatomes+nbDots+2):
xDot=float(tabLignesPsaOut[compt][2])
yDot=float(tabLignesPsaOut[compt][3])
zDot=float(tabLignesPsaOut[compt][4])
compt2=1
m=100
mi=0
while(compt2 <= nbatomes):
xa=getx[compt2]
ya=gety[compt2]
za=getz[compt2]
goodDots= math.sqrt((xDot - xa)**2 + (yDot - ya)**2 + (zDot - za)**2)
if(goodDots < m):
m=goodDots
mi=compt2
compt2=compt2+1
atomeCorrespondant=getA[mi]
rgbi=label[atomeCorrespondant]
if(getRayon[mi]==tabR['Hp']):
rgbi=label['O']
getrgb[comptDots,:]=[rgb[rgbi,0], rgb[rgbi,1], rgb[rgbi,2]]
getDots[comptDots,:]=[xDot,yDot,zDot]
if (rgbi == 1):
label1.append(np.vstack([getDots[comptDots], getrgb[comptDots]]))
elif (rgbi == 2):
label2.append(np.vstack([getDots[comptDots], getrgb[comptDots]]))
elif (rgbi == 3):
label3.append(np.vstack([getDots[comptDots], getrgb[comptDots]]))
elif (rgbi == 4):
label4.append(np.vstack([getDots[comptDots], getrgb[comptDots]]))
else:
print("no label for dot no %5s ?\n" %(comptDots))
if(verbose==1):
dotsFic.write('%8s'%xDot+'%8s'%yDot+'%8s'%zDot+espace5+'%5s'%(rgb[rgbi,0])+'%5s'%(rgb[rgbi,1])+'%5s'%(rgb[rgbi,2])+'\n')
comptDots=comptDots+1
compt=compt+1
if(verbose==1):
dotsFic.close()
dotslabel1=open('dotslabel1.xyzrgb', 'w')
dotslabel2=open('dotslabel2.xyzrgb', 'w')
dotslabel3=open('dotslabel3.xyzrgb', 'w')
dotslabel4=open('dotslabel4.xyzrgb', 'w')
getDots1=np.empty(shape=[len(label1),3], dtype='float64')
getrgb1=np.empty(shape=[len(label1),3], dtype='float64')
getDots2=np.empty(shape=[len(label2),3], dtype='float64')
getrgb2=np.empty(shape=[len(label2),3], dtype='float64')
getDots3=np.empty(shape=[len(label3),3], dtype='float64')
getrgb3=np.empty(shape=[len(label3),3], dtype='float64')
getDots4=np.empty(shape=[len(label4),3], dtype='float64')
getrgb4=np.empty(shape=[len(label4),3], dtype='float64')
compt=0
while(compt < len(label1)):
getDots1[compt]= label1[compt][0]
getrgb1[compt]= label1[compt][1]
if(verbose==1):
dotslabel1.write('%8s'%getDots1[compt,0]+'%8s'%getDots1[compt,1]+'%8s'%getDots1[compt,2]+espace5+'%5s'%getrgb1[compt,0]+'%5s'%getrgb1[compt,1]+'%5s'%getrgb1[compt,2]+'\n')
compt=compt+1
compt=0
while(compt < len(getDots2)):
getDots2[compt]= label2[compt][0]
getrgb2[compt]= label2[compt][1]
if(verbose==1):
dotslabel2.write('%8s'%getDots2[compt,0]+'%8s'%getDots2[compt,1]+'%8s'%getDots2[compt,2]+espace5+'%5s'%getrgb2[compt,0]+'%5s'%getrgb2[compt,1]+'%5s'%getrgb2[compt,2]+'\n')
compt=compt+1
compt=0
while(compt < len(getDots3)):
getDots3[compt]= label3[compt][0]
getrgb3[compt]= label3[compt][1]
if(verbose==1):
dotslabel3.write('%8s'%getDots3[compt,0]+'%8s'%getDots3[compt,1]+'%8s'%getDots3[compt,2]+espace5+'%5s'%getrgb3[compt,0]+'%5s'%getrgb3[compt,1]+'%5s'%getrgb3[compt,2]+'\n')
compt=compt+1
compt=0
while(compt < len(getDots4)):
getDots4[compt]= label4[compt][0]
getrgb4[compt]= label4[compt][1]
if(verbose==1):
dotslabel4.write('%8s'%getDots4[compt,0]+'%8s'%getDots4[compt,1]+'%8s'%getDots4[compt,2]+espace5+'%5s'%getrgb4[compt,0]+'%5s'%getrgb4[compt,1]+'%5s'%getrgb4[compt,2]+'\n')
compt=compt+1
if(verbose==1):
dotslabel1.close()
dotslabel2.close()
dotslabel3.close()
dotslabel4.close()
else:
os.remove("psa.in")
os.remove("psa.out")
return getDots, getrgb, getDots1, getrgb1, getDots2, getrgb2, getDots3, getrgb3, getDots4, getrgb4
| 34.457627 | 288 | 0.564191 | 1,083 | 8,132 | 4.23638 | 0.182825 | 0.034002 | 0.031386 | 0.026155 | 0.188317 | 0.162598 | 0.141456 | 0.118134 | 0.058849 | 0.024194 | 0 | 0.064454 | 0.236842 | 8,132 | 235 | 289 | 34.604255 | 0.674831 | 0.049803 | 0 | 0.203209 | 1 | 0 | 0.078259 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.005348 | false | 0 | 0.026738 | 0 | 0.037433 | 0.016043 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bfb60401fb3180662fe2cf6854036c39c7bffb8 | 3,352 | py | Python | tests/test_neofoodclub.py | diceroll123/neofoodclub.py | ac82ee32bb6ed4f98ebce222273910bc8acfcd95 | [
"MIT"
] | 2 | 2021-04-21T06:34:50.000Z | 2021-04-21T06:38:09.000Z | tests/test_neofoodclub.py | diceroll123/neofoodclub.py | ac82ee32bb6ed4f98ebce222273910bc8acfcd95 | [
"MIT"
] | 2 | 2021-12-14T00:19:42.000Z | 2022-02-19T02:15:52.000Z | tests/test_neofoodclub.py | diceroll123/neofoodclub.py | ac82ee32bb6ed4f98ebce222273910bc8acfcd95 | [
"MIT"
] | 1 | 2021-05-05T02:25:20.000Z | 2021-05-05T02:25:20.000Z | import unittest
from typing import Tuple
from neofoodclub import NeoFoodClub # type: ignore
from neofoodclub.types import RoundData # type: ignore
# i picked the smallest round I could quickly find
test_round_data: RoundData = {
"currentOdds": [
[1, 2, 13, 3, 5],
[1, 4, 2, 4, 6],
[1, 3, 13, 7, 2],
[1, 13, 2, 3, 3],
[1, 8, 2, 4, 13],
],
"foods": [
[26, 25, 4, 9, 21, 1, 33, 11, 7, 10],
[12, 9, 14, 35, 25, 6, 21, 19, 40, 37],
[17, 30, 21, 39, 37, 15, 29, 40, 31, 10],
[10, 18, 35, 9, 34, 23, 27, 32, 28, 12],
[11, 20, 9, 33, 7, 14, 4, 23, 31, 26],
],
"lastChange": "2021-02-16T23:47:18+00:00",
"openingOdds": [
[1, 2, 13, 3, 5],
[1, 4, 2, 4, 6],
[1, 3, 13, 7, 2],
[1, 13, 2, 3, 3],
[1, 8, 2, 4, 12],
],
"pirates": [
[2, 8, 14, 11],
[20, 7, 6, 10],
[19, 4, 12, 15],
[3, 1, 5, 13],
[17, 16, 18, 9],
],
"round": 7956,
"start": "2021-02-15T23:47:41+00:00",
"timestamp": "2021-02-16T23:47:37+00:00",
"winners": [1, 3, 4, 2, 4],
"changes": [
{"arena": 1, "new": 6, "old": 5, "pirate": 4, "t": "2021-02-16T23:47:18+00:00"},
{
"arena": 4,
"new": 8,
"old": 12,
"pirate": 1,
"t": "2021-02-16T23:47:18+00:00",
},
{"arena": 4, "new": 4, "old": 6, "pirate": 3, "t": "2021-02-16T23:47:18+00:00"},
{
"arena": 4,
"new": 12,
"old": 13,
"pirate": 4,
"t": "2021-02-16T23:47:18+00:00",
},
],
}
test_bet_hash = "ltqvqwgimhqtvrnywrwvijwnn"
test_indices: Tuple[Tuple[int, ...], ...] = (
(2, 1, 3, 4, 3),
(1, 4, 1, 3, 1),
(4, 2, 1, 1, 1),
(3, 2, 2, 1, 2),
(3, 1, 3, 4, 4),
(1, 3, 2, 2, 3),
(4, 4, 4, 2, 3),
(2, 4, 2, 4, 1),
(1, 3, 1, 4, 4),
(2, 2, 3, 2, 3),
)
test_binaries: Tuple[int, ...] = (
0x48212,
0x81828,
0x14888,
0x24484,
0x28211,
0x82442,
0x11142,
0x41418,
0x82811,
0x44242,
)
test_expected_results = (test_bet_hash, test_indices, test_binaries)
test_nfc = NeoFoodClub(test_round_data)
hash_bets = test_nfc.make_bets_from_hash(test_bet_hash)
indices_bets = test_nfc.make_bets_from_indices(test_indices) # type: ignore
binaries_bets = test_nfc.make_bets_from_binaries(*test_binaries)
########################################################################################################################
class BetDecodingTest(unittest.TestCase):
def test_bet_hash_encoding(self):
self.assertEqual(
(hash_bets.bets_hash, hash_bets.indices, tuple(hash_bets)),
test_expected_results,
)
def test_bet_indices_encoding(self):
self.assertEqual(
(indices_bets.bets_hash, indices_bets.indices, tuple(indices_bets)),
test_expected_results,
)
def test_bet_binary_encoding(self):
self.assertEqual(
(binaries_bets.bets_hash, binaries_bets.indices, tuple(binaries_bets)),
test_expected_results,
)
class BetEquivalenceTest(unittest.TestCase):
def test_bet_equivalence(self):
self.assertTrue(hash_bets == indices_bets and indices_bets == binaries_bets)
| 27.47541 | 120 | 0.498807 | 462 | 3,352 | 3.474026 | 0.24026 | 0.011215 | 0.041122 | 0.048598 | 0.237383 | 0.204984 | 0.161994 | 0.109034 | 0.109034 | 0.109034 | 0 | 0.187553 | 0.295346 | 3,352 | 121 | 121 | 27.702479 | 0.491956 | 0.025955 | 0 | 0.216981 | 0 | 0 | 0.111147 | 0.063694 | 0 | 0 | 0.022293 | 0 | 0.037736 | 1 | 0.037736 | false | 0 | 0.037736 | 0 | 0.09434 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
5bfcf311b8aa728cf923ee0a028bdc0a9bd5b881 | 384 | py | Python | backend/api/migrations/0023_auto_20190823_1611.py | yamamz/BRMI_LOANAPP | e6f79789855a633ee78a168452bca508622bcca8 | [
"MIT"
] | null | null | null | backend/api/migrations/0023_auto_20190823_1611.py | yamamz/BRMI_LOANAPP | e6f79789855a633ee78a168452bca508622bcca8 | [
"MIT"
] | 6 | 2020-06-05T22:43:22.000Z | 2022-02-10T12:32:19.000Z | backend/api/migrations/0023_auto_20190823_1611.py | yamamz/BRMI_LOANAPP | e6f79789855a633ee78a168452bca508622bcca8 | [
"MIT"
] | null | null | null | # Generated by Django 2.1.1 on 2019-08-23 08:11
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('api', '0022_auto_20190823_1553'),
]
operations = [
migrations.AlterField(
model_name='loan',
name='loan_period',
field=models.FloatField(default=0.0),
),
]
| 20.210526 | 49 | 0.59375 | 43 | 384 | 5.186047 | 0.767442 | 0.071749 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120879 | 0.289063 | 384 | 18 | 50 | 21.333333 | 0.695971 | 0.117188 | 0 | 0 | 1 | 0 | 0.121662 | 0.068249 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
75022887b6f82317b30d98bb409f69f029c65114 | 486 | py | Python | tests/frameworks/fast/test_fast_template.py | Jhsmit/awesome-panel-extensions | 41eba7cf84caa911be4ed0df2a96e16fc1e70263 | [
"CC-BY-4.0"
] | 3 | 2020-07-16T07:28:45.000Z | 2020-07-17T12:53:56.000Z | tests/frameworks/fast/test_fast_template.py | MarcSkovMadsen/panel-extensions-template | f41ad8d8fb8502f87de3a4992917cbffb6299012 | [
"CC-BY-4.0"
] | null | null | null | tests/frameworks/fast/test_fast_template.py | MarcSkovMadsen/panel-extensions-template | f41ad8d8fb8502f87de3a4992917cbffb6299012 | [
"CC-BY-4.0"
] | null | null | null | # pylint: disable=redefined-outer-name,protected-access
# pylint: disable=missing-function-docstring,missing-module-docstring,missing-class-docstring
import panel as pn
from panel import Template
from awesome_panel_extensions.frameworks.fast import FastTemplate
def test_constructor():
# Given
column = pn.Column()
main = [column]
# When
template = FastTemplate(main=main)
# Then
assert issubclass(FastTemplate, Template)
assert template.main == main
| 27 | 93 | 0.753086 | 57 | 486 | 6.368421 | 0.578947 | 0.071625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158436 | 486 | 17 | 94 | 28.588235 | 0.887531 | 0.331276 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222222 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
7502dff8f0fd39e9a5e925ef11b5e1b9ca9564a0 | 2,371 | py | Python | doc2json/jats2json/pmc_utils/back_tag_utils.py | josephcc/s2orc-doc2json | 8a6a21b7a8a3c6ad11cd42bdd0d46ee32a5a990d | [
"Apache-2.0"
] | 132 | 2021-02-15T18:16:12.000Z | 2022-03-29T04:47:17.000Z | doc2json/jats2json/pmc_utils/back_tag_utils.py | josephcc/s2orc-doc2json | 8a6a21b7a8a3c6ad11cd42bdd0d46ee32a5a990d | [
"Apache-2.0"
] | 6 | 2021-02-21T09:52:11.000Z | 2022-02-01T17:45:43.000Z | doc2json/jats2json/pmc_utils/back_tag_utils.py | josephcc/s2orc-doc2json | 8a6a21b7a8a3c6ad11cd42bdd0d46ee32a5a990d | [
"Apache-2.0"
] | 18 | 2021-02-15T18:18:05.000Z | 2022-03-11T19:37:47.000Z | from typing import Dict, List
def _wrap_text(tag):
return tag.text if tag else ''
def parse_authors(authors_tag) -> List:
"""The PMC XML has a slightly different format than authors listed in front tag."""
if not authors_tag:
return []
authors = []
for name_tag in authors_tag.find_all('name', recursive=False):
surname = name_tag.find('surname')
given_names = name_tag.find('given-names')
given_names = given_names.text.split(' ') if given_names else None
suffix = name_tag.find('suffix')
authors.append({
'first': given_names[0] if given_names else '',
'middle': given_names[1:] if given_names else [],
'last': surname.text if surname else '',
'suffix': suffix.text if suffix else ''
})
return authors
def parse_bib_entries(back_tag) -> Dict:
bib_entries = {}
# TODO: PMC2778891 does not have 'ref-list' in its back_tag. do we even need this, or can directly .find_all('ref')?
ref_list_tag = back_tag.find('ref-list')
if ref_list_tag:
for ref_tag in ref_list_tag.find_all('ref'):
# The ref ID and label are semantically swapped between CORD-19 and PMC, lol
ref_label = ref_tag['id']
ref_id = ref_tag.find('label')
authors_tag = ref_tag.find('person-group', {'person-group-type': 'author'})
year = ref_tag.find('year')
fpage = ref_tag.find('fpage')
lpage = ref_tag.find('lpage')
pages = f'{fpage.text}-{lpage.text}' if fpage and lpage else None
dois = [tag.text for tag in ref_tag.find_all('pub-id', {'pub-id-type': 'doi'})]
bib_entries[ref_label] = {
'ref_id': _wrap_text(ref_id),
'title': _wrap_text(ref_tag.find('article-title')),
'authors': parse_authors(authors_tag),
'year': int(year.text) if year and year.text.isdigit() else None,
'venue': _wrap_text(ref_tag.find('source')),
'volume': _wrap_text(ref_tag.find('volume')),
'issn': _wrap_text(ref_tag.find('issue')),
'pages': pages,
'other_ids': {
'DOI': dois,
}
}
return bib_entries | 42.339286 | 122 | 0.564319 | 309 | 2,371 | 4.113269 | 0.307443 | 0.08812 | 0.078678 | 0.04406 | 0.056648 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00674 | 0.311683 | 2,371 | 56 | 123 | 42.339286 | 0.772059 | 0.113454 | 0 | 0 | 0 | 0 | 0.120529 | 0.012249 | 0 | 0 | 0 | 0.017857 | 0 | 1 | 0.065217 | false | 0 | 0.021739 | 0.021739 | 0.173913 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
75054ce723fd893b6681ad004d075813a927586a | 2,593 | py | Python | gachianalyzer/analysis_organizer.py | mitsu-ksgr/splatoon2-gachianalyzer | 9d964f514c85b626bc8e604a0bf3a2598ac91765 | [
"MIT"
] | null | null | null | gachianalyzer/analysis_organizer.py | mitsu-ksgr/splatoon2-gachianalyzer | 9d964f514c85b626bc8e604a0bf3a2598ac91765 | [
"MIT"
] | 1 | 2019-07-15T16:54:57.000Z | 2019-07-15T18:41:39.000Z | gachianalyzer/analysis_organizer.py | mitsu-ksgr/splatoon2-gachianalyzer | 9d964f514c85b626bc8e604a0bf3a2598ac91765 | [
"MIT"
] | null | null | null | import cv2
from .video_analyzer import VideoAnalyzer
class AnalysisOrganizer:
"""
VideoAnalyzer の結果を編成します.
"""
def __organize(self):
"""
self.result をイベント毎に編成しなおします.
"""
events = []
prev = self.result[0]
for i in range(1, len(self.result)):
# 前フレームと同じなら同じイベントとみなす.
# 前フレームとイベントが変わったら記録する.
if prev['event'] != self.result[i]['event']:
end = self.result[i - 1]
events.append({
'event': prev['event'],
'start_time': prev['time'],
'end_time': end['time'],
'start_frame': prev['frame'],
'end_frame': end['frame']
})
prev = self.result[i]
return events
def __init__(self, analyze_results):
"""
analyze_results - VideoAnalyzer のリザルトの配列.
#TODO 並列処理させない場合を考慮して、一次元配列だけを受け取れるようにする
"""
self.result = sorted(
[x for sub in analyze_results for x in sub],
key = lambda x: x['frame']
)
self.events = self.__organize()
def dump(self):
for event in self.events:
print('Time:{:4d} ~ {:4d}, Frame:{:5.0f} ~ {:5.0f}, Event={}'.format(
event['start_time'], event['end_time'],
event['start_frame'], event['end_frame'],
event['event']
))
def extract_battles(self):
"""
試合部分(ガチマ開始〜リザルト表示)の開始時間, 試合時間を返します.
"""
battles = []
st_time = None
battle_event = None
for event in self.events:
e = event['event']
if e == 'Loading':
st_time = event['end_time']
elif e == 'ResultUdemae':
battle_event = prev
st_time = battle_event['start_time']
elif e == 'ResultOkaneRank':
# 直前にローディングを挟んで居ない場合、ロード画面後(バトル中)
# から録画開始したと仮定し、battle_event の開始時刻を記録する
if not st_time:
if not battle_event:
continue
st_time = battle_event['start_time']
diff = int(event['end_time']) - int(st_time)
battles.append((st_time, diff))
st_time, battle_event = None, None
elif e in ['LobbyStandby', 'LobbyModeSelect', 'LobbyFindBattle']:
st_time, battle_event = None, None
else:
pass
prev = event
return battles
| 30.505882 | 81 | 0.482838 | 245 | 2,593 | 4.942857 | 0.330612 | 0.044591 | 0.046243 | 0.056152 | 0.117258 | 0.084228 | 0 | 0 | 0 | 0 | 0 | 0.00646 | 0.403008 | 2,593 | 84 | 82 | 30.869048 | 0.775194 | 0.110297 | 0 | 0.107143 | 0 | 0.017857 | 0.133092 | 0 | 0 | 0 | 0 | 0.011905 | 0 | 1 | 0.071429 | false | 0.017857 | 0.035714 | 0 | 0.160714 | 0.017857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7509211755a7245800dfbd3042066dfd9b209c87 | 288 | py | Python | answers/x_2_6.py | ofl/kuku | 76eefc0d3d859051473ee0d5f48b5d42d17d05a6 | [
"MIT"
] | null | null | null | answers/x_2_6.py | ofl/kuku | 76eefc0d3d859051473ee0d5f48b5d42d17d05a6 | [
"MIT"
] | 4 | 2021-09-23T03:19:52.000Z | 2021-11-13T10:38:21.000Z | answers/x_2_6.py | ofl/kuku | 76eefc0d3d859051473ee0d5f48b5d42d17d05a6 | [
"MIT"
] | null | null | null | # x_2_6
#
# ヒントを参考に「a」「b」「c」「d」がそれぞれどんな値となるかを予想してください
# ヒント
print(type('桃太郎'))
print(type(10))
print(type(12.3))
a = type('777') # => str
b = type(10 + 3.5) # => float
c = type(14 / 7) # => float
d = type(10_000_000) # => int
# print(a)
# print(b)
# print(c)
# print(d)
| 15.157895 | 43 | 0.541667 | 49 | 288 | 3.102041 | 0.489796 | 0.177632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111607 | 0.222222 | 288 | 18 | 44 | 16 | 0.566964 | 0.413194 | 0 | 0 | 0 | 0 | 0.038462 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
75141fdc0d4a4c71c0178c6ccd7f200961cef89d | 8,204 | py | Python | analysis/gfp/seqtools.py | johli/genesis | 5424c1888d4330e505ad87412e7f1cc5dd828888 | [
"MIT"
] | 12 | 2020-02-02T14:29:15.000Z | 2021-09-12T08:05:43.000Z | analysis/gfp/seqtools.py | johli/genesis | 5424c1888d4330e505ad87412e7f1cc5dd828888 | [
"MIT"
] | 1 | 2022-01-04T08:04:00.000Z | 2022-01-10T08:49:04.000Z | analysis/gfp/seqtools.py | johli/genesis | 5424c1888d4330e505ad87412e7f1cc5dd828888 | [
"MIT"
] | 3 | 2020-03-10T22:24:05.000Z | 2021-05-05T13:23:01.000Z | import numpy as np
class SequenceTools(object):
dna2gray_ = {'c': (0, 0), 't': (1, 0), 'g': (1, 1), 'a': (0, 1)}
gray2dna_ = {(0, 0): 'c', (1, 0): 't', (1, 1): 'g', (0, 1): 'a'}
codon2protein_ = {'ttt': 'f', 'ttc': 'f', 'tta': 'l', 'ttg': 'l', 'tct': 's', 'tcc': 's', 'tca': 's',
'tcg': 's', 'tat': 'y', 'tac': 'y', 'taa': '!', 'tag': '!', 'tgt': 'c', 'tgc': 'c',
'tga': '!', 'tgg': 'w', 'ctt': 'l', 'ctc': 'l', 'cta': 'l', 'ctg': 'l', 'cct': 'p',
'ccc': 'p', 'cca': 'p', 'ccg': 'p', 'cat': 'h', 'cac': 'h', 'caa': 'q', 'cag': 'q',
'cgt': 'r', 'cgc': 'r', 'cga': 'r', 'cgg': 'r', 'att': 'i', 'atc': 'i', 'ata': 'i',
'atg': 'm', 'act': 't', 'acc': 't', 'aca': 't', 'acg': 't', 'aat': 'n', 'aac': 'n',
'aaa': 'k', 'aag': 'k', 'agt': 's', 'agc': 's', 'aga': 'r', 'agg': 'r', 'gtt': 'v',
'gtc': 'v', 'gta': 'v', 'gtg': 'v', 'gct': 'a', 'gcc': 'a', 'gca': 'a', 'gcg': 'a',
'gat': 'd', 'gac': 'd', 'gaa': 'e', 'gag': 'e', 'ggt': 'g', 'ggc': 'g', 'gga': 'g',
'ggg': 'g'}
protein2codon_ = {
'l': ['tta', 'ttg', 'ctt', 'ctc', 'cta', 'ctg'],
's': ['tct', 'tcc', 'tca', 'tcg', 'agt', 'agc'],
'r': ['cgt', 'cgc', 'cga', 'cgg', 'aga', 'agg'],
'v': ['gtt', 'gtc', 'gta', 'gtg'],
'a': ['gct', 'gcc', 'gca', 'gcg'],
'p': ['cct', 'ccc', 'cca', 'ccg'],
't': ['act', 'acc', 'aca', 'acg'],
'g': ['ggt', 'ggc', 'gga', 'ggg'],
'stop': ['taa', 'tag', 'tga'],
'i': ['att', 'atc', 'ata'],
'y': ['tat', 'tac'],
'f': ['ttt', 'ttc'],
'c': ['tgt', 'tgc'],
'h': ['cat', 'cac'],
'q': ['caa', 'cag'],
'n': ['aat', 'aac'],
'k': ['aaa', 'aag'],
'd': ['gat', 'gac'],
'e': ['gaa', 'gag'],
'w': ['tgg'],
'm': ['atg']
}
protein2constraint_ = {
'l': {(1,): {('t',)}, (0, 2): {('t', 'a'), ('t', 'g'), ('c', 't'), ('c', 'c'), ('c', 'a'), ('c', 'g')}},
's': {(0, 1, 2): {('t', 'c', 't'), ('t', 'c', 'c'), ('t', 'c', 'a'), ('t', 'c', 'g'), ('a', 'g', 't'),
('a', 'g', 'c')}},
'r': {(1,): {('g',)}, (0, 2): {('c', 't'), ('c', 'c'), ('c', 'a'), ('c', 'g'), ('a', 'a'), ('a', 'g')}},
'v': {(0,): {('g',)}, (1,): {('t',)}, (2,): {('g',), ('t',), ('a',), ('c',)}},
'a': {(0,): {('g',)}, (1,): {('c',)}, (2,): {('g',), ('t',), ('a',), ('c',)}},
'p': {(0,): {('c',)}, (1,): {('c',)}, (2,): {('g',), ('t',), ('a',), ('c',)}},
't': {(0,): {('a',)}, (1,): {('c',)}, (2,): {('g',), ('t',), ('a',), ('c',)}},
'g': {(0,): {('g',)}, (1,): {('g',)}, (2,): {('g',), ('t',), ('a',), ('c',)}},
'stop': {(0,): {('t',)}, (1, 2): {('a', 'a'), ('a', 'g'), ('g', 'a')}},
'i': {(0,): {('a',)}, (1,): {('t',)}, (2,): {('t',), ('a',), ('c',)}},
'y': {(0,): {('t',)}, (1,): {('a',)}, (2,): {('t',), ('c',)}},
'f': {(0,): {('t',)}, (1,): {('t',)}, (2,): {('t',), ('c',)}},
'c': {(0,): {('t',)}, (1,): {('g',)}, (2,): {('t',), ('c',)}},
'h': {(0,): {('c',)}, (1,): {('a',)}, (2,): {('t',), ('c',)}},
'q': {(0,): {('c',)}, (1,): {('a',)}, (2,): {('a',), ('g',)}},
'n': {(0,): {('a',)}, (1,): {('a',)}, (2,): {('t',), ('c',)}},
'k': {(0,): {('a',)}, (1,): {('a',)}, (2,): {('a',), ('g',)}},
'd': {(0,): {('g',)}, (1,): {('a',)}, (2,): {('t',), ('c',)}},
'e': {(0,): {('g',)}, (1,): {('a',)}, (2,): {('a',), ('g',)}},
'w': {(0,): {('t',)}, (1,): {('g',)}, (2,): {('g',)}},
'm': {(0,): {('a',)}, (1,): {('t',)}, (2,): {('g',)}},
}
# Integer mapping from Fernandes and Vinga (2016)
codon2idx_ = {'aaa': 1, 'aac': 2, 'aag': 3, 'aat': 4, 'aca': 5, 'acc': 6, 'acg': 7, 'act': 8, 'aga': 9,
'agc': 10, 'agg': 11, 'agt': 12, 'ata': 13, 'atc': 14, 'atg': 15, 'att': 16, 'caa': 17,
'cac': 18, 'cag': 19, 'cat': 20, 'cca': 21, 'ccc': 22, 'ccg': 23, 'cct': 24, 'cga': 25,
'cgc': 26, 'cgg': 27, 'cgt': 28, 'cta': 29, 'ctc': 30, 'ctg': 31, 'ctt': 32, 'gaa': 33,
'gac': 34, 'gag': 35, 'gat': 36, 'gca': 37, 'gcc': 38, 'gcg': 39, 'gct': 40, 'gga': 41,
'ggc': 42, 'ggg': 43, 'ggt': 44, 'gta': 45, 'gtc': 46, 'gtg': 47, 'gtt': 48, 'taa': 49,
'tac': 50, 'tag': 51, 'tat': 52, 'tca': 53, 'tcc': 54, 'tcg': 55, 'tct': 56, 'tga': 57,
'tgc': 58, 'tgg': 59, 'tgt': 60, 'tta': 61, 'ttc': 62, 'ttg': 63, 'ttt': 64}
@staticmethod
def convert_dna_to_rna(seq):
dna2rna = {'t': 'u', 'a': 'a', 'g': 'g', 'c': 'c'}
return "".join([dna2rna[s] for s in seq])
@staticmethod
def convert_dna_arr_to_str(dna_arr, base_order='ATCG'):
""" Convert N x 4 tokenized array into length N string """
dna_seq_str = ''
for i in range(dna_arr.shape[0]):
token = np.argmax(dna_arr[i, :])
dna_seq_str += base_order[token]
return dna_seq_str
@staticmethod
def get_aa_codons():
aa_list = sorted(list(SequenceTools.protein2codon_.keys()))
aa_codons = np.zeros((len(aa_list), 6, 3, 4))
i = 0
for aa in aa_list:
cods = SequenceTools.protein2codon_[aa]
j = 0
for c in cods:
cod_arr = SequenceTools.convert_dna_str_to_arr(c)
aa_codons[i, j] = cod_arr
j += 1
i += 1
return aa_codons
@staticmethod
def convert_dna_str_to_arr(dna_str, base_order='ATCG'):
""" Convert length N string into N x 4 tokenized array"""
dna_str = dna_str.upper()
N = len(dna_str)
dna_arr = np.zeros((N, 4))
for i in range(N):
idx = base_order.index(dna_str[i])
dna_arr[i, idx] = 1.
return dna_arr
@staticmethod
def convert_dna_arr_to_gray(dna_arr, base_order='ATCG'):
""" Convert N x 4 tokenized array into 2N x 2 tokenized gray code array"""
N = dna_arr.shape[0]
gray_arr = np.zeros((2 * N, 2))
for i in range(N):
token = np.argmax(dna_arr[i, :])
dna_i = base_order[token]
gray_i = SequenceTools.dna2gray_[dna_i]
for j in range(2):
gray_arr[2 * i + j, gray_i[j]] = 1
return gray_arr
@staticmethod
def convert_gray_to_dna_str(gray_arr):
Ngray = gray_arr.shape[0]
dna_str = ''
i = 0
while i < Ngray:
g1 = int(np.argmax(gray_arr[i, :]))
g2 = int(np.argmax(gray_arr[i + 1, :]))
dna_str += SequenceTools.gray2dna_[(g1, g2)]
i += 2
return dna_str
@staticmethod
def convert_dna_str_to_gray(dna_str):
"""Convert length N string into 2N x 2 tokenized gray code array"""
dna_str = dna_str.lower()
N = len(dna_str)
gray_arr = np.zeros((2 * N, 2))
for i in range(N):
gray_i = SequenceTools.dna2gray_[dna_str[i]]
for j in range(2):
gray_arr[2 * i + j, gray_i[j]] = 1
return gray_arr
@staticmethod
def convert_rna_to_dna(seq):
rna2dna = {'u': 't', 'a': 'a', 'g': 'g', 'c': 'c'}
return "".join([rna2dna[s] for s in seq])
@classmethod
def get_codon_from_idx(cls, idx):
idx2codon = {val: key for key, val in SequenceTools.codon2idx_.items()}
return idx2codon[idx]
@classmethod
def get_start_codon_int(cls):
return SequenceTools.codon2idx_['atg']
@classmethod
def get_stop_codon_ints(cls):
stop_codons = SequenceTools.protein2codon_['stop']
return [SequenceTools.codon2idx_[s] for s in stop_codons]
@classmethod
def translate_dna_str(cls, dna_seq):
dna_seq = dna_seq.lower()
prot_seq = []
i = 0
while i < len(dna_seq):
cod = dna_seq[i:i + 3]
prot_seq.append(SequenceTools.codon2protein_[cod])
i += 3
prot_seq = "".join(prot_seq)
return prot_seq
| 44.345946 | 112 | 0.384203 | 1,096 | 8,204 | 2.758212 | 0.213504 | 0.035726 | 0.006947 | 0.006616 | 0.292425 | 0.202117 | 0.149851 | 0.127688 | 0.093285 | 0.093285 | 0 | 0.046084 | 0.304364 | 8,204 | 184 | 113 | 44.586957 | 0.483617 | 0.034252 | 0 | 0.186335 | 0 | 0 | 0.108748 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074534 | false | 0 | 0.006211 | 0.006211 | 0.198758 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
751dc8bcc74a49ad7a489ad38470f4ef64fc97a5 | 12,715 | py | Python | launch/test/launch/test_logging.py | pal-robotics-forks/launch | 9a0a01924cd88f832c5a50903553ddcec6ef648d | [
"Apache-2.0"
] | null | null | null | launch/test/launch/test_logging.py | pal-robotics-forks/launch | 9a0a01924cd88f832c5a50903553ddcec6ef648d | [
"Apache-2.0"
] | null | null | null | launch/test/launch/test_logging.py | pal-robotics-forks/launch | 9a0a01924cd88f832c5a50903553ddcec6ef648d | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Open Source Robotics Foundation, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Tests for the launch.logging module."""
import logging
import os
import pathlib
import re
from unittest import mock
import launch.logging
import pytest
@pytest.fixture
def log_dir(tmpdir_factory):
"""Test fixture that generates a temporary directory for log files."""
return str(tmpdir_factory.mktemp('logs'))
def test_bad_logging_launch_config():
"""Tests that setup throws at bad configuration."""
launch.logging.reset()
with pytest.raises(ValueError):
launch.logging.launch_config.log_dir = 'not/a/real/dir'
with pytest.raises(ValueError):
launch.logging.launch_config.set_screen_format('default', screen_style='%')
with pytest.raises(ValueError):
launch.logging.launch_config.set_log_format(log_format='default', log_style='%')
def test_output_loggers_bad_configuration(log_dir):
"""Tests that output loggers setup throws at bad configuration."""
launch.logging.launch_config.reset()
launch.logging.launch_config.log_dir = log_dir
with pytest.raises(ValueError):
launch.logging.get_output_loggers('some-proc', 'not_an_alias')
with pytest.raises(ValueError):
launch.logging.get_output_loggers('some-proc', {'garbage': {'log'}})
with pytest.raises(ValueError):
launch.logging.get_output_loggers('some-proc', {'stdout': {'garbage'}})
@pytest.mark.parametrize('config,checks', [
('screen', {'stdout': {'screen'}, 'stderr': {'screen'}}),
('log', {'stdout': {'log'}, 'stderr': {'log', 'screen'}}),
('both', {'both': {'log', 'screen'}}),
('own_log', {
'stdout': {'own_log'},
'stderr': {'own_log'},
'both': {'own_log'},
}),
('full', {
'stdout': {'log', 'own_log', 'screen'},
'stderr': {'log', 'own_log', 'screen'},
'both': {'own_log'},
}),
(
{'stdout': {'screen', 'log'}, 'stderr': {'own_log'}},
{
'stdout': {'screen', 'log'},
'stderr': {'own_log'}
},
)
])
def test_output_loggers_configuration(capsys, log_dir, config, checks):
checks = {'stdout': set(), 'stderr': set(), 'both': set(), **checks}
launch.logging.reset()
launch.logging.launch_config.log_dir = log_dir
logger = launch.logging.get_logger('some-proc')
logger.addHandler(launch.logging.launch_config.get_screen_handler())
logger.addHandler(launch.logging.launch_config.get_log_file_handler())
logger.setLevel(logging.ERROR)
stdout_logger, stderr_logger = launch.logging.get_output_loggers('some-proc', config)
logger.debug('oops')
logger.error('baz')
stdout_logger.info('foo')
stderr_logger.info('bar')
capture = capsys.readouterr()
lines = list(reversed(capture.out.splitlines()))
assert '[ERROR] [some-proc]: baz' == lines.pop()
if 'screen' in (checks['stdout'] | checks['both']):
assert 'foo' == lines.pop()
if 'screen' in (checks['stderr'] | checks['both']):
assert 'bar' == lines.pop()
assert 0 == len(lines)
assert 0 == len(capture.err)
launch.logging.launch_config.get_log_file_handler().flush()
main_log_path = launch.logging.launch_config.get_log_file_path()
assert os.path.exists(main_log_path)
assert 0 != os.stat(main_log_path).st_size
with open(main_log_path, 'r') as f:
lines = list(reversed(f.readlines()))
assert re.match(r'[0-9]+\.[0-9]+ \[ERROR\] \[some-proc\]: baz', lines.pop()) is not None
if 'log' in (checks['stdout'] | checks['both']):
assert re.match(r'[0-9]+\.[0-9]+ foo', lines.pop()) is not None
if 'log' in (checks['stderr'] | checks['both']):
assert re.match(r'[0-9]+\.[0-9]+ bar', lines.pop()) is not None
assert 0 == len(lines)
if 'own_log' in (checks['stdout'] | checks['both']):
launch.logging.launch_config.get_log_file_handler('some-proc-stdout.log').flush()
own_log_path = launch.logging.launch_config.get_log_file_path('some-proc-stdout.log')
assert os.path.exists(own_log_path)
assert 0 != os.stat(own_log_path).st_size
with open(own_log_path, 'r') as f:
lines = f.read().splitlines()
assert 1 == len(lines)
assert 'foo' == lines[0]
else:
own_log_path = launch.logging.launch_config.get_log_file_path('some-proc-stdout.log')
assert (not os.path.exists(own_log_path) or 0 == os.stat(own_log_path).st_size)
if 'own_log' in (checks['stderr'] | checks['both']):
launch.logging.launch_config.get_log_file_handler('some-proc-stderr.log').flush()
own_log_path = launch.logging.launch_config.get_log_file_path('some-proc-stderr.log')
assert os.path.exists(own_log_path)
assert 0 != os.stat(own_log_path).st_size
with open(own_log_path, 'r') as f:
lines = f.read().splitlines()
assert 1 == len(lines)
assert 'bar' == lines[0]
else:
own_log_path = launch.logging.launch_config.get_log_file_path('some-proc-stderr.log')
assert (not os.path.exists(own_log_path) or 0 == os.stat(own_log_path).st_size)
if 'own_log' in checks['both']:
launch.logging.launch_config.get_log_file_handler('some-proc.log').flush()
own_log_path = launch.logging.launch_config.get_log_file_path('some-proc.log')
assert os.path.exists(own_log_path)
assert 0 != os.stat(own_log_path).st_size
with open(own_log_path, 'r') as f:
lines = f.read().splitlines()
assert 2 == len(lines)
assert 'foo' == lines[0]
assert 'bar' == lines[1]
else:
own_log_path = launch.logging.launch_config.get_log_file_path('some-proc.log')
assert (not os.path.exists(own_log_path) or 0 == os.stat(own_log_path).st_size)
def test_screen_default_format_with_timestamps(capsys, log_dir):
"""Test screen logging when using the default logs format with timestamps."""
launch.logging.reset()
launch.logging.launch_config.level = logging.DEBUG
launch.logging.launch_config.log_dir = log_dir
launch.logging.launch_config.set_screen_format('default_with_timestamp')
logger = launch.logging.get_logger('some-proc')
logger.addHandler(launch.logging.launch_config.get_screen_handler())
assert logger.getEffectiveLevel() == logging.DEBUG
logger.debug('foo')
capture = capsys.readouterr()
lines = capture.out.splitlines()
assert 1 == len(lines)
assert re.match(r'[0-9]+\.[0-9]+ \[DEBUG\] \[some-proc\]: foo', lines[0]) is not None
assert 0 == len(capture.err)
def test_screen_default_format(capsys):
"""Test screen logging when using the default logs format."""
launch.logging.reset()
logger = launch.logging.get_logger('some-proc')
logger.addHandler(launch.logging.launch_config.get_screen_handler())
assert logger.getEffectiveLevel() == logging.INFO
logger.info('bar')
capture = capsys.readouterr()
lines = capture.out.splitlines()
assert 1 == len(lines)
assert '[INFO] [some-proc]: bar' == lines[0]
assert 0 == len(capture.err)
def test_log_default_format(log_dir):
"""Test logging to the main log file when using the default logs format."""
launch.logging.reset()
launch.logging.launch_config.level = logging.WARN
launch.logging.launch_config.log_dir = log_dir
logger = launch.logging.get_logger('some-proc')
logger.addHandler(launch.logging.launch_config.get_log_file_handler())
assert logger.getEffectiveLevel() == logging.WARN
logger.error('baz')
launch.logging.launch_config.get_log_file_handler().flush()
assert os.path.exists(launch.logging.launch_config.get_log_file_path())
assert 0 != os.stat(launch.logging.launch_config.get_log_file_path()).st_size
with open(launch.logging.launch_config.get_log_file_path(), 'r') as f:
lines = f.readlines()
assert 1 == len(lines)
assert re.match(r'[0-9]+\.[0-9]+ \[ERROR\] \[some-proc\]: baz', lines[0]) is not None
def test_log_handler_factory(log_dir):
"""Test logging using a custom log handlers."""
class TestStreamHandler(launch.logging.handlers.Handler):
def __init__(self, output):
super().__init__()
self._output = output
def emit(self, record):
self._output.append(self.format(record))
import collections
outputs = collections.defaultdict(list)
launch.logging.reset()
launch.logging.launch_config.level = logging.WARN
launch.logging.launch_config.log_dir = log_dir
launch.logging.launch_config.log_handler_factory = (
lambda path, encoding=None: TestStreamHandler(
output=outputs[path]
)
)
logger = launch.logging.get_logger('some-proc')
logger.addHandler(launch.logging.launch_config.get_log_file_handler())
logger.debug('foo')
logger.error('baz')
path = launch.logging.launch_config.get_log_file_path()
assert path in outputs
assert len(outputs[path]) == 1
assert outputs[path][0].endswith('baz')
def fake_make_unique_log_dir(*, base_path):
# Passthrough; do not create the directory
return base_path
@mock.patch('launch.logging._make_unique_log_dir', mock.MagicMock(wraps=fake_make_unique_log_dir))
def test_get_logging_directory():
launch.logging.launch_config.reset()
os.environ.pop('ROS_LOG_DIR', None)
os.environ.pop('ROS_HOME', None)
home = pathlib.Path.home()
assert str(home)
# Default case without ROS_LOG_DIR or ROS_HOME being set (but with HOME)
default_dir = str(home / '.ros/log')
# This ensures that the launch config will check the environment again
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == default_dir
# Use $ROS_LOG_DIR if it is set
my_log_dir_raw = '/my/ros_log_dir'
my_log_dir = str(pathlib.Path(my_log_dir_raw))
os.environ['ROS_LOG_DIR'] = my_log_dir
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == my_log_dir
# Make sure it converts path separators when necessary
os.environ['ROS_LOG_DIR'] = my_log_dir_raw
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == my_log_dir
# Setting ROS_HOME won't change anything since ROS_LOG_DIR is used first
os.environ['ROS_HOME'] = '/this/wont/be/used'
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == my_log_dir
os.environ.pop('ROS_HOME', None)
# Empty is considered unset
os.environ['ROS_LOG_DIR'] = ''
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == default_dir
# Make sure '~' is expanded to the home directory
os.environ['ROS_LOG_DIR'] = '~/logdir'
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == str(home / 'logdir')
os.environ.pop('ROS_LOG_DIR', None)
# Without ROS_LOG_DIR, use $ROS_HOME/log
fake_ros_home = home / '.fakeroshome'
fake_ros_home_log_dir = str(fake_ros_home / 'log')
os.environ['ROS_HOME'] = str(fake_ros_home)
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == fake_ros_home_log_dir
# Make sure it converts path separators when necessary
my_ros_home_raw = '/my/ros/home'
my_ros_home_log_dir = str(pathlib.Path(my_ros_home_raw) / 'log')
os.environ['ROS_HOME'] = my_ros_home_raw
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == my_ros_home_log_dir
# Empty is considered unset
os.environ['ROS_HOME'] = ''
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == default_dir
# Make sure '~' is expanded to the home directory
os.environ['ROS_HOME'] = '~/.fakeroshome'
launch.logging.launch_config.log_dir = None
assert launch.logging.launch_config.log_dir == fake_ros_home_log_dir
os.environ.pop('ROS_HOME', None)
launch.logging.launch_config.reset()
| 39.243827 | 98 | 0.683366 | 1,790 | 12,715 | 4.627933 | 0.128492 | 0.120835 | 0.135321 | 0.175036 | 0.680106 | 0.636045 | 0.594037 | 0.550217 | 0.511347 | 0.445799 | 0 | 0.00538 | 0.181361 | 12,715 | 323 | 99 | 39.365325 | 0.79047 | 0.125757 | 0 | 0.461538 | 0 | 0 | 0.11094 | 0.005158 | 0 | 0 | 0 | 0 | 0.230769 | 1 | 0.051282 | false | 0 | 0.034188 | 0.004274 | 0.098291 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
751f0dac6fc587c1fb5bfe7b2c73eb70a7dff82d | 690 | py | Python | code/C10/pkg/mysql.py | DataScienceCoding/Python-Web-Crawler | 7348f20b09fc89263ac4a7109b3890c64b687265 | [
"MIT"
] | 2 | 2020-12-17T06:33:48.000Z | 2020-12-26T01:21:13.000Z | code/C10/pkg/mysql.py | DataScienceCoding/Python-Web-Crawler | 7348f20b09fc89263ac4a7109b3890c64b687265 | [
"MIT"
] | null | null | null | code/C10/pkg/mysql.py | DataScienceCoding/Python-Web-Crawler | 7348f20b09fc89263ac4a7109b3890c64b687265 | [
"MIT"
] | null | null | null | import pymysql
from conf.cfg import Conf
class MysqlCli():
def __init__(self):
conf = Conf()
option = conf.section('mysql')
try:
self._db = pymysql.connect(host=option['host'],
port=int(option['port']),
user=option['username'],
password=option['password'])
self.cursor = self._db.cursor()
except Exception as e:
print(e.reason())
# def execute(self, sql, *arg):
# return self._db.execute(sql, arg)
def __del__(self):
print('Mysql will be closed.')
self._db.close()
| 28.75 | 67 | 0.484058 | 70 | 690 | 4.6 | 0.542857 | 0.074534 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.394203 | 690 | 23 | 68 | 30 | 0.770335 | 0.097101 | 0 | 0 | 0 | 0 | 0.080645 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0.058824 | 0.117647 | 0 | 0.294118 | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
75218be058245f88ba545a6dbbbb4d1e406ae82c | 658 | py | Python | HookMandrill2Mongo.py | krisfremen/hookmandrill2mongo | 2533ba8ec3e7d38ecccd0529c1f2e86175233156 | [
"MIT"
] | null | null | null | HookMandrill2Mongo.py | krisfremen/hookmandrill2mongo | 2533ba8ec3e7d38ecccd0529c1f2e86175233156 | [
"MIT"
] | null | null | null | HookMandrill2Mongo.py | krisfremen/hookmandrill2mongo | 2533ba8ec3e7d38ecccd0529c1f2e86175233156 | [
"MIT"
] | null | null | null | #!/usr/bin/python
# -*- coding: utf-8 -*-
from flask import *
from pymongo import MongoClient
application = app = Flask(__name__)
dbhost = "localhost"
dbname = "webhooks"
dbcoll = "mandrill"
@app.route('/', methods=['GET', 'POST'])
def hook():
try:
mongo = MongoClient(host=dbhost)
mandrillhookjson = json.loads(request.form['mandrill_events'])
for mandrillhook in mandrillhookjson:
mongo[dbname][dbcoll].insert({"hook": mandrillhook})
return "OK"
except Exception, err:
return Response(status=400, response=str(err))
if __name__ == '__main__':
app.debug = True
app.run(host="0.0.0.0") | 25.307692 | 70 | 0.642857 | 77 | 658 | 5.324675 | 0.701299 | 0.014634 | 0.014634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015267 | 0.203647 | 658 | 26 | 71 | 25.307692 | 0.767176 | 0.057751 | 0 | 0 | 0 | 0 | 0.11147 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.105263 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7523dddeccce7c26beff51912adb1dd4fceff950 | 11,210 | py | Python | logging_automation.py | aws-samples/aws-centeralized-logging-with-datadog | f1fb778664915b645a0fec5c56a34654ac761233 | [
"MIT-0"
] | 3 | 2020-02-05T03:52:33.000Z | 2021-10-30T01:41:04.000Z | logging_automation.py | aws-samples/aws-centeralized-logging-with-datadog | f1fb778664915b645a0fec5c56a34654ac761233 | [
"MIT-0"
] | null | null | null | logging_automation.py | aws-samples/aws-centeralized-logging-with-datadog | f1fb778664915b645a0fec5c56a34654ac761233 | [
"MIT-0"
] | 6 | 2018-09-06T05:48:07.000Z | 2021-10-30T01:40:56.000Z | # Copyright 2008-2018 Amazon.com, Inc. or its affiliates. All Rights Reserved.
# Licensed under the Apache License, Version 2.0 (the "License"). You may not use this file except in compliance with the License. A copy of the License is located at
# http://aws.amazon.com/apache2.0/
# or in the "license" file accompanying this file. This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
from __future__ import print_function
import boto3
import botocore
import time
import sys
import argparse
import json
import os
import base64
encrypted_token = os.environ['DD_KMS_API_KEY']
ddApiKey = boto3.client('kms').decrypt(CiphertextBlob=base64.b64decode(encrypted_token))['Plaintext']
def lambda_handler(event, context):
access_to_billing = "DENY"
if event['existing_accountid'] is None:
print("Creating new account: " + event['account_name'] + " (" + event['account_email'] + ")")
print("********************************")
credentials = assume_role(event['masteraccount_id'], 'ST-S-Automation', None)
account_id = create_account(event['account_name'], event['account_email'], 'OrganizationAccountAccessRole', access_to_billing, credentials)
print("********************************")
print("Created acount: " + account_id)
print("********************************")
else:
account_id = event['existing_accountid']
print("Updating Shared Security account policy...")
credentials = assume_role(event['securityaccount_id'], 'ST-S-Automation', None)
update_policy(account_id, event['cloudtrail_bucket'], event['datadogcode_bucket'], credentials)
print("********************************")
print("Deploying resources from " + 'Member.yml' + " as " + 'Member' + " in " + 'us-east-1')
mastercredentials = assume_role(event['masteraccount_id'], 'ST-S-Automation', None)
credentials = assume_role(account_id, 'OrganizationAccountAccessRole', mastercredentials)
template = get_template('Member.yml')
stack = deploy_resources(template, 'Member', 'us-east-1', event['cloudtrail_bucket'], event['datadogcode_bucket'], event['securityaccount_id'], ddApiKey, credentials)
print("********************************")
print(stack)
print("********************************")
print("Resources deployed for account " + account_id)
def assume_role(account_id, account_role, credentials):
if credentials is None:
sts_client = boto3.client('sts')
else:
sts_client = boto3.client('sts', aws_access_key_id=credentials['AccessKeyId'], aws_secret_access_key=credentials['SecretAccessKey'], aws_session_token=credentials['SessionToken'],)
role_arn = 'arn:aws:iam::' + account_id + ':role/' + account_role
assuming_role = True
while assuming_role is True:
try:
assuming_role = False
assumedRoleObject = sts_client.assume_role(RoleArn=role_arn, RoleSessionName="NewRole")
except botocore.exceptions.ClientError as e:
assuming_role = True
print(e)
time.sleep(10)
return assumedRoleObject['Credentials']
def create_account(account_name, account_email, account_role, access_to_billing, credentials):
'''
Create a new AWS account and add it to an organization
'''
client = boto3.client('organizations', aws_access_key_id=credentials['AccessKeyId'], aws_secret_access_key=credentials['SecretAccessKey'], aws_session_token=credentials['SessionToken'],)
try:
create_account_response = client.create_account(Email=account_email, AccountName=account_name, RoleName=account_role, IamUserAccessToBilling=access_to_billing)
except botocore.exceptions.ClientError as e:
print(e)
sys.exit(1)
time.sleep(10)
account_status = 'IN_PROGRESS'
while account_status == 'IN_PROGRESS':
create_account_status_response = client.describe_create_account_status(CreateAccountRequestId=create_account_response.get('CreateAccountStatus').get('Id'))
print("Create account status "+str(create_account_status_response))
account_status = create_account_status_response.get('CreateAccountStatus').get('State')
if account_status == 'SUCCEEDED':
account_id = create_account_status_response.get('CreateAccountStatus').get('AccountId')
elif account_status == 'FAILED':
print("Account creation failed: " + create_account_status_response.get('CreateAccountStatus').get('FailureReason'))
sys.exit(1)
root_id = client.list_roots().get('Roots')[0].get('Id')
return account_id
def update_policy(account_id, cloudtrail_bucket, datadogcode_bucket, credentials):
s3 = boto3.client('s3', aws_access_key_id=credentials['AccessKeyId'], aws_secret_access_key=credentials['SecretAccessKey'], aws_session_token=credentials['SessionToken'],)
iam = boto3.client('iam', aws_access_key_id=credentials['AccessKeyId'], aws_secret_access_key=credentials['SecretAccessKey'], aws_session_token=credentials['SessionToken'],)
'''
Update CloudTrail bucket policy
'''
cloudtrail_arn = "arn:aws:s3:::" + cloudtrail_bucket +"/AWSLogs/" + account_id + "/*"
cloudtrail_response = s3.get_bucket_policy(Bucket=cloudtrail_bucket)
cloudtrailpolicy = json.loads(cloudtrail_response['Policy'])
for cloudtrail_index in range(len(cloudtrailpolicy['Statement'])):
if cloudtrailpolicy['Statement'][cloudtrail_index]['Sid'] == 'AWSCloudTrailWrite':
folder_list = cloudtrailpolicy['Statement'][cloudtrail_index]['Resource']
folder_list.append(cloudtrail_arn)
cloudtrailpolicy['Statement'][cloudtrail_index]['Resource'] = folder_list
s3.put_bucket_policy(Bucket=cloudtrail_bucket, Policy=json.dumps(cloudtrailpolicy))
'''
Update Datadog Lambda Code bucket policy
'''
newaccount_arn = "arn:aws:iam::" + account_id + ":root"
datadog_response = s3.get_bucket_policy(Bucket=datadogcode_bucket)
datadogcodepolicy = json.loads(datadog_response['Policy'])
datadog_index = 0
for statement in datadogcodepolicy['Statement']:
if statement['Sid'] == 'CodeReadAccess':
account_list = statement['Principal']['AWS']
account_list.append(newaccount_arn)
statement['Principal']['AWS'] = account_list
datadogcodepolicy['Statement'][datadog_index] = statement
datadog_index += 1
s3.put_bucket_policy(Bucket=datadogcode_bucket, Policy=json.dumps(datadogcodepolicy))
'''
Update LoggingLambdaRole role policy
'''
account_arn = "arn:aws:iam::" + account_id + ":role/ST-S-Automation"
assumerole_response = iam.get_role_policy(RoleName='LoggingLambdaRole', PolicyName='AssumeRole')
assumerole_policy = assumerole_response['PolicyDocument']
for assumerole_index in range(len(assumerole_policy['Statement'])):
if assumerole_policy['Statement'][assumerole_index]['Sid'] == 'AWSAssumeRole':
account_list = assumerole_policy['Statement'][assumerole_index]['Resource']
account_list.append(account_arn)
assumerole_policy['Statement'][assumerole_index]['Resource'] = account_list
iam.put_role_policy(RoleName='LoggingLambdaRole', PolicyName='AssumeRole', PolicyDocument=json.dumps(assumerole_policy))
print("Policies successfully updated")
def get_template(template_file):
'''
Read a template file and return the contents
'''
print("Reading resources from " + template_file)
f = open(template_file, "r")
cf_template = f.read()
return cf_template
def deploy_resources(template, stack_name, stack_region, cloudtrail_bucket, datadogcode_bucket, securityaccount_id, datadog_apikey, credentials):
print(datadog_apikey)
'''
Create a CloudFormation stack of resources within the new account
'''
datestamp = time.strftime("%d/%m/%Y")
client = boto3.client('cloudformation',
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'],
region_name=stack_region)
print("Creating stack " + stack_name + " in " + stack_region)
creating_stack = True
while creating_stack is True:
try:
creating_stack = False
create_stack_response = client.create_stack(
StackName=stack_name,
TemplateBody=template,
Parameters=[
{
'ParameterKey' : 'cloudtrailbucket',
'ParameterValue' : cloudtrail_bucket
},
{
'ParameterKey' : 'securityaccountid',
'ParameterValue' : securityaccount_id
},
{
'ParameterKey' : 'Datadogbucket',
'ParameterValue' : datadogcode_bucket
},
{
'ParameterKey' : 'DatadogAPIToken',
'ParameterValue' : datadog_apikey
}
],
NotificationARNs=[],
Capabilities=[
'CAPABILITY_NAMED_IAM',
],
OnFailure='ROLLBACK',
Tags=[
{
'Key': 'ManagedResource',
'Value': 'True'
},
{
'Key': 'DeployDate',
'Value': datestamp
}
]
)
except botocore.exceptions.ClientError as e:
creating_stack = True
print(e)
time.sleep(10)
stack_building = True
print("********************************")
print("Stack creation in process...")
print("********************************")
print(create_stack_response)
while stack_building is True:
event_list = client.describe_stack_events(StackName=stack_name).get("StackEvents")
stack_event = event_list[0]
if (stack_event.get('ResourceType') == 'AWS::CloudFormation::Stack' and
stack_event.get('ResourceStatus') == 'CREATE_COMPLETE'):
stack_building = False
print("Stack construction complete.")
elif (stack_event.get('ResourceType') == 'AWS::CloudFormation::Stack' and
stack_event.get('ResourceStatus') == 'ROLLBACK_COMPLETE'):
stack_building = False
print("Stack construction failed.")
sys.exit(1)
else:
print(stack_event)
print("********************************")
print("Stack building . . .")
print("********************************")
time.sleep(10)
stack = client.describe_stacks(StackName=stack_name)
return stack | 46.322314 | 268 | 0.632739 | 1,108 | 11,210 | 6.168773 | 0.234657 | 0.018435 | 0.019459 | 0.010241 | 0.317484 | 0.276372 | 0.212729 | 0.139868 | 0.122604 | 0.108852 | 0 | 0.005819 | 0.233452 | 11,210 | 242 | 269 | 46.322314 | 0.789596 | 0.04826 | 0 | 0.194737 | 0 | 0 | 0.2195 | 0.043842 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.047368 | null | null | 0.168421 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
75256c2774647f7511c9c5789e9a1b6ac39eb890 | 661 | py | Python | scripts/tweakstatus.py | willingc/pc-bot | 8a8ba6bc4c1fbb4bd3dbc740509bf30b3077b06c | [
"BSD-3-Clause"
] | 5 | 2015-05-29T04:18:15.000Z | 2020-07-01T00:28:27.000Z | scripts/tweakstatus.py | alex/pc-bot | ff6d9fcb16363c447e9abff1079708f945f795e9 | [
"BSD-3-Clause"
] | 1 | 2022-02-11T03:48:57.000Z | 2022-02-11T03:48:57.000Z | scripts/tweakstatus.py | willingc/pc-bot | 8a8ba6bc4c1fbb4bd3dbc740509bf30b3077b06c | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
"""
Manually tweak a talk's status, leaving a note about why.
"""
import sys
import argparse
import pycon_bot.mongo
from pycon_bot.models import TalkProposal, Note
p = argparse.ArgumentParser()
p.add_argument('talk_id', type=int)
p.add_argument('new_status', choices=[c[0] for c in TalkProposal.STATUSES])
p.add_argument('note')
p.add_argument('--dsn')
args = p.parse_args()
if not pycon_bot.mongo.connect(args.dsn):
sys.stderr.write("Need to pass --dsn or set env[MONGO_DSN].")
sys.exit(1)
t = TalkProposal.objects.get(talk_id=args.talk_id)
t.update(
push__notes = Note(text=args.note),
set__status = args.new_status
)
| 25.423077 | 75 | 0.733737 | 110 | 661 | 4.245455 | 0.536364 | 0.034261 | 0.102784 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003454 | 0.124054 | 661 | 25 | 76 | 26.44 | 0.803109 | 0.118003 | 0 | 0 | 0 | 0 | 0.116522 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.055556 | 0.222222 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.