hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
67783d21b81dfe68ee03ea18059310e15699d017 | 335 | py | Python | backend/app/app/db/base.py | AndreyKlychnikov/startup-together | 76697246fc361b88d2df90e2f65e41edc9660b83 | [
"MIT"
] | null | null | null | backend/app/app/db/base.py | AndreyKlychnikov/startup-together | 76697246fc361b88d2df90e2f65e41edc9660b83 | [
"MIT"
] | null | null | null | backend/app/app/db/base.py | AndreyKlychnikov/startup-together | 76697246fc361b88d2df90e2f65e41edc9660b83 | [
"MIT"
] | null | null | null | # Import all the models, so that Base has them before being
# imported by Alembic
from app.db.base_class import Base # noqa
from app.models.item import Item # noqa
from app.models.user import User, UserProfile # noqa
from app.models.project import Project # noqa
from app.models.project_membership import ProjectMembership # noqa
| 41.875 | 67 | 0.78806 | 52 | 335 | 5.038462 | 0.480769 | 0.133588 | 0.167939 | 0.259542 | 0.183206 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155224 | 335 | 7 | 68 | 47.857143 | 0.925795 | 0.304478 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
e1d75d05f34adea02db579a8268f45fb1f221ec0 | 124 | py | Python | titan/react_module_pkg/appmodule/resources.py | mnieber/gen | 65f8aa4fb671c4f90d5cbcb1a0e10290647a31d9 | [
"MIT"
] | null | null | null | titan/react_module_pkg/appmodule/resources.py | mnieber/gen | 65f8aa4fb671c4f90d5cbcb1a0e10290647a31d9 | [
"MIT"
] | null | null | null | titan/react_module_pkg/appmodule/resources.py | mnieber/gen | 65f8aa4fb671c4f90d5cbcb1a0e10290647a31d9 | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from titan.react_pkg.module import Module
@dataclass
class AppModule(Module):
pass
| 13.777778 | 41 | 0.798387 | 16 | 124 | 6.125 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153226 | 124 | 8 | 42 | 15.5 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.2 | 0.4 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 5 |
c04f8b3cf2e76f98ca4007bb30577c5edd0802d1 | 140 | py | Python | flappy2.py | benja971/Flappy | 596883cb6fdfa8771c857b4331d172c5e75b133c | [
"MIT"
] | null | null | null | flappy2.py | benja971/Flappy | 596883cb6fdfa8771c857b4331d172c5e75b133c | [
"MIT"
] | null | null | null | flappy2.py | benja971/Flappy | 596883cb6fdfa8771c857b4331d172c5e75b133c | [
"MIT"
] | null | null | null | import pygame, time
from random import *
from pygame.locals import *
from ctypes import windll
windll.shcore.SetProcessDpiAwareness(5) | 23.333333 | 39 | 0.792857 | 18 | 140 | 6.166667 | 0.611111 | 0.18018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008403 | 0.15 | 140 | 6 | 39 | 23.333333 | 0.92437 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.8 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
fbfc123aa462ebd10d6c7d0fe63a178be4436886 | 63 | py | Python | exercises/CI/introduction_to_ci/.cic/application.py | lvl-up/cic | b35f4e01792fbcbd1667d31c0dd81baff1e54ad9 | [
"Apache-2.0"
] | 1 | 2018-11-16T11:46:13.000Z | 2018-11-16T11:46:13.000Z | exercises/CI/introduction_to_ci/.cic/application.py | lvl-up/cic | b35f4e01792fbcbd1667d31c0dd81baff1e54ad9 | [
"Apache-2.0"
] | 15 | 2018-11-14T11:54:43.000Z | 2019-01-14T11:31:28.000Z | exercises/CI/introduction_to_ci/.cic/application.py | lvl-up/cic | b35f4e01792fbcbd1667d31c0dd81baff1e54ad9 | [
"Apache-2.0"
] | 21 | 2018-11-11T21:34:44.000Z | 2019-04-08T07:52:15.000Z | class Application:
def working(self):
return False | 15.75 | 22 | 0.650794 | 7 | 63 | 5.857143 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 63 | 4 | 23 | 15.75 | 0.911111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
220e531f99e945923cc9db1e5d9b673b4778b42d | 1,630 | py | Python | cart_venv/Lib/site-packages/tensorflow_estimator/python/estimator/api/_v2/estimator/experimental/__init__.py | juice1000/Synchronous-vs-Asynchronous-Learning-Tensorflow- | 654be60f7986ac9bb7ce1d080ddee377c3389f93 | [
"MIT"
] | 2 | 2019-08-04T20:28:14.000Z | 2019-10-27T23:26:42.000Z | cart_venv/Lib/site-packages/tensorflow_estimator/python/estimator/api/_v2/estimator/experimental/__init__.py | juice1000/Synchronous-vs-Asynchronous-Learning-Tensorflow- | 654be60f7986ac9bb7ce1d080ddee377c3389f93 | [
"MIT"
] | null | null | null | cart_venv/Lib/site-packages/tensorflow_estimator/python/estimator/api/_v2/estimator/experimental/__init__.py | juice1000/Synchronous-vs-Asynchronous-Learning-Tensorflow- | 654be60f7986ac9bb7ce1d080ddee377c3389f93 | [
"MIT"
] | 1 | 2020-11-04T03:16:29.000Z | 2020-11-04T03:16:29.000Z | # This file is MACHINE GENERATED! Do not edit.
# Generated by: tensorflow/python/tools/api/generator/create_python_api.py script.
"""Public API for tf.estimator.experimental namespace.
"""
from __future__ import print_function as _print_function
import sys as _sys
from tensorflow_estimator.python.estimator.canned.linear import LinearSDCA
from tensorflow_estimator.python.estimator.canned.rnn import RNNClassifier
from tensorflow_estimator.python.estimator.canned.rnn import RNNEstimator
from tensorflow_estimator.python.estimator.early_stopping import make_early_stopping_hook
from tensorflow_estimator.python.estimator.early_stopping import stop_if_higher_hook
from tensorflow_estimator.python.estimator.early_stopping import stop_if_lower_hook
from tensorflow_estimator.python.estimator.early_stopping import stop_if_no_decrease_hook
from tensorflow_estimator.python.estimator.early_stopping import stop_if_no_increase_hook
from tensorflow_estimator.python.estimator.export.export import build_raw_supervised_input_receiver_fn
from tensorflow_estimator.python.estimator.hooks.hooks import InMemoryEvaluatorHook
from tensorflow_estimator.python.estimator.hooks.hooks import make_stop_at_checkpoint_step_hook
from tensorflow_estimator.python.estimator.model_fn import call_logit_fn
del _print_function
from tensorflow.python.util import module_wrapper as _module_wrapper
if not isinstance(_sys.modules[__name__], _module_wrapper.TFModuleWrapper):
_sys.modules[__name__] = _module_wrapper.TFModuleWrapper(
_sys.modules[__name__], "estimator.experimental", public_apis=None, deprecation=False,
has_lite=False)
| 52.580645 | 102 | 0.870552 | 217 | 1,630 | 6.156682 | 0.35023 | 0.136228 | 0.206587 | 0.260479 | 0.575599 | 0.575599 | 0.47979 | 0.47979 | 0.276946 | 0.203593 | 0 | 0 | 0.072393 | 1,630 | 30 | 103 | 54.333333 | 0.883598 | 0.109202 | 0 | 0 | 1 | 0 | 0.015235 | 0.015235 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0.1 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
221382d44d93faf6fa4618404d927d212bb3e6cf | 278 | py | Python | python/love.py | kelvinrandu/pretty-cool-scripts | 5b1f9bf6e1f1474ee28b88c9f43a68cdf2f64cc7 | [
"MIT"
] | null | null | null | python/love.py | kelvinrandu/pretty-cool-scripts | 5b1f9bf6e1f1474ee28b88c9f43a68cdf2f64cc7 | [
"MIT"
] | null | null | null | python/love.py | kelvinrandu/pretty-cool-scripts | 5b1f9bf6e1f1474ee28b88c9f43a68cdf2f64cc7 | [
"MIT"
] | 1 | 2019-10-26T08:45:26.000Z | 2019-10-26T08:45:26.000Z | # if you want to run this script navigate to this file and type python love.py to run this script
print('\n'.join([''.join([('Love'[(x-y) % len('Love')] if ((x*0.05)**2+(y*0.1)**2-1)**3-(x*0.05)**2*(y*0.1)**3 <= 0else' ') for x in range(-30, 30)]) for y in range(30, -30, -1)])) | 139 | 180 | 0.579137 | 62 | 278 | 2.596774 | 0.483871 | 0.062112 | 0.111801 | 0.186335 | 0.099379 | 0.099379 | 0.099379 | 0 | 0 | 0 | 0 | 0.108333 | 0.136691 | 278 | 2 | 180 | 139 | 0.5625 | 0.341727 | 0 | 0 | 0 | 0 | 0.06044 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
222c9779e49fd09c5e750a2794fe4c4b7b71a509 | 184 | py | Python | asana_app/admin.py | maksim-shaidulin/asana | bd23a0a794ce7aafe31e9df1de9c7606f32504f4 | [
"MIT"
] | null | null | null | asana_app/admin.py | maksim-shaidulin/asana | bd23a0a794ce7aafe31e9df1de9c7606f32504f4 | [
"MIT"
] | null | null | null | asana_app/admin.py | maksim-shaidulin/asana | bd23a0a794ce7aafe31e9df1de9c7606f32504f4 | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import ProjectModel, UserModel, TaskModel
admin.site.register(ProjectModel)
admin.site.register(UserModel)
admin.site.register(TaskModel) | 30.666667 | 54 | 0.842391 | 23 | 184 | 6.73913 | 0.478261 | 0.174194 | 0.329032 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.070652 | 184 | 6 | 55 | 30.666667 | 0.906433 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
22314abf818437d3f79500efd0efab8c14131a4f | 196 | py | Python | client/core/summarization/gensim_impl.py | YourNorth/rezak-summarizator | 3ab2f4bf1044ea9654b4084a39030987e4b8bfe8 | [
"MIT"
] | 3 | 2020-03-28T16:48:10.000Z | 2020-12-01T17:18:55.000Z | client/core/summarization/gensim_impl.py | YourNorth/rezak-summarizator | 3ab2f4bf1044ea9654b4084a39030987e4b8bfe8 | [
"MIT"
] | 31 | 2020-03-20T17:53:08.000Z | 2021-03-10T11:48:11.000Z | client/core/summarization/gensim_impl.py | YourNorth/rezak-summarizator | 3ab2f4bf1044ea9654b4084a39030987e4b8bfe8 | [
"MIT"
] | 1 | 2020-03-20T05:01:16.000Z | 2020-03-20T05:01:16.000Z | from gensim.summarization.summarizer import summarize
from typing import List
def gensim_summary(text: str, options: dict) -> List[str]:
return summarize(text, options["ratio"], None, True)
| 28 | 58 | 0.760204 | 26 | 196 | 5.692308 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132653 | 196 | 6 | 59 | 32.666667 | 0.870588 | 0 | 0 | 0 | 0 | 0 | 0.02551 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0.25 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 5 |
2272af611832005be9ac8ac076785ab2861bfe5f | 1,216 | py | Python | company/company_types_api.py | thinkstack-co/ConnectPyse | ded8b426250aee352598f33ad08b7bcc3c6a3017 | [
"MIT"
] | 23 | 2017-01-24T05:44:05.000Z | 2021-11-26T17:08:01.000Z | company/company_types_api.py | thinkstack-co/ConnectPyse | ded8b426250aee352598f33ad08b7bcc3c6a3017 | [
"MIT"
] | 10 | 2017-01-14T21:11:10.000Z | 2019-06-16T21:10:29.000Z | company/company_types_api.py | thinkstack-co/ConnectPyse | ded8b426250aee352598f33ad08b7bcc3c6a3017 | [
"MIT"
] | 16 | 2017-01-24T02:28:19.000Z | 2021-07-13T17:23:22.000Z | from ..cw_controller import CWController
# Class for /company/companies/types
from connectpyse.company import company_type
class CompanyTypeAPI(CWController):
def __init__(self):
self.module_url = 'company'
self.module = 'companies/types'
self._class = company_type.CompanyType
super().__init__() # instance gets passed to parent object
def get_company_types(self):
return super()._get()
def create_company_type(self, a_company_type):
return super()._create(a_company_type)
def get_company_types_count(self):
return super()._get_count()
def get_company_type_by_id(self, company_type_id):
return super()._get_by_id(company_type_id)
def delete_company_type_by_id(self, company_type_id):
super()._delete_by_id(company_type_id)
def replace_company_type(self, company_type_id):
pass
def update_company_type(self, company_type_id, key, value):
return super()._update(company_type_id, key, value)
def merge_company_type(self, a_company_type, target_company_type_id):
# return super()._merge(a_company_type, target_company_type_id)
pass
| 32.864865 | 74 | 0.698191 | 160 | 1,216 | 4.84375 | 0.25625 | 0.298065 | 0.150968 | 0.087742 | 0.406452 | 0.340645 | 0.162581 | 0.082581 | 0 | 0 | 0 | 0 | 0.217105 | 1,216 | 36 | 75 | 33.777778 | 0.814076 | 0.110197 | 0 | 0.083333 | 0 | 0 | 0.021113 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0.083333 | 0.083333 | 0.208333 | 0.708333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 5 |
3f08ab7d63b37a03d0925338d6f662c5fe3a29dc | 645 | py | Python | ocrd/cli/bashlib.py | saw-leipzig/pyocrd | a346f472d3dfff1fecc14eda7b8088ab882ee8ca | [
"Apache-2.0"
] | null | null | null | ocrd/cli/bashlib.py | saw-leipzig/pyocrd | a346f472d3dfff1fecc14eda7b8088ab882ee8ca | [
"Apache-2.0"
] | null | null | null | ocrd/cli/bashlib.py | saw-leipzig/pyocrd | a346f472d3dfff1fecc14eda7b8088ab882ee8ca | [
"Apache-2.0"
] | null | null | null | import click
from ocrd.constants import BASHLIB_FILENAME
# ----------------------------------------------------------------------
# ocrd bashlib
# ----------------------------------------------------------------------
@click.group('bashlib')
def bashlib_cli():
"""
Work with bash library
"""
# ----------------------------------------------------------------------
# ocrd bashlib filename
# ----------------------------------------------------------------------
@bashlib_cli.command('filename')
def bashlib_filename():
"""
Dump the bash library filename for sourcing by shell scripts
"""
print(BASHLIB_FILENAME)
| 25.8 | 72 | 0.382946 | 43 | 645 | 5.627907 | 0.534884 | 0.247934 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122481 | 645 | 24 | 73 | 26.875 | 0.427562 | 0.624806 | 0 | 0 | 0 | 0 | 0.072816 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | true | 0 | 0.285714 | 0 | 0.571429 | 0.142857 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
3f0a7dfad6f0c8be523a83f51df16d0f0f3d38e2 | 162 | py | Python | BasicApp/admin.py | bozcani/borsa-scraper-app | 56c767a9b6d6c9be40046aa03763f13465860f6f | [
"MIT"
] | 3 | 2020-02-06T10:05:29.000Z | 2020-04-18T10:11:37.000Z | BasicApp/admin.py | bozcani/borsa | 56c767a9b6d6c9be40046aa03763f13465860f6f | [
"MIT"
] | 10 | 2020-02-06T08:50:13.000Z | 2020-04-25T12:17:17.000Z | BasicApp/admin.py | bozcani/borsa-scraper-app | 56c767a9b6d6c9be40046aa03763f13465860f6f | [
"MIT"
] | 1 | 2020-02-06T07:40:06.000Z | 2020-02-06T07:40:06.000Z | from django.contrib import admin
from .models import StockMarket, Stock
# Register your models here.
admin.site.register(StockMarket)
admin.site.register(Stock)
| 23.142857 | 38 | 0.814815 | 22 | 162 | 6 | 0.545455 | 0.136364 | 0.257576 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.104938 | 162 | 6 | 39 | 27 | 0.910345 | 0.160494 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
3f261ec538f54cb9bca36cbd14217d86f3b90521 | 65 | py | Python | py_sod_metrics/__init__.py | lartpang/PySODMetrics | 4aa253a59aff71507f92daf2dffe539c5c97ce46 | [
"MIT"
] | 39 | 2020-11-22T03:59:08.000Z | 2022-03-06T05:15:35.000Z | py_sod_metrics/__init__.py | lartpang/PySODMetrics | 4aa253a59aff71507f92daf2dffe539c5c97ce46 | [
"MIT"
] | 2 | 2021-06-26T09:32:44.000Z | 2021-09-27T11:50:54.000Z | py_sod_metrics/__init__.py | lartpang/PySODMetrics | 4aa253a59aff71507f92daf2dffe539c5c97ce46 | [
"MIT"
] | 12 | 2020-11-30T06:48:55.000Z | 2022-02-17T03:51:23.000Z | # -*- coding: utf-8 -*-
from py_sod_metrics.sod_metrics import *
| 21.666667 | 40 | 0.692308 | 10 | 65 | 4.2 | 0.8 | 0.47619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 0.138462 | 65 | 2 | 41 | 32.5 | 0.732143 | 0.323077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
3f2cf4413a83d4ad6d18ba08f65dd82254755e21 | 23 | py | Python | python/testData/resolve/multiFile/emptyModuleNamesake/EmptyModuleNamesake.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/resolve/multiFile/emptyModuleNamesake/EmptyModuleNamesake.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/resolve/multiFile/emptyModuleNamesake/EmptyModuleNamesake.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | import re
<ref> | 11.5 | 13 | 0.478261 | 3 | 23 | 3.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.434783 | 23 | 2 | 13 | 11.5 | 0.846154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.5 | null | null | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
58ad3c0261d5ca538ca2527944ff877fd7ba9da7 | 10,472 | py | Python | data_collector/get_stock_real_time_indicator_from_xueqiu.py | tangzhuangkun/Investment_Decision_AdvisorV4.0 | 654c29d904c4c59e1fbda0b5d18803c21972b4b1 | [
"MIT"
] | null | null | null | data_collector/get_stock_real_time_indicator_from_xueqiu.py | tangzhuangkun/Investment_Decision_AdvisorV4.0 | 654c29d904c4c59e1fbda0b5d18803c21972b4b1 | [
"MIT"
] | null | null | null | data_collector/get_stock_real_time_indicator_from_xueqiu.py | tangzhuangkun/Investment_Decision_AdvisorV4.0 | 654c29d904c4c59e1fbda0b5d18803c21972b4b1 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
# author: Tang Zhuangkun
from bs4 import BeautifulSoup
import requests
import time
import sys
sys.path.append("..")
import parsers.disguise as disguise
import log.custom_logger as custom_logger
class GetStockRealTimeIndicatorFromXueqiu:
# 获取雪球上股票估值数据
# 1、获取实时的股票滚动市盈率
# 2、获取实时的股票市净率
# 3、获取实时的股票滚动股息率
def __init__(self):
pass
def parse_page_content(self, stock_id, header, proxy, indicator):
# 解析雪球网页信息
# stock_id: 股票代码(2位上市地+6位数字, 如 sz000596)
# page_address,地址
# header,伪装的UA
# proxy,伪装的IP
# indicator, 需要抓取的指标,如 pe_ttm,市盈率TTM;pb,市净率,dr_ttm,滚动股息率 等
# 返回 股票滚动市盈率
# 地址模板
page_address = 'https://xueqiu.com/S/' + stock_id
# 递归算法,处理异常
try:
# 增加连接重试次数,默认10次
requests.adapters.DEFAULT_RETRIES = 10
# 关闭多余的连接:requests使用了urllib3库,默认的http connection是keep-alive的,
# requests设置False关闭
s = requests.session()
s.keep_alive = False
# 忽略警告
requests.packages.urllib3.disable_warnings()
# 得到页面的信息
raw_page = requests.get(page_address, headers=header, proxies=proxy, verify=False, stream=False,
timeout=10).text
# 使用BeautifulSoup解析页面
bs = BeautifulSoup(raw_page, "html.parser")
# 300开头,创业板
if (stock_id[:5] == 'SZ300' or stock_id[:5] == 'sz300'):
if indicator == 'pe_ttm':
# 解析网页信息,获取动态市盈率
real_time_stock_info = bs.find('table', attrs={'class': 'quote-info'})
tr_list = real_time_stock_info.find_all('tr')
real_time_pe_ttm = tr_list[4].find_all('span')[1].get_text()
# 日志记录
# msg = "Collected stock real time " + indicator + " from " + page_address
# custom_logger.CustomLogger().log_writter(msg, lev='debug')
# 返回 股票滚动市盈率
return real_time_pe_ttm
elif indicator == 'pb':
# 解析网页信息,获取市净率
real_time_stock_info = bs.find('table', attrs={'class': 'quote-info'})
tr_list = real_time_stock_info.find_all('tr')
real_time_pb = tr_list[5].find_all('span')[1].get_text()
# 日志记录
# msg = "Collected stock real time " + indicator + " from " + page_address
# custom_logger.CustomLogger().log_writter(msg, lev='debug')
# 返回 股票滚动市盈率
return real_time_pb
elif indicator == 'dr_ttm':
# 解析网页信息,获取滚动股息率
real_time_stock_info = bs.find('table', attrs={'class': 'quote-info'})
tr_list = real_time_stock_info.find_all('tr')
real_time_dr_ttm = tr_list[5].find_all('span')[3].get_text()
# 日志记录
# msg = "Collected stock real time " + indicator + " from " + page_address
# custom_logger.CustomLogger().log_writter(msg, lev='debug')
# 返回 股票滚动股息率
return real_time_dr_ttm
else:
# 日志记录
msg = "Unknown indicator"
custom_logger.CustomLogger().log_writter(msg, lev='warning')
# 返回 空
return -10000
# 688开头,科创板
elif (stock_id[:5] == 'SH688' or stock_id[:5] == 'sh688'):
if indicator == 'pe_ttm':
# 解析网页信息,获取动态市盈率
real_time_stock_info = bs.find('table', attrs={'class': 'quote-info'})
tr_list = real_time_stock_info.find_all('tr')
real_time_pe_ttm = tr_list[4].find_all('span')[1].get_text()
# 日志记录
# msg = "Collected stock real time " + indicator + " from " + page_address
# custom_logger.CustomLogger().log_writter(msg, lev='debug')
# 返回 股票滚动市盈率
return real_time_pe_ttm
elif indicator == 'pb':
# 解析网页信息,获取市净率
real_time_stock_info = bs.find('table', attrs={'class': 'quote-info'})
tr_list = real_time_stock_info.find_all('tr')
real_time_pb = tr_list[5].find_all('span')[1].get_text()
# 日志记录
# msg = "Collected stock real time " + indicator + " from " + page_address
# custom_logger.CustomLogger().log_writter(msg, lev='debug')
# 返回 股票滚动市盈率
return real_time_pb
elif indicator == 'dr_ttm':
# 解析网页信息,获取滚动股息率
real_time_stock_info = bs.find('table', attrs={'class': 'quote-info'})
tr_list = real_time_stock_info.find_all('tr')
real_time_dr_ttm = tr_list[5].find_all('span')[3].get_text()
# 日志记录
# msg = "Collected stock real time "+ indicator + " from " + page_address
# custom_logger.CustomLogger().log_writter(msg, lev='debug')
# 返回 股票滚动股息率
return real_time_dr_ttm
else:
# 日志记录
msg = "Unknown indicator"
custom_logger.CustomLogger().log_writter(msg, lev='warning')
# 返回 空
return -10000
# 沪A,深A,中小板
else:
if indicator == 'pe_ttm':
# 解析网页信息,获取动态市盈率
real_time_stock_info = bs.find('table', attrs={'class': 'quote-info'})
tr_list = real_time_stock_info.find_all('tr')
real_time_pe_ttm = tr_list[2].find_all('span')[3].get_text()
# 日志记录
# msg = "Collected stock real time " + indicator + " from " + page_address
# custom_logger.CustomLogger().log_writter(msg, lev='debug')
# 返回 股票滚动市盈率
return real_time_pe_ttm
elif indicator == 'pb':
# 解析网页信息,获取市净率
real_time_stock_info = bs.find('table', attrs={'class': 'quote-info'})
tr_list = real_time_stock_info.find_all('tr')
real_time_pb = tr_list[3].find_all('span')[3].get_text()
# 日志记录
# msg = "Collected stock real time " + indicator + " from " + page_address
# custom_logger.CustomLogger().log_writter(msg, lev='debug')
# 返回 股票滚动市盈率
return real_time_pb
elif indicator == 'dr_ttm':
# 解析网页信息,获取滚动股息率
real_time_stock_info = bs.find('table', attrs={'class': 'quote-info'})
tr_list = real_time_stock_info.find_all('tr')
real_time_dr_ttm = tr_list[5].find_all('span')[1].get_text()
# 日志记录
# msg = "Collected stock real time "+ indicator + " from " + page_address
# custom_logger.CustomLogger().log_writter(msg, lev='debug')
# 返回 股票滚动股息率
return real_time_dr_ttm
else:
# 日志记录
msg = "Unknown indicator"
custom_logger.CustomLogger().log_writter(msg, lev='warning')
# 返回 空
return -10000
# 如果读取超时,重新在执行一遍解析页面
except requests.exceptions.ReadTimeout:
# 日志记录
msg = "Collected stock real time "+ indicator + " from " + page_address + ' ' + "ReadTimeout"
custom_logger.CustomLogger().log_writter(msg, lev='warning')
# 返回解析页面得到的股票指标
# return self.parse_page_content(stock_id, header, proxy, indicator)
return self.get_single_stock_real_time_indicator(stock_id, indicator)
# 如果连接请求超时,重新在执行一遍解析页面
except requests.exceptions.ConnectTimeout:
# 日志记录
msg = "Collected stock real time "+ indicator + " from " + page_address + ' ' + "ConnectTimeout"
custom_logger.CustomLogger().log_writter(msg, lev='warning')
# 返回解析页面得到的股票指标
# return self.parse_page_content(stock_id, header, proxy, indicator)
return self.get_single_stock_real_time_indicator(stock_id, indicator)
# 如果请求超时,重新在执行一遍解析页面
except requests.exceptions.Timeout:
# 日志记录
msg = "Collected stock real time "+ indicator + " from " + page_address + ' ' + "Timeout"
custom_logger.CustomLogger().log_writter(msg, lev='warning')
# 返回解析页面得到的股票指标
# return self.parse_page_content(stock_id, header, proxy, indicator)
return self.get_single_stock_real_time_indicator(stock_id, indicator)
except Exception as e:
# 日志记录
msg = page_address + str(e)
custom_logger.CustomLogger().log_writter(msg, lev='warning')
# 返回解析页面得到的股票指标
# return self.parse_page_content(stock_id, header, proxy, indicator)
return self.get_single_stock_real_time_indicator(stock_id, indicator)
def get_single_stock_real_time_indicator(self, stock_id, indicator):
# 从雪球网获取实时的股票滚动市盈率pe_ttm
# stock_id: 股票代码(2位上市地+6位数字, 如 sz000596)
# indicator, 需要抓取的指标,如 pe_ttm,市盈率TTM;pb,市净率,dr_ttm,滚动股息率 等
# 返回: 获取的实时的股票滚动市盈率 格式如 32.74
# 伪装,隐藏UA和IP
header, proxy = disguise.Disguise().assemble_header_proxy()
return self.parse_page_content(stock_id, header, proxy, indicator)
if __name__ == '__main__':
time_start = time.time()
go = GetStockRealTimeIndicatorFromXueqiu()
real_time_pe_ttm = go.get_single_stock_real_time_indicator('sh600315', 'pe_ttm')
print(real_time_pe_ttm)
'''
for i in range(1000):
real_time_pe_ttm = go.get_single_stock_real_time_pe_ttm('SH600519')
print(real_time_pe_ttm)
real_time_pe_ttm = go.get_single_stock_real_time_pe_ttm('SZ002505')
print(real_time_pe_ttm)
print()
'''
time_end = time.time()
print('Time Cost: ' + str(time_end - time_start)) | 42.918033 | 109 | 0.539057 | 1,115 | 10,472 | 4.773991 | 0.167713 | 0.093181 | 0.048845 | 0.057486 | 0.748074 | 0.732857 | 0.725343 | 0.714447 | 0.714447 | 0.714447 | 0 | 0.016575 | 0.360485 | 10,472 | 244 | 110 | 42.918033 | 0.778259 | 0.238636 | 0 | 0.59434 | 0 | 0 | 0.07846 | 0 | 0.028302 | 0 | 0 | 0 | 0 | 1 | 0.028302 | false | 0.009434 | 0.056604 | 0 | 0.254717 | 0.018868 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
58f89733efb2272a3958829c48d1a5bab37b6def | 11,682 | py | Python | test/test_markdown_front_matter.py | scop/pymarkdown | 562ba8f7857d99ba09e86e42de5a37ec6d9b2c30 | [
"MIT"
] | null | null | null | test/test_markdown_front_matter.py | scop/pymarkdown | 562ba8f7857d99ba09e86e42de5a37ec6d9b2c30 | [
"MIT"
] | null | null | null | test/test_markdown_front_matter.py | scop/pymarkdown | 562ba8f7857d99ba09e86e42de5a37ec6d9b2c30 | [
"MIT"
] | null | null | null | """
Tests for the optional front-matter processing
"""
import pytest
from .utils import act_and_assert
config_map = {"extensions": {"front-matter": {"enabled": True}}}
@pytest.mark.gfm
def test_front_matter_01():
"""
Any whitespace before the three - characters causes it not to fire.
fill in layer - test_thematic_breaks_020
"""
# Arrange
source_markdown = """ ---
Title: my document
---
"""
expected_tokens = [
"[tbreak(1,2):-: :---]",
"[setext(3,1):-:3::(2,1)]",
"[text(2,1):Title: my document:]",
"[end-setext::]",
"[BLANK(4,1):]",
]
expected_gfm = """<hr />
<h2>Title: my document</h2>"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_02():
"""
The starting character must be the '-' character, not the other two.
"""
# Arrange
source_markdown = """***
Title: my document
***
"""
expected_tokens = [
"[tbreak(1,1):*::***]",
"[para(2,1):]",
"[text(2,1):Title: my document:]",
"[end-para:::False]",
"[tbreak(3,1):*::***]",
"[BLANK(4,1):]",
]
expected_gfm = """<hr />
<p>Title: my document</p>
<hr />"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_03():
"""
Everything between the start and end is parsed, but not as part of HTML output.
"""
# Arrange
source_markdown = """---
Title: my document
---
---
"""
expected_tokens = [
"[front-matter(1,1):---:---:['Title: my document']:{'title': 'my document'}]",
"[tbreak(4,1):-::---]",
"[BLANK(5,1):]",
]
expected_gfm = """<hr />"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_04():
"""
There must be an opening and closing boundary for it to be eligible as front matter.
"""
# Arrange
source_markdown = """---
"""
expected_tokens = ["[tbreak(1,1):-::---]", "[BLANK(2,1):]"]
expected_gfm = """<hr />"""
# Act & Assert
act_and_assert(
source_markdown,
expected_gfm,
expected_tokens,
show_debug=False,
config_map=config_map,
)
@pytest.mark.gfm
def test_front_matter_05():
"""
There must be an opening and closing boundary for it to be eligible as front matter.
"""
# Arrange
source_markdown = """---
test:
"""
expected_tokens = [
"[tbreak(1,1):-::---]",
"[para(2,1):]",
"[text(2,1):test::]",
"[end-para:::True]",
"[BLANK(3,1):]",
]
expected_gfm = """<hr />
<p>test:</p>"""
# Act & Assert
act_and_assert(
source_markdown,
expected_gfm,
expected_tokens,
show_debug=False,
config_map=config_map,
)
@pytest.mark.gfm
def test_front_matter_06():
"""
There must be an opening and closing boundary for it to be eligible as front matter.
Even if there is just a field name and no value.
"""
# Arrange
source_markdown = """---
test:
---
"""
expected_tokens = [
"[front-matter(1,1):---:---:['test:']:{'test': ''}]",
"[BLANK(4,1):]",
]
expected_gfm = """"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_07():
"""
There must be an opening and closing boundary for it to be eligible as front matter.
Containing a single line field name and value is normal.
"""
# Arrange
source_markdown = """---
test: abc
---
"""
expected_tokens = [
"[front-matter(1,1):---:---:['test: abc']:{'test': 'abc'}]",
"[BLANK(4,1):]",
]
expected_gfm = """"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_08():
"""
There must be an opening and closing boundary for it to be eligible as front matter.
In normal mode, a multiline field value is indicated by a second line that is indented
by at least 4 characters.
"""
# Arrange
source_markdown = """---
test: abc
def
---
"""
expected_tokens = [
"[front-matter(1,1):---:---:['test: abc', ' def']:{'test': 'abc\\ndef'}]",
"[BLANK(5,1):]",
]
expected_gfm = """"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_09():
"""
If a field name does not start a new line or there isn't 4+ spaces at the start,
the entire header is abandoned.
"""
# Arrange
source_markdown = """---
test: abc
def
---
"""
expected_tokens = [
"[tbreak(1,1):-::---]",
"[setext(4,1):-:3::(2,1)]",
"[text(2,1):test: abc\ndef::\n \x02]",
"[end-setext::]",
"[BLANK(5,1):]",
]
expected_gfm = """<hr />
<h2>test: abc
def</h2>"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_10():
"""
A field name can be indented as many as 3 characters, as long as it ends with a ':'.
"""
# Arrange
source_markdown = """---
test: abc
def:
---
"""
expected_tokens = [
"[front-matter(1,1):---:---:['test: abc', ' def:']:{'test': 'abc', 'def': ''}]",
"[BLANK(5,1):]",
]
expected_gfm = """"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_11():
"""
A front matter element must contain at least one field name.
"""
# Arrange
source_markdown = """---
---"""
expected_tokens = ["[tbreak(1,1):-::---]", "[tbreak(2,1):-::---]"]
expected_gfm = """<hr />
<hr />"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_12():
"""
A continuation without a field to associate it with is bad.
"""
# Arrange
source_markdown = """---
\a\a\a\acontinuation
---""".replace(
"\a", " "
)
expected_tokens = [
"[tbreak(1,1):-::---]",
"[icode-block(2,5): :]",
"[text(2,5):continuation:]",
"[end-icode-block:::False]",
"[tbreak(3,1):-::---]",
]
expected_gfm = """<hr />
<pre><code>continuation
</code></pre>
<hr />"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_13():
"""
If a blank line is encountered before the end marker, the entire header is
thrown out.
"""
# Arrange
source_markdown = """---
Title: my document
---
---
"""
expected_tokens = [
"[tbreak(1,1):-::---]",
"[BLANK(2,1):]",
"[setext(4,1):-:3::(3,1)]",
"[text(3,1):Title: my document:]",
"[end-setext::]",
"[tbreak(5,1):-::---]",
"[BLANK(6,1):]",
]
expected_gfm = """<hr />
<h2>Title: my document</h2>
<hr />"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_14():
"""
Any whitespace after the three - characters in the start boundary is acceptable.
"""
# Arrange
source_markdown = """---\a\a
Title: my document
---
""".replace(
"\a", " "
)
expected_tokens = [
"[front-matter(1,1):--- :---:['Title: my document']:{'title': 'my document'}]",
"[BLANK(4,1):]",
]
expected_gfm = """"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_15():
"""
Any whitespace after the three - characters in the end boundary is acceptable.
"""
# Arrange
source_markdown = """---
Title: my document
---\a\a
""".replace(
"\a", " "
)
expected_tokens = [
"[front-matter(1,1):---:--- :['Title: my document']:{'title': 'my document'}]",
"[BLANK(4,1):]",
]
expected_gfm = """"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_16():
"""
More than three - characters in the boundary is not acceptable.
"""
# Arrange
source_markdown = """----
Title: my document
----
"""
expected_tokens = [
"[tbreak(1,1):-::----]",
"[setext(3,1):-:4::(2,1)]",
"[text(2,1):Title: my document:]",
"[end-setext::]",
"[BLANK(4,1):]",
]
expected_gfm = """<hr />\n<h2>Title: my document</h2>"""
# Act & Assert
act_and_assert(
source_markdown,
expected_gfm,
expected_tokens,
config_map=config_map,
show_debug=True,
)
@pytest.mark.gfm
def test_front_matter_17():
"""
This is an extension of test_front_matter_13. If a blank line is encountered
before the end marker, but after a field name, the entire header is still thrown out.
"""
# Arrange
source_markdown = """---
Title: my document
---
---
"""
expected_tokens = [
"[tbreak(1,1):-::---]",
"[para(2,1):]",
"[text(2,1):Title: my document:]",
"[end-para:::True]",
"[BLANK(3,1):]",
"[tbreak(4,1):-::---]",
"[tbreak(5,1):-::---]",
"[BLANK(6,1):]",
]
expected_gfm = """<hr />
<p>Title: my document</p>
<hr />
<hr />"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_18():
"""
This is an extension of test_front_matter_13/17. If a blank line is encountered
before the end marker, but after a field name and indented by at least 4 spaces,
the front matter is still valid.
"""
# Arrange
source_markdown = """---
Title: my document
/a/a/a/a
---
---
""".replace(
"/a", " "
)
expected_tokens = [
"[front-matter(1,1):---:---:['Title: my document', ' ']:{'title': 'my document\\n'}]",
"[tbreak(5,1):-::---]",
"[BLANK(6,1):]",
]
expected_gfm = """<hr />"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
@pytest.mark.gfm
def test_front_matter_19():
"""
This is an extension of test_front_matter_18. If a blank line is encountered
before the end marker, but before a field name and indented by at least 4 spaces,
the front matter is no longer valid.
"""
# Arrange
source_markdown = """---
/a/a/a/a
Title: my document
---
---
""".replace(
"/a", " "
)
expected_tokens = [
"[tbreak(1,1):-::---]",
"[BLANK(2,1): ]",
"[setext(4,1):-:3::(3,1)]",
"[text(3,1):Title: my document:]",
"[end-setext::]",
"[tbreak(5,1):-::---]",
"[BLANK(6,1):]",
]
expected_gfm = """<hr />
<h2>Title: my document</h2>
<hr />"""
# Act & Assert
act_and_assert(
source_markdown, expected_gfm, expected_tokens, config_map=config_map
)
| 21.876404 | 97 | 0.552645 | 1,434 | 11,682 | 4.317294 | 0.119944 | 0.071071 | 0.072686 | 0.049104 | 0.826199 | 0.811501 | 0.776611 | 0.761589 | 0.689549 | 0.669359 | 0 | 0.024684 | 0.261342 | 11,682 | 533 | 98 | 21.917448 | 0.69278 | 0.216658 | 0 | 0.699708 | 0 | 0.014577 | 0.310384 | 0.054599 | 0 | 0 | 0 | 0 | 0.058309 | 1 | 0.055394 | false | 0 | 0.005831 | 0 | 0.061224 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
4519b48d4f45c23fea4efd93cd840606c17cfae7 | 292 | py | Python | python/graphscope/experimental/nx/tests/algorithms/forward/test_reciprocity.py | wenyuanyu/GraphScope | a40ccaf70557e608d8b091eb25ab04477f99ce21 | [
"Apache-2.0"
] | 2 | 2020-12-15T08:42:10.000Z | 2022-01-14T09:13:16.000Z | python/graphscope/experimental/nx/tests/algorithms/forward/test_reciprocity.py | wenyuanyu/GraphScope | a40ccaf70557e608d8b091eb25ab04477f99ce21 | [
"Apache-2.0"
] | 1 | 2020-12-22T13:15:40.000Z | 2020-12-22T13:15:40.000Z | python/graphscope/experimental/nx/tests/algorithms/forward/test_reciprocity.py | wenyuanyu/GraphScope | a40ccaf70557e608d8b091eb25ab04477f99ce21 | [
"Apache-2.0"
] | 1 | 2021-11-23T03:40:43.000Z | 2021-11-23T03:40:43.000Z | import networkx.algorithms.tests.test_reciprocity
import pytest
from graphscope.experimental.nx.utils.compat import import_as_graphscope_nx
import_as_graphscope_nx(networkx.algorithms.tests.test_reciprocity,
decorators=pytest.mark.usefixtures("graphscope_session"))
| 36.5 | 81 | 0.808219 | 34 | 292 | 6.676471 | 0.529412 | 0.15859 | 0.202643 | 0.237885 | 0.334802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123288 | 292 | 7 | 82 | 41.714286 | 0.886719 | 0 | 0 | 0 | 0 | 0 | 0.061644 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.8 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
4520e8ff315b5355e91b53fab773642079730297 | 297 | py | Python | pava/implementation/natives/com/sun/xml/internal/ws/policy/PolicyMapUtil.py | laffra/pava | 54d10cf7f8def2f96e254c0356623d08f221536f | [
"MIT"
] | 4 | 2017-03-30T16:51:16.000Z | 2020-10-05T12:25:47.000Z | pava/implementation/natives/com/sun/xml/internal/ws/policy/PolicyMapUtil.py | laffra/pava | 54d10cf7f8def2f96e254c0356623d08f221536f | [
"MIT"
] | null | null | null | pava/implementation/natives/com/sun/xml/internal/ws/policy/PolicyMapUtil.py | laffra/pava | 54d10cf7f8def2f96e254c0356623d08f221536f | [
"MIT"
] | null | null | null | def add_native_methods(clazz):
def rejectAlternatives__com_sun_xml_internal_ws_policy_PolicyMap__(a0):
raise NotImplementedError()
clazz.rejectAlternatives__com_sun_xml_internal_ws_policy_PolicyMap__ = staticmethod(rejectAlternatives__com_sun_xml_internal_ws_policy_PolicyMap__)
| 42.428571 | 151 | 0.868687 | 35 | 297 | 6.457143 | 0.485714 | 0.278761 | 0.318584 | 0.358407 | 0.690265 | 0.690265 | 0.690265 | 0.690265 | 0 | 0 | 0 | 0.003704 | 0.090909 | 297 | 6 | 152 | 49.5 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
4530ec5e9d11f99f785fd05e2922b00313c3acda | 113 | py | Python | astrospam/ham.py | cdeil/sphinx-tutorial | 58439e7d36d03e6b69241ce8e7acea70473b566c | [
"MIT"
] | 7 | 2016-03-22T22:48:16.000Z | 2020-01-02T18:15:03.000Z | astrospam/ham.py | cdeil/sphinx-tutorial | 58439e7d36d03e6b69241ce8e7acea70473b566c | [
"MIT"
] | 1 | 2016-03-20T18:25:39.000Z | 2016-03-20T18:25:39.000Z | astrospam/ham.py | cdeil/sphinx-tutorial | 58439e7d36d03e6b69241ce8e7acea70473b566c | [
"MIT"
] | 9 | 2016-03-20T16:44:41.000Z | 2022-03-07T14:14:31.000Z | """This is the ``ham.py`` docstring.
"""
class Ham(object):
"""This is the ``Ham`` class docstring.
"""
| 16.142857 | 43 | 0.557522 | 15 | 113 | 4.2 | 0.533333 | 0.190476 | 0.285714 | 0.380952 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.212389 | 113 | 6 | 44 | 18.833333 | 0.707865 | 0.663717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
188a9d581ebd97838e174a9ec862fc83a7477ae0 | 27,596 | py | Python | models/cifar10_fitnet_multi.py | PetarV-/X-CNN | 38fe4d8a7185d16ddff8a9958695c40ed0d2652d | [
"MIT"
] | 13 | 2016-10-05T08:13:23.000Z | 2020-08-03T06:56:45.000Z | models/cifar10_fitnet_multi.py | PetarV-/X-CNN | 38fe4d8a7185d16ddff8a9958695c40ed0d2652d | [
"MIT"
] | 1 | 2016-10-07T02:04:44.000Z | 2017-03-07T21:39:03.000Z | models/cifar10_fitnet_multi.py | PetarV-/X-CNN | 38fe4d8a7185d16ddff8a9958695c40ed0d2652d | [
"MIT"
] | 11 | 2016-10-07T09:00:29.000Z | 2019-03-15T03:33:34.000Z | '''
This will implement the multilayer variant of the FitNet4 network
'''
from __future__ import print_function
from keras.preprocessing.image import ImageDataGenerator
from keras.models import Model
from keras.layers import Input, Dense, Activation, Flatten, Dropout, merge, MaxoutDense, Lambda
from keras.layers import Convolution2D, MaxPooling2D, ZeroPadding2D
from keras.layers.normalization import BatchNormalization
from keras.optimizers import Adam
from keras.regularizers import l2
from utils.preprocess import get_cifar
import sys
# Theano backend raises a RecursionError otherwise
sys.setrecursionlimit(50000)
batch_size = 128
nb_classes = 10
nb_epoch = 230
data_augmentation = True
# show the summary?
show_summary = True
# save the weights after training?
save_weights = True
weights_file = 'cifar10_fitnet_multi.h5'
# the data, shuffled and split between train and test sets
(X_train, Y_train), (X_test, Y_test) = get_cifar(p=1.0, append_test=False, use_c10=True)
print('X_train shape:', X_train.shape)
print(X_train.shape[0], 'train samples')
print(X_test.shape[0], 'test samples')
X_train = X_train.astype('float32')
X_test = X_test.astype('float32')
inputYUV = Input(shape=(3, 32, 32))
inputNorm = BatchNormalization(axis=1)(inputYUV)
inputY = Lambda(lambda x: x[:,0:1,:,:], output_shape=(1, 32, 32))(inputNorm)
inputU = Lambda(lambda x: x[:,1:2,:,:], output_shape=(1, 32, 32))(inputNorm)
inputV = Lambda(lambda x: x[:,2:3,:,:], output_shape=(1, 32, 32))(inputNorm)
inputY_drop = Dropout(0.2)(inputY)
inputU_drop = Dropout(0.2)(inputU)
inputV_drop = Dropout(0.2)(inputV)
h0_conv_Y_a = Convolution2D(24, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(inputY_drop)
h0_conv_Y_b = Convolution2D(24, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(inputY_drop)
h0_conv_Y = merge([h0_conv_Y_a, h0_conv_Y_b], mode='max', concat_axis=1)
h0_conv_Y = BatchNormalization(axis=1)(h0_conv_Y)
h0_conv_U_a = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(inputU_drop)
h0_conv_U_b = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(inputU_drop)
h0_conv_U = merge([h0_conv_U_a, h0_conv_U_b], mode='max', concat_axis=1)
h0_conv_U = BatchNormalization(axis=1)(h0_conv_U)
h0_conv_V_a = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(inputV_drop)
h0_conv_V_b = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(inputV_drop)
h0_conv_V = merge([h0_conv_V_a, h0_conv_V_b], mode='max', concat_axis=1)
h0_conv_V = BatchNormalization(axis=1)(h0_conv_V)
h1_conv_Y_a = Convolution2D(24, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h0_conv_Y)
h1_conv_Y_b = Convolution2D(24, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h0_conv_Y)
h1_conv_Y = merge([h1_conv_Y_a, h1_conv_Y_b], mode='max', concat_axis=1)
h1_conv_Y = BatchNormalization(axis=1)(h1_conv_Y)
h1_conv_U_a = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h0_conv_U)
h1_conv_U_b = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h0_conv_U)
h1_conv_U = merge([h1_conv_U_a, h1_conv_U_b], mode='max', concat_axis=1)
h1_conv_U = BatchNormalization(axis=1)(h1_conv_U)
h1_conv_V_a = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h0_conv_V)
h1_conv_V_b = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h0_conv_V)
h1_conv_V = merge([h1_conv_V_a, h1_conv_V_b], mode='max', concat_axis=1)
h1_conv_V = BatchNormalization(axis=1)(h1_conv_V)
h2_conv_Y_a = Convolution2D(24, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h1_conv_Y)
h2_conv_Y_b = Convolution2D(24, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h1_conv_Y)
h2_conv_Y = merge([h2_conv_Y_a, h2_conv_Y_b], mode='max', concat_axis=1)
h2_conv_Y = BatchNormalization(axis=1)(h2_conv_Y)
h2_conv_U_a = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h1_conv_U)
h2_conv_U_b = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h1_conv_U)
h2_conv_U = merge([h2_conv_U_a, h2_conv_U_b], mode='max', concat_axis=1)
h2_conv_U = BatchNormalization(axis=1)(h2_conv_U)
h2_conv_V_a = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h1_conv_V)
h2_conv_V_b = Convolution2D(12, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h1_conv_V)
h2_conv_V = merge([h2_conv_V_a, h2_conv_V_b], mode='max', concat_axis=1)
h2_conv_V = BatchNormalization(axis=1)(h2_conv_V)
h3_conv_Y_a = Convolution2D(36, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h2_conv_Y)
h3_conv_Y_b = Convolution2D(36, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h2_conv_Y)
h3_conv_Y = merge([h3_conv_Y_a, h3_conv_Y_b], mode='max', concat_axis=1)
h3_conv_Y = BatchNormalization(axis=1)(h3_conv_Y)
h3_conv_U_a = Convolution2D(18, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h2_conv_U)
h3_conv_U_b = Convolution2D(18, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h2_conv_U)
h3_conv_U = merge([h3_conv_U_a, h3_conv_U_b], mode='max', concat_axis=1)
h3_conv_U = BatchNormalization(axis=1)(h3_conv_U)
h3_conv_V_a = Convolution2D(18, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h2_conv_V)
h3_conv_V_b = Convolution2D(18, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h2_conv_V)
h3_conv_V = merge([h3_conv_V_a, h3_conv_V_b], mode='max', concat_axis=1)
h3_conv_V = BatchNormalization(axis=1)(h3_conv_V)
h4_conv_Y_a = Convolution2D(36, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h3_conv_Y)
h4_conv_Y_b = Convolution2D(36, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h3_conv_Y)
h4_conv_Y = merge([h4_conv_Y_a, h4_conv_Y_b], mode='max', concat_axis=1)
h4_conv_Y = BatchNormalization(axis=1)(h4_conv_Y)
h4_conv_U_a = Convolution2D(18, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h3_conv_U)
h4_conv_U_b = Convolution2D(18, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h3_conv_U)
h4_conv_U = merge([h4_conv_U_a, h4_conv_U_b], mode='max', concat_axis=1)
h4_conv_U = BatchNormalization(axis=1)(h4_conv_U)
h4_conv_V_a = Convolution2D(18, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h3_conv_V)
h4_conv_V_b = Convolution2D(18, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h3_conv_V)
h4_conv_V = merge([h4_conv_V_a, h4_conv_V_b], mode='max', concat_axis=1)
h4_conv_V = BatchNormalization(axis=1)(h4_conv_V)
poolY = MaxPooling2D(pool_size=(2, 2))(h4_conv_Y)
poolU = MaxPooling2D(pool_size=(2, 2))(h4_conv_U)
poolV = MaxPooling2D(pool_size=(2, 2))(h4_conv_V)
poolY = Dropout(0.2)(poolY)
poolU = Dropout(0.2)(poolU)
poolV = Dropout(0.2)(poolV)
# Inline connections
Y_to_Y_a = Convolution2D(36, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolY)
Y_to_Y_b = Convolution2D(36, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolY)
Y_to_Y = merge([Y_to_Y_a, Y_to_Y_b], mode='max', concat_axis=1)
Y_to_Y = BatchNormalization(axis=1)(Y_to_Y)
U_to_U_a = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolU)
U_to_U_b = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolU)
U_to_U = merge([U_to_U_a, U_to_U_b], mode='max', concat_axis=1)
U_to_U = BatchNormalization(axis=1)(U_to_U)
V_to_V_a = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolV)
V_to_V_b = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolV)
V_to_V = merge([V_to_V_a, V_to_V_b], mode='max', concat_axis=1)
V_to_V = BatchNormalization(axis=1)(V_to_V)
# Cross connections: Y <-> U, Y <-> V
Y_to_UV_a = Convolution2D(12, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolY)
Y_to_UV_b = Convolution2D(12, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolY)
Y_to_UV = merge([Y_to_UV_a, Y_to_UV_b], mode='max', concat_axis=1)
Y_to_UV = BatchNormalization(axis=1)(Y_to_UV)
U_to_Y_a = Convolution2D(12, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolU)
U_to_Y_b = Convolution2D(12, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolU)
U_to_Y = merge([U_to_Y_a, U_to_Y_b], mode='max', concat_axis=1)
U_to_Y = BatchNormalization(axis=1)(U_to_Y)
V_to_Y_a = Convolution2D(12, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolV)
V_to_Y_b = Convolution2D(12, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolV)
V_to_Y = merge([V_to_Y_a, V_to_Y_b], mode='max', concat_axis=1)
V_to_Y = BatchNormalization(axis=1)(V_to_Y)
Ymap = merge([Y_to_Y, U_to_Y, V_to_Y], mode='concat', concat_axis=1)
Umap = merge([U_to_U, Y_to_UV], mode='concat', concat_axis=1)
Vmap = merge([V_to_V, Y_to_UV], mode='concat', concat_axis=1)
h5_conv_Y_a = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Ymap)
h5_conv_Y_b = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Ymap)
h5_conv_Y = merge([h5_conv_Y_a, h5_conv_Y_b], mode='max', concat_axis=1)
h5_conv_Y = BatchNormalization(axis=1)(h5_conv_Y)
h5_conv_U_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Umap)
h5_conv_U_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Umap)
h5_conv_U = merge([h5_conv_U_a, h5_conv_U_b], mode='max', concat_axis=1)
h5_conv_U = BatchNormalization(axis=1)(h5_conv_U)
h5_conv_V_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Vmap)
h5_conv_V_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Vmap)
h5_conv_V = merge([h5_conv_V_a, h5_conv_V_b], mode='max', concat_axis=1)
h5_conv_V = BatchNormalization(axis=1)(h5_conv_V)
h6_conv_Y_a = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h5_conv_Y)
h6_conv_Y_b = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h5_conv_Y)
h6_conv_Y = merge([h6_conv_Y_a, h6_conv_Y_b], mode='max', concat_axis=1)
h6_conv_Y = BatchNormalization(axis=1)(h6_conv_Y)
h6_conv_U_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h5_conv_U)
h6_conv_U_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h5_conv_U)
h6_conv_U = merge([h6_conv_U_a, h6_conv_U_b], mode='max', concat_axis=1)
h6_conv_U = BatchNormalization(axis=1)(h6_conv_U)
h6_conv_V_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h5_conv_V)
h6_conv_V_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h5_conv_V)
h6_conv_V = merge([h6_conv_V_a, h6_conv_V_b], mode='max', concat_axis=1)
h6_conv_V = BatchNormalization(axis=1)(h6_conv_V)
h7_conv_Y_a = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h6_conv_Y)
h7_conv_Y_b = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h6_conv_Y)
h7_conv_Y = merge([h7_conv_Y_a, h7_conv_Y_b], mode='max', concat_axis=1)
h7_conv_Y = BatchNormalization(axis=1)(h7_conv_Y)
h7_conv_U_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h6_conv_U)
h7_conv_U_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h6_conv_U)
h7_conv_U = merge([h7_conv_U_a, h7_conv_U_b], mode='max', concat_axis=1)
h7_conv_U = BatchNormalization(axis=1)(h7_conv_U)
h7_conv_V_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h6_conv_V)
h7_conv_V_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h6_conv_V)
h7_conv_V = merge([h7_conv_V_a, h7_conv_V_b], mode='max', concat_axis=1)
h7_conv_V = BatchNormalization(axis=1)(h7_conv_V)
h8_conv_Y_a = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h7_conv_Y)
h8_conv_Y_b = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h7_conv_Y)
h8_conv_Y = merge([h8_conv_Y_a, h8_conv_Y_b], mode='max', concat_axis=1)
h8_conv_Y = BatchNormalization(axis=1)(h8_conv_Y)
h8_conv_U_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h7_conv_U)
h8_conv_U_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h7_conv_U)
h8_conv_U = merge([h8_conv_U_a, h8_conv_U_b], mode='max', concat_axis=1)
h8_conv_U = BatchNormalization(axis=1)(h8_conv_U)
h8_conv_V_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h7_conv_V)
h8_conv_V_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h7_conv_V)
h8_conv_V = merge([h8_conv_V_a, h8_conv_V_b], mode='max', concat_axis=1)
h8_conv_V = BatchNormalization(axis=1)(h8_conv_V)
h9_conv_Y_a = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h8_conv_Y)
h9_conv_Y_b = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h8_conv_Y)
h9_conv_Y = merge([h9_conv_Y_a, h9_conv_Y_b], mode='max', concat_axis=1)
h9_conv_Y = BatchNormalization(axis=1)(h9_conv_Y)
h9_conv_U_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h8_conv_U)
h9_conv_U_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h8_conv_U)
h9_conv_U = merge([h9_conv_U_a, h9_conv_U_b], mode='max', concat_axis=1)
h9_conv_U = BatchNormalization(axis=1)(h9_conv_U)
h9_conv_V_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h8_conv_V)
h9_conv_V_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h8_conv_V)
h9_conv_V = merge([h9_conv_V_a, h9_conv_V_b], mode='max', concat_axis=1)
h9_conv_V = BatchNormalization(axis=1)(h9_conv_V)
h10_conv_Y_a = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h9_conv_Y)
h10_conv_Y_b = Convolution2D(60, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h9_conv_Y)
h10_conv_Y = merge([h10_conv_Y_a, h10_conv_Y_b], mode='max', concat_axis=1)
h10_conv_Y = BatchNormalization(axis=1)(h10_conv_Y)
h10_conv_U_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h9_conv_U)
h10_conv_U_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h9_conv_U)
h10_conv_U = merge([h10_conv_U_a, h10_conv_U_b], mode='max', concat_axis=1)
h10_conv_U = BatchNormalization(axis=1)(h10_conv_U)
h10_conv_V_a = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h9_conv_V)
h10_conv_V_b = Convolution2D(30, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h9_conv_V)
h10_conv_V = merge([h10_conv_V_a, h10_conv_V_b], mode='max', concat_axis=1)
h10_conv_V = BatchNormalization(axis=1)(h10_conv_V)
poolY = MaxPooling2D(pool_size=(2, 2))(h10_conv_Y)
poolU = MaxPooling2D(pool_size=(2, 2))(h10_conv_U)
poolV = MaxPooling2D(pool_size=(2, 2))(h10_conv_V)
poolY = Dropout(0.2)(poolY)
poolU = Dropout(0.2)(poolU)
poolV = Dropout(0.2)(poolV)
# Inline connections
Y_to_Y_a = Convolution2D(60, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolY)
Y_to_Y_b = Convolution2D(60, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolY)
Y_to_Y = merge([Y_to_Y_a, Y_to_Y_b], mode='max', concat_axis=1)
Y_to_Y = BatchNormalization(axis=1)(Y_to_Y)
U_to_U_a = Convolution2D(30, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolU)
U_to_U_b = Convolution2D(30, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolU)
U_to_U = merge([U_to_U_a, U_to_U_b], mode='max', concat_axis=1)
U_to_U = BatchNormalization(axis=1)(U_to_U)
V_to_V_a = Convolution2D(30, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolV)
V_to_V_b = Convolution2D(30, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolV)
V_to_V = merge([V_to_V_a, V_to_V_b], mode='max', concat_axis=1)
V_to_V = BatchNormalization(axis=1)(V_to_V)
# Cross connections: Y <-> U, Y <-> V
Y_to_UV_a = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolY)
Y_to_UV_b = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolY)
Y_to_UV = merge([Y_to_UV_a, Y_to_UV_b], mode='max', concat_axis=1)
Y_to_UV = BatchNormalization(axis=1)(Y_to_UV)
U_to_Y_a = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolU)
U_to_Y_b = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolU)
U_to_Y = merge([U_to_Y_a, U_to_Y_b], mode='max', concat_axis=1)
U_to_Y = BatchNormalization(axis=1)(U_to_Y)
V_to_Y_a = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolV)
V_to_Y_b = Convolution2D(18, 1, 1, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(poolV)
V_to_Y = merge([V_to_Y_a, V_to_Y_b], mode='max', concat_axis=1)
V_to_Y = BatchNormalization(axis=1)(V_to_Y)
Ymap = merge([Y_to_Y, U_to_Y, V_to_Y], mode='concat', concat_axis=1)
Umap = merge([U_to_U, Y_to_UV], mode='concat', concat_axis=1)
Vmap = merge([V_to_V, Y_to_UV], mode='concat', concat_axis=1)
h11_conv_Y_a = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Ymap)
h11_conv_Y_b = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Ymap)
h11_conv_Y = merge([h11_conv_Y_a, h11_conv_Y_b], mode='max', concat_axis=1)
h11_conv_Y = BatchNormalization(axis=1)(h11_conv_Y)
h11_conv_U_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Umap)
h11_conv_U_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Umap)
h11_conv_U = merge([h11_conv_U_a, h11_conv_U_b], mode='max', concat_axis=1)
h11_conv_U = BatchNormalization(axis=1)(h11_conv_U)
h11_conv_V_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Vmap)
h11_conv_V_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(Vmap)
h11_conv_V = merge([h11_conv_V_a, h11_conv_V_b], mode='max', concat_axis=1)
h11_conv_V = BatchNormalization(axis=1)(h11_conv_V)
h12_conv_Y_a = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h11_conv_Y)
h12_conv_Y_b = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h11_conv_Y)
h12_conv_Y = merge([h12_conv_Y_a, h12_conv_Y_b], mode='max', concat_axis=1)
h12_conv_Y = BatchNormalization(axis=1)(h12_conv_Y)
h12_conv_U_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h11_conv_U)
h12_conv_U_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h11_conv_U)
h12_conv_U = merge([h12_conv_U_a, h12_conv_U_b], mode='max', concat_axis=1)
h12_conv_U = BatchNormalization(axis=1)(h12_conv_U)
h12_conv_V_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h11_conv_V)
h12_conv_V_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h11_conv_V)
h12_conv_V = merge([h12_conv_V_a, h12_conv_V_b], mode='max', concat_axis=1)
h12_conv_V = BatchNormalization(axis=1)(h12_conv_V)
h13_conv_Y_a = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h12_conv_Y)
h13_conv_Y_b = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h12_conv_Y)
h13_conv_Y = merge([h13_conv_Y_a, h13_conv_Y_b], mode='max', concat_axis=1)
h13_conv_Y = BatchNormalization(axis=1)(h13_conv_Y)
h13_conv_U_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h12_conv_U)
h13_conv_U_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h12_conv_U)
h13_conv_U = merge([h13_conv_U_a, h13_conv_U_b], mode='max', concat_axis=1)
h13_conv_U = BatchNormalization(axis=1)(h13_conv_U)
h13_conv_V_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h12_conv_V)
h13_conv_V_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h12_conv_V)
h13_conv_V = merge([h13_conv_V_a, h13_conv_V_b], mode='max', concat_axis=1)
h13_conv_V = BatchNormalization(axis=1)(h13_conv_V)
h14_conv_Y_a = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h13_conv_Y)
h14_conv_Y_b = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h13_conv_Y)
h14_conv_Y = merge([h14_conv_Y_a, h14_conv_Y_b], mode='max', concat_axis=1)
h14_conv_Y = BatchNormalization(axis=1)(h14_conv_Y)
h14_conv_U_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h13_conv_U)
h14_conv_U_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h13_conv_U)
h14_conv_U = merge([h14_conv_U_a, h14_conv_U_b], mode='max', concat_axis=1)
h14_conv_U = BatchNormalization(axis=1)(h14_conv_U)
h14_conv_V_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h13_conv_V)
h14_conv_V_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h13_conv_V)
h14_conv_V = merge([h14_conv_V_a, h14_conv_V_b], mode='max', concat_axis=1)
h14_conv_V = BatchNormalization(axis=1)(h14_conv_V)
h15_conv_Y_a = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h14_conv_Y)
h15_conv_Y_b = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h14_conv_Y)
h15_conv_Y = merge([h15_conv_Y_a, h15_conv_Y_b], mode='max', concat_axis=1)
h15_conv_Y = BatchNormalization(axis=1)(h15_conv_Y)
h15_conv_U_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h14_conv_U)
h15_conv_U_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h14_conv_U)
h15_conv_U = merge([h15_conv_U_a, h15_conv_U_b], mode='max', concat_axis=1)
h15_conv_U = BatchNormalization(axis=1)(h15_conv_U)
h15_conv_V_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h14_conv_V)
h15_conv_V_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h14_conv_V)
h15_conv_V = merge([h15_conv_V_a, h15_conv_V_b], mode='max', concat_axis=1)
h15_conv_V = BatchNormalization(axis=1)(h15_conv_V)
h16_conv_Y_a = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h15_conv_Y)
h16_conv_Y_b = Convolution2D(96, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h15_conv_Y)
h16_conv_Y = merge([h16_conv_Y_a, h16_conv_Y_b], mode='max', concat_axis=1)
h16_conv_Y = BatchNormalization(axis=1)(h16_conv_Y)
h16_conv_U_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h15_conv_U)
h16_conv_U_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h15_conv_U)
h16_conv_U = merge([h16_conv_U_a, h16_conv_U_b], mode='max', concat_axis=1)
h16_conv_U = BatchNormalization(axis=1)(h16_conv_U)
h16_conv_V_a = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h15_conv_V)
h16_conv_V_b = Convolution2D(48, 3, 3, border_mode='same', init='glorot_uniform', W_regularizer=l2(0.0005))(h15_conv_V)
h16_conv_V = merge([h16_conv_V_a, h16_conv_V_b], mode='max', concat_axis=1)
h16_conv_V = BatchNormalization(axis=1)(h16_conv_V)
poolY = MaxPooling2D(pool_size=(8, 8))(h16_conv_Y)
poolU = MaxPooling2D(pool_size=(8, 8))(h16_conv_U)
poolV = MaxPooling2D(pool_size=(8, 8))(h16_conv_V)
poolY = Dropout(0.2)(poolY)
poolU = Dropout(0.2)(poolU)
poolV = Dropout(0.2)(poolV)
concat_map = merge([poolY, poolU, poolV], mode='concat', concat_axis=1)
h16 = Flatten()(concat_map)
h17 = MaxoutDense(500, nb_feature=5, init='glorot_uniform', W_regularizer=l2(0.0005))(h16)
h17 = BatchNormalization(axis=1)(h17)
h17_drop = Dropout(0.2)(h17)
out = Dense(nb_classes, activation='softmax', init='glorot_uniform', W_regularizer=l2(0.0005))(h17_drop)
model = Model(input=inputYUV, output=out)
model.compile(loss='categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])
if show_summary:
print(model.summary())
if not data_augmentation:
print('Not using data augmentation.')
model.fit(X_train, Y_train,
batch_size=batch_size,
nb_epoch=nb_epoch,
validation_data=(X_test, Y_test),
shuffle=True,
verbose=2)
else:
print('Using real-time data augmentation.')
# this will do preprocessing and realtime data augmentation
datagen = ImageDataGenerator(
featurewise_center=False, # set input mean to 0 over the dataset
samplewise_center=False, # set each sample mean to 0
featurewise_std_normalization=False, # divide inputs by std of the dataset
samplewise_std_normalization=False, # divide each input by its std
zca_whitening=False, # apply ZCA whitening
rotation_range=0, # randomly rotate images in the range (degrees, 0 to 180)
width_shift_range=0.1, # randomly shift images horizontally (fraction of total width)
height_shift_range=0.1, # randomly shift images vertically (fraction of total height)
horizontal_flip=True, # randomly flip images
vertical_flip=False) # randomly flip images
# compute quantities required for featurewise normalization
# (std, mean, and principal components if ZCA whitening is applied)
datagen.fit(X_train)
# fit the model on the batches generated by datagen.flow()
model.fit_generator(datagen.flow(X_train, Y_train,
batch_size=batch_size),
samples_per_epoch=X_train.shape[0],
nb_epoch=nb_epoch,
validation_data=(X_test, Y_test),
verbose=2)
if save_weights:
model.save_weights(weights_file)
| 60.38512 | 119 | 0.75395 | 5,094 | 27,596 | 3.711425 | 0.045544 | 0.03967 | 0.115096 | 0.121866 | 0.830742 | 0.743997 | 0.738496 | 0.730033 | 0.630858 | 0.630858 | 0 | 0.092091 | 0.091028 | 27,596 | 456 | 120 | 60.517544 | 0.661617 | 0.034027 | 0 | 0.128571 | 0 | 0 | 0.101697 | 0.001765 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.028571 | 0 | 0.028571 | 0.02 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
188f5fec294798582e06ccdb289bef4904bdb579 | 76 | py | Python | gym_softrobot/envs/simple_control/__init__.py | nmnaughton/gym-softrobot | 7b7eb9bfb97f2e3d2c3e2f7df50ca96426a2482f | [
"MIT"
] | 10 | 2022-01-11T19:49:02.000Z | 2022-03-24T22:27:32.000Z | gym_softrobot/envs/simple_control/__init__.py | nmnaughton/gym-softrobot | 7b7eb9bfb97f2e3d2c3e2f7df50ca96426a2482f | [
"MIT"
] | 7 | 2022-01-15T07:48:53.000Z | 2022-03-07T17:43:44.000Z | gym_softrobot/envs/simple_control/__init__.py | nmnaughton/gym-softrobot | 7b7eb9bfb97f2e3d2c3e2f7df50ca96426a2482f | [
"MIT"
] | 2 | 2022-03-06T19:43:06.000Z | 2022-03-25T21:31:52.000Z | from gym_softrobot.envs.simple_control.inertial_pull import InertialPullEnv
| 38 | 75 | 0.907895 | 10 | 76 | 6.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 76 | 1 | 76 | 76 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
1898088a0b3cac12836711a7984a0c0249075589 | 14,108 | py | Python | tests/unit/hardware/TestRotary.py | rakhimov/rtk | adc35e218ccfdcf3a6e3082f6a1a1d308ed4ff63 | [
"BSD-3-Clause"
] | null | null | null | tests/unit/hardware/TestRotary.py | rakhimov/rtk | adc35e218ccfdcf3a6e3082f6a1a1d308ed4ff63 | [
"BSD-3-Clause"
] | null | null | null | tests/unit/hardware/TestRotary.py | rakhimov/rtk | adc35e218ccfdcf3a6e3082f6a1a1d308ed4ff63 | [
"BSD-3-Clause"
] | 2 | 2020-04-03T04:14:42.000Z | 2021-02-22T05:30:35.000Z | #!/usr/bin/env python -O
"""
This is the test class for testing Rotary Switch module algorithms and
models.
"""
# -*- coding: utf-8 -*-
#
# tests.unit.TestRotary.py is part of The RTK Project
#
# All rights reserved.
import sys
from os.path import dirname
sys.path.insert(0, dirname(dirname(dirname(__file__))) + "/rtk", )
import unittest
from nose.plugins.attrib import attr
from hardware.component.switch.Rotary import Rotary
__author__ = 'Andrew Rowland'
__email__ = 'andrew.rowland@reliaqual.com'
__organization__ = 'ReliaQual Associates, LLC'
__copyright__ = 'Copyright 2015 Andrew "Weibullguy" Rowland'
class TestRotaryModel(unittest.TestCase):
"""
Class for testing the Rotary Switch data model class.
"""
def setUp(self):
"""
Setup the test fixture for the Rotary Switch class.
"""
self.DUT = Rotary()
@attr(all=True, unit=True)
def test_create(self):
"""
(TestRotary) __init__ should return a Rotary Switch data model
"""
self.assertTrue(isinstance(self.DUT, Rotary))
# Verify Hardware class was properly initialized.
self.assertEqual(self.DUT.revision_id, None)
self.assertEqual(self.DUT.category_id, 0)
# Verify the Switch class was properly initialized.
self.assertEqual(self.DUT.category, 7)
self.assertEqual(self.DUT.quality, 0)
self.assertEqual(self.DUT.q_override, 0.0)
self.assertEqual(self.DUT.piE, 0.0)
# Verify the Rotary Switch class was properly initialized.
self.assertEqual(self.DUT.subcategory, 69)
self.assertEqual(self.DUT.construction, 0)
self.assertEqual(self.DUT.load_type, 0)
self.assertEqual(self.DUT.n_contacts, 0)
self.assertEqual(self.DUT.cycles_per_hour, 0.0)
self.assertEqual(self.DUT.piCYC, 0.0)
self.assertEqual(self.DUT.piL, 0.0)
@attr(all=True, unit=True)
def test_set_attributes(self):
"""
(TestRotary) set_attributes should return a 0 error code on success
"""
_values = (0, 32, 'Alt Part #', 'Attachments', 'CAGE Code',
'Comp Ref Des', 0.0, 0.0, 0.0, 'Description', 100.0, 0,
0, 'Figure #', 50.0, 'LCN', 1, 0, 10.0, 'Name', 'NSN', 0,
'Page #', 0, 0, 'Part #', 1, 'Ref Des', 1.0, 0,
'Remarks', 0.0, 'Spec #', 0, 30.0, 30.0, 0.0, 2014,
1.0, 155.0, -25.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0,
0.0, 1.0,
0.0, 1.0, 1.0, 0.0, 0.0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
1, 0.0, '', 0.0, 0.0, 0.0, 1, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0,
0, 0, 1, 0.0,
0, 0, 0.0, 30.0, 0.0, 358.0,
1.0, 125.0, 0.01, 2.0, 1.0, 1.0, 1.5, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3, 2, 2, 4)
(_error_code,
_error_msg) = self.DUT.set_attributes(_values)
self.assertEqual(_error_code, 0)
self.assertEqual(self.DUT.quality, 3)
self.assertEqual(self.DUT.construction, 2)
self.assertEqual(self.DUT.load_type, 2)
self.assertEqual(self.DUT.n_contacts, 4)
self.assertEqual(self.DUT.cycles_per_hour, 2.0)
self.assertEqual(self.DUT.piCYC, 1.0)
self.assertEqual(self.DUT.piL, 1.0)
@attr(all=True, unit=True)
def test_set_attributes_missing_index(self):
"""
(TestRotary) set_attributes should return a 40 error code when too few items are passed
"""
_values = (0, 32, 'Alt Part #', 'Attachments', 'CAGE Code',
'Comp Ref Des', 0.0, 0.0, 0.0, 'Description', 100.0, 0,
0, 'Figure #', 50.0, 'LCN', 1, 0, 10.0, 'Name', 'NSN', 0,
'Page #', 0, 0, 'Part #', 1, 'Ref Des', 1.0, 0,
'Remarks', 0.0, 'Spec #', 0, 30.0, 30.0, 0.0, 2014,
1.0, 155.0, -25.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0,
0.0, 1.0,
0.0, 1.0, 1.0, 0.0, 0.0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
1, 0.0, '', 0.0, 0.0, 0.0, 1, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0,
0, 0, 1, 0.0,
0, 0, 0.0, 30.0, 0.0, 358.0,
1.0, 0.01, 2.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3)
(_error_code,
_error_msg) = self.DUT.set_attributes(_values)
self.assertEqual(_error_code, 40)
@attr(all=True, unit=True)
def test_set_attributes_type_error(self):
"""
(TestRotary) set_attributes should return a 10 error code when the wrong type is passed
"""
_values = (0, 32, 'Alt Part #', 'Attachments', 'CAGE Code',
'Comp Ref Des', 0.0, 0.0, 0.0, 'Description', 100.0, 0,
0, 'Figure #', 50.0, 'LCN', 1, 0, 10.0, 'Name', 'NSN', 0,
'Page #', 0, 0, 'Part #', 1, 'Ref Des', 1.0, 0,
'Remarks', 0.0, 'Spec #', 0, 30.0, 30.0, 0.0, 2014,
1.0, 155.0, -25.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0,
0.0, 1.0,
0.0, 1.0, 1.0, 0.0, 0.0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
1, 0.0, '', 0.0, 0.0, 0.0, 1, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0,
0, 0, 1, 0.0,
0, 0, 0.0, 30.0, 0.0, 358.0,
1.0, 0.0, 0.01, 2.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3, '')
(_error_code,
_error_msg) = self.DUT.set_attributes(_values)
self.assertEqual(_error_code, 10)
@attr(all=True, unit=True)
def test_get_attributes(self):
"""
(TestRotary) get_attributes should return a tuple of attribute values
"""
_values = (None, None, '', '', '', '', 0.0, 0.0, 0.0, '', 100.0, 0, 0,
'', 50.0, '', 1, 0, 10.0, '', '', 0, '', 0, 0, '', 1, '',
1.0, 0, '', 0.0, '', 0, 30.0, 30.0, 0.0, 2014,
1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 1.0, 0.0, 1.0,
0.0, 1.0, 1.0, 0.0, 0.0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1,
0.0, {}, 0.0, 0.0, 0.0, 1, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0,
0, 0,
0.0, 30.0, 0.0, 30.0,
0, 0.0, 0.0, 0.0, '', 0, 0, 0, 0.0, 0.0, 0.0)
self.assertEqual(self.DUT.get_attributes(), _values)
@attr(all=True, unit=True)
def test_attribute_sanity(self):
"""
(TestRotary) get_attributes(set_attributes(values)) == values
"""
_in_values = (0, 32, 'Alt Part #', 'Attachments', 'CAGE Code',
'Comp Ref Des', 0.0, 0.0, 0.0, 'Description', 100.0, 0,
0, 'Figure #', 50.0, 'LCN', 1, 0, 10.0, 'Name', 'NSN', 0,
'Page #', 0, 0, 'Part #', 1, 'Ref Des', 1.0, 0,
'Remarks', 0.0, 'Spec #', 0, 30.0, 30.0, 0.0, 2014,
1.0, 155.0, -25.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0,
0.0, 1.0,
0.0, 1.0, 1.0, 0.0, 0.0, 0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,
1, 0.0, '', 0.0, 0.0, 0.0, 1, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0, 0.0, 0,
0, 0, 1, 0.0,
0, 0, 0.0, 30.0, 0.0, 358.0,
1.0, 125.0, 0.01, 2.0, 1.0, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 3, 5, 1, 8)
_out_values = (0, 32, 'Alt Part #', 'Attachments', 'CAGE Code',
'Comp Ref Des', 0.0, 0.0, 0.0, 'Description', 100.0, 0,
0, 'Figure #', 50.0, 'LCN', 1, 0, 10.0, 'Name', 'NSN',
0, 'Page #', 0, 0, 'Part #', 1, 'Ref Des', 1.0, 0,
'Remarks', 0.0, 'Spec #', 0, 30.0, 30.0, 0.0, 2014,
1.0, 155.0, -25.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0,
0.0, 1.0,
0.0, 1.0, 1.0, 0.0, 0.0, 0, 0.0, 0.0, 0.0, 0.0, 0.0,
0.0, 1, 0.0, '', 0.0, 0.0, 0.0, 1, 0.0, 0.0, 0.0, 0.0,
0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, 1.0, 1.0, 0.0,
0.0, 0,
0, 0, 0.0, 30.0, 0.0, 358.0, 3, 1.0, 125.0, 0.01, '',
5, 1, 8, 2.0, 1.0, 1.0)
self.DUT.set_attributes(_in_values)
_result = self.DUT.get_attributes()
self.assertEqual(_result, _out_values)
@attr(all=True, unit=True)
def test_calculate_217_count(self):
"""
(TestRotary) calculate_part should return False on success when calculating MIL-HDBK-217F parts count results
"""
self.DUT.environment_active = 5
self.DUT.hazard_rate_type = 1
self.DUT.quality = 1
self.assertFalse(self.DUT.calculate_part())
self.assertEqual(self.DUT.hazard_rate_model['equation'],
'lambdab * piQ')
self.assertEqual(self.DUT.hazard_rate_model['lambdab'], 9.5)
self.assertEqual(self.DUT.hazard_rate_model['piQ'], 1.0)
self.assertAlmostEqual(self.DUT.hazard_rate_active, 9.5E-06)
@attr(all=True, unit=True)
def test_calculate_217_stress_resistive(self):
"""
(TestRotary) calculate_part should return False on success when calculating MIL-HDBK-217F stress results with a resistive load
"""
self.DUT.environment_active = 2
self.DUT.hazard_rate_type = 2
self.DUT.quality = 1
self.DUT.construction = 1
self.DUT.load_type = 1
self.DUT.n_contacts = 8
self.DUT.cycles_per_hour = 2
self.DUT.operating_current = 0.023
self.DUT.rated_current = 0.05
self.assertFalse(self.DUT.calculate_part())
self.assertEqual(self.DUT.hazard_rate_model['equation'],
'lambdab * piCYC * piL * piE')
self.assertAlmostEqual(self.DUT.hazard_rate_model['lambdab'], 0.00694)
self.assertEqual(self.DUT.hazard_rate_model['piCYC'], 2.0)
self.assertAlmostEqual(self.DUT.hazard_rate_model['piL'], 1.3918378)
self.assertEqual(self.DUT.hazard_rate_model['piE'], 3.0)
self.assertAlmostEqual(self.DUT.hazard_rate_active, 5.7956124E-08)
@attr(all=True, unit=True)
def test_calculate_217_stress_inductive(self):
"""
(TestRotary) calculate_part should return False on success when calculating MIL-HDBK-217F stress results with an inductive load
"""
self.DUT.environment_active = 2
self.DUT.hazard_rate_type = 2
self.DUT.quality = 2
self.DUT.construction = 2
self.DUT.load_type = 2
self.DUT.n_contacts = 8
self.DUT.cycles_per_hour = 2
self.DUT.operating_current = 0.023
self.DUT.rated_current = 0.05
self.assertFalse(self.DUT.calculate_part())
self.assertEqual(self.DUT.hazard_rate_model['equation'],
'lambdab * piCYC * piL * piE')
self.assertAlmostEqual(self.DUT.hazard_rate_model['lambdab'], 0.00694)
self.assertEqual(self.DUT.hazard_rate_model['piCYC'], 2.0)
self.assertAlmostEqual(self.DUT.hazard_rate_model['piL'], 3.7527916)
self.assertEqual(self.DUT.hazard_rate_model['piE'], 3.0)
self.assertAlmostEqual(self.DUT.hazard_rate_active, 1.5626624E-07)
@attr(all=True, unit=True)
def test_calculate_217_stress_lamp(self):
"""
(TestRotary) calculate_part should return False on success when calculating MIL-HDBK-217F stress results with a lamp load
"""
self.DUT.environment_active = 2
self.DUT.hazard_rate_type = 2
self.DUT.quality = 2
self.DUT.construction = 1
self.DUT.load_type = 3
self.DUT.n_contacts = 8
self.DUT.cycles_per_hour = 0.8
self.DUT.operating_current = 0.023
self.DUT.rated_current = 0.05
self.assertFalse(self.DUT.calculate_part())
self.assertEqual(self.DUT.hazard_rate_model['equation'],
'lambdab * piCYC * piL * piE')
self.assertAlmostEqual(self.DUT.hazard_rate_model['lambdab'], 0.58)
self.assertEqual(self.DUT.hazard_rate_model['piCYC'], 1.0)
self.assertAlmostEqual(self.DUT.hazard_rate_model['piL'], 198.3434254)
self.assertEqual(self.DUT.hazard_rate_model['piE'], 3.0)
self.assertAlmostEqual(self.DUT.hazard_rate_active, 0.0003451176)
@attr(all=True, unit=False)
def test_calculate_217_stress_overflow(self):
"""
(TestRotary) calculate_part should return True when an OverflowError is raised when calculating MIL-HDBK-217F stress results
"""
self.DUT.hazard_rate_type = 2
self.DUT.operating_voltage = 1.25
self.DUT.acvapplied = 0.025
self.DUT.rated_voltage = 3.3
self.DUT.capacitance = 0.0000027
self.DUT.effective_resistance = 0.5
self.DUT.reference_temperature = 0.000000000000001
self.assertTrue(self.DUT.calculate_part())
@attr(all=True, unit=False)
def test_calculate_217_stress_zero_division(self):
"""
(TestRotary) calculate_part should return True when a ZeroDivisionError is raised when calculating MIL-HDBK-217F stress results
"""
self.DUT.hazard_rate_type = 2
self.DUT.operating_voltage = 1.25
self.DUT.acvapplied = 0.025
self.DUT.rated_voltage = 3.3
self.DUT.capacitance = 0.0000033
self.DUT.effective_resistance = 0.5
self.DUT.reference_temperature = 0.0
self.assertTrue(self.DUT.calculate_part())
| 42.751515 | 135 | 0.528636 | 2,284 | 14,108 | 3.169002 | 0.095884 | 0.164686 | 0.206411 | 0.234872 | 0.806024 | 0.773556 | 0.718983 | 0.674496 | 0.630146 | 0.597817 | 0 | 0.144057 | 0.308194 | 14,108 | 329 | 136 | 42.881459 | 0.597541 | 0.116459 | 0 | 0.539823 | 0 | 0 | 0.067408 | 0.002307 | 0 | 0 | 0 | 0 | 0.238938 | 1 | 0.057522 | false | 0 | 0.022124 | 0 | 0.084071 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
18bac17d6af80da074bad1b1868aea9693e943d9 | 219 | py | Python | src/test/selenium-robot/resources/RobotUtils.py | Aleks161-lab/pet-clinic | 640bed805c8a8392b0d74271917f6c611703dade | [
"Apache-2.0"
] | 2 | 2022-02-15T18:51:58.000Z | 2022-02-15T18:52:03.000Z | src/test/selenium-robot/resources/RobotUtils.py | Aleks161-lab/pet-clinic | 640bed805c8a8392b0d74271917f6c611703dade | [
"Apache-2.0"
] | 52 | 2019-08-28T15:58:43.000Z | 2021-11-08T12:12:51.000Z | src/test/selenium-robot/resources/RobotUtils.py | jllempen/pet-clinic | ee9bd5296fc8ee1bb38861b80932a90eaf8b4089 | [
"Apache-2.0"
] | 35 | 2020-08-13T02:12:10.000Z | 2021-01-31T17:59:46.000Z | class RobotUtils():
def convertFloatToString(self, myFloat):
return str(myFloat)
def getDynamicXpath(self, xpath, var1):
#pdb.Pdb(stdout=sys.__stdout__).set_trace()
return xpath % var1
| 24.333333 | 51 | 0.6621 | 24 | 219 | 5.833333 | 0.666667 | 0.128571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011834 | 0.228311 | 219 | 8 | 52 | 27.375 | 0.816568 | 0.191781 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0 | 0.4 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
e1728061963179d0cd961bd6af2369604ec96ec8 | 20 | py | Python | template/__main__.py | jwalsh/codejam-jwalsh | e7540e0e0f40686525e06d9e50c7c6fde7306034 | [
"MIT"
] | null | null | null | template/__main__.py | jwalsh/codejam-jwalsh | e7540e0e0f40686525e06d9e50c7c6fde7306034 | [
"MIT"
] | null | null | null | template/__main__.py | jwalsh/codejam-jwalsh | e7540e0e0f40686525e06d9e50c7c6fde7306034 | [
"MIT"
] | null | null | null | print('running...')
| 10 | 19 | 0.6 | 2 | 20 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 20 | 1 | 20 | 20 | 0.631579 | 0 | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
e1768f87634a1d1f60b19e56d11dae2e82377d1b | 75 | py | Python | coursera_python_specialization/helloworld.py | missulmer/Pythonstudy | fc811570cafaa383ac98f489028583081932b3e5 | [
"CC0-1.0"
] | null | null | null | coursera_python_specialization/helloworld.py | missulmer/Pythonstudy | fc811570cafaa383ac98f489028583081932b3e5 | [
"CC0-1.0"
] | null | null | null | coursera_python_specialization/helloworld.py | missulmer/Pythonstudy | fc811570cafaa383ac98f489028583081932b3e5 | [
"CC0-1.0"
] | null | null | null | print "Look! A Python! It's so smart it can tell letter and numbers apart." | 75 | 75 | 0.746667 | 15 | 75 | 3.733333 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173333 | 75 | 1 | 75 | 75 | 0.903226 | 0 | 0 | 0 | 0 | 0 | 0.881579 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
e1863b677c9289b351d8ab1d6677f4540b0819f4 | 3,062 | py | Python | website/migrations/0002_auto_20170104_1555.py | CrowdcoinSA/crowdcoin-platform | 72d48d4a7ac4abe7d02e3f369ca62a29c402c9d9 | [
"MIT"
] | null | null | null | website/migrations/0002_auto_20170104_1555.py | CrowdcoinSA/crowdcoin-platform | 72d48d4a7ac4abe7d02e3f369ca62a29c402c9d9 | [
"MIT"
] | 7 | 2020-02-12T00:10:22.000Z | 2022-01-13T00:43:13.000Z | website/migrations/0002_auto_20170104_1555.py | CrowdcoinSA/crowdcoin-platform | 72d48d4a7ac4abe7d02e3f369ca62a29c402c9d9 | [
"MIT"
] | 1 | 2020-10-05T12:20:19.000Z | 2020-10-05T12:20:19.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.9.4 on 2017-01-04 13:55
from __future__ import unicode_literals
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('website', '0001_initial'),
]
operations = [
migrations.DeleteModel(
name='EmailTransaction',
),
migrations.RemoveField(
model_name='fundstransaction',
name='network',
),
migrations.RemoveField(
model_name='fundstransaction',
name='profile',
),
migrations.RemoveField(
model_name='fundstransaction',
name='transaction_type',
),
migrations.RemoveField(
model_name='marketproduct',
name='profile',
),
migrations.DeleteModel(
name='NewsletterSubscription',
),
migrations.RemoveField(
model_name='prepaidlead',
name='profile',
),
migrations.RemoveField(
model_name='prepaidlead',
name='supervisor',
),
migrations.RemoveField(
model_name='prepaidlead',
name='transaction_type',
),
migrations.RemoveField(
model_name='referral',
name='profile',
),
migrations.RemoveField(
model_name='review',
name='profile',
),
migrations.RemoveField(
model_name='ussdsession',
name='profile',
),
migrations.RemoveField(
model_name='voicekeyword',
name='profile',
),
migrations.RemoveField(
model_name='userprofile',
name='balance',
),
migrations.RemoveField(
model_name='userprofile',
name='date_of_birth',
),
migrations.RemoveField(
model_name='userprofile',
name='device_tag',
),
migrations.RemoveField(
model_name='userprofile',
name='from_quasar',
),
migrations.RemoveField(
model_name='userprofile',
name='gender',
),
migrations.RemoveField(
model_name='userprofile',
name='msisdn',
),
migrations.RemoveField(
model_name='userprofile',
name='thisisme_verified',
),
migrations.DeleteModel(
name='FundsTransaction',
),
migrations.DeleteModel(
name='MarketProduct',
),
migrations.DeleteModel(
name='PrepaidLead',
),
migrations.DeleteModel(
name='Referral',
),
migrations.DeleteModel(
name='Review',
),
migrations.DeleteModel(
name='TransactionType',
),
migrations.DeleteModel(
name='UssdSession',
),
migrations.DeleteModel(
name='VoiceKeyword',
),
]
| 25.949153 | 47 | 0.512084 | 207 | 3,062 | 7.425121 | 0.289855 | 0.245934 | 0.304489 | 0.351334 | 0.545218 | 0.545218 | 0.063761 | 0 | 0 | 0 | 0 | 0.010537 | 0.380144 | 3,062 | 117 | 48 | 26.17094 | 0.799262 | 0.021881 | 0 | 0.709091 | 1 | 0 | 0.175468 | 0.007353 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.018182 | 0 | 0.045455 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
e196b8801ee6f0757f8a48793a70f6da9df653c8 | 101 | py | Python | chapter14_OOP/2019/foo.py | motazsaad/WDMM1405 | 9363c9557b3fe5dff66064bb4042bbe0f884757b | [
"Apache-2.0"
] | 4 | 2019-03-01T09:27:43.000Z | 2020-10-20T05:19:08.000Z | chapter14_OOP/2019/foo.py | zainab8585/WDMM1405 | 9363c9557b3fe5dff66064bb4042bbe0f884757b | [
"Apache-2.0"
] | null | null | null | chapter14_OOP/2019/foo.py | zainab8585/WDMM1405 | 9363c9557b3fe5dff66064bb4042bbe0f884757b | [
"Apache-2.0"
] | 4 | 2019-02-19T18:43:34.000Z | 2022-03-13T19:09:26.000Z | def sayHello():
return 'Hello World'
if __name__ == '__main__':
print('welcome to python')
| 14.428571 | 30 | 0.643564 | 12 | 101 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.217822 | 101 | 6 | 31 | 16.833333 | 0.721519 | 0 | 0 | 0 | 0 | 0 | 0.356436 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | true | 0 | 0 | 0.25 | 0.5 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 5 |
e1b66a18eca06656ccb8ec0f93a195b9e07c5385 | 36 | py | Python | problem/10000~19999/15881/15881.py3.py | njw1204/BOJ-AC | 1de41685725ae4657a7ff94e413febd97a888567 | [
"MIT"
] | 1 | 2019-04-19T16:37:44.000Z | 2019-04-19T16:37:44.000Z | problem/10000~19999/15881/15881.py3.py | njw1204/BOJ-AC | 1de41685725ae4657a7ff94e413febd97a888567 | [
"MIT"
] | 1 | 2019-04-20T11:42:44.000Z | 2019-04-20T11:42:44.000Z | problem/10000~19999/15881/15881.py3.py | njw1204/BOJ-AC | 1de41685725ae4657a7ff94e413febd97a888567 | [
"MIT"
] | 3 | 2019-04-19T16:37:47.000Z | 2021-10-25T00:45:00.000Z | input();print(input().count('pPAp')) | 36 | 36 | 0.666667 | 5 | 36 | 4.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 36 | 1 | 36 | 36 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
bed500ba7457f9877539f75710e763b5bfced447 | 154 | py | Python | handler.py | oslokommune/okdata-maskinporten-api | f323f9da977a5b43c18a002ebe8f11b09f20cda8 | [
"MIT"
] | null | null | null | handler.py | oslokommune/okdata-maskinporten-api | f323f9da977a5b43c18a002ebe8f11b09f20cda8 | [
"MIT"
] | 3 | 2021-08-31T12:11:44.000Z | 2021-09-17T12:09:45.000Z | handler.py | oslokommune/okdata-permission-api | e12739794fb48437c1b39e1bffc9d632e76d2449 | [
"MIT"
] | null | null | null | import os
from mangum import Mangum
from app import app
root_path = os.environ["ROOT_PATH"]
handler = Mangum(app=app, api_gateway_base_path=root_path)
| 17.111111 | 58 | 0.792208 | 26 | 154 | 4.461538 | 0.461538 | 0.206897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12987 | 154 | 8 | 59 | 19.25 | 0.865672 | 0 | 0 | 0 | 0 | 0 | 0.058442 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.6 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
83080f13a1f27e1c1bd76994d78f06a741290631 | 14,601 | py | Python | osisoft/pidevclub/piwebapi/models/__init__.py | inselbuch/pwap2 | 4ded0a62b241d9354f39ce87f3411fe9708317e3 | [
"Apache-2.0"
] | 3 | 2019-05-16T15:44:09.000Z | 2020-11-25T22:28:31.000Z | osisoft/pidevclub/piwebapi/models/__init__.py | inselbuch/pwap2 | 4ded0a62b241d9354f39ce87f3411fe9708317e3 | [
"Apache-2.0"
] | null | null | null | osisoft/pidevclub/piwebapi/models/__init__.py | inselbuch/pwap2 | 4ded0a62b241d9354f39ce87f3411fe9708317e3 | [
"Apache-2.0"
] | 8 | 2019-03-15T10:20:57.000Z | 2021-05-20T13:06:37.000Z | # coding: utf-8
"""
Copyright 2018 OSIsoft, LLC
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
<http://www.apache.org/licenses/LICENSE-2.0>
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
from osisoft.pidevclub.piwebapi.models.pi_ambiguous import PIAmbiguous
from osisoft.pidevclub.piwebapi.models.pi_analysis import PIAnalysis
from osisoft.pidevclub.piwebapi.models.pi_analysis_category import PIAnalysisCategory
from osisoft.pidevclub.piwebapi.models.pi_analysis_category_links import PIAnalysisCategoryLinks
from osisoft.pidevclub.piwebapi.models.pi_analysis_links import PIAnalysisLinks
from osisoft.pidevclub.piwebapi.models.pi_analysis_rule import PIAnalysisRule
from osisoft.pidevclub.piwebapi.models.pi_analysis_rule_links import PIAnalysisRuleLinks
from osisoft.pidevclub.piwebapi.models.pi_analysis_rule_plug_in import PIAnalysisRulePlugIn
from osisoft.pidevclub.piwebapi.models.pi_analysis_rule_plug_in_links import PIAnalysisRulePlugInLinks
from osisoft.pidevclub.piwebapi.models.pi_analysis_template import PIAnalysisTemplate
from osisoft.pidevclub.piwebapi.models.pi_analysis_template_links import PIAnalysisTemplateLinks
from osisoft.pidevclub.piwebapi.models.pi_annotation import PIAnnotation
from osisoft.pidevclub.piwebapi.models.pi_annotation_links import PIAnnotationLinks
from osisoft.pidevclub.piwebapi.models.pi_asset_database import PIAssetDatabase
from osisoft.pidevclub.piwebapi.models.pi_asset_database_links import PIAssetDatabaseLinks
from osisoft.pidevclub.piwebapi.models.pi_asset_server import PIAssetServer
from osisoft.pidevclub.piwebapi.models.pi_asset_server_links import PIAssetServerLinks
from osisoft.pidevclub.piwebapi.models.pi_attribute import PIAttribute
from osisoft.pidevclub.piwebapi.models.pi_attribute_category import PIAttributeCategory
from osisoft.pidevclub.piwebapi.models.pi_attribute_category_links import PIAttributeCategoryLinks
from osisoft.pidevclub.piwebapi.models.pi_attribute_links import PIAttributeLinks
from osisoft.pidevclub.piwebapi.models.pi_attribute_template import PIAttributeTemplate
from osisoft.pidevclub.piwebapi.models.pi_attribute_template_links import PIAttributeTemplateLinks
from osisoft.pidevclub.piwebapi.models.pi_attribute_trait import PIAttributeTrait
from osisoft.pidevclub.piwebapi.models.pi_attribute_trait_links import PIAttributeTraitLinks
from osisoft.pidevclub.piwebapi.models.pi_cache_instance import PICacheInstance
from osisoft.pidevclub.piwebapi.models.pi_channel_instance import PIChannelInstance
from osisoft.pidevclub.piwebapi.models.pi_data_pipe_event import PIDataPipeEvent
from osisoft.pidevclub.piwebapi.models.pi_data_server import PIDataServer
from osisoft.pidevclub.piwebapi.models.pi_data_server_license import PIDataServerLicense
from osisoft.pidevclub.piwebapi.models.pi_data_server_license_links import PIDataServerLicenseLinks
from osisoft.pidevclub.piwebapi.models.pi_data_server_links import PIDataServerLinks
from osisoft.pidevclub.piwebapi.models.pi_element import PIElement
from osisoft.pidevclub.piwebapi.models.pi_element_category import PIElementCategory
from osisoft.pidevclub.piwebapi.models.pi_element_category_links import PIElementCategoryLinks
from osisoft.pidevclub.piwebapi.models.pi_element_links import PIElementLinks
from osisoft.pidevclub.piwebapi.models.pi_element_template import PIElementTemplate
from osisoft.pidevclub.piwebapi.models.pi_element_template_links import PIElementTemplateLinks
from osisoft.pidevclub.piwebapi.models.pi_enumeration_set import PIEnumerationSet
from osisoft.pidevclub.piwebapi.models.pi_enumeration_set_links import PIEnumerationSetLinks
from osisoft.pidevclub.piwebapi.models.pi_enumeration_value import PIEnumerationValue
from osisoft.pidevclub.piwebapi.models.pi_enumeration_value_links import PIEnumerationValueLinks
from osisoft.pidevclub.piwebapi.models.pi_errors import PIErrors
from osisoft.pidevclub.piwebapi.models.pi_event_frame import PIEventFrame
from osisoft.pidevclub.piwebapi.models.pi_event_frame_links import PIEventFrameLinks
from osisoft.pidevclub.piwebapi.models.pi_extended_timed_value import PIExtendedTimedValue
from osisoft.pidevclub.piwebapi.models.pi_extended_timed_values import PIExtendedTimedValues
from osisoft.pidevclub.piwebapi.models.pi_item_attribute import PIItemAttribute
from osisoft.pidevclub.piwebapi.models.pi_item_element import PIItemElement
from osisoft.pidevclub.piwebapi.models.pi_item_event_frame import PIItemEventFrame
from osisoft.pidevclub.piwebapi.models.pi_item_point import PIItemPoint
from osisoft.pidevclub.piwebapi.models.pi_items_analysis import PIItemsAnalysis
from osisoft.pidevclub.piwebapi.models.pi_items_analysis_category import PIItemsAnalysisCategory
from osisoft.pidevclub.piwebapi.models.pi_items_analysis_rule import PIItemsAnalysisRule
from osisoft.pidevclub.piwebapi.models.pi_items_analysis_rule_plug_in import PIItemsAnalysisRulePlugIn
from osisoft.pidevclub.piwebapi.models.pi_items_analysis_template import PIItemsAnalysisTemplate
from osisoft.pidevclub.piwebapi.models.pi_items_annotation import PIItemsAnnotation
from osisoft.pidevclub.piwebapi.models.pi_items_asset_database import PIItemsAssetDatabase
from osisoft.pidevclub.piwebapi.models.pi_items_asset_server import PIItemsAssetServer
from osisoft.pidevclub.piwebapi.models.pi_items_attribute import PIItemsAttribute
from osisoft.pidevclub.piwebapi.models.pi_items_attribute_category import PIItemsAttributeCategory
from osisoft.pidevclub.piwebapi.models.pi_items_attribute_template import PIItemsAttributeTemplate
from osisoft.pidevclub.piwebapi.models.pi_items_attribute_trait import PIItemsAttributeTrait
from osisoft.pidevclub.piwebapi.models.pi_items_cache_instance import PIItemsCacheInstance
from osisoft.pidevclub.piwebapi.models.pi_items_channel_instance import PIItemsChannelInstance
from osisoft.pidevclub.piwebapi.models.pi_items_data_server import PIItemsDataServer
from osisoft.pidevclub.piwebapi.models.pi_items_element import PIItemsElement
from osisoft.pidevclub.piwebapi.models.pi_items_element_category import PIItemsElementCategory
from osisoft.pidevclub.piwebapi.models.pi_items_element_template import PIItemsElementTemplate
from osisoft.pidevclub.piwebapi.models.pi_items_enumeration_set import PIItemsEnumerationSet
from osisoft.pidevclub.piwebapi.models.pi_items_enumeration_value import PIItemsEnumerationValue
from osisoft.pidevclub.piwebapi.models.pi_items_event_frame import PIItemsEventFrame
from osisoft.pidevclub.piwebapi.models.pi_items_item_attribute import PIItemsItemAttribute
from osisoft.pidevclub.piwebapi.models.pi_items_item_element import PIItemsItemElement
from osisoft.pidevclub.piwebapi.models.pi_items_item_event_frame import PIItemsItemEventFrame
from osisoft.pidevclub.piwebapi.models.pi_items_item_point import PIItemsItemPoint
from osisoft.pidevclub.piwebapi.models.pi_items_items_substatus import PIItemsItemsSubstatus
from osisoft.pidevclub.piwebapi.models.pi_items_notification_contact_template import PIItemsNotificationContactTemplate
from osisoft.pidevclub.piwebapi.models.pi_items_notification_rule import PIItemsNotificationRule
from osisoft.pidevclub.piwebapi.models.pi_items_notification_rule_subscriber import PIItemsNotificationRuleSubscriber
from osisoft.pidevclub.piwebapi.models.pi_items_notification_rule_template import PIItemsNotificationRuleTemplate
from osisoft.pidevclub.piwebapi.models.pi_items_point import PIItemsPoint
from osisoft.pidevclub.piwebapi.models.pi_items_point_attribute import PIItemsPointAttribute
from osisoft.pidevclub.piwebapi.models.pi_items_security_entry import PIItemsSecurityEntry
from osisoft.pidevclub.piwebapi.models.pi_items_security_identity import PIItemsSecurityIdentity
from osisoft.pidevclub.piwebapi.models.pi_items_security_mapping import PIItemsSecurityMapping
from osisoft.pidevclub.piwebapi.models.pi_items_security_rights import PIItemsSecurityRights
from osisoft.pidevclub.piwebapi.models.pi_items_stream_summaries import PIItemsStreamSummaries
from osisoft.pidevclub.piwebapi.models.pi_items_stream_updates_register import PIItemsStreamUpdatesRegister
from osisoft.pidevclub.piwebapi.models.pi_items_stream_updates_retrieve import PIItemsStreamUpdatesRetrieve
from osisoft.pidevclub.piwebapi.models.pi_items_stream_value import PIItemsStreamValue
from osisoft.pidevclub.piwebapi.models.pi_items_stream_values import PIItemsStreamValues
from osisoft.pidevclub.piwebapi.models.pi_itemsstring import PIItemsstring
from osisoft.pidevclub.piwebapi.models.pi_items_substatus import PIItemsSubstatus
from osisoft.pidevclub.piwebapi.models.pi_items_summary_value import PIItemsSummaryValue
from osisoft.pidevclub.piwebapi.models.pi_items_table import PIItemsTable
from osisoft.pidevclub.piwebapi.models.pi_items_table_category import PIItemsTableCategory
from osisoft.pidevclub.piwebapi.models.pi_items_time_rule_plug_in import PIItemsTimeRulePlugIn
from osisoft.pidevclub.piwebapi.models.pi_items_unit_class import PIItemsUnitClass
from osisoft.pidevclub.piwebapi.models.pi_landing import PILanding
from osisoft.pidevclub.piwebapi.models.pi_landing_links import PILandingLinks
from osisoft.pidevclub.piwebapi.models.pi_media_metadata import PIMediaMetadata
from osisoft.pidevclub.piwebapi.models.pi_media_metadata_links import PIMediaMetadataLinks
from osisoft.pidevclub.piwebapi.models.pi_notification_contact_template import PINotificationContactTemplate
from osisoft.pidevclub.piwebapi.models.pi_notification_contact_template_links import PINotificationContactTemplateLinks
from osisoft.pidevclub.piwebapi.models.pi_notification_rule import PINotificationRule
from osisoft.pidevclub.piwebapi.models.pi_notification_rule_subscriber import PINotificationRuleSubscriber
from osisoft.pidevclub.piwebapi.models.pi_notification_rule_template import PINotificationRuleTemplate
from osisoft.pidevclub.piwebapi.models.pi_pagination_links import PIPaginationLinks
from osisoft.pidevclub.piwebapi.models.pi_point import PIPoint
from osisoft.pidevclub.piwebapi.models.pi_point_attribute import PIPointAttribute
from osisoft.pidevclub.piwebapi.models.pi_point_attribute_links import PIPointAttributeLinks
from osisoft.pidevclub.piwebapi.models.pi_point_links import PIPointLinks
from osisoft.pidevclub.piwebapi.models.pi_property_error import PIPropertyError
from osisoft.pidevclub.piwebapi.models.pi_request import PIRequest
from osisoft.pidevclub.piwebapi.models.pi_request_template import PIRequestTemplate
from osisoft.pidevclub.piwebapi.models.pi_response import PIResponse
from osisoft.pidevclub.piwebapi.models.pi_search_by_attribute import PISearchByAttribute
from osisoft.pidevclub.piwebapi.models.pi_security import PISecurity
from osisoft.pidevclub.piwebapi.models.pi_security_entry import PISecurityEntry
from osisoft.pidevclub.piwebapi.models.pi_security_entry_links import PISecurityEntryLinks
from osisoft.pidevclub.piwebapi.models.pi_security_identity import PISecurityIdentity
from osisoft.pidevclub.piwebapi.models.pi_security_identity_links import PISecurityIdentityLinks
from osisoft.pidevclub.piwebapi.models.pi_security_mapping import PISecurityMapping
from osisoft.pidevclub.piwebapi.models.pi_security_mapping_links import PISecurityMappingLinks
from osisoft.pidevclub.piwebapi.models.pi_security_rights import PISecurityRights
from osisoft.pidevclub.piwebapi.models.pi_security_rights_links import PISecurityRightsLinks
from osisoft.pidevclub.piwebapi.models.pi_stream_annotation import PIStreamAnnotation
from osisoft.pidevclub.piwebapi.models.pi_stream_summaries import PIStreamSummaries
from osisoft.pidevclub.piwebapi.models.pi_stream_summaries_links import PIStreamSummariesLinks
from osisoft.pidevclub.piwebapi.models.pi_stream_updates_register import PIStreamUpdatesRegister
from osisoft.pidevclub.piwebapi.models.pi_stream_updates_retrieve import PIStreamUpdatesRetrieve
from osisoft.pidevclub.piwebapi.models.pi_stream_value import PIStreamValue
from osisoft.pidevclub.piwebapi.models.pi_stream_value_links import PIStreamValueLinks
from osisoft.pidevclub.piwebapi.models.pi_stream_values import PIStreamValues
from osisoft.pidevclub.piwebapi.models.pi_stream_values_links import PIStreamValuesLinks
from osisoft.pidevclub.piwebapi.models.pi_substatus import PISubstatus
from osisoft.pidevclub.piwebapi.models.pi_summary_value import PISummaryValue
from osisoft.pidevclub.piwebapi.models.pi_system_landing import PISystemLanding
from osisoft.pidevclub.piwebapi.models.pi_system_landing_links import PISystemLandingLinks
from osisoft.pidevclub.piwebapi.models.pi_system_status import PISystemStatus
from osisoft.pidevclub.piwebapi.models.pi_table import PITable
from osisoft.pidevclub.piwebapi.models.pi_table_category import PITableCategory
from osisoft.pidevclub.piwebapi.models.pi_table_category_links import PITableCategoryLinks
from osisoft.pidevclub.piwebapi.models.pi_table_data import PITableData
from osisoft.pidevclub.piwebapi.models.pi_table_links import PITableLinks
from osisoft.pidevclub.piwebapi.models.pi_timed_value import PITimedValue
from osisoft.pidevclub.piwebapi.models.pi_timed_values import PITimedValues
from osisoft.pidevclub.piwebapi.models.pi_time_rule import PITimeRule
from osisoft.pidevclub.piwebapi.models.pi_time_rule_links import PITimeRuleLinks
from osisoft.pidevclub.piwebapi.models.pi_time_rule_plug_in import PITimeRulePlugIn
from osisoft.pidevclub.piwebapi.models.pi_time_rule_plug_in_links import PITimeRulePlugInLinks
from osisoft.pidevclub.piwebapi.models.pi_unit import PIUnit
from osisoft.pidevclub.piwebapi.models.pi_unit_class import PIUnitClass
from osisoft.pidevclub.piwebapi.models.pi_unit_class_links import PIUnitClassLinks
from osisoft.pidevclub.piwebapi.models.pi_unit_links import PIUnitLinks
from osisoft.pidevclub.piwebapi.models.pi_user_info import PIUserInfo
from osisoft.pidevclub.piwebapi.models.pi_value import PIValue
from osisoft.pidevclub.piwebapi.models.pi_value_query import PIValueQuery
from osisoft.pidevclub.piwebapi.models.pi_version import PIVersion
from osisoft.pidevclub.piwebapi.models.pi_web_exception import PIWebException
| 82.02809 | 120 | 0.890076 | 1,788 | 14,601 | 7.040828 | 0.167785 | 0.140678 | 0.255779 | 0.35809 | 0.601398 | 0.596155 | 0.55612 | 0.41989 | 0.065533 | 0.016522 | 0 | 0.000658 | 0.063352 | 14,601 | 177 | 121 | 82.491525 | 0.91986 | 0.038764 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
83287062d7ff1aa87d2ee14314f69dfe559415ee | 83 | py | Python | src/features/__init__.py | kaletise/kirill-marketcapov | df094e922d8da608a71ca1a5fe8100a0af748702 | [
"MIT"
] | 1 | 2021-04-26T16:33:30.000Z | 2021-04-26T16:33:30.000Z | src/features/__init__.py | kaletise/kirill-marketcapov | df094e922d8da608a71ca1a5fe8100a0af748702 | [
"MIT"
] | null | null | null | src/features/__init__.py | kaletise/kirill-marketcapov | df094e922d8da608a71ca1a5fe8100a0af748702 | [
"MIT"
] | 2 | 2021-04-26T16:45:13.000Z | 2021-04-26T21:06:34.000Z | from . import friends
from . import news
from . import online
from . import status
| 16.6 | 21 | 0.759036 | 12 | 83 | 5.25 | 0.5 | 0.634921 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192771 | 83 | 4 | 22 | 20.75 | 0.940299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
83386fd4d70879c9c39827fcb7317c2ff9f4dc70 | 402 | py | Python | 01-Regular-Expression/example_search.py | littlecarson/web-scraping | 422ae1f5317cb25349f715e806c5138e86a35688 | [
"MIT"
] | null | null | null | 01-Regular-Expression/example_search.py | littlecarson/web-scraping | 422ae1f5317cb25349f715e806c5138e86a35688 | [
"MIT"
] | null | null | null | 01-Regular-Expression/example_search.py | littlecarson/web-scraping | 422ae1f5317cb25349f715e806c5138e86a35688 | [
"MIT"
] | null | null | null | import re
example_text = '我是kingname, 我的微博账号是:kingname, 密码是:12345678, QQ账号是:99999, 密码是:890abcd, 银行卡账号是:000001, 密码是:654321, Github账号是:99999@qq.com, 密码是:7777love8888, 请记住他们。'
user_password = re.search('账号是:(.*?), 密码是:(.*?),', example_text)
print(user_password)
print(user_password.group())
print(user_password.group(1))
print(user_password.group(2))
# print(user_password.group(3)) #此处会报错,因为没有group(3)
| 36.545455 | 162 | 0.753731 | 57 | 402 | 5.175439 | 0.54386 | 0.244068 | 0.288136 | 0.298305 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120321 | 0.069652 | 402 | 10 | 163 | 40.2 | 0.668449 | 0.119403 | 0 | 0 | 0 | 0.142857 | 0.472934 | 0.065527 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.714286 | 0.142857 | 0 | 0.142857 | 0.571429 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 5 |
83622479bd3f0d5dd8443124ed4cda9992d211f8 | 190 | py | Python | python36/simple_func.py | sharedcloud/sharedcloud-examples | 22ede1ecce1aa5ccec9db5614b682acce1bec778 | [
"MIT"
] | null | null | null | python36/simple_func.py | sharedcloud/sharedcloud-examples | 22ede1ecce1aa5ccec9db5614b682acce1bec778 | [
"MIT"
] | null | null | null | python36/simple_func.py | sharedcloud/sharedcloud-examples | 22ede1ecce1aa5ccec9db5614b682acce1bec778 | [
"MIT"
] | null | null | null | import platform
import time
def handler(event):
print('Python version: {}'.format(platform.python_version()))
return 'Hello friend, you are passing {} arguments'.format(len(event))
| 27.142857 | 74 | 0.726316 | 24 | 190 | 5.708333 | 0.75 | 0.189781 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.136842 | 190 | 6 | 75 | 31.666667 | 0.835366 | 0 | 0 | 0 | 0 | 0 | 0.315789 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0.2 | 0.4 | 0 | 0.8 | 0.2 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 5 |
36441c2b54504abfc965df45f5ef1541adb243cf | 439 | py | Python | exercises/triangle/example.py | wonhyeongseo/python | ccd399510a58ad42d03420e43de67893f55dd411 | [
"MIT"
] | 2 | 2018-11-24T01:00:38.000Z | 2019-02-05T07:32:44.000Z | exercises/triangle/example.py | toroad/python | ce085c81a82ae5fb460fe166323dbbaa5a2588c5 | [
"MIT"
] | null | null | null | exercises/triangle/example.py | toroad/python | ce085c81a82ae5fb460fe166323dbbaa5a2588c5 | [
"MIT"
] | null | null | null | def valid(sides):
return (
sum(sorted(sides)[:2]) >= sorted(sides)[2] and
all(s > 0 for s in sides)
)
def is_equilateral(sides):
return valid(sides) and all(sides[0] == s for s in sides)
def is_isosceles(sides):
return (
valid(sides) and
any(s1 == s2 for s1, s2 in zip(sorted(sides), sorted(sides)[1:]))
)
def is_scalene(sides):
return valid(sides) and not is_isosceles(sides)
| 20.904762 | 73 | 0.603645 | 68 | 439 | 3.838235 | 0.338235 | 0.153257 | 0.183908 | 0.241379 | 0.398467 | 0.122605 | 0 | 0 | 0 | 0 | 0 | 0.027607 | 0.257403 | 439 | 20 | 74 | 21.95 | 0.773006 | 0 | 0 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0.285714 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
3659616fdd3c64f1cdcdf61b022fd4c1ac0636fd | 106 | py | Python | run.py | luoweis/xskAdmin | d92d0297102140ce5241a0c9f3a80f520bee96a8 | [
"MIT"
] | null | null | null | run.py | luoweis/xskAdmin | d92d0297102140ce5241a0c9f3a80f520bee96a8 | [
"MIT"
] | null | null | null | run.py | luoweis/xskAdmin | d92d0297102140ce5241a0c9f3a80f520bee96a8 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding=utf-8 -*-
from app import app
# 可以输出我们开发工程中url,便于调试
print(app.url_map) | 17.666667 | 22 | 0.698113 | 17 | 106 | 4.294118 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010753 | 0.122642 | 106 | 6 | 23 | 17.666667 | 0.774194 | 0.575472 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 5 |
3677a27ee06ad645500ba654779cba35590aee0b | 94 | py | Python | acrocart/__init__.py | jnez71/AcroCart | 01df2a68e308536021be5f7e4889c79b4cc08b0f | [
"MIT"
] | 11 | 2018-02-08T20:24:46.000Z | 2022-01-20T11:56:50.000Z | acrocart/__init__.py | jnez71/AcroCart | 01df2a68e308536021be5f7e4889c79b4cc08b0f | [
"MIT"
] | 1 | 2018-08-07T18:06:34.000Z | 2018-08-08T22:15:50.000Z | acrocart/__init__.py | jnez71/AcroCart | 01df2a68e308536021be5f7e4889c79b4cc08b0f | [
"MIT"
] | 5 | 2018-04-02T13:36:52.000Z | 2021-01-25T13:21:01.000Z | from dynamics import Dynamics
from simulator import Simulator
from optimizer import Optimizer
| 23.5 | 31 | 0.87234 | 12 | 94 | 6.833333 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12766 | 94 | 3 | 32 | 31.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
3687c21736bbbf2664649322a402f4d97da7f6e0 | 222 | py | Python | foundation/teryt/tests/factories.py | pilnujemy/foundation-manager | 1f1d6afcbb408c87a171bcbe3f9e58570eb478b6 | [
"BSD-3-Clause"
] | 1 | 2016-01-04T06:30:24.000Z | 2016-01-04T06:30:24.000Z | foundation/teryt/tests/factories.py | pilnujemy/foundation-manager | 1f1d6afcbb408c87a171bcbe3f9e58570eb478b6 | [
"BSD-3-Clause"
] | 36 | 2015-11-27T14:17:34.000Z | 2016-07-14T10:23:52.000Z | foundation/teryt/tests/factories.py | pilnujemy/foundation-manager | 1f1d6afcbb408c87a171bcbe3f9e58570eb478b6 | [
"BSD-3-Clause"
] | 1 | 2016-05-14T01:11:28.000Z | 2016-05-14T01:11:28.000Z | from __future__ import absolute_import
from .. import models
from teryt_tree.factories import JednostkaAdministracyjnaFactory
class JSTFactory(JednostkaAdministracyjnaFactory):
class Meta:
model = models.JST
| 24.666667 | 64 | 0.810811 | 22 | 222 | 7.909091 | 0.636364 | 0.413793 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148649 | 222 | 8 | 65 | 27.75 | 0.920635 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.833333 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
36a999319a44bc75ab6039384cb80b204a613201 | 188 | py | Python | cloudcafe/identity/v3/common/tokens/__init__.py | rcbops-qa/cloudcafe | d937f85496aadafbb94a330b9adb8ea18bee79ba | [
"Apache-2.0"
] | null | null | null | cloudcafe/identity/v3/common/tokens/__init__.py | rcbops-qa/cloudcafe | d937f85496aadafbb94a330b9adb8ea18bee79ba | [
"Apache-2.0"
] | null | null | null | cloudcafe/identity/v3/common/tokens/__init__.py | rcbops-qa/cloudcafe | d937f85496aadafbb94a330b9adb8ea18bee79ba | [
"Apache-2.0"
] | 1 | 2020-04-13T17:44:28.000Z | 2020-04-13T17:44:28.000Z | from cloudcafe.identity.v3.common.tokens.behavior import TokensBehavior
from cloudcafe.identity.v3.common.tokens.client import TokensClient
client = TokensClient
behavior = TokensBehavior
| 37.6 | 71 | 0.861702 | 22 | 188 | 7.363636 | 0.5 | 0.160494 | 0.259259 | 0.283951 | 0.432099 | 0.432099 | 0 | 0 | 0 | 0 | 0 | 0.011494 | 0.074468 | 188 | 4 | 72 | 47 | 0.91954 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
36e04cbe0254815b061f52471bc3ee4ab8e3cea9 | 73 | py | Python | points/__init__.py | noobshack/wizworm-cogs | 85de6ad4f45a6c6927180049a16c97ee51d64df8 | [
"MIT"
] | 1 | 2020-11-08T23:51:47.000Z | 2020-11-08T23:51:47.000Z | points/__init__.py | noobshack/wizworm-cogs | 85de6ad4f45a6c6927180049a16c97ee51d64df8 | [
"MIT"
] | null | null | null | points/__init__.py | noobshack/wizworm-cogs | 85de6ad4f45a6c6927180049a16c97ee51d64df8 | [
"MIT"
] | null | null | null | from .points import Points
def setup(bot):
bot.add_cog(ns_points())
| 14.6 | 28 | 0.712329 | 12 | 73 | 4.166667 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164384 | 73 | 4 | 29 | 18.25 | 0.819672 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
7ff6299c7b043968ab929e02c363d306256d78d6 | 162 | py | Python | Python_LEVEL_TWO/two.py | chernyshov-dev/WebDev | fcd06f9d2f1b0bff189e2d977c6882550dad27f5 | [
"MIT"
] | 4 | 2021-08-29T15:22:00.000Z | 2021-11-15T11:13:31.000Z | Python_LEVEL_TWO/two.py | chernyshov-dev/WebDev | fcd06f9d2f1b0bff189e2d977c6882550dad27f5 | [
"MIT"
] | 14 | 2021-08-12T16:05:17.000Z | 2021-11-21T20:59:13.000Z | Python_LEVEL_TWO/two.py | chernyshov-dev/WebDev | fcd06f9d2f1b0bff189e2d977c6882550dad27f5 | [
"MIT"
] | null | null | null | import one
print("TOP LEVEL TWO.PY")
one.func()
if __name__ == '__main__':
print("two.py is being run directly")
else:
print("two.py has been imported")
| 18 | 41 | 0.67284 | 26 | 162 | 3.884615 | 0.730769 | 0.148515 | 0.19802 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.179012 | 162 | 8 | 42 | 20.25 | 0.759399 | 0 | 0 | 0 | 0 | 0 | 0.469136 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.285714 | 0 | 0.285714 | 0.428571 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
3d0aedbbbfa473e4d831c32151bac7d9c52a66bb | 155 | py | Python | src/easy_colors.py | farhadzaidi/easy_colors | 1f08489feaa5fa37de58c81fb5e0340a260b6a8c | [
"MIT"
] | null | null | null | src/easy_colors.py | farhadzaidi/easy_colors | 1f08489feaa5fa37de58c81fb5e0340a260b6a8c | [
"MIT"
] | null | null | null | src/easy_colors.py | farhadzaidi/easy_colors | 1f08489feaa5fa37de58c81fb5e0340a260b6a8c | [
"MIT"
] | null | null | null |
class Colors:
INFO = '\033[96m'
WARNING = '\033[93m'
SUCCESS = '\033[92m'
ERROR = '\033[91m'
BOLD = '\033[1m'
UNDERLINE = '\033[4m'
END = '\033[0m' | 17.222222 | 22 | 0.574194 | 23 | 155 | 3.869565 | 0.73913 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.256 | 0.193548 | 155 | 9 | 23 | 17.222222 | 0.456 | 0 | 0 | 0 | 0 | 0 | 0.341935 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
3d317ff206a604a3580ac4ebaf52a0fd7d71df72 | 8,813 | py | Python | tests/test_list_parity.py | imranariffin/hashed-list | b002eb034b2f11f2518afb3fad8c75f1d047e824 | [
"MIT"
] | null | null | null | tests/test_list_parity.py | imranariffin/hashed-list | b002eb034b2f11f2518afb3fad8c75f1d047e824 | [
"MIT"
] | 3 | 2021-10-16T00:09:24.000Z | 2021-11-28T18:30:56.000Z | tests/test_list_parity.py | imranariffin/hashed-list | b002eb034b2f11f2518afb3fad8c75f1d047e824 | [
"MIT"
] | null | null | null | from typing import List
import unittest
from hashed_list import HashedList
class TestListParity(unittest.TestCase):
"""Assert that HashedArray is in feature parity with builtin list"""
def test_index(self):
harray = HashedList(list(range(10)))
index = harray.index(9)
self.assertEqual(index, 9)
def test_index__with_start(self):
ls: List[str] = ["10", "3", "5"]
harray: HashedList[str] = HashedList(ls)
self.assertEqual(harray.index("10", start=0), ls.index("10", 0))
self.assertEqual(harray.index("3", start=1), ls.index("3", 1))
self.assertEqual(harray.index("5", start=2), ls.index("5", 2))
with self.assertRaises(ValueError) as context_harray:
harray.index("10", start=1)
with self.assertRaises(ValueError) as context_ls:
ls.index("10", 1)
self.assertEqual(context_harray.exception.args, context_ls.exception.args)
def test_index__with_start_and_end(self):
ls: List[str] = ["10", "3", "5"]
harray: HashedList[str] = HashedList(ls)
self.assertEqual(harray.index("10", start=0, end=3), ls.index("10", 0, 3))
self.assertEqual(harray.index("3", start=1, end=3), ls.index("3", 1, 3))
self.assertEqual(harray.index("5", start=2, end=3), ls.index("5", 2, 3))
with self.assertRaises(ValueError) as context_harray:
harray.index("5", start=0, end=1)
with self.assertRaises(ValueError) as context_ls:
ls.index("5", 0, 1)
self.assertEqual(context_harray.exception.args, context_ls.exception.args)
def test_empty(self):
harray = HashedList([])
try:
ls = []
ls.index(1)
except Exception as e_list:
try:
harray.index(1)
except Exception as e_harray:
self.assertEqual(e_list.__class__, e_harray.__class__)
self.assertEqual(e_list.args, e_harray.args)
def test_lookup(self):
ls = [1, 10, 22, -1]
harray = HashedList(ls)
self.assertEqual(ls[0], harray[0])
self.assertEqual(ls[2], harray[2])
self.assertEqual(ls[-1], harray[-1])
self.assertEqual(ls[2:3], harray[2:3])
def test_setitem(self):
ls = [1, 10, 22, -1]
harray = HashedList(ls)
ls[0] = 2
harray[0] = 2
self.assertEqual(ls[0], harray[0])
ls = [1, 10, 22, -1]
harray = HashedList(ls)
ls[2] = 99
harray[2] = 99
self.assertEqual(ls[2], harray[2])
ls = [1, 10, 22, -1]
harray = HashedList(ls)
ls[-1] = -111
harray[-1] = -111
self.assertEqual(ls[-1], harray[-1])
ls = [1, 10, 22, -1]
harray = HashedList(ls)
ls[1:3] = [3, 9]
harray[1:3] = [3, 9]
self.assertEqual(ls[1:3], harray[1:3])
def test_append(self):
ls = ["a", "b", "c"]
harray = HashedList(ls)
ls.append("Z")
harray.append("Z")
self.assertEqual(ls, harray)
self.assertEqual(ls.index("Z"), harray.index("Z"))
self.assertEqual(ls.index("c"), harray.index("c"))
self.assertEqual(ls.index("b"), harray.index("b"))
self.assertEqual(ls.index("a"), harray.index("a"))
def test_extend(self):
ls = ["a", "b", "c"]
harray = HashedList(ls)
ls.extend(["X", "Y", "Z"])
harray.extend(["X", "Y", "Z"])
self.assertEqual(ls, harray)
self.assertEqual(ls.index("Z"), harray.index("Z"))
self.assertEqual(ls.index("Y"), harray.index("Y"))
self.assertEqual(ls.index("X"), harray.index("X"))
self.assertEqual(ls.index("c"), harray.index("c"))
self.assertEqual(ls.index("b"), harray.index("b"))
self.assertEqual(ls.index("a"), harray.index("a"))
def test_insert(self):
# Insert at the beginning
ls = ["a", "b", "c"]
harray = HashedList(ls)
ls.insert(0, "Z")
harray.insert(0, "Z")
self.assertEqual(ls, harray)
self.assertEqual(ls.index("Z"), harray.index("Z"))
self.assertEqual(ls.index("c"), harray.index("c"))
self.assertEqual(ls.index("b"), harray.index("b"))
self.assertEqual(ls.index("a"), harray.index("a"))
# Insert into the middle
ls = ["a", "b", "c"]
harray = HashedList(ls)
ls.insert(1, "Z")
harray.insert(1, "Z")
self.assertEqual(ls, harray)
self.assertEqual(ls.index("Z"), harray.index("Z"))
self.assertEqual(ls.index("c"), harray.index("c"))
self.assertEqual(ls.index("b"), harray.index("b"))
self.assertEqual(ls.index("a"), harray.index("a"))
# Insert into the end
ls = ["a", "b", "c"]
harray = HashedList(ls)
ls.insert(3, "Z")
harray.insert(3, "Z")
self.assertEqual(ls, harray)
self.assertEqual(ls.index("Z"), harray.index("Z"))
self.assertEqual(ls.index("c"), harray.index("c"))
self.assertEqual(ls.index("b"), harray.index("b"))
self.assertEqual(ls.index("a"), harray.index("a"))
def test_remove(self):
# Remove from the beginning
ls = ["a", "b", "c"]
harray = HashedList(ls)
ls.remove("a")
harray.remove("a")
self.assertEqual(ls, harray)
self.assertEqual(ls.index("c"), harray.index("c"))
self.assertEqual(ls.index("b"), harray.index("b"))
# Remove from the middle
ls = ["a", "b", "c"]
harray = HashedList(ls)
ls.remove("b")
harray.remove("b")
self.assertEqual(ls, harray)
self.assertEqual(ls.index("c"), harray.index("c"))
self.assertEqual(ls.index("a"), harray.index("a"))
# Remove from the end
ls = ["a", "b", "c"]
harray = HashedList(ls)
ls.remove("c")
harray.remove("c")
self.assertEqual(ls, harray)
self.assertEqual(ls.index("b"), harray.index("b"))
self.assertEqual(ls.index("a"), harray.index("a"))
def test_pop(self):
# Pop from the beginning
ls = ["a", "b", "c"]
harray = HashedList(ls)
value_ls = ls.pop(0)
value_harray = harray.pop(0)
self.assertEqual(ls, harray)
self.assertEqual(value_ls, value_harray)
self.assertEqual(ls.index("c"), harray.index("c"))
self.assertEqual(ls.index("b"), harray.index("b"))
# Pop from the middle
ls = ["a", "b", "c"]
harray = HashedList(ls)
value_ls = ls.pop(1)
value_harray = harray.pop(1)
self.assertEqual(ls, harray)
self.assertEqual(value_ls, value_harray)
self.assertEqual(ls.index("c"), harray.index("c"))
self.assertEqual(ls.index("a"), harray.index("a"))
# Pop from the end
ls = ["a", "b", "c"]
harray = HashedList(ls)
value_ls = ls.pop()
value_harray = harray.pop()
self.assertEqual(ls, harray)
self.assertEqual(value_ls, value_harray)
self.assertEqual(ls.index("b"), harray.index("b"))
self.assertEqual(ls.index("a"), harray.index("a"))
def test_clear(self):
ls = ["a", "b", "c"]
harray = HashedList(ls)
ls.clear()
harray.clear()
self.assertEqual(ls, harray)
with self.assertRaises(ValueError) as context_ls:
ls.index("a")
with self.assertRaises(ValueError) as context_harray:
harray.index("a")
self.assertEqual(context_harray.exception.args, context_ls.exception.args)
def test_count(self):
ls = ["c", "b"]
harray = ["c", "b"]
self.assertEqual(ls.count("c"), harray.count("c"))
self.assertEqual(ls.count("b"), harray.count("b"))
def test_sort(self):
ls = ["c", "b", "1"]
harray = HashedList(ls)
ls.sort()
harray.sort()
self.assertEqual(ls, harray)
self.assertEqual(ls.index("1"), harray.index("1"))
self.assertEqual(ls.index("b"), harray.index("b"))
self.assertEqual(ls.index("c"), harray.index("c"))
def test_reverse(self):
ls = ["b", "c", "1"]
harray = HashedList(["b", "c", "1"])
ls.reverse()
harray.reverse()
self.assertEqual(ls, harray)
self.assertEqual(ls.index("1"), harray.index("1"))
self.assertEqual(ls.index("b"), harray.index("b"))
self.assertEqual(ls.index("c"), harray.index("c"))
def test_contains(self):
ls = [1, 2, 3]
harray = HashedList([1, 2, 3])
self.assertEqual(1 in ls, 1 in harray)
self.assertEqual(2 in ls, 2 in harray)
self.assertEqual(3 in ls, 3 in harray)
self.assertEqual(999 in ls, 999 in harray)
| 34.027027 | 82 | 0.553954 | 1,152 | 8,813 | 4.18316 | 0.065972 | 0.258352 | 0.225773 | 0.182611 | 0.73314 | 0.721104 | 0.680017 | 0.652625 | 0.649097 | 0.560075 | 0 | 0.026553 | 0.269261 | 8,813 | 258 | 83 | 34.158915 | 0.721739 | 0.029388 | 0 | 0.536232 | 0 | 0 | 0.021663 | 0 | 0 | 0 | 0 | 0 | 0.429952 | 1 | 0.077295 | false | 0 | 0.014493 | 0 | 0.096618 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
3d3208986192092446313c07700d911e121df7ed | 107 | py | Python | busy_beaver/exceptions.py | ZaxR/busy-beaver | ffe1250d0156f71d1053f37c8070ca0dd888348f | [
"MIT"
] | null | null | null | busy_beaver/exceptions.py | ZaxR/busy-beaver | ffe1250d0156f71d1053f37c8070ca0dd888348f | [
"MIT"
] | null | null | null | busy_beaver/exceptions.py | ZaxR/busy-beaver | ffe1250d0156f71d1053f37c8070ca0dd888348f | [
"MIT"
] | null | null | null | class BusyBeaverException(Exception):
pass
class UnexpectedStatusCode(BusyBeaverException):
pass
| 15.285714 | 48 | 0.794393 | 8 | 107 | 10.625 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149533 | 107 | 6 | 49 | 17.833333 | 0.934066 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
3d4a01bd041b104db060aaf9f15b17bc7ef5e3b1 | 120 | py | Python | code/begin/setting.py | redxyb/Flask | 4ee226501f16eb0fa5cb585dc6bf780005fa8a28 | [
"MIT"
] | null | null | null | code/begin/setting.py | redxyb/Flask | 4ee226501f16eb0fa5cb585dc6bf780005fa8a28 | [
"MIT"
] | null | null | null | code/begin/setting.py | redxyb/Flask | 4ee226501f16eb0fa5cb585dc6bf780005fa8a28 | [
"MIT"
] | null | null | null | '''
Author: xyb
Date: 2020-08-09 09:45:47
LastEditTime: 2020-08-09 09:51:56
Flask应用的配置文件
'''
SECRET_KEY = 'SDSFSFSDFSD' | 15 | 33 | 0.716667 | 20 | 120 | 4.25 | 0.75 | 0.141176 | 0.188235 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.264151 | 0.116667 | 120 | 8 | 34 | 15 | 0.537736 | 0.708333 | 0 | 0 | 0 | 0 | 0.392857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
e9e328e23521d66e5f63cf462a24f4161de930f8 | 13,364 | py | Python | uagen/uagen.py | ga1008/uagen | fc42e6859bfaee706acab2e420d8f9af56b357c3 | [
"MIT"
] | null | null | null | uagen/uagen.py | ga1008/uagen | fc42e6859bfaee706acab2e420d8f9af56b357c3 | [
"MIT"
] | null | null | null | uagen/uagen.py | ga1008/uagen | fc42e6859bfaee706acab2e420d8f9af56b357c3 | [
"MIT"
] | null | null | null | import random
from argparse import ArgumentParser, RawTextHelpFormatter
def base_platform():
# ----------- win platform ------------
NT_version = ['6.0', '6.1', '6.2', '6.3', '10.0']
bit_type = ['86', '64']
windows_pf = 'Windows NT {NT_version}; Win{bit_type}; x{bit_type};'.format(
NT_version=random.choice(NT_version), bit_type=random.choice(bit_type))
# ----------- linux platform ------------
linux_system = ['CentOS', 'Ubuntu', 'Redhat', 'Debian', 'OpenSUSE', 'Fedora', 'RHEL']
linux_pf = "X11; {linux_system}; Linux x86_64;".format(linux_system=random.choice(linux_system))
# ----------- mac platform ------------
mac_os_versions = ['10.{}'.format(x) for x in range(6, 16)]
mac_pf = "Macintosh; Intel Mac OS X {mac_os_version};".format(
mac_os_version=random.choice(mac_os_versions))
platforms = [windows_pf, linux_pf, mac_pf]
return random.choice(platforms)
def firefox_ua():
gecko_versions = ['52.2.0esr', '53.0b2', '51.0b5', '52.0.2esr', '58.0b9', '70.0b12', '69.0b4', '63.0.3', '62.0b14', '69.0b5', '70.0', '60.0b15', '60.0', '54.0b2', '70.0b8', '62.0b6', '58.0b5', '58.0b12', '66.0', '66.0.2', '58.0b11', '52.0b7', '52.0b2', '63.0b11', '72.0b9', '60.4.0esr', '52.0.2', '69.0b7', '58.0b3', '69.0b6', '62.0b8', '68.0.1', '63.0b6', '57.0b7', '52.0b1', '52.5.2esr', '62.0b17', '61.0', '60.7.0esr', '58.0b14', '70.0b7', '66.0b9', '61.0b10', '55.0b1', '56.0b2', '59.0b13', '59.0b6', '65.0b7', '68.0.2', '67.0', '66.0b10', '54.0b3', '64.0b13', '60.0b7', '55.0b7', '51.0b7', '50.0b1', '67.0b17', '67.0.2', '71.0b7', '64.0b10', '65.0b6', '69.0b14', '72.0b1', '54.0b6', '67.0.3', '57.0b3', '52.1.1esr', '72.0b8', '64.0.2', '71.0b12', '62.0b3', '56.0b8', '69.0.3', '72.0b5', '50.0b3', '67.0b3', '59.0.3', '62.0b11', '52.0', '60.0b8', '60.3.0esr', '60.0b12', '63.0b14', '66.0b7', '59.0b14', '53.0.3', '55.0b12', '68.0b8', '72.0b3', '70.0b9', '67.0.1', '66.0b14', '63.0b12', '51.0b9', '57.0b14', '60.6.3esr', '60.8.0esr', '62.0b7', '50.1.0', '53.0b5', '66.0b12', '68.0b14', '50.0', '68.0b4', '72.0b2', '60.0b13', '56.0b12', '54.0b7', '64.0b8', '52.0b6', '69.0b16', '63.0b8', '70.0b11', '52.7.3esr', '54.0b11', '52.0b3', '52.1.2esr', '52.4.0esr', '55.0b8', '68.0b3', '58.0b15', '64.0b9', '50.0b7', '61.0b9', '65.0.2', '68.0esr', '58.0b10', '63.0.1', '57.0.3', '60.2.2esr', '60.0.1', '65.0b4', '63.0b5', '71.0b3', '53.0b1', '70.0b13', '68.1.0esr', '62.0b15', '61.0b8', '71.0b4', '54.0b9', '59.0b9', '65.0b11', '57.0b4', '63.0', '58.0b8', '57.0b12', '52.9.0esr', '68.0b5', '66.0b3', '70.0b10', '60.9.0esr', '57.0b9', '53.0.2', '68.3.0esr', '67.0b14', '52.1.0esr', '53.0b3', '57.0b8', '67.0.4', '53.0b8', '58.0b6', '52.7.4esr', '57.0b10', '51.0', '57.0b11', '52.0b5', '55.0b6', '51.0.1', '62.0.3', '70.0b14', '60.1.0esr', '60.7.1esr', '53.0', '56.0', '66.0.3', '67.0b6', '60.0b16', '64.0b14', '62.0', '58.0.1', '66.0.4', '63.0b4', '67.0b9', '57.0.4', '57.0.1', '66.0b6', '53.0b10', '60.6.1esr', '72.0b4', '58.0b4', '69.0b10', '64.0b5', '55.0b11', '59.0b8', '62.0b10', '62.0b5', '59.0b3', '52.8.0esr', '55.0.1', '55.0b9', '53.0b4', '55.0', '66.0b8', '50.0b8', '55.0.2', '52.3.0esr', '60.0b3', '67.0b8', '55.0b4', '51.0b13', '51.0b14', '55.0b13', '59.0', '67.0b4', '64.0b6', '56.0b11', '60.5.1esr', '63.0b10', '62.0b13', '69.0b8', '69.0b11', '61.0b3', '51.0b2', '57.0b6', '68.0b11', '51.0b8', '66.0.1', '56.0b1', '60.2.1esr', '54.0b4', '52.0.1esr', '63.0b3', '60.0b4', '63.0b13', '61.0b5', '52.0b9', '59.0b12', '59.0.2', '71.0b8', '56.0b9', '52.6.0esr', '57.0.2', '71.0b5', '60.0b14', '67.0b7', '52.0b8', '72.0b7', '56.0b6', '52.0.1', '67.0b13', '54.0.1', '52.8.1esr', '60.0b11', '67.0b12', '58.0b13', '68.0.1esr', '56.0b5', '66.0b4', '51.0b11', '57.0', '56.0b3', '57.0b13', '66.0b5', '67.0b19', '67.0b5', '54.0b8', '71.0b6', '71.0b11', '64.0b7', '69.0', '52.7.0esr', '65.0b5', '50.0.1', '70.0b6', '56.0.2', '62.0.2', '71.0b9', '58.0b7', '55.0b3', '57.0b5', '62.0b9', '70.0.1', '56.0b7', '60.0b10', '60.7.2esr', '65.0.1', '65.0b3', '52.0esr', '52.4.1esr', '54.0b13', '66.0b13', '50.0b9', '61.0b12', '59.0b10', '54.0b1', '58.0b16', '66.0.5', '59.0b7', '59.0b5', '61.0.1', '61.0.2', '51.0b3', '65.0b8', '67.0b15', '50.0.2', '50.0b4', '69.0b12', '69.0b3', '61.0b6', '51.0b1', '59.0b4', '61.0b11', '70.0b5', '72.0b6', '50.0b6', '50.0b11', '69.0.1', '69.0b9', '71.0b10', '69.0.2', '59.0b11', '64.0b11', '52.5.0esr', '58.0.2', '68.0b12', '68.0b7', '58.0', '61.0b4', '63.0b7', '55.0b5', '60.2.0esr', '51.0b10', '65.0b12', '61.0b13', '67.0b16', '63.0b9', '68.2.0esr', '53.0b9', '62.0b20', '53.0b7', '60.0b6', '68.0b9', '60.0.2', '66.0b11', '50.0b10', '68.0b13', '68.0b10', '54.0b12', '60.0.2esr', '60.6.2esr', '52.2.1esr', '55.0b2', '62.0b19', '64.0b3', '52.7.1esr', '68.0', '67.0b18', '64.0', '61.0b7', '56.0b4', '64.0b12', '53.0b6', '62.0b18', '56.0.1', '51.0b12', '60.6.0esr', '60.0b9', '64.0b4', '65.0b9', '56.0b10', '55.0b10', '62.0b4', '65.0', '60.5.0esr', '60.5.2esr', '61.0b14', '50.0b5', '54.0b5', '62.0b16', '60.0b5', '68.0b6', '62.0b12', '51.0b6', '60.0esr', '68.0.2esr', '52.0b4', '60.0.1esr', '70.0b3', '54.0', '59.0.1', '51.0b4', '71.0', '55.0.3', '69.0b13', '67.0b10', '52.5.3esr', '70.0b4', '67.0b11', '54.0b10', '52.7.2esr', '69.0b15', '65.0b10', '50.0b2']
f_ua = "Mozilla/5.0 ({platform} rv:{geckoversion}) Gecko/20100101 Firefox/{geckoversion}".format(
platform=base_platform(), geckoversion=random.choice(gecko_versions))
return f_ua
def chrome_ua():
apple_versions = ['534.51.22', '537.50', '537.36', '534.44', '534.46', '534.52.7', '534.53.11', '534.54.16 ', '534.55.3', '534.57.2', '534.59.8', '535.11', '535.20', '535.24', '535.6', '535.7', '535.8', '536.11', '536.2', '536.3', '536.29.13', '536.26', '537.36']
chrome_versions = ['69.0.3493.3', '77.0.3865.90', '69.0.3497.42', '76.0.3809.71', '74.0.3729.169', '68.0.3440.84', '74.0.3724.8', '73.0.3683.10', '67.0.3386.1', '66.0.3359.170', '65.0.3325.146', '72.0.3626.64', '68.0.3440.15', '73.0.3683.20', '64.0.3282.168', '72.0.3626.96', '69.0.3464.0', '73.0.3683.103', '77.0.3865.120', '80.0.3970.5', '78.0.3904.9', '64.0.3282.71', '76.0.3806.1', '70.0.3510.2', '72.0.3610.2', '75.0.3770.90', '66.0.3343.3', '64.0.3282.140', '65.0.3311.3', '70.0.3528.4', '73.0.3683.86', '72.0.3626.119', '80.0.3964.0', '75.0.3770.142', '78.0.3904.87', '67.0.3396.10', '75.0.3753.4', '72.0.3595.2', '79.0.3945.79', '76.0.3809.80', '68.0.3440.106', '79.0.3945.36', '75.0.3770.66', '66.0.3359.181', '67.0.3393.4', '68.0.3440.68', '65.0.3322.3', '80.0.3987.7', '78.0.3895.5', '66.0.3359.139', '76.0.3788.1', '70.0.3538.77', '69.0.3497.32', '69.0.3486.0', '68.0.3423.2', '66.0.3359.45', '78.0.3904.97', '65.0.3325.181', '79.0.3945.88', '77.0.3865.42', '68.0.3440.59', '67.0.3396.62', '72.0.3626.17', '76.0.3809.87', '73.0.3683.75', '69.0.3497.57', '74.0.3717.0', '73.0.3683.56', '70.0.3521.2', '68.0.3440.42', '79.0.3945.29', '80.0.3987.16', '71.0.3554.0', '75.0.3745.4', '77.0.3865.70', '75.0.3770.80', '65.0.3325.18', '78.0.3904.44', '74.0.3702.0', '76.0.3809.62', '70.0.3538.110', '67.0.3396.40', '71.0.3573.0', '79.0.3945.45', '76.0.3809.46', '67.0.3396.56', '74.0.3729.75', '75.0.3770.8', '67.0.3381.1', '72.0.3622.0', '66.0.3346.8', '67.0.3396.30', '77.0.3854.3', '65.0.3315.3', '66.0.3359.66', '77.0.3865.56', '69.0.3497.72', '77.0.3831.6', '76.0.3809.132', '69.0.3497.81', '73.0.3639.1', '73.0.3679.0', '79.0.3945.70', '76.0.3809.36', '66.0.3350.0', '68.0.3440.25', '65.0.3325.162', '76.0.3809.25', '64.0.3282.113', '77.0.3865.19', '74.0.3710.0', '68.0.3438.3', '77.0.3865.65', '70.0.3534.4', '69.0.3497.100', '78.0.3904.108', '71.0.3559.6', '67.0.3377.1', '74.0.3729.157', '66.0.3359.33', '78.0.3902.4', '69.0.3497.92', '78.0.3904.34', '75.0.3770.52', '65.0.3325.88', '72.0.3590.0', '72.0.3626.14', '67.0.3396.99', '76.0.3809.12', '64.0.3282.85', '75.0.3770.38', '77.0.3860.5', '70.0.3514.0', '81.0.4000.3', '69.0.3452.0', '68.0.3418.2', '67.0.3371.0', '68.0.3440.75', '78.0.3904.70', '74.0.3729.61', '71.0.3578.30', '64.0.3282.167', '71.0.3578.53', '71.0.3578.10', '69.0.3497.12', '77.0.3865.10', '75.0.3770.87', '67.0.3396.18', '73.0.3664.3', '68.0.3440.7', '79.0.3945.16', '78.0.3904.21', '66.0.3359.117', '76.0.3800.0', '67.0.3396.48', '78.0.3887.7', '74.0.3729.131', '65.0.3325.51', '79.0.3945.56', '64.0.3282.119', '72.0.3626.7', '71.0.3578.62', '73.0.3683.39', '77.0.3865.35', '72.0.3626.53', '79.0.3928.4', '75.0.3759.4', '72.0.3626.28', '71.0.3551.3', '77.0.3833.0', '71.0.3578.98', '70.0.3538.54', '78.0.3904.50', '73.0.3683.27', '70.0.3538.102', '72.0.3626.121', '70.0.3538.9', '79.0.3921.0', '71.0.3578.80', '68.0.3440.33', '65.0.3325.73', '70.0.3538.22', '72.0.3626.109', '68.0.3440.17', '73.0.3642.0', '75.0.3770.15', '75.0.3766.2', '71.0.3578.20', '67.0.3396.87', '64.0.3282.99', '75.0.3770.27', '78.0.3880.4', '68.0.3432.3', '71.0.3569.0', '80.0.3983.2', '74.0.3729.91', '69.0.3497.23', '75.0.3770.100', '68.0.3409.2', '63.0.3239.132', '69.0.3472.3', '70.0.3538.67', '67.0.3396.79', '64.0.3282.186', '76.0.3809.21', '72.0.3626.81', '78.0.3904.17', '77.0.3865.75', '70.0.3538.16', '78.0.3876.0', '70.0.3538.45', '66.0.3359.81', '66.0.3355.0', '76.0.3809.100', '77.0.3824.6', '74.0.3729.40', '74.0.3729.108', '71.0.3578.44', '65.0.3325.32', '79.0.3941.4', '70.0.3538.35', '66.0.3359.106']
ch_ua = "Mozilla/5.0 ({platform}) AppleWebKit/{ap_version} (KHTML, like Gecko) Chrome/{chrome_v} Safari/{ap_version}".format(
platform=base_platform(), ap_version=random.choice(apple_versions),
chrome_v=random.choice(chrome_versions)
)
return ch_ua
def opera_ua():
opr_versions = ['55.0.2994.59', '53.0.2907.110', '63.0.3368.54789', '56.0.3051.52', '45.0.2552.892', '60.0.3255.170', '62.0.3331.81', '63.0.3368.43', '62.0.3331.116', '46.0.2597.61', '60.0.3255.151', '56.0.3051.116', '62.0.3331.117', '63.0.3368.57388', '42.0.2393.85', '45.0.2552.635', '54.0.2952.60', '44.0.2510.1159', '44.0.2510.1449', '49.0.2725.56', '60.0.3255.109', '42.0.2393.137', '54.0.2952.71', '62.0.3331.119', '49.0.2725.34', '62.0.3331.66', '43.0.2442.1165', '65.0.3467.42', '53.0.2907.99', '64.0.3417.61', '60.0.3255.50962', '51.0.2830.26', '64.0.3417.92', '62.0.3331.105', '62.0.3331.52', '48.0.2685.39', '57.0.3098.106', '56.0.3051.36', '63.0.3368.55362', '50.0.2762.67', '63.0.3368.94', '62.0.3331.132', '58.0.3135.132', '65.0.3467.72', '60.0.3255.50747', '55.0.2994.56', '52.0.2871.99', '57.0.3098.116', '49.0.2725.39', '58.0.3135.47', '60.0.3255.27', '62.0.3331.101', '63.0.3368.107', '48.0.2685.52', '58.0.3135.127', '62.0.3331.96', '55.0.2994.61', '63.0.3368.75', '65.0.3467.78', '60.0.3255.141', '47.0.2631.39', '49.0.2725.47', '46.0.2597.57', '40.0.2308.54', '53.0.2907.88', '63.0.3368.56078', '53.0.2907.57', '64.0.3417.119', '53.0.2907.37', '56.0.3051.31', '54.0.2952.41', '62.0.3331.99', '44.0.2510.857', '63.0.3368.54979', '56.0.3051.104', '41.0.2353.56', '48.0.2685.35', '53.0.2907.106', '58.0.3135.79', '60.0.3255.51199', '43.0.2442.1144', '47.0.2631.71', '62.0.3331.43', '58.0.3135.65', '45.0.2552.884', '58.0.3135.117', '50.0.2762.45', '46.0.2597.26', '45.0.2552.881', '42.0.2393.351', '46.0.2597.46', '40.0.2308.90', '64.0.3417.54', '43.0.2442.806', '64.0.3417.70', '52.0.2871.97', '51.0.2830.34', '63.0.3368.57756', '52.0.2871.64', '44.0.2510.1218', '56.0.3051.35', '63.0.3368.66', '64.0.3417.73', '49.0.2725.43', '58.0.3135.68', '62.0.3331.18', '64.0.3417.47', '63.0.3368.35', '45.0.2552.812', '56.0.3051.99', '54.0.2952.51', '65.0.3467.48', '45.0.2552.869', '48.0.2685.50', '62.0.3331.72', '63.0.3368.53', '47.0.2631.48', '53.0.2907.68', '52.0.2871.37', '46.0.2597.32', '58.0.3135.107', '58.0.3135.90', '60.0.3255.160', '42.0.2393.94', '60.0.3255.56', '42.0.2393.517', '60.0.3255.84', '63.0.3368.55666', '48.0.2685.32', '47.0.2631.83', '51.0.2830.62', '41.0.2353.69', '45.0.2552.888', '56.0.3051.43', '57.0.3098.110', '65.0.3467.38', '54.0.2952.54', '58.0.3135.53', '52.0.2871.30', '40.0.2308.81', '57.0.3098.91', '57.0.3098.102', '60.0.3255.116', '40.0.2308.75', '64.0.3417.83', '43.0.2442.991', '63.0.3368.71', '41.0.2353.46', '55.0.2994.44', '46.0.2597.39', '65.0.3467.69', '60.0.3255.59', '63.0.3368.88', '49.0.2725.64', '60.0.3255.125', '63.0.3368.56786', '55.0.2994.37', '47.0.2631.55', '58.0.3135.118', '50.0.2762.58', '64.0.3417.126', '65.0.3467.62', '51.0.2830.55', '60.0.3255.95', '54.0.2952.46', '54.0.2952.64', '57.0.3098.76', '51.0.2830.40', '52.0.2871.40', '56.0.3051.102', '47.0.2631.80', '60.0.3255.83', '45.0.2552.898', '60.0.3255.70', '40.0.2308.62']
op_ua = chrome_ua() + ' OPR/' + random.choice(opr_versions)
return op_ua
def random_ua():
ua_func = random.choice([
firefox_ua,
chrome_ua,
opera_ua
])
return ua_func()
def uagen():
dp = ' 这是一个随机 UA 生成工具,如果你不确定怎么使用,请参考 README.md。\n' \
' https://github.com/ga1008/uagen'
da = ""
parser = ArgumentParser(description=dp, formatter_class=RawTextHelpFormatter, add_help=True)
parser.add_argument("-m", "--mode", type=str, dest="mode", default='random',
help=f'{da}生成模式:random/firefox/chrome/opera,默认 random')
args = parser.parse_args()
mode = args.mode
mode_dic = {
"random": random_ua,
"firefox": firefox_ua,
"chrome": chrome_ua,
"opera": opera_ua
}
print(mode_dic.get(mode, random_ua)())
if __name__ == '__main__':
# print(opera())
print(random_ua())
| 171.333333 | 4,278 | 0.555447 | 2,861 | 13,364 | 2.566585 | 0.14051 | 0.008988 | 0.016206 | 0.002996 | 0.005175 | 0 | 0 | 0 | 0 | 0 | 0 | 0.455603 | 0.099147 | 13,364 | 77 | 4,279 | 173.558442 | 0.154332 | 0.009728 | 0 | 0 | 0 | 0.017857 | 0.61728 | 0.006425 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107143 | false | 0 | 0.035714 | 0 | 0.232143 | 0.035714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
e9fbf0881dda5fe91fc25270249cd1bfd4772f8e | 80 | py | Python | runserver.py | dedeco/cnddh-denuncias | 96d6945d8b4036895fe9409cb8d089a7c3b11f8d | [
"Apache-2.0"
] | 2 | 2017-04-19T10:23:45.000Z | 2020-01-19T05:11:40.000Z | runserver.py | dedeco/cnddh-denuncias | 96d6945d8b4036895fe9409cb8d089a7c3b11f8d | [
"Apache-2.0"
] | null | null | null | runserver.py | dedeco/cnddh-denuncias | 96d6945d8b4036895fe9409cb8d089a7c3b11f8d | [
"Apache-2.0"
] | 1 | 2017-04-19T10:23:47.000Z | 2017-04-19T10:23:47.000Z | from cnddh import app
from cnddh.config import DEBUG
app.run(debug=DEBUG)
| 16 | 31 | 0.75 | 14 | 80 | 4.357143 | 0.571429 | 0.295082 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175 | 80 | 4 | 32 | 20 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.666667 | null | null | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
181aaba53b303764d666f729a3d52b0b793f5650 | 144 | py | Python | jupyterlab2pymolpysnips/Selection/hideSelection.py | MooersLab/pymolpysnips | 50a89c85adf8006d85c1d6cd3f8aad7e440a0b92 | [
"MIT"
] | null | null | null | jupyterlab2pymolpysnips/Selection/hideSelection.py | MooersLab/pymolpysnips | 50a89c85adf8006d85c1d6cd3f8aad7e440a0b92 | [
"MIT"
] | null | null | null | jupyterlab2pymolpysnips/Selection/hideSelection.py | MooersLab/pymolpysnips | 50a89c85adf8006d85c1d6cd3f8aad7e440a0b92 | [
"MIT"
] | null | null | null | """
cmd.do('indicate none')
"""
cmd.do('indicate none')
# Description: Trun off magenta squares on current selection.
# Source: placeHolder
| 16 | 62 | 0.701389 | 18 | 144 | 5.611111 | 0.777778 | 0.09901 | 0.257426 | 0.336634 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152778 | 144 | 8 | 63 | 18 | 0.827869 | 0.736111 | 0 | 0 | 0 | 0 | 0.464286 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
1836d676e62f8dfda4b5b0d814ba3f713885bafc | 4,489 | py | Python | authors/apps/articles/migrations/0002_auto_20190228_0656.py | andela/ah-fulldeck | e89bee17668c0fc5db94fd70974145f39485c017 | [
"BSD-3-Clause"
] | null | null | null | authors/apps/articles/migrations/0002_auto_20190228_0656.py | andela/ah-fulldeck | e89bee17668c0fc5db94fd70974145f39485c017 | [
"BSD-3-Clause"
] | 25 | 2019-01-08T06:15:52.000Z | 2021-06-10T21:06:33.000Z | authors/apps/articles/migrations/0002_auto_20190228_0656.py | andela/ah-fulldeck | e89bee17668c0fc5db94fd70974145f39485c017 | [
"BSD-3-Clause"
] | 3 | 2019-09-17T13:28:02.000Z | 2020-12-15T12:38:18.000Z | # Generated by Django 2.1.5 on 2019-02-28 06:56
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('contenttypes', '0002_remove_content_type_name'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('articles', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='reportarticle',
name='user',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL, to_field='email'),
),
migrations.AddField(
model_name='likedislike',
name='content_type',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='contenttypes.ContentType'),
),
migrations.AddField(
model_name='likedislike',
name='user',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL, verbose_name='user'),
),
migrations.AddField(
model_name='historicalcomment',
name='article',
field=models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='articles.Article'),
),
migrations.AddField(
model_name='historicalcomment',
name='author',
field=models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='historicalcomment',
name='history_user',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='+', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='historicalcomment',
name='parent',
field=models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='articles.Comment'),
),
migrations.AddField(
model_name='favoritearticle',
name='article',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='articles.Article'),
),
migrations.AddField(
model_name='favoritearticle',
name='user',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='comment',
name='article',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, related_name='comments', to='articles.Article'),
),
migrations.AddField(
model_name='comment',
name='author',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='comments', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='comment',
name='parent',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='threads', to='articles.Comment'),
),
migrations.AddField(
model_name='articleratings',
name='article',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='articleratings', to='articles.Article'),
),
migrations.AddField(
model_name='articleratings',
name='rated_by',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='article',
name='author',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='article', to=settings.AUTH_USER_MODEL),
),
migrations.AddField(
model_name='article',
name='likedislike',
field=models.ManyToManyField(related_name='articles', to='articles.LikeDislike'),
),
migrations.AddField(
model_name='article',
name='tags',
field=models.ManyToManyField(related_name='articles', to='articles.Tag'),
),
]
| 42.752381 | 175 | 0.631544 | 465 | 4,489 | 5.926882 | 0.154839 | 0.049347 | 0.141872 | 0.166546 | 0.826197 | 0.81894 | 0.726052 | 0.62373 | 0.583817 | 0.583817 | 0 | 0.006781 | 0.244375 | 4,489 | 104 | 176 | 43.163462 | 0.805719 | 0.010025 | 0 | 0.659794 | 1 | 0 | 0.136425 | 0.011932 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.030928 | 0 | 0.072165 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
18482c34a45ee42e621e1fcfd3a880499a95edee | 51 | py | Python | gitops_server/workers/status_updater/__init__.py | uptick/gitops | f65057ef337a6fe34f3510c889c2284c1fdb7218 | [
"BSD-2-Clause"
] | 6 | 2020-05-12T14:58:20.000Z | 2022-02-01T09:40:55.000Z | gitops_server/workers/status_updater/__init__.py | uptick/gitops | f65057ef337a6fe34f3510c889c2284c1fdb7218 | [
"BSD-2-Clause"
] | 22 | 2019-09-04T13:33:12.000Z | 2022-03-17T06:05:16.000Z | gitops_server/workers/status_updater/__init__.py | uptick/gitops | f65057ef337a6fe34f3510c889c2284c1fdb7218 | [
"BSD-2-Clause"
] | null | null | null | from .worker import DeploymentStatusWorker # NOQA
| 25.5 | 50 | 0.823529 | 5 | 51 | 8.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.137255 | 51 | 1 | 51 | 51 | 0.954545 | 0.078431 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
18611cc1e520d2675a1bd799dafcbdca9c6e057a | 336 | py | Python | backend/simulation/mechanisms.py | eric-yoo/UDC-2019-Hackathon | 04147f4e3d307784468ee2423712e1c425cb8a58 | [
"MIT"
] | null | null | null | backend/simulation/mechanisms.py | eric-yoo/UDC-2019-Hackathon | 04147f4e3d307784468ee2423712e1c425cb8a58 | [
"MIT"
] | null | null | null | backend/simulation/mechanisms.py | eric-yoo/UDC-2019-Hackathon | 04147f4e3d307784468ee2423712e1c425cb8a58 | [
"MIT"
] | 1 | 2020-10-01T15:22:35.000Z | 2020-10-01T15:22:35.000Z | def incentive_linear(diff, coef):
earn = diff * coef
return earn
def incentive_power(diff, coef, power):
earn = (diff ** power) * coef
return earn
def incentive_exp(diff, coef, exp):
earn = (exp ** diff) * coef
return earn
def custom(diff, coef, exp):
earn = (exp ** (diff + 2)) * coef
return earn
| 17.684211 | 39 | 0.610119 | 46 | 336 | 4.391304 | 0.23913 | 0.237624 | 0.277228 | 0.252475 | 0.579208 | 0.217822 | 0 | 0 | 0 | 0 | 0 | 0.004049 | 0.264881 | 336 | 18 | 40 | 18.666667 | 0.813765 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
187e01131b58a895005faecd7d62a1a1612cc6d6 | 253 | py | Python | src/data/datasets/__init__.py | ndeutschmann/face_gans | 84f2e92f2f674de5052aeee1dac8c20a29190be4 | [
"MIT"
] | null | null | null | src/data/datasets/__init__.py | ndeutschmann/face_gans | 84f2e92f2f674de5052aeee1dac8c20a29190be4 | [
"MIT"
] | null | null | null | src/data/datasets/__init__.py | ndeutschmann/face_gans | 84f2e92f2f674de5052aeee1dac8c20a29190be4 | [
"MIT"
] | null | null | null | from .celebA_dataloader import create_celebA_dataloader
from .image_target_folder_dataset import ImageTargetFoldersDataset
from .image_folder_target_file_dataset import ImageFoldersTargetFileDataset
from .indexable_dataset import MultiHDF5TablesDataset
| 50.6 | 75 | 0.920949 | 27 | 253 | 8.222222 | 0.518519 | 0.175676 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004219 | 0.063241 | 253 | 4 | 76 | 63.25 | 0.932489 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
43eda3392851bc3c6bf18d6e3aa3cfe2449852ad | 152 | py | Python | backend/text_processing/__init__.py | malyvsen/lawyer-to-human | caf5c0497f7ff6059815264fec044dc842a7cc25 | [
"MIT"
] | null | null | null | backend/text_processing/__init__.py | malyvsen/lawyer-to-human | caf5c0497f7ff6059815264fec044dc842a7cc25 | [
"MIT"
] | null | null | null | backend/text_processing/__init__.py | malyvsen/lawyer-to-human | caf5c0497f7ff6059815264fec044dc842a7cc25 | [
"MIT"
] | null | null | null | from text_processing.document import Document
from text_processing.analysis import analysis
__all__ = ['document', 'sentence', 'word', 'lemma_count']
| 25.333333 | 57 | 0.789474 | 18 | 152 | 6.277778 | 0.611111 | 0.141593 | 0.318584 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.105263 | 152 | 5 | 58 | 30.4 | 0.830882 | 0 | 0 | 0 | 0 | 0 | 0.203947 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
43f036a2a18dddfbe00c17753c79ecb914d795f6 | 36 | py | Python | src/sensu_go/clients/__init__.py | sensu/sensu-go-python | b225559646f0b6dcc20056c83e243a64c7d40b5c | [
"MIT"
] | 3 | 2021-04-11T16:07:00.000Z | 2021-07-12T13:20:33.000Z | src/sensu_go/resources/__init__.py | xlab-steampunk/sensu-go-python | 8dbd7e554047341bb61327d80da3e41172924831 | [
"MIT"
] | 8 | 2021-02-24T00:58:58.000Z | 2021-07-14T07:41:06.000Z | src/sensu_go/resources/__init__.py | xlab-steampunk/sensu-go-python | 8dbd7e554047341bb61327d80da3e41172924831 | [
"MIT"
] | 2 | 2021-01-26T20:52:37.000Z | 2021-12-28T14:01:10.000Z | # Copyright (c) 2020 XLAB Steampunk
| 18 | 35 | 0.75 | 5 | 36 | 5.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133333 | 0.166667 | 36 | 1 | 36 | 36 | 0.766667 | 0.916667 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
a100f1a3d6ab7f2f6bcce48aa3949c43ff00b294 | 172 | py | Python | utils/models/__init__.py | PyFlux/PyFlux-Django-Backend | b3be6002cbf82e9e51ff75f46a42fcf90cdc4925 | [
"Apache-2.0"
] | null | null | null | utils/models/__init__.py | PyFlux/PyFlux-Django-Backend | b3be6002cbf82e9e51ff75f46a42fcf90cdc4925 | [
"Apache-2.0"
] | 13 | 2019-12-04T23:06:12.000Z | 2022-03-11T23:48:46.000Z | utils/models/__init__.py | PyFlux/PyFlux-Django-Backend | b3be6002cbf82e9e51ff75f46a42fcf90cdc4925 | [
"Apache-2.0"
] | 1 | 2019-06-25T08:49:54.000Z | 2019-06-25T08:49:54.000Z | from .Taskmanagement import TaskManagement
from .Feedback import Feedback
from .Template import Template
from .Category import Category
from .Subcategory import SubCategory | 34.4 | 42 | 0.860465 | 20 | 172 | 7.4 | 0.35 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110465 | 172 | 5 | 43 | 34.4 | 0.96732 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
a13a3f2eb3952c6750a2393e7460aa816732842b | 128 | py | Python | Calculate-distance/measurements/admin.py | A-kriti/Amazing-Python-Scripts | ebf607fe39e6d9e61f30ec3439fc8d6ab1f736b9 | [
"MIT"
] | 930 | 2020-09-05T22:07:28.000Z | 2022-03-30T07:56:18.000Z | Calculate-distance/measurements/admin.py | maheshdbabar9340/Amazing-Python-Scripts | e2272048cbe49b4bda5072bbdd8479739bb6c18d | [
"MIT"
] | 893 | 2020-09-04T07:57:24.000Z | 2022-02-08T02:12:26.000Z | Calculate-distance/measurements/admin.py | maheshdbabar9340/Amazing-Python-Scripts | e2272048cbe49b4bda5072bbdd8479739bb6c18d | [
"MIT"
] | 497 | 2020-09-05T08:16:24.000Z | 2022-03-31T00:55:57.000Z | from django.contrib import admin
from .models import Measurement
# Register your models here.
admin.site.register(Measurement)
| 21.333333 | 32 | 0.820313 | 17 | 128 | 6.176471 | 0.647059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117188 | 128 | 5 | 33 | 25.6 | 0.929204 | 0.203125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
a14b7f2d231b513febcbc5a483309f8ce51f28f1 | 642 | py | Python | code/generate_grammar/hollo1996/NounPhraseGenerator/Fourlang.py | Hollo1996/semantic_parsing_with_IRTGs | 49a445507bfbf53cc8c2509a01ac531829b6e369 | [
"MIT"
] | null | null | null | code/generate_grammar/hollo1996/NounPhraseGenerator/Fourlang.py | Hollo1996/semantic_parsing_with_IRTGs | 49a445507bfbf53cc8c2509a01ac531829b6e369 | [
"MIT"
] | null | null | null | code/generate_grammar/hollo1996/NounPhraseGenerator/Fourlang.py | Hollo1996/semantic_parsing_with_IRTGs | 49a445507bfbf53cc8c2509a01ac531829b6e369 | [
"MIT"
] | null | null | null | from enum import Enum
#This is the Enum I store an enum for the types of the irtg-s the program can generate in.
class idGenerator:
id = -1
@classmethod
def auto(cls):
cls.id += 1
return cls.id
class Fourlang(Enum):
ZeroTo = idGenerator.auto()
UnderTo = idGenerator.auto()
Zero = idGenerator.auto()
OneTo_ZeroBack = idGenerator.auto()
TwoTo = idGenerator.auto()
OneBack__TwoTo = idGenerator.auto()
OneBack_at_TwoTo = idGenerator.auto()
OneBack_has_TwoTo = idGenerator.auto()
ZeroCompound = idGenerator.auto()
ZeroFlat = idGenerator.auto()
_None = idGenerator.auto()
| 25.68 | 90 | 0.676012 | 80 | 642 | 5.325 | 0.5 | 0.387324 | 0.187793 | 0.190141 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00404 | 0.228972 | 642 | 24 | 91 | 26.75 | 0.856566 | 0.138629 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0 | 0.052632 | 0 | 0.894737 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
a17f905649711350b098a890a9be6d77adfc63c9 | 195 | py | Python | apps/orgnization/views_restful.py | My-captain/iMooc | 0dbcd45e48cb1d6bb56cd86b35ee744bdec2b683 | [
"MIT"
] | 3 | 2018-08-13T04:28:56.000Z | 2020-12-31T11:40:00.000Z | apps/orgnization/views_restful.py | My-captain/iMooc | 0dbcd45e48cb1d6bb56cd86b35ee744bdec2b683 | [
"MIT"
] | null | null | null | apps/orgnization/views_restful.py | My-captain/iMooc | 0dbcd45e48cb1d6bb56cd86b35ee744bdec2b683 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# ==========================================
# @Time : 2018/3/26 下午9:39
# @Author : Mr.Robot
# @File : views_restful.py
# ========================================== | 32.5 | 44 | 0.297436 | 16 | 195 | 3.5625 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.065868 | 0.14359 | 195 | 6 | 45 | 32.5 | 0.275449 | 0.938462 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
a18f2b9592345820342d627f207d0b7aa954acc4 | 82 | py | Python | enthought/chaco/tooltip.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 3 | 2016-12-09T06:05:18.000Z | 2018-03-01T13:00:29.000Z | enthought/chaco/tooltip.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 1 | 2020-12-02T00:51:32.000Z | 2020-12-02T08:48:55.000Z | enthought/chaco/tooltip.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | null | null | null | # proxy module
from __future__ import absolute_import
from chaco.tooltip import *
| 20.5 | 38 | 0.829268 | 11 | 82 | 5.727273 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.134146 | 82 | 3 | 39 | 27.333333 | 0.887324 | 0.146341 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
a1927f37ee4fcfb618fb287cf50ad279d39cb7b9 | 90 | py | Python | __init__.py | m2lines/subgrid | 3de5d14c5525a62529d43cbafccda716c74e32df | [
"MIT"
] | 1 | 2021-11-03T01:27:16.000Z | 2021-11-03T01:27:16.000Z | __init__.py | m2lines/subgrid | 3de5d14c5525a62529d43cbafccda716c74e32df | [
"MIT"
] | null | null | null | __init__.py | m2lines/subgrid | 3de5d14c5525a62529d43cbafccda716c74e32df | [
"MIT"
] | 1 | 2021-06-24T15:58:32.000Z | 2021-06-24T15:58:32.000Z | # -*- coding: utf-8 -*-
"""
Created on Mon Nov 25 18:28:27 2019
@author: Arthur
"""
pass | 11.25 | 35 | 0.588889 | 15 | 90 | 3.533333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180556 | 0.2 | 90 | 8 | 36 | 11.25 | 0.555556 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
a1ad0891a80638b8aba2b700d404b2ea2309e68e | 869 | py | Python | tests/test_provider_mumoshu_helmfile.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 507 | 2017-07-26T02:58:38.000Z | 2022-01-21T12:35:13.000Z | tests/test_provider_mumoshu_helmfile.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 135 | 2017-07-20T12:01:59.000Z | 2021-10-04T22:25:40.000Z | tests/test_provider_mumoshu_helmfile.py | mjuenema/python-terrascript | 6d8bb0273a14bfeb8ff8e950fe36f97f7c6e7b1d | [
"BSD-2-Clause"
] | 81 | 2018-02-20T17:55:28.000Z | 2022-01-31T07:08:40.000Z | # tests/test_provider_mumoshu_helmfile.py
# Automatically generated by tools/makecode.py (24-Sep-2021 15:18:32 UTC)
def test_provider_import():
import terrascript.provider.mumoshu.helmfile
def test_resource_import():
from terrascript.resource.mumoshu.helmfile import helmfile_embedding_example
from terrascript.resource.mumoshu.helmfile import helmfile_release
from terrascript.resource.mumoshu.helmfile import helmfile_release_set
# TODO: Shortcut imports without namespace for official and supported providers.
# TODO: This has to be moved into a required_providers block.
# def test_version_source():
#
# import terrascript.provider.mumoshu.helmfile
#
# t = terrascript.provider.mumoshu.helmfile.helmfile()
# s = str(t)
#
# assert 'https://github.com/mumoshu/terraform-provider-helmfile' in s
# assert '0.14.1' in s
| 29.965517 | 80 | 0.767549 | 113 | 869 | 5.769912 | 0.530973 | 0.161043 | 0.141104 | 0.156442 | 0.383436 | 0.260736 | 0.260736 | 0.180982 | 0 | 0 | 0 | 0.021563 | 0.146145 | 869 | 28 | 81 | 31.035714 | 0.857143 | 0.576525 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0 | 1 | 0.333333 | true | 0 | 1 | 0 | 1.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
a1c01c7c883911644ed0b0ecd3f304188c4d5a81 | 13,703 | py | Python | rdr_service/lib_fhir/fhirclient_3_0_0/models/device_tests.py | all-of-us/raw-data-repository | d28ad957557587b03ff9c63d55dd55e0508f91d8 | [
"BSD-3-Clause"
] | 39 | 2017-10-13T19:16:27.000Z | 2021-09-24T16:58:21.000Z | rdr_service/lib_fhir/fhirclient_3_0_0/models/device_tests.py | all-of-us/raw-data-repository | d28ad957557587b03ff9c63d55dd55e0508f91d8 | [
"BSD-3-Clause"
] | 312 | 2017-09-08T15:42:13.000Z | 2022-03-23T18:21:40.000Z | rdr_service/lib_fhir/fhirclient_3_0_0/models/device_tests.py | all-of-us/raw-data-repository | d28ad957557587b03ff9c63d55dd55e0508f91d8 | [
"BSD-3-Clause"
] | 19 | 2017-09-15T13:58:00.000Z | 2022-02-07T18:33:20.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
#
# Generated from FHIR 3.0.0.11832 on 2017-03-22.
# 2017, SMART Health IT.
import io
import json
import os
import unittest
from . import device
from .fhirdate import FHIRDate
class DeviceTests(unittest.TestCase):
def instantiate_from(self, filename):
datadir = os.environ.get('FHIR_UNITTEST_DATADIR') or ''
with io.open(os.path.join(datadir, filename), 'r', encoding='utf-8') as handle:
js = json.load(handle)
self.assertEqual("Device", js["resourceType"])
return device.Device(js)
def testDevice1(self):
inst = self.instantiate_from("device-example-f001-feedingtube.json")
self.assertIsNotNone(inst, "Must have instantiated a Device instance")
self.implDevice1(inst)
js = inst.as_json()
self.assertEqual("Device", js["resourceType"])
inst2 = device.Device(js)
self.implDevice1(inst2)
def implDevice1(self, inst):
self.assertEqual(inst.expirationDate.date, FHIRDate("2020-08-08").date)
self.assertEqual(inst.expirationDate.as_json(), "2020-08-08")
self.assertEqual(inst.id, "f001")
self.assertEqual(inst.identifier[0].system, "http:/goodhealthhospital/identifier/devices")
self.assertEqual(inst.identifier[0].value, "12345")
self.assertEqual(inst.manufactureDate.date, FHIRDate("2015-08-08").date)
self.assertEqual(inst.manufactureDate.as_json(), "2015-08-08")
self.assertEqual(inst.status, "active")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.type.coding[0].code, "25062003")
self.assertEqual(inst.type.coding[0].display, "Feeding tube, device")
self.assertEqual(inst.type.coding[0].system, "http://snomed.info/sct")
def testDevice2(self):
inst = self.instantiate_from("device-example-ihe-pcd.json")
self.assertIsNotNone(inst, "Must have instantiated a Device instance")
self.implDevice2(inst)
js = inst.as_json()
self.assertEqual("Device", js["resourceType"])
inst2 = device.Device(js)
self.implDevice2(inst2)
def implDevice2(self, inst):
self.assertEqual(inst.id, "ihe-pcd")
self.assertEqual(inst.identifier[0].type.coding[0].code, "SNO")
self.assertEqual(inst.identifier[0].type.coding[0].system, "http://hl7.org/fhir/identifier-type")
self.assertEqual(inst.identifier[0].type.text, "Serial Number")
self.assertEqual(inst.identifier[0].value, "AMID-123-456")
self.assertEqual(inst.lotNumber, "12345")
self.assertEqual(inst.manufacturer, "Acme Devices, Inc")
self.assertEqual(inst.model, "A.1.1")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.type.text, "Vital Signs Monitor")
def testDevice3(self):
inst = self.instantiate_from("device-example-pacemaker.json")
self.assertIsNotNone(inst, "Must have instantiated a Device instance")
self.implDevice3(inst)
js = inst.as_json()
self.assertEqual("Device", js["resourceType"])
inst2 = device.Device(js)
self.implDevice3(inst2)
def implDevice3(self, inst):
self.assertEqual(inst.contact[0].system, "phone")
self.assertEqual(inst.contact[0].value, "ext 4352")
self.assertEqual(inst.id, "example-pacemaker")
self.assertEqual(inst.identifier[0].system, "http://acme.com/devices/pacemakers/octane/serial")
self.assertEqual(inst.identifier[0].value, "1234-5678-90AB-CDEF")
self.assertEqual(inst.lotNumber, "1234-5678")
self.assertEqual(inst.manufacturer, "Acme Devices, Inc")
self.assertEqual(inst.model, "PM/Octane 2014")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.type.coding[0].code, "octane2014")
self.assertEqual(inst.type.coding[0].display, "Performance pace maker for high octane patients")
self.assertEqual(inst.type.coding[0].system, "http://acme.com/devices")
def testDevice4(self):
inst = self.instantiate_from("device-example-software.json")
self.assertIsNotNone(inst, "Must have instantiated a Device instance")
self.implDevice4(inst)
js = inst.as_json()
self.assertEqual("Device", js["resourceType"])
inst2 = device.Device(js)
self.implDevice4(inst2)
def implDevice4(self, inst):
self.assertEqual(inst.contact[0].system, "url")
self.assertEqual(inst.contact[0].value, "http://acme.com")
self.assertEqual(inst.id, "software")
self.assertEqual(inst.identifier[0].system, "http://acme.com/ehr/client-ids")
self.assertEqual(inst.identifier[0].value, "goodhealth")
self.assertEqual(inst.manufacturer, "Acme Devices, Inc")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.type.text, "EHR")
self.assertEqual(inst.url, "http://acme.com/goodhealth/ehr/")
self.assertEqual(inst.version, "10.23-23423")
def testDevice5(self):
inst = self.instantiate_from("device-example-udi1.json")
self.assertIsNotNone(inst, "Must have instantiated a Device instance")
self.implDevice5(inst)
js = inst.as_json()
self.assertEqual("Device", js["resourceType"])
inst2 = device.Device(js)
self.implDevice5(inst2)
def implDevice5(self, inst):
self.assertEqual(inst.expirationDate.date, FHIRDate("2014-11-20").date)
self.assertEqual(inst.expirationDate.as_json(), "2014-11-20")
self.assertEqual(inst.id, "example-udi1")
self.assertEqual(inst.identifier[0].system, "http://acme.com/devices/pacemakers/octane/serial")
self.assertEqual(inst.identifier[0].value, "1234-5678-90AB-CDEF")
self.assertEqual(inst.identifier[1].type.coding[0].code, "SNO")
self.assertEqual(inst.identifier[1].type.coding[0].system, "http://hl7.org/fhir/identifier-type")
self.assertEqual(inst.identifier[1].value, "10987654d321")
self.assertEqual(inst.lotNumber, "7654321D")
self.assertEqual(inst.manufacturer, "Acme Devices, Inc")
self.assertEqual(inst.model, "PM/Octane 2014")
self.assertEqual(inst.safety[0].coding[0].code, "mr-unsafe")
self.assertEqual(inst.safety[0].coding[0].display, "MR Unsafe")
self.assertEqual(inst.safety[0].coding[0].system, "urn:oid:2.16.840.1.113883.3.26.1.1")
self.assertEqual(inst.safety[0].text, "MR Unsafe")
self.assertEqual(inst.status, "active")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.type.coding[0].code, "468063009")
self.assertEqual(inst.type.coding[0].display, "Coated femoral stem prosthesis, modular")
self.assertEqual(inst.type.coding[0].system, "http://snomed.info/sct")
self.assertEqual(inst.type.text, "Coated femoral stem prosthesis, modular")
self.assertEqual(inst.udi.carrierAIDC, "XWQyMDExMDg1NzY3NDAwMjAxNzE3MTQxMTIwMTBOWUZVTDAx4oaUMjExOTI4MzfihpQ3MTNBMUIyQzNENEU1RjZH")
self.assertEqual(inst.udi.carrierHRF, "{01}00844588003288{17}141120{10}7654321D{21}10987654d321")
self.assertEqual(inst.udi.deviceIdentifier, "00844588003288")
self.assertEqual(inst.udi.entryType, "barcode")
self.assertEqual(inst.udi.issuer, "http://hl7.org/fhir/NamingSystem/gs1")
self.assertEqual(inst.udi.jurisdiction, "http://hl7.org/fhir/NamingSystem/fda-udi")
self.assertEqual(inst.udi.name, "FHIR® Hip System")
def testDevice6(self):
inst = self.instantiate_from("device-example-udi2.json")
self.assertIsNotNone(inst, "Must have instantiated a Device instance")
self.implDevice6(inst)
js = inst.as_json()
self.assertEqual("Device", js["resourceType"])
inst2 = device.Device(js)
self.implDevice6(inst2)
def implDevice6(self, inst):
self.assertEqual(inst.expirationDate.date, FHIRDate("2014-02-01").date)
self.assertEqual(inst.expirationDate.as_json(), "2014-02-01")
self.assertEqual(inst.extension[0].url, "http://hl7.org/fhir/StructureDefinition/device-din")
self.assertEqual(inst.extension[0].valueIdentifier.system, "http://hl7.org/fhir/NamingSystem/iccbba-din")
self.assertEqual(inst.extension[0].valueIdentifier.value, "A99971312345600")
self.assertEqual(inst.id, "example-udi2")
self.assertEqual(inst.manufactureDate.date, FHIRDate("2013-02-01").date)
self.assertEqual(inst.manufactureDate.as_json(), "2013-02-01")
self.assertEqual(inst.manufacturer, "Acme Devices, Inc")
self.assertEqual(inst.status, "inactive")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.udi.deviceIdentifier, "A9999XYZ100T0474")
self.assertEqual(inst.udi.issuer, "http://hl7.org/fhir/NamingSystem/iccbba-other")
self.assertEqual(inst.udi.jurisdiction, "http://hl7.org/fhir/NamingSystem/fda-udi")
self.assertEqual(inst.udi.name, "Bone,Putty Demineralized")
def testDevice7(self):
inst = self.instantiate_from("device-example-udi3.json")
self.assertIsNotNone(inst, "Must have instantiated a Device instance")
self.implDevice7(inst)
js = inst.as_json()
self.assertEqual("Device", js["resourceType"])
inst2 = device.Device(js)
self.implDevice7(inst2)
def implDevice7(self, inst):
self.assertEqual(inst.expirationDate.date, FHIRDate("2020-02-02").date)
self.assertEqual(inst.expirationDate.as_json(), "2020-02-02")
self.assertEqual(inst.id, "example-udi3")
self.assertEqual(inst.identifier[0].type.coding[0].code, "SNO")
self.assertEqual(inst.identifier[0].type.coding[0].system, "http://hl7.org/fhir/identifier-type")
self.assertEqual(inst.identifier[0].value, "XYZ456789012345678")
self.assertEqual(inst.lotNumber, "LOT123456789012345")
self.assertEqual(inst.manufactureDate.date, FHIRDate("2013-02-02").date)
self.assertEqual(inst.manufactureDate.as_json(), "2013-02-02")
self.assertEqual(inst.manufacturer, "GlobalMed, Inc")
self.assertEqual(inst.model, "Ultra Implantable")
self.assertEqual(inst.status, "inactive")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.udi.carrierHRF, "+H123PARTNO1234567890120/$$420020216LOT123456789012345/SXYZ456789012345678/16D20130202C")
self.assertEqual(inst.udi.entryType, "manual")
self.assertEqual(inst.udi.issuer, "http://hl7.org/fhir/NamingSystem/hibcc")
self.assertEqual(inst.udi.jurisdiction, "http://hl7.org/fhir/NamingSystem/fda-udi")
self.assertEqual(inst.udi.name, "FHIR® Ulltra Implantable")
def testDevice8(self):
inst = self.instantiate_from("device-example-udi4.json")
self.assertIsNotNone(inst, "Must have instantiated a Device instance")
self.implDevice8(inst)
js = inst.as_json()
self.assertEqual("Device", js["resourceType"])
inst2 = device.Device(js)
self.implDevice8(inst2)
def implDevice8(self, inst):
self.assertEqual(inst.id, "example-udi4")
self.assertEqual(inst.lotNumber, "RZ12345678")
self.assertEqual(inst.manufacturer, "GlobalMed, Inc")
self.assertEqual(inst.status, "inactive")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.udi.carrierHRF, "=)1TE123456A&)RZ12345678")
self.assertEqual(inst.udi.issuer, "http://hl7.org/fhir/NamingSystem/iccbba-blood")
self.assertEqual(inst.udi.jurisdiction, "http://hl7.org/fhir/NamingSystem/fda-udi")
def testDevice9(self):
inst = self.instantiate_from("device-example.json")
self.assertIsNotNone(inst, "Must have instantiated a Device instance")
self.implDevice9(inst)
js = inst.as_json()
self.assertEqual("Device", js["resourceType"])
inst2 = device.Device(js)
self.implDevice9(inst2)
def implDevice9(self, inst):
self.assertEqual(inst.contact[0].system, "phone")
self.assertEqual(inst.contact[0].value, "ext 4352")
self.assertEqual(inst.id, "example")
self.assertEqual(inst.identifier[0].system, "http://goodcare.org/devices/id")
self.assertEqual(inst.identifier[0].value, "345675")
self.assertEqual(inst.identifier[1].type.coding[0].code, "SNO")
self.assertEqual(inst.identifier[1].type.coding[0].system, "http://hl7.org/fhir/identifier-type")
self.assertEqual(inst.identifier[1].type.text, "Serial Number")
self.assertEqual(inst.identifier[1].value, "AMID-342135-8464")
self.assertEqual(inst.lotNumber, "43453424")
self.assertEqual(inst.manufacturer, "Acme Devices, Inc")
self.assertEqual(inst.model, "AB 45-J")
self.assertEqual(inst.note[0].text, "QA Checked")
self.assertEqual(inst.note[0].time.date, FHIRDate("2015-06-28T14:03:32+10:00").date)
self.assertEqual(inst.note[0].time.as_json(), "2015-06-28T14:03:32+10:00")
self.assertEqual(inst.status, "active")
self.assertEqual(inst.text.status, "generated")
self.assertEqual(inst.type.coding[0].code, "86184003")
self.assertEqual(inst.type.coding[0].display, "Electrocardiographic monitor and recorder")
self.assertEqual(inst.type.coding[0].system, "http://snomed.info/sct")
self.assertEqual(inst.type.text, "ECG")
| 51.322097 | 138 | 0.672189 | 1,621 | 13,703 | 5.665638 | 0.154226 | 0.235192 | 0.277221 | 0.075784 | 0.771015 | 0.725828 | 0.683907 | 0.599194 | 0.538436 | 0.47757 | 0 | 0.061319 | 0.181201 | 13,703 | 266 | 139 | 51.515038 | 0.757041 | 0.008319 | 0 | 0.372807 | 1 | 0.004386 | 0.237871 | 0.045572 | 0 | 0 | 0 | 0 | 0.671053 | 1 | 0.083333 | false | 0 | 0.026316 | 0 | 0.118421 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
a1dcf8d15a549e7ad8a6b76c8c2f4bf5e2d2e3b9 | 134 | py | Python | run.py | superbool/web-serial | ea9bcbeb010126a4901196e26b90150ac28adfc8 | [
"Apache-2.0"
] | 1 | 2021-06-05T05:12:18.000Z | 2021-06-05T05:12:18.000Z | run.py | superbool/web-serial | ea9bcbeb010126a4901196e26b90150ac28adfc8 | [
"Apache-2.0"
] | null | null | null | run.py | superbool/web-serial | ea9bcbeb010126a4901196e26b90150ac28adfc8 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
from app import app
from app import socketio
if __name__ == '__main__':
socketio.run(app, host='0.0.0.0')
| 16.75 | 37 | 0.686567 | 23 | 134 | 3.652174 | 0.608696 | 0.071429 | 0.309524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0.164179 | 134 | 7 | 38 | 19.142857 | 0.714286 | 0.149254 | 0 | 0 | 0 | 0 | 0.132743 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
a1ec6ed45185e12b252ac2793a83dc7f589e1acf | 269 | py | Python | meetup/admin.py | BradleyKirton/gpug-graphql | ebcd53e48773801d3908d11bcc4c039a259dc04b | [
"MIT"
] | null | null | null | meetup/admin.py | BradleyKirton/gpug-graphql | ebcd53e48773801d3908d11bcc4c039a259dc04b | [
"MIT"
] | 3 | 2020-06-05T18:19:29.000Z | 2021-06-10T20:23:21.000Z | meetup/admin.py | BradleyKirton/gpug-graphql | ebcd53e48773801d3908d11bcc4c039a259dc04b | [
"MIT"
] | null | null | null | import meetup.models
from django.contrib import admin
# Register the models on the admin site
admin.site.register(meetup.models.Meetup)
admin.site.register(meetup.models.Event)
admin.site.register(meetup.models.Attendee)
admin.site.register(meetup.models.Attendance) | 26.9 | 45 | 0.825279 | 39 | 269 | 5.692308 | 0.358974 | 0.27027 | 0.306306 | 0.414414 | 0.522523 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074349 | 269 | 10 | 45 | 26.9 | 0.891566 | 0.137546 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
a1f8b97c35132eb912e204653aff8bc5f1a50ae9 | 124 | py | Python | User/admin.py | LukaszHoszowski/Django_ProEstate | 36c5cc25842f4e5afebd9ff6eaa83c9457fb7a3a | [
"MIT"
] | 1 | 2022-02-15T13:36:29.000Z | 2022-02-15T13:36:29.000Z | User/admin.py | LukaszHoszowski/Django_ProEstate | 36c5cc25842f4e5afebd9ff6eaa83c9457fb7a3a | [
"MIT"
] | null | null | null | User/admin.py | LukaszHoszowski/Django_ProEstate | 36c5cc25842f4e5afebd9ff6eaa83c9457fb7a3a | [
"MIT"
] | null | null | null | from django.contrib import admin
from User.models import Profile
# Register your models here.
admin.site.register(Profile) | 20.666667 | 32 | 0.814516 | 18 | 124 | 5.611111 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120968 | 124 | 6 | 33 | 20.666667 | 0.926606 | 0.209677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
62bb0f615f354969b1d5f9034eb52d4792683ff0 | 357 | py | Python | tests/test_simple_process.py | jaden-git/simple-utils | 8526ea9a8357665722627ab2c44da527c2884006 | [
"Apache-2.0"
] | 1 | 2021-03-08T09:54:05.000Z | 2021-03-08T09:54:05.000Z | tests/test_simple_process.py | jaden-git/simple-utils | 8526ea9a8357665722627ab2c44da527c2884006 | [
"Apache-2.0"
] | null | null | null | tests/test_simple_process.py | jaden-git/simple-utils | 8526ea9a8357665722627ab2c44da527c2884006 | [
"Apache-2.0"
] | null | null | null | import simple_utils
def test_retry():
# 성공한 경우
assert simple_utils.process.retry(lambda x: x+1, argv={'x':2}) == 3
# 실패한 경우
try:
simple_utils.process.retry(lambda x: x+y, argv={'x':5}, waiting_time=1, limit=1)
except simple_utils.process.TimeoutError:
pass
else:
raise Exception('process - retry 테스트 실패')
| 23.8 | 88 | 0.627451 | 53 | 357 | 4.113208 | 0.584906 | 0.201835 | 0.247706 | 0.211009 | 0.284404 | 0.284404 | 0.284404 | 0 | 0 | 0 | 0 | 0.02214 | 0.240896 | 357 | 14 | 89 | 25.5 | 0.782288 | 0.036415 | 0 | 0 | 0 | 0 | 0.070588 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.111111 | true | 0.111111 | 0.111111 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
62ce881c42d4a1d954c4a4ca0b6af0a9a5e8cba3 | 78 | py | Python | __init__.py | OdinTech3/StatusBarSymbols | bed5f51dd0bac6c5b8196e11465def1950ad454b | [
"MIT"
] | 1 | 2019-07-16T18:43:47.000Z | 2019-07-16T18:43:47.000Z | __init__.py | OdinTech3/StatusBarSymbols | bed5f51dd0bac6c5b8196e11465def1950ad454b | [
"MIT"
] | null | null | null | __init__.py | OdinTech3/StatusBarSymbols | bed5f51dd0bac6c5b8196e11465def1950ad454b | [
"MIT"
] | null | null | null | from .symbol import StatusSymbol, PythonSyntax, MagicPythonSyntax # noqa F401 | 78 | 78 | 0.833333 | 8 | 78 | 8.125 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.043478 | 0.115385 | 78 | 1 | 78 | 78 | 0.898551 | 0.115385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
1a2aadeac48ec0a44d92789cab684a0af70225bf | 614 | py | Python | mariadb_kernel/maria_magics/supported_magics.py | robertbindar/mariadb_kernel-1 | 22a58dd0e4d2fbb687f0ed02415bd5278d3ed046 | [
"BSD-3-Clause"
] | null | null | null | mariadb_kernel/maria_magics/supported_magics.py | robertbindar/mariadb_kernel-1 | 22a58dd0e4d2fbb687f0ed02415bd5278d3ed046 | [
"BSD-3-Clause"
] | null | null | null | mariadb_kernel/maria_magics/supported_magics.py | robertbindar/mariadb_kernel-1 | 22a58dd0e4d2fbb687f0ed02415bd5278d3ed046 | [
"BSD-3-Clause"
] | null | null | null | """ Maintains a list of magic commands supported by the kernel """
# Copyright (c) MariaDB Foundation.
# Distributed under the terms of the Modified BSD License.
from mariadb_kernel.maria_magics.line import Line
from mariadb_kernel.maria_magics.df import DF
from mariadb_kernel.maria_magics.lsmagic import LSMagic
from mariadb_kernel.maria_magics.maria_magic import MariaMagic
from mariadb_kernel.maria_magics.bar import Bar
from mariadb_kernel.maria_magics.pie import Pie
def get():
return {
"line": Line,
"bar": Bar,
"pie": Pie,
"df": DF,
"lsmagic": LSMagic
}
| 27.909091 | 66 | 0.729642 | 85 | 614 | 5.117647 | 0.388235 | 0.151724 | 0.234483 | 0.303448 | 0.386207 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.192182 | 614 | 21 | 67 | 29.238095 | 0.877016 | 0.245928 | 0 | 0 | 0 | 0 | 0.04185 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | true | 0 | 0.428571 | 0.071429 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
a7f2bd738134f2bd5446f605cbe8b336e751831a | 260 | py | Python | Padrao Strategy/interface.py | David-Marcoss/POO1-PYTHON | b0a49fb95e8b87fdfe2fc40eca547bf25f93c3c4 | [
"MIT"
] | null | null | null | Padrao Strategy/interface.py | David-Marcoss/POO1-PYTHON | b0a49fb95e8b87fdfe2fc40eca547bf25f93c3c4 | [
"MIT"
] | null | null | null | Padrao Strategy/interface.py | David-Marcoss/POO1-PYTHON | b0a49fb95e8b87fdfe2fc40eca547bf25f93c3c4 | [
"MIT"
] | null | null | null | import abc
class Icampeao(abc.ABC): #classe estrategia
@abc.abstractmethod
def habilidade(self):
pass
@abc.abstractmethod
def nivel_de_luta(self):
pass
@abc.abstractmethod
def estilo_de_combate(self):
pass
| 15.294118 | 44 | 0.646154 | 30 | 260 | 5.466667 | 0.533333 | 0.310976 | 0.365854 | 0.304878 | 0.341463 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.276923 | 260 | 16 | 45 | 16.25 | 0.87234 | 0.065385 | 0 | 0.545455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0.272727 | 0.090909 | 0 | 0.454545 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
c503e00031f10f4ae53150f913d30f969b84a9af | 124 | py | Python | osCam/core/admin.py | ChicoState/open-source-security-camera | ad4875ffdc165d51625056ea52b046493fa6261e | [
"MIT"
] | 6 | 2022-02-25T01:20:26.000Z | 2022-03-30T19:29:46.000Z | osCam/core/admin.py | ChicoState/open-source-security-camera | ad4875ffdc165d51625056ea52b046493fa6261e | [
"MIT"
] | 35 | 2022-02-19T20:49:21.000Z | 2022-03-30T05:45:21.000Z | osCam/core/admin.py | ChicoState/open-source-security-camera | ad4875ffdc165d51625056ea52b046493fa6261e | [
"MIT"
] | 5 | 2022-02-21T04:00:39.000Z | 2022-03-30T19:33:07.000Z | from django.contrib import admin
from .models import Recording
# Register your models here.
admin.site.register(Recording)
| 20.666667 | 32 | 0.814516 | 17 | 124 | 5.941176 | 0.647059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120968 | 124 | 5 | 33 | 24.8 | 0.926606 | 0.209677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
c529ccd3322fe9bfd3aa8f0870ecff478c6d976b | 334 | py | Python | setuptools/_importlib.py | gentlegiantJGC/setuptools | 6b0b410572b11e0be9a62ff478616abfddc8633a | [
"MIT"
] | 1 | 2022-02-15T17:56:45.000Z | 2022-02-15T17:56:45.000Z | setuptools/_importlib.py | gentlegiantJGC/setuptools | 6b0b410572b11e0be9a62ff478616abfddc8633a | [
"MIT"
] | null | null | null | setuptools/_importlib.py | gentlegiantJGC/setuptools | 6b0b410572b11e0be9a62ff478616abfddc8633a | [
"MIT"
] | null | null | null | import sys
if sys.version_info < (3, 10):
from setuptools.extern import importlib_metadata as metadata
else:
import importlib.metadata as metadata # noqa: F401
if sys.version_info < (3, 9):
from setuptools.extern import importlib_resources as resources
else:
import importlib.resources as resources # noqa: F401
| 23.857143 | 66 | 0.748503 | 46 | 334 | 5.347826 | 0.391304 | 0.243902 | 0.097561 | 0.130081 | 0.853659 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040441 | 0.185629 | 334 | 13 | 67 | 25.692308 | 0.863971 | 0.062874 | 0 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.555556 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
c52aec2e05d3ba4f924950b6b940f5064974a39b | 153 | py | Python | python/8Kyu/Enumerable Magic #30 - Split that Array.py | athasv/Codewars-data | 5e106466e709fd776f23585ad9f652d0d65b48d3 | [
"MIT"
] | null | null | null | python/8Kyu/Enumerable Magic #30 - Split that Array.py | athasv/Codewars-data | 5e106466e709fd776f23585ad9f652d0d65b48d3 | [
"MIT"
] | null | null | null | python/8Kyu/Enumerable Magic #30 - Split that Array.py | athasv/Codewars-data | 5e106466e709fd776f23585ad9f652d0d65b48d3 | [
"MIT"
] | null | null | null | def partition(arr, classifier_method):
return [i for i in arr if classifier_method(i) == True], [i for i in arr if classifier_method(i) == False] | 76.5 | 114 | 0.699346 | 26 | 153 | 4 | 0.461538 | 0.461538 | 0.096154 | 0.134615 | 0.557692 | 0.557692 | 0.557692 | 0.557692 | 0.557692 | 0 | 0 | 0 | 0.189542 | 153 | 2 | 114 | 76.5 | 0.83871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 5 |
c52c22f7701abac71edaf38e4b7849f7f686467c | 217 | py | Python | flower/views/error.py | shintaii/flower | fdeb135ddb3718404c0f1e9cca73fc45181f611a | [
"BSD-3-Clause"
] | 4,474 | 2015-01-01T18:34:36.000Z | 2022-03-29T06:02:38.000Z | flower/views/error.py | shintaii/flower | fdeb135ddb3718404c0f1e9cca73fc45181f611a | [
"BSD-3-Clause"
] | 835 | 2015-01-06T21:29:48.000Z | 2022-03-31T04:35:10.000Z | flower/views/error.py | shintaii/flower | fdeb135ddb3718404c0f1e9cca73fc45181f611a | [
"BSD-3-Clause"
] | 980 | 2015-01-02T21:41:28.000Z | 2022-03-31T08:30:52.000Z | import tornado.web
from ..views import BaseHandler
class NotFoundErrorHandler(BaseHandler):
def get(self):
raise tornado.web.HTTPError(404)
def post(self):
raise tornado.web.HTTPError(404)
| 18.083333 | 40 | 0.705069 | 26 | 217 | 5.884615 | 0.576923 | 0.196078 | 0.20915 | 0.248366 | 0.405229 | 0.405229 | 0 | 0 | 0 | 0 | 0 | 0.034682 | 0.202765 | 217 | 11 | 41 | 19.727273 | 0.849711 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.285714 | 0 | 0.714286 | 0 | 1 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
c5616be39f565679aceabdeedcc8b43804b08fba | 1,689 | py | Python | tkbeacon/models/__init__.py | NCATS-Tangerine/tkbeacon-python-client | a5a88405b6eb5c998b66f6741b589ba72839fd51 | [
"MIT"
] | null | null | null | tkbeacon/models/__init__.py | NCATS-Tangerine/tkbeacon-python-client | a5a88405b6eb5c998b66f6741b589ba72839fd51 | [
"MIT"
] | null | null | null | tkbeacon/models/__init__.py | NCATS-Tangerine/tkbeacon-python-client | a5a88405b6eb5c998b66f6741b589ba72839fd51 | [
"MIT"
] | null | null | null | # coding: utf-8
# flake8: noqa
"""
Translator Knowledge Beacon API
This is the Translator Knowledge Beacon web service application programming interface (API). # noqa: E501
The version of the OpenAPI document: 1.3.0
Contact: richard@starinformatics.com
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
# import models into model package
from tkbeacon.models.beacon_concept import BeaconConcept
from tkbeacon.models.beacon_concept_category import BeaconConceptCategory
from tkbeacon.models.beacon_concept_detail import BeaconConceptDetail
from tkbeacon.models.beacon_concept_with_details import BeaconConceptWithDetails
from tkbeacon.models.beacon_knowledge_map_object import BeaconKnowledgeMapObject
from tkbeacon.models.beacon_knowledge_map_predicate import BeaconKnowledgeMapPredicate
from tkbeacon.models.beacon_knowledge_map_statement import BeaconKnowledgeMapStatement
from tkbeacon.models.beacon_knowledge_map_subject import BeaconKnowledgeMapSubject
from tkbeacon.models.beacon_predicate import BeaconPredicate
from tkbeacon.models.beacon_statement import BeaconStatement
from tkbeacon.models.beacon_statement_annotation import BeaconStatementAnnotation
from tkbeacon.models.beacon_statement_citation import BeaconStatementCitation
from tkbeacon.models.beacon_statement_object import BeaconStatementObject
from tkbeacon.models.beacon_statement_predicate import BeaconStatementPredicate
from tkbeacon.models.beacon_statement_subject import BeaconStatementSubject
from tkbeacon.models.beacon_statement_with_details import BeaconStatementWithDetails
from tkbeacon.models.exact_match_response import ExactMatchResponse
| 48.257143 | 111 | 0.87685 | 191 | 1,689 | 7.534031 | 0.39267 | 0.141765 | 0.212648 | 0.266852 | 0.346769 | 0.100069 | 0 | 0 | 0 | 0 | 0 | 0.005181 | 0.08585 | 1,689 | 34 | 112 | 49.676471 | 0.926813 | 0.193606 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
3def634b8d6292f17830d4a74d786d6aa89484aa | 44 | py | Python | new1.py | luket4/cs3240-labdemo | c59a92ae6808ec3e0de923e8a99784f8c8f18783 | [
"MIT"
] | null | null | null | new1.py | luket4/cs3240-labdemo | c59a92ae6808ec3e0de923e8a99784f8c8f18783 | [
"MIT"
] | null | null | null | new1.py | luket4/cs3240-labdemo | c59a92ae6808ec3e0de923e8a99784f8c8f18783 | [
"MIT"
] | null | null | null | from helper import *
greeting(my name is..)
| 14.666667 | 22 | 0.727273 | 7 | 44 | 4.571429 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.159091 | 44 | 2 | 23 | 22 | 0.864865 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.5 | null | null | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
9a81b36d90394514756e9a0f8de17caa44c6deeb | 108 | py | Python | django_facebook/exceptions.py | duointeractive/Django-facebook | 07bf5b30d721c72bce58b4c89c48515412f080d0 | [
"BSD-3-Clause"
] | 24 | 2016-08-06T18:10:54.000Z | 2022-03-04T11:47:39.000Z | django_facebook/exceptions.py | sirmmo/Django-facebook | 07bf5b30d721c72bce58b4c89c48515412f080d0 | [
"BSD-3-Clause"
] | 1 | 2017-03-28T02:36:50.000Z | 2017-03-28T07:18:57.000Z | django_facebook/exceptions.py | sirmmo/Django-facebook | 07bf5b30d721c72bce58b4c89c48515412f080d0 | [
"BSD-3-Clause"
] | 13 | 2017-03-28T02:35:32.000Z | 2022-02-21T23:36:15.000Z |
class FacebookException(Exception):
pass
class IncompleteProfileError(FacebookException):
pass
| 10.8 | 48 | 0.768519 | 8 | 108 | 10.375 | 0.625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.175926 | 108 | 9 | 49 | 12 | 0.932584 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
9a8e0b6f139b8b8e5133d1d0b9a63a6494432a71 | 47,876 | py | Python | SDMtoolbox_2_10_1to3/Scripts/Rarefy.py | GO-Loc-GO/MCM-ICM | d69cfc42c9204425c4e4dd39df461588cce5482d | [
"MIT"
] | null | null | null | SDMtoolbox_2_10_1to3/Scripts/Rarefy.py | GO-Loc-GO/MCM-ICM | d69cfc42c9204425c4e4dd39df461588cce5482d | [
"MIT"
] | null | null | null | SDMtoolbox_2_10_1to3/Scripts/Rarefy.py | GO-Loc-GO/MCM-ICM | d69cfc42c9204425c4e4dd39df461588cce5482d | [
"MIT"
] | null | null | null | # Description:Rarefy
# Requirements: Spatial Analyst
# Author: Jason Brown
# 4/03/2014 v2
# Import system modules
import arcpy, sys, string, os, csv, glob, arcgisscripting
gp = arcgisscripting.create()
arcpy.env.overwriteOutput = True
def str2bool(pstr):
"""Convert ESRI boolean string to Python boolean type"""
return pstr == 'true'
#set workspace NOTE: all files must be in this directory
infile = sys.argv[1]
speciesField=sys.argv[2]
latitudeField=sys.argv[3]
longitudField=sys.argv[4]
outFolder = sys.argv[5]
outName=sys.argv[6]
resolution=sys.argv[7]
inProject = sys.argv[8]
inputBoolean = str2bool(sys.argv[9])
inHetero = sys.argv[10]
nClasses = sys.argv[11]
ClassType= sys.argv[12]
maxDist=sys.argv[13]
minDist=sys.argv[14]
arcpy.CreateFolder_management(outFolder,"TEMP_ilUli")
arcpy.CreateFolder_management(outFolder,"TEMP_ilUlii")
arcpy.CreateFolder_management(outFolder,"TEMP_ilUliii")
arcpy.CreateFolder_management(outFolder,"TEMP_ilUliiii")
arcpy.CreateFolder_management(outFolder,"TEMP_ilUliiiii")
arcpy.CreateFolder_management(outFolder,"TEMP_ilUlvi")
arcpy.CreateFolder_management(outFolder,"TEMP_ilUlvii")
arcpy.CreateFolder_management(outFolder,"TEMP_ilUlviii")
arcpy.CreateFolder_management(outFolder,"TEMP_ilUlviiii")
arcpy.CreateFolder_management(outFolder,"TEMP_ilUlxx")
outFolderTemp= outFolder + "/TEMP_ilUli"
outFolderTemp2= outFolder + "/TEMP_ilUlii"
outFolderTemp3= outFolder + "/TEMP_ilUliii"
outFolderTemp4= outFolder + "/TEMP_ilUliiii"
outFolderTemp5= outFolder + "/TEMP_ilUliiiii"
outFolderTemp6 = outFolder + "/TEMP_ilUlvi"
outFolderTemp7 = outFolder + "/TEMP_ilUlvii"
outFolderTemp8 = outFolder + "/TEMP_ilUlviii"
outFolderTemp9 = outFolder + "/TEMP_ilUlviiii"
outFolderTemp10 = outFolder + "/TEMP_ilUlxx"
TempShp1 = outFolderTemp + "/temp1.shp"
TempShp2 = outFolderTemp + "/temp2.shp"
#infile = sys.argv[1]
inSHP = str(infile).replace("\\","/")
SHPName1 = os.path.split(inSHP)[1]
SHPName = str(SHPName1)[:-4]
transformationOut ='#'
project1="PROJCS['Africa_Equidistant_Conic',GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Equidistant_Conic'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',25.0],PARAMETER['Standard_Parallel_1',20.0],PARAMETER['Standard_Parallel_2',-23.0],PARAMETER['Latitude_Of_Origin',0.0],UNIT['Meter',1.0]]"
project2="PROJCS['Asia_North_Equidistant_Conic',GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Equidistant_Conic'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',95.0],PARAMETER['Standard_Parallel_1',15.0],PARAMETER['Standard_Parallel_2',65.0],PARAMETER['Latitude_Of_Origin',30.0],UNIT['Meter',1.0]]"
project3="PROJCS['Asia_South_Equidistant_Conic',GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Equidistant_Conic'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',125.0],PARAMETER['Standard_Parallel_1',7.0],PARAMETER['Standard_Parallel_2',-32.0],PARAMETER['Latitude_Of_Origin',-15.0],UNIT['Meter',1.0]]"
project4="PROJCS['Europe_Equidistant_Conic',GEOGCS['GCS_European_1950',DATUM['D_European_1950',SPHEROID['International_1924',6378388.0,297.0]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Equidistant_Conic'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',10.0],PARAMETER['Standard_Parallel_1',43.0],PARAMETER['Standard_Parallel_2',62.0],PARAMETER['Latitude_Of_Origin',30.0],UNIT['Meter',1.0]]"
transformation4="ED_1950_To_WGS_1984_NTv2_Catalonia"
project5="PROJCS['North_America_Equidistant_Conic',GEOGCS['GCS_North_American_1983',DATUM['D_North_American_1983',SPHEROID['GRS_1980',6378137.0,298.257222101]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Equidistant_Conic'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',-96.0],PARAMETER['Standard_Parallel_1',20.0],PARAMETER['Standard_Parallel_2',60.0],PARAMETER['Latitude_Of_Origin',40.0],UNIT['Meter',1.0]]"
transformation5="WGS_1984_(ITRF00)_To_NAD_1983"
project6="PROJCS['USA_Contiguous_Equidistant_Conic',GEOGCS['GCS_North_American_1983',DATUM['D_North_American_1983',SPHEROID['GRS_1980',6378137.0,298.257222101]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Equidistant_Conic'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',-96.0],PARAMETER['Standard_Parallel_1',33.0],PARAMETER['Standard_Parallel_2',45.0],PARAMETER['Latitude_Of_Origin',39.0],UNIT['Meter',1.0]]"
transformation6="WGS_1984_(ITRF00)_To_NAD_1983"
project7="PROJCS['South_America_Equidistant_Conic',GEOGCS['GCS_South_American_1969',DATUM['D_South_American_1969',SPHEROID['GRS_1967_Truncated',6378160.0,298.25]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Equidistant_Conic'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',-60.0],PARAMETER['Standard_Parallel_1',-5.0],PARAMETER['Standard_Parallel_2',-42.0],PARAMETER['Latitude_Of_Origin',-32.0],UNIT['Meter',1.0]]"
transformation7="SAD_1969_To_WGS_1984_15"
project8="PROJCS['World_Azimuthal_Equidistant',GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Azimuthal_Equidistant'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',0.0],PARAMETER['Latitude_Of_Origin',0.0],UNIT['Meter',1.0]]"
project9="PROJCS['World_Equidistant_Conic',GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Equidistant_Conic'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',0.0],PARAMETER['Standard_Parallel_1',60.0],PARAMETER['Standard_Parallel_2',60.0],PARAMETER['Latitude_Of_Origin',0.0],UNIT['Meter',1.0]]"
project10="PROJCS['World_Plate_Carree',GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Plate_Carree'],PARAMETER['False_Easting',0.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',0.0],UNIT['Meter',1.0]]"
if inProject == "World: Azimuthal Equidistant":
ProjectOut=project8
if inProject == "World: Equidistant Conic":
ProjectOut=project9
if inProject == "World: Plate Carree":
ProjectOut=project10
if inProject == "Continent: Africa Equidistant Conic":
ProjectOut=project1
if inProject == "Continent: Asia North Equidistant Conic":
ProjectOut=project2
if inProject == "Continent: Asia South Equidistant Conic":
ProjectOut=project3
if inProject == "Continent: Europe Equidistant Conic":
ProjectOut=project4
transformationOut = transformation4
if inProject == "Continent: North America Equidistant Conic":
ProjectOut=project5
transformationOut = transformation5
if inProject == "Continent: USA Contiguous Equidistant Conic":
ProjectOut=project6
transformationOut = transformation6
if inProject == "Continent: South America Equidistant Conic":
ProjectOut=project7
transformationOut = transformation7
#step1 copy in file, then project to equidistant projection
outPoints=outFolder+"/"+outName+"_temp.shp"
outPoints2=outFolder+"/"+outName+"_temp2.shp"
outPointsF=outFolder+"/"+outName+"_spatially_rarified_locs.shp"
gp.AddMessage(" ")
gp.AddMessage("***Projecting "+infile+" to Equal Distant Projection")
gp.AddMessage(" ")
#arcpy.CopyFeatures_management(infile,outPoints)
arcpy.Project_management(infile,outPoints,ProjectOut,transformationOut,"#")
latitudeField=sys.argv[3]
longitudField=sys.argv[4]
#step2 delete spatial duplications
#count all points in input
totPnts1= arcpy.GetCount_management(outPoints)
totPntsN1= int(str(totPnts1))
gp.addmessage(str(totPntsN1)+ " points input")
gp.AddMessage(" ")
#setting up inputs
speciesField1=str(speciesField)
latitudeField1=str(latitudeField)
longitudField1=str(longitudField)
groupSLL=str(speciesField1)+";"+str(latitudeField1)+";"+str(longitudField1)
gp.AddMessage("***Step 1:Remove spatially redundant occurrence localities for each species")
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(outPoints,groupSLL,"#","0")
#count number of duplicate points
totPnts2= arcpy.GetCount_management(outPoints)
totPntsN2= int(str(totPnts2))
diffnumVal2= int(totPntsN1-totPntsN2)
gp.addmessage(str(diffnumVal2) + " duplicates removed (of "+ str(totPntsN1)+" points)")
gp.AddMessage(" ")
gp.addmessage(str(totPntsN2) + " points remain")
gp.AddMessage(" ")
nClassesIN=(int(nClasses))
if nClassesIN > 5:
gp.AddMessage("Too many classes, reduce to between 2-5")
del gp
#step3 spatial rarefy---SINGLE CLASS
if inputBoolean== False:
groupSP="Shape;"+str(speciesField)
resolutionS=str(resolution)
gp.AddMessage("***Step 2: Spatially rarefying occurrence data at "+resolutionS)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(outPoints,groupSP,resolutionS,"0")
#count number of duplicate points
totPnts3= arcpy.GetCount_management(outPoints)
totPntsN3= int(str(totPnts3))
diffnumVal3= int(totPntsN2-totPntsN3)
gp.addmessage(str(diffnumVal3) + " spatially autocorrelated points removed (of "+ str(totPntsN2)+" points)")
gp.AddMessage(" ")
gp.addmessage("Final dataset includes "+str(totPntsN3)+" unique occurrence points")
gp.AddMessage(" ")
##project back into WGS1980
gp.AddMessage("Projecting shapefile back to original WGS 1984 geographic projection")
gp.AddMessage(" ")
arcpy.Project_management(outPoints,outPointsF,"GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]]","#","#")
arcpy.Delete_management(outPoints)
gp.AddMessage("FINISHED SUCCESFULLY")
#step3 spatial rarefy---GRADUATED CLASSES:5CLASSES
if inputBoolean== True and nClassesIN == 5:
emptyShape=outFolderTemp+"/empty.shp"
arcpy.Copy_management(outPoints,emptyShape)
arcpy.DeleteRows_management(emptyShape)
arcpy.gp.Slice_sa(inHetero,outFolder+"/spatial_groups.tif",nClassesIN,ClassType,"1")
arcpy.gp.ExtractValuesToPoints_sa(outPoints,outFolder+"/spatial_groups.tif",outFolderTemp+"/OutPtsEall.shp","NONE","VALUE_ONLY")
arcpy.AddField_management(outFolderTemp+"/OutPtsEall.shp", "TEMP","SHORT","1","","","","")
arcpy.CalculateField_management(outFolderTemp+"/OutPtsEall.shp", "TEMP", "[RASTERVALU]", "VB","#")
arcpy.DeleteField_management(outFolderTemp+"/OutPtsEall.shp","RASTERVALU")
arcpy.AddField_management(outFolderTemp+"/OutPtsEall.shp", "RASTERVALU","SHORT","1","","","","")
arcpy.CalculateField_management(outFolderTemp+"/OutPtsEall.shp", "RASTERVALU", "[TEMP]", "VB","#")
arcpy.DeleteField_management(outFolderTemp+"/OutPtsEall.shp","TEMP")
###CALCULATE_RAREFY_DISTANCES
arcpy.MakeFeatureLayer_management(outFolderTemp+"/OutPtsEall.shp", "OutPtsEall_View")
arcpy.SelectLayerByAttribute_management("OutPtsEall_View","NEW_SELECTION", '"RASTERVALU" =-9999')
arcpy.CalculateField_management("OutPtsEall_View", field="RASTERVALU", expression="1", expression_type="VB", code_block="")
def get_num(x):
return float(''.join(ele for ele in x if ele.isdigit() or ele == '.'))
def get_text(x):
return str(''.join(ele for ele in x if ele.isalpha() or ele == ' '))
inMax = get_num(maxDist)
inMin = get_num(minDist)
inUnits = get_text(minDist)
inMaxV = float(inMax)
inMinV = float(inMin)
interVal=(inMaxV-inMinV)/(nClassesIN-1)
Dcm1=inMinV+interVal
Dcm2=Dcm1+interVal
Dcm3=Dcm2+interVal
Dcm1T=str(Dcm1)+inUnits
Dcm2T=str(Dcm2)+inUnits
Dcm3T=str(Dcm3)+inUnits
####TEXTOUT
groupSP="Shape;"+str(speciesField)
resolutionS=str(resolution)
gp.AddMessage("***Step 2. Phase 1: Spatially rarefying all occurrence data at "+minDist)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(outFolderTemp+"/OutPtsEall.shp",groupSP,minDist,"0")
#count number of duplicate points
totPnts3= arcpy.GetCount_management(outFolderTemp+"/OutPtsEall.shp")
totPntsN3= int(str(totPnts3))
diffnumVal3= int(totPntsN2-totPntsN3)
gp.addmessage(str(diffnumVal3) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE1
from Scripts import Rarefy1####VERSION1--SPLIT BASED ON RASTER VALUE!!! EXPORT TO TEMPFOLD2
E0A= outFolderTemp2+"/0.shp"
E1A= outFolderTemp2+"/1.shp"
E2A= outFolderTemp2+"/2.shp"
E3A= outFolderTemp2+"/3.shp"
E4A= outFolderTemp2+"/4.shp"
E5A= outFolderTemp2+"/5.shp"
IN_EA=str(E1A+";"+E2A+";"+E3A+";"+E4A)
EM1_4= outFolderTemp2+"/Merge1.shp"
#count number of duplicate points
if arcpy.Exists(E0A):
MISdataPN=arcpy.GetCount_management(E0A)
MISdataPA=str(MISdataPN)
MISdataP=int(MISdataPA)
else:
MISdataP=0
arcpy.Copy_management(emptyShape,E0A)
if arcpy.Exists(E1A):
V=1
else:
arcpy.Copy_management(emptyShape,E1A)
if arcpy.Exists(E2A):
V=1
else:
arcpy.Copy_management(emptyShape,E2A)
if arcpy.Exists(E3A):
V=1
else:
arcpy.Copy_management(emptyShape,E3A)
if arcpy.Exists(E4A):
V=1
else:
arcpy.Copy_management(emptyShape,E4A)
if arcpy.Exists(E5A):
group5Pnts=arcpy.GetCount_management(E5A)
group5PntsNA=str(group5Pnts)
group5PntsN=int(group5PntsNA)
else:
group5PntsN=0
arcpy.Copy_management(emptyShape,E5A)
arcpy.Merge_management(IN_EA,EM1_4,"#")
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 2: Spatially rarefying occurrence data in groups 1-4 at "+Dcm1T)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(EM1_4,groupSP,Dcm1T,"0")
totPnts4= arcpy.GetCount_management(EM1_4)
totPntsN4= int(str(totPnts4))
diffnumVal4= int(totPntsN3-totPntsN4-group5PntsN-MISdataP)
gp.addmessage(str(diffnumVal4) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE2
from Scripts import Rarefy2
E1B= outFolderTemp3+"/1.shp"
E2B= outFolderTemp3+"/2.shp"
E3B= outFolderTemp3+"/3.shp"
E4B= outFolderTemp3+"/4.shp"
EM1_3= outFolderTemp3+"/Merge2.shp"
IN_EB=str(E1B+";"+E2B+";"+E3B)
#count number of duplicate points
if arcpy.Exists(E1B):
V=1
else:
arcpy.Copy_management(emptyShape,E1B)
if arcpy.Exists(E2B):
V=1
else:
arcpy.Copy_management(emptyShape,E2B)
if arcpy.Exists(E3B):
V=1
else:
arcpy.Copy_management(emptyShape,E3B)
if arcpy.Exists(E4B):
group4Pnts=arcpy.GetCount_management(E4B)
group4PntsNA=str(group4Pnts)
group4PntsN=int(group4PntsNA)
else:
group4PntsN=0
arcpy.Copy_management(emptyShape,E4B)
arcpy.Merge_management(IN_EB,EM1_3,"#")
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 3: Spatially rarefying occurrence data in groups 1-3 at "+Dcm2T)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(EM1_3,groupSP,Dcm2T,"0")
totPnts5= arcpy.GetCount_management(EM1_3)
totPntsN5= int(str(totPnts5))
diffnumVal5= int(totPntsN4-totPntsN5-group4PntsN)
gp.addmessage(str(diffnumVal5) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE4
from Scripts import Rarefy3####VERSION1--SPLIT BASED ON RASTER VALUE!!! EXPORT TO TEMPFOLD4
E1C= outFolderTemp4+"/1.shp"
E2C= outFolderTemp4+"/2.shp"
E3C= outFolderTemp4+"/3.shp"
EM1_2= outFolderTemp4+"/Merge3.shp"
IN_EC=str(E1C+";"+E2C)
#count number of duplicate points
if arcpy.Exists(E1C):
V=1
else:
arcpy.Copy_management(emptyShape,E1C)
if arcpy.Exists(E2C):
V=1
else:
arcpy.Copy_management(emptyShape,E2C)
if arcpy.Exists(E3C):
group5Pnts=arcpy.GetCount_management(E3C)
group5PntsNA=str(group5Pnts)
group5PntsN=int(group5PntsNA)
else:
group5PntsN=0
arcpy.Copy_management(emptyShape,E3C)
arcpy.Merge_management(IN_EC,EM1_2,"#")
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 4: Spatially rarefying occurrence data in groups 1-2 at "+Dcm3T)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(EM1_2,groupSP,Dcm3T,"0")
totPnts6= arcpy.GetCount_management(EM1_2)
totPntsN6= int(str(totPnts6))
diffnumVal6= int(totPntsN5-totPntsN6-group5PntsN)
gp.addmessage(str(diffnumVal6) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE5
from Scripts import Rarefy4####VERSION1--SPLIT BASED ON RASTER VALUE!!! EXPORT TO TEMPFOLD5
E1D= outFolderTemp5+"/1.shp"
E2D= outFolderTemp5+"/2.shp"
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 5: Spatially rarefying occurrence data in group 1 at "+maxDist)
gp.AddMessage(" ")
if arcpy.Exists(E1D):
totPnts7a=arcpy.GetCount_management(E1D)
totPntsN7aT=str(totPnts7a)
totPntsN7a=int(totPntsN7aT)
else:
totPntsN7a=0
arcpy.Copy_management(emptyShape,E1D)
if arcpy.Exists(E2D):
v=1
else:
arcpy.Copy_management(emptyShape,E2D)
arcpy.DeleteIdentical_management(E1D,groupSP,maxDist,"0")
#count number of duplicate points
totPntsN7b=0
if arcpy.Exists(E1D):
totPnts7b=arcpy.GetCount_management(E1D)
totPntsN7bT=str(totPnts7b)
totPntsN7b=int(totPntsN7bT)
diffnumVal7= int(totPntsN7a-totPntsN7b)
gp.addmessage(str(diffnumVal7) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
###MERGE_ALL_DATSETS
IN_ED=str(E5A+";"+E4B+";"+E3C+";"+E2D+";"+E1D)
arcpy.Merge_management(IN_ED,outPoints2,"#")
totPnts8= arcpy.GetCount_management(outPoints2)
totPntsN8= int(str(totPnts8))
##project back into WGS1984
gp.AddMessage("Projecting shapefile back to original WGS 1984 geographic projection")
gp.AddMessage(" ")
arcpy.Project_management(outPoints2,outPointsF,"GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]]","#","#")
arcpy.Delete_management(outPoints)
gp.AddMessage("****************************************")
gp.AddMessage("****************************************")
gp.AddMessage("FINISHED SUCCESFULLY")
gp.addmessage("Starting number of localities: "+str(totPntsN1))
gp.addmessage("Spatially redundant occurrence localities removed: "+str(diffnumVal2))
gp.addmessage("The following spatially autocorrelated localities")
gp.addmessage("were hierarchically removed, as follows:")
gp.addmessage("Phase 1: "+str(diffnumVal3) + " points removed from groups 1-5 at "+ minDist)
gp.addmessage("Phase 2: "+str(diffnumVal4) + " points removed from groups 1-4 at "+ Dcm1T)
gp.addmessage("Phase 3: "+str(diffnumVal5) + " points removed from groups 1-3 at "+ Dcm2T)
gp.addmessage("Phase 4: "+str(diffnumVal6) + " points removed from groups 1-2 at "+ Dcm3T)
gp.addmessage("Phase 5: "+str(diffnumVal7) + " points removed from group 1 at "+ maxDist)
gp.AddMessage(" ")
gp.addmessage("Final dataset includes "+str(totPntsN8)+" unique occurrence points")
gp.AddMessage("****************************************")
gp.AddMessage("****************************************")
if MISdataP != 0:
#output csv
gp.AddMessage("WARNING!: "+str(MISdataPN)+" localities fell outside of "+str(inHetero)+ " and were not included in spatial rarefying")
gp.AddMessage("****************************************")
gp.AddMessage("See file:"+str(outFolder) + "/" + str(outName) + "_EXCLUDED_PTS_OUTSIDE_OF_HETEROGENEITY_RASTER.csv for localities")
CSVFile = outFolder + "/" + outName+ "_EXCLUDED_PTS_OUTSIDE_OF_HETEROGENEITY_RASTER.csv"
fieldnames = [f.name for f in arcpy.ListFields(E0A) if f.type <> 'Geometry']
for i,f in enumerate(fieldnames):
if f == 'Shape' or f == 'FID' or f == 'OBJECTID':
del fieldnames[i]
with open(CSVFile, 'w') as f:
f.write(','.join(fieldnames)+'\n') #csv headers
with arcpy.da.SearchCursor(E0A, fieldnames) as cursor:
for row in cursor:
f.write(','.join([str(r) for r in row])+'\n')
#step3 spatial rarefy---GRADUATED CLASSES:4CLASSES
if inputBoolean== True and nClassesIN == 4:
emptyShape=outFolderTemp+"/empty.shp"
arcpy.Copy_management(outPoints,emptyShape)
arcpy.DeleteRows_management(emptyShape)
arcpy.gp.Slice_sa(inHetero,outFolder+"/spatial_groups.tif",nClassesIN,ClassType,"1")
arcpy.gp.ExtractValuesToPoints_sa(outPoints,outFolder+"/spatial_groups.tif",outFolderTemp+"/OutPtsEall.shp","NONE","VALUE_ONLY")
arcpy.AddField_management(outFolderTemp+"/OutPtsEall.shp", "TEMP","SHORT","1","","","","")
arcpy.CalculateField_management(outFolderTemp+"/OutPtsEall.shp", "TEMP", "[RASTERVALU]", "VB","#")
arcpy.DeleteField_management(outFolderTemp+"/OutPtsEall.shp","RASTERVALU")
arcpy.AddField_management(outFolderTemp+"/OutPtsEall.shp", "RASTERVALU","SHORT","1","","","","")
arcpy.CalculateField_management(outFolderTemp+"/OutPtsEall.shp", "RASTERVALU", "[TEMP]", "VB","#")
arcpy.DeleteField_management(outFolderTemp+"/OutPtsEall.shp","TEMP")
arcpy.MakeFeatureLayer_management(outFolderTemp+"/OutPtsEall.shp", "OutPtsEall_View")
arcpy.SelectLayerByAttribute_management("OutPtsEall_View","NEW_SELECTION", '"RASTERVALU" =-9999')
arcpy.CalculateField_management("OutPtsEall_View", field="RASTERVALU", expression="1", expression_type="VB", code_block="")
###CALCULATE_RAREFY_DISTANCES
def get_num(x):
return float(''.join(ele for ele in x if ele.isdigit() or ele == '.'))
def get_text(x):
return str(''.join(ele for ele in x if ele.isalpha() or ele == ' '))
inMax = get_num(maxDist)
inMin = get_num(minDist)
inUnits = get_text(minDist)
inMaxV = float(inMax)
inMinV = float(inMin)
interVal=(inMaxV-inMinV)/(nClassesIN-1)
Dcm1=inMinV+interVal
Dcm2=Dcm1+interVal
Dcm1T=str(Dcm1)+inUnits
Dcm2T=str(Dcm2)+inUnits
####TEXTOUT
groupSP="Shape;"+str(speciesField)
resolutionS=str(resolution)
gp.AddMessage("***Step 2. Phase 1: Spatially rarefying all occurrence data at "+minDist)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(outFolderTemp+"/OutPtsEall.shp",groupSP,minDist,"0")
#count number of duplicate points
totPnts3= arcpy.GetCount_management(outFolderTemp+"/OutPtsEall.shp")
totPntsN3= int(str(totPnts3))
diffnumVal3= int(totPntsN2-totPntsN3)
gp.addmessage(str(diffnumVal3) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE1
from Scripts import Rarefy1####VERSION1--SPLIT BASED ON RASTER VALUE!!! EXPORT TO TEMPFOLD2
E0A= outFolderTemp2+"/0.shp"
E1A= outFolderTemp2+"/1.shp"
E2A= outFolderTemp2+"/2.shp"
E3A= outFolderTemp2+"/3.shp"
E4A= outFolderTemp2+"/4.shp"
IN_EA=str(E1A+";"+E2A+";"+E3A)
EM1_3= outFolderTemp2+"/Merge1.shp"
#count number of duplicate points
if arcpy.Exists(E0A):
MISdataPN=arcpy.GetCount_management(E0A)
MISdataPA=str(MISdataPN)
MISdataP=int(MISdataPA)
else:
MISdataP=0
arcpy.Copy_management(emptyShape,E0A)
if arcpy.Exists(E1A):
V=1
else:
arcpy.Copy_management(emptyShape,E1A)
if arcpy.Exists(E2A):
V=1
else:
arcpy.Copy_management(emptyShape,E2A)
if arcpy.Exists(E3A):
V=1
else:
arcpy.Copy_management(emptyShape,E3A)
if arcpy.Exists(E4A):
group5Pnts=arcpy.GetCount_management(E4A)
group5PntsNA=str(group5Pnts)
group5PntsN=int(group5PntsNA)
else:
group5PntsN=0
arcpy.Copy_management(emptyShape,E4A)
arcpy.Merge_management(IN_EA,EM1_3,"#")
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 2: Spatially rarefying occurrence data in groups 1-3 at "+Dcm1T)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(EM1_3,groupSP,Dcm1T,"0")
totPnts4= arcpy.GetCount_management(EM1_3)
totPntsN4= int(str(totPnts4))
diffnumVal4= int(totPntsN3-totPntsN4-group5PntsN-MISdataP)
gp.addmessage(str(diffnumVal4) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE2
from Scripts import Rarefy2
E1B= outFolderTemp3+"/1.shp"
E2B= outFolderTemp3+"/2.shp"
E3B= outFolderTemp3+"/3.shp"
EM1_2= outFolderTemp3+"/Merge2.shp"
IN_EB=str(E1B+";"+E2B)
#count number of duplicate points
if arcpy.Exists(E1B):
V=1
else:
arcpy.Copy_management(emptyShape,E1B)
if arcpy.Exists(E2B):
V=1
else:
arcpy.Copy_management(emptyShape,E2B)
if arcpy.Exists(E3B):
group4Pnts=arcpy.GetCount_management(E3B)
group4PntsNA=str(group4Pnts)
group4PntsN=int(group4PntsNA)
else:
group4PntsN=0
arcpy.Copy_management(emptyShape,E3B)
arcpy.Merge_management(IN_EB,EM1_2,"#")
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 3: Spatially rarefying occurrence data in groups 1-2 at "+Dcm2T)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(EM1_2,groupSP,Dcm2T,"0")
totPnts5= arcpy.GetCount_management(EM1_2)
totPntsN5= int(str(totPnts5))
diffnumVal5= int(totPntsN4-totPntsN5-group4PntsN)
gp.addmessage(str(diffnumVal5) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE4
from Scripts import Rarefy3####VERSION1--SPLIT BASED ON RASTER VALUE!!! EXPORT TO TEMPFOLD4
E1C= outFolderTemp4+"/1.shp"
E2C= outFolderTemp4+"/2.shp"
#count number of duplicate points
if arcpy.Exists(E1C):
V=1
else:
arcpy.Copy_management(emptyShape,E1C)
if arcpy.Exists(E2C):
group5Pnts=arcpy.GetCount_management(E2C)
group5PntsNA=str(group5Pnts)
group5PntsN=int(group5PntsNA)
else:
group5PntsN=0
arcpy.Copy_management(emptyShape,E2C)
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 4: Spatially rarefying occurrence data in group 1 at "+str(maxDist))
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(E1C,groupSP,maxDist,"0")
totPnts6= arcpy.GetCount_management(E1C)
totPntsN6= int(str(totPnts6))
diffnumVal6= int(totPntsN5-totPntsN6-group5PntsN)
gp.addmessage(str(diffnumVal6) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
###MERGE_ALL_DATSETS
IN_ED=str(E4A+";"+E3B+";"+E2C+";"+E1C)
arcpy.Merge_management(IN_ED,outPoints2,"#")
totPnts8= arcpy.GetCount_management(outPoints2)
totPntsN8= int(str(totPnts8))
##project back into WGS1984
gp.AddMessage("Projecting shapefile back to original WGS 1984 geographic projection")
gp.AddMessage(" ")
arcpy.Project_management(outPoints2,outPointsF,"GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]]","#","#")
arcpy.Delete_management(outPoints)
gp.AddMessage("****************************************")
gp.AddMessage("****************************************")
gp.AddMessage("FINISHED SUCCESFULLY")
gp.addmessage("Starting number of localities: "+str(totPntsN1))
gp.addmessage("Spatially redundant occurrence localities removed: "+str(diffnumVal2))
gp.addmessage("The following spatially autocorrelated localities")
gp.addmessage("were hierarchically removed, as follows:")
gp.addmessage("Phase 1: "+str(diffnumVal3) + " points removed from groups 1-4 at "+ minDist)
gp.addmessage("Phase 2: "+str(diffnumVal4) + " points removed from groups 1-3 at "+ Dcm1T)
gp.addmessage("Phase 3: "+str(diffnumVal5) + " points removed from groups 1-2 at "+ Dcm2T)
gp.addmessage("Phase 4: "+str(diffnumVal6) + " points removed from group 1 at "+ str(maxDist))
gp.AddMessage(" ")
gp.addmessage("Final dataset includes "+str(totPntsN8)+" unique occurrence points")
gp.AddMessage("****************************************")
gp.AddMessage("****************************************")
if MISdataP != 0:
#output csv
gp.AddMessage("WARNING!: "+str(MISdataPN)+" localities fell outside of "+str(inHetero)+ " and were not included in spatial rarefying")
gp.AddMessage("****************************************")
gp.AddMessage("See file:"+str(outFolder) + "/" + str(outName) + "_EXCLUDED_PTS_OUTSIDE_OF_HETEROGENEITY_RASTER.csv for localities")
CSVFile = outFolder + "/" + outName+ "_EXCLUDED_PTS_OUTSIDE_OF_HETEROGENEITY_RASTER.csv"
fieldnames = [f.name for f in arcpy.ListFields(E0A) if f.type <> 'Geometry']
for i,f in enumerate(fieldnames):
if f == 'Shape' or f == 'FID' or f == 'OBJECTID':
del fieldnames[i]
with open(CSVFile, 'w') as f:
f.write(','.join(fieldnames)+'\n') #csv headers
with arcpy.da.SearchCursor(E0A, fieldnames) as cursor:
for row in cursor:
f.write(','.join([str(r) for r in row])+'\n')
#################################################
#step3 spatial rarefy---GRADUATED CLASSES:3CLASSES
if inputBoolean== True and nClassesIN == 3:
emptyShape=outFolderTemp+"/empty.shp"
arcpy.Copy_management(outPoints,emptyShape)
arcpy.DeleteRows_management(emptyShape)
arcpy.gp.Slice_sa(inHetero,outFolder+"/spatial_groups.tif",nClassesIN,ClassType,"1")
arcpy.gp.ExtractValuesToPoints_sa(outPoints,outFolder+"/spatial_groups.tif",outFolderTemp+"/OutPtsEall.shp","NONE","VALUE_ONLY")
arcpy.AddField_management(outFolderTemp+"/OutPtsEall.shp", "TEMP","SHORT","1","","","","")
arcpy.CalculateField_management(outFolderTemp+"/OutPtsEall.shp", "TEMP", "[RASTERVALU]", "VB","#")
arcpy.DeleteField_management(outFolderTemp+"/OutPtsEall.shp","RASTERVALU")
arcpy.AddField_management(outFolderTemp+"/OutPtsEall.shp", "RASTERVALU","SHORT","1","","","","")
arcpy.CalculateField_management(outFolderTemp+"/OutPtsEall.shp", "RASTERVALU", "[TEMP]", "VB","#")
arcpy.DeleteField_management(outFolderTemp+"/OutPtsEall.shp","TEMP")
arcpy.MakeFeatureLayer_management(outFolderTemp+"/OutPtsEall.shp", "OutPtsEall_View")
arcpy.SelectLayerByAttribute_management("OutPtsEall_View","NEW_SELECTION", '"RASTERVALU" =-9999')
arcpy.CalculateField_management("OutPtsEall_View", field="RASTERVALU", expression="1", expression_type="VB", code_block="")
###CALCULATE_RAREFY_DISTANCES
def get_num(x):
return float(''.join(ele for ele in x if ele.isdigit() or ele == '.'))
def get_text(x):
return str(''.join(ele for ele in x if ele.isalpha() or ele == ' '))
inMax = get_num(maxDist)
inMin = get_num(minDist)
inUnits = get_text(minDist)
inMaxV = float(inMax)
inMinV = float(inMin)
interVal=(inMaxV-inMinV)/(nClassesIN-1)
Dcm1=inMinV+interVal
Dcm1T=str(Dcm1)+inUnits
####TEXTOUT
groupSP="Shape;"+str(speciesField)
resolutionS=str(resolution)
gp.AddMessage("***Step 2. Phase 1: Spatially rarefying all occurrence data at "+minDist)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(outFolderTemp+"/OutPtsEall.shp",groupSP,minDist,"0")
#count number of duplicate points
totPnts3= arcpy.GetCount_management(outFolderTemp+"/OutPtsEall.shp")
totPntsN3= int(str(totPnts3))
diffnumVal3= int(totPntsN2-totPntsN3)
gp.addmessage(str(diffnumVal3) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE1
from Scripts import Rarefy1####VERSION1--SPLIT BASED ON RASTER VALUE!!! EXPORT TO TEMPFOLD2
E0A= outFolderTemp2+"/0.shp"
E1A= outFolderTemp2+"/1.shp"
E2A= outFolderTemp2+"/2.shp"
E3A= outFolderTemp2+"/3.shp"
IN_EA=str(E1A+";"+E2A)
EM1_2= outFolderTemp2+"/Merge1.shp"
#count number of duplicate points
if arcpy.Exists(E0A):
MISdataPN=arcpy.GetCount_management(E0A)
MISdataPA=str(MISdataPN)
MISdataP=int(MISdataPA)
else:
MISdataP=0
arcpy.Copy_management(emptyShape,E0A)
if arcpy.Exists(E1A):
V=1
else:
arcpy.Copy_management(emptyShape,E1A)
if arcpy.Exists(E2A):
V=1
else:
arcpy.Copy_management(emptyShape,E2A)
if arcpy.Exists(E3A):
group5Pnts=arcpy.GetCount_management(E3A)
group5PntsNA=str(group5Pnts)
group5PntsN=int(group5PntsNA)
else:
group5PntsN=0
arcpy.Copy_management(emptyShape,E3A)
arcpy.Merge_management(IN_EA,EM1_2,"#")
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 2: Spatially rarefying occurrence data in groups 1-2 at "+Dcm1T)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(EM1_2,groupSP,Dcm1T,"0")
totPnts4= arcpy.GetCount_management(EM1_2)
totPntsN4= int(str(totPnts4))
diffnumVal4= int(totPntsN3-totPntsN4-group5PntsN-MISdataP)
gp.addmessage(str(diffnumVal4) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE2
from Scripts import Rarefy2
E1B= outFolderTemp3+"/1.shp"
E2B= outFolderTemp3+"/2.shp"
EM1_2= outFolderTemp3+"/Merge2.shp"
#count number of duplicate points
if arcpy.Exists(E1B):
V=1
else:
arcpy.Copy_management(emptyShape,E1B)
if arcpy.Exists(E2B):
group4Pnts=arcpy.GetCount_management(E2B)
group4PntsNA=str(group4Pnts)
group4PntsN=int(group4PntsNA)
else:
group4PntsN=0
arcpy.Copy_management(emptyShape,E2B)
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 3: Spatially rarefying occurrence data in group 1 at "+str(maxDist))
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(E1B,groupSP,maxDist,"0")
totPnts5= arcpy.GetCount_management(E1B)
totPntsN5= int(str(totPnts5))
diffnumVal5= int(totPntsN4-totPntsN5-group4PntsN)
gp.addmessage(str(diffnumVal5) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
###MERGE_ALL_DATSETS
IN_ED=str(E3A+";"+E2B+";"+E1B)
arcpy.Merge_management(IN_ED,outPoints2,"#")
totPnts8= arcpy.GetCount_management(outPoints2)
totPntsN8= int(str(totPnts8))
##project back into WGS1984
gp.AddMessage("Projecting shapefile back to original WGS 1984 geographic projection")
gp.AddMessage(" ")
arcpy.Project_management(outPoints2,outPointsF,"GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]]","#","#")
arcpy.Delete_management(outPoints)
gp.AddMessage("****************************************")
gp.AddMessage("****************************************")
gp.AddMessage("FINISHED SUCCESFULLY")
gp.addmessage("Starting number of localities: "+str(totPntsN1))
gp.addmessage("Spatially redundant occurrence localities removed: "+str(diffnumVal2))
gp.addmessage("The following spatially autocorrelated localities")
gp.addmessage("were hierarchically removed, as follows:")
gp.addmessage("Phase 1: "+str(diffnumVal3) + " points removed from groups 1-3 at "+ minDist)
gp.addmessage("Phase 2: "+str(diffnumVal4) + " points removed from groups 1-2 at "+ Dcm1T)
gp.addmessage("Phase 3: "+str(diffnumVal5) + " points removed from group 1 at "+ str(maxDist))
gp.AddMessage(" ")
gp.addmessage("Final dataset includes "+str(totPntsN8)+" unique occurrence points")
gp.AddMessage("****************************************")
gp.AddMessage("****************************************")
if MISdataP != 0:
#output csv
gp.AddMessage("WARNING!: "+str(MISdataPN)+" localities fell outside of "+str(inHetero)+ " and were not included in spatial rarefying")
gp.AddMessage("****************************************")
gp.AddMessage("See file:"+str(outFolder) + "/" + str(outName) + "_EXCLUDED_PTS_OUTSIDE_OF_HETEROGENEITY_RASTER.csv for localities")
CSVFile = outFolder + "/" + outName+ "_EXCLUDED_PTS_OUTSIDE_OF_HETEROGENEITY_RASTER.csv"
fieldnames = [f.name for f in arcpy.ListFields(E0A) if f.type <> 'Geometry']
for i,f in enumerate(fieldnames):
if f == 'Shape' or f == 'FID' or f == 'OBJECTID':
del fieldnames[i]
with open(CSVFile, 'w') as f:
f.write(','.join(fieldnames)+'\n') #csv headers
with arcpy.da.SearchCursor(E0A, fieldnames) as cursor:
for row in cursor:
f.write(','.join([str(r) for r in row])+'\n')
##############################################
#step3 spatial rarefy---GRADUATED CLASSES:2CLASSES
if inputBoolean== True and nClassesIN == 2:
emptyShape=outFolderTemp+"/empty.shp"
arcpy.Copy_management(outPoints,emptyShape)
arcpy.DeleteRows_management(emptyShape)
arcpy.gp.Slice_sa(inHetero,outFolder+"/spatial_groups.tif",nClassesIN,ClassType,"1")
arcpy.gp.ExtractValuesToPoints_sa(outPoints,outFolder+"/spatial_groups.tif",outFolderTemp+"/OutPtsEall.shp","NONE","VALUE_ONLY")
arcpy.AddField_management(outFolderTemp+"/OutPtsEall.shp", "TEMP","SHORT","1","","","","")
arcpy.CalculateField_management(outFolderTemp+"/OutPtsEall.shp", "TEMP", "[RASTERVALU]", "VB","#")
arcpy.DeleteField_management(outFolderTemp+"/OutPtsEall.shp","RASTERVALU")
arcpy.AddField_management(outFolderTemp+"/OutPtsEall.shp", "RASTERVALU","SHORT","1","","","","")
arcpy.CalculateField_management(outFolderTemp+"/OutPtsEall.shp","RASTERVALU", "[TEMP]", "VB","#")
arcpy.DeleteField_management(outFolderTemp+"/OutPtsEall.shp","TEMP")
arcpy.MakeFeatureLayer_management(outFolderTemp+"/OutPtsEall.shp", "OutPtsEall_View")
arcpy.SelectLayerByAttribute_management("OutPtsEall_View","NEW_SELECTION", '"RASTERVALU" =-9999')
arcpy.CalculateField_management("OutPtsEall_View", field="RASTERVALU", expression="1", expression_type="VB", code_block="")
def get_num(x):
return float(''.join(ele for ele in x if ele.isdigit() or ele == '.'))
def get_text(x):
return str(''.join(ele for ele in x if ele.isalpha() or ele == ' '))
inMax = get_num(maxDist)
inMin = get_num(minDist)
inUnits = get_text(minDist)
inMaxV = float(inMax)
inMinV = float(inMin)
####TEXTOUT
groupSP="Shape;"+str(speciesField)
resolutionS=str(resolution)
gp.AddMessage("***Step 2. Phase 1: Spatially rarefying all occurrence data at "+minDist)
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(outFolderTemp+"/OutPtsEall.shp",groupSP,minDist,"0")
#count number of duplicate points
totPnts3= arcpy.GetCount_management(outFolderTemp+"/OutPtsEall.shp")
totPntsN3= int(str(totPnts3))
diffnumVal3= int(totPntsN2-totPntsN3)
gp.addmessage(str(diffnumVal3) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
#SPLIT_BY_CLASS_THE_RAREFY_CLASSES_HIERARCHICALLY_PHASE1
from Scripts import Rarefy1####VERSION1--SPLIT BASED ON RASTER VALUE!!! EXPORT TO TEMPFOLD2
E0A= outFolderTemp2+"/0.shp"
E1A= outFolderTemp2+"/1.shp"
E2A= outFolderTemp2+"/2.shp"
EM1_2= outFolderTemp2+"/Merge1.shp"
#count number of duplicate points
if arcpy.Exists(E0A):
MISdataPN=arcpy.GetCount_management(E0A)
MISdataPA=str(MISdataPN)
MISdataP=int(MISdataPA)
else:
MISdataP=0
arcpy.Copy_management(emptyShape,E0A)
if arcpy.Exists(E1A):
V=1
else:
arcpy.Copy_management(emptyShape,E1A)
if arcpy.Exists(E2A):
group5Pnts=arcpy.GetCount_management(E2A)
group5PntsNA=str(group5Pnts)
group5PntsN=int(group5PntsNA)
else:
group5PntsN=0
arcpy.Copy_management(emptyShape,E2A)
####TEXTOUT_PHASE2
gp.AddMessage("***Step 2. Phase 2: Spatially rarefying occurrence data in group 1 at "+str(maxDist))
gp.AddMessage(" ")
arcpy.DeleteIdentical_management(E1A,groupSP,maxDist,"0")
totPnts4= arcpy.GetCount_management(E1A)
totPntsN4= int(str(totPnts4))
diffnumVal4= int(totPntsN3-totPntsN4-group5PntsN-MISdataP)
gp.addmessage(str(diffnumVal4) + " spatially autocorrelated points removed")
gp.AddMessage(" ")
###MERGE_ALL_DATSETS
IN_ED=str(E2A+";"+E1A)
arcpy.Merge_management(IN_ED,outPoints2,"#")
totPnts8= arcpy.GetCount_management(outPoints2)
totPntsN8= int(str(totPnts8))
##project back into WGS1984
gp.AddMessage("Projecting shapefile back to original WGS 1984 geographic projection")
gp.AddMessage(" ")
arcpy.Project_management(outPoints2,outPointsF,"GEOGCS['GCS_WGS_1984',DATUM['D_WGS_1984',SPHEROID['WGS_1984',6378137.0,298.257223563]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]]","#","#")
arcpy.Delete_management(outPoints)
gp.AddMessage("****************************************")
gp.AddMessage("****************************************")
gp.AddMessage("FINISHED SUCCESFULLY")
gp.addmessage("Starting number of localities: "+str(totPntsN1))
gp.addmessage("Spatially redundant occurrence localities removed: "+str(diffnumVal2))
gp.addmessage("The following spatially autocorrelated localities")
gp.addmessage("were hierarchically removed, as follows:")
gp.addmessage("Phase 1: "+str(diffnumVal3) + " points removed from groups 1-2 at "+ minDist)
gp.addmessage("Phase 2: "+str(diffnumVal4) + " points removed from group 1 at "+str(maxDist))
gp.AddMessage(" ")
gp.addmessage("Final dataset includes "+str(totPntsN8)+" unique occurrence points")
gp.AddMessage("****************************************")
gp.AddMessage("****************************************")
if MISdataP != 0:
#output csv
gp.AddMessage("WARNING!: "+str(MISdataPN)+" localities fell outside of "+str(inHetero)+ " and were not included in spatial rarefying")
gp.AddMessage("****************************************")
gp.AddMessage("See file:"+str(outFolder) + "/" + str(outName) + "_EXCLUDED_PTS_OUTSIDE_OF_HETEROGENEITY_RASTER.csv for localities")
CSVFile = outFolder + "/" + outName+ "_EXCLUDED_PTS_OUTSIDE_OF_HETEROGENEITY_RASTER.csv"
fieldnames = [f.name for f in arcpy.ListFields(E0A) if f.type <> 'Geometry']
for i,f in enumerate(fieldnames):
if f == 'Shape' or f == 'FID' or f == 'OBJECTID':
del fieldnames[i]
with open(CSVFile, 'w') as f:
f.write(','.join(fieldnames)+'\n') #csv headers
with arcpy.da.SearchCursor(E0A, fieldnames) as cursor:
for row in cursor:
f.write(','.join([str(r) for r in row])+'\n')
###Add new points to map
mxd = arcpy.mapping.MapDocument("CURRENT")
df = arcpy.mapping.ListDataFrames(mxd)[0]
addLayer = arcpy.mapping.Layer(outPointsF)
arcpy.mapping.AddLayer(df, addLayer, "AUTO_ARRANGE")
#output csv
CSVFile = outFolder + "/" + outName+ "_rarefied_points.csv"
fieldnames = [f.name for f in arcpy.ListFields(outPointsF) if f.type <> 'Geometry']
for i,f in enumerate(fieldnames):
if f == 'Shape' or f == 'FID' or f == 'OBJECTID':
del fieldnames[i]
with open(CSVFile, 'w') as f:
f.write(','.join(fieldnames)+'\n') #csv headers
with arcpy.da.SearchCursor(outPointsF, fieldnames) as cursor:
for row in cursor:
f.write(','.join([str(r) for r in row])+'\n')
import datetime
#create table of inputs into SDMtoolbox
#change outFolder if needed, out file if needed, and inputs names
#NOTE: If script contains a "glob" n- write the first occured of the glob as: + str(inFolder+"/*.asc")+ "\n" +
newLine = ""
file = open(outFolder+"/"+outName+".SDMtoolboxInputs", "w")
file.write(newLine)
file.close()
addlineH=str(datetime.datetime.now())+ "\n" + "Input Parameters \n" + "Input Folder: " + str(infile) + "\n" + "Species Field:" +str(speciesField)+ " \n" + "Long: "+ str(longitudField)+ " \n" + "Lat: "+ str(latitudeField)+ "Output Folder: " + str(outFolder) + "\n" + "Output Name" +str(outName)+ " \n" + "resolution: "+ str(resolution)+ " \n" + "Projection: "+ str(inProject)+ "Boolean: " + str(inputBoolean) + "\n" + "inHetero" +str(inHetero)+ " \n" + "Classes: "+ str(nClasses)+ " \n" + "Class Types: "+ str(ClassType)+ " \n" + "Max Distance: "+ str(maxDist)+ " \n" + "Min Distance: "+ str(minDist)
filedataZ=""
###special for 'GRID' function
#Rasters=gp.listrasters("", "ALL")
#raster= Rasters.next
#while raster:
# print raster
# newLineR =str(raster)+ ", "+filedataZ
# filedataZ=newLineR
# raster = Rasters.next()
#END of special for 'GRID' function
file = open(outFolder+"/"+outName+".SDMtoolboxInputs", "r")
filedata = file.read()
file.close()
#DELETE +"\nRasters input: "
newLine =addlineH+"\nRasters input: "+filedataZ+filedata
file = open(outFolder+"/"+outName+".SDMtoolboxInputs", "w")
file.write(newLine)
file.close()
gp.AddMessage("*******************************************")
gp.AddMessage("Table of inputs were output: "+outFolder+"/"+outName+".SDMtoolboxInputs")
#####step1_change_output_name_appropriately. For example: outFolder+"/ExtractByMask.SDMtoolboxInputs". If fixed name leave within ""; if calling user input: "/"+outName+".SDMtoolboxInputs" (CHANGING gridName to name input)
###change 'outFolder' to match outfolder sytax in focal script
####step3_change inputs to match script Line:9 (from top of python script
###step4 paste at bottom before file deletes
gp.AddMessage("Cleaning up workspace")
if arcpy.Exists(outFolderTemp+"/OutPtsEall.shp"):
arcpy.Delete_management(outFolderTemp+"/OutPtsEall.shp")
if arcpy.Exists(outFolderTemp+"/empty.shp"):
arcpy.Delete_management(outFolderTemp+"/empty.shp")
if arcpy.Exists(outFolderTemp2+"/1.shp"):
arcpy.Delete_management(outFolderTemp2+"/1.shp")
if arcpy.Exists(outFolderTemp2+"/2.shp"):
arcpy.Delete_management(outFolderTemp2+"/2.shp")
if arcpy.Exists(outFolderTemp2+"/3.shp"):
arcpy.Delete_management(outFolderTemp2+"/3.shp")
if arcpy.Exists(outFolderTemp2+"/4.shp"):
arcpy.Delete_management(outFolderTemp2+"/4.shp")
if arcpy.Exists(outFolderTemp2+"/5.shp"):
arcpy.Delete_management(outFolderTemp2+"/5.shp")
if arcpy.Exists(outFolderTemp2+"/0.shp"):
arcpy.Delete_management(outFolderTemp2+"/0.shp")
if arcpy.Exists(outFolderTemp2+"/Merge1.shp"):
arcpy.Delete_management(outFolderTemp2+"/Merge1.shp")
if arcpy.Exists(outFolderTemp3+"/1.shp"):
arcpy.Delete_management(outFolderTemp3+"/1.shp")
if arcpy.Exists(outFolderTemp3+"/2.shp"):
arcpy.Delete_management(outFolderTemp3+"/2.shp")
if arcpy.Exists(outFolderTemp3+"/3.shp"):
arcpy.Delete_management(outFolderTemp3+"/3.shp")
if arcpy.Exists(outFolderTemp3+"/4.shp"):
arcpy.Delete_management(outFolderTemp3+"/4.shp")
if arcpy.Exists(outFolderTemp3+"/Merge2.shp"):
arcpy.Delete_management(outFolderTemp3+"/Merge2.shp")
if arcpy.Exists(outFolderTemp4+"/Merge3.shp"):
arcpy.Delete_management(outFolderTemp4+"/Merge3.shp")
if arcpy.Exists(outFolderTemp4+"/1.shp"):
arcpy.Delete_management(outFolderTemp4+"/1.shp")
if arcpy.Exists(outFolderTemp4+"/2.shp"):
arcpy.Delete_management(outFolderTemp4+"/2.shp")
if arcpy.Exists(outFolderTemp4+"/3.shp"):
arcpy.Delete_management(outFolderTemp4+"/3.shp")
if arcpy.Exists(outPoints2):
arcpy.Delete_management(outPoints2)
arcpy.Delete_management(outFolderTemp)
arcpy.Delete_management(outFolderTemp2)
arcpy.Delete_management(outFolderTemp3)
arcpy.Delete_management(outFolderTemp4)
arcpy.Delete_management(outFolderTemp5)
arcpy.Delete_management(outFolderTemp6)
arcpy.Delete_management(outFolderTemp7)
arcpy.Delete_management(outFolderTemp8)
arcpy.Delete_management(outFolderTemp9)
arcpy.Delete_management(outFolderTemp10)
gp.AddMessage("Finished successfully")
#del gp
| 49.870833 | 599 | 0.711003 | 5,768 | 47,876 | 5.78138 | 0.093967 | 0.057217 | 0.021051 | 0.039944 | 0.798033 | 0.722044 | 0.709359 | 0.701832 | 0.677002 | 0.672174 | 0 | 0.047501 | 0.125825 | 47,876 | 959 | 600 | 49.922836 | 0.749283 | 0.073461 | 0 | 0.66307 | 0 | 0.017986 | 0.320917 | 0.146175 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.014388 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
9a9791f0f9633c7a75dc5725b8aa78d4793b0eef | 187 | py | Python | jsparagus/emit/__init__.py | codehag/jsparagus | c184f1155985c84566d995a9227157d2c3244fc0 | [
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | jsparagus/emit/__init__.py | codehag/jsparagus | c184f1155985c84566d995a9227157d2c3244fc0 | [
"Apache-2.0",
"MIT-0",
"MIT"
] | 2 | 2020-05-07T10:45:55.000Z | 2020-05-07T11:10:49.000Z | jsparagus/emit/__init__.py | codehag/jsparagus | c184f1155985c84566d995a9227157d2c3244fc0 | [
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | """Emit code and parser tables in Python and Rust."""
__all__ = ['write_python_parser', 'write_rust_parser']
from .python import write_python_parser
from .rust import write_rust_parser
| 26.714286 | 54 | 0.786096 | 28 | 187 | 4.821429 | 0.428571 | 0.162963 | 0.251852 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.122995 | 187 | 6 | 55 | 31.166667 | 0.823171 | 0.251337 | 0 | 0 | 0 | 0 | 0.268657 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
9ae4ca5684f7a24f4a28ee0e5261a0a5d6daa6ab | 213 | py | Python | app/public/forms/__init__.py | Atheloses/Flask-Bones | 67d8de5abeeb5ca8e78ed247dec5a90eaa94bee0 | [
"MIT"
] | null | null | null | app/public/forms/__init__.py | Atheloses/Flask-Bones | 67d8de5abeeb5ca8e78ed247dec5a90eaa94bee0 | [
"MIT"
] | null | null | null | app/public/forms/__init__.py | Atheloses/Flask-Bones | 67d8de5abeeb5ca8e78ed247dec5a90eaa94bee0 | [
"MIT"
] | null | null | null | from .user import UserForm, RegisterUserForm, EditUserForm
from .group import GroupForm, RegisterGroupForm, EditGroupForm
from .firma import FirmaForm, RegisterFirmaForm, EditFirmaForm
from .login import LoginForm | 53.25 | 62 | 0.85446 | 22 | 213 | 8.272727 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.098592 | 213 | 4 | 63 | 53.25 | 0.947917 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
9af4097b730e167832319b7f2f1d002c68f7e6eb | 218 | py | Python | web_project/User/admin.py | nosy0411/Object_Oriented_Programming | e6713b5131c125ac50814d375057f06da43e958e | [
"MIT"
] | null | null | null | web_project/User/admin.py | nosy0411/Object_Oriented_Programming | e6713b5131c125ac50814d375057f06da43e958e | [
"MIT"
] | null | null | null | web_project/User/admin.py | nosy0411/Object_Oriented_Programming | e6713b5131c125ac50814d375057f06da43e958e | [
"MIT"
] | null | null | null | from django.contrib import admin
from .models import Profile, Eval, HUser, Line
# Register your models here.
admin.site.register(Profile)
admin.site.register(Eval)
admin.site.register(HUser)
admin.site.register(Line)
| 24.222222 | 46 | 0.798165 | 32 | 218 | 5.4375 | 0.4375 | 0.206897 | 0.390805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09633 | 218 | 8 | 47 | 27.25 | 0.883249 | 0.119266 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
b16c6b48413612b11ed146e590d766911b3e2cc9 | 4,498 | py | Python | tests/location/test_location_location.py | questionlp/wwdtm | f3cf3399c22bf19e369e6e0250e7c72de0be3a90 | [
"Apache-2.0"
] | null | null | null | tests/location/test_location_location.py | questionlp/wwdtm | f3cf3399c22bf19e369e6e0250e7c72de0be3a90 | [
"Apache-2.0"
] | 1 | 2022-01-17T04:25:49.000Z | 2022-01-17T04:25:49.000Z | tests/location/test_location_location.py | questionlp/wwdtm | f3cf3399c22bf19e369e6e0250e7c72de0be3a90 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# vim: set noai syntax=python ts=4 sw=4:
#
# Copyright (c) 2018-2021 Linh Pham
# wwdtm is released under the terms of the Apache License 2.0
"""Testing for object: :py:class:`wwdtm.location.Location`
"""
import json
from typing import Any, Dict
import pytest
from wwdtm.location import Location
@pytest.mark.skip
def get_connect_dict() -> Dict[str, Any]:
"""Read in database connection settings and return values as a
dictionary.
:return: A dictionary containing database connection settings
for use by mysql.connector
"""
with open("config.json", "r") as config_file:
config_dict = json.load(config_file)
if "database" in config_dict:
return config_dict["database"]
def test_location_retrieve_all():
"""Testing for :py:meth:`wwdtm.location.Location.retrieve_all`
"""
location = Location(connect_dict=get_connect_dict())
locations = location.retrieve_all()
assert locations, "No locations could be retrieved"
assert "id" in locations[0], "'id' was not returned for the first list item"
def test_location_retrieve_all_details():
"""Testing for :py:meth:`wwdtm.location.Location.retrieve_all_details`
"""
location = Location(connect_dict=get_connect_dict())
locations = location.retrieve_all_details()
assert locations, "No locations could be retrieved"
assert "id" in locations[0], "'id' was not returned for first list item"
assert "recordings" in locations[0], ("'recordings' was not returned for "
"the first list item")
def test_location_retrieve_all_ids():
"""Testing for :py:meth:`wwdtm.location.Location.retrieve_all_ids`
"""
location = Location(connect_dict=get_connect_dict())
ids = location.retrieve_all_ids()
assert ids, "No location IDs could be retrieved"
def test_location_retrieve_all_slugs():
"""Testing for :py:meth:`wwdtm.location.Location.retrieve_all_slugs`
"""
location = Location(connect_dict=get_connect_dict())
slugs = location.retrieve_all_slugs()
assert slugs, "No location slug strings could be retrieved"
@pytest.mark.parametrize("location_id", [95])
def test_location_retrieve_by_id(location_id: int):
"""Testing for :py:meth:`wwdtm.location.Location.retrieve_by_id`
:param location_id: Location ID to test retrieving location
information
"""
location = Location(connect_dict=get_connect_dict())
info = location.retrieve_by_id(location_id)
assert info, f"Location ID {location_id} not found"
assert "venue" in info, f"'venue' was not returned for ID {location_id}"
@pytest.mark.parametrize("location_id", [95])
def test_location_retrieve_details_by_id(location_id: int):
"""Testing for :py:meth:`wwdtm.location.location.retrieve_details_by_id`
:param location_id: Location ID to test retrieving location details
"""
location = Location(connect_dict=get_connect_dict())
info = location.retrieve_details_by_id(location_id)
assert info, f"Location ID {location_id} not found"
assert "venue" in info, f"'venue' was not returned for ID {location_id}"
assert "recordings" in info, f"'recordings' was not returned for ID {location_id}"
@pytest.mark.parametrize("location_slug", ["the-chicago-theatre-chicago-il"])
def test_location_retrieve_by_slug(location_slug: str):
"""Testing for :py:meth:`wwdtm.location.Location.retrieve_by_slug`
:param location_slug: Location slug string to test retrieving
location information
"""
location = Location(connect_dict=get_connect_dict())
info = location.retrieve_by_slug(location_slug)
assert info, f"Location slug {location_slug} not found"
assert "venue" in info, f"'venue' was not returned for slug {location_slug}"
@pytest.mark.parametrize("location_slug", ["the-chicago-theatre-chicago-il"])
def test_location_retrieve_details_by_slug(location_slug: str):
"""Testing for :py:meth:`wwdtm.location.Location.retrieve_details_by_slug`
:param location_slug: Location slug string to test retrieving
location details
"""
location = Location(connect_dict=get_connect_dict())
info = location.retrieve_details_by_slug(location_slug)
assert info, f"Location slug {location_slug} not found"
assert "venue" in info, f"'venue' was not returned for slug {location_slug}"
assert "recordings" in info, f"'recordings' was not returned for slug {location_slug}"
| 36.274194 | 90 | 0.723433 | 615 | 4,498 | 5.089431 | 0.17561 | 0.122684 | 0.072843 | 0.048882 | 0.76869 | 0.748882 | 0.748882 | 0.713099 | 0.710863 | 0.649521 | 0 | 0.005362 | 0.170743 | 4,498 | 123 | 91 | 36.569106 | 0.83378 | 0.2759 | 0 | 0.392857 | 0 | 0 | 0.289541 | 0.019133 | 0 | 0 | 0 | 0 | 0.303571 | 1 | 0.160714 | false | 0 | 0.071429 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
b176a7473f0cc651a49dbae804d7b0f8787f01e5 | 739 | py | Python | Components/_x__components.py | JetStarBlues/Nand-2-Tetris | c27b5c2ac659f1edb63d36d89bf87e226bc5672c | [
"MIT"
] | null | null | null | Components/_x__components.py | JetStarBlues/Nand-2-Tetris | c27b5c2ac659f1edb63d36d89bf87e226bc5672c | [
"MIT"
] | null | null | null | Components/_x__components.py | JetStarBlues/Nand-2-Tetris | c27b5c2ac659f1edb63d36d89bf87e226bc5672c | [
"MIT"
] | null | null | null | # The various components of the Hack computer
from ._0__globalConstants import *
if PERFORMANCE_MODE:
from ._1__elementary_performance import *
from ._2__arithmetic_performance import *
from ._3__clock_performance import *
from ._5__memory_performance import *
from ._6__counter_performance import *
from ._7__cpu_performance import *
from ._9__inputOutput_performance import *
else:
from ._1__elementary import *
from ._2__arithmetic import *
# from ._3__clock import *
from ._3__clock_performance import *
# from ._4__flipFlops import *
from ._4__flipFlops_performance import *
from ._5__memory import *
from ._6__counter import *
from ._7__cpu import *
from ._9__inputOutput import *
from ._8__computer import *
| 25.482759 | 45 | 0.788904 | 96 | 739 | 5.375 | 0.322917 | 0.310078 | 0.325581 | 0.093023 | 0.21124 | 0.143411 | 0.143411 | 0 | 0 | 0 | 0 | 0.030351 | 0.152909 | 739 | 28 | 46 | 26.392857 | 0.79393 | 0.131258 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.894737 | 0 | 0.894737 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
b1870a43116e22562a8a5442d346ee40ef3fda73 | 28,258 | py | Python | desc/geometry/surface.py | MothVine/DESC | 8f18ca63b34dad07ec67a4d43945d39287b303b8 | [
"MIT"
] | null | null | null | desc/geometry/surface.py | MothVine/DESC | 8f18ca63b34dad07ec67a4d43945d39287b303b8 | [
"MIT"
] | null | null | null | desc/geometry/surface.py | MothVine/DESC | 8f18ca63b34dad07ec67a4d43945d39287b303b8 | [
"MIT"
] | null | null | null | import numpy as np
import warnings
from desc.backend import jnp, sign, put
from desc.boundary_conditions import LCFSConstraint, PoincareConstraint
from desc.utils import copy_coeffs
from desc.grid import Grid, LinearGrid
from desc.basis import DoubleFourierSeries, ZernikePolynomial
from desc.transform import Transform
from .core import Surface
from .utils import xyz2rpz_vec, rpz2xyz_vec, xyz2rpz, rpz2xyz
__all__ = ["FourierRZToroidalSurface", "ZernikeRZToroidalSection"]
class FourierRZToroidalSurface(Surface):
"""Toroidal surface represented by Fourier series in poloidal and toroidal angles.
Parameters
----------
R_lmn, Z_lmn : array-like, shape(k,)
Fourier coefficients for R and Z in cylindrical coordinates
modes_R : array-like, shape(k,2)
poloidal and toroidal mode numbers [m,n] for R_n.
modes_Z : array-like, shape(k,2)
mode numbers associated with Z_n, defaults to modes_R
NFP : int
number of field periods
sym : bool
whether to enforce stellarator symmetry. Default is "auto" which enforces if
modes are symmetric. If True, non-symmetric modes will be truncated.
rho : float (0,1)
flux surface label for the toroidal surface
grid : Grid
default grid for computation
name : str
name for this surface
"""
_io_attrs_ = Surface._io_attrs_ + [
"_R_lmn",
"_Z_lmn",
"_R_basis",
"_Z_basis",
"_R_transform",
"_Z_transform",
"rho",
"_NFP",
]
def __init__(
self,
R_lmn=None,
Z_lmn=None,
modes_R=None,
modes_Z=None,
NFP=1,
sym="auto",
rho=1,
grid=None,
name="",
):
if R_lmn is None:
R_lmn = np.array([10, 1])
modes_R = np.array([[0, 0], [1, 0]])
if Z_lmn is None:
Z_lmn = np.array([0, 1])
modes_Z = np.array([[0, 0], [-1, 0]])
if modes_Z is None:
modes_Z = modes_R
R_lmn, Z_lmn, modes_R, modes_Z = map(
np.asarray, (R_lmn, Z_lmn, modes_R, modes_Z)
)
assert issubclass(modes_R.dtype.type, np.integer)
assert issubclass(modes_Z.dtype.type, np.integer)
MR = np.max(abs(modes_R[:, 0]))
NR = np.max(abs(modes_R[:, 1]))
MZ = np.max(abs(modes_Z[:, 0]))
NZ = np.max(abs(modes_Z[:, 1]))
self._L = 0
self._M = max(MR, MZ)
self._N = max(NR, NZ)
if sym == "auto":
if np.all(
R_lmn[np.where(sign(modes_R[:, 0]) != sign(modes_R[:, 1]))] == 0
) and np.all(
Z_lmn[np.where(sign(modes_Z[:, 0]) == sign(modes_Z[:, 1]))] == 0
):
sym = True
else:
sym = False
self._R_basis = DoubleFourierSeries(
M=MR, N=NR, NFP=NFP, sym="cos" if sym else False
)
self._Z_basis = DoubleFourierSeries(
M=MZ, N=NZ, NFP=NFP, sym="sin" if sym else False
)
self._R_lmn = copy_coeffs(R_lmn, modes_R, self.R_basis.modes[:, 1:])
self._Z_lmn = copy_coeffs(Z_lmn, modes_Z, self.Z_basis.modes[:, 1:])
self._NFP = NFP
self._sym = sym
self.rho = rho
if grid is None:
grid = LinearGrid(
rho=self.rho,
M=4 * self.M + 1,
N=4 * self.N + 1,
endpoint=True,
)
self._grid = grid
self._R_transform, self._Z_transform = self._get_transforms(grid)
self.name = name
@property
def NFP(self):
"""Number of (toroidal) field periods (int)."""
return self._NFP
@property
def R_basis(self):
"""Spectral basis for R double Fourier series."""
return self._R_basis
@property
def Z_basis(self):
"""Spectral basis for Z double Fourier series."""
return self._Z_basis
@property
def grid(self):
"""Grid for computation."""
return self._grid
@grid.setter
def grid(self, new):
if isinstance(new, Grid):
self._grid = new
elif isinstance(new, (np.ndarray, jnp.ndarray)):
self._grid = Grid(new, sort=False)
else:
raise TypeError(
f"grid should be a Grid or subclass, or ndarray, got {type(new)}"
)
self._R_transform.grid = self.grid
self._Z_transform.grid = self.grid
def change_resolution(self, *args, **kwargs):
"""Change the maximum poloidal and toroidal resolution"""
assert ((len(args) in [2, 3]) and len(kwargs) == 0) or (
len(args) == 0
), "change_resolution should be called with 2 or 3 positional arguments or only keyword arguments"
L = kwargs.get("L", None)
M = kwargs.get("M", None)
N = kwargs.get("N", None)
if L is not None:
warnings.warn(
"FourierRZToroidalSurface does not have a radial resolution, ignoring L"
)
if len(args) == 2:
M, N = args
elif len(args) == 3:
L, M, N = args
if (N != self.N) or (M != self.M):
R_modes_old = self.R_basis.modes
Z_modes_old = self.Z_basis.modes
self.R_basis.change_resolution(M=M, N=N)
self.Z_basis.change_resolution(M=M, N=N)
self._R_transform, self._Z_transform = self._get_transforms(self.grid)
self.R_lmn = copy_coeffs(self.R_lmn, R_modes_old, self.R_basis.modes)
self.Z_lmn = copy_coeffs(self.Z_lmn, Z_modes_old, self.Z_basis.modes)
self._M = M
self._N = N
@property
def R_lmn(self):
"""Spectral coefficients for R."""
return self._R_lmn
@R_lmn.setter
def R_lmn(self, new):
if len(new) == self.R_basis.num_modes:
self._R_lmn = jnp.asarray(new)
else:
raise ValueError(
f"R_lmn should have the same size as the basis, got {len(new)} for basis with {self.R_basis.num_modes} modes"
)
@property
def Z_lmn(self):
"""Spectral coefficients for Z."""
return self._Z_lmn
@Z_lmn.setter
def Z_lmn(self, new):
if len(new) == self.Z_basis.num_modes:
self._Z_lmn = jnp.asarray(new)
else:
raise ValueError(
f"Z_lmn should have the same size as the basis, got {len(new)} for basis with {self.Z_basis.num_modes} modes"
)
def get_coeffs(self, m, n=0):
"""Get Fourier coefficients for given mode number(s)."""
n = np.atleast_1d(n).astype(int)
m = np.atleast_1d(m).astype(int)
m, n = np.broadcast_arrays(m, n)
R = np.zeros_like(m).astype(float)
Z = np.zeros_like(m).astype(float)
mn = np.array([m, n]).T
idxR = np.where(
(mn[:, np.newaxis, :] == self.R_basis.modes[np.newaxis, :, 1:]).all(axis=-1)
)
idxZ = np.where(
(mn[:, np.newaxis, :] == self.Z_basis.modes[np.newaxis, :, 1:]).all(axis=-1)
)
R[idxR[0]] = self.R_lmn[idxR[1]]
Z[idxZ[0]] = self.Z_lmn[idxZ[1]]
return R, Z
def set_coeffs(self, m, n=0, R=None, Z=None):
"""Set specific Fourier coefficients."""
m, n, R, Z = (
np.atleast_1d(m),
np.atleast_1d(n),
np.atleast_1d(R),
np.atleast_1d(Z),
)
m, n, R, Z = np.broadcast_arrays(m, n, R, Z)
for mm, nn, RR, ZZ in zip(m, n, R, Z):
if RR is not None:
idxR = self.R_basis.get_idx(0, mm, nn)
self.R_lmn = put(self.R_lmn, idxR, RR)
if ZZ is not None:
idxZ = self.Z_basis.get_idx(0, mm, nn)
self.Z_lmn = put(self.Z_lmn, idxZ, ZZ)
def _get_transforms(self, grid=None):
if grid is None:
return self._R_transform, self._Z_transform
if not isinstance(grid, Grid):
if np.isscalar(grid):
grid = LinearGrid(rho=1, M=grid, N=grid, NFP=self.NFP)
elif len(grid) == 2:
grid = LinearGrid(rho=1, M=grid[0], N=grid[1], NFP=self.NFP)
elif grid.shape[1] == 2:
grid = np.pad(grid, ((0, 0), (1, 0)), constant_values=self.rho)
grid = Grid(grid, sort=False)
else:
grid = Grid(grid, sort=False)
R_transform = Transform(
grid,
self.R_basis,
derivs=np.array(
[[0, 0, 0], [0, 1, 0], [0, 2, 0], [0, 0, 1], [0, 0, 2], [0, 1, 1]]
),
)
Z_transform = Transform(
grid,
self.Z_basis,
derivs=np.array(
[[0, 0, 0], [0, 1, 0], [0, 2, 0], [0, 0, 1], [0, 0, 2], [0, 1, 1]]
),
)
return R_transform, Z_transform
def compute_curvature(self, params=None, grid=None):
"""Compute gaussian and mean curvature."""
raise NotImplementedError()
def compute_coordinates(
self, R_lmn=None, Z_lmn=None, grid=None, dt=0, dz=0, basis="rpz"
):
"""Compute values using specified coefficients.
Parameters
----------
R_lmn, Z_lmn: array-like
fourier coefficients for R, Z. Defaults to self.R_lmn, self.Z_lmn
grid : Grid or array-like
toroidal coordinates to compute at. Defaults to self.grid
If an integer, assumes that many linearly spaced points in (0,2pi)
dt, dz: int
derivative order to compute in theta, zeta
basis : {"rpz", "xyz"}
coordinate system for returned points
Returns
-------
values : ndarray, shape(k,3)
R, phi, Z or x, y, z coordinates of the surface at points specified in grid
"""
assert basis.lower() in ["rpz", "xyz"]
if R_lmn is None:
R_lmn = self.R_lmn
if Z_lmn is None:
Z_lmn = self.Z_lmn
R_transform, Z_transform = self._get_transforms(grid)
if dz == 0:
R = R_transform.transform(R_lmn, dt=dt, dz=0)
Z = Z_transform.transform(Z_lmn, dt=dt, dz=0)
phi = R_transform.grid.nodes[:, 2] * (dt == 0)
coords = jnp.stack([R, phi, Z], axis=1)
elif dz == 1:
R0 = R_transform.transform(R_lmn, dt=dt, dz=0)
dR = R_transform.transform(R_lmn, dt=dt, dz=1)
dZ = Z_transform.transform(Z_lmn, dt=dt, dz=1)
dphi = R0 * (dt == 0)
coords = jnp.stack([dR, dphi, dZ], axis=1)
elif dz == 2:
R0 = R_transform.transform(R_lmn, dt=dt, dz=0)
dR = R_transform.transform(R_lmn, dt=dt, dz=1)
d2R = R_transform.transform(R_lmn, dt=dt, dz=2)
d2Z = Z_transform.transform(Z_lmn, dt=dt, dz=2)
R = d2R - R0
Z = d2Z
# 2nd derivative wrt to phi = 0
phi = 2 * dR * (dt == 0)
coords = jnp.stack([R, phi, Z], axis=1)
elif dz == 3:
R0 = R_transform.transform(R_lmn, dt=dt, dz=0)
dR = R_transform.transform(R_lmn, dt=dt, dz=1)
d2R = R_transform.transform(R_lmn, dt=dt, dz=2)
d3R = R_transform.transform(R_lmn, dt=dt, dz=3)
d3Z = Z_transform.transform(Z_lmn, dt=dt, dz=3)
R = d3R - 3 * dR
Z = d3Z
phi = (3 * d2R - R0) * (dt == 0)
coords = jnp.stack([R, phi, Z], axis=1)
else:
raise NotImplementedError(
"Derivatives higher than 3 have not been implemented in cylindrical coordinates"
)
if basis.lower() == "xyz":
if (dt > 0) or (dz > 0):
coords = rpz2xyz_vec(coords, phi=R_transform.grid.nodes[:, 2])
else:
coords = rpz2xyz(coords)
return coords
def compute_normal(self, R_lmn=None, Z_lmn=None, grid=None, basis="rpz"):
"""Compute normal vector to surface on default grid.
Parameters
----------
R_lmn, Z_lmn: array-like
fourier coefficients for R, Z. Defaults to self.R_lmn, self.Z_lmn
grid : Grid or array-like
toroidal coordinates to compute at. Defaults to self.grid
If an integer, assumes that many linearly spaced points in (0,2pi)
basis : {"rpz", "xyz"}
basis vectors to use for normal vector representation
Returns
-------
N : ndarray, shape(k,3)
normal vector to surface in specified coordinates
"""
assert basis.lower() in ["rpz", "xyz"]
if R_lmn is None:
R_lmn = self.R_lmn
if Z_lmn is None:
Z_lmn = self.Z_lmn
R_transform, Z_transform = self._get_transforms(grid)
r_t = self.compute_coordinates(R_lmn, Z_lmn, grid, dt=1)
r_z = self.compute_coordinates(R_lmn, Z_lmn, grid, dz=1)
N = jnp.cross(r_t, r_z, axis=1)
N = N / jnp.linalg.norm(N, axis=1)[:, jnp.newaxis]
if basis.lower() == "xyz":
phi = R_transform.grid.nodes[:, 2]
N = rpz2xyz_vec(N, phi=phi)
return N
def compute_surface_area(self, R_lmn=None, Z_lmn=None, grid=None):
"""Compute surface area via quadrature.
Parameters
----------
R_lmn, Z_lmn: array-like
fourier coefficients for R, Z. Defaults to self.R_lmn, self.Z_lmn
grid : Grid or array-like
toroidal coordinates to compute at. Defaults to self.grid
If an integer, assumes that many linearly spaced points in (0,2pi)
Returns
-------
area : float
surface area
"""
if R_lmn is None:
R_lmn = self.R_lmn
if Z_lmn is None:
Z_lmn = self.Z_lmn
R_transform, Z_transform = self._get_transforms(grid)
r_t = self.compute_coordinates(R_lmn, Z_lmn, grid, dt=1)
r_z = self.compute_coordinates(R_lmn, Z_lmn, grid, dz=1)
N = jnp.cross(r_t, r_z, axis=1)
return jnp.sum(R_transform.grid.weights * jnp.linalg.norm(N, axis=1))
def get_constraint(self, R_basis, Z_basis, L_basis):
"""Get the linear constraint to enforce this surface as a boundary condition"""
return LCFSConstraint(
R_basis,
Z_basis,
L_basis,
self.R_basis,
self.Z_basis,
self.R_lmn,
self.Z_lmn,
)
class ZernikeRZToroidalSection(Surface):
"""A toroidal cross section represented by a Zernike polynomial in R,Z.
Parameters
----------
R_lmn, Z_lmn : array-like, shape(k,)
zernike coefficients
modes_R : array-like, shape(k,2)
radial and poloidal mode numbers [l,m] for R_lmn
modes_Z : array-like, shape(k,2)
radial and poloidal mode numbers [l,m] for Z_lmn. If None defaults to modes_R.
sym : bool
whether to enforce stellarator symmetry. Default is "auto" which enforces if
modes are symmetric. If True, non-symmetric modes will be truncated.
spectral_indexing : {``'ansi'``, ``'fringe'``}
Indexing method, default value = ``'fringe'``
For L=0, all methods are equivalent and give a "chevron" shaped
basis (only the outer edge of the zernike pyramid of width M).
For L>0, the indexing scheme defines order of the basis functions:
``'ansi'``: ANSI indexing fills in the pyramid with triangles of
decreasing size, ending in a triagle shape. For L == M,
the traditional ANSI pyramid indexing is recovered. For L>M, adds rows
to the bottom of the pyramid, increasing L while keeping M constant,
giving a "house" shape
``'fringe'``: Fringe indexing fills in the pyramid with chevrons of
decreasing size, ending in a diamond shape for L=2*M where
the traditional fringe/U of Arizona indexing is recovered.
For L > 2*M, adds chevrons to the bottom, making a hexagonal diamond
zeta : float (0,2pi)
toroidal angle for the section.
grid : Grid
default grid for computation
name : str
name for this surface
"""
_io_attrs_ = Surface._io_attrs_ + [
"_R_lmn",
"_Z_lmn",
"_R_basis",
"_Z_basis",
"_R_transform",
"_Z_transform",
"zeta",
"_spectral_indexing",
]
def __init__(
self,
R_lmn=None,
Z_lmn=None,
modes_R=None,
modes_Z=None,
spectral_indexing="fringe",
sym="auto",
zeta=0,
grid=None,
name="",
):
if R_lmn is None:
R_lmn = np.array([10, 1])
modes_R = np.array([[0, 0], [1, 1]])
if Z_lmn is None:
Z_lmn = np.array([0, 1])
modes_Z = np.array([[0, 0], [1, -1]])
if modes_Z is None:
modes_Z = modes_R
R_lmn, Z_lmn, modes_R, modes_Z = map(
np.asarray, (R_lmn, Z_lmn, modes_R, modes_Z)
)
assert issubclass(modes_R.dtype.type, np.integer)
assert issubclass(modes_Z.dtype.type, np.integer)
LR = np.max(abs(modes_R[:, 0]))
MR = np.max(abs(modes_R[:, 1]))
LZ = np.max(abs(modes_Z[:, 0]))
MZ = np.max(abs(modes_Z[:, 1]))
self._L = max(LR, LZ)
self._M = max(MR, MZ)
self._N = 0
if sym == "auto":
if np.all(
R_lmn[np.where(sign(modes_R[:, 0]) != sign(modes_R[:, 1]))] == 0
) and np.all(
Z_lmn[np.where(sign(modes_Z[:, 0]) == sign(modes_Z[:, 1]))] == 0
):
sym = True
else:
sym = False
self._R_basis = ZernikePolynomial(
L=max(LR, MR),
M=max(LR, MR),
spectral_indexing=spectral_indexing,
sym="cos" if sym else False,
)
self._Z_basis = ZernikePolynomial(
L=max(LZ, MZ),
M=max(LZ, MZ),
spectral_indexing=spectral_indexing,
sym="sin" if sym else False,
)
self._R_lmn = copy_coeffs(R_lmn, modes_R, self.R_basis.modes[:, :2])
self._Z_lmn = copy_coeffs(Z_lmn, modes_Z, self.Z_basis.modes[:, :2])
self._sym = sym
self._spectral_indexing = spectral_indexing
self.zeta = zeta
if grid is None:
grid = LinearGrid(
L=2 * self.L, M=4 * self.M + 1, zeta=self.zeta, endpoint=True
)
self._grid = grid
self._R_transform, self._Z_transform = self._get_transforms(grid)
self.name = name
@property
def spectral_indexing(self):
return self._spectral_indexing
@property
def R_basis(self):
"""Spectral basis for R Zernike polynomial."""
return self._R_basis
@property
def Z_basis(self):
"""Spectral basis for Z Zernike polynomial."""
return self._Z_basis
@property
def grid(self):
"""Grid for computation."""
return self._grid
@grid.setter
def grid(self, new):
if isinstance(new, Grid):
self._grid = new
elif isinstance(new, (np.ndarray, jnp.ndarray)):
self._grid = Grid(new, sort=False)
else:
raise TypeError(
f"grid should be a Grid or subclass, or ndarray, got {type(new)}"
)
self._R_transform.grid = self.grid
self._Z_transform.grid = self.grid
def change_resolution(self, *args, **kwargs):
"""Change the maximum radial and poloidal resolution"""
assert ((len(args) in [2, 3]) and len(kwargs) == 0) or (
len(args) == 0
), "change_resolution should be called with 2 or 3 positional arguments or only keyword arguments"
L = kwargs.get("L", None)
M = kwargs.get("M", None)
N = kwargs.get("N", None)
if N is not None:
warnings.warn(
"ZernikeRZToroidalSection does not have a toroidal resolution, ignoring N"
)
if len(args) == 2:
L, M = args
elif len(args) == 3:
L, M, N = args
if (L != self.L) or (M != self.M):
R_modes_old = self.R_basis.modes
Z_modes_old = self.Z_basis.modes
self.R_basis.change_resolution(L=L, M=M)
self.Z_basis.change_resolution(L=L, M=M)
self._R_transform, self._Z_transform = self._get_transforms(self.grid)
self.R_lmn = copy_coeffs(self.R_lmn, R_modes_old, self.R_basis.modes)
self.Z_lmn = copy_coeffs(self.Z_lmn, Z_modes_old, self.Z_basis.modes)
self._L = L
self._M = M
@property
def R_lmn(self):
"""Spectral coefficients for R."""
return self._R_lmn
@R_lmn.setter
def R_lmn(self, new):
if len(new) == self.R_basis.num_modes:
self._R_lmn = jnp.asarray(new)
else:
raise ValueError(
f"R_lmn should have the same size as the basis, got {len(new)} for basis with {self.R_basis.num_modes} modes"
)
@property
def Z_lmn(self):
"""Spectral coefficients for Z."""
return self._Z_lmn
@Z_lmn.setter
def Z_lmn(self, new):
if len(new) == self.Z_basis.num_modes:
self._Z_lmn = jnp.asarray(new)
else:
raise ValueError(
f"Z_lmn should have the same size as the basis, got {len(new)} for basis with {self.Z_basis.num_modes} modes"
)
def get_coeffs(self, l, m=0):
"""Get Zernike coefficients for given mode number(s)."""
l = np.atleast_1d(l).astype(int)
m = np.atleast_1d(m).astype(int)
l, m = np.broadcast_arrays(l, m)
R = np.zeros_like(m).astype(float)
Z = np.zeros_like(m).astype(float)
lm = np.array([l, m]).T
idxR = np.where(
(lm[:, np.newaxis, :] == self.R_basis.modes[np.newaxis, :, :2]).all(axis=-1)
)
idxZ = np.where(
(lm[:, np.newaxis, :] == self.Z_basis.modes[np.newaxis, :, :2]).all(axis=-1)
)
R[idxR[0]] = self.R_lmn[idxR[1]]
Z[idxZ[0]] = self.Z_lmn[idxZ[1]]
return R, Z
def set_coeffs(self, l, m=0, R=None, Z=None):
"""Set specific Zernike coefficients."""
l, m, R, Z = (
np.atleast_1d(l),
np.atleast_1d(m),
np.atleast_1d(R),
np.atleast_1d(Z),
)
l, m, R, Z = np.broadcast_arrays(l, m, R, Z)
for ll, mm, RR, ZZ in zip(l, m, R, Z):
if RR is not None:
idxR = self.R_basis.get_idx(ll, mm, 0)
self.R_lmn = put(self.R_lmn, idxR, RR)
if ZZ is not None:
idxZ = self.Z_basis.get_idx(ll, mm, 0)
self.Z_lmn = put(self.Z_lmn, idxZ, ZZ)
def _get_transforms(self, grid=None):
if grid is None:
return self._R_transform, self._Z_transform
if not isinstance(grid, Grid):
if np.isscalar(grid):
grid = LinearGrid(L=grid, M=grid, zeta=0, NFP=1)
elif len(grid) == 2:
grid = LinearGrid(L=grid[0], M=grid[1], zeta=0, NFP=1)
elif grid.shape[1] == 2:
grid = np.pad(grid, ((0, 0), (0, 1)), constant_values=self.zeta)
grid = Grid(grid, sort=False)
else:
grid = Grid(grid, sort=False)
R_transform = Transform(
grid,
self.R_basis,
derivs=np.array(
[[0, 0, 0], [1, 0, 0], [2, 0, 0], [0, 1, 0], [0, 2, 0], [1, 1, 0]]
),
)
Z_transform = Transform(
grid,
self.Z_basis,
derivs=np.array(
[[0, 0, 0], [1, 0, 0], [2, 0, 0], [0, 1, 0], [0, 2, 0], [1, 1, 0]]
),
)
return R_transform, Z_transform
def compute_curvature(self, params=None, grid=None):
"""Compute gaussian and mean curvature."""
raise NotImplementedError()
def compute_coordinates(
self, R_lmn=None, Z_lmn=None, grid=None, dr=0, dt=0, basis="rpz"
):
"""Compute values using specified coefficients.
Parameters
----------
R_lmn, Z_lmn: array-like
zernike coefficients for R, Z. Defaults to self.R_lmn, self.Z_lmn
grid : Grid or array-like
toroidal coordinates to compute at. Defaults to self.grid
If an integer, assumes that many linearly spaced points in (0,1)x(0,2pi)
dr, dt: int
derivative order to compute in rho, theta
basis : {"rpz", "xyz"}
coordinate system for returned points
Returns
-------
values : ndarray, shape(k,3)
R, phi, Z or x, y, z coordinates of the surface at points specified in grid
"""
assert basis.lower() in ["rpz", "xyz"]
if R_lmn is None:
R_lmn = self.R_lmn
if Z_lmn is None:
Z_lmn = self.Z_lmn
R_transform, Z_transform = self._get_transforms(grid)
R = R_transform.transform(R_lmn, dr=dr, dt=dt)
Z = Z_transform.transform(Z_lmn, dr=dr, dt=dt)
phi = R_transform.grid.nodes[:, 2] * (dr == 0) * (dt == 0)
coords = jnp.stack([R, phi, Z], axis=1)
if basis.lower() == "xyz":
if (dt > 0) or (dr > 0):
coords = rpz2xyz_vec(coords, phi=R_transform.grid.nodes[:, 2])
else:
coords = rpz2xyz(coords)
return coords
def compute_normal(self, R_lmn=None, Z_lmn=None, grid=None, basis="rpz"):
"""Compute normal vector to surface on default grid.
Parameters
----------
R_lmn, Z_lmn: array-like
zernike coefficients for R, Z. Defaults to self.R_lmn, self.Z_lmn
grid : Grid or array-like
toroidal coordinates to compute at. Defaults to self.grid
If an integer, assumes that many linearly spaced points in (0,1)x(0,2pi)
basis : {"rpz", "xyz"}
basis vectors to use for normal vector representation
Returns
-------
N : ndarray, shape(k,3)
normal vector to surface in specified coordinates
"""
assert basis.lower() in ["rpz", "xyz"]
R_transform, Z_transform = self._get_transforms(grid)
phi = R_transform.grid.nodes[:, -1]
# normal vector is a constant 1*phihat
N = jnp.array([jnp.zeros_like(phi), jnp.ones_like(phi), jnp.zeros_like(phi)]).T
if basis.lower() == "xyz":
N = rpz2xyz_vec(N, phi=phi)
return N
def compute_surface_area(self, R_lmn=None, Z_lmn=None, grid=None):
"""Compute surface area via quadrature.
Parameters
----------
R_lmn, Z_lmn: array-like
zernike coefficients for R, Z. Defaults to self.R_lmn, self.Z_lmn
grid : Grid or array-like
toroidal coordinates to compute at. Defaults to self.grid
If an integer, assumes that many linearly spaced points in (0,1)x(0,2pi)
Returns
-------
area : float
surface area
"""
if R_lmn is None:
R_lmn = self.R_lmn
if Z_lmn is None:
Z_lmn = self.Z_lmn
R_transform, Z_transform = self._get_transforms(grid)
r_r = self.compute_coordinates(R_lmn, Z_lmn, grid, dr=1)
r_t = self.compute_coordinates(R_lmn, Z_lmn, grid, dt=1)
N = jnp.cross(r_r, r_t, axis=1)
return jnp.sum(R_transform.grid.weights * jnp.linalg.norm(N, axis=1)) / (
2 * np.pi
)
def get_constraint(self, R_basis, Z_basis, L_basis):
"""Get the linear constraint to enforce this surface as a boundary condition"""
return PoincareConstraint(
R_basis,
Z_basis,
L_basis,
self.R_basis,
self.Z_basis,
self.R_lmn,
self.Z_lmn,
)
| 34.377129 | 125 | 0.542714 | 3,935 | 28,258 | 3.741296 | 0.079797 | 0.025812 | 0.020106 | 0.010868 | 0.815378 | 0.793914 | 0.744532 | 0.731422 | 0.692705 | 0.672735 | 0 | 0.017812 | 0.338417 | 28,258 | 821 | 126 | 34.419001 | 0.769671 | 0.215797 | 0 | 0.671454 | 0 | 0.007181 | 0.057899 | 0.00909 | 0 | 0 | 0 | 0 | 0.017953 | 1 | 0.068223 | false | 0 | 0.017953 | 0.001795 | 0.140036 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
b18afbec13f89c2b7c04b1a30e740982f818b5b4 | 172 | py | Python | award/admin.py | EmmanuelMuchiri/Awards | d786689a6f5f32532d005ef6a50eed4600ba5ecc | [
"MIT"
] | null | null | null | award/admin.py | EmmanuelMuchiri/Awards | d786689a6f5f32532d005ef6a50eed4600ba5ecc | [
"MIT"
] | 5 | 2020-06-05T22:45:28.000Z | 2021-09-08T01:16:58.000Z | award/admin.py | EmmanuelMuchiri/Awards | d786689a6f5f32532d005ef6a50eed4600ba5ecc | [
"MIT"
] | 3 | 2019-09-09T08:16:01.000Z | 2019-11-25T11:37:58.000Z | from django.contrib import admin
from .models import *
# Register your models here.
admin.site.register(Profile);
admin.site.register(Project);
admin.site.register(Rating) | 24.571429 | 32 | 0.796512 | 24 | 172 | 5.708333 | 0.541667 | 0.19708 | 0.372263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 172 | 7 | 33 | 24.571429 | 0.878205 | 0.151163 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
b1ae30a1e4f60e0132bd4f119ff6b9fedba7e7a8 | 120 | py | Python | hwrt/wsgi.py | MartinThoma/hwrt | 7b274fa3022292bb1215eaec99f1826f64f98a07 | [
"MIT"
] | 65 | 2015-04-08T12:11:22.000Z | 2022-02-28T23:46:53.000Z | hwrt/wsgi.py | MartinThoma/hwrt | 7b274fa3022292bb1215eaec99f1826f64f98a07 | [
"MIT"
] | 35 | 2015-01-05T11:56:30.000Z | 2022-03-12T00:55:38.000Z | hwrt/wsgi.py | MartinThoma/hwrt | 7b274fa3022292bb1215eaec99f1826f64f98a07 | [
"MIT"
] | 18 | 2015-01-19T15:57:25.000Z | 2021-02-15T20:38:32.000Z | """
Use this for web servers.
gunicorn hwrt.wsgi:app
"""
# First party modules
from hwrt.serve import app # noqa
| 13.333333 | 34 | 0.683333 | 18 | 120 | 4.555556 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216667 | 120 | 8 | 35 | 15 | 0.87234 | 0.658333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
490892f74a19c96a1127f8da9edc1852d71b4d07 | 64 | py | Python | src/common/pytorch/layer/torchtools/lr_scheduler/__init__.py | wu-uw/OpenCompetition | 9aa9d7a50ada1deb653d295dd8a7fe46321b9094 | [
"Apache-2.0"
] | 15 | 2019-12-22T14:26:47.000Z | 2020-11-02T10:57:37.000Z | src/common/pytorch/layer/torchtools/lr_scheduler/__init__.py | GT-JLU/OpenCompetition | 5262fc5fa7efd7b483c1dc09cb7747dd75e37175 | [
"Apache-2.0"
] | 2 | 2020-02-03T07:10:11.000Z | 2020-02-11T16:38:56.000Z | src/common/pytorch/layer/torchtools/lr_scheduler/__init__.py | GT-JLU/OpenCompetition | 5262fc5fa7efd7b483c1dc09cb7747dd75e37175 | [
"Apache-2.0"
] | 12 | 2020-01-06T14:16:52.000Z | 2020-05-23T14:12:30.000Z | from .delayed import DelayerScheduler, DelayedCosineAnnealingLR
| 32 | 63 | 0.890625 | 5 | 64 | 11.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078125 | 64 | 1 | 64 | 64 | 0.966102 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
491b1b3b177a539a44201308c4603bc0b7bfdc66 | 4,072 | py | Python | TWLight/users/migrations/0052_auto_20200312_1628.py | nicole331/TWLight | fab9002e76868f8a2ef36f9279c777de34243b2c | [
"MIT"
] | 67 | 2017-12-14T22:27:48.000Z | 2022-03-13T18:21:31.000Z | TWLight/users/migrations/0052_auto_20200312_1628.py | nicole331/TWLight | fab9002e76868f8a2ef36f9279c777de34243b2c | [
"MIT"
] | 433 | 2017-03-24T22:51:23.000Z | 2022-03-31T19:36:22.000Z | TWLight/users/migrations/0052_auto_20200312_1628.py | Mahuton/TWLight | 90b299d07b0479f21dc90e17b8d05f5a221b0de1 | [
"MIT"
] | 105 | 2017-06-23T03:53:41.000Z | 2022-03-30T17:24:29.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.29 on 2020-03-12 16:28
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [("users", "0051_userprofile_proxy_notification_sent")]
operations = [
migrations.AddField(
model_name="editor",
name="ignore_wp_blocks",
field=models.BooleanField(
default=False,
help_text="Ignore the 'not currently blocked' criterion for access?",
),
),
migrations.AddField(
model_name="editor",
name="wp_account_old_enough",
field=models.BooleanField(
default=False,
editable=False,
help_text="At their last login, did this user meet the account age criterion in the terms of use?",
),
),
migrations.AddField(
model_name="editor",
name="wp_bundle_eligible",
field=models.BooleanField(
default=False,
editable=False,
help_text="At their last login, did this user meet the criteria for access to the library card bundle?",
),
),
migrations.AddField(
model_name="editor",
name="wp_editcount_prev",
field=models.IntegerField(
blank=True,
default=0,
editable=False,
help_text="Previous Wikipedia edit count",
null=True,
),
),
migrations.AddField(
model_name="editor",
name="wp_editcount_prev_updated",
field=models.DateTimeField(
blank=True,
default=None,
editable=False,
help_text="When the previous editcount was last updated from Wikipedia",
null=True,
),
),
migrations.AddField(
model_name="editor",
name="wp_editcount_recent",
field=models.IntegerField(
blank=True,
default=0,
editable=False,
help_text="Recent Wikipedia edit count",
null=True,
),
),
migrations.AddField(
model_name="editor",
name="wp_editcount_updated",
field=models.DateTimeField(
blank=True,
default=None,
editable=False,
help_text="When the editcount was updated from Wikipedia",
null=True,
),
),
migrations.AddField(
model_name="editor",
name="wp_enough_edits",
field=models.BooleanField(
default=False,
editable=False,
help_text="At their last login, did this user meet the total editcount criterion in the terms of use?",
),
),
migrations.AddField(
model_name="editor",
name="wp_enough_recent_edits",
field=models.BooleanField(
default=False,
editable=False,
help_text="At their last login, did this user meet the recent editcount criterion in the terms of use?",
),
),
migrations.AddField(
model_name="editor",
name="wp_not_blocked",
field=models.BooleanField(
default=False,
editable=False,
help_text="At their last login, did this user meet the 'not currently blocked' criterion in the terms of use?",
),
),
migrations.AlterField(
model_name="editor",
name="wp_valid",
field=models.BooleanField(
default=False,
editable=False,
help_text="At their last login, did this user meet the criteria in the terms of use?",
),
),
]
| 33.933333 | 127 | 0.518418 | 385 | 4,072 | 5.34026 | 0.251948 | 0.048152 | 0.080253 | 0.101654 | 0.799125 | 0.740272 | 0.722276 | 0.68677 | 0.680934 | 0.653697 | 0 | 0.009792 | 0.398084 | 4,072 | 119 | 128 | 34.218487 | 0.829049 | 0.016945 | 0 | 0.741071 | 1 | 0.008929 | 0.26275 | 0.027 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.017857 | 0 | 0.044643 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
492deeef8a6a0ea618cf54929b2bde81336828a6 | 44 | py | Python | custom_components/tasmota_irhvac/__init__.py | shlomifgm/HomeAssistant | 6b86b81c33cdd9b0d6a9af0dc74b163f0aeb2112 | [
"MIT"
] | 4 | 2021-10-30T13:28:31.000Z | 2021-11-04T14:50:39.000Z | custom_components/tasmota_irhvac/__init__.py | shlomifgm/HomeAssistant | 6b86b81c33cdd9b0d6a9af0dc74b163f0aeb2112 | [
"MIT"
] | 11 | 2021-10-30T15:23:09.000Z | 2021-11-25T18:40:53.000Z | custom_components/tasmota_irhvac/__init__.py | shlomifgm/HomeAssistant | 6b86b81c33cdd9b0d6a9af0dc74b163f0aeb2112 | [
"MIT"
] | null | null | null | """The Tasmota Irhvac climate component."""
| 22 | 43 | 0.727273 | 5 | 44 | 6.4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.113636 | 44 | 1 | 44 | 44 | 0.820513 | 0.840909 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
4941e7d334db84663a0d689b26ab734213a806d5 | 23 | py | Python | rotatescreen/__init__.py | TheBrokenEstate/rotate-screen | 4ca578aece99c25df21b7515c936dacf613e1eab | [
"MIT"
] | 3 | 2021-01-06T15:36:07.000Z | 2021-09-26T01:34:13.000Z | rotatescreen/__init__.py | TheBrokenEstate/rotate-screen | 4ca578aece99c25df21b7515c936dacf613e1eab | [
"MIT"
] | 1 | 2022-02-23T18:56:38.000Z | 2022-02-23T19:15:48.000Z | rotatescreen/__init__.py | TheBrokenEstate/rotate-screen | 4ca578aece99c25df21b7515c936dacf613e1eab | [
"MIT"
] | 2 | 2022-02-04T05:32:03.000Z | 2022-02-28T14:40:44.000Z | from .display import *
| 11.5 | 22 | 0.73913 | 3 | 23 | 5.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 23 | 1 | 23 | 23 | 0.894737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
495be864f2db614c3fbecd116f17ca62196d64de | 61 | py | Python | realtime_hand_3d/segmentation/dataset/__init__.py | NeelayS/realtime_hand | 219c772b9b7df60c390edac7da23f9cdddebca4d | [
"MIT"
] | null | null | null | realtime_hand_3d/segmentation/dataset/__init__.py | NeelayS/realtime_hand | 219c772b9b7df60c390edac7da23f9cdddebca4d | [
"MIT"
] | null | null | null | realtime_hand_3d/segmentation/dataset/__init__.py | NeelayS/realtime_hand | 219c772b9b7df60c390edac7da23f9cdddebca4d | [
"MIT"
] | null | null | null | from .ego2hands import Ego2HandsDataset
from .utils import *
| 20.333333 | 39 | 0.819672 | 7 | 61 | 7.142857 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037736 | 0.131148 | 61 | 2 | 40 | 30.5 | 0.90566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
b8e2f4c39e69f34482e2b50f87a97834cd37bd79 | 360 | py | Python | App/Apis/limitdata.py | BrandonZhangGithub/flask | d439a95e28c231b690bdc8cd533b4b70e092bdfa | [
"BSD-3-Clause"
] | null | null | null | App/Apis/limitdata.py | BrandonZhangGithub/flask | d439a95e28c231b690bdc8cd533b4b70e092bdfa | [
"BSD-3-Clause"
] | null | null | null | App/Apis/limitdata.py | BrandonZhangGithub/flask | d439a95e28c231b690bdc8cd533b4b70e092bdfa | [
"BSD-3-Clause"
] | null | null | null | from flask_restful import Resource
from App.models import User,Order
from App import db
from flask import jsonify
#RESTful必须使用类的方式创建接口
#RESTful独立设置路由
class Ldata(Resource):
def get(self):
return jsonify({'code':200,'content':'get接口'})
def post(self):
return jsonify({'code': 200, 'content': 'post接口'})
#flask-bootstrap 前端框架 css js
# | 22.5 | 58 | 0.708333 | 47 | 360 | 5.404255 | 0.617021 | 0.070866 | 0.133858 | 0.165354 | 0.244094 | 0.244094 | 0 | 0 | 0 | 0 | 0 | 0.020202 | 0.175 | 360 | 16 | 59 | 22.5 | 0.835017 | 0.166667 | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.444444 | 0.222222 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 5 |
b8f3e2b120629be3033b83e29db2ced7530828f0 | 382 | py | Python | object-oriented-programming/src/oop-mro.py | giserh/book-python | ebd4e70cea1dd56986aa8efbae3629ba3f1ba087 | [
"MIT"
] | 1 | 2019-01-02T15:04:08.000Z | 2019-01-02T15:04:08.000Z | object-oriented-programming/src/oop-mro.py | giserh/book-python | ebd4e70cea1dd56986aa8efbae3629ba3f1ba087 | [
"MIT"
] | null | null | null | object-oriented-programming/src/oop-mro.py | giserh/book-python | ebd4e70cea1dd56986aa8efbae3629ba3f1ba087 | [
"MIT"
] | null | null | null | from pprint import pprint
class A:
def wyswietl(self):
print('a')
class B:
def wyswietl(self):
print('b')
class C:
def wyswietl(self):
print('c')
class D(A, B, C):
pass
d = D().wyswietl() # a
pprint(D.__mro__)
# (<class '__main__.D'>,
# <class '__main__.A'>,
# <class '__main__.B'>,
# <class '__main__.C'>,
# <class 'object'>) | 12.733333 | 25 | 0.544503 | 52 | 382 | 3.615385 | 0.307692 | 0.191489 | 0.239362 | 0.319149 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.26178 | 382 | 30 | 26 | 12.733333 | 0.666667 | 0.293194 | 0 | 0.214286 | 0 | 0 | 0.011364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0.071429 | 0.071429 | 0 | 0.571429 | 0.357143 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
7750a10be6b762292a35df739ae8b941f7866153 | 73 | py | Python | Exercicios/001 Deixando tudo pronto.py | isachlopes/PythonCursoEmVideo | b3814c408b13ff404bc73c8cac1367f66cd83792 | [
"MIT"
] | 1 | 2020-04-12T20:55:21.000Z | 2020-04-12T20:55:21.000Z | Exercicios/001 Deixando tudo pronto.py | isachlopes/PythonCursoEmVideo | b3814c408b13ff404bc73c8cac1367f66cd83792 | [
"MIT"
] | null | null | null | Exercicios/001 Deixando tudo pronto.py | isachlopes/PythonCursoEmVideo | b3814c408b13ff404bc73c8cac1367f66cd83792 | [
"MIT"
] | null | null | null | #crie um programa que escreva "Olá Mundo!" na tela.
print('Olá, Mundo!')
| 24.333333 | 51 | 0.69863 | 12 | 73 | 4.25 | 0.833333 | 0.313725 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150685 | 73 | 2 | 52 | 36.5 | 0.822581 | 0.684932 | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
91f7e161560bea8e0a98f98da4253e50777e365c | 82 | py | Python | app/model.py | nhockcuncon77/underdog-devs-ds-a | 72bbdb1b717f307207e8360a0c4cf115d908625c | [
"MIT"
] | null | null | null | app/model.py | nhockcuncon77/underdog-devs-ds-a | 72bbdb1b717f307207e8360a0c4cf115d908625c | [
"MIT"
] | null | null | null | app/model.py | nhockcuncon77/underdog-devs-ds-a | 72bbdb1b717f307207e8360a0c4cf115d908625c | [
"MIT"
] | null | null | null | """
Labs DS Machine Learning Engineer Role
- Machine Learning Model Interface
"""
| 16.4 | 38 | 0.756098 | 10 | 82 | 6.2 | 0.8 | 0.483871 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158537 | 82 | 4 | 39 | 20.5 | 0.898551 | 0.890244 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
6279172ea30cc722a35e1351e7f45523afbc4cf8 | 90 | py | Python | discum/__init__.py | piotrostr/disco_py | 0b693d801785d1f54d46faeed7e98373ef2fb890 | [
"MIT"
] | null | null | null | discum/__init__.py | piotrostr/disco_py | 0b693d801785d1f54d46faeed7e98373ef2fb890 | [
"MIT"
] | null | null | null | discum/__init__.py | piotrostr/disco_py | 0b693d801785d1f54d46faeed7e98373ef2fb890 | [
"MIT"
] | null | null | null | from .__version__ import __version__
from .discum import Client
from .plug import Plug
| 22.5 | 37 | 0.8 | 12 | 90 | 5.333333 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 90 | 3 | 38 | 30 | 0.853333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
654e9ac46ca79244f38a7808933adb223a17f5b5 | 147,325 | py | Python | com/vmware/appliance_client.py | adammillerio/vsphere-automation-sdk-python | c07e1be98615201139b26c28db3aa584c4254b66 | [
"MIT"
] | null | null | null | com/vmware/appliance_client.py | adammillerio/vsphere-automation-sdk-python | c07e1be98615201139b26c28db3aa584c4254b66 | [
"MIT"
] | null | null | null | com/vmware/appliance_client.py | adammillerio/vsphere-automation-sdk-python | c07e1be98615201139b26c28db3aa584c4254b66 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
#---------------------------------------------------------------------------
# Copyright 2020 VMware, Inc. All rights reserved.
# AUTO GENERATED FILE -- DO NOT MODIFY!
#
# vAPI stub file for package com.vmware.appliance.
#---------------------------------------------------------------------------
"""
The ``com.vmware.appliance_client`` module provides classes for managing
vCenter Appliance configuration. The module is available starting in vSphere
6.7.
"""
__author__ = 'VMware, Inc.'
__docformat__ = 'restructuredtext en'
import sys
from com.vmware.cis_client import Tasks
from vmware.vapi.stdlib.client.task import Task
from vmware.vapi.bindings import type
from vmware.vapi.bindings.converter import TypeConverter
from vmware.vapi.bindings.enum import Enum
from vmware.vapi.bindings.error import VapiError
from vmware.vapi.bindings.struct import VapiStruct
from vmware.vapi.bindings.stub import (
ApiInterfaceStub, StubFactoryBase, VapiInterface)
from vmware.vapi.bindings.common import raise_core_exception
from vmware.vapi.data.validator import (UnionValidator, HasFieldsOfValidator)
from vmware.vapi.exception import CoreException
from vmware.vapi.lib.constants import TaskType
from vmware.vapi.lib.rest import OperationRestMetadata
class Notification(VapiStruct):
"""
The ``Notification`` class describes a notification that can be reported by
the appliance task. This class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
id=None,
time=None,
message=None,
resolution=None,
):
"""
:type id: :class:`str`
:param id: The notification id. This attribute was added in vSphere API 6.7.
:type time: :class:`datetime.datetime` or ``None``
:param time: The time the notification was raised/found. This attribute was
added in vSphere API 6.7.
Only if the time information is available.
:type message: :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param message: The notification message. This attribute was added in vSphere API
6.7.
:type resolution: :class:`com.vmware.vapi.std_client.LocalizableMessage` or ``None``
:param resolution: The resolution message, if any. This attribute was added in vSphere
API 6.7.
Only :class:`set` for warnings and errors.
"""
self.id = id
self.time = time
self.message = message
self.resolution = resolution
VapiStruct.__init__(self)
Notification._set_binding_type(type.StructType(
'com.vmware.appliance.notification', {
'id': type.StringType(),
'time': type.OptionalType(type.DateTimeType()),
'message': type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage'),
'resolution': type.OptionalType(type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage')),
},
Notification,
False,
None))
class Notifications(VapiStruct):
"""
The ``Notifications`` class contains info/warning/error messages that can
be reported be the appliance task. This class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
info=None,
warnings=None,
errors=None,
):
"""
:type info: :class:`list` of :class:`Notification` or ``None``
:param info: Info notification messages reported. This attribute was added in
vSphere API 6.7.
Only :class:`set` if an info was reported by the appliance task.
:type warnings: :class:`list` of :class:`Notification` or ``None``
:param warnings: Warning notification messages reported. This attribute was added in
vSphere API 6.7.
Only :class:`set` if an warning was reported by the appliance task.
:type errors: :class:`list` of :class:`Notification` or ``None``
:param errors: Error notification messages reported. This attribute was added in
vSphere API 6.7.
Only :class:`set` if an error was reported by the appliance task.
"""
self.info = info
self.warnings = warnings
self.errors = errors
VapiStruct.__init__(self)
Notifications._set_binding_type(type.StructType(
'com.vmware.appliance.notifications', {
'info': type.OptionalType(type.ListType(type.ReferenceType(__name__, 'Notification'))),
'warnings': type.OptionalType(type.ListType(type.ReferenceType(__name__, 'Notification'))),
'errors': type.OptionalType(type.ListType(type.ReferenceType(__name__, 'Notification'))),
},
Notifications,
False,
None))
class SubtaskInfo(VapiStruct):
"""
The ``SubtaskInfo`` class contains information about one of the subtasks
that makes up an appliance task. This class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'status',
{
'RUNNING' : [('progress', True), ('result', False), ('start_time', True)],
'BLOCKED' : [('progress', True), ('result', False), ('start_time', True)],
'SUCCEEDED' : [('progress', True), ('result', False), ('start_time', True), ('end_time', True)],
'FAILED' : [('progress', True), ('result', False), ('error', False), ('start_time', True), ('end_time', True)],
'PENDING' : [],
}
),
]
def __init__(self,
progress=None,
result=None,
description=None,
service=None,
operation=None,
parent=None,
target=None,
status=None,
cancelable=None,
error=None,
start_time=None,
end_time=None,
user=None,
):
"""
:type progress: :class:`com.vmware.cis.task_client.Progress`
:param progress: Progress of the operation. This attribute was added in vSphere API
6.7.
This attribute is optional and it is only relevant when the value
of ``#status`` is one of
:attr:`com.vmware.cis.task_client.Status.RUNNING`,
:attr:`com.vmware.cis.task_client.Status.BLOCKED`,
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED`, or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type result: :class:`Notifications` or ``None``
:param result: Result of the operation. If an operation reports partial results
before it completes, this attribute could be :class:`set` before
the null has the value
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED`. The value
could change as the operation progresses. This attribute was added
in vSphere API 6.7.
This attribute will be None if result is not available at the
current step of the operation.
:type description: :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param description: Description of the operation associated with the task.
:type service: :class:`str`
:param service: Identifier of the service containing the operation.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.vapi.service``. When methods return a value of this
class as a return value, the attribute will be an identifier for
the resource type: ``com.vmware.vapi.service``.
:type operation: :class:`str`
:param operation: Identifier of the operation associated with the task.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.vapi.operation``. When methods return a value of this
class as a return value, the attribute will be an identifier for
the resource type: ``com.vmware.vapi.operation``.
:type parent: :class:`str` or ``None``
:param parent: Parent of the current task.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.cis.task``. When methods return a value of this class
as a return value, the attribute will be an identifier for the
resource type: ``com.vmware.cis.task``.
This attribute will be None if the task has no parent.
:type target: :class:`com.vmware.vapi.std_client.DynamicID` or ``None``
:param target: Identifier of the target created by the operation or an existing
one the operation performed on.
This attribute will be None if the operation has no target or
multiple targets.
:type status: :class:`com.vmware.cis.task_client.Status`
:param status: Status of the operation associated with the task.
:type cancelable: :class:`bool`
:param cancelable: Flag to indicate whether or not the operation can be cancelled. The
value may change as the operation progresses.
:type error: :class:`Exception` or ``None``
:param error: Description of the error if the operation status is "FAILED".
If None the description of why the operation failed will be
included in the result of the operation (see
:attr:`com.vmware.cis.task_client.Info.result`).
:type start_time: :class:`datetime.datetime`
:param start_time: Time when the operation is started.
This attribute is optional and it is only relevant when the value
of ``status`` is one of
:attr:`com.vmware.cis.task_client.Status.RUNNING`,
:attr:`com.vmware.cis.task_client.Status.BLOCKED`,
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED`, or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type end_time: :class:`datetime.datetime`
:param end_time: Time when the operation is completed.
This attribute is optional and it is only relevant when the value
of ``status`` is one of
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED` or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type user: :class:`str` or ``None``
:param user: Name of the user who performed the operation.
This attribute will be None if the operation is performed by the
system.
"""
self.progress = progress
self.result = result
self.description = description
self.service = service
self.operation = operation
self.parent = parent
self.target = target
self.status = status
self.cancelable = cancelable
self.error = error
self.start_time = start_time
self.end_time = end_time
self.user = user
VapiStruct.__init__(self)
SubtaskInfo._set_binding_type(type.StructType(
'com.vmware.appliance.subtask_info', {
'progress': type.OptionalType(type.ReferenceType('com.vmware.cis.task_client', 'Progress')),
'result': type.OptionalType(type.ReferenceType(__name__, 'Notifications')),
'description': type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage'),
'service': type.IdType(resource_types='com.vmware.vapi.service'),
'operation': type.IdType(resource_types='com.vmware.vapi.operation'),
'parent': type.OptionalType(type.IdType()),
'target': type.OptionalType(type.ReferenceType('com.vmware.vapi.std_client', 'DynamicID')),
'status': type.ReferenceType('com.vmware.cis.task_client', 'Status'),
'cancelable': type.BooleanType(),
'error': type.OptionalType(type.AnyErrorType()),
'start_time': type.OptionalType(type.DateTimeType()),
'end_time': type.OptionalType(type.DateTimeType()),
'user': type.OptionalType(type.StringType()),
},
SubtaskInfo,
False,
None))
class TaskInfo(VapiStruct):
"""
The ``TaskInfo`` class contains information about an appliance task and the
subtasks of which it consists. This class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'status',
{
'RUNNING' : [('progress', True), ('start_time', True)],
'BLOCKED' : [('progress', True), ('start_time', True)],
'SUCCEEDED' : [('progress', True), ('start_time', True), ('end_time', True)],
'FAILED' : [('progress', True), ('error', False), ('start_time', True), ('end_time', True)],
'PENDING' : [],
}
),
]
def __init__(self,
progress=None,
subtask_order=None,
subtasks=None,
description=None,
service=None,
operation=None,
parent=None,
target=None,
status=None,
cancelable=None,
error=None,
start_time=None,
end_time=None,
user=None,
):
"""
:type progress: :class:`com.vmware.cis.task_client.Progress`
:param progress: Progress of the task. This attribute was added in vSphere API 6.7.
This attribute is optional and it is only relevant when the value
of ``#status`` is one of
:attr:`com.vmware.cis.task_client.Status.RUNNING`,
:attr:`com.vmware.cis.task_client.Status.BLOCKED`,
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED`, or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type subtask_order: :class:`list` of :class:`str`
:param subtask_order: List of tasks that make up this appliance task in the order they
are being run. This attribute was added in vSphere API 6.7.
:type subtasks: :class:`dict` of :class:`str` and :class:`SubtaskInfo`
:param subtasks: Information about the subtasks that this appliance task consists
of. This attribute was added in vSphere API 6.7.
:type description: :class:`com.vmware.vapi.std_client.LocalizableMessage`
:param description: Description of the operation associated with the task.
:type service: :class:`str`
:param service: Identifier of the service containing the operation.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.vapi.service``. When methods return a value of this
class as a return value, the attribute will be an identifier for
the resource type: ``com.vmware.vapi.service``.
:type operation: :class:`str`
:param operation: Identifier of the operation associated with the task.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.vapi.operation``. When methods return a value of this
class as a return value, the attribute will be an identifier for
the resource type: ``com.vmware.vapi.operation``.
:type parent: :class:`str` or ``None``
:param parent: Parent of the current task.
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.cis.task``. When methods return a value of this class
as a return value, the attribute will be an identifier for the
resource type: ``com.vmware.cis.task``.
This attribute will be None if the task has no parent.
:type target: :class:`com.vmware.vapi.std_client.DynamicID` or ``None``
:param target: Identifier of the target created by the operation or an existing
one the operation performed on.
This attribute will be None if the operation has no target or
multiple targets.
:type status: :class:`com.vmware.cis.task_client.Status`
:param status: Status of the operation associated with the task.
:type cancelable: :class:`bool`
:param cancelable: Flag to indicate whether or not the operation can be cancelled. The
value may change as the operation progresses.
:type error: :class:`Exception` or ``None``
:param error: Description of the error if the operation status is "FAILED".
If None the description of why the operation failed will be
included in the result of the operation (see
:attr:`com.vmware.cis.task_client.Info.result`).
:type start_time: :class:`datetime.datetime`
:param start_time: Time when the operation is started.
This attribute is optional and it is only relevant when the value
of ``status`` is one of
:attr:`com.vmware.cis.task_client.Status.RUNNING`,
:attr:`com.vmware.cis.task_client.Status.BLOCKED`,
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED`, or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type end_time: :class:`datetime.datetime`
:param end_time: Time when the operation is completed.
This attribute is optional and it is only relevant when the value
of ``status`` is one of
:attr:`com.vmware.cis.task_client.Status.SUCCEEDED` or
:attr:`com.vmware.cis.task_client.Status.FAILED`.
:type user: :class:`str` or ``None``
:param user: Name of the user who performed the operation.
This attribute will be None if the operation is performed by the
system.
"""
self.progress = progress
self.subtask_order = subtask_order
self.subtasks = subtasks
self.description = description
self.service = service
self.operation = operation
self.parent = parent
self.target = target
self.status = status
self.cancelable = cancelable
self.error = error
self.start_time = start_time
self.end_time = end_time
self.user = user
VapiStruct.__init__(self)
TaskInfo._set_binding_type(type.StructType(
'com.vmware.appliance.task_info', {
'progress': type.OptionalType(type.ReferenceType('com.vmware.cis.task_client', 'Progress')),
'subtask_order': type.ListType(type.StringType()),
'subtasks': type.MapType(type.StringType(), type.ReferenceType(__name__, 'SubtaskInfo')),
'description': type.ReferenceType('com.vmware.vapi.std_client', 'LocalizableMessage'),
'service': type.IdType(resource_types='com.vmware.vapi.service'),
'operation': type.IdType(resource_types='com.vmware.vapi.operation'),
'parent': type.OptionalType(type.IdType()),
'target': type.OptionalType(type.ReferenceType('com.vmware.vapi.std_client', 'DynamicID')),
'status': type.ReferenceType('com.vmware.cis.task_client', 'Status'),
'cancelable': type.BooleanType(),
'error': type.OptionalType(type.AnyErrorType()),
'start_time': type.OptionalType(type.DateTimeType()),
'end_time': type.OptionalType(type.DateTimeType()),
'user': type.OptionalType(type.StringType()),
},
TaskInfo,
False,
None))
class Health(VapiInterface):
"""
The ``Health`` class provides methods to retrieve the appliance health
information. This class was added in vSphere API 6.7.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.health'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _HealthStub)
self._VAPI_OPERATION_IDS = {}
def messages(self,
item,
):
"""
Get health messages. This method was added in vSphere API 6.7.
:type item: :class:`str`
:param item: ID of the data item
The parameter must be an identifier for the resource type:
``com.vmware.appliance.health``.
:rtype: :class:`list` of :class:`Notification`
:return: List of the health messages
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
Unknown health item
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('messages',
{
'item': item,
})
class LocalAccounts(VapiInterface):
"""
The ``LocalAccounts`` class provides methods to manage local user account.
This class was added in vSphere API 6.7.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.local_accounts'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _LocalAccountsStub)
self._VAPI_OPERATION_IDS = {}
class Info(VapiStruct):
"""
The ``LocalAccounts.Info`` class defines the local account properties. This
class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
fullname=None,
email=None,
roles=None,
enabled=None,
has_password=None,
last_password_change=None,
password_expires_at=None,
inactive_at=None,
min_days_between_password_change=None,
max_days_between_password_change=None,
warn_days_before_password_expiration=None,
):
"""
:type fullname: :class:`str` or ``None``
:param fullname: Full name of the user. This attribute was added in vSphere API 6.7.
If None, the value was never set.
:type email: :class:`str` or ``None``
:param email: Email address of the local account. This attribute was added in
vSphere API 6.7.
If None, the value was never set.
:type roles: :class:`list` of :class:`str`
:param roles: User roles. This attribute was added in vSphere API 6.7.
When clients pass a value of this class as a parameter, the
attribute must contain identifiers for the resource type:
``com.vmware.appliance.roles``. When methods return a value of this
class as a return value, the attribute will contain identifiers for
the resource type: ``com.vmware.appliance.roles``.
:type enabled: :class:`bool`
:param enabled: Flag indicating if the account is enabled. This attribute was added
in vSphere API 6.7.
:type has_password: :class:`bool`
:param has_password: Is the user password set. This attribute was added in vSphere API
6.7.
:type last_password_change: :class:`datetime.datetime` or ``None``
:param last_password_change: Date and time password was changed. This attribute was added in
vSphere API 6.7.
If None, the password was never set.
:type password_expires_at: :class:`datetime.datetime` or ``None``
:param password_expires_at: Date when the account's password will expire. This attribute was
added in vSphere API 6.7.
If None, the password never expires.
:type inactive_at: :class:`datetime.datetime` or ``None``
:param inactive_at: Date and time account will be locked after password expiration.
This attribute was added in vSphere API 6.7.
If None, account will not be locked.
:type min_days_between_password_change: :class:`long` or ``None``
:param min_days_between_password_change: Minimum number of days between password change. This attribute was
added in vSphere API 6.7.
If None, pasword can be changed any time.
:type max_days_between_password_change: :class:`long` or ``None``
:param max_days_between_password_change: Maximum number of days between password change. This attribute was
added in vSphere API 6.7.
If None, password never expires.
:type warn_days_before_password_expiration: :class:`long` or ``None``
:param warn_days_before_password_expiration: Number of days of warning before password expires. This attribute
was added in vSphere API 6.7.
If None, a user is never warned.
"""
self.fullname = fullname
self.email = email
self.roles = roles
self.enabled = enabled
self.has_password = has_password
self.last_password_change = last_password_change
self.password_expires_at = password_expires_at
self.inactive_at = inactive_at
self.min_days_between_password_change = min_days_between_password_change
self.max_days_between_password_change = max_days_between_password_change
self.warn_days_before_password_expiration = warn_days_before_password_expiration
VapiStruct.__init__(self)
Info._set_binding_type(type.StructType(
'com.vmware.appliance.local_accounts.info', {
'fullname': type.OptionalType(type.StringType()),
'email': type.OptionalType(type.StringType()),
'roles': type.ListType(type.IdType()),
'enabled': type.BooleanType(),
'has_password': type.BooleanType(),
'last_password_change': type.OptionalType(type.DateTimeType()),
'password_expires_at': type.OptionalType(type.DateTimeType()),
'inactive_at': type.OptionalType(type.DateTimeType()),
'min_days_between_password_change': type.OptionalType(type.IntegerType()),
'max_days_between_password_change': type.OptionalType(type.IntegerType()),
'warn_days_before_password_expiration': type.OptionalType(type.IntegerType()),
},
Info,
False,
None))
class Config(VapiStruct):
"""
The ``LocalAccounts.Config`` class defines the information required for the
account. This class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
password=None,
old_password=None,
full_name=None,
email=None,
roles=None,
enabled=None,
password_expires=None,
password_expires_at=None,
inactive_after_password_expiration=None,
days_after_password_expiration=None,
min_days_between_password_change=None,
max_days_between_password_change=None,
warn_days_before_password_expiration=None,
):
"""
:type password: :class:`str`
:param password: Password. This attribute was added in vSphere API 6.7.
:type old_password: :class:`str` or ``None``
:param old_password: Old password of the user (required in case of the password change,
not required if superAdmin user changes the password of the other
user). This attribute was added in vSphere API 6.7.
If None, user may not have password set.
:type full_name: :class:`str` or ``None``
:param full_name: Full name of the user. This attribute was added in vSphere API 6.7.
If None, user will have no fullname.
:type email: :class:`str` or ``None``
:param email: Email address of the local account. This attribute was added in
vSphere API 6.7.
If None, user will have no email.
:type roles: :class:`list` of :class:`str`
:param roles: User roles. This attribute was added in vSphere API 6.7.
When clients pass a value of this class as a parameter, the
attribute must contain identifiers for the resource type:
``com.vmware.appliance.roles``. When methods return a value of this
class as a return value, the attribute will contain identifiers for
the resource type: ``com.vmware.appliance.roles``.
:type enabled: :class:`bool` or ``None``
:param enabled: Flag indicating if the account is enabled. This attribute was added
in vSphere API 6.7.
If None, defaults to True
:type password_expires: :class:`bool` or ``None``
:param password_expires: Flag indicating if the account password expires. This attribute was
added in vSphere API 6.7.
If None, defaults to True.
:type password_expires_at: :class:`datetime.datetime` or ``None``
:param password_expires_at: Date when the account's password will expire. This attribute was
added in vSphere API 6.7.
If None, will be taken from system defaults (see
local-accounts/policy).
:type inactive_after_password_expiration: :class:`bool` or ``None``
:param inactive_after_password_expiration: Flag indicating if the account will be locked after password
expiration. This attribute was added in vSphere API 6.7.
If None, defaults to True.
:type days_after_password_expiration: :class:`long` or ``None``
:param days_after_password_expiration: Number of days after password expiration before the account will be
locked. This attribute was added in vSphere API 6.7.
If None, will be taken from system defaults (see
local-accounts/policy).
:type min_days_between_password_change: :class:`long` or ``None``
:param min_days_between_password_change: Minimum number of days between password change. This attribute was
added in vSphere API 6.7.
If None, will be taken from system defaults (see
local-accounts/policy).
:type max_days_between_password_change: :class:`long` or ``None``
:param max_days_between_password_change: Maximum number of days between password change. This attribute was
added in vSphere API 6.7.
If None, will be taken from system defaults (see
local-accounts/policy).
:type warn_days_before_password_expiration: :class:`long` or ``None``
:param warn_days_before_password_expiration: Number of days of warning before password expires. This attribute
was added in vSphere API 6.7.
If None, will be taken from system defaults (see
local-accounts/policy).
"""
self.password = password
self.old_password = old_password
self.full_name = full_name
self.email = email
self.roles = roles
self.enabled = enabled
self.password_expires = password_expires
self.password_expires_at = password_expires_at
self.inactive_after_password_expiration = inactive_after_password_expiration
self.days_after_password_expiration = days_after_password_expiration
self.min_days_between_password_change = min_days_between_password_change
self.max_days_between_password_change = max_days_between_password_change
self.warn_days_before_password_expiration = warn_days_before_password_expiration
VapiStruct.__init__(self)
Config._set_binding_type(type.StructType(
'com.vmware.appliance.local_accounts.config', {
'password': type.SecretType(),
'old_password': type.OptionalType(type.SecretType()),
'full_name': type.OptionalType(type.StringType()),
'email': type.OptionalType(type.StringType()),
'roles': type.ListType(type.IdType()),
'enabled': type.OptionalType(type.BooleanType()),
'password_expires': type.OptionalType(type.BooleanType()),
'password_expires_at': type.OptionalType(type.DateTimeType()),
'inactive_after_password_expiration': type.OptionalType(type.BooleanType()),
'days_after_password_expiration': type.OptionalType(type.IntegerType()),
'min_days_between_password_change': type.OptionalType(type.IntegerType()),
'max_days_between_password_change': type.OptionalType(type.IntegerType()),
'warn_days_before_password_expiration': type.OptionalType(type.IntegerType()),
},
Config,
False,
None))
class UpdateConfig(VapiStruct):
"""
The ``LocalAccounts.UpdateConfig`` class defines the fields that might be
updated. This class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
password=None,
old_password=None,
full_name=None,
email=None,
roles=None,
enabled=None,
password_expires=None,
password_expires_at=None,
inactive_after_password_expiration=None,
days_after_password_expiration=None,
min_days_between_password_change=None,
max_days_between_password_change=None,
warn_days_before_password_expiration=None,
):
"""
:type password: :class:`str` or ``None``
:param password: Password. This attribute was added in vSphere API 6.7.
If None, value will not be changed
:type old_password: :class:`str` or ``None``
:param old_password: Old password of the user (required in case of the password change,
not required if superAdmin user changes the password of the other
user). This attribute was added in vSphere API 6.7.
If None, user may not have password set.
:type full_name: :class:`str` or ``None``
:param full_name: Full name of the user. This attribute was added in vSphere API 6.7.
If None, value will not be changed
:type email: :class:`str` or ``None``
:param email: Email address of the local account. This attribute was added in
vSphere API 6.7.
If None, value will not be changed
:type roles: :class:`list` of :class:`str` or ``None``
:param roles: User roles. This attribute was added in vSphere API 6.7.
When clients pass a value of this class as a parameter, the
attribute must contain identifiers for the resource type:
``com.vmware.appliance.roles``. When methods return a value of this
class as a return value, the attribute will contain identifiers for
the resource type: ``com.vmware.appliance.roles``.
If None, value will not be changed
:type enabled: :class:`bool` or ``None``
:param enabled: Flag indicating if the account is enabled. This attribute was added
in vSphere API 6.7.
If None, value will not be changed
:type password_expires: :class:`bool` or ``None``
:param password_expires: Flag indicating if the account password expires. This attribute was
added in vSphere API 6.7.
If None, value will not be changed
:type password_expires_at: :class:`datetime.datetime` or ``None``
:param password_expires_at: Date when the account's password will expire. This attribute was
added in vSphere API 6.7.
If None, value will not be changed
:type inactive_after_password_expiration: :class:`bool` or ``None``
:param inactive_after_password_expiration: Flag indicating if the account will be locked after password
expiration. This attribute was added in vSphere API 6.7.
If None, value will not be changed
:type days_after_password_expiration: :class:`long` or ``None``
:param days_after_password_expiration: Number of days after password expiration before the account will be
locked. This attribute was added in vSphere API 6.7.
If None, value will not be changed
:type min_days_between_password_change: :class:`long` or ``None``
:param min_days_between_password_change: Minimum number of days between password change. This attribute was
added in vSphere API 6.7.
If None, value will not be changed
:type max_days_between_password_change: :class:`long` or ``None``
:param max_days_between_password_change: Maximum number of days between password change. This attribute was
added in vSphere API 6.7.
If None, value will not be changed
:type warn_days_before_password_expiration: :class:`long` or ``None``
:param warn_days_before_password_expiration: Number of days of warning before password expires. This attribute
was added in vSphere API 6.7.
If None, value will not be changed
"""
self.password = password
self.old_password = old_password
self.full_name = full_name
self.email = email
self.roles = roles
self.enabled = enabled
self.password_expires = password_expires
self.password_expires_at = password_expires_at
self.inactive_after_password_expiration = inactive_after_password_expiration
self.days_after_password_expiration = days_after_password_expiration
self.min_days_between_password_change = min_days_between_password_change
self.max_days_between_password_change = max_days_between_password_change
self.warn_days_before_password_expiration = warn_days_before_password_expiration
VapiStruct.__init__(self)
UpdateConfig._set_binding_type(type.StructType(
'com.vmware.appliance.local_accounts.update_config', {
'password': type.OptionalType(type.SecretType()),
'old_password': type.OptionalType(type.SecretType()),
'full_name': type.OptionalType(type.StringType()),
'email': type.OptionalType(type.StringType()),
'roles': type.OptionalType(type.ListType(type.IdType())),
'enabled': type.OptionalType(type.BooleanType()),
'password_expires': type.OptionalType(type.BooleanType()),
'password_expires_at': type.OptionalType(type.DateTimeType()),
'inactive_after_password_expiration': type.OptionalType(type.BooleanType()),
'days_after_password_expiration': type.OptionalType(type.IntegerType()),
'min_days_between_password_change': type.OptionalType(type.IntegerType()),
'max_days_between_password_change': type.OptionalType(type.IntegerType()),
'warn_days_before_password_expiration': type.OptionalType(type.IntegerType()),
},
UpdateConfig,
False,
None))
def get(self,
username,
):
"""
Get the local user account information. This method was added in
vSphere API 6.7.
:type username: :class:`str`
:param username: User login name
:rtype: :class:`LocalAccounts.Info`
:return: Local user account information
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
If the account is not found
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('get',
{
'username': username,
})
def list(self):
"""
Get a list of the local user accounts. This method was added in vSphere
API 6.7.
:rtype: :class:`list` of :class:`str`
:return: List of identifiers
The return value will contain identifiers for the resource type:
``com.vmware.appliance.local_accounts``.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('list', None)
def create(self,
username,
config,
):
"""
Create a new local user account. This method was added in vSphere API
6.7.
:type username: :class:`str`
:param username: User login name
The parameter must be an identifier for the resource type:
``com.vmware.appliance.local_accounts``.
:type config: :class:`LocalAccounts.Config`
:param config: User configuration
:raise: :class:`com.vmware.vapi.std.errors_client.AlreadyExists`
If an account already exists
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
If a username is invalid (username is validated against
[a-zA-Z0-9][a-zA-Z0-9\-\.\\\\@]\*[a-zA-Z0-9] pattern)
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('create',
{
'username': username,
'config': config,
})
def set(self,
username,
config,
):
"""
Set local user account properties. This method was added in vSphere API
6.7.
:type username: :class:`str`
:param username: User login name
The parameter must be an identifier for the resource type:
``com.vmware.appliance.local_accounts``.
:type config: :class:`LocalAccounts.Config`
:param config: User configuration
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
If the account is not found
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('set',
{
'username': username,
'config': config,
})
def update(self,
username,
config,
):
"""
Update selected fields in local user account properties. This method
was added in vSphere API 6.7.
:type username: :class:`str`
:param username: User login name
The parameter must be an identifier for the resource type:
``com.vmware.appliance.local_accounts``.
:type config: :class:`LocalAccounts.UpdateConfig`
:param config: User configuration
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
If the account is not found
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('update',
{
'username': username,
'config': config,
})
def delete(self,
username,
):
"""
Delete a local user account. This method was added in vSphere API 6.7.
:type username: :class:`str`
:param username: User login name
The parameter must be an identifier for the resource type:
``com.vmware.appliance.local_accounts``.
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
If the account is not found
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('delete',
{
'username': username,
})
class Monitoring(VapiInterface):
"""
``Monitoring`` class provides methods Get and list monitoring data for
requested item.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.monitoring'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _MonitoringStub)
self._VAPI_OPERATION_IDS = {}
class FunctionType(Enum):
"""
``Monitoring.FunctionType`` class Defines aggregation function
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
COUNT = None
"""
Aggregation takes count per period (sum)
"""
MAX = None
"""
Aggregation takes maximums per period
"""
AVG = None
"""
Aggregation takes average per period
"""
MIN = None
"""
Aggregation takes minimums per period
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`FunctionType` instance.
"""
Enum.__init__(string)
FunctionType._set_values([
FunctionType('COUNT'),
FunctionType('MAX'),
FunctionType('AVG'),
FunctionType('MIN'),
])
FunctionType._set_binding_type(type.EnumType(
'com.vmware.appliance.monitoring.function_type',
FunctionType))
class IntervalType(Enum):
"""
``Monitoring.IntervalType`` class Defines interval between the values in
hours and mins, for which aggregation will apply
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
MINUTES30 = None
"""
Thirty minutes interval between values. One week is 336 values.
"""
HOURS2 = None
"""
Two hours interval between values. One month has 360 values.
"""
MINUTES5 = None
"""
Five minutes interval between values (finest). One day would have 288
values, one week is 2016.
"""
DAY1 = None
"""
24 hours interval between values. One year has 365 values.
"""
HOURS6 = None
"""
Six hour interval between values. One quarter is 360 values.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`IntervalType` instance.
"""
Enum.__init__(string)
IntervalType._set_values([
IntervalType('MINUTES30'),
IntervalType('HOURS2'),
IntervalType('MINUTES5'),
IntervalType('DAY1'),
IntervalType('HOURS6'),
])
IntervalType._set_binding_type(type.EnumType(
'com.vmware.appliance.monitoring.interval_type',
IntervalType))
class MonitoredItemData(VapiStruct):
"""
``Monitoring.MonitoredItemData`` class Structure representing monitored
item data.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
name=None,
interval=None,
function=None,
start_time=None,
end_time=None,
data=None,
):
"""
:type name: :class:`str`
:param name: Monitored item IDs Ex: CPU, MEMORY, STORAGE_TOTAL
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.appliance.monitoring``. When methods return a value of
this class as a return value, the attribute will be an identifier
for the resource type: ``com.vmware.appliance.monitoring``.
:type interval: :class:`Monitoring.IntervalType`
:param interval: interval between values in hours, minutes
:type function: :class:`Monitoring.FunctionType`
:param function: aggregation function
:type start_time: :class:`datetime.datetime`
:param start_time: Start time in UTC
:type end_time: :class:`datetime.datetime`
:param end_time: End time in UTC
:type data: :class:`list` of :class:`str`
:param data: list of values
"""
self.name = name
self.interval = interval
self.function = function
self.start_time = start_time
self.end_time = end_time
self.data = data
VapiStruct.__init__(self)
MonitoredItemData._set_binding_type(type.StructType(
'com.vmware.appliance.monitoring.monitored_item_data', {
'name': type.IdType(resource_types='com.vmware.appliance.monitoring'),
'interval': type.ReferenceType(__name__, 'Monitoring.IntervalType'),
'function': type.ReferenceType(__name__, 'Monitoring.FunctionType'),
'start_time': type.DateTimeType(),
'end_time': type.DateTimeType(),
'data': type.ListType(type.StringType()),
},
MonitoredItemData,
False,
None))
class MonitoredItemDataRequest(VapiStruct):
"""
``Monitoring.MonitoredItemDataRequest`` class Structure representing
requested monitored item data.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
names=None,
interval=None,
function=None,
start_time=None,
end_time=None,
):
"""
:type names: :class:`list` of :class:`str`
:param names: monitored item IDs Ex: CPU, MEMORY
When clients pass a value of this class as a parameter, the
attribute must contain identifiers for the resource type:
``com.vmware.appliance.monitoring``. When methods return a value of
this class as a return value, the attribute will contain
identifiers for the resource type:
``com.vmware.appliance.monitoring``.
:type interval: :class:`Monitoring.IntervalType`
:param interval: interval between values in hours, minutes
:type function: :class:`Monitoring.FunctionType`
:param function: aggregation function
:type start_time: :class:`datetime.datetime`
:param start_time: Start time in UTC
:type end_time: :class:`datetime.datetime`
:param end_time: End time in UTC
"""
self.names = names
self.interval = interval
self.function = function
self.start_time = start_time
self.end_time = end_time
VapiStruct.__init__(self)
MonitoredItemDataRequest._set_binding_type(type.StructType(
'com.vmware.appliance.monitoring.monitored_item_data_request', {
'names': type.ListType(type.IdType()),
'interval': type.ReferenceType(__name__, 'Monitoring.IntervalType'),
'function': type.ReferenceType(__name__, 'Monitoring.FunctionType'),
'start_time': type.DateTimeType(),
'end_time': type.DateTimeType(),
},
MonitoredItemDataRequest,
False,
None))
class MonitoredItem(VapiStruct):
"""
``Monitoring.MonitoredItem`` class Structure representing requested
monitored item data.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
id=None,
name=None,
units=None,
category=None,
instance=None,
description=None,
):
"""
:type id: :class:`str`
:param id: monitored item ID Ex: CPU, MEMORY
When clients pass a value of this class as a parameter, the
attribute must be an identifier for the resource type:
``com.vmware.appliance.monitoring``. When methods return a value of
this class as a return value, the attribute will be an identifier
for the resource type: ``com.vmware.appliance.monitoring``.
:type name: :class:`str`
:param name: monitored item name Ex: "Network write speed"
:type units: :class:`str`
:param units: Y-axis label EX: "Mbps", "%"
:type category: :class:`str`
:param category: category Ex: network, storage etc
:type instance: :class:`str`
:param instance: instance name Ex: eth0
:type description: :class:`str`
:param description: monitored item description Ex:
com.vmware.applmgmt.mon.descr.net.rx.packetRate.eth0
"""
self.id = id
self.name = name
self.units = units
self.category = category
self.instance = instance
self.description = description
VapiStruct.__init__(self)
MonitoredItem._set_binding_type(type.StructType(
'com.vmware.appliance.monitoring.monitored_item', {
'id': type.IdType(resource_types='com.vmware.appliance.monitoring'),
'name': type.StringType(),
'units': type.StringType(),
'category': type.StringType(),
'instance': type.StringType(),
'description': type.StringType(),
},
MonitoredItem,
False,
None))
def query(self,
item,
):
"""
Get monitoring data.
:type item: :class:`Monitoring.MonitoredItemDataRequest`
:param item: MonitoredItemDataRequest Structure
:rtype: :class:`list` of :class:`Monitoring.MonitoredItemData`
:return: list of MonitoredItemData structure
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('query',
{
'item': item,
})
def list(self):
"""
Get monitored items list
:rtype: :class:`list` of :class:`Monitoring.MonitoredItem`
:return: list of names
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('list', None)
def get(self,
stat_id,
):
"""
Get monitored item info
:type stat_id: :class:`str`
:param stat_id: statistic item id
The parameter must be an identifier for the resource type:
``com.vmware.appliance.monitoring``.
:rtype: :class:`Monitoring.MonitoredItem`
:return: MonitoredItem structure
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('get',
{
'stat_id': stat_id,
})
class Networking(VapiInterface):
"""
The ``Networking`` class provides methods Get Network configurations. This
class was added in vSphere API 6.7.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.networking'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _NetworkingStub)
self._VAPI_OPERATION_IDS = {}
self._VAPI_OPERATION_IDS.update({'change_task': 'change$task'})
class DNSInfo(VapiStruct):
"""
The ``Networking.DNSInfo`` class contains information about the DNS
configuration of a virtual appliance. This class was added in vSphere API
6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
mode=None,
hostname=None,
servers=None,
):
"""
:type mode: :class:`Networking.DNSInfo.DNSMode`
:param mode: DNS mode. This attribute was added in vSphere API 6.7.
:type hostname: :class:`str`
:param hostname: Hostname. This attribute was added in vSphere API 6.7.
:type servers: :class:`list` of :class:`str`
:param servers: Servers. This attribute was added in vSphere API 6.7.
"""
self.mode = mode
self.hostname = hostname
self.servers = servers
VapiStruct.__init__(self)
class DNSMode(Enum):
"""
The ``Networking.DNSInfo.DNSMode`` class describes the source of DNS
servers. This enumeration was added in vSphere API 6.7.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
DHCP = None
"""
The DNS servers addresses are obtained from a DHCP server. This class
attribute was added in vSphere API 6.7.
"""
STATIC = None
"""
The DNS servers addresses are specified explicitly. This class attribute
was added in vSphere API 6.7.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`DNSMode` instance.
"""
Enum.__init__(string)
DNSMode._set_values([
DNSMode('DHCP'),
DNSMode('STATIC'),
])
DNSMode._set_binding_type(type.EnumType(
'com.vmware.appliance.networking.DNS_info.DNS_mode',
DNSMode))
DNSInfo._set_binding_type(type.StructType(
'com.vmware.appliance.networking.DNS_info', {
'mode': type.ReferenceType(__name__, 'Networking.DNSInfo.DNSMode'),
'hostname': type.StringType(),
'servers': type.ListType(type.StringType()),
},
DNSInfo,
False,
None))
class Info(VapiStruct):
"""
The ``Networking.Info`` class contains information about the network
configuration of a virtual appliance. This class was added in vSphere API
6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
dns=None,
interfaces=None,
):
"""
:type dns: :class:`Networking.DNSInfo`
:param dns: DNS configuration. This attribute was added in vSphere API 6.7.
:type interfaces: :class:`dict` of :class:`str` and :class:`com.vmware.appliance.networking_client.Interfaces.InterfaceInfo`
:param interfaces: Interface configuration as a key-value map where key is a network
interface name, for example, "nic0". This attribute was added in
vSphere API 6.7.
When clients pass a value of this class as a parameter, the key in
the attribute :class:`dict` must be an identifier for the resource
type: ``com.vmware.appliance.networking.interfaces``. When methods
return a value of this class as a return value, the key in the
attribute :class:`dict` will be an identifier for the resource
type: ``com.vmware.appliance.networking.interfaces``.
"""
self.dns = dns
self.interfaces = interfaces
VapiStruct.__init__(self)
Info._set_binding_type(type.StructType(
'com.vmware.appliance.networking.info', {
'dns': type.ReferenceType(__name__, 'Networking.DNSInfo'),
'interfaces': type.MapType(type.IdType(), type.ReferenceType('com.vmware.appliance.networking_client', 'Interfaces.InterfaceInfo')),
},
Info,
False,
None))
class UpdateSpec(VapiStruct):
"""
The ``Networking.UpdateSpec`` class describes whether to enable or disable
ipv6 on interfaces. This class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
ipv6_enabled=None,
):
"""
:type ipv6_enabled: :class:`bool` or ``None``
:param ipv6_enabled: IPv6 Enabled or not. This attribute was added in vSphere API 6.7.
If unspecified, leaves the current state of Ipv6.
"""
self.ipv6_enabled = ipv6_enabled
VapiStruct.__init__(self)
UpdateSpec._set_binding_type(type.StructType(
'com.vmware.appliance.networking.update_spec', {
'ipv6_enabled': type.OptionalType(type.BooleanType()),
},
UpdateSpec,
False,
None))
class ChangeSpec(VapiStruct):
"""
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_canonical_to_pep_names = {
'SSO_user': 'sso_user',
'SSO_password': 'sso_password',
}
def __init__(self,
hostname=None,
sso_user=None,
sso_password=None,
dns=None,
ipv4=None,
ipv6=None,
):
"""
:type hostname: :class:`str`
:param hostname: New hostname to assign to the management network of vCenter
appliance. This attribute was added in vSphere API 6.7.3.
:type sso_user: :class:`str`
:param sso_user: vCenter Server SSO administrator username. This attribute was added
in vSphere API 6.7.3.
:type sso_password: :class:`str`
:param sso_password: vCenter Server SSO administrator Password. This attribute was added
in vSphere API 6.7.3.
:type dns: :class:`com.vmware.appliance.networking.dns_client.Servers.DNSServerConfig` or ``None``
:param dns: DNS Configuration to set for the machine. This attribute was added
in vSphere API 6.7.3.
If None DNS settings will not be changed
:type ipv4: :class:`com.vmware.appliance.networking.interfaces_client.Ipv4.Config` or ``None``
:param ipv4: IPv4 Configuration to set for the machine. This attribute was added
in vSphere API 6.7.3.
If None IPv4 settings will not be changed
:type ipv6: :class:`com.vmware.appliance.networking.interfaces_client.Ipv6.Config` or ``None``
:param ipv6: IPv6 Configuration to set for the machine. This attribute was added
in vSphere API 6.7.3.
If None IPv6 settings will not be changed
"""
self.hostname = hostname
self.sso_user = sso_user
self.sso_password = sso_password
self.dns = dns
self.ipv4 = ipv4
self.ipv6 = ipv6
VapiStruct.__init__(self)
ChangeSpec._set_binding_type(type.StructType(
'com.vmware.appliance.networking.change_spec', {
'hostname': type.StringType(),
'SSO_user': type.StringType(),
'SSO_password': type.SecretType(),
'dns': type.OptionalType(type.ReferenceType('com.vmware.appliance.networking.dns_client', 'Servers.DNSServerConfig')),
'ipv4': type.OptionalType(type.ReferenceType('com.vmware.appliance.networking.interfaces_client', 'Ipv4.Config')),
'ipv6': type.OptionalType(type.ReferenceType('com.vmware.appliance.networking.interfaces_client', 'Ipv6.Config')),
},
ChangeSpec,
False,
None))
def get(self):
"""
Get Networking information for all configured interfaces. This method
was added in vSphere API 6.7.
:rtype: :class:`Networking.Info`
:return: The Map of network configuration info for all interfaces.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error.
"""
return self._invoke('get', None)
def update(self,
spec,
):
"""
Enable or Disable ipv6 on all interfaces. This method was added in
vSphere API 6.7.
:type spec: :class:`Networking.UpdateSpec`
:param spec: update spec with optional boolean value
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error.
"""
return self._invoke('update',
{
'spec': spec,
})
def reset(self):
"""
Reset and restarts network configuration on all interfaces, also this
will renew the DHCP lease for DHCP IP address. This method was added in
vSphere API 6.7.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error.
"""
return self._invoke('reset', None)
def change_task(self,
spec,
):
"""
Changes the Hostname/IP of the management network of vCenter appliance.
The Hostname/IP change invokes the PNID change process which involves
LDAP entry modification, updating registry entries, configuration files
modification and network configuration changes. vCenter server is
expected to be down for few minutes during these changes. This method
was added in vSphere API 6.7.3.
:type spec: :class:`Networking.ChangeSpec`
:param spec: Information required to change the hostname.
:raise: :class:`com.vmware.vapi.std.errors_client.Unsupported`
if it's not embedded node
:raise: :class:`com.vmware.vapi.std.errors_client.InvalidArgument`
if passed arguments are invalid.
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthenticated`
if the user is not authenticated.
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
if another change task is in progress
"""
task_id = self._invoke('change$task',
{
'spec': spec,
})
task_svc = Tasks(self._config)
task_instance = Task(task_id, task_svc, type.VoidType())
return task_instance
class Ntp(VapiInterface):
"""
``Ntp`` class provides methods Gets NTP configuration status and tests
connection to ntp servers. This class was added in vSphere API 6.7.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.ntp'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _NtpStub)
self._VAPI_OPERATION_IDS = {}
class ServerStatus(Enum):
"""
``Ntp.ServerStatus`` class Status of server during test. This enumeration
was added in vSphere API 6.7.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
SERVER_REACHABLE = None
"""
Server is reachable. This class attribute was added in vSphere API 6.7.
"""
SERVER_UNREACHABLE = None
"""
Server is unreachable. This class attribute was added in vSphere API 6.7.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`ServerStatus` instance.
"""
Enum.__init__(string)
ServerStatus._set_values([
ServerStatus('SERVER_REACHABLE'),
ServerStatus('SERVER_UNREACHABLE'),
])
ServerStatus._set_binding_type(type.EnumType(
'com.vmware.appliance.ntp.server_status',
ServerStatus))
class LocalizableMessage(VapiStruct):
"""
``Ntp.LocalizableMessage`` class Structure representing message. This class
was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
id=None,
default_message=None,
args=None,
):
"""
:type id: :class:`str`
:param id: id in message bundle. This attribute was added in vSphere API 6.7.
:type default_message: :class:`str`
:param default_message: text in english. This attribute was added in vSphere API 6.7.
:type args: :class:`list` of :class:`str`
:param args: nested data. This attribute was added in vSphere API 6.7.
"""
self.id = id
self.default_message = default_message
self.args = args
VapiStruct.__init__(self)
LocalizableMessage._set_binding_type(type.StructType(
'com.vmware.appliance.ntp.localizable_message', {
'id': type.StringType(),
'default_message': type.StringType(),
'args': type.ListType(type.StringType()),
},
LocalizableMessage,
False,
None))
class TestRunStatus(VapiStruct):
"""
``Ntp.TestRunStatus`` class Status of the test. This class was added in
vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
server=None,
status=None,
message=None,
):
"""
:type server: :class:`str`
:param server: Server name associated with the test run. This attribute was added
in vSphere API 6.7.
:type status: :class:`Ntp.ServerStatus`
:param status: Server status. This attribute was added in vSphere API 6.7.
:type message: :class:`Ntp.LocalizableMessage`
:param message: Message associated with status. This attribute was added in vSphere
API 6.7.
"""
self.server = server
self.status = status
self.message = message
VapiStruct.__init__(self)
TestRunStatus._set_binding_type(type.StructType(
'com.vmware.appliance.ntp.test_run_status', {
'server': type.StringType(),
'status': type.ReferenceType(__name__, 'Ntp.ServerStatus'),
'message': type.ReferenceType(__name__, 'Ntp.LocalizableMessage'),
},
TestRunStatus,
False,
None))
def test(self,
servers,
):
"""
Test the connection to a list of ntp servers. This method was added in
vSphere API 6.7.
:type servers: :class:`list` of :class:`str`
:param servers: List of host names or IP addresses of NTP servers.
:rtype: :class:`list` of :class:`Ntp.TestRunStatus`
:return: List of test run statuses.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('test',
{
'servers': servers,
})
def set(self,
servers,
):
"""
Set NTP servers. This method updates old NTP servers from configuration
and sets the input NTP servers in the configuration. If NTP based time
synchronization is used internally, the NTP daemon will be restarted to
reload given NTP configuration. In case NTP based time synchronization
is not used, this method only replaces servers in the NTP
configuration. This method was added in vSphere API 6.7.
:type servers: :class:`list` of :class:`str`
:param servers: List of host names or ip addresses of ntp servers.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('set',
{
'servers': servers,
})
def get(self):
"""
Get the NTP configuration status. If you run the 'timesync.get'
command, you can retrieve the current time synchronization method (NTP-
or VMware Tools-based). The 'ntp' command always returns the NTP server
information, even when the time synchronization mode is not set to NTP.
If the time synchronization mode is not NTP-based, the NTP server
status is displayed as down. This method was added in vSphere API 6.7.
:rtype: :class:`list` of :class:`str`
:return: List of NTP servers.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('get', None)
class Recovery(VapiInterface):
"""
The ``Recovery`` class provides methods to invoke an appliance recovery
(backup and restore). This class was added in vSphere API 6.7.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.recovery'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _RecoveryStub)
self._VAPI_OPERATION_IDS = {}
class Info(VapiStruct):
"""
The ``Recovery.Info`` class contains the information about the appliance
recovery environment. This class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
supported=None,
):
"""
:type supported: :class:`bool`
:param supported: Is recovery supported in this appliance. This attribute was added
in vSphere API 6.7.
"""
self.supported = supported
VapiStruct.__init__(self)
Info._set_binding_type(type.StructType(
'com.vmware.appliance.recovery.info', {
'supported': type.BooleanType(),
},
Info,
False,
None))
def get(self):
"""
Gets the properties of the appliance Recovery subsystem. This method
was added in vSphere API 6.7.
:rtype: :class:`Recovery.Info`
:return: Structure containing the properties of the Recovery subsystem.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
if any error occurs during the execution of the operation.
"""
return self._invoke('get', None)
class Services(VapiInterface):
"""
The ``Service`` class provides methods to manage a single/set of appliance
services. This class was added in vSphere API 6.7.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.services'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _ServicesStub)
self._VAPI_OPERATION_IDS = {}
class State(Enum):
"""
The ``Services.State`` class defines valid Run State for services. This
enumeration was added in vSphere API 6.7.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
STARTING = None
"""
Service Run State is Starting, it is still not functional. This class
attribute was added in vSphere API 6.7.
"""
STOPPING = None
"""
Service Run State is Stopping, it is not functional. This class attribute
was added in vSphere API 6.7.
"""
STARTED = None
"""
Service Run State is Started, it is fully functional. This class attribute
was added in vSphere API 6.7.
"""
STOPPED = None
"""
Service Run State is Stopped. This class attribute was added in vSphere API
6.7.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`State` instance.
"""
Enum.__init__(string)
State._set_values([
State('STARTING'),
State('STOPPING'),
State('STARTED'),
State('STOPPED'),
])
State._set_binding_type(type.EnumType(
'com.vmware.appliance.services.state',
State))
class Info(VapiStruct):
"""
The ``Services.Info`` class contains information about a service. This
class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
description=None,
state=None,
):
"""
:type description: :class:`str`
:param description: Service description. This attribute was added in vSphere API 6.7.
:type state: :class:`Services.State`
:param state: Running State. This attribute was added in vSphere API 6.7.
"""
self.description = description
self.state = state
VapiStruct.__init__(self)
Info._set_binding_type(type.StructType(
'com.vmware.appliance.services.info', {
'description': type.StringType(),
'state': type.ReferenceType(__name__, 'Services.State'),
},
Info,
False,
None))
def start(self,
service,
):
"""
Starts a service. This method was added in vSphere API 6.7.
:type service: :class:`str`
:param service: identifier of the service to start
The parameter must be an identifier for the resource type:
``com.vmware.appliance.services``.
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
if the service associated with ``service`` does not exist.
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
if the operation is denied in the current state of the service. If
a stop or restart operation is in progress, the start operation
will not be allowed.
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
if start operation is issued on a service which has startup type
null.
:raise: :class:`com.vmware.vapi.std.errors_client.TimedOut`
if any timeout occurs during the execution of the start operation.
Timeout occurs when the service takes longer than StartTimeout to
start.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
if any other error occurs during the execution of the operation.
"""
return self._invoke('start',
{
'service': service,
})
def stop(self,
service,
):
"""
Stops a service. This method was added in vSphere API 6.7.
:type service: :class:`str`
:param service: identifier of the service to stop
The parameter must be an identifier for the resource type:
``com.vmware.appliance.services``.
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
if the service associated with ``service`` does not exist.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
if any other error occurs during the execution of the operation.
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
if the operation is denied in the current state of the service. If
a stop operation is in progress, issuing another stop operation
will lead to this error.
"""
return self._invoke('stop',
{
'service': service,
})
def restart(self,
service,
):
"""
Restarts a service. This method was added in vSphere API 6.7.
:type service: :class:`str`
:param service: identifier of the service to restart
The parameter must be an identifier for the resource type:
``com.vmware.appliance.services``.
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
if the service associated with ``service`` does not exist.
:raise: :class:`com.vmware.vapi.std.errors_client.TimedOut`
if any timeout occurs during the execution of the restart
operation.
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
if the operation is denied in the current state of the service. If
a stop or start operation is in progress, issuing a restart
operation will lead to this error.
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
if a restart operation is issued on a service which has startup
type null
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
if any other error occurs during the execution of the operation.
"""
return self._invoke('restart',
{
'service': service,
})
def get(self,
service,
):
"""
Returns the state of a service. This method was added in vSphere API
6.7.
:type service: :class:`str`
:param service: identifier of the service whose state is being queried.
The parameter must be an identifier for the resource type:
``com.vmware.appliance.services``.
:rtype: :class:`Services.Info`
:return: Service Info structure.
:raise: :class:`com.vmware.vapi.std.errors_client.NotFound`
if the service associated with ``service`` does not exist.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
if any other error occurs during the execution of the operation.
"""
return self._invoke('get',
{
'service': service,
})
def list(self):
"""
Lists details of vCenter services. This method was added in vSphere API
6.7.
:rtype: :class:`dict` of :class:`str` and :class:`Services.Info`
:return: Map of service identifiers to service Info structures.
The key in the return value :class:`dict` will be an identifier for
the resource type: ``com.vmware.appliance.services``.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
if any error occurs during the execution of the operation.
"""
return self._invoke('list', None)
class Shutdown(VapiInterface):
"""
``Shutdown`` class provides methods Performs reboot/shutdown operations on
appliance. This class was added in vSphere API 6.7.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.shutdown'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _ShutdownStub)
self._VAPI_OPERATION_IDS = {}
class ShutdownConfig(VapiStruct):
"""
``Shutdown.ShutdownConfig`` class Structure that defines shutdown
configuration returned by the Shutdown.get operation. This class was added
in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
def __init__(self,
shutdown_time=None,
action=None,
reason=None,
):
"""
:type shutdown_time: :class:`datetime.datetime` or ``None``
:param shutdown_time: Shutdown time. This attribute was added in vSphere API 6.7.
shutdownTime Optional value of pending shutdown time
:type action: :class:`str`
:param action: The pending shutdown operation. The string values for pending
operations can be 'poweroff', 'reboot' or ''. This attribute was
added in vSphere API 6.7.
:type reason: :class:`str`
:param reason: The reason behind the shutdown action. This attribute was added in
vSphere API 6.7.
"""
self.shutdown_time = shutdown_time
self.action = action
self.reason = reason
VapiStruct.__init__(self)
ShutdownConfig._set_binding_type(type.StructType(
'com.vmware.appliance.shutdown.shutdown_config', {
'shutdown_time': type.OptionalType(type.DateTimeType()),
'action': type.StringType(),
'reason': type.StringType(),
},
ShutdownConfig,
False,
None))
def cancel(self):
"""
Cancel pending shutdown action. This method was added in vSphere API
6.7.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('cancel', None)
def poweroff(self,
delay,
reason,
):
"""
Power off the appliance. This method was added in vSphere API 6.7.
:type delay: :class:`long`
:param delay: Minutes after which poweroff should start. If 0 is specified,
poweroff will start immediately.
:type reason: :class:`str`
:param reason: Reason for peforming poweroff.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('poweroff',
{
'delay': delay,
'reason': reason,
})
def reboot(self,
delay,
reason,
):
"""
Reboot the appliance. This method was added in vSphere API 6.7.
:type delay: :class:`long`
:param delay: Minutes after which reboot should start. If 0 is specified, reboot
will start immediately.
:type reason: :class:`str`
:param reason: Reason for peforming reboot.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('reboot',
{
'delay': delay,
'reason': reason,
})
def get(self):
"""
Get details about the pending shutdown action. This method was added in
vSphere API 6.7.
:rtype: :class:`Shutdown.ShutdownConfig`
:return: Configuration of pending shutdown action.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('get', None)
class Timesync(VapiInterface):
"""
``Timesync`` class provides methods Performs time synchronization
configuration. This class was added in vSphere API 6.7.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.timesync'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _TimesyncStub)
self._VAPI_OPERATION_IDS = {}
class TimeSyncMode(Enum):
"""
The ``Timesync.TimeSyncMode`` class defines time synchronization modes.
This enumeration was added in vSphere API 6.7.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
DISABLED = None
"""
Time synchronization is disabled. This class attribute was added in vSphere
API 6.7.
"""
NTP = None
"""
NTP-based time synchronization. This class attribute was added in vSphere
API 6.7.
"""
HOST = None
"""
VMware Tool-based time synchronization. This class attribute was added in
vSphere API 6.7.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`TimeSyncMode` instance.
"""
Enum.__init__(string)
TimeSyncMode._set_values([
TimeSyncMode('DISABLED'),
TimeSyncMode('NTP'),
TimeSyncMode('HOST'),
])
TimeSyncMode._set_binding_type(type.EnumType(
'com.vmware.appliance.timesync.time_sync_mode',
TimeSyncMode))
def set(self,
mode,
):
"""
Set time synchronization mode. This method was added in vSphere API
6.7.
:type mode: :class:`Timesync.TimeSyncMode`
:param mode: Time synchronization mode.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('set',
{
'mode': mode,
})
def get(self):
"""
Get time synchronization mode. This method was added in vSphere API
6.7.
:rtype: :class:`Timesync.TimeSyncMode`
:return: Time synchronization mode.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
"""
return self._invoke('get', None)
class Update(VapiInterface):
"""
The ``Update`` class provides methods to get the status of the appliance
update. This class was added in vSphere API 6.7.
"""
_VAPI_SERVICE_ID = 'com.vmware.appliance.update'
"""
Identifier of the service in canonical form.
"""
def __init__(self, config):
"""
:type config: :class:`vmware.vapi.bindings.stub.StubConfiguration`
:param config: Configuration to be used for creating the stub.
"""
VapiInterface.__init__(self, config, _UpdateStub)
self._VAPI_OPERATION_IDS = {}
class State(Enum):
"""
The ``Update.State`` class defines the various states the appliance update
can be in. This enumeration was added in vSphere API 6.7.
.. note::
This class represents an enumerated type in the interface language
definition. The class contains class attributes which represent the
values in the current version of the enumerated type. Newer versions of
the enumerated type may contain new values. To use new values of the
enumerated type in communication with a server that supports the newer
version of the API, you instantiate this class. See :ref:`enumerated
type description page <enumeration_description>`.
"""
UP_TO_DATE = None
"""
The appliance is up to date. This class attribute was added in vSphere API
6.7.
"""
UPDATES_PENDING = None
"""
A new update is available. This class attribute was added in vSphere API
6.7.
"""
STAGE_IN_PROGRESS = None
"""
The appliance update is in progress of downloading an update. This class
attribute was added in vSphere API 6.7.
"""
INSTALL_IN_PROGRESS = None
"""
The appliance update is in progress of installing an update. This class
attribute was added in vSphere API 6.7.
"""
INSTALL_FAILED = None
"""
The appliance update failed and cannot recover. This class attribute was
added in vSphere API 6.7.
"""
ROLLBACK_IN_PROGRESS = None
"""
The appliance update failed and recovery is in progress. This class
attribute was added in vSphere API 6.7.
"""
def __init__(self, string):
"""
:type string: :class:`str`
:param string: String value for the :class:`State` instance.
"""
Enum.__init__(string)
State._set_values([
State('UP_TO_DATE'),
State('UPDATES_PENDING'),
State('STAGE_IN_PROGRESS'),
State('INSTALL_IN_PROGRESS'),
State('INSTALL_FAILED'),
State('ROLLBACK_IN_PROGRESS'),
])
State._set_binding_type(type.EnumType(
'com.vmware.appliance.update.state',
State))
class Info(VapiStruct):
"""
The ``Update.Info`` class describes the state of the appliance update. This
class was added in vSphere API 6.7.
.. tip::
The arguments are used to initialize data attributes with the same
names.
"""
_validator_list = [
UnionValidator(
'state',
{
'UP_TO_DATE' : [],
'UPDATES_PENDING' : [],
'STAGE_IN_PROGRESS' : [],
'INSTALL_IN_PROGRESS' : [],
'INSTALL_FAILED' : [],
'ROLLBACK_IN_PROGRESS' : [],
}
),
]
def __init__(self,
state=None,
task=None,
version=None,
latest_query_time=None,
):
"""
:type state: :class:`Update.State`
:param state: State of the appliance update. This attribute was added in vSphere
API 6.7.
:type task: :class:`TaskInfo` or ``None``
:param task: The running or completed update task. This attribute was added in
vSphere API 6.7.
:type version: :class:`str`
:param version: Version of base appliance if state is UP_TO_DATE Version of update
being staged or installed if state is INSTALL_IN_PROGRESS or
STAGE_IN_PROGRESS Version of update staged if state is
UPDATES_PENDING Version of update failed if state is INSTALL_FAILED
or ROLLBACK_IN_PROGRESS. This attribute was added in vSphere API
6.7.
:type latest_query_time: :class:`datetime.datetime` or ``None``
:param latest_query_time: Timestamp of latest query to update repository. This attribute was
added in vSphere API 6.7.
If None the update was never queried
"""
self.state = state
self.task = task
self.version = version
self.latest_query_time = latest_query_time
VapiStruct.__init__(self)
Info._set_binding_type(type.StructType(
'com.vmware.appliance.update.info', {
'state': type.ReferenceType(__name__, 'Update.State'),
'task': type.OptionalType(type.ReferenceType(__name__, 'TaskInfo')),
'version': type.StringType(),
'latest_query_time': type.OptionalType(type.DateTimeType()),
},
Info,
False,
None))
def get(self):
"""
Gets the current status of the appliance update. This method was added
in vSphere API 6.7.
:rtype: :class:`Update.Info`
:return: Info structure containing the status information about the
appliance.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthenticated`
session is not authenticated
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
session is not authorized to perform this operation
"""
return self._invoke('get', None)
def cancel(self):
"""
Request the cancellation the update operation that is currently in
progress. This method was added in vSphere API 6.7.
:raise: :class:`com.vmware.vapi.std.errors_client.Error`
Generic error
:raise: :class:`com.vmware.vapi.std.errors_client.NotAllowedInCurrentState`
Current task is not cancellable
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthenticated`
session is not authenticated
:raise: :class:`com.vmware.vapi.std.errors_client.Unauthorized`
session is not authorized to perform this operation
"""
return self._invoke('cancel', None)
class _HealthStub(ApiInterfaceStub):
def __init__(self, config):
# properties for messages operation
messages_input_type = type.StructType('operation-input', {
'item': type.IdType(resource_types='com.vmware.appliance.health'),
})
messages_error_dict = {
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
messages_input_value_validator_list = [
]
messages_output_validator_list = [
]
messages_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/health/{item}/messages',
path_variables={
'item': 'item',
},
query_parameters={
}
)
operations = {
'messages': {
'input_type': messages_input_type,
'output_type': type.ListType(type.ReferenceType(__name__, 'Notification')),
'errors': messages_error_dict,
'input_value_validator_list': messages_input_value_validator_list,
'output_validator_list': messages_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'messages': messages_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.health',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _LocalAccountsStub(ApiInterfaceStub):
def __init__(self, config):
# properties for get operation
get_input_type = type.StructType('operation-input', {
'username': type.StringType(),
})
get_error_dict = {
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/local-accounts/{username}',
path_variables={
'username': 'username',
},
query_parameters={
}
)
# properties for list operation
list_input_type = type.StructType('operation-input', {})
list_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
list_input_value_validator_list = [
]
list_output_validator_list = [
]
list_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/local-accounts',
path_variables={
},
query_parameters={
}
)
# properties for create operation
create_input_type = type.StructType('operation-input', {
'username': type.IdType(resource_types='com.vmware.appliance.local_accounts'),
'config': type.ReferenceType(__name__, 'LocalAccounts.Config'),
})
create_error_dict = {
'com.vmware.vapi.std.errors.already_exists':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'AlreadyExists'),
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
create_input_value_validator_list = [
]
create_output_validator_list = [
]
create_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/local-accounts/{username}',
path_variables={
'username': 'username',
},
query_parameters={
}
)
# properties for set operation
set_input_type = type.StructType('operation-input', {
'username': type.IdType(resource_types='com.vmware.appliance.local_accounts'),
'config': type.ReferenceType(__name__, 'LocalAccounts.Config'),
})
set_error_dict = {
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
set_input_value_validator_list = [
]
set_output_validator_list = [
]
set_rest_metadata = OperationRestMetadata(
http_method='PUT',
url_template='/appliance/local-accounts/{username}',
path_variables={
'username': 'username',
},
query_parameters={
}
)
# properties for update operation
update_input_type = type.StructType('operation-input', {
'username': type.IdType(resource_types='com.vmware.appliance.local_accounts'),
'config': type.ReferenceType(__name__, 'LocalAccounts.UpdateConfig'),
})
update_error_dict = {
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
update_input_value_validator_list = [
]
update_output_validator_list = [
]
update_rest_metadata = OperationRestMetadata(
http_method='PATCH',
url_template='/appliance/local-accounts/{username}',
path_variables={
'username': 'username',
},
query_parameters={
}
)
# properties for delete operation
delete_input_type = type.StructType('operation-input', {
'username': type.IdType(resource_types='com.vmware.appliance.local_accounts'),
})
delete_error_dict = {
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
delete_input_value_validator_list = [
]
delete_output_validator_list = [
]
delete_rest_metadata = OperationRestMetadata(
http_method='DELETE',
url_template='/appliance/local-accounts/{username}',
path_variables={
'username': 'username',
},
query_parameters={
}
)
operations = {
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'LocalAccounts.Info'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
'list': {
'input_type': list_input_type,
'output_type': type.ListType(type.IdType()),
'errors': list_error_dict,
'input_value_validator_list': list_input_value_validator_list,
'output_validator_list': list_output_validator_list,
'task_type': TaskType.NONE,
},
'create': {
'input_type': create_input_type,
'output_type': type.VoidType(),
'errors': create_error_dict,
'input_value_validator_list': create_input_value_validator_list,
'output_validator_list': create_output_validator_list,
'task_type': TaskType.NONE,
},
'set': {
'input_type': set_input_type,
'output_type': type.VoidType(),
'errors': set_error_dict,
'input_value_validator_list': set_input_value_validator_list,
'output_validator_list': set_output_validator_list,
'task_type': TaskType.NONE,
},
'update': {
'input_type': update_input_type,
'output_type': type.VoidType(),
'errors': update_error_dict,
'input_value_validator_list': update_input_value_validator_list,
'output_validator_list': update_output_validator_list,
'task_type': TaskType.NONE,
},
'delete': {
'input_type': delete_input_type,
'output_type': type.VoidType(),
'errors': delete_error_dict,
'input_value_validator_list': delete_input_value_validator_list,
'output_validator_list': delete_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'get': get_rest_metadata,
'list': list_rest_metadata,
'create': create_rest_metadata,
'set': set_rest_metadata,
'update': update_rest_metadata,
'delete': delete_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.local_accounts',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _MonitoringStub(ApiInterfaceStub):
def __init__(self, config):
# properties for query operation
query_input_type = type.StructType('operation-input', {
'item': type.ReferenceType(__name__, 'Monitoring.MonitoredItemDataRequest'),
})
query_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
query_input_value_validator_list = [
]
query_output_validator_list = [
]
query_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/monitoring/query',
path_variables={
},
query_parameters={
}
)
# properties for list operation
list_input_type = type.StructType('operation-input', {})
list_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
list_input_value_validator_list = [
]
list_output_validator_list = [
]
list_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/monitoring',
path_variables={
},
query_parameters={
}
)
# properties for get operation
get_input_type = type.StructType('operation-input', {
'stat_id': type.IdType(resource_types='com.vmware.appliance.monitoring'),
})
get_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/monitoring/{stat_id}',
path_variables={
'stat_id': 'stat_id',
},
query_parameters={
}
)
operations = {
'query': {
'input_type': query_input_type,
'output_type': type.ListType(type.ReferenceType(__name__, 'Monitoring.MonitoredItemData')),
'errors': query_error_dict,
'input_value_validator_list': query_input_value_validator_list,
'output_validator_list': query_output_validator_list,
'task_type': TaskType.NONE,
},
'list': {
'input_type': list_input_type,
'output_type': type.ListType(type.ReferenceType(__name__, 'Monitoring.MonitoredItem')),
'errors': list_error_dict,
'input_value_validator_list': list_input_value_validator_list,
'output_validator_list': list_output_validator_list,
'task_type': TaskType.NONE,
},
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'Monitoring.MonitoredItem'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'query': query_rest_metadata,
'list': list_rest_metadata,
'get': get_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.monitoring',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _NetworkingStub(ApiInterfaceStub):
def __init__(self, config):
# properties for get operation
get_input_type = type.StructType('operation-input', {})
get_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/networking',
path_variables={
},
query_parameters={
}
)
# properties for update operation
update_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Networking.UpdateSpec'),
})
update_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
update_input_value_validator_list = [
]
update_output_validator_list = [
]
update_rest_metadata = OperationRestMetadata(
http_method='PATCH',
url_template='/appliance/networking',
path_variables={
},
query_parameters={
}
)
# properties for reset operation
reset_input_type = type.StructType('operation-input', {})
reset_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
reset_input_value_validator_list = [
]
reset_output_validator_list = [
]
reset_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/networking?action=reset',
path_variables={
},
query_parameters={
}
)
# properties for change operation
change_input_type = type.StructType('operation-input', {
'spec': type.ReferenceType(__name__, 'Networking.ChangeSpec'),
})
change_error_dict = {
'com.vmware.vapi.std.errors.unsupported':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unsupported'),
'com.vmware.vapi.std.errors.invalid_argument':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'InvalidArgument'),
'com.vmware.vapi.std.errors.unauthenticated':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthenticated'),
'com.vmware.vapi.std.errors.not_allowed_in_current_state':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotAllowedInCurrentState'),
}
change_input_value_validator_list = [
]
change_output_validator_list = [
]
change_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/networking?action=change',
path_variables={
},
query_parameters={
}
)
operations = {
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'Networking.Info'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
'update': {
'input_type': update_input_type,
'output_type': type.VoidType(),
'errors': update_error_dict,
'input_value_validator_list': update_input_value_validator_list,
'output_validator_list': update_output_validator_list,
'task_type': TaskType.NONE,
},
'reset': {
'input_type': reset_input_type,
'output_type': type.VoidType(),
'errors': reset_error_dict,
'input_value_validator_list': reset_input_value_validator_list,
'output_validator_list': reset_output_validator_list,
'task_type': TaskType.NONE,
},
'change$task': {
'input_type': change_input_type,
'output_type': type.IdType(resource_types='com.vmware.cis.TASK'),
'errors': change_error_dict,
'input_value_validator_list': change_input_value_validator_list,
'output_validator_list': [],
'task_type': TaskType.TASK_ONLY,
},
}
rest_metadata = {
'get': get_rest_metadata,
'update': update_rest_metadata,
'reset': reset_rest_metadata,
'change': change_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.networking',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _NtpStub(ApiInterfaceStub):
def __init__(self, config):
# properties for test operation
test_input_type = type.StructType('operation-input', {
'servers': type.ListType(type.StringType()),
})
test_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
test_input_value_validator_list = [
]
test_output_validator_list = [
]
test_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/ntp/test',
path_variables={
},
query_parameters={
}
)
# properties for set operation
set_input_type = type.StructType('operation-input', {
'servers': type.ListType(type.StringType()),
})
set_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
set_input_value_validator_list = [
]
set_output_validator_list = [
]
set_rest_metadata = OperationRestMetadata(
http_method='PUT',
url_template='/appliance/ntp',
path_variables={
},
query_parameters={
}
)
# properties for get operation
get_input_type = type.StructType('operation-input', {})
get_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/ntp',
path_variables={
},
query_parameters={
}
)
operations = {
'test': {
'input_type': test_input_type,
'output_type': type.ListType(type.ReferenceType(__name__, 'Ntp.TestRunStatus')),
'errors': test_error_dict,
'input_value_validator_list': test_input_value_validator_list,
'output_validator_list': test_output_validator_list,
'task_type': TaskType.NONE,
},
'set': {
'input_type': set_input_type,
'output_type': type.VoidType(),
'errors': set_error_dict,
'input_value_validator_list': set_input_value_validator_list,
'output_validator_list': set_output_validator_list,
'task_type': TaskType.NONE,
},
'get': {
'input_type': get_input_type,
'output_type': type.ListType(type.StringType()),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'test': test_rest_metadata,
'set': set_rest_metadata,
'get': get_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.ntp',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _RecoveryStub(ApiInterfaceStub):
def __init__(self, config):
# properties for get operation
get_input_type = type.StructType('operation-input', {})
get_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/recovery',
path_variables={
},
query_parameters={
}
)
operations = {
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'Recovery.Info'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'get': get_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.recovery',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _ServicesStub(ApiInterfaceStub):
def __init__(self, config):
# properties for start operation
start_input_type = type.StructType('operation-input', {
'service': type.IdType(resource_types='com.vmware.appliance.services'),
})
start_error_dict = {
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.not_allowed_in_current_state':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotAllowedInCurrentState'),
'com.vmware.vapi.std.errors.timed_out':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'TimedOut'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
start_input_value_validator_list = [
]
start_output_validator_list = [
]
start_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/services/{id}/start',
path_variables={
'service': 'id',
},
query_parameters={
}
)
# properties for stop operation
stop_input_type = type.StructType('operation-input', {
'service': type.IdType(resource_types='com.vmware.appliance.services'),
})
stop_error_dict = {
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
'com.vmware.vapi.std.errors.not_allowed_in_current_state':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotAllowedInCurrentState'),
}
stop_input_value_validator_list = [
]
stop_output_validator_list = [
]
stop_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/services/{id}/stop',
path_variables={
'service': 'id',
},
query_parameters={
}
)
# properties for restart operation
restart_input_type = type.StructType('operation-input', {
'service': type.IdType(resource_types='com.vmware.appliance.services'),
})
restart_error_dict = {
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.timed_out':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'TimedOut'),
'com.vmware.vapi.std.errors.not_allowed_in_current_state':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotAllowedInCurrentState'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
restart_input_value_validator_list = [
]
restart_output_validator_list = [
]
restart_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/services/{id}/restart',
path_variables={
'service': 'id',
},
query_parameters={
}
)
# properties for get operation
get_input_type = type.StructType('operation-input', {
'service': type.IdType(resource_types='com.vmware.appliance.services'),
})
get_error_dict = {
'com.vmware.vapi.std.errors.not_found':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotFound'),
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/services/{id}',
path_variables={
'service': 'id',
},
query_parameters={
}
)
# properties for list operation
list_input_type = type.StructType('operation-input', {})
list_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
list_input_value_validator_list = [
]
list_output_validator_list = [
]
list_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/services',
path_variables={
},
query_parameters={
}
)
operations = {
'start': {
'input_type': start_input_type,
'output_type': type.VoidType(),
'errors': start_error_dict,
'input_value_validator_list': start_input_value_validator_list,
'output_validator_list': start_output_validator_list,
'task_type': TaskType.NONE,
},
'stop': {
'input_type': stop_input_type,
'output_type': type.VoidType(),
'errors': stop_error_dict,
'input_value_validator_list': stop_input_value_validator_list,
'output_validator_list': stop_output_validator_list,
'task_type': TaskType.NONE,
},
'restart': {
'input_type': restart_input_type,
'output_type': type.VoidType(),
'errors': restart_error_dict,
'input_value_validator_list': restart_input_value_validator_list,
'output_validator_list': restart_output_validator_list,
'task_type': TaskType.NONE,
},
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'Services.Info'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
'list': {
'input_type': list_input_type,
'output_type': type.MapType(type.IdType(), type.ReferenceType(__name__, 'Services.Info')),
'errors': list_error_dict,
'input_value_validator_list': list_input_value_validator_list,
'output_validator_list': list_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'start': start_rest_metadata,
'stop': stop_rest_metadata,
'restart': restart_rest_metadata,
'get': get_rest_metadata,
'list': list_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.services',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _ShutdownStub(ApiInterfaceStub):
def __init__(self, config):
# properties for cancel operation
cancel_input_type = type.StructType('operation-input', {})
cancel_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
cancel_input_value_validator_list = [
]
cancel_output_validator_list = [
]
cancel_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/shutdown/cancel',
path_variables={
},
query_parameters={
}
)
# properties for poweroff operation
poweroff_input_type = type.StructType('operation-input', {
'delay': type.IntegerType(),
'reason': type.StringType(),
})
poweroff_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
poweroff_input_value_validator_list = [
]
poweroff_output_validator_list = [
]
poweroff_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/shutdown/poweroff',
path_variables={
},
query_parameters={
}
)
# properties for reboot operation
reboot_input_type = type.StructType('operation-input', {
'delay': type.IntegerType(),
'reason': type.StringType(),
})
reboot_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
reboot_input_value_validator_list = [
]
reboot_output_validator_list = [
]
reboot_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/shutdown/reboot',
path_variables={
},
query_parameters={
}
)
# properties for get operation
get_input_type = type.StructType('operation-input', {})
get_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/shutdown',
path_variables={
},
query_parameters={
}
)
operations = {
'cancel': {
'input_type': cancel_input_type,
'output_type': type.VoidType(),
'errors': cancel_error_dict,
'input_value_validator_list': cancel_input_value_validator_list,
'output_validator_list': cancel_output_validator_list,
'task_type': TaskType.NONE,
},
'poweroff': {
'input_type': poweroff_input_type,
'output_type': type.VoidType(),
'errors': poweroff_error_dict,
'input_value_validator_list': poweroff_input_value_validator_list,
'output_validator_list': poweroff_output_validator_list,
'task_type': TaskType.NONE,
},
'reboot': {
'input_type': reboot_input_type,
'output_type': type.VoidType(),
'errors': reboot_error_dict,
'input_value_validator_list': reboot_input_value_validator_list,
'output_validator_list': reboot_output_validator_list,
'task_type': TaskType.NONE,
},
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'Shutdown.ShutdownConfig'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'cancel': cancel_rest_metadata,
'poweroff': poweroff_rest_metadata,
'reboot': reboot_rest_metadata,
'get': get_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.shutdown',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _TimesyncStub(ApiInterfaceStub):
def __init__(self, config):
# properties for set operation
set_input_type = type.StructType('operation-input', {
'mode': type.ReferenceType(__name__, 'Timesync.TimeSyncMode'),
})
set_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
set_input_value_validator_list = [
]
set_output_validator_list = [
]
set_rest_metadata = OperationRestMetadata(
http_method='PUT',
url_template='/appliance/timesync',
path_variables={
},
query_parameters={
}
)
# properties for get operation
get_input_type = type.StructType('operation-input', {})
get_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/timesync',
path_variables={
},
query_parameters={
}
)
operations = {
'set': {
'input_type': set_input_type,
'output_type': type.VoidType(),
'errors': set_error_dict,
'input_value_validator_list': set_input_value_validator_list,
'output_validator_list': set_output_validator_list,
'task_type': TaskType.NONE,
},
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'Timesync.TimeSyncMode'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'set': set_rest_metadata,
'get': get_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.timesync',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class _UpdateStub(ApiInterfaceStub):
def __init__(self, config):
# properties for get operation
get_input_type = type.StructType('operation-input', {})
get_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
'com.vmware.vapi.std.errors.unauthenticated':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthenticated'),
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
}
get_input_value_validator_list = [
]
get_output_validator_list = [
]
get_rest_metadata = OperationRestMetadata(
http_method='GET',
url_template='/appliance/update',
path_variables={
},
query_parameters={
}
)
# properties for cancel operation
cancel_input_type = type.StructType('operation-input', {})
cancel_error_dict = {
'com.vmware.vapi.std.errors.error':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Error'),
'com.vmware.vapi.std.errors.not_allowed_in_current_state':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'NotAllowedInCurrentState'),
'com.vmware.vapi.std.errors.unauthenticated':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthenticated'),
'com.vmware.vapi.std.errors.unauthorized':
type.ReferenceType('com.vmware.vapi.std.errors_client', 'Unauthorized'),
}
cancel_input_value_validator_list = [
]
cancel_output_validator_list = [
]
cancel_rest_metadata = OperationRestMetadata(
http_method='POST',
url_template='/appliance/update?action=cancel',
path_variables={
},
query_parameters={
}
)
operations = {
'get': {
'input_type': get_input_type,
'output_type': type.ReferenceType(__name__, 'Update.Info'),
'errors': get_error_dict,
'input_value_validator_list': get_input_value_validator_list,
'output_validator_list': get_output_validator_list,
'task_type': TaskType.NONE,
},
'cancel': {
'input_type': cancel_input_type,
'output_type': type.VoidType(),
'errors': cancel_error_dict,
'input_value_validator_list': cancel_input_value_validator_list,
'output_validator_list': cancel_output_validator_list,
'task_type': TaskType.NONE,
},
}
rest_metadata = {
'get': get_rest_metadata,
'cancel': cancel_rest_metadata,
}
ApiInterfaceStub.__init__(
self, iface_name='com.vmware.appliance.update',
config=config, operations=operations, rest_metadata=rest_metadata,
is_vapi_rest=True)
class StubFactory(StubFactoryBase):
_attrs = {
'Health': Health,
'LocalAccounts': LocalAccounts,
'Monitoring': Monitoring,
'Networking': Networking,
'Ntp': Ntp,
'Recovery': Recovery,
'Services': Services,
'Shutdown': Shutdown,
'Timesync': Timesync,
'Update': Update,
'access': 'com.vmware.appliance.access_client.StubFactory',
'health': 'com.vmware.appliance.health_client.StubFactory',
'localaccounts': 'com.vmware.appliance.localaccounts_client.StubFactory',
'logging': 'com.vmware.appliance.logging_client.StubFactory',
'monitoring': 'com.vmware.appliance.monitoring_client.StubFactory',
'networking': 'com.vmware.appliance.networking_client.StubFactory',
'ntp': 'com.vmware.appliance.ntp_client.StubFactory',
'shutdown': 'com.vmware.appliance.shutdown_client.StubFactory',
'system': 'com.vmware.appliance.system_client.StubFactory',
'tymesync': 'com.vmware.appliance.tymesync_client.StubFactory',
'update': 'com.vmware.appliance.update_client.StubFactory',
'local_accounts': 'com.vmware.appliance.local_accounts_client.StubFactory',
}
| 39.213468 | 144 | 0.582868 | 15,473 | 147,325 | 5.367543 | 0.041686 | 0.036411 | 0.029897 | 0.034484 | 0.784713 | 0.761366 | 0.729206 | 0.703728 | 0.679442 | 0.663741 | 0 | 0.003942 | 0.325036 | 147,325 | 3,756 | 145 | 39.223908 | 0.831263 | 0.363394 | 0 | 0.556402 | 1 | 0 | 0.192371 | 0.126534 | 0 | 0 | 0 | 0 | 0 | 1 | 0.039634 | false | 0.038618 | 0.007114 | 0 | 0.093496 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
65584ff5949577690e4de2e43dd3c53f2b5c1122 | 225 | py | Python | ztq_worker/ztq_worker/system_info/__init__.py | audoe/ztq | b2527171d759e58d28f8a43876d87ca1a14b546a | [
"MIT"
] | null | null | null | ztq_worker/ztq_worker/system_info/__init__.py | audoe/ztq | b2527171d759e58d28f8a43876d87ca1a14b546a | [
"MIT"
] | null | null | null | ztq_worker/ztq_worker/system_info/__init__.py | audoe/ztq | b2527171d759e58d28f8a43876d87ca1a14b546a | [
"MIT"
] | null | null | null | # -*- encoding: utf-8 -*-
import sys
if sys.platform.startswith('win'):
from win import get_cpu_style, get_cpu_usage, get_mem_usage, get_ip
else:
from linux import get_cpu_style, get_cpu_usage, get_mem_usage, get_ip
| 28.125 | 73 | 0.751111 | 39 | 225 | 3.974359 | 0.461538 | 0.154839 | 0.154839 | 0.219355 | 0.567742 | 0.567742 | 0.567742 | 0.567742 | 0.567742 | 0.567742 | 0 | 0.005208 | 0.146667 | 225 | 7 | 74 | 32.142857 | 0.802083 | 0.102222 | 0 | 0 | 0 | 0 | 0.015 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.6 | 0 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
656422744a81c3e23e69a46796d3dc52ea0985fb | 93 | py | Python | python/testData/resolve/PercentPositionalArgs.py | jnthn/intellij-community | 8fa7c8a3ace62400c838e0d5926a7be106aa8557 | [
"Apache-2.0"
] | 2 | 2019-04-28T07:48:50.000Z | 2020-12-11T14:18:08.000Z | python/testData/resolve/PercentPositionalArgs.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 173 | 2018-07-05T13:59:39.000Z | 2018-08-09T01:12:03.000Z | python/testData/resolve/PercentPositionalArgs.py | Cyril-lamirand/intellij-community | 60ab6c61b82fc761dd68363eca7d9d69663cfa39 | [
"Apache-2.0"
] | 2 | 2020-03-15T08:57:37.000Z | 2020-04-07T04:48:14.000Z | "in percent string it's %<ref>s argument, but i want to pass %d arguments" % ("string", 1423) | 93 | 93 | 0.688172 | 17 | 93 | 3.764706 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051282 | 0.16129 | 93 | 1 | 93 | 93 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0.829787 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
6564bd3bdb768c8cba30b5f79b8fefd42ec55449 | 64 | py | Python | utils/data/__init__.py | kunalsinha/wings | a28b4ca67bcccfd45d6125bd9c4784a42c157d4c | [
"MIT"
] | null | null | null | utils/data/__init__.py | kunalsinha/wings | a28b4ca67bcccfd45d6125bd9c4784a42c157d4c | [
"MIT"
] | null | null | null | utils/data/__init__.py | kunalsinha/wings | a28b4ca67bcccfd45d6125bd9c4784a42c157d4c | [
"MIT"
] | null | null | null | from .dataset import Dataset
from .dataloader import DataLoader
| 21.333333 | 34 | 0.84375 | 8 | 64 | 6.75 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 64 | 2 | 35 | 32 | 0.964286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
65bb9da7169aea5c86f61cb8380a85279fe57246 | 73 | py | Python | src/hub/dataload/sources/unii/__init__.py | ravila4/mychem.info | 9b63b5f0957b5e7b252ca8122734a363905036b3 | [
"Apache-2.0"
] | 10 | 2017-07-24T11:45:27.000Z | 2022-02-14T13:42:36.000Z | src/hub/dataload/sources/unii/__init__.py | veleritas/mychem.info | bb22357d4cbbc3c4865da224bf998f2cbc59f8f2 | [
"Apache-2.0"
] | 92 | 2017-06-22T16:49:20.000Z | 2022-03-24T20:50:01.000Z | src/hub/dataload/sources/unii/__init__.py | veleritas/mychem.info | bb22357d4cbbc3c4865da224bf998f2cbc59f8f2 | [
"Apache-2.0"
] | 11 | 2017-06-12T18:31:35.000Z | 2022-01-31T02:56:52.000Z | from .unii_dump import UniiDumper
from .unii_upload import UniiUploader
| 18.25 | 37 | 0.849315 | 10 | 73 | 6 | 0.7 | 0.266667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.123288 | 73 | 3 | 38 | 24.333333 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
65c681eb03213e0a000c34071a6727301d7396e3 | 4,228 | py | Python | classify.py | kainhuck/redisk | fc35332b7581549dce696aa5e4ef24c14b8db9e1 | [
"MIT"
] | 3 | 2020-08-02T08:51:49.000Z | 2020-08-03T01:07:17.000Z | classify.py | kainhuck/redisk | fc35332b7581549dce696aa5e4ef24c14b8db9e1 | [
"MIT"
] | null | null | null | classify.py | kainhuck/redisk | fc35332b7581549dce696aa5e4ef24c14b8db9e1 | [
"MIT"
] | 1 | 2020-09-08T09:50:47.000Z | 2020-09-08T09:50:47.000Z | # classify
class String(object):
def __init__(self, master):
self._master = master
def set(self, key: str, value: str, expire: int = -1) -> str:
return self._master.set(key, value, expire)
def get(self, key: str) -> str:
return self._master.get(key)
def incr(self, key: str) -> int:
return self._master.incr(key)
def incrby(self, key: str, increment: int) -> int:
return self._master.incrby(key, increment)
class Hash(object):
def __init__(self, master):
self._master = master
def hset(self, key: str, items: dict) -> int:
return self._master.hset(key, items)
def hget(self, key: str, field: str) -> object:
return self._master.hget(key, field)
def hmset(self, key: str, items: dict) -> str:
return self._master.hmset(key, items)
def hmget(self, key: str, *args) -> [str]:
return self._master.hmget(key, *args)
def hgetall(self, key: str) -> dict:
return self._master.hgetall(key)
class List(object):
def __init__(self, master):
self._master = master
def lpush(self, key: str, *args) -> int:
return self._master.lpush(key, *args)
def lrange(self, key: str, start: int, stop: int) -> [str]:
return self._master.lrange(key, start, stop)
def rpush(self, key: str, *args) -> int:
return self._master.rpush(key, *args)
def ltrim(self, key: str, start: int, stop: int) -> str:
return self._master.ltrim(key, start, stop)
def lpop(self, key: str) -> str:
return self._master.lpop(key)
def llen(self, key: str) -> int:
return self._master.llen(key)
class Set(object):
def __init__(self, master):
self._master = master
def sadd(self, key: str, *args) -> int:
return self._master.sadd(key, *args)
def smembers(self, key: str) -> {str}:
return self._master.smembers(key)
def sdiff(self, key: str, *args) -> {str}:
return self._master.sdiff(key, *args)
def sinter(self, key: str, *args) -> {str}:
return self._master.sinter(key, *args)
def sunion(self, key: str, *args) -> {str}:
return self._master.sunion(key, *args)
def scard(self, key: str) -> int:
return self._master.scard(key)
class Zset(object):
def __init__(self, master):
self._master = master
def zadd(self, key: str, items: dict) -> int:
return self._master.zadd(key, items)
def zrange(self, key: str, start: int, stop: int, withscores: bool = False) -> list:
return self._master.zrange(key, start, stop, withscores)
def zrevrange(self, key: str, start: int, stop: int, withscores: bool = False) -> list:
return self._master.zrevrange(key, start, stop, withscores)
def zcard(self, key: str) -> int:
return self._master.zcard(key)
class Key(object):
def __init__(self, master):
self._master = master
def keys(self, pattern: str) -> [str]:
return self._master.keys(pattern)
def getAllKeys(self) -> [str]:
return self._master.getAllKeys()
def rename(self, key:str, newkey:str):
return self._master.rename(key, newkey)
def renamenx(self, key:str, newkey:str) -> int:
return self._master.renamenx(key, newkey)
def dump(self, key:str) -> bytes:
return self._master.dump(key)
def delete(self, key:str, *args) -> int:
return self._master.delete(key, *args)
def exists(self, key:str, *args) -> int:
return self._master.exists(key, *args)
def expire(self, key:str, seconds:int) -> int:
return self._master.expire(key, seconds)
def pexpire(self, key:str, milliseconds:int) -> int:
return self._master.pexpire(key, milliseconds)
def persist(self, key:str) -> int:
return self._master.persist(key)
def ttl(self, key:str) -> int:
return self._master.ttl(key)
def pttl(self, key:str) -> int:
return self._master.pttl(key)
def randomKey(self) -> str:
return self._master.randomKey()
def typeOf(self, key:str) -> str:
return self._master.typeOf(key) | 28.761905 | 91 | 0.605014 | 567 | 4,228 | 4.389771 | 0.12522 | 0.204902 | 0.250703 | 0.137405 | 0.57051 | 0.466051 | 0.466051 | 0.337887 | 0.218562 | 0.086782 | 0 | 0.000317 | 0.253784 | 4,228 | 147 | 92 | 28.761905 | 0.78859 | 0.001892 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.46875 | false | 0 | 0 | 0.40625 | 0.9375 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
65cd90955c3b7261bb1f3464f5419a284d43508b | 58 | py | Python | datatracker/exceptions.py | stefanomunarini/bibxml-service | 4c3f6922f8a5c84cf63755e687f61862da0f4ad4 | [
"BSD-3-Clause"
] | null | null | null | datatracker/exceptions.py | stefanomunarini/bibxml-service | 4c3f6922f8a5c84cf63755e687f61862da0f4ad4 | [
"BSD-3-Clause"
] | null | null | null | datatracker/exceptions.py | stefanomunarini/bibxml-service | 4c3f6922f8a5c84cf63755e687f61862da0f4ad4 | [
"BSD-3-Clause"
] | null | null | null | class UnexpectedDatatrackerResponse(ValueError):
pass
| 19.333333 | 48 | 0.827586 | 4 | 58 | 12 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12069 | 58 | 2 | 49 | 29 | 0.941176 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.