hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
873ec9ba303e141273950fd51d91a34b5133d14b | 1,511 | py | Python | setup.py | drfb/wca-django-allauth | f1789e76448209b28bb92287e01c8d42a7dd98ac | [
"MIT"
] | null | null | null | setup.py | drfb/wca-django-allauth | f1789e76448209b28bb92287e01c8d42a7dd98ac | [
"MIT"
] | null | null | null | setup.py | drfb/wca-django-allauth | f1789e76448209b28bb92287e01c8d42a7dd98ac | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from setuptools import setup, find_packages
with open("README.rst") as readme_file:
readme = readme_file.read()
requirements = ["Django >= 1.11", "django-allauth"]
setup(
name="wca-django-allauth",
version="1.0.1",
description="World Cube Association OAuth2 provider for django-allauth.",
long_description=readme,
author="Dhan-Rheb Belza",
author_email="dhanrheb@gmail.com",
url="https://github.com/drfb/wca-django-allauth",
license="MIT license",
packages=find_packages(include=["wca_allauth", "wca_allauth.*"]),
keywords="world cube association allauth",
classifiers=[
"Development Status :: 4 - Beta",
"Intended Audience :: Developers",
"License :: OSI Approved :: MIT License",
"Operating System :: OS Independent",
"Programming Language :: Python :: 2",
"Programming Language :: Python :: 2.7",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Framework :: Django",
"Framework :: Django :: 1.11",
"Framework :: Django :: 2.0",
"Framework :: Django :: 2.1",
"Framework :: Django :: 2.2",
"Framework :: Django :: 3.0",
],
install_requires=requirements,
include_package_data=True,
)
| 34.340909 | 77 | 0.611516 | 165 | 1,511 | 5.533333 | 0.478788 | 0.166484 | 0.219058 | 0.170865 | 0.059146 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028522 | 0.234282 | 1,511 | 43 | 78 | 35.139535 | 0.760588 | 0.013236 | 0 | 0 | 0 | 0 | 0.559732 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.026316 | 0 | 0.026316 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
87407ef385f4c8a82cb9204406f570505da69037 | 7,840 | py | Python | temboo/core/Library/Wordnik/Words/ReverseDictionary.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 7 | 2016-03-07T02:07:21.000Z | 2022-01-21T02:22:41.000Z | temboo/core/Library/Wordnik/Words/ReverseDictionary.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | null | null | null | temboo/core/Library/Wordnik/Words/ReverseDictionary.py | jordanemedlock/psychtruths | 52e09033ade9608bd5143129f8a1bfac22d634dd | [
"Apache-2.0"
] | 8 | 2016-06-14T06:01:11.000Z | 2020-04-22T09:21:44.000Z | # -*- coding: utf-8 -*-
###############################################################################
#
# ReverseDictionary
# Retrieves a reverse dictionary search for a given word.
#
# Python versions 2.6, 2.7, 3.x
#
# Copyright 2014, Temboo Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND,
# either express or implied. See the License for the specific
# language governing permissions and limitations under the License.
#
#
###############################################################################
from temboo.core.choreography import Choreography
from temboo.core.choreography import InputSet
from temboo.core.choreography import ResultSet
from temboo.core.choreography import ChoreographyExecution
import json
class ReverseDictionary(Choreography):
def __init__(self, temboo_session):
"""
Create a new instance of the ReverseDictionary Choreo. A TembooSession object, containing a valid
set of Temboo credentials, must be supplied.
"""
super(ReverseDictionary, self).__init__(temboo_session, '/Library/Wordnik/Words/ReverseDictionary')
def new_input_set(self):
return ReverseDictionaryInputSet()
def _make_result_set(self, result, path):
return ReverseDictionaryResultSet(result, path)
def _make_execution(self, session, exec_id, path):
return ReverseDictionaryChoreographyExecution(session, exec_id, path)
class ReverseDictionaryInputSet(InputSet):
"""
An InputSet with methods appropriate for specifying the inputs to the ReverseDictionary
Choreo. The InputSet object is used to specify input parameters when executing this Choreo.
"""
def set_APIKey(self, value):
"""
Set the value of the APIKey input for this Choreo. ((required, string) The API Key from Wordnik.)
"""
super(ReverseDictionaryInputSet, self)._set_input('APIKey', value)
def set_ExcludePartOfSpeech(self, value):
"""
Set the value of the ExcludePartOfSpeech input for this Choreo. ((optional, string) Excludes the specified comma-delimited parts of speech from the results returned. Acceptable values include: adjective, noun, etc. See docs for full list.)
"""
super(ReverseDictionaryInputSet, self)._set_input('ExcludePartOfSpeech', value)
def set_ExcludeSource(self, value):
"""
Set the value of the ExcludeSource input for this Choreo. ((optional, string) Exclude these comma-delimited source dictionaries: ahd, century, wiktionary,webster, wordnet.)
"""
super(ReverseDictionaryInputSet, self)._set_input('ExcludeSource', value)
def set_ExpandTerms(self, value):
"""
Set the value of the ExpandTerms input for this Choreo. ((optional, any) Expand terms by either: synonym or hypernym.)
"""
super(ReverseDictionaryInputSet, self)._set_input('ExpandTerms', value)
def set_IncludePartOfSpeech(self, value):
"""
Set the value of the IncludePartOfSpeech input for this Choreo. ((optional, string) Only includes the specified comma-delimited parts of speech. Acceptable values include: adjective, noun, etc. See docs for full list.)
"""
super(ReverseDictionaryInputSet, self)._set_input('IncludePartOfSpeech', value)
def set_IncludeSource(self, value):
"""
Set the value of the IncludeSource input for this Choreo. ((optional, string) Only include these comma-delimited source dictionaries: ahd, century, wiktionary,webster, wordnet.)
"""
super(ReverseDictionaryInputSet, self)._set_input('IncludeSource', value)
def set_Limit(self, value):
"""
Set the value of the Limit input for this Choreo. ((optional, integer) Maximum number of results to return. Defaults to 10.)
"""
super(ReverseDictionaryInputSet, self)._set_input('Limit', value)
def set_MaxCorpus(self, value):
"""
Set the value of the MaxCorpus input for this Choreo. ((optional, integer) Results include a corpus frequency count for each word returned. When this input is specified, results are limited to words with a corpus frequency count below the given number.)
"""
super(ReverseDictionaryInputSet, self)._set_input('MaxCorpus', value)
def set_MaxLength(self, value):
"""
Set the value of the MaxLength input for this Choreo. ((optional, integer) Maximum word length.)
"""
super(ReverseDictionaryInputSet, self)._set_input('MaxLength', value)
def set_MinCorpus(self, value):
"""
Set the value of the MinCorpus input for this Choreo. ((optional, integer) Results include a corpus frequency count for each word returned. When this input is specified, results are limited to words with a corpus frequency count above the given number.)
"""
super(ReverseDictionaryInputSet, self)._set_input('MinCorpus', value)
def set_MinLength(self, value):
"""
Set the value of the MinLength input for this Choreo. ((optional, integer) Minimum word length.)
"""
super(ReverseDictionaryInputSet, self)._set_input('MinLength', value)
def set_Query(self, value):
"""
Set the value of the Query input for this Choreo. ((required, string) Word or fragment to query for in Wordnik.)
"""
super(ReverseDictionaryInputSet, self)._set_input('Query', value)
def set_ResponseType(self, value):
"""
Set the value of the ResponseType input for this Choreo. ((optional, string) Response can be either JSON or XML. Defaults to JSON.)
"""
super(ReverseDictionaryInputSet, self)._set_input('ResponseType', value)
def set_Skip(self, value):
"""
Set the value of the Skip input for this Choreo. ((optional, integer) Number of results to skip.)
"""
super(ReverseDictionaryInputSet, self)._set_input('Skip', value)
def set_SortBy(self, value):
"""
Set the value of the SortBy input for this Choreo. ((optional, string) Results can be sorted by: alpha, count, or length.)
"""
super(ReverseDictionaryInputSet, self)._set_input('SortBy', value)
def set_SortOrder(self, value):
"""
Set the value of the SortOrder input for this Choreo. ((optional, string) Indicate the order to sort, either asc (ascending) or desc (descending).)
"""
super(ReverseDictionaryInputSet, self)._set_input('SortOrder', value)
def set_WordSense(self, value):
"""
Set the value of the WordSense input for this Choreo. ((optional, string) Restricts words and finds the closest sense to the one indicated.)
"""
super(ReverseDictionaryInputSet, self)._set_input('WordSense', value)
class ReverseDictionaryResultSet(ResultSet):
"""
A ResultSet with methods tailored to the values returned by the ReverseDictionary Choreo.
The ResultSet object is used to retrieve the results of a Choreo execution.
"""
def getJSONFromString(self, str):
return json.loads(str)
def get_Response(self):
"""
Retrieve the value for the "Response" output from this Choreo execution. (The response from Wordnik.)
"""
return self._output.get('Response', None)
class ReverseDictionaryChoreographyExecution(ChoreographyExecution):
def _make_result_set(self, response, path):
return ReverseDictionaryResultSet(response, path)
| 48.09816 | 261 | 0.687117 | 927 | 7,840 | 5.731392 | 0.243797 | 0.017881 | 0.038396 | 0.047995 | 0.476002 | 0.392057 | 0.303971 | 0.161114 | 0.140034 | 0.140034 | 0 | 0.002576 | 0.207653 | 7,840 | 162 | 262 | 48.395062 | 0.852543 | 0.494005 | 0 | 0 | 0 | 0 | 0.065073 | 0.012107 | 0 | 0 | 0 | 0 | 0 | 1 | 0.421053 | false | 0 | 0.087719 | 0.087719 | 0.684211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
8744ad85049280d3d7e84e12f0756b1db2c5d62e | 133 | py | Python | contest/abc093/C.py | mola1129/atcoder | 1d3b18cb92d0ba18c41172f49bfcd0dd8d29f9db | [
"MIT"
] | null | null | null | contest/abc093/C.py | mola1129/atcoder | 1d3b18cb92d0ba18c41172f49bfcd0dd8d29f9db | [
"MIT"
] | null | null | null | contest/abc093/C.py | mola1129/atcoder | 1d3b18cb92d0ba18c41172f49bfcd0dd8d29f9db | [
"MIT"
] | null | null | null | a, b, c = map(int, input().split())
x = max(a, b, c)
total = a + b + c
if x % 2 != total % 2:
x += 1
print((3 * x - total) // 2)
| 19 | 35 | 0.443609 | 28 | 133 | 2.107143 | 0.535714 | 0.101695 | 0.152542 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053763 | 0.300752 | 133 | 6 | 36 | 22.166667 | 0.580645 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8746a3928d35e5735d7f5b5e631c4aa6aa6f9a14 | 605 | py | Python | python/sklearn/setup.py | stephchen/weld | bfa02e2793e128052e59cdc99b27d9441ae40a1e | [
"BSD-3-Clause"
] | null | null | null | python/sklearn/setup.py | stephchen/weld | bfa02e2793e128052e59cdc99b27d9441ae40a1e | [
"BSD-3-Clause"
] | null | null | null | python/sklearn/setup.py | stephchen/weld | bfa02e2793e128052e59cdc99b27d9441ae40a1e | [
"BSD-3-Clause"
] | null | null | null | import os
import platform
import shutil
import subprocess
import sys
from setuptools import setup, Distribution
import setuptools.command.build_ext as _build_ext
from setuptools.command.install import install
class Install(install):
def run(self):
install.run(self)
python_executable = sys.executable
class BinaryDistribution(Distribution):
def has_ext_modules(self):
return True
setup(name='weldsklearn',
version='0.0.1',
packages=['weldsklearn'],
cmdclass={"install": Install},
distclass=BinaryDistribution,
install_requires=['pyweld'])
| 23.269231 | 49 | 0.728926 | 69 | 605 | 6.289855 | 0.521739 | 0.064516 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006061 | 0.181818 | 605 | 25 | 50 | 24.2 | 0.870707 | 0 | 0 | 0 | 0 | 0 | 0.066116 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.095238 | false | 0 | 0.380952 | 0.047619 | 0.619048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
87470808dc34b81ad1b5c3b1f8407fb03a69f602 | 66 | py | Python | torch/ao/nn/sparse/__init__.py | brooks-anderson/pytorch | dd928097938b6368fc7e2dc67721550d50ab08ea | [
"Intel"
] | 7 | 2021-05-29T16:31:51.000Z | 2022-02-21T18:52:25.000Z | torch/ao/nn/sparse/__init__.py | brooks-anderson/pytorch | dd928097938b6368fc7e2dc67721550d50ab08ea | [
"Intel"
] | 1 | 2021-05-10T01:18:33.000Z | 2021-05-10T01:18:33.000Z | torch/ao/nn/sparse/__init__.py | brooks-anderson/pytorch | dd928097938b6368fc7e2dc67721550d50ab08ea | [
"Intel"
] | 1 | 2021-08-06T22:50:37.000Z | 2021-08-06T22:50:37.000Z | # Folders
from . import quantized
__all__ = [
'quantized',
]
| 9.428571 | 23 | 0.636364 | 6 | 66 | 6.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.242424 | 66 | 6 | 24 | 11 | 0.76 | 0.106061 | 0 | 0 | 0 | 0 | 0.157895 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
874ee7911c79aa69f7ef41f0b7ba2b4e4742187d | 5,460 | py | Python | trava_erpnext/hooks.py | trava-sport/trava_erpnext_12 | 6ee070254a8e9bc5999fde925c21349f6ba0a782 | [
"MIT"
] | null | null | null | trava_erpnext/hooks.py | trava-sport/trava_erpnext_12 | 6ee070254a8e9bc5999fde925c21349f6ba0a782 | [
"MIT"
] | null | null | null | trava_erpnext/hooks.py | trava-sport/trava_erpnext_12 | 6ee070254a8e9bc5999fde925c21349f6ba0a782 | [
"MIT"
] | 1 | 2022-02-12T20:21:57.000Z | 2022-02-12T20:21:57.000Z | # -*- coding: utf-8 -*-
from __future__ import unicode_literals
from . import __version__ as app_version
app_name = "trava_erpnext"
app_title = "Trava Erpnext"
app_publisher = "trava"
app_description = "Trava Erpnext System"
app_icon = "octicon octicon-file-directory"
app_color = "grey"
app_email = "belyerin@ya.ru"
app_license = "MIT"
fixtures = ['Report', 'Role Profile', 'Role', 'Custom Field', 'Custom Script', 'Property Setter', 'Workflow', 'Workflow State', 'Workflow Action']
# Includes in <head>
# ------------------
# include js, css files in header of desk.html
# app_include_css = "/assets/trava_erpnext/css/trava_erpnext.css"
# app_include_js = "assets/js/trava_erpnext.min.js"
# include js, css files in header of web template
# web_include_css = "/assets/trava_erpnext/css/trava_erpnext.css"
# web_include_js = "/assets/trava_erpnext/js/trava_erpnext.js"
# include custom scss in every website theme (without file extension ".scss")
# website_theme_scss = "trava_erpnext/public/scss/website"
# include js, css files in header of web form
# webform_include_js = {"doctype": "public/js/doctype.js"}
# webform_include_css = {"doctype": "public/css/doctype.css"}
# include js in page
# page_js = {"page" : "public/js/file.js"}
# include js in doctype views
# doctype_js = {"doctype" : "public/js/doctype.js"}
# doctype_list_js = {"doctype" : "public/js/doctype_list.js"}
# doctype_tree_js = {"doctype" : "public/js/doctype_tree.js"}
# doctype_calendar_js = {"doctype" : "public/js/doctype_calendar.js"}
doctype_js = {
"Sales Invoice" : "public/js/sales_invoice.js",
"Sales Order" : "public/js/sales_order.js",
"Packing Slip" : "public/js/packing_slip.js"
}
# Home Pages
# ----------
# application home page (will override Website Settings)
# home_page = "login"
# website user home page (by Role)
# role_home_page = {
# "Role": "home_page"
# }
# Generators
# ----------
# automatically create page for each record of this doctype
# website_generators = ["Web Page"]
# Installation
# ------------
# before_install = "trava_erpnext.install.before_install"
# after_install = "trava_erpnext.install.after_install"
# Desk Notifications
# ------------------
# See frappe.core.notifications.get_notification_config
# notification_config = "trava_erpnext.notifications.get_notification_config"
# Permissions
# -----------
# Permissions evaluated in scripted ways
# permission_query_conditions = {
# "Event": "frappe.desk.doctype.event.event.get_permission_query_conditions",
# }
#
# has_permission = {
# "Event": "frappe.desk.doctype.event.event.has_permission",
# }
# DocType Class
# ---------------
# Override standard doctype classes
# override_doctype_class = {
# "ToDo": "custom_app.overrides.CustomToDo"
# }
# override_doctype_class = {
# "Sales Invoice": "trava_erpnext.overrides.sales_invoice.CustomSalesInvoice",
# "Packing Slip": "trava_erpnext.overrides.packing_slip.CustomPackingSlip"
# }
# Document Events
# ---------------
# Hook on document methods and events
doc_events = {
"Sales Invoice": {
"before_insert": ["trava_erpnext.overrides.sales_invoice.build_my_thing"]
},
}
#"validate": "trava_erpnext.trava_erpnext.trava_integrations.doctype.wb_settings.sales_order.validate",
# Scheduled Tasks
# ---------------
scheduler_events = {
"cron": {
"2/30 * * * *": [
"trava_erpnext.trava_integrations.doctype.wb_settings.wb_settings.schedule_get_report_orders_daily",
"trava_erpnext.trava_integrations.doctype.wb_settings.wb_settings.schedule_get_report_sales_daily"
],
"40 6,11,16,23 * * *": [
"trava_erpnext.trava_integrations.doctype.wb_settings.wb_settings.schedule_get_report_stocks",
],
"10 3 4 * *": [
"trava_erpnext.trava_integrations.doctype.wb_settings.wb_settings.schedule_get_report_orders_monthly",
"trava_erpnext.trava_integrations.doctype.wb_settings.wb_settings.schedule_get_report_sales_monthly",
],
"10 3 * * 2": [
"trava_erpnext.trava_integrations.doctype.wb_settings.wb_settings.schedule_get_report_sales_by_sales",
],
"10 5 * * 2": [
"trava_erpnext.trava_integrations.doctype.wb_settings.wb_methods.schedule_create_report_commission_from_wb_sbs",
]
},
# "all": [
# "trava_erpnext.trava_integrations.doctype.wb_settings.wb_settings.schedule_get_report_orders_daily"
# "trava_erpnext.trava_integrations.doctype.wb_settings.wb_settings.schedule_get_report_sales_daily",
# ],
# "daily": [
# "trava_erpnext.tasks.daily"
# ],
# "hourly": [
# "trava_erpnext.tasks.hourly"
# ],
# "weekly": [
# "trava_erpnext.tasks.weekly"
# ]
# "monthly": [
# "trava_erpnext.tasks.monthly"
# ]
}
# Testing
# -------
# before_tests = "trava_erpnext.install.before_tests"
# Overriding Methods
# ------------------------------
#
# override_whitelisted_methods = {
# "frappe.desk.doctype.event.event.get_events": "trava_erpnext.event.get_events"
# }
#override_whitelisted_methods = {
# "erpnext.selling.page.point_of_sale.point_of_sale.search_serial_or_batch_or_barcode_number": "trava_erpnext.overrides.point_of_sale.search_serial_or_batch_or_barcode_number"
#}
#
# each overriding function accepts a `data` argument;
# generated from the base implementation of the doctype dashboard,
# along with any modifications made in other Frappe apps
# override_doctype_dashboards = {
# "Task": "trava_erpnext.task.get_dashboard_data"
# }
# exempt linked doctypes from being automatically cancelled
#
# auto_cancel_exempted_doctypes = ["Auto Repeat"]
| 30.502793 | 175 | 0.729487 | 682 | 5,460 | 5.52346 | 0.294721 | 0.11468 | 0.049642 | 0.076984 | 0.36103 | 0.306876 | 0.267321 | 0.247943 | 0.232015 | 0.194319 | 0 | 0.005187 | 0.117216 | 5,460 | 178 | 176 | 30.674157 | 0.776349 | 0.65641 | 0 | 0.095238 | 0 | 0 | 0.646693 | 0.473714 | 0 | 0 | 0 | 0.005618 | 0 | 1 | 0 | false | 0 | 0.047619 | 0 | 0.047619 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8769cfedcaa29cd1f526640c0788831069ecb4df | 1,111 | py | Python | shotgrid_leecher/record/new_asset_event.py | Ellipsanime/openpype-shotgun | 21c753d3dd4d55923462f3c833f650678b138b6e | [
"MIT"
] | 6 | 2021-09-01T16:04:30.000Z | 2022-02-14T21:41:51.000Z | shotgrid_leecher/record/new_asset_event.py | Ellipsanime/shotgrid-leecher | a2272e77d1f1326026bf1ccb19adeb32ea05f676 | [
"MIT"
] | null | null | null | shotgrid_leecher/record/new_asset_event.py | Ellipsanime/shotgrid-leecher | a2272e77d1f1326026bf1ccb19adeb32ea05f676 | [
"MIT"
] | null | null | null | import datetime
import json
from typing import Any, Dict
import attr
from shotgrid_leecher.record.enums import ShotgridEvents
from shotgrid_leecher.record.shotgrid_subtypes import (
ShotgridProject,
ShotgridUser,
ShotgridEntity,
)
from shotgrid_leecher.utils.encoders import DataclassJSONEncoder
@attr.s(auto_attribs=True, frozen=True)
class NewAssetEvent:
shotgrid_id: int
shotgrid_name: str
shotgrid_creation_date: datetime.datetime
shotgrid_user: ShotgridUser
shotgrid_project: ShotgridProject
shotgrid_entity: ShotgridEntity
type: str = ShotgridEvents.NEW_ASSET.value
def get_unique_id(self) -> str:
return "".join(
[
self.shotgrid_project.name,
"/",
self.shotgrid_name,
"/",
self.type,
]
)
def to_json(self) -> str:
return json.dumps(self, cls=DataclassJSONEncoder)
def to_dict(self) -> Dict[str, Any]:
# TODO use more optimized way to convert dataclass to dict struct
return json.loads(self.to_json())
| 25.837209 | 73 | 0.664266 | 122 | 1,111 | 5.885246 | 0.47541 | 0.050139 | 0.079387 | 0.069638 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.261026 | 1,111 | 42 | 74 | 26.452381 | 0.874543 | 0.056706 | 0 | 0.058824 | 0 | 0 | 0.001912 | 0 | 0 | 0 | 0 | 0.02381 | 0 | 1 | 0.088235 | false | 0 | 0.205882 | 0.088235 | 0.617647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
876a8f3d7c4d0c8a21f36beed4653143aece646e | 2,556 | py | Python | acceptance_tests/features/pages/inbox_internal.py | ONSdigital/ras-integration-tests | ac31b8c2e79fec70e1475731edabc5acd54e8874 | [
"MIT"
] | 2 | 2017-11-03T13:10:46.000Z | 2018-01-29T13:12:12.000Z | acceptance_tests/features/pages/inbox_internal.py | ONSdigital/rasrm-acceptance-tests | ac31b8c2e79fec70e1475731edabc5acd54e8874 | [
"MIT"
] | 99 | 2018-02-23T10:52:22.000Z | 2021-02-03T11:35:54.000Z | acceptance_tests/features/pages/inbox_internal.py | ONSdigital/ras-integration-tests | ac31b8c2e79fec70e1475731edabc5acd54e8874 | [
"MIT"
] | 1 | 2021-04-11T07:55:47.000Z | 2021-04-11T07:55:47.000Z | from acceptance_tests import browser
from common.browser_utilities import wait_for_url_matches
from config import Config
def go_to_using_context(context, conversation_tab='open'):
tab = conversation_tab.replace(' ', '+')
target_url = f"{Config.RESPONSE_OPERATIONS_UI}/messages/{context.short_name}?conversation_tab={tab}"
browser.visit(target_url)
wait_for_url_matches(target_url, timeout=3, retry=0.5, post_change_delay=0.1)
def go_to_select_survey():
target_url = f"{Config.RESPONSE_OPERATIONS_UI}/messages/select-survey"
browser.visit(target_url)
wait_for_url_matches(target_url, timeout=3, retry=0.5, post_change_delay=0.1)
def get_page_title():
return browser.title
def get_messages():
messages = []
table = browser.find_by_id('tbl-messages')
rows = table.find_by_tag('tbody').find_by_tag('tr')
for row in rows:
messages.append({
'ru_ref': row.find_by_name('tbl-messages-RU_Ref').value,
'business_name': row.find_by_name('tbl-messages-business').value,
'subject': row.find_by_name('tbl-messages-subject').value,
'from': row.find_by_name('tbl-messages-from').value,
'to': row.find_by_name('tbl-messages-to').value,
'received': row.find_by_name('tbl-messages-received').value
})
return messages
def get_table_heading():
table = browser.find_by_id('tbl-messages')
headings = table.find_by_tag('thead').find_by_tag('tr')
return headings[0].value
def get_no_messages_text():
return browser.find_by_text('No new conversations')
def get_no_closed_conversations_text():
return browser.find_by_text('No closed conversations')
def get_no_my_conversations_text():
return browser.find_by_text('There are currently no messages')
def get_dropdown_list():
return browser.driver.find_element_by_id('survey-list')
def get_filter_page_title():
return browser.find_by_text('Filter messages by survey')
def get_unread_messages():
return browser.find_by_name('message-unread')
def get_pagination_previous_link():
return browser.driver.find_element_by_class_name('previous')
def get_pagination():
return browser.driver.find_element_by_class_name('pagination')
def get_pagination_next_link():
return browser.driver.find_element_by_class_name('next')
def get_message_link_index(number_of_messages):
return browser.find_by_id(f"message-link-{number_of_messages}")
def closed_tab_present():
return browser.driver.find_element_by_link_text('Closed')
| 29.37931 | 104 | 0.739828 | 371 | 2,556 | 4.74124 | 0.245283 | 0.061399 | 0.059125 | 0.044343 | 0.478113 | 0.434338 | 0.316089 | 0.217737 | 0.1444 | 0.093235 | 0 | 0.005023 | 0.143192 | 2,556 | 86 | 105 | 29.72093 | 0.798174 | 0 | 0 | 0.111111 | 0 | 0 | 0.203443 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 1 | 0.296296 | false | 0 | 0.055556 | 0.222222 | 0.611111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
877496e08dadfd47ed6d3e6c041c27f8e11022ab | 623 | py | Python | plants_api/users/migrations/0005_auto_20191026_2044.py | Javen17/plants_api | 08e68aa6a1d350f00879b645bbfdc37b900e9464 | [
"MIT"
] | 2 | 2019-09-29T04:19:32.000Z | 2019-10-27T23:44:21.000Z | plants_api/users/migrations/0005_auto_20191026_2044.py | Javen17/plants_api | 08e68aa6a1d350f00879b645bbfdc37b900e9464 | [
"MIT"
] | 12 | 2020-03-28T00:13:21.000Z | 2022-02-10T08:33:33.000Z | plants_api/users/migrations/0005_auto_20191026_2044.py | Javen17/plants_api | 08e68aa6a1d350f00879b645bbfdc37b900e9464 | [
"MIT"
] | 1 | 2019-09-28T20:27:45.000Z | 2019-09-28T20:27:45.000Z | # Generated by Django 2.2.5 on 2019-10-27 01:44
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('users', '0004_remove_user_name'),
]
operations = [
migrations.RemoveField(
model_name='profile',
name='email',
),
migrations.RemoveField(
model_name='profile',
name='name',
),
migrations.AddField(
model_name='user',
name='name',
field=models.CharField(blank=True, max_length=255, verbose_name='Name of User'),
),
]
| 23.074074 | 92 | 0.555377 | 63 | 623 | 5.365079 | 0.619048 | 0.079882 | 0.153846 | 0.177515 | 0.242604 | 0.242604 | 0 | 0 | 0 | 0 | 0 | 0.052381 | 0.325843 | 623 | 26 | 93 | 23.961538 | 0.752381 | 0.072231 | 0 | 0.45 | 1 | 0 | 0.119792 | 0.036458 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.05 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e40b831d6001505e617a19a3e45b3bab3eb25db | 784 | py | Python | player_registry/views.py | brunosmmm/chainball-server | c272532bfc515044e772e1870ecf54dad88bf1e9 | [
"MIT"
] | null | null | null | player_registry/views.py | brunosmmm/chainball-server | c272532bfc515044e772e1870ecf54dad88bf1e9 | [
"MIT"
] | 7 | 2020-02-12T00:29:50.000Z | 2022-03-12T01:02:46.000Z | player_registry/views.py | brunosmmm/chainball-server | c272532bfc515044e772e1870ecf54dad88bf1e9 | [
"MIT"
] | null | null | null | """Player registry views."""
from .serializers import PlayerSerializer
from .models import Player
from rest_framework import viewsets
from rest_framework.decorators import action
from rest_framework.response import Response
from rest_framework.permissions import IsAuthenticated
from rest_framework_api_key.permissions import HasAPIKey
class PlayerViewSet(viewsets.ReadOnlyModelViewSet):
"""Player viewset."""
permission_classes = [HasAPIKey | IsAuthenticated]
queryset = Player.objects.all()
serializer_class = PlayerSerializer
@action(detail=True)
def get_sfx_data(self, request, pk=None):
"""Get SFX data."""
player = self.get_object()
sfx_data = player.sfx_data_b64
return Response({"status": "ok", "data": sfx_data})
| 30.153846 | 59 | 0.748724 | 90 | 784 | 6.344444 | 0.488889 | 0.070053 | 0.148862 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003044 | 0.16199 | 784 | 25 | 60 | 31.36 | 0.866058 | 0.066327 | 0 | 0 | 0 | 0 | 0.01676 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.4375 | 0 | 0.8125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
5e6b7838b2aa8ca3128f189e4c1e7419781308f3 | 2,138 | py | Python | handybeam/opencl_wrappers/propagator_wrappers.py | ultraleap/HandyBeam | 9f80b97742cde4b75d3478d554dc9bc2cd9dfd96 | [
"ECL-2.0",
"Apache-2.0"
] | 2 | 2019-10-20T09:15:46.000Z | 2020-12-03T00:31:23.000Z | handybeam/opencl_wrappers/propagator_wrappers.py | ultraleap/HandyBeam | 9f80b97742cde4b75d3478d554dc9bc2cd9dfd96 | [
"ECL-2.0",
"Apache-2.0"
] | 3 | 2020-04-04T18:36:54.000Z | 2021-10-12T22:57:34.000Z | handybeam/opencl_wrappers/propagator_wrappers.py | ultraleap/HandyBeam | 9f80b97742cde4b75d3478d554dc9bc2cd9dfd96 | [
"ECL-2.0",
"Apache-2.0"
] | 5 | 2019-11-29T16:05:26.000Z | 2021-07-01T22:56:39.000Z | ## Imports
import handybeam
import handybeam.opencl_wrappers.abstract_wrapper
import handybeam.propagator_mixins
import handybeam.propagator_mixins.clist_propagator
import handybeam.propagator_mixins.rect_propagator
import handybeam.propagator_mixins.hex_propagator
import handybeam.propagator_mixins.lamb_propagator
import handybeam.tx_array
import handybeam.cl_system
## Global variables
## Class
class Propagator(
handybeam.opencl_wrappers.abstract_wrapper.Wrapper,
handybeam.propagator_mixins.clist_propagator.ClistPropMixin,
handybeam.propagator_mixins.rect_propagator.RectPropMixin,
handybeam.propagator_mixins.hex_propagator.HexPropMixin,
handybeam.propagator_mixins.lamb_propagator.LambPropMixin
):
'''This is a wrapper class which inherits from the template wrapper class Wrapper and the
OpenCL propagator mixin classes. An instance of this class is initialised when a world
object is initialised.
'''
def __init__(self, parent=None, use_device = 0, use_platform = 0):
'''This method intialises an instance of the Propagator class. During the initialisation process,
the compiled OpenCL propagator kernels are assigned to the appropriate propagator mixin classes.
.. TODO::
Provide description and type for the handybeam world object.
Parameters
----------
parent : handybeam world object
DESCRIPTION
'''
# Inherits the OpenCL wrappers - i.e. the mixin classes
super(Propagator, self).__init__()
self.parent = parent
self.cl_system = handybeam.cl_system.OpenCLSystem(parent = self.parent,use_device = self.parent.device, use_platform=self.parent.platform)
# Run the _register methods for each of mixin classes to initialise the high-performance opencl kernels.
self._register_clist_propagator()
self._register_rect_propagator()
self._register_hex_propagator()
self._register_lamb_propagator()
| 33.40625 | 146 | 0.707203 | 233 | 2,138 | 6.274678 | 0.339056 | 0.092339 | 0.153899 | 0.106019 | 0.277702 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001221 | 0.233863 | 2,138 | 63 | 147 | 33.936508 | 0.891331 | 0.340037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015873 | 0 | 1 | 0.041667 | false | 0 | 0.375 | 0 | 0.458333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
5e80811b5ad4b113c2a63a7b61b281e654ba8d66 | 3,391 | py | Python | aries_cloudagent/tails/tests/test_indy.py | ldej/aries-cloudagent-python | 25b7a9c08921e67b0962c434102489884ac403b2 | [
"Apache-2.0"
] | null | null | null | aries_cloudagent/tails/tests/test_indy.py | ldej/aries-cloudagent-python | 25b7a9c08921e67b0962c434102489884ac403b2 | [
"Apache-2.0"
] | 1 | 2020-06-16T20:20:55.000Z | 2020-06-16T20:20:55.000Z | aries_cloudagent/tails/tests/test_indy.py | ldej/aries-cloudagent-python | 25b7a9c08921e67b0962c434102489884ac403b2 | [
"Apache-2.0"
] | 1 | 2020-04-30T08:22:22.000Z | 2020-04-30T08:22:22.000Z | from asynctest import TestCase as AsyncTestCase
from asynctest import mock as async_mock
from ...config.injection_context import InjectionContext
from .. import indy_tails_server as test_module
TEST_DID = "55GkHamhTU1ZbTbV2ab9DE"
CRED_DEF_ID = f"{TEST_DID}:3:CL:1234:default"
REV_REG_ID = f"{TEST_DID}:4:{CRED_DEF_ID}:CL_ACCUM:0"
class TestIndyTailsServer(AsyncTestCase):
async def test_upload_no_tails_base_url_x(self):
context = InjectionContext(settings={"ledger.genesis_transactions": "dummy"})
indy_tails = test_module.IndyTailsServer()
with self.assertRaises(test_module.TailsServerNotConfiguredError):
await indy_tails.upload_tails_file(context, REV_REG_ID, "/tmp/dummy/path")
async def test_upload(self):
context = InjectionContext(
settings={
"ledger.genesis_transactions": "dummy",
"tails_server_base_url": "http://1.2.3.4:8088",
}
)
indy_tails = test_module.IndyTailsServer()
with async_mock.patch(
"builtins.open", async_mock.MagicMock()
) as mock_open, async_mock.patch.object(
test_module.aiohttp, "ClientSession", async_mock.MagicMock()
) as mock_cli_session:
mock_open.return_value = async_mock.MagicMock(
__enter__=async_mock.MagicMock()
)
mock_cli_session.return_value = async_mock.MagicMock(
__aenter__=async_mock.CoroutineMock(
return_value=async_mock.MagicMock(
put=async_mock.MagicMock(
return_value=async_mock.MagicMock(
__aenter__=async_mock.CoroutineMock(
return_value=async_mock.MagicMock(status=200)
)
)
)
)
)
)
(ok, reason) = await indy_tails.upload_tails_file(
context, REV_REG_ID, "/tmp/dummy/path"
)
assert ok
assert reason is None
with async_mock.patch(
"builtins.open", async_mock.MagicMock()
) as mock_open, async_mock.patch.object(
test_module.aiohttp, "ClientSession", async_mock.MagicMock()
) as mock_cli_session:
mock_open.return_value = async_mock.MagicMock(
__enter__=async_mock.MagicMock()
)
mock_cli_session.return_value = async_mock.MagicMock(
__aenter__=async_mock.CoroutineMock(
return_value=async_mock.MagicMock(
put=async_mock.MagicMock(
return_value=async_mock.MagicMock(
__aenter__=async_mock.CoroutineMock(
return_value=async_mock.MagicMock(
status=403, reason="Unauthorized"
)
)
)
)
)
)
)
(ok, reason) = await indy_tails.upload_tails_file(
context, REV_REG_ID, "/tmp/dummy/path"
)
assert not ok
assert reason == "Unauthorized"
| 39.894118 | 86 | 0.545562 | 318 | 3,391 | 5.437107 | 0.257862 | 0.140544 | 0.187392 | 0.115674 | 0.710237 | 0.710237 | 0.668595 | 0.668595 | 0.593407 | 0.593407 | 0 | 0.012316 | 0.37747 | 3,391 | 84 | 87 | 40.369048 | 0.806727 | 0 | 0 | 0.44 | 0 | 0 | 0.092008 | 0.047774 | 0 | 0 | 0 | 0 | 0.066667 | 1 | 0 | false | 0 | 0.053333 | 0 | 0.066667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e82af02f8365763472be7467cf284a1826edaa4 | 13,733 | py | Python | eos_db/test/test_user_api.py | cedadev/eos-db | b97b1b7c469779e370aab8ad68cf7e8d2e6ff8e6 | [
"BSD-3-Clause"
] | null | null | null | eos_db/test/test_user_api.py | cedadev/eos-db | b97b1b7c469779e370aab8ad68cf7e8d2e6ff8e6 | [
"BSD-3-Clause"
] | null | null | null | eos_db/test/test_user_api.py | cedadev/eos-db | b97b1b7c469779e370aab8ad68cf7e8d2e6ff8e6 | [
"BSD-3-Clause"
] | null | null | null | """Tests for DB API behaviour when logged in as user.
Need to sort out which tests live here and which in test_vm_actions_http
"""
import os
import unittest
from eos_db import server
from webtest import TestApp
from pyramid.paster import get_app
from http.cookiejar import DefaultCookiePolicy
# Depend on test.ini in the same dir as thsi file.
test_ini = os.path.join(os.path.dirname(__file__), 'test.ini')
class TestUserAPI(unittest.TestCase):
"""Tests API functions associated with actions a regular user can take.
Note that all tests are in-process, we don't actually start a HTTP server.
All administrative requirements will be set up with direct calls
to eos_db.server, and all user calls will be done via self.app.
"""
def setUp(self):
"""Launch app using webtest with test settings"""
self.appconf = get_app(test_ini)
self.app = TestApp(self.appconf)
#All auth via BasicAuth - never return the session cookie.
self.app.cookiejar.set_policy(DefaultCookiePolicy(allowed_domains=[]))
# This sets global var "engine" - in the case of SQLite this is a fresh RAM
# DB each time. If we only did this on class instantiation the database would
# be dirty and one test could influence another.
# TODO - add a test that tests this.
server.choose_engine("SQLite")
# Punch in new user account with direct server call
# This will implicitly generate the tables.
user_id = self.create_user("testuser")
#Here is what the user should look like when inspected
self.user_json = { "name" : "testuser testuser",
"handle" : "testuser@example.com",
"id" : 1,
"credits" : 0,
"username": "testuser"}
#print("user_id is %s" % str(user_id))
#print("user_from_db_is %s" % server.get_user_id_from_name("testuser"))
server.touch_to_add_password(user_id, "asdf")
# And log in as this user for all tests (via BasicAuth)
# FIXME - switch to token auth to speed up the tests.
self.app.authorization = ('Basic', ('testuser', 'asdf'))
"""Unauthenticated API functions.
Should respond the same regardless of authentication.
"""
def test_home_view(self):
""" Home view should respond with 200 OK. """
response = self.app.get('/', status=200)
# Not sure why Ben implemented options, but it should still work.
def test_options(self):
""" Options should respond with 200 OK. """
response = self.app.options('/', status=200)
"""User API functions.
The user functions in the API are primarily used by system utilities.
Creating a user and password, and validating against the database in
order to receive an access token, are prerequisites for using functions
in later sections of the API. These can only be called by an
administrator."""
def test_whoami(self):
""" How do I find out who I am? """
response = self.app.get('/user')
#We expect to be user 1, as the database is fresh.
#All other items should be as per create_user("testuser")
self.assertEqual( response.json, self.user_json )
def test_retrieve_my_info(self):
""" Retrieving my own user info by name should give the same result
as above."""
response = self.app.get('/users/testuser', status=200)
#We expect to be user 1, as the database is fresh.
#All other items should be as per create_user("testuser")
self.assertEqual( response.json, self.user_json )
def test_retrieve_other_user_info(self):
""" Retrieving info for another user should respond 200 OK. """
self.create_user("anotheruser")
self.assertEqual(self.app.get('/users/anotheruser').json['name'], "anotheruser anotheruser")
def test_retrieve_users(self):
""" Add another couple of users. Three records should be returned, as
there is already a testuser. """
self.create_user("foo")
self.create_user("bar")
response = self.app.get('/users')
self.assertEqual(len(response.json), 3)
#Unimplemented just now.
@unittest.expectedFailure
def test_delete_user(self):
""" Delete a user. Should fail because the account does not have permission,
but it actually fails because deletion is unimplemented.
"""
self.create_user("anotheruser")
response = self.app.delete('/users/anotheruser', status=404)
def test_change_my_password(self):
""" Apply a password to our user. Check that we receive a 200 OK.
Check we can log in with the new password but not the old.
"""
response = self.app.put('/user/password',
{'password': 'newpass'})
#This should work
self.app.authorization = ('Basic', ('testuser', 'newpass'))
self.app.get('/users/testuser')
#This should fail as the password is now wrong.
self.app.authorization = ('Basic', ('testuser', 'asdf'))
self.app.get('/users/testuser', status=401)
def test_change_other_password(self):
""" Try to change password for another user, which should fail.
"""
self.create_user("anotheruser")
response = self.app.put('/users/anotheruser/password',
{'password': 'newpass'},
status=401)
def test_retrieve_user_credit(self):
""" If administrator adds credit, I should be able to see it.
See full credit tests in test_credit.py
"""
self.add_credit(123, 'testuser')
#And retrieve it back
response = self.app.get('/user')
user_json = self.user_json.copy()
user_json['credits'] = 123
self.assertEqual( response.json, user_json )
def test_retrieve_servers(self):
""" A user can request a list of servers that they own.
"""
server_id = self.create_server('fooserver', 'testuser')
my_servers = self.app.get('/servers').json
self.assertTrue(server_id)
self.assertEqual(len(my_servers), 1)
self.assertEqual(my_servers[0]['artifact_name'], 'fooserver')
def test_retrieve_user_touches(self):
""" Retrieve a list of touches that the user has made to the database.
This can only be requested by the user themselves, an agent or an
administrator. """
def test_create_server(self):
""" A regular user cannot create a server or give themselves ownership
of a server, so this should produce an appropriate error.
"""
def test_create_server_owner(self):
""" Add an owner to a server. Ensure that a 200 OK response results.
"""
#FIXME - move this to administrator tests.
""" Server State-Change Functions. """
def test_start_server(self):
""" Check that a server appears in state 'Starting' after using the
relevant API call. This also tests the function 'retrieve_servers_in_state'.
"""
server_id = self.create_server('fooserver', 'testuser')
self.app.post('/servers/fooserver/Starting')
#1 - server should appear to be Starting in list of my servers.
my_servers = self.app.get('/servers').json
self.assertEqual(len(my_servers), 1)
self.assertEqual(my_servers[0]['state'], 'Starting')
#2 - server should appear to be Starting if I look at it directly
my_server = self.app.get('/servers/fooserver').json
self.assertEqual(my_server['state'], 'Starting')
#3 - server should appear as the only server in state Starting
servers_in_state = self.app.get('/states/Starting').json
self.assertEqual(len(servers_in_state), 1)
self.assertEqual(servers_in_state[0]['artifact_name'], 'fooserver')
def test_error_by_id_server(self):
""" Check that i can put a server into Error state, refereced by ID
"""
server_id = self.create_server('fooserver', 'testuser')
self.app.post('/servers/by_id/%i/Error' % server_id)
server_info = self.app.get('/servers/fooserver').json
self.assertEqual(server_info['state'], 'Error')
def test_boost_deboost_server(self):
""" Test the boost call, which is done by putting the server into /Preparing.
After this the server should be in the Preparing state and the user should
have fewer credits.
Also test the deboost.
"""
server_id = self.create_server('boostme', 'testuser')
self.add_credit(123, 'testuser')
self.app.post('/servers/boostme/Preparing', params=dict(hours=20, cores=2, ram=40))
#Check the user
user_info = self.app.get('/user').json
self.assertEqual(user_info['credits'], 103)
#Check the server
info_expected = dict(boosted="Boosted", boostremaining="19 hrs, 59 min", ram="40 GB", cores="2")
server_info = self.app.get('/servers/boostme').json
#Remove items I don't want to compare from server_info
info_got = {k:str(server_info[k]) for k in server_info if k in info_expected}
self.assertEqual(info_got, info_expected)
#Deboost and check again
self.app.post('/servers/boostme/Pre_Deboosting')
#Check the user - we should be down 1 credit.
user_info = self.app.get('/user').json
self.assertEqual(user_info['credits'], 122)
#Check the server once more.
info_expected = dict(boosted="Unboosted", boostremaining="N/A", ram="16 GB", cores="1")
server_info = self.app.get('/servers/boostme').json
#Remove items I don't want to compare from server_info
info_got = {k:str(server_info[k]) for k in server_info if k in info_expected}
self.assertEqual(info_got, info_expected)
def test_restart_server(self):
""" Check that a server appears in state 'Restarted' after using the
relevant API call. This also tests the function 'retrieve_servers_in_state'.
"""
def test_stop_server(self):
""" Check that a server appears in state 'Stopped' after using the
relevant API call. This also tests the function 'retrieve_servers_in_state'.
"""
def test_prepare_server(self):
""" Check that a server appears in state 'Prepared' after using the
relevant API call. This also tests the function 'retrieve_servers_in_state'.
"""
def test_pre_deboost_server(self):
""" Check that a server appears in relevant state after using the
relevant API call. This also tests the function 'retrieve_servers_in_state'.
"""
def test_stopped_server(self):
""" Check that a server appears in relevant state after using the
relevant API call. This also tests the function 'retrieve_servers_in_state'.
"""
def test_started_server(self):
""" Check that a server appears in relevant state after using the
relevant API call. This also tests the function 'retrieve_servers_in_state'.
"""
def test_prepared_server(self):
""" Check that a server appears in relevant state after using the
relevant API call. This also tests the function 'retrieve_servers_in_state'.
"""
def test_predeboosted_server(self):
""" Check that a server appears in relevant state after using the
relevant API call. This also tests the function 'retrieve_servers_in_state'.
"""
def test_retrieve_server(self):
""" Pull back details of our server by name. """
def test_retrieve_server_by_id(self):
""" Our server will have ID 1. Check that we can retrieve details of
it."""
def test_update_server(self):
""" Not currently implemented. """
def test_delete_server(self):
""" Not currently implemented. """
def test_set_server_specification(self):
""" Follows hard-coded rules for machine behaviour.
Set machine CPUs to 2. Check, should pass.
Set machine CPUs to 65000. Check, should fail.
Set machine RAM to 16. Check, should pass.
Set machine RAM to 65000. Check, should fail."""
def test_get_server_specification(self):
""" Check that machine RAM and Cores are 2 and 16 as above. """
def test_retrieve_job_progress(self):
""" Not currently implemented. """
def test_retrieve_server_touches(self):
""" Not currently implemented. """
###############################################################################
# Support Functions, calling the server code directly #
###############################################################################
def create_user(self, name):
#Since we are not logged in as the administrator, do this directly
return server.create_user("users", name + "@example.com", name + " " + name, name)
# Servers should not normally have uuid set to name, but maybe for testing it doesn't
# matter?
def create_server(self, name, owner):
owner_id = server.get_user_id_from_name(owner)
server_id = server.create_appliance(name, name)
server.touch_to_add_ownership(server_id, owner_id)
return server_id
def add_credit(self, amount, owner):
owner_id = server.get_user_id_from_name(owner)
server.touch_to_add_credit(owner_id, int(amount))
if __name__ == '__main__':
unittest.main()
| 38.903683 | 104 | 0.638972 | 1,825 | 13,733 | 4.675068 | 0.203288 | 0.027075 | 0.019925 | 0.022269 | 0.383849 | 0.340483 | 0.303446 | 0.279887 | 0.253165 | 0.234412 | 0 | 0.009292 | 0.255516 | 13,733 | 352 | 105 | 39.014205 | 0.825215 | 0.393213 | 0 | 0.21374 | 0 | 0 | 0.121706 | 0.018986 | 0 | 0 | 0 | 0.005682 | 0.137405 | 1 | 0.282443 | false | 0.061069 | 0.045802 | 0.007634 | 0.351145 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5e82e3c1af92dd0f53d2ece337b738f0ce2cd9c3 | 310 | py | Python | sandcastle/__init__.py | FrNecas/sandcastle | ece539650770fea057877f0c97074acf506fada4 | [
"MIT"
] | 5 | 2019-06-19T11:14:54.000Z | 2019-08-06T14:33:28.000Z | sandcastle/__init__.py | FrNecas/sandcastle | ece539650770fea057877f0c97074acf506fada4 | [
"MIT"
] | 97 | 2020-08-03T14:31:03.000Z | 2022-03-28T10:42:19.000Z | sandcastle/__init__.py | FrNecas/sandcastle | ece539650770fea057877f0c97074acf506fada4 | [
"MIT"
] | 8 | 2019-06-18T13:19:46.000Z | 2020-02-06T11:38:17.000Z | # Copyright Contributors to the Packit project.
# SPDX-License-Identifier: MIT
from sandcastle.api import Sandcastle, VolumeSpec, MappedDir # NOQA
from sandcastle.exceptions import ( # NOQA
SandcastleException,
SandcastleTimeoutReached,
SandcastleCommandFailed,
SandcastleExecutionError,
)
| 25.833333 | 68 | 0.783871 | 27 | 310 | 9 | 0.814815 | 0.115226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.158065 | 310 | 11 | 69 | 28.181818 | 0.931034 | 0.270968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.285714 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5e84c79008d8784b138388c093541e267e10dd22 | 5,703 | py | Python | src/sage/modules/module_functors.py | bopopescu/sage | 2d495be78e0bdc7a0a635454290b27bb4f5f70f0 | [
"BSL-1.0"
] | 1,742 | 2015-01-04T07:06:13.000Z | 2022-03-30T11:32:52.000Z | src/sage/modules/module_functors.py | Ivo-Maffei/sage | 467fbc70a08b552b3de33d9065204ee9cbfb02c7 | [
"BSL-1.0"
] | 66 | 2015-03-19T19:17:24.000Z | 2022-03-16T11:59:30.000Z | src/sage/modules/module_functors.py | dimpase/sage | 468f23815ade42a2192b0a9cd378de8fdc594dcd | [
"BSL-1.0"
] | 495 | 2015-01-10T10:23:18.000Z | 2022-03-24T22:06:11.000Z | """
Module Functors
AUTHORS:
- Travis Scrimshaw (2017-10): Initial implementation of
:class:`QuotientModuleFunctor`
"""
#*****************************************************************************
# Copyright (C) 2017 Travis Scrimshaw <tcscrims at gmail.com>
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 2 of the License, or
# (at your option) any later version.
# http://www.gnu.org/licenses/
#*****************************************************************************
##############################################################
# Construction functor for quotient modules
##############################################################
from sage.categories.pushout import ConstructionFunctor
from sage.categories.modules import Modules
class QuotientModuleFunctor(ConstructionFunctor):
r"""
Construct the quotient of a module by a submodule.
INPUT:
- ``relations`` -- a module
.. NOTE::
This construction functor keeps track of the basis of defining
``relations``. It can only be applied to free modules into which
this basis coerces.
EXAMPLES::
sage: A = (1/2)*ZZ^2
sage: B = 2*ZZ^2
sage: Q = A / B
sage: F = Q.construction()[0]
sage: F
QuotientModuleFunctor
sage: F(A) == Q
True
The modules are constructed from the cover not the ambient module::
sage: F(B.ambient_module()) == Q
False
We can construct quotients from different modules::
sage: F((1/2)*ZZ^2)
Finitely generated module V/W over Integer Ring with invariants (4, 4)
sage: F(ZZ^2)
Finitely generated module V/W over Integer Ring with invariants (2, 2)
sage: F(2*ZZ^2)
Finitely generated module V/W over Integer Ring with invariants ()
This functor is used for constructing pushouts::
sage: A = ZZ^3
sage: x,y,z = A.basis()
sage: A1 = A.submodule([x])
sage: A2 = A.submodule([y, 2*x])
sage: B1 = A.submodule([])
sage: B2 = A.submodule([2*x])
sage: Q1 = A1 / B1
sage: Q2 = A2 / B2
sage: q3 = Q1.an_element() + Q2.an_element()
"""
rank = 5 # ranking of functor, not rank of module
def __init__(self, relations):
"""
Initialization of ``self``.
TESTS::
sage: from sage.modules.module_functors import QuotientModuleFunctor
sage: B = (2/3)*ZZ^2
sage: F = QuotientModuleFunctor(B)
sage: TestSuite(F).run()
"""
R = relations.category().base_ring()
ConstructionFunctor.__init__(self, Modules(R), Modules(R))
self._relations = relations
def relations(self):
"""
Return the defining relations of ``self``.
EXAMPLES::
sage: A = (ZZ**2) / span([[4,0],[0,3]], ZZ)
sage: A.construction()[0].relations()
Free module of degree 2 and rank 2 over Integer Ring
Echelon basis matrix:
[4 0]
[0 3]
"""
return self._relations
def _apply_functor(self, ambient):
"""
Apply the functor to an object of ``self``'s domain.
TESTS::
sage: A = ZZ^3
sage: B = 2 * A
sage: C = 4 * A
sage: D = B / C
sage: F = D.construction()[0]
sage: D == F(D.construction()[1])
True
"""
return ambient.quotient(self._relations)
def __eq__(self, other):
"""
The quotient functor ``self`` is equal to ``other`` if
it is a :class:`QuotientModuleFunctor` and the relations
subspace are equal.
EXAMPLES::
sage: F1 = ((ZZ^3) / (4*ZZ^3)).construction()[0]
sage: F2 = ((2*ZZ^3) / (4*ZZ^3)).construction()[0]
sage: F1 == F2
True
sage: F3 = ((ZZ^3) / (8*ZZ^3)).construction()[0]
sage: F1 == F3
False
"""
if not isinstance(other, QuotientModuleFunctor):
return False
return self._relations == other._relations
def __ne__(self, other):
r"""
Check whether ``self`` is not equal to ``other``.
EXAMPLES::
sage: F1 = ((ZZ^3) / (4*ZZ^3)).construction()[0]
sage: F2 = ((2*ZZ^3) / (4*ZZ^3)).construction()[0]
sage: F1 != F2
False
sage: F3 = ((ZZ^3) / (8*ZZ^3)).construction()[0]
sage: F1 != F3
True
"""
return not (self == other)
def merge(self, other):
r"""
Merge the construction functors ``self`` and ``other``.
EXAMPLES::
sage: A = ZZ^3
sage: x,y,z = A.basis()
sage: A1 = A.submodule([x])
sage: A2 = A.submodule([y, 2*x])
sage: B1 = A.submodule([])
sage: B2 = A.submodule([2*x])
sage: Q1 = A1 / B1
sage: Q2 = A2 / B2
sage: F1 = Q1.construction()[0]
sage: F2 = Q2.construction()[0]
sage: F3 = F1.merge(F2)
sage: q3 = Q1.an_element() + Q2.an_element()
sage: q3.parent() == F3(A1 + A2)
True
sage: G = A1.construction()[0]; G
SubspaceFunctor
sage: F1.merge(G)
sage: F2.merge(G)
"""
if isinstance(other, QuotientModuleFunctor):
return QuotientModuleFunctor(self._relations + other._relations)
| 29.858639 | 80 | 0.507277 | 669 | 5,703 | 4.276532 | 0.243647 | 0.015729 | 0.05942 | 0.033555 | 0.235232 | 0.231038 | 0.231038 | 0.231038 | 0.212863 | 0.212863 | 0 | 0.035696 | 0.327021 | 5,703 | 190 | 81 | 30.015789 | 0.709745 | 0.673681 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.083333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5e8c26d213acbf1dfb2bea3ade3d42a27bed739f | 4,621 | py | Python | academic_html/send_academic.py | aungbobo04/emailer | 2b60917300d528e12263d0d15a15b944088eb193 | [
"MIT"
] | null | null | null | academic_html/send_academic.py | aungbobo04/emailer | 2b60917300d528e12263d0d15a15b944088eb193 | [
"MIT"
] | null | null | null | academic_html/send_academic.py | aungbobo04/emailer | 2b60917300d528e12263d0d15a15b944088eb193 | [
"MIT"
] | null | null | null | #!/usr/bin/python3
import yagmail
import csv
import termcolor
import os
import sys
os.system('color')
def load_applicants(filename):
with open(filename, mode="r") as f:
csv_reader = csv.reader(f)
list_from_csv = []
for row in csv_reader:
for string in row:
list_from_csv.append(string.replace(" ", ""))
return list_from_csv
def find_pdf(filename):
for root, dirs, files in os.walk(r"./"):
for name in files:
if name == filename:
return os.path.abspath(os.path.join(root, name))
# mailing process
def send_mail(applicants, content, attachment):
try:
yag = yagmail.SMTP("cookieacademy21@gmail.com")
# validate attachment
if find_pdf(attachment) == None:
sys.exit(1)
else:
# Event Invitation
yag.send(to=applicants, cc="cookieacademy2020@gmail.com", bcc="cookieacademy19@gmail.com", subject="Students ID & Classroom Invitation Link", contents=content, attachments=find_pdf(attachment))
print(termcolor.colored("***** Sent Invitation Successfully *****", "green", attrs=["bold"]))
except SystemExit:
print(termcolor.colored("Cannot find attachment. Please recheck the attachment filename. Aborting...", "red"))
except:
print(termcolor("Error, email was not sent", "red"))
def confirm(applicants):
print(termcolor.colored(f"Recipients: {applicants}", "yellow", attrs=["bold"]))
if len(applicants) == 0:
print(termcolor.colored("Please enter recipients in academic_applicants.csv", "red"))
else:
answer = ""
while answer not in ["y", "n"]:
answer = input("OK to push to continue [Y/N]? ").lower()
return answer == "y"
if __name__ == "__main__":
applicants = load_applicants("academic_applicants.csv")
class_name = input("Enter the Name of the Class : ")
classroom_link = input("Enter the Google Classroom link : ")
attachment = input("Enter the Name of Attachment : ")
content = [f"""<!DOCTYPE html><html lang="en"><head><meta charset="UTF-8"><meta http-equiv="X-UA-Compatible" content="IE=edge"><meta name="viewport" content="width=device-width, initial-scale=1.0"><link href="https://fonts.googleapis.com/css2?family=Roboto" rel="stylesheet"><title>Students ID & Classroom Invitation!</title></head>
<body><div style="font-family:'Roboto'; box-shadow: rgb(212, 212, 212) 4px 4px; background-color:rgb(255, 255, 255); color: black; max-width: 500px; margin:auto; border: 2px solid #253a6d;"><div style="text-align:center; margin-top: 25px"><img style="width: 100%; max-width: 100px;" src="https://i.imgur.com/jfyuaOh.png" alt="cookie-logo" /><div style="max-width: 65%; font-family: 'Roboto'; font-weight: 300; font-size: 15px; color: #253a6d; margin: 20px auto;"><p style="text-align: center;">Thank You for joining {class_name} at <i>Cookie Academy</i>. You can check your name, email and student ID number in the attachment below.</p></div><a style="text-decoration: none; color: white; font-size: 14px; padding: 10px 15px; background-color: #00b9e7; border-radius: 10px;" href="{classroom_link}">Join Classroom</a><p style="font-family: 'Roboto'; font-weight: 300;font-size: 16px; margin-top: 30px;">Follow us on Social Media</p><div style="display:flex; margin-top: 10px; margin-bottom: 15px;"><div style="display:flex; margin:auto"><a href="https://www.facebook.com/cookieacademymm" target="blank"><img alt="facebook" height="30" src="https://i.imgur.com/on4VXmE.png" style="display:block; height:auto; padding-right: 25px; border:0;" title="Facebook" width="30" /></a><a href="#" target="blank"><img alt="instagram" height="30" src="https://i.imgur.com/TRncuPi.png" style="display:block; padding-right:25px; height:auto; border:0;" title="Instagram" width="30" /></a><a href="https://discord.gg/HDk38j7wJv" target="blank"><img alt="discord" height="30" src="https://i.imgur.com/B1rsYUF.png" style="display:block; height:auto; border:0;" title="Discord" width="30" /></a></div></div><div style=" font-family: 'Roboto' ; font-weight: 300; font-size: 13px; color: #7a7a7a; padding-top: 10px; padding-bottom: 35px;"><p style="margin: 3px;">Email: cookieacademy21@gmail.com</p><p style="margin: 3px;">Academic Department</p><p style="margin: 3px;">Cookie Academy © 2021</p></div></div></div></div></div></div></body>
</html>"""]
if confirm(applicants):
send_mail(applicants, content, attachment)
else:
print(termcolor.colored("Program Abort", "red"))
| 61.613333 | 2,021 | 0.664791 | 634 | 4,621 | 4.801262 | 0.380126 | 0.013798 | 0.034494 | 0.018397 | 0.186597 | 0.0841 | 0.064389 | 0.03975 | 0.027595 | 0 | 0 | 0.034109 | 0.162519 | 4,621 | 74 | 2,022 | 62.445946 | 0.752196 | 0.015148 | 0 | 0.057692 | 0 | 0.038462 | 0.637783 | 0.136793 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.096154 | 0 | 0.230769 | 0.115385 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5ea11ad07cdd3288acebe4f08b956a9d2d0a85fc | 7,206 | py | Python | cifti/tests/test_io.py | pauldmccarthy/cifti | f9f55d67307732ebfaf97a2417e684f480c97f5e | [
"MIT"
] | 9 | 2018-01-28T08:56:21.000Z | 2021-05-27T12:51:58.000Z | cifti/tests/test_io.py | pauldmccarthy/cifti | f9f55d67307732ebfaf97a2417e684f480c97f5e | [
"MIT"
] | null | null | null | cifti/tests/test_io.py | pauldmccarthy/cifti | f9f55d67307732ebfaf97a2417e684f480c97f5e | [
"MIT"
] | 4 | 2017-03-17T02:54:11.000Z | 2019-06-04T15:40:23.000Z | from .. import io, axis
from nibabel.tests.nibabel_data import get_nibabel_data, needs_nibabel_data
import os
import numpy as np
import tempfile
dirname = os.path.join(get_nibabel_data(), 'nitest-cifti2')
hcp_labels = ['CortexLeft', 'CortexRight', 'AccumbensLeft', 'AccumbensRight', 'AmygdalaLeft', 'AmygdalaRight',
'brain_stem', 'CaudateLeft', 'CaudateRight', 'CerebellumLeft', 'CerebellumRight',
'Diencephalon_ventral_left', 'Diencephalon_ventral_right', 'HippocampusLeft', 'HippocampusRight',
'PallidumLeft', 'PallidumRight', 'PutamenLeft', 'PutamenRight', 'ThalamusLeft', 'ThalamusRight']
hcp_n_elements = [29696, 29716, 135, 140, 315, 332, 3472, 728, 755, 8709, 9144, 706,
712, 764, 795, 297, 260, 1060, 1010, 1288, 1248]
hcp_affine = np.array([[ -2., 0., 0., 90.],
[ 0., 2., 0., -126.],
[ 0., 0., 2., -72.],
[ 0., 0., 0., 1.]])
def check_hcp_grayordinates(brain_model):
"""Checks that a BrainModel matches the expected 32k HCP grayordinates
"""
assert isinstance(brain_model, axis.BrainModel)
structures = list(brain_model.iter_structures())
assert len(structures) == len(hcp_labels)
idx_start = 0
for idx, (name, _, bm), label, nel in zip(range(len(structures)), structures, hcp_labels, hcp_n_elements):
if idx < 2:
assert name in bm.nvertices.keys()
assert (bm.voxel == -1).all()
assert (bm.vertex != -1).any()
assert bm.nvertices[name] == 32492
else:
assert name not in bm.nvertices.keys()
assert (bm.voxel != -1).any()
assert (bm.vertex == -1).all()
assert (bm.affine == hcp_affine).all()
assert bm.volume_shape == (91, 109, 91)
assert name == axis.BrainModel.to_cifti_brain_structure_name(label)
assert len(bm) == nel
assert (bm.arr == brain_model.arr[idx_start:idx_start + nel]).all()
idx_start += nel
assert idx_start == len(brain_model)
assert (brain_model.arr[:5]['vertex'] == np.arange(5)).all()
assert structures[0][2].vertex[-1] == 32491
assert structures[1][2].vertex[0] == 0
assert structures[1][2].vertex[-1] == 32491
assert (structures[-1][2].arr[-1] == brain_model.arr[-1]).all()
assert (brain_model.arr[-1]['voxel'] == [38, 55, 46]).all()
assert (brain_model.arr[70000]['voxel'] == [56, 22, 19]).all()
def check_Conte69(brain_model):
"""Checks that the BrainModel matches the expected Conte69 surface coordinates
"""
assert isinstance(brain_model, axis.BrainModel)
structures = list(brain_model.iter_structures())
assert len(structures) == 2
assert structures[0][0] == 'CIFTI_STRUCTURE_CORTEX_LEFT'
assert structures[0][2].is_surface.all()
assert structures[1][0] == 'CIFTI_STRUCTURE_CORTEX_RIGHT'
assert structures[1][2].is_surface.all()
assert (brain_model.voxel == -1).all()
assert (brain_model.arr[:5]['vertex'] == np.arange(5)).all()
assert structures[0][2].vertex[-1] == 32491
assert structures[1][2].vertex[0] == 0
assert structures[1][2].vertex[-1] == 32491
def check_rewrite(arr, axes, extension='.nii'):
(fd, name) = tempfile.mkstemp(extension)
io.write(name, arr, axes)
arr2, axes2 = io.read(name)
assert (arr == arr2).all()
assert (axes == axes2)
assert (axes == io.get_axes(name))
return arr2, axes2
@needs_nibabel_data('nitest-cifti2')
def test_read_ones():
arr, axes = io.read(os.path.join(dirname, 'ones.dscalar.nii'))
assert (arr == 1).all()
assert isinstance(axes[0], axis.Scalar)
assert len(axes[0]) == 1
assert axes[0].name[0] == 'ones'
assert axes[0].meta[0] == {}
check_hcp_grayordinates(axes[1])
arr2, axes2 = check_rewrite(arr, axes)
check_hcp_grayordinates(axes2[1])
@needs_nibabel_data('nitest-cifti2')
def test_read_conte69_dscalar():
arr, axes = io.read(os.path.join(dirname, 'Conte69.MyelinAndCorrThickness.32k_fs_LR.dscalar.nii'))
assert isinstance(axes[0], axis.Scalar)
assert len(axes[0]) == 2
assert axes[0].name[0] == 'MyelinMap_BC_decurv'
assert axes[0].name[1] == 'corrThickness'
assert axes[0].meta[0] == {'PaletteColorMapping': '<PaletteColorMapping Version="1">\n <ScaleMode>MODE_AUTO_SCALE_PERCENTAGE</ScaleMode>\n <AutoScalePercentageValues>98.000000 2.000000 2.000000 98.000000</AutoScalePercentageValues>\n <UserScaleValues>-100.000000 0.000000 0.000000 100.000000</UserScaleValues>\n <PaletteName>ROY-BIG-BL</PaletteName>\n <InterpolatePalette>true</InterpolatePalette>\n <DisplayPositiveData>true</DisplayPositiveData>\n <DisplayZeroData>false</DisplayZeroData>\n <DisplayNegativeData>true</DisplayNegativeData>\n <ThresholdTest>THRESHOLD_TEST_SHOW_OUTSIDE</ThresholdTest>\n <ThresholdType>THRESHOLD_TYPE_OFF</ThresholdType>\n <ThresholdFailureInGreen>false</ThresholdFailureInGreen>\n <ThresholdNormalValues>-1.000000 1.000000</ThresholdNormalValues>\n <ThresholdMappedValues>-1.000000 1.000000</ThresholdMappedValues>\n <ThresholdMappedAvgAreaValues>-1.000000 1.000000</ThresholdMappedAvgAreaValues>\n <ThresholdDataName></ThresholdDataName>\n <ThresholdRangeMode>PALETTE_THRESHOLD_RANGE_MODE_MAP</ThresholdRangeMode>\n</PaletteColorMapping>'}
check_Conte69(axes[1])
check_rewrite(arr, axes)
@needs_nibabel_data('nitest-cifti2')
def test_read_conte69_dtseries():
arr, axes = io.read(os.path.join(dirname, 'Conte69.MyelinAndCorrThickness.32k_fs_LR.dtseries.nii'))
assert isinstance(axes[0], axis.Series)
assert len(axes[0]) == 2
assert axes[0].start == 0
assert axes[0].step == 1
assert axes[0].size == arr.shape[0]
assert (axes[0].arr == [0, 1]).all()
check_Conte69(axes[1])
check_rewrite(arr, axes)
@needs_nibabel_data('nitest-cifti2')
def test_read_conte69_dlabel():
arr, axes = io.read(os.path.join(dirname, 'Conte69.parcellations_VGD11b.32k_fs_LR.dlabel.nii'))
assert isinstance(axes[0], axis.Label)
assert len(axes[0]) == 3
assert (axes[0].name == ['Composite Parcellation-lh (FRB08_OFP03_retinotopic)',
'Brodmann lh (from colin.R via pals_R-to-fs_LR)', 'MEDIAL WALL lh (fs_LR)']).all()
assert axes[0].label[1][70] == ('19_B05', (1.0, 0.867, 0.467, 1.0))
assert (axes[0].meta == [{}] * 3).all()
check_Conte69(axes[1])
check_rewrite(arr, axes)
@needs_nibabel_data('nitest-cifti2')
def test_read_conte69_ptseries():
arr, axes = io.read(os.path.join(dirname, 'Conte69.MyelinAndCorrThickness.32k_fs_LR.ptseries.nii'))
assert isinstance(axes[0], axis.Series)
assert len(axes[0]) == 2
assert axes[0].start == 0
assert axes[0].step == 1
assert axes[0].size == arr.shape[0]
assert (axes[0].arr == [0, 1]).all()
assert len(axes[1]) == 54
voxels, vertices = axes[1]['ER_FRB08']
assert voxels.shape == (0, 3)
assert len(vertices) == 2
assert vertices['CIFTI_STRUCTURE_CORTEX_LEFT'].shape == (206 // 2, )
assert vertices['CIFTI_STRUCTURE_CORTEX_RIGHT'].shape == (206 // 2, )
check_rewrite(arr, axes)
| 47.098039 | 1,110 | 0.663752 | 942 | 7,206 | 4.936306 | 0.242038 | 0.027957 | 0.037849 | 0.029677 | 0.368602 | 0.344301 | 0.316774 | 0.316774 | 0.282151 | 0.264301 | 0 | 0.074218 | 0.179156 | 7,206 | 152 | 1,111 | 47.407895 | 0.711919 | 0.020538 | 0 | 0.312 | 0 | 0.008 | 0.279466 | 0.187731 | 0 | 0 | 0 | 0 | 0.544 | 1 | 0.064 | false | 0 | 0.04 | 0 | 0.112 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5ea62bf280fc518595513d8c7e3dccecd4cadab2 | 3,326 | py | Python | mininlp/metrics.py | davidleonfdez/mini-nlp-framework | 25992ece5f1629a5062e3960787d95a24a0859d0 | [
"MIT"
] | null | null | null | mininlp/metrics.py | davidleonfdez/mini-nlp-framework | 25992ece5f1629a5062e3960787d95a24a0859d0 | [
"MIT"
] | null | null | null | mininlp/metrics.py | davidleonfdez/mini-nlp-framework | 25992ece5f1629a5062e3960787d95a24a0859d0 | [
"MIT"
] | null | null | null | from abc import ABC, abstractmethod
from functools import partial
from mininlp.predict import predict_binary, predict_dl, predict_multiclass
from sklearn.metrics import accuracy_score, f1_score
import torch
import torch.nn as nn
from torch.utils.data import DataLoader
from typing import Callable
def metric_lower_is_better(metric_fn:Callable):
"Indicates whether a lower value returned by `metric_fn` implies better predictions."
if metric_fn in (accuracy_score, f1_score):
return False
#if metric_fn in (,):
# return True
raise ValueError("Unsupported metric function")
class Metric(ABC):
"Base metric class. Child classes must wrap a callable metric function."
@property
@abstractmethod
def lower_is_better(self) -> bool:
pass
@property
@abstractmethod
def name(self) -> str:
pass
@abstractmethod
def __call__(self, model:nn.Module, dl:DataLoader, **predict_kwargs) -> float:
"Calculate the value of the metric using the inputs and labels given by `dl`."
class BinaryClassificationMetric(Metric):
"Metric appropriate for binary classification problems. Default metric function is F1-score."
def __init__(self, metric_fn=f1_score):
self.metric_fn = metric_fn
def __call__(self, model:nn.Module, dl:DataLoader, **predict_kwargs) -> float:
with torch.no_grad():
preds, y = predict_dl(model, dl, **predict_kwargs)
return self.metric_fn(y, preds)
@property
def lower_is_better(self) -> bool:
return metric_lower_is_better(self.metric_fn)
@property
def name(self) -> str:
return self.metric_fn.__name__
class MulticlassClassificationMetric(Metric):
"Metric appropriate for multi-class classification problems. Default metric function is F1-score."
def __init__(self, metric_fn=partial(f1_score, average='weighted')):
self.metric_fn = metric_fn
self.inner_metric_fn = metric_fn
while isinstance(self.inner_metric_fn, partial):
self.inner_metric_fn = self.inner_metric_fn.func
def __call__(self, model:nn.Module, dl:DataLoader, **predict_kwargs) -> float:
with torch.no_grad():
preds, y = predict_dl(model, dl, predict=predict_multiclass, **predict_kwargs)
return self.metric_fn(y, preds)
@property
def lower_is_better(self) -> bool:
return metric_lower_is_better(self.inner_metric_fn)
@property
def name(self) -> str:
return self.inner_metric_fn.__name__
class LanguageModelMetric(Metric):
"Metric appropriate for language modeling problems. Default metric function is accuracy."
def __init__(self, metric_fn=accuracy_score, pad_idx=0):
self.metric_fn = metric_fn
self.pad_idx = pad_idx
def __call__(self, model:nn.Module, dl:DataLoader, **predict_kwargs) -> float:
with torch.no_grad():
preds, y = predict_dl(model, dl, predict=predict_multiclass, **predict_kwargs)
mask = y.view(-1) != self.pad_idx
return self.metric_fn(y.view(-1)[mask], preds.view(-1)[mask])
@property
def lower_is_better(self) -> bool:
return metric_lower_is_better(self.metric_fn)
@property
def name(self) -> str:
return self.metric_fn.__name__
| 34.28866 | 102 | 0.699339 | 438 | 3,326 | 5.031963 | 0.230594 | 0.098004 | 0.07078 | 0.053993 | 0.512704 | 0.472323 | 0.422414 | 0.422414 | 0.422414 | 0.404265 | 0 | 0.003808 | 0.210463 | 3,326 | 96 | 103 | 34.645833 | 0.835491 | 0.16356 | 0 | 0.534247 | 0 | 0 | 0.163576 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.219178 | false | 0.027397 | 0.109589 | 0.082192 | 0.520548 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5eab075d1f3fe617b08ffa8443a2a7d55a03cd0f | 7,647 | py | Python | api/league/migrations/0001_initial.py | wepickheroes/wepickheroes.github.io | 032c2a75ef058aaceb795ce552c52fbcc4cdbba3 | [
"MIT"
] | 3 | 2018-02-15T20:04:23.000Z | 2018-09-29T18:13:55.000Z | api/league/migrations/0001_initial.py | wepickheroes/wepickheroes.github.io | 032c2a75ef058aaceb795ce552c52fbcc4cdbba3 | [
"MIT"
] | 5 | 2018-01-31T02:01:15.000Z | 2018-05-11T04:07:32.000Z | api/league/migrations/0001_initial.py | prattl/wepickheroes | 032c2a75ef058aaceb795ce552c52fbcc4cdbba3 | [
"MIT"
] | null | null | null | # Generated by Django 2.0 on 2018-03-24 21:43
from django.db import migrations, models
import django.db.models.deletion
import django.utils.timezone
import uuid
class Migration(migrations.Migration):
initial = True
dependencies = [
('nucleus', '0003_auto_20180324_2100'),
('teams', '0003_auto_20180324_2100'),
]
operations = [
migrations.CreateModel(
name='Division',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('created', models.DateTimeField(default=django.utils.timezone.now, editable=False)),
('updated', models.DateTimeField(blank=True, editable=False, null=True)),
('number', models.IntegerField()),
('name', models.CharField(blank=True, max_length=256, null=True)),
],
options={
'ordering': ('number',),
},
),
migrations.CreateModel(
name='DivisionSeason',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('created', models.DateTimeField(default=django.utils.timezone.now, editable=False)),
('updated', models.DateTimeField(blank=True, editable=False, null=True)),
('division', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='league.Division')),
],
options={
'ordering': ('division__number', 'season__number'),
},
),
migrations.CreateModel(
name='League',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('created', models.DateTimeField(default=django.utils.timezone.now, editable=False)),
('updated', models.DateTimeField(blank=True, editable=False, null=True)),
('name', models.CharField(max_length=256)),
('num_series_per_season', models.IntegerField()),
('num_games_per_series', models.IntegerField()),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='LeagueRegistration',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('created', models.DateTimeField(default=django.utils.timezone.now, editable=False)),
('updated', models.DateTimeField(blank=True, editable=False, null=True)),
('league', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='league.League')),
('registered_by', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='nucleus.TeamMember')),
('team', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='teams.Team')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='Match',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('created', models.DateTimeField(default=django.utils.timezone.now, editable=False)),
('updated', models.DateTimeField(blank=True, editable=False, null=True)),
('matchid', models.CharField(blank=True, max_length=32, null=True)),
('loser', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='matches_lost', to='teams.Team')),
],
options={
'verbose_name_plural': 'Matches',
},
),
migrations.CreateModel(
name='Season',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('created', models.DateTimeField(default=django.utils.timezone.now, editable=False)),
('updated', models.DateTimeField(blank=True, editable=False, null=True)),
('number', models.IntegerField(editable=False, unique=True)),
('start_date', models.DateField()),
('end_date', models.DateField()),
],
options={
'ordering': ('number',),
},
),
migrations.CreateModel(
name='Series',
fields=[
('id', models.UUIDField(default=uuid.uuid4, editable=False, primary_key=True, serialize=False)),
('created', models.DateTimeField(default=django.utils.timezone.now, editable=False)),
('updated', models.DateTimeField(blank=True, editable=False, null=True)),
('start_date', models.DateField()),
('end_date', models.DateField()),
('division_season', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='league.DivisionSeason')),
('loser', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='series_lost', to='teams.Team')),
('team_a', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='series_as_team_a', to='teams.Team')),
('team_b', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='series_as_team_b', to='teams.Team')),
('winner', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='series_won', to='teams.Team')),
],
options={
'verbose_name_plural': 'Series',
'ordering': ('division_season__division__number', 'division_season__season__number', 'start_date'),
},
),
migrations.AddField(
model_name='match',
name='series',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='league.Series'),
),
migrations.AddField(
model_name='match',
name='winner',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.CASCADE, related_name='matches_won', to='teams.Team'),
),
migrations.AddField(
model_name='league',
name='seasons',
field=models.ManyToManyField(blank=True, related_name='leagues', to='league.Season'),
),
migrations.AddField(
model_name='divisionseason',
name='season',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='league.Season'),
),
migrations.AddField(
model_name='divisionseason',
name='teams',
field=models.ManyToManyField(related_name='division_seasons', to='teams.Team'),
),
migrations.AddField(
model_name='division',
name='league',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='league.League'),
),
migrations.AddField(
model_name='division',
name='seasons',
field=models.ManyToManyField(related_name='divisions', through='league.DivisionSeason', to='league.Season'),
),
migrations.AlterUniqueTogether(
name='division',
unique_together={('league', 'number')},
),
]
| 48.398734 | 159 | 0.584151 | 733 | 7,647 | 5.968622 | 0.143247 | 0.065371 | 0.048 | 0.075429 | 0.763429 | 0.736457 | 0.651657 | 0.618286 | 0.618286 | 0.5712 | 0 | 0.010973 | 0.273048 | 7,647 | 157 | 160 | 48.707006 | 0.776039 | 0.005623 | 0 | 0.58 | 1 | 0 | 0.143383 | 0.022757 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.026667 | 0 | 0.053333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5eab89c7a4f0a740f356857c28c01abe09cf535c | 2,023 | py | Python | authlib/jose/rfc7516/models.py | YPCrumble/authlib | 782a0fced780849418dc2a869528d10387e24b65 | [
"BSD-3-Clause"
] | 1 | 2021-12-23T06:38:11.000Z | 2021-12-23T06:38:11.000Z | authlib/jose/rfc7516/models.py | YPCrumble/authlib | 782a0fced780849418dc2a869528d10387e24b65 | [
"BSD-3-Clause"
] | 10 | 2020-09-30T05:41:05.000Z | 2021-11-03T08:55:31.000Z | authlib/jose/rfc7516/models.py | YPCrumble/authlib | 782a0fced780849418dc2a869528d10387e24b65 | [
"BSD-3-Clause"
] | 2 | 2021-09-22T07:55:39.000Z | 2022-03-12T10:05:39.000Z | import os
class JWEAlgorithm(object):
"""Interface for JWE algorithm. JWA specification (RFC7518) SHOULD
implement the algorithms for JWE with this base implementation.
"""
EXTRA_HEADERS = None
name = None
description = None
algorithm_type = 'JWE'
algorithm_location = 'alg'
def prepare_key(self, raw_data):
raise NotImplementedError
def wrap(self, enc_alg, headers, key):
raise NotImplementedError
def unwrap(self, enc_alg, ek, headers, key):
raise NotImplementedError
class JWEEncAlgorithm(object):
name = None
description = None
algorithm_type = 'JWE'
algorithm_location = 'enc'
IV_SIZE = None
CEK_SIZE = None
def generate_cek(self):
return os.urandom(self.CEK_SIZE // 8)
def generate_iv(self):
return os.urandom(self.IV_SIZE // 8)
def check_iv(self, iv):
if len(iv) * 8 != self.IV_SIZE:
raise ValueError('Invalid "iv" size')
def encrypt(self, msg, aad, iv, key):
"""Encrypt the given "msg" text.
:param msg: text to be encrypt in bytes
:param aad: additional authenticated data in bytes
:param iv: initialization vector in bytes
:param key: encrypted key in bytes
:return: (ciphertext, iv, tag)
"""
raise NotImplementedError
def decrypt(self, ciphertext, aad, iv, tag, key):
"""Decrypt the given cipher text.
:param ciphertext: ciphertext in bytes
:param aad: additional authenticated data in bytes
:param iv: initialization vector in bytes
:param tag: authentication tag in bytes
:param key: encrypted key in bytes
:return: message
"""
raise NotImplementedError
class JWEZipAlgorithm(object):
name = None
description = None
algorithm_type = 'JWE'
algorithm_location = 'zip'
def compress(self, s):
raise NotImplementedError
def decompress(self, s):
raise NotImplementedError
| 25.607595 | 70 | 0.641621 | 237 | 2,023 | 5.396624 | 0.333333 | 0.049257 | 0.065676 | 0.053948 | 0.367475 | 0.331509 | 0.331509 | 0.331509 | 0.331509 | 0.234558 | 0 | 0.004791 | 0.277805 | 2,023 | 78 | 71 | 25.935897 | 0.870637 | 0.303015 | 0 | 0.4 | 0 | 0 | 0.027365 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.025 | 0.05 | 0.775 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5eafe51421457fcaa6716d88c2b628233f2316df | 5,436 | py | Python | app.py | bfontaine/RemindMe | f873d4b1a9fdec9e5f8ac7bd37d5a0ea7e71a932 | [
"MIT"
] | 3 | 2016-03-03T12:16:23.000Z | 2021-06-29T07:03:14.000Z | app.py | bfontaine/RemindMe | f873d4b1a9fdec9e5f8ac7bd37d5a0ea7e71a932 | [
"MIT"
] | null | null | null | app.py | bfontaine/RemindMe | f873d4b1a9fdec9e5f8ac7bd37d5a0ea7e71a932 | [
"MIT"
] | null | null | null | # -*- coding: UTF-8 -*-
from flask import Flask, render_template, g, request, flash, session
from flask.ext.assets import Environment, Bundle
from flask.ext.babel import Babel, gettext
from webassets_iife import IIFE
from remindme import ajax, store
from remindme.core import schedule_sms, SMSException
from remindme.flaskutils import logged_only, unlogged_only, redirect_for, \
retrieve_session, user, title, set_debug_mode, is_debug_mode
from remindme.log import mkLogger
app = Flask(__name__)
app.config.from_pyfile('remindme.cfg', silent=True)
logger = mkLogger('app')
# i18n
babel = Babel(app)
# assets
assets = Environment(app)
js_filters = []
css_filters = []
if not app.config['DEBUG']:
js_filters.concat([IIFE, 'closure_js'])
css_filters.concat(['cssmin'])
# - JS
js = Bundle(
# Bootstrap/Bootflat
'js/vendor/jquery.js',
'js/vendor/html5shiv.js',
'js/vendor/bootstrap.min.js',
'js/vendor/angular.js',
'js/vendor/angular-animate.js',
'js/vendor/mousetrap.js',
'js/vendor/wMousetrap.js',
'js/vendor/ui-bootstrap-tpls-0.11.0.js',
# Our JS
'js/utils.js',
'js/app.js',
filters=js_filters,
output='js/rm.js')
assets.register('js_all', js)
# - CSS
css = Bundle(
# Bootstrap/Bootflat
'css/bootstrap.min.css',
'css/bootflat.min.css',
# Our JS
'css/app.css',
filters=css_filters,
output='css/rm.css')
assets.register('css_all', css)
@app.before_request
def set_current_user():
_id = session.get('_id')
if _id and '/static/' not in request.path:
setattr(g, 'user', store.get_user(_id=_id))
@app.before_request
def set_g_locale():
if '/static/' not in request.path:
logger.debug("setting user locale...")
setattr(g, 'locale', babel.locale_selector_func())
@babel.localeselector
def get_locale():
trs = [str(t) for t in babel.list_translations()]
# 1. ?locale=
locale_param = request.args.get('locale') or request.args.get('lang')
if locale_param:
if locale_param[:2] in trs:
logger.debug("Known locale param: %s", locale_param)
return locale_param
logger.debug("Unknown locale param: %s", locale_param)
# 2. user.locale
u = user()
if u and u.locale:
logger.debug("Using user locale")
return u.locale
# 3. request header
logger.debug("locale: fall back in headers")
return request.accept_languages.best_match(trs)
@app.route('/')
@unlogged_only
def index():
return render_template('main.html')
@app.route('/home', methods=['GET', 'POST'])
@logged_only
def app_index():
set_debug_mode(request.args.get('debug_mode'))
if (is_debug_mode()):
flash(gettext("Debug mode"), 'warning')
fields = retrieve_session('app_index')
if request.method == 'POST':
key, pswd = g.user.api_username, g.user.api_password
try:
schedule_sms(request.form['text'], request.form['when'],
{'user': key, 'pass': pswd})
except SMSException as e:
print e
flash(gettext("Oops, error."), 'danger')
return redirect_for('app_index')
else:
flash(gettext("Scheduled!"), 'success')
return redirect_for('app_index')
else:
return render_template('app_main.html', fields=fields)
@app.route('/settings')
@logged_only
def user_settings():
# TODO prefill fields
return render_template('app_user_settings.html', fields={})
@app.route('/login', methods=['GET', 'POST'])
@title('Login')
@unlogged_only
def login():
err = gettext('Wrong email or password.')
if request.method == 'POST':
email = request.form['email']
user = store.get_user(email=email)
if not user or not user.check_password(request.form['password']):
flash(err, 'danger')
return redirect_for('login')
session['_id'] = str(user._id)
return redirect_for('app_index')
fields = retrieve_session('login')
return render_template('login.html', fields=fields)
@app.route('/signin', methods=['GET', 'POST'])
@title('Signin')
@unlogged_only
def signin():
fields = retrieve_session('signin')
if request.method == 'POST':
email = request.form['email']
if store.get_user(email=email):
flash(gettext('This email is already registered'), 'danger')
return redirect_for('signin', request.form)
api_user = request.form['api_user']
api_pass = request.form['api_pass']
if store.get_user(api_user=api_user, api_pass=api_pass):
flash(gettext('These API credentials are already registered'),
'danger')
return redirect_for('signin', request.form)
passwd = request.form['password']
user = store.User(email, passwd, api_user, api_pass)
user.save()
flash(gettext('Your account has been successfully created!'),
'success')
return redirect_for('login', {'email': user.email})
return render_template('signin.html', fields=fields)
@app.route('/logout', methods=['POST'])
@logged_only
def logout():
g.user = None
session.clear()
return redirect_for('index')
# API
@app.route('/ajax/sms/schedule', methods=['POST'])
@logged_only
def ajax_sms_schedule(): # POST text (string), when (UTC date string)
return ajax.api_schedule_sms(request.data)
| 27.876923 | 75 | 0.645143 | 709 | 5,436 | 4.792666 | 0.253879 | 0.011772 | 0.040024 | 0.027075 | 0.169217 | 0.074161 | 0.057092 | 0.057092 | 0.033549 | 0 | 0 | 0.002795 | 0.210081 | 5,436 | 194 | 76 | 28.020619 | 0.788542 | 0.038263 | 0 | 0.147887 | 0 | 0 | 0.197237 | 0.038565 | 0 | 0 | 0 | 0.005155 | 0 | 0 | null | null | 0.056338 | 0.056338 | null | null | 0.007042 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5eb57474500cbf1236c0391df68f9e6c13be94bd | 3,910 | py | Python | lm_eval/tasks/gem_wikilingua.py | bigscience-workshop/lm-evaluation-harness | c639c81974d6d0efea2e471f6292cf3c6ae67e4c | [
"MIT"
] | null | null | null | lm_eval/tasks/gem_wikilingua.py | bigscience-workshop/lm-evaluation-harness | c639c81974d6d0efea2e471f6292cf3c6ae67e4c | [
"MIT"
] | null | null | null | lm_eval/tasks/gem_wikilingua.py | bigscience-workshop/lm-evaluation-harness | c639c81974d6d0efea2e471f6292cf3c6ae67e4c | [
"MIT"
] | null | null | null | """
WikiLingua: A New Benchmark Dataset for Cross-Lingual Abstractive Summarization
https://arxiv.org/pdf/2010.03093.pdf
Wikilingua is a large-scale (~770k article-summary pairs), multilingual dataset for the evaluation of cross-lingual abstractive systems.
It consists of parallel articles and summaries (article-summary pairs) from WikiHow across 18 languages (i.e. all the languages available on WikiHow).
It contains 141,457 unique English articles and each of the other 17 languages has on average, 42,783 articles that align with an article in English.
This dataset is part of the GEM Benchmark. (Description from https://gem-benchmark.com/data_cards/WikiLingua)
Homepage: None, Repo: https://github.com/esdurmus/Wikilingua
"""
import typing
from lm_eval.base import PromptSourceTask
_CITATION = """
@inproceedings{ladhak-wiki-2020,
title={WikiLingua: A New Benchmark Dataset for Multilingual Abstractive Summarization},
author={Faisal Ladhak, Esin Durmus, Claire Cardie and Kathleen McKeown},
booktitle={Findings of EMNLP, 2020},
year={2020}
}"""
class GEMWikiLinguaBase(PromptSourceTask):
VERSION = 0
DATASET_PATH = "GEM/wiki_lingua"
DATASET_NAME = None
def has_training_docs(self):
return True
def has_validation_docs(self):
return True
def has_test_docs(self):
return True
def training_docs(self):
if self.has_training_docs():
if self._training_docs is None:
self._training_docs = list(self.dataset["train"])
return self._training_docs
def validation_docs(self):
if self.has_validation_docs():
return self.dataset["validation"]
def test_docs(self):
if self.has_test_docs():
return self.dataset["test"]
def max_generation_length(self):
return 64
class GEMWikiLinguaAr(GEMWikiLinguaBase):
DATASET_NAME = "ar"
class GEMWikiLinguaCs(GEMWikiLinguaBase):
DATASET_NAME = "cs"
class GEMWikiLinguaDe(GEMWikiLinguaBase):
DATASET_NAME = "de"
class GEMWikiLinguaEn(GEMWikiLinguaBase):
DATASET_NAME = "en"
class GEMWikiLinguaEs(GEMWikiLinguaBase):
DATASET_NAME = "es"
class GEMWikiLinguaFr(GEMWikiLinguaBase):
DATASET_NAME = "fr"
class GEMWikiLinguaHi(GEMWikiLinguaBase):
DATASET_NAME = "hi"
class GEMWikiLinguaId(GEMWikiLinguaBase):
DATASET_NAME = "id"
class GEMWikiLinguaIt(GEMWikiLinguaBase):
DATASET_NAME = "it"
class GEMWikiLinguaJa(GEMWikiLinguaBase):
DATASET_NAME = "ja"
class GEMWikiLinguaKo(GEMWikiLinguaBase):
DATASET_NAME = "ko"
class GEMWikiLinguaNl(GEMWikiLinguaBase):
DATASET_NAME = "nl"
class GEMWikiLinguaPt(GEMWikiLinguaBase):
DATASET_NAME = "pt"
class GEMWikiLinguaRu(GEMWikiLinguaBase):
DATASET_NAME = "ru"
class GEMWikiLinguaTh(GEMWikiLinguaBase):
DATASET_NAME = "th"
class GEMWikiLinguaTr(GEMWikiLinguaBase):
DATASET_NAME = "tr"
class GEMWikiLinguaVi(GEMWikiLinguaBase):
DATASET_NAME = "vi"
class GEMWikiLinguaZh(GEMWikiLinguaBase):
DATASET_NAME = "zh"
WIKILINGUA_TASKS = [
GEMWikiLinguaAr,
GEMWikiLinguaCs,
GEMWikiLinguaDe,
GEMWikiLinguaEn,
GEMWikiLinguaEs,
GEMWikiLinguaFr,
GEMWikiLinguaHi,
GEMWikiLinguaId,
GEMWikiLinguaIt,
GEMWikiLinguaJa,
GEMWikiLinguaKo,
GEMWikiLinguaNl,
GEMWikiLinguaPt,
GEMWikiLinguaRu,
GEMWikiLinguaTh,
GEMWikiLinguaTr,
GEMWikiLinguaVi,
GEMWikiLinguaZh,
]
def construct_tasks() -> typing.Dict[str, GEMWikiLinguaBase]:
"""
Returns a dictionary of tasks keyed by task name, for example:
"GEM/wiki_lingua_ar"
will dispatch to the GEM WikiLingua Arabic class.
"""
tasks = {}
for task_class in WIKILINGUA_TASKS:
benchmark = task_class.DATASET_PATH
lang = task_class.DATASET_NAME
tasks[f'{benchmark}_{lang}'] = task_class
return tasks
| 23.841463 | 151 | 0.721739 | 427 | 3,910 | 6.470726 | 0.398126 | 0.079624 | 0.18241 | 0.019544 | 0.067318 | 0.04126 | 0 | 0 | 0 | 0 | 0 | 0.013372 | 0.196675 | 3,910 | 163 | 152 | 23.98773 | 0.866285 | 0.222251 | 0 | 0.031579 | 0 | 0 | 0.115947 | 0.010631 | 0 | 0 | 0 | 0 | 0 | 1 | 0.084211 | false | 0 | 0.021053 | 0.042105 | 0.610526 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
5ec85d5cab2eb94917e332ccc3c6cfee0e0cd6bc | 191 | py | Python | Codewars/8kyu/simple-calculator/Python/solution2.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | 7 | 2017-09-20T16:40:39.000Z | 2021-08-31T18:15:08.000Z | Codewars/8kyu/simple-calculator/Python/solution2.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | Codewars/8kyu/simple-calculator/Python/solution2.py | RevansChen/online-judge | ad1b07fee7bd3c49418becccda904e17505f3018 | [
"MIT"
] | null | null | null | # Python - 3.4.3
def calculator(x, y, op):
# 使用根據運算子建立對應的計算值並回傳
return {
'+': x + y,
'-': x - y,
'*': x * y,
'/': x / y
}.get(op, 'unknown value') | 19.1 | 30 | 0.39267 | 23 | 191 | 3.26087 | 0.565217 | 0.133333 | 0.12 | 0.16 | 0.106667 | 0.106667 | 0 | 0 | 0 | 0 | 0 | 0.025862 | 0.39267 | 191 | 10 | 30 | 19.1 | 0.62069 | 0.172775 | 0 | 0 | 0 | 0 | 0.108974 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0.142857 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
5ecea3f9b8847a35015662148b19aba1582057af | 385 | py | Python | products/models.py | Kmiokande/estocator | 2f657f33372d6f52ea840ed2fe9760da4b5c52af | [
"MIT"
] | null | null | null | products/models.py | Kmiokande/estocator | 2f657f33372d6f52ea840ed2fe9760da4b5c52af | [
"MIT"
] | 1 | 2021-04-28T06:35:02.000Z | 2021-04-28T06:35:02.000Z | products/models.py | higornobrega/estocator | 5c652cfadb523a19393fa11f533d573bfcc9460f | [
"MIT"
] | 1 | 2021-03-26T17:30:24.000Z | 2021-03-26T17:30:24.000Z | from django.db import models
class Product(models.Model):
name = models.CharField(max_length=100)
brand = models.CharField(max_length=50)
price = models.DecimalField(max_digits=6, decimal_places=2)
quantity = models.IntegerField()
bar_code = models.IntegerField(unique=True)
def __str__(self):
return self.name + " - Bar code: " + str(self.bar_code)
| 29.615385 | 63 | 0.706494 | 51 | 385 | 5.137255 | 0.627451 | 0.080153 | 0.137405 | 0.183206 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022152 | 0.179221 | 385 | 12 | 64 | 32.083333 | 0.806962 | 0 | 0 | 0 | 0 | 0 | 0.033766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.111111 | 0.111111 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
0d5a4812f0200fbc6773cee701b6ec0bb64d427f | 4,166 | py | Python | core/networking.py | ikea-lisp-code/firemix | a4e2af316fa3ec7e847bc892256eee71c1d5619c | [
"MIT"
] | 2 | 2018-10-04T18:54:33.000Z | 2019-08-10T22:33:16.000Z | core/networking.py | ikea-lisp-code/firemix | a4e2af316fa3ec7e847bc892256eee71c1d5619c | [
"MIT"
] | null | null | null | core/networking.py | ikea-lisp-code/firemix | a4e2af316fa3ec7e847bc892256eee71c1d5619c | [
"MIT"
] | null | null | null | import sys
import numpy as np
import socket
import array
import struct
import time
from profilehooks import profile
from lib.colors import hls_to_rgb
from lib.buffer_utils import BufferUtils
COMMAND_SET_BGR = 0x10
COMMAND_SET_RGB = 0x20
class Networking:
def __init__(self, app):
self._socket = None
self._app = app
self.open_socket()
self._packet_cache = {}
def open_socket(self):
self._socket = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
def write_commands(self, commands):
"""TODO implement"""
pass
@profile
def write_buffer(self, buffer):
"""
Performs a bulk strand write.
Decodes the HLS-Float data according to client settings
"""
strand_settings = self._app.scene.get_strand_settings()
# Protect against presets or transitions that write float data.
buffer_rgb = np.int_(hls_to_rgb(buffer) * 255)
def fill_packet(intbuffer, start, end, offset, packet, swap_order=False):
for pixel_index, pixel in enumerate(intbuffer[start:end]):
buffer_index = offset + pixel_index * 3
if swap_order:
packet[buffer_index] = pixel[2]
packet[buffer_index + 1] = pixel[1]
packet[buffer_index + 2] = pixel[0]
else:
packet[buffer_index] = pixel[0]
packet[buffer_index + 1] = pixel[1]
packet[buffer_index + 2] = pixel[2]
clients = [client for client in self._app.settings['networking']['clients']
if client["enabled"]]
if not clients:
return
for strand in xrange(len(strand_settings)):
if not strand_settings[strand]["enabled"]:
continue
packet = array.array('B', [])
color_mode = strand_settings[strand]["color-mode"]
start, end = BufferUtils.get_strand_extents(strand)
packet_header_size = 4
packet_size = (end-start) * 3 + packet_header_size
packet = self._packet_cache.get(packet_size, None)
if packet is None:
packet = [0,] * packet_size
self._packet_cache[packet_size] = packet
command = COMMAND_SET_RGB if color_mode == "RGB8" else COMMAND_SET_BGR
packet[0] = strand
packet[1] = command
length = packet_size - packet_header_size
packet[2] = length & 0x00FF
packet[3] = (length & 0xFF00) >> 8
rgb8_packet = None
bgr8_packet = None
for client in clients:
# TODO: Split into smaller packets so that less-than-ideal networks will be OK
client_color_mode = client["color-mode"]
if client_color_mode == 'RGB8':
if rgb8_packet is None:
fill_packet(buffer_rgb, start, end, packet_header_size, packet, False)
rgb8_packet = array.array('B', packet)
packet = rgb8_packet
elif client_color_mode == 'BGR8':
if bgr8_packet is None:
fill_packet(buffer_rgb, start, end, packet_header_size, packet, True)
bgr8_packet = array.array('B', packet)
packet = rgb8_packet
else:
raise NotImplementedError('Unknown color mode: %s' % client_color_mode)
try:
#print "Sending packet of length %i for strand %i", (len(packet), strand)
self._socket.sendto(packet, (client["host"], client["port"]))
except IOError as (errno, strerror):
print "I/O error({0}): {1}".format(errno, strerror)
#print "On strand %i with length %i" % (strand, len(packet))
except ValueError:
print "Could not convert data to an integer."
except:
print "Unexpected error:", sys.exc_info()[0]
raise
| 36.867257 | 94 | 0.556889 | 469 | 4,166 | 4.744136 | 0.315565 | 0.036404 | 0.045843 | 0.039551 | 0.132135 | 0.132135 | 0.132135 | 0.132135 | 0.097079 | 0.097079 | 0 | 0.017924 | 0.357177 | 4,166 | 112 | 95 | 37.196429 | 0.81292 | 0.06457 | 0 | 0.072289 | 0 | 0 | 0.045007 | 0 | 0 | 0 | 0.005326 | 0.017857 | 0 | 0 | null | null | 0.012048 | 0.108434 | null | null | 0.036145 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0d63807c9a1635ffcd3f6ef51f80701ce3ba0a1f | 1,012 | py | Python | pylons-emlo/emlo/workspace/indexing/src/solrconfig.py | culturesofknowledge/emlo-server | 8a88ca98a5211086195793e4bed5960550638936 | [
"MIT"
] | null | null | null | pylons-emlo/emlo/workspace/indexing/src/solrconfig.py | culturesofknowledge/emlo-server | 8a88ca98a5211086195793e4bed5960550638936 | [
"MIT"
] | null | null | null | pylons-emlo/emlo/workspace/indexing/src/solrconfig.py | culturesofknowledge/emlo-server | 8a88ca98a5211086195793e4bed5960550638936 | [
"MIT"
] | null | null | null | '''
Created on 27 Aug 2010
@author: dev
solr configurations
'''
solr_base_url = "http://solr:8983/solr/"
solr_urls = {
'all' : solr_base_url + 'all',
'locations' : solr_base_url + 'locations',
'comments' : solr_base_url + 'comments',
'images' : solr_base_url + 'images',
'works' : solr_base_url + 'works',
'people' : solr_base_url + 'people',
'manifestations' : solr_base_url + 'manifestations',
'institutions' : solr_base_url + 'institutions',
'resources' : solr_base_url + 'resources',
}
solr_urls_stage = {
'all' : solr_base_url + 'all_stage',
'locations' : solr_base_url + 'locations_stage',
'comments' : solr_base_url + 'comments_stage',
'images' : solr_base_url + 'images_stage',
'works' : solr_base_url + 'works_stage',
'people' : solr_base_url + 'people_stage',
'manifestations' : solr_base_url + 'manifestations_stage',
'institutions' : solr_base_url + 'institutions_stage',
'resources' : solr_base_url + 'resources_stage',
} | 29.764706 | 62 | 0.666008 | 120 | 1,012 | 5.2 | 0.2 | 0.24359 | 0.334936 | 0.044872 | 0.778846 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012151 | 0.186759 | 1,012 | 34 | 63 | 29.764706 | 0.746051 | 0.056324 | 0 | 0 | 0 | 0 | 0.383966 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0d675892bca79387a4a3c847ade4ee5f800f4f19 | 330 | py | Python | Lista 3/Exercicio 6.py | GiovannaPazello/Projetos-em-Python | 3cf7edbdf2a2350605a775389f7fe2cc7fe8032e | [
"MIT"
] | null | null | null | Lista 3/Exercicio 6.py | GiovannaPazello/Projetos-em-Python | 3cf7edbdf2a2350605a775389f7fe2cc7fe8032e | [
"MIT"
] | null | null | null | Lista 3/Exercicio 6.py | GiovannaPazello/Projetos-em-Python | 3cf7edbdf2a2350605a775389f7fe2cc7fe8032e | [
"MIT"
] | null | null | null | '''Faça um programa que leia um número digitado pelo usuário. Depois, informe todos os
números primos gerados até o número digitado pelo usuário.'''
n = int(input("Verificar números primos até:"))
cont=0
nPrimos = []
for i in range(2,n):
if (n % i == 0):
cont += 1
if(cont==0):
nPrimos.append(n)
print(nPrimos) | 22 | 86 | 0.663636 | 53 | 330 | 4.132075 | 0.641509 | 0.127854 | 0.164384 | 0.228311 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019084 | 0.206061 | 330 | 15 | 87 | 22 | 0.816794 | 0.430303 | 0 | 0 | 0 | 0 | 0.15847 | 0 | 0 | 0 | 0 | 0.066667 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0d70000cb2f4d7d413af6a739d2ea1f15071405b | 703 | py | Python | net_fns/__init__.py | crypt3lx2k/amdrl | 3d53b832d10c817eaf2613007d27aaf84381fe25 | [
"MIT"
] | 1 | 2018-03-31T08:14:01.000Z | 2018-03-31T08:14:01.000Z | net_fns/__init__.py | crypt3lx2k/amdrl | 3d53b832d10c817eaf2613007d27aaf84381fe25 | [
"MIT"
] | null | null | null | net_fns/__init__.py | crypt3lx2k/amdrl | 3d53b832d10c817eaf2613007d27aaf84381fe25 | [
"MIT"
] | null | null | null | #! /usr/bin/env python
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from . import layer_fns
from . import dense
from . import dqn_nature
__all__ = ['make_net_fn', 'dense', 'dqn_nature']
def make_net_fn (net_params):
"""Makes a net function based on full specification."""
layers = []
for params in net_params:
layer_fn = layer_fns.make_layer_fn(params)
layers.append(layer_fn)
def net_fn (features, training=False):
"""Builds full net."""
net = features
for i, layer in enumerate(layers):
net = layer(net, training=training)
return net
return net_fn
| 22.677419 | 59 | 0.675676 | 95 | 703 | 4.642105 | 0.410526 | 0.045351 | 0.108844 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.236131 | 703 | 30 | 60 | 23.433333 | 0.821229 | 0.125178 | 0 | 0 | 0 | 0 | 0.043046 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.555556 | 0.055556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
0d70a71c10c945ebdf80f96885b21d54305490f9 | 116 | py | Python | w3resource/Basics- Part I/basics010.py | DanielPascualSenties/pythonw3 | f0355d1b640dec19e0b087797538204332111bb5 | [
"MIT"
] | null | null | null | w3resource/Basics- Part I/basics010.py | DanielPascualSenties/pythonw3 | f0355d1b640dec19e0b087797538204332111bb5 | [
"MIT"
] | null | null | null | w3resource/Basics- Part I/basics010.py | DanielPascualSenties/pythonw3 | f0355d1b640dec19e0b087797538204332111bb5 | [
"MIT"
] | null | null | null | print("Insert an in integer")
n = input()
nn = n + n
nnn = nn + n
result = int(n) + int(nn) + int(nnn)
print(result) | 19.333333 | 36 | 0.603448 | 22 | 116 | 3.181818 | 0.5 | 0.085714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206897 | 116 | 6 | 37 | 19.333333 | 0.76087 | 0 | 0 | 0 | 0 | 0 | 0.17094 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0d737e0925f78299405a99a5f37eb8eca8a9ccb9 | 291 | py | Python | armitage/armitageApp/ackermann/repository/ackermannRepository.py | sudoFerraz/armitage | 303a5aca64ace9bacd10acabb0ba1d9b99c678ac | [
"CC0-1.0"
] | null | null | null | armitage/armitageApp/ackermann/repository/ackermannRepository.py | sudoFerraz/armitage | 303a5aca64ace9bacd10acabb0ba1d9b99c678ac | [
"CC0-1.0"
] | null | null | null | armitage/armitageApp/ackermann/repository/ackermannRepository.py | sudoFerraz/armitage | 303a5aca64ace9bacd10acabb0ba1d9b99c678ac | [
"CC0-1.0"
] | null | null | null | import sqlalchemy
from sqlalchemy import Column, Integer, Float, String
from ...model import Base
class AckermannCalculations(Base):
__tablename__ = 'ackermann_calculations'
id = Column(Integer, primary_key=True)
time_spent = Column(String(999))
result = Column(String(999)) | 32.333333 | 53 | 0.756014 | 34 | 291 | 6.264706 | 0.647059 | 0.122066 | 0.140845 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024292 | 0.151203 | 291 | 9 | 54 | 32.333333 | 0.838057 | 0 | 0 | 0 | 0 | 0 | 0.075342 | 0.075342 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
0d8766a3f9da92b0623ba1afa542ea0af756bc8a | 4,032 | py | Python | custom_components/helios/sensor.py | anekinloewe/homeassistant-helios | abe761a1a0221542dbcac287b26a21f320a356b8 | [
"MIT"
] | 3 | 2020-11-04T22:49:51.000Z | 2021-06-01T19:47:28.000Z | custom_components/helios/sensor.py | anekinloewe/homeassistant-helios | abe761a1a0221542dbcac287b26a21f320a356b8 | [
"MIT"
] | 10 | 2020-07-12T08:34:45.000Z | 2022-03-03T17:43:34.000Z | custom_components/helios/sensor.py | anekinloewe/homeassistant-helios | abe761a1a0221542dbcac287b26a21f320a356b8 | [
"MIT"
] | 5 | 2021-05-02T09:35:19.000Z | 2022-01-23T11:59:09.000Z | from datetime import datetime, date
from homeassistant.const import TEMP_CELSIUS
from homeassistant.core import callback
from homeassistant.helpers.dispatcher import async_dispatcher_connect
from homeassistant.helpers.entity import Entity
from .const import (
DOMAIN,
SIGNAL_HELIOS_STATE_UPDATE
)
async def async_setup_entry(hass, entry, async_add_entities):
client = hass.data[DOMAIN]["client"]
name = hass.data[DOMAIN]["name"] + ' '
state_proxy = hass.data[DOMAIN]["state_proxy"]
async_add_entities(
[
HeliosTempSensor(client, name + "Outside Air", "temp_outside_air"),
HeliosTempSensor(client, name + "Supply Air", "temp_supply_air"),
HeliosTempSensor(client, name + "Extract Air", "temp_extract_air"),
HeliosTempSensor(client, name + "Exhaust Air", "temp_outgoing_air"),
HeliosSensor(client, name + "Extract Air Humidity", "v02136", 2, "%", "mdi:water-percent"),
HeliosSensor(client, name + "Supply Air Speed", "v00348", 4, "rpm", "mdi:fan"),
HeliosSensor(client, name + "Extract Air Speed", "v00349", 4, "rpm", "mdi:fan"),
HeliosFanSpeedSensor(state_proxy, name),
HeliosBoostTimeSensor(state_proxy, name),
],
update_before_add=False
)
class HeliosTempSensor(Entity):
def __init__(self, client, name, metric):
self._state = None
self._name = name
self._metric = metric
self._client = client
def update(self):
self._state = self._client.get_feature(self._metric)
@property
def name(self):
return self._name
@property
def state(self):
return self._state
@property
def unit_of_measurement(self):
return TEMP_CELSIUS
class HeliosSensor(Entity):
def __init__(self, client, name, var, var_length, units, icon):
self._state = None
self._name = name
self._variable = var
self._var_length = var_length
self._units = units
self._icon = icon
self._client = client
def update(self):
self._state = self._client.get_variable(
self._variable,
self._var_length,
conversion=int
)
@property
def name(self):
return self._name
@property
def state(self):
return self._state
@property
def icon(self):
return self._icon
@property
def unit_of_measurement(self):
return self._units
class HeliosFanSpeedSensor(Entity):
def __init__(self, state_proxy, name):
self._state_proxy = state_proxy
self._name = name + "Fan Speed"
@property
def should_poll(self):
return False
async def async_added_to_hass(self):
async_dispatcher_connect(
self.hass, SIGNAL_HELIOS_STATE_UPDATE, self._update_callback
)
@callback
def _update_callback(self):
self.async_schedule_update_ha_state(True)
@property
def name(self):
return self._name
@property
def state(self):
return self._state_proxy.get_speed_percent()
@property
def icon(self):
return "mdi:fan"
@property
def unit_of_measurement(self):
return "%"
class HeliosBoostTimeSensor(Entity):
def __init__(self, state_proxy, name):
self._state_proxy = state_proxy
self._name = name + "Boost Time"
@property
def should_poll(self):
return False
async def async_added_to_hass(self):
async_dispatcher_connect(
self.hass, SIGNAL_HELIOS_STATE_UPDATE, self._update_callback
)
@callback
def _update_callback(self):
self.async_schedule_update_ha_state(True)
@property
def name(self):
return self._name
@property
def state(self):
return self._state_proxy.get_boost_time()
@property
def icon(self):
return "mdi:clock"
@property
def unit_of_measurement(self):
return "mins"
| 26.012903 | 103 | 0.640129 | 461 | 4,032 | 5.305857 | 0.182213 | 0.076451 | 0.057236 | 0.0278 | 0.543336 | 0.511447 | 0.466476 | 0.385119 | 0.385119 | 0.385119 | 0 | 0.006085 | 0.266369 | 4,032 | 154 | 104 | 26.181818 | 0.820825 | 0 | 0 | 0.536585 | 0 | 0 | 0.068948 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.203252 | false | 0 | 0.04878 | 0.138211 | 0.422764 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
0d8ab18d56859a1efec494f5b33ef1a401ae2fc6 | 190 | py | Python | trim_to_name.py | northWind87/py_blackboard_marking_helper | 4692b0f541ad7cec95da807e6e852527d84a98b4 | [
"MIT"
] | null | null | null | trim_to_name.py | northWind87/py_blackboard_marking_helper | 4692b0f541ad7cec95da807e6e852527d84a98b4 | [
"MIT"
] | null | null | null | trim_to_name.py | northWind87/py_blackboard_marking_helper | 4692b0f541ad7cec95da807e6e852527d84a98b4 | [
"MIT"
] | null | null | null | import os
import shutil
if __name__ == "__main__":
for f in os.listdir('.'):
if "2014" in f:
new_name = f[f.find("2014") + 20:]
shutil.move(f, new_name)
| 21.111111 | 46 | 0.531579 | 28 | 190 | 3.25 | 0.535714 | 0.087912 | 0.175824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.076923 | 0.315789 | 190 | 8 | 47 | 23.75 | 0.623077 | 0 | 0 | 0 | 0 | 0 | 0.089474 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0d905bbdd7c996bcfd79d3e50b0087711530b5d5 | 2,786 | py | Python | LTUAssistantPlus/skills/add_calendar_event_skill.py | Xyaneon/LTUAssistantPlus | 9fdf2f2b4861450fbed64b90b8c7c69b0173e052 | [
"MIT"
] | null | null | null | LTUAssistantPlus/skills/add_calendar_event_skill.py | Xyaneon/LTUAssistantPlus | 9fdf2f2b4861450fbed64b90b8c7c69b0173e052 | [
"MIT"
] | 24 | 2019-10-14T04:25:35.000Z | 2021-06-06T17:57:10.000Z | LTUAssistantPlus/skills/add_calendar_event_skill.py | Xyaneon/LTUAssistantPlus | 9fdf2f2b4861450fbed64b90b8c7c69b0173e052 | [
"MIT"
] | 1 | 2019-10-24T13:17:07.000Z | 2019-10-24T13:17:07.000Z | #!/usr/bin/python3
from nlp.universal_dependencies import ParsedUniversalDependencies
from services.assistant_services_base import AssistantServicesBase
from services.calendar.calendar_event import CalendarEvent
from .skill import SkillInput, Skill
class AddCalendarEventSkill(Skill):
"""Lets the assistant schedule a calendar event for the user."""
def __init__(self):
"""Initializes a new instance of the AddCalendarEventSkill class."""
self._cmd_list = ['schedule', 'remind', 'remind about', 'plan']
def matches_command(self, skill_input: SkillInput) -> bool:
"""Returns a Boolean value indicating whether this skill can be used to handle the given command."""
verb = (skill_input.verb or None) and skill_input.verb.lower()
return verb in self._cmd_list
def execute_for_command(self, skill_input: SkillInput, services: AssistantServicesBase):
"""Executes this skill on the given command input."""
verb_object = skill_input.dependencies.noun
event_str = verb_object
if event_str == "event":
event_sentence = self._ask_for_event_name(services, skill_input.verbose)
day_sentence = self._ask_for_event_day(services, skill_input.verbose)
time_sentence = self._ask_for_event_start_time(services, skill_input.verbose)
cal_event = CalendarEvent(event_sentence, day_sentence, time_sentence, "")
services.calendar_service.add_event(cal_event)
feedback_sentence = "Alright, I'm putting down " + str(cal_event) + "."
services.user_interaction_service.speak(feedback_sentence, skill_input.verbose)
else:
services.user_interaction_service.speak("Sorry, I am unable to help you schedule this right now.", skill_input.verbose)
def perform_setup(self, services):
"""Executes any setup work necessary for this skill before it can be used."""
pass
def _ask_for_event_day(self, services: AssistantServicesBase, verbose: bool=False) -> str:
"""Asks the user for the day of the event and returns it."""
return services.user_interaction_service.ask_question("What day will this be on?", verbose)
def _ask_for_event_name(self, services: AssistantServicesBase, verbose: bool=False) -> str:
"""Asks the user for the name of the event and returns it."""
return services.user_interaction_service.ask_question("Okay, what is the event called?", verbose)
def _ask_for_event_start_time(self, services: AssistantServicesBase, verbose: bool=False) -> str:
"""Asks the user for the start time of the event and returns it."""
return services.user_interaction_service.ask_question("What time will this start at?", verbose) | 56.857143 | 131 | 0.718234 | 358 | 2,786 | 5.360335 | 0.310056 | 0.05211 | 0.034393 | 0.078166 | 0.355915 | 0.219906 | 0.219906 | 0.219906 | 0.219906 | 0.219906 | 0 | 0.000447 | 0.196339 | 2,786 | 49 | 132 | 56.857143 | 0.856632 | 0.18916 | 0 | 0 | 0 | 0 | 0.091032 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.225806 | false | 0.032258 | 0.129032 | 0 | 0.516129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0d93a27b4f7d83455373ca13b948b26819f0696a | 3,519 | py | Python | lankuai/lankuai/lkitsm/web2/lkweb2/models.py | abiner/lankuai | 55a3631528acf1c46a471cb0616e28a5396faab5 | [
"MIT"
] | null | null | null | lankuai/lankuai/lkitsm/web2/lkweb2/models.py | abiner/lankuai | 55a3631528acf1c46a471cb0616e28a5396faab5 | [
"MIT"
] | null | null | null | lankuai/lankuai/lkitsm/web2/lkweb2/models.py | abiner/lankuai | 55a3631528acf1c46a471cb0616e28a5396faab5 | [
"MIT"
] | null | null | null | from django.db import models
from datetime import *
# Create your models here.
#公司内部部门
class Departments(models.Model):
depname=models.CharField(u'部门名称',max_length=40)
depmanagerid=models.CharField(u'部门经理',max_length=40)
def __str__(self):
return '%s' % self.depname#返回属性的显示内同
class Meta:
verbose_name='公司内部门'
verbose_name_plural='公司内部门'
#公司工程师信息
class User(models.Model):
uname=models.CharField(u'姓名',max_length=40)
upasswd=models.CharField(u'系统密码',max_length=40)
uCreatedate=models.DateField(U'入职日期',auto_now_add=True)#获取创建日期
uupoto=models.ImageField(u'照片')
userworkid=models.CharField(u'工号',max_length=20)
uidcard=models.IntegerField(u'身份证号码')
utelphone=models.IntegerField(u'手机号码')
uemail=models.EmailField(u'邮箱地址')#自动验证邮件格式
ufirstcontact=models.CharField(u'联系人姓名',max_length=40)
uconphone=models.IntegerField(u'联系人号码')
uaddress=models.CharField(u'住址',max_length=100)
ugender=models.BooleanField(default=True)#性别
ueduschool=models.CharField(u'毕业院校',max_length=40)
ueducation=models.CharField(u'学历',max_length=20)
uorigin=models.CharField(u'籍贯',max_length=20)
isDelect=models.BooleanField(default=False)#是否在职状态
uprojectname=models.ForeignKey("Projects",on_delete=models.CASCADE,) #所属项目------------
udepname=models.ForeignKey("Departments",on_delete=models.CASCADE,)#所属部门---------------
ulasttime=models.DateField(auto_now=True)#获取修改日期
def __str__(self):
return '%s' % self.uname#返回属性的显示内同
class Meta:
verbose_name='工程师信息表'
verbose_name_plural='工程师信息表'
#***********************************************************************************************
######没有注册到数据库
#***********************************************************************************************
#_____________________________________WEBOA_____________________________________________________
#工程师培训学习考核记录
#工程师岗位状态转换流程记录--项目岗位转换
#调休/事假/病假/加班申请等
#试工--入职--正式--级别提升申请
#公司服务事件记录表
#项目信息
class Projects(models.Model):
pname=models.CharField(u'项目名称',max_length=100)
pname=models.CharField(u'项目简称',max_length=100)
paddress=models.CharField(u'项目地址',max_length=100)
pcreatedate=models.DateField(auto_now_add=True)
plastdate=models.DateField(auto_now=True)
def __str__(self):
return '%s' % self.pname#返回属性的显示内同
class Meta:
verbose_name='项目信息表'
verbose_name_plural='项目信息表'
#项目客户部门信息
class CustomerDep (models.Model):
cdname=models.CharField(u'项目部门',max_length=100)
dpname=models.ForeignKey("Projects",on_delete=models.CASCADE,)#项目名称------------
def __str__(self):
return '%s' % self.cdname#返回属性的显示内同
class Meta:
verbose_name='项目部门表'
verbose_name_plural='项目部门表'
#项目客户信息
class Customersess(models.Model):
cname=models.CharField(u'客户姓名',max_length=40)
cphone=models.IntegerField(u'客户电话')
# cpasswd=models.CharField(u'登陆密码',max_length=40)
cprojectname=models.ForeignKey("Projects",on_delete=models.CASCADE,)#客户所属项目-----------
cdepname=models.ForeignKey("CustomerDep",on_delete=models.CASCADE,)#客户所属部门----------
def __str__(self):
return '%s' % self.cname#返回属性的显示内同
class Meta:
verbose_name='客户人员信息表'
verbose_name_plural='客户人员信息表'
#______________________________________ITIL______________________________________________________
class Itilworklog(models.Model):
pass
class Itsla(models.Model):
pass
class Itfaulttpye(models.Model):
pass
| 30.868421 | 97 | 0.682012 | 398 | 3,519 | 5.417085 | 0.354271 | 0.111317 | 0.118738 | 0.037106 | 0.20269 | 0.111317 | 0.062616 | 0 | 0 | 0 | 0 | 0.012111 | 0.131856 | 3,519 | 113 | 98 | 31.141593 | 0.693617 | 0.205172 | 0 | 0.185714 | 0 | 0 | 0.069556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0.057143 | 0.028571 | 0.071429 | 0.814286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
0d945fce07d53f681c1c128c7fb5d4c494fb8df2 | 261 | py | Python | ChangeUserAgent/change-user-agent.py | lucaslegname/mitmproxy-helpers | 3e87249f4380f0d2e8930a23fe7ccf6fdfec0f4e | [
"MIT"
] | 7 | 2019-09-30T09:11:59.000Z | 2022-02-25T02:23:37.000Z | ChangeUserAgent/change-user-agent.py | lucaslegname/mitmproxy-helpers | 3e87249f4380f0d2e8930a23fe7ccf6fdfec0f4e | [
"MIT"
] | 1 | 2022-03-29T15:31:44.000Z | 2022-03-29T15:31:44.000Z | ChangeUserAgent/change-user-agent.py | lucaslegname/mitmproxy-helpers | 3e87249f4380f0d2e8930a23fe7ccf6fdfec0f4e | [
"MIT"
] | null | null | null | from mitmproxy import http
from mitmproxy import ctx
class ChangeUserAgent:
user_agent = "my-custom-user-agent"
def request(self, flow: http.HTTPFlow) -> None:
flow.request.headers["user-agent"] = self.user_agent
addons = [ChangeUserAgent()]
| 23.727273 | 60 | 0.720307 | 33 | 261 | 5.636364 | 0.575758 | 0.193548 | 0.204301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168582 | 261 | 10 | 61 | 26.1 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0.114943 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0da82358f5c7a06cd16c5a76ec215a778d3f041c | 142 | py | Python | connectedafrica/logs.py | ANCIR/siyazana.co.za | b019bef9d5fa6ae75176444b95dae0fdcfa90463 | [
"MIT"
] | 6 | 2016-01-30T19:07:28.000Z | 2017-12-11T11:52:33.000Z | connectedafrica/logs.py | CodeForAfrica/siyazana.co.za | b019bef9d5fa6ae75176444b95dae0fdcfa90463 | [
"MIT"
] | null | null | null | connectedafrica/logs.py | CodeForAfrica/siyazana.co.za | b019bef9d5fa6ae75176444b95dae0fdcfa90463 | [
"MIT"
] | 2 | 2017-01-06T13:25:39.000Z | 2017-11-14T15:06:05.000Z | import logging
logging.basicConfig(level=logging.DEBUG)
requests_log = logging.getLogger("requests")
requests_log.setLevel(logging.WARNING)
| 20.285714 | 44 | 0.830986 | 17 | 142 | 6.823529 | 0.588235 | 0.189655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06338 | 142 | 6 | 45 | 23.666667 | 0.87218 | 0 | 0 | 0 | 0 | 0 | 0.056338 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0da8582ccedea5581b0f063ca746918606e43222 | 585 | py | Python | src/utils/utils.py | KatyKasilina/StumbleUpon-Evergreen-DataMining | de8824bb85f00aef5b9ad57690191dbc984b9384 | [
"MIT"
] | null | null | null | src/utils/utils.py | KatyKasilina/StumbleUpon-Evergreen-DataMining | de8824bb85f00aef5b9ad57690191dbc984b9384 | [
"MIT"
] | null | null | null | src/utils/utils.py | KatyKasilina/StumbleUpon-Evergreen-DataMining | de8824bb85f00aef5b9ad57690191dbc984b9384 | [
"MIT"
] | null | null | null | import json
import pickle
from typing import NoReturn
import pandas as pd
def read_data(path: str) -> pd.DataFrame:
data = pd.read_csv(path, sep ='\t')
return data
def save_metrics_to_json(file_path: str, metrics: dict) -> NoReturn:
with open(file_path, "w") as metric_file:
json.dump(metrics, metric_file)
def save_pkl_file(input_file, output_name: str) -> NoReturn:
with open(output_name, "wb") as f:
pickle.dump(input_file, f)
def load_pkl_file(input_: str):
with open(input_, "rb") as fin:
res = pickle.load(fin)
return res
| 21.666667 | 68 | 0.682051 | 92 | 585 | 4.130435 | 0.423913 | 0.063158 | 0.084211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206838 | 585 | 26 | 69 | 22.5 | 0.818966 | 0 | 0 | 0 | 0 | 0 | 0.011966 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.235294 | 0 | 0.588235 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0dab8827e18dc8b1b52098f011d874e5f56af468 | 785 | py | Python | Diena_7_functions/built_in_fun.py | MarisKuz/Python-RTU | 12261d06dc81fa0d98190ca0eb5133d43d517070 | [
"MIT"
] | 8 | 2020-08-31T16:10:54.000Z | 2021-11-24T06:37:37.000Z | Diena_7_functions/built_in_fun.py | MarisKuz/Python-RTU | 12261d06dc81fa0d98190ca0eb5133d43d517070 | [
"MIT"
] | 8 | 2021-06-08T22:30:29.000Z | 2022-03-12T00:48:55.000Z | Diena_7_functions/built_in_fun.py | MarisKuz/Python-RTU | 12261d06dc81fa0d98190ca0eb5133d43d517070 | [
"MIT"
] | 12 | 2020-09-28T17:06:52.000Z | 2022-02-17T12:12:46.000Z | # https://docs.python.org/3/library/functions.html#built-in-functions
my_results = [True, True, 2*2==4, True]
print(all(my_results)) # all statements in the sequence should be truthy to get True else we get False
my_results.append(False)
print(my_results)
print(all(my_results)) #In logic this is called universal quantor, all my)_results must be truthy for all to return True
print(any(my_results)) # any just needs one true result inside, existences kvantors, one or more items are true
print(len(my_results))
my_results.append(9000)
print("max", max(my_results)) # we only have 1 or 0 in my_results
my_results.append(-30)
print(my_results)
print("min", min(my_results))
print("summa", sum(my_results)) # well when summing booleans True is 1 and False is 0
# print(min(my_results)) | 46.176471 | 120 | 0.766879 | 140 | 785 | 4.185714 | 0.471429 | 0.245734 | 0.061433 | 0.05802 | 0.081911 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020319 | 0.122293 | 785 | 17 | 121 | 46.176471 | 0.830189 | 0.555414 | 0 | 0.307692 | 0 | 0 | 0.032258 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.692308 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
0dae66cb7da70ce915ffb87fd08d8f42f3e106bc | 244 | py | Python | examples/exampleM2.py | pyrate-build/pyrate-build | 8ce9c2dd2b94b50aebfd058e1dd8731cbb192e6d | [
"Apache-2.0"
] | 41 | 2016-01-14T15:28:53.000Z | 2022-03-17T12:43:01.000Z | examples/exampleM2.py | pyrate-build/pyrate-build | 8ce9c2dd2b94b50aebfd058e1dd8731cbb192e6d | [
"Apache-2.0"
] | 5 | 2016-01-20T09:42:30.000Z | 2016-12-22T22:54:27.000Z | examples/exampleM2.py | pyrate-build/pyrate-build | 8ce9c2dd2b94b50aebfd058e1dd8731cbb192e6d | [
"Apache-2.0"
] | 3 | 2016-01-20T09:40:28.000Z | 2020-10-29T09:33:48.000Z | #!/usr/bin/env pyrate
build_output = ['makefile']
ex_d = executable('exampleM2_debug.bin', 'test.cpp foo.cpp', compiler_opts = '-O0')
ex_r = executable('exampleM2_release.bin', 'test.cpp foo.cpp', compiler_opts = '-O3')
default_targets = ex_r
| 34.857143 | 85 | 0.721311 | 37 | 244 | 4.513514 | 0.621622 | 0.227545 | 0.11976 | 0.155689 | 0.335329 | 0.335329 | 0.335329 | 0 | 0 | 0 | 0 | 0.018265 | 0.102459 | 244 | 6 | 86 | 40.666667 | 0.744292 | 0.081967 | 0 | 0 | 0 | 0 | 0.38565 | 0.09417 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0db265d25b36e8832056b150921b75012c379cb3 | 1,744 | py | Python | src/talkofactawords/talkofactawords/scripts/collect_words.py | salayhin/talkofacta | 8b5a14245dd467bb1fda75423074c4840bd69fb7 | [
"MIT"
] | null | null | null | src/talkofactawords/talkofactawords/scripts/collect_words.py | salayhin/talkofacta | 8b5a14245dd467bb1fda75423074c4840bd69fb7 | [
"MIT"
] | null | null | null | src/talkofactawords/talkofactawords/scripts/collect_words.py | salayhin/talkofacta | 8b5a14245dd467bb1fda75423074c4840bd69fb7 | [
"MIT"
] | null | null | null | """
Talk of Europe Creative Camp #2 :: Wordcloud project :: collect_words
Compute the set of words that will be considered for extracting significant ones.
Inputs word features extracted using ``compute_features words`` and stored in
<zodb_dir>. Selects a set of words that is common for five largest countries (by speech counts).
Subtracts from that stopwords for numerous languages.
Writes output into ZODB variable root.all_words as a python set object.
Usage: collect_words [-h]
Copyright 2015, Konstantin Tretyakov, Ilya Kuzovkin, Alexander Tkachenko.
License: MIT
"""
from sqlalchemy import func, desc
from docopt import docopt
from unidecode import unidecode
import transaction
import nltk
from talkofactadb.model import Speech, open_db
from talkofactadb.config import get_config
from talkofactawords.zodb import open_zodb
from clint.textui import progress
def main():
args = docopt(__doc__)
c = get_config()
session = open_db()
zodb_root = open_zodb(read_only=True)
ids = session.query(Speech.id).all()
all_words = set()
for (id,) in progress.bar(ids, label="Progress: ", every=100):
all_words = all_words.union(zodb_root.features['words'][id].keys())
print "Word set size: ", len(all_words)
#print "Subtracting stopwords..."
#nltk.download('stopwords')
#langs = ['english', 'spanish']
#all_stopwords = reduce(lambda x, y: x | y, [set(nltk.corpus.stopwords.words(lng)) for lng in langs])
#all_stopwords = set(map(unidecode, all_stopwords))
#all_words = all_words - all_stopwords
#print "Resulting word set size: ", len(all_words)
print "Saving..."
zodb_root = open_zodb()
zodb_root.all_words = all_words
transaction.commit()
print "Done" | 34.88 | 105 | 0.730505 | 245 | 1,744 | 5.061224 | 0.497959 | 0.064516 | 0.035484 | 0.03871 | 0.043548 | 0.043548 | 0.043548 | 0 | 0 | 0 | 0 | 0.005536 | 0.171445 | 1,744 | 50 | 106 | 34.88 | 0.852595 | 0.18578 | 0 | 0 | 0 | 0 | 0.051559 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.375 | null | null | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
0db971ac67e4e500d0aafd4d3e0fef8537557051 | 609 | py | Python | app/main/views.py | BerniceWu/SpotifyBot | bbac67b36bfd335a8cf3d5546346b5dd82203e20 | [
"MIT"
] | null | null | null | app/main/views.py | BerniceWu/SpotifyBot | bbac67b36bfd335a8cf3d5546346b5dd82203e20 | [
"MIT"
] | null | null | null | app/main/views.py | BerniceWu/SpotifyBot | bbac67b36bfd335a8cf3d5546346b5dd82203e20 | [
"MIT"
] | null | null | null | from io import BytesIO
from flask import request, send_file
import telegram
from . import main
from .. import bot
from ..fsm import SpotifyBotMachine
machine = SpotifyBotMachine()
@main.route('/hook', methods=['post'])
def webhook_handler():
update = telegram.Update.de_json(request.get_json(force=True), bot)
machine.advance(update)
return 'ok'
@main.route('/show-fsm', methods=['GET'])
def show_fsm():
byte_io = BytesIO()
machine.graph.draw(byte_io, prog='dot', format='png')
byte_io.seek(0)
return send_file(byte_io, attachment_filename='fsm.png', mimetype='image/png')
| 22.555556 | 82 | 0.712644 | 85 | 609 | 4.976471 | 0.517647 | 0.056738 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001919 | 0.144499 | 609 | 26 | 83 | 23.423077 | 0.809981 | 0 | 0 | 0 | 0 | 0 | 0.073892 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
0dbc4ce71f31b79b22ac3086f9b436fdb6de6c74 | 2,362 | py | Python | env/Lib/site-packages/plotly/validators/layout/smith/imaginaryaxis/__init__.py | andresgreen-byte/Laboratorio-1--Inversion-de-Capital | 8a4707301d19c3826c31026c4077930bcd6a8182 | [
"MIT"
] | 7 | 2022-01-16T12:28:16.000Z | 2022-03-04T15:31:45.000Z | packages/python/plotly/plotly/validators/layout/smith/imaginaryaxis/__init__.py | jiangrongbo/plotly.py | df19fc702b309586cc24e25373b87e8bdbb3ff60 | [
"MIT"
] | 14 | 2021-10-20T23:33:47.000Z | 2021-12-21T04:50:37.000Z | packages/python/plotly/plotly/validators/layout/smith/imaginaryaxis/__init__.py | jiangrongbo/plotly.py | df19fc702b309586cc24e25373b87e8bdbb3ff60 | [
"MIT"
] | 1 | 2021-11-29T22:55:05.000Z | 2021-11-29T22:55:05.000Z | import sys
if sys.version_info < (3, 7):
from ._visible import VisibleValidator
from ._tickwidth import TickwidthValidator
from ._tickvalssrc import TickvalssrcValidator
from ._tickvals import TickvalsValidator
from ._ticksuffix import TicksuffixValidator
from ._ticks import TicksValidator
from ._tickprefix import TickprefixValidator
from ._ticklen import TicklenValidator
from ._tickformat import TickformatValidator
from ._tickfont import TickfontValidator
from ._tickcolor import TickcolorValidator
from ._showticksuffix import ShowticksuffixValidator
from ._showtickprefix import ShowtickprefixValidator
from ._showticklabels import ShowticklabelsValidator
from ._showline import ShowlineValidator
from ._showgrid import ShowgridValidator
from ._linewidth import LinewidthValidator
from ._linecolor import LinecolorValidator
from ._layer import LayerValidator
from ._hoverformat import HoverformatValidator
from ._gridwidth import GridwidthValidator
from ._gridcolor import GridcolorValidator
from ._color import ColorValidator
else:
from _plotly_utils.importers import relative_import
__all__, __getattr__, __dir__ = relative_import(
__name__,
[],
[
"._visible.VisibleValidator",
"._tickwidth.TickwidthValidator",
"._tickvalssrc.TickvalssrcValidator",
"._tickvals.TickvalsValidator",
"._ticksuffix.TicksuffixValidator",
"._ticks.TicksValidator",
"._tickprefix.TickprefixValidator",
"._ticklen.TicklenValidator",
"._tickformat.TickformatValidator",
"._tickfont.TickfontValidator",
"._tickcolor.TickcolorValidator",
"._showticksuffix.ShowticksuffixValidator",
"._showtickprefix.ShowtickprefixValidator",
"._showticklabels.ShowticklabelsValidator",
"._showline.ShowlineValidator",
"._showgrid.ShowgridValidator",
"._linewidth.LinewidthValidator",
"._linecolor.LinecolorValidator",
"._layer.LayerValidator",
"._hoverformat.HoverformatValidator",
"._gridwidth.GridwidthValidator",
"._gridcolor.GridcolorValidator",
"._color.ColorValidator",
],
)
| 40.033898 | 56 | 0.699831 | 160 | 2,362 | 9.9125 | 0.39375 | 0.017654 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001104 | 0.233277 | 2,362 | 58 | 57 | 40.724138 | 0.874655 | 0 | 0 | 0 | 0 | 0 | 0.293819 | 0.293819 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.464286 | 0 | 0.464286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
0dbcc3a95f08574e72862626c34dad9fcdc3bdd1 | 1,584 | py | Python | armi/materials/thorium.py | keckler/armi | b5f95b4795aa21e00fd6786f6994862a4bdccb16 | [
"Apache-2.0"
] | 162 | 2019-11-01T17:35:58.000Z | 2022-03-18T04:22:39.000Z | armi/materials/thorium.py | keckler/armi | b5f95b4795aa21e00fd6786f6994862a4bdccb16 | [
"Apache-2.0"
] | 315 | 2019-11-01T17:32:05.000Z | 2022-03-30T03:51:42.000Z | armi/materials/thorium.py | keckler/armi | b5f95b4795aa21e00fd6786f6994862a4bdccb16 | [
"Apache-2.0"
] | 55 | 2019-11-01T16:59:59.000Z | 2022-03-25T18:19:06.000Z | # Copyright 2019 TerraPower, LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# cython: profile=False
"""
Thorium Metal
Data is from [#IAEA-TECDOCT-1450]_.
.. [#IAEA-TECDOCT-1450] Thorium fuel cycle -- Potential benefits and challenges, IAEA-TECDOC-1450 (2005).
https://www-pub.iaea.org/mtcd/publications/pdf/te_1450_web.pdf
"""
from armi.utils.units import getTk
from armi.materials.material import Material
class Thorium(Material):
name = "Thorium metal"
def setDefaultMassFracs(self):
self.setMassFrac("TH232", 1.0)
def density(self, Tk=None, Tc=None):
Tk = getTk(Tc, Tk)
return 11.68
def linearExpansion(self, Tk=None, Tc=None):
r"""m/m/K from IAEA TECDOC 1450"""
Tk = getTk(Tc, Tk)
self.checkTempRange(30, 600, Tk, "linear expansionn")
return 11.9e-6
def thermalConductivity(self, Tk=None, Tc=None):
r"""W/m-K from IAEA TE 1450"""
Tk = getTk(Tc, Tk)
return 43.1
def meltingPoint(self):
r"""melting point in K from IAEA TE 1450"""
return 2025.0
| 30.461538 | 105 | 0.683081 | 234 | 1,584 | 4.611111 | 0.542735 | 0.055607 | 0.027804 | 0.033364 | 0.119555 | 0.031511 | 0 | 0 | 0 | 0 | 0 | 0.052716 | 0.209596 | 1,584 | 51 | 106 | 31.058824 | 0.809105 | 0.561237 | 0 | 0.142857 | 0 | 0 | 0.053111 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.238095 | false | 0 | 0.095238 | 0 | 0.619048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0dbdb42323dd9bbb7c14fd790a8f7e58be358741 | 1,243 | py | Python | tests/endpoints/test_storage_terminals.py | V0RT3X4/python-sdk | 4cffae83b90a58a56f1a534057fa1ca1c8671e05 | [
"Apache-2.0"
] | 9 | 2019-11-13T17:14:55.000Z | 2019-11-18T16:06:13.000Z | tests/endpoints/test_storage_terminals.py | VorTECHsa/python-sdk | d85aabd8d9843e4d04d857360492bea002c2b24b | [
"Apache-2.0"
] | 114 | 2020-01-08T11:08:24.000Z | 2022-03-30T16:42:23.000Z | tests/endpoints/test_storage_terminals.py | V0RT3X4/python-sdk | 4cffae83b90a58a56f1a534057fa1ca1c8671e05 | [
"Apache-2.0"
] | 6 | 2020-05-28T00:09:02.000Z | 2022-03-14T03:52:44.000Z | from tests.testcases import TestCaseUsingMockAPI
from vortexasdk.endpoints.storage_terminals import StorageTerminals
from tests.mock_client import example_storage_terminals
from vortexasdk.endpoints.storage_terminals_result import StorageTerminalResult
class TestStorageTerminals(TestCaseUsingMockAPI):
st = StorageTerminalResult(example_storage_terminals)
def test_search(self):
terminals = StorageTerminals().search().to_df()
assert len(terminals) > 0
def test_name_search_term(self):
terminals = StorageTerminals().search(name=['Military']).to_df()
assert len(terminals) > 0
def test_search_ids(self):
terminals = (
StorageTerminals().search(ids=['08bbaf7a67ab30036d73b9604b932352a73905e16b8342b27f02ae34941b7db5']).to_list()
)
names = [a.name for a in terminals]
assert 'Military Oil Depot' in names
def test_to_list(self):
names = [x.name for x in self.st.to_list()]
assert names == ['Waypoints', 'South Pars Kangan Site - Phase 13', 'Military Oil Depot']
def test_check_columns(self):
terminals = StorageTerminals().search().to_df()
assert list(terminals.columns) == ['id', 'name', 'lat', 'lon']
| 36.558824 | 121 | 0.709574 | 136 | 1,243 | 6.316176 | 0.382353 | 0.040745 | 0.135041 | 0.16298 | 0.253783 | 0.16298 | 0.16298 | 0.069849 | 0 | 0 | 0 | 0.047619 | 0.189059 | 1,243 | 33 | 122 | 37.666667 | 0.804563 | 0 | 0 | 0.166667 | 0 | 0 | 0.13033 | 0.051488 | 0 | 0 | 0 | 0 | 0.208333 | 1 | 0.208333 | false | 0 | 0.166667 | 0 | 0.458333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0dcc5cd7be438f3f4e49937a8f2f064a78703ffc | 10,450 | py | Python | OpenGLWrapper_JE/venv/Lib/site-packages/OpenGL/raw/GL/VERSION/GL_1_3.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | OpenGLWrapper_JE/venv/Lib/site-packages/OpenGL/raw/GL/VERSION/GL_1_3.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | OpenGLWrapper_JE/venv/Lib/site-packages/OpenGL/raw/GL/VERSION/GL_1_3.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | '''Autogenerated by xml_generate script, do not edit!'''
from OpenGL import platform as _p, arrays
# Code generation uses this
from OpenGL.raw.GL import _types as _cs
# End users want this...
from OpenGL.raw.GL._types import *
from OpenGL.raw.GL import _errors
from OpenGL.constant import Constant as _C
import ctypes
_EXTENSION_NAME = 'GL_VERSION_GL_1_3'
def _f( function ):
return _p.createFunction( function,_p.PLATFORM.GL,'GL_VERSION_GL_1_3',error_checker=_errors._error_checker)
GL_ACTIVE_TEXTURE=_C('GL_ACTIVE_TEXTURE',0x84E0)
GL_ADD_SIGNED=_C('GL_ADD_SIGNED',0x8574)
GL_CLAMP_TO_BORDER=_C('GL_CLAMP_TO_BORDER',0x812D)
GL_CLIENT_ACTIVE_TEXTURE=_C('GL_CLIENT_ACTIVE_TEXTURE',0x84E1)
GL_COMBINE=_C('GL_COMBINE',0x8570)
GL_COMBINE_ALPHA=_C('GL_COMBINE_ALPHA',0x8572)
GL_COMBINE_RGB=_C('GL_COMBINE_RGB',0x8571)
GL_COMPRESSED_ALPHA=_C('GL_COMPRESSED_ALPHA',0x84E9)
GL_COMPRESSED_INTENSITY=_C('GL_COMPRESSED_INTENSITY',0x84EC)
GL_COMPRESSED_LUMINANCE=_C('GL_COMPRESSED_LUMINANCE',0x84EA)
GL_COMPRESSED_LUMINANCE_ALPHA=_C('GL_COMPRESSED_LUMINANCE_ALPHA',0x84EB)
GL_COMPRESSED_RGB=_C('GL_COMPRESSED_RGB',0x84ED)
GL_COMPRESSED_RGBA=_C('GL_COMPRESSED_RGBA',0x84EE)
GL_COMPRESSED_TEXTURE_FORMATS=_C('GL_COMPRESSED_TEXTURE_FORMATS',0x86A3)
GL_CONSTANT=_C('GL_CONSTANT',0x8576)
GL_DOT3_RGB=_C('GL_DOT3_RGB',0x86AE)
GL_DOT3_RGBA=_C('GL_DOT3_RGBA',0x86AF)
GL_INTERPOLATE=_C('GL_INTERPOLATE',0x8575)
GL_MAX_CUBE_MAP_TEXTURE_SIZE=_C('GL_MAX_CUBE_MAP_TEXTURE_SIZE',0x851C)
GL_MAX_TEXTURE_UNITS=_C('GL_MAX_TEXTURE_UNITS',0x84E2)
GL_MULTISAMPLE=_C('GL_MULTISAMPLE',0x809D)
GL_MULTISAMPLE_BIT=_C('GL_MULTISAMPLE_BIT',0x20000000)
GL_NORMAL_MAP=_C('GL_NORMAL_MAP',0x8511)
GL_NUM_COMPRESSED_TEXTURE_FORMATS=_C('GL_NUM_COMPRESSED_TEXTURE_FORMATS',0x86A2)
GL_OPERAND0_ALPHA=_C('GL_OPERAND0_ALPHA',0x8598)
GL_OPERAND0_RGB=_C('GL_OPERAND0_RGB',0x8590)
GL_OPERAND1_ALPHA=_C('GL_OPERAND1_ALPHA',0x8599)
GL_OPERAND1_RGB=_C('GL_OPERAND1_RGB',0x8591)
GL_OPERAND2_ALPHA=_C('GL_OPERAND2_ALPHA',0x859A)
GL_OPERAND2_RGB=_C('GL_OPERAND2_RGB',0x8592)
GL_PREVIOUS=_C('GL_PREVIOUS',0x8578)
GL_PRIMARY_COLOR=_C('GL_PRIMARY_COLOR',0x8577)
GL_PROXY_TEXTURE_CUBE_MAP=_C('GL_PROXY_TEXTURE_CUBE_MAP',0x851B)
GL_REFLECTION_MAP=_C('GL_REFLECTION_MAP',0x8512)
GL_RGB_SCALE=_C('GL_RGB_SCALE',0x8573)
GL_SAMPLES=_C('GL_SAMPLES',0x80A9)
GL_SAMPLE_ALPHA_TO_COVERAGE=_C('GL_SAMPLE_ALPHA_TO_COVERAGE',0x809E)
GL_SAMPLE_ALPHA_TO_ONE=_C('GL_SAMPLE_ALPHA_TO_ONE',0x809F)
GL_SAMPLE_BUFFERS=_C('GL_SAMPLE_BUFFERS',0x80A8)
GL_SAMPLE_COVERAGE=_C('GL_SAMPLE_COVERAGE',0x80A0)
GL_SAMPLE_COVERAGE_INVERT=_C('GL_SAMPLE_COVERAGE_INVERT',0x80AB)
GL_SAMPLE_COVERAGE_VALUE=_C('GL_SAMPLE_COVERAGE_VALUE',0x80AA)
GL_SOURCE0_ALPHA=_C('GL_SOURCE0_ALPHA',0x8588)
GL_SOURCE0_RGB=_C('GL_SOURCE0_RGB',0x8580)
GL_SOURCE1_ALPHA=_C('GL_SOURCE1_ALPHA',0x8589)
GL_SOURCE1_RGB=_C('GL_SOURCE1_RGB',0x8581)
GL_SOURCE2_ALPHA=_C('GL_SOURCE2_ALPHA',0x858A)
GL_SOURCE2_RGB=_C('GL_SOURCE2_RGB',0x8582)
GL_SUBTRACT=_C('GL_SUBTRACT',0x84E7)
GL_TEXTURE0=_C('GL_TEXTURE0',0x84C0)
GL_TEXTURE1=_C('GL_TEXTURE1',0x84C1)
GL_TEXTURE10=_C('GL_TEXTURE10',0x84CA)
GL_TEXTURE11=_C('GL_TEXTURE11',0x84CB)
GL_TEXTURE12=_C('GL_TEXTURE12',0x84CC)
GL_TEXTURE13=_C('GL_TEXTURE13',0x84CD)
GL_TEXTURE14=_C('GL_TEXTURE14',0x84CE)
GL_TEXTURE15=_C('GL_TEXTURE15',0x84CF)
GL_TEXTURE16=_C('GL_TEXTURE16',0x84D0)
GL_TEXTURE17=_C('GL_TEXTURE17',0x84D1)
GL_TEXTURE18=_C('GL_TEXTURE18',0x84D2)
GL_TEXTURE19=_C('GL_TEXTURE19',0x84D3)
GL_TEXTURE2=_C('GL_TEXTURE2',0x84C2)
GL_TEXTURE20=_C('GL_TEXTURE20',0x84D4)
GL_TEXTURE21=_C('GL_TEXTURE21',0x84D5)
GL_TEXTURE22=_C('GL_TEXTURE22',0x84D6)
GL_TEXTURE23=_C('GL_TEXTURE23',0x84D7)
GL_TEXTURE24=_C('GL_TEXTURE24',0x84D8)
GL_TEXTURE25=_C('GL_TEXTURE25',0x84D9)
GL_TEXTURE26=_C('GL_TEXTURE26',0x84DA)
GL_TEXTURE27=_C('GL_TEXTURE27',0x84DB)
GL_TEXTURE28=_C('GL_TEXTURE28',0x84DC)
GL_TEXTURE29=_C('GL_TEXTURE29',0x84DD)
GL_TEXTURE3=_C('GL_TEXTURE3',0x84C3)
GL_TEXTURE30=_C('GL_TEXTURE30',0x84DE)
GL_TEXTURE31=_C('GL_TEXTURE31',0x84DF)
GL_TEXTURE4=_C('GL_TEXTURE4',0x84C4)
GL_TEXTURE5=_C('GL_TEXTURE5',0x84C5)
GL_TEXTURE6=_C('GL_TEXTURE6',0x84C6)
GL_TEXTURE7=_C('GL_TEXTURE7',0x84C7)
GL_TEXTURE8=_C('GL_TEXTURE8',0x84C8)
GL_TEXTURE9=_C('GL_TEXTURE9',0x84C9)
GL_TEXTURE_BINDING_CUBE_MAP=_C('GL_TEXTURE_BINDING_CUBE_MAP',0x8514)
GL_TEXTURE_COMPRESSED=_C('GL_TEXTURE_COMPRESSED',0x86A1)
GL_TEXTURE_COMPRESSED_IMAGE_SIZE=_C('GL_TEXTURE_COMPRESSED_IMAGE_SIZE',0x86A0)
GL_TEXTURE_COMPRESSION_HINT=_C('GL_TEXTURE_COMPRESSION_HINT',0x84EF)
GL_TEXTURE_CUBE_MAP=_C('GL_TEXTURE_CUBE_MAP',0x8513)
GL_TEXTURE_CUBE_MAP_NEGATIVE_X=_C('GL_TEXTURE_CUBE_MAP_NEGATIVE_X',0x8516)
GL_TEXTURE_CUBE_MAP_NEGATIVE_Y=_C('GL_TEXTURE_CUBE_MAP_NEGATIVE_Y',0x8518)
GL_TEXTURE_CUBE_MAP_NEGATIVE_Z=_C('GL_TEXTURE_CUBE_MAP_NEGATIVE_Z',0x851A)
GL_TEXTURE_CUBE_MAP_POSITIVE_X=_C('GL_TEXTURE_CUBE_MAP_POSITIVE_X',0x8515)
GL_TEXTURE_CUBE_MAP_POSITIVE_Y=_C('GL_TEXTURE_CUBE_MAP_POSITIVE_Y',0x8517)
GL_TEXTURE_CUBE_MAP_POSITIVE_Z=_C('GL_TEXTURE_CUBE_MAP_POSITIVE_Z',0x8519)
GL_TRANSPOSE_COLOR_MATRIX=_C('GL_TRANSPOSE_COLOR_MATRIX',0x84E6)
GL_TRANSPOSE_MODELVIEW_MATRIX=_C('GL_TRANSPOSE_MODELVIEW_MATRIX',0x84E3)
GL_TRANSPOSE_PROJECTION_MATRIX=_C('GL_TRANSPOSE_PROJECTION_MATRIX',0x84E4)
GL_TRANSPOSE_TEXTURE_MATRIX=_C('GL_TRANSPOSE_TEXTURE_MATRIX',0x84E5)
@_f
@_p.types(None,_cs.GLenum)
def glActiveTexture(texture):pass
@_f
@_p.types(None,_cs.GLenum)
def glClientActiveTexture(texture):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,_cs.GLenum,_cs.GLsizei,_cs.GLint,_cs.GLsizei,ctypes.c_void_p)
def glCompressedTexImage1D(target,level,internalformat,width,border,imageSize,data):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLint,_cs.GLsizei,ctypes.c_void_p)
def glCompressedTexImage2D(target,level,internalformat,width,height,border,imageSize,data):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLsizei,_cs.GLint,_cs.GLsizei,ctypes.c_void_p)
def glCompressedTexImage3D(target,level,internalformat,width,height,depth,border,imageSize,data):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,_cs.GLint,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,ctypes.c_void_p)
def glCompressedTexSubImage1D(target,level,xoffset,width,format,imageSize,data):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,_cs.GLint,_cs.GLint,_cs.GLsizei,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,ctypes.c_void_p)
def glCompressedTexSubImage2D(target,level,xoffset,yoffset,width,height,format,imageSize,data):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,_cs.GLint,_cs.GLint,_cs.GLint,_cs.GLsizei,_cs.GLsizei,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,ctypes.c_void_p)
def glCompressedTexSubImage3D(target,level,xoffset,yoffset,zoffset,width,height,depth,format,imageSize,data):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,ctypes.c_void_p)
def glGetCompressedTexImage(target,level,img):pass
@_f
@_p.types(None,arrays.GLdoubleArray)
def glLoadTransposeMatrixd(m):pass
@_f
@_p.types(None,arrays.GLfloatArray)
def glLoadTransposeMatrixf(m):pass
@_f
@_p.types(None,arrays.GLdoubleArray)
def glMultTransposeMatrixd(m):pass
@_f
@_p.types(None,arrays.GLfloatArray)
def glMultTransposeMatrixf(m):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLdouble)
def glMultiTexCoord1d(target,s):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLdoubleArray)
def glMultiTexCoord1dv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLfloat)
def glMultiTexCoord1f(target,s):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLfloatArray)
def glMultiTexCoord1fv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint)
def glMultiTexCoord1i(target,s):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLintArray)
def glMultiTexCoord1iv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLshort)
def glMultiTexCoord1s(target,s):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLshortArray)
def glMultiTexCoord1sv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLdouble,_cs.GLdouble)
def glMultiTexCoord2d(target,s,t):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLdoubleArray)
def glMultiTexCoord2dv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLfloat,_cs.GLfloat)
def glMultiTexCoord2f(target,s,t):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLfloatArray)
def glMultiTexCoord2fv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,_cs.GLint)
def glMultiTexCoord2i(target,s,t):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLintArray)
def glMultiTexCoord2iv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLshort,_cs.GLshort)
def glMultiTexCoord2s(target,s,t):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLshortArray)
def glMultiTexCoord2sv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLdouble,_cs.GLdouble,_cs.GLdouble)
def glMultiTexCoord3d(target,s,t,r):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLdoubleArray)
def glMultiTexCoord3dv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLfloat,_cs.GLfloat,_cs.GLfloat)
def glMultiTexCoord3f(target,s,t,r):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLfloatArray)
def glMultiTexCoord3fv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,_cs.GLint,_cs.GLint)
def glMultiTexCoord3i(target,s,t,r):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLintArray)
def glMultiTexCoord3iv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLshort,_cs.GLshort,_cs.GLshort)
def glMultiTexCoord3s(target,s,t,r):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLshortArray)
def glMultiTexCoord3sv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLdouble,_cs.GLdouble,_cs.GLdouble,_cs.GLdouble)
def glMultiTexCoord4d(target,s,t,r,q):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLdoubleArray)
def glMultiTexCoord4dv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLfloat,_cs.GLfloat,_cs.GLfloat,_cs.GLfloat)
def glMultiTexCoord4f(target,s,t,r,q):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLfloatArray)
def glMultiTexCoord4fv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint,_cs.GLint,_cs.GLint,_cs.GLint)
def glMultiTexCoord4i(target,s,t,r,q):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLintArray)
def glMultiTexCoord4iv(target,v):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLshort,_cs.GLshort,_cs.GLshort,_cs.GLshort)
def glMultiTexCoord4s(target,s,t,r,q):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLshortArray)
def glMultiTexCoord4sv(target,v):pass
@_f
@_p.types(None,_cs.GLfloat,_cs.GLboolean)
def glSampleCoverage(value,invert):pass
| 42.137097 | 142 | 0.810813 | 1,647 | 10,450 | 4.690953 | 0.187614 | 0.037277 | 0.041677 | 0.065493 | 0.452369 | 0.382863 | 0.337173 | 0.327077 | 0.309604 | 0.295755 | 0 | 0.058102 | 0.056268 | 10,450 | 247 | 143 | 42.307692 | 0.725309 | 0.009569 | 0 | 0.279835 | 1 | 0 | 0.167393 | 0.070325 | 0 | 0 | 0.057448 | 0 | 0 | 1 | 0.193416 | false | 0.1893 | 0.024691 | 0.004115 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
0dd23ff287029c43aa31540be362e91a7360cdf0 | 315 | py | Python | django_uicomponents/templatetags/components.py | koenwoortman/django-uicomponents | 833dd219ebbbdaa7dc2b41730d5f21afa55641f1 | [
"MIT"
] | 3 | 2021-05-22T10:45:51.000Z | 2021-08-12T14:40:45.000Z | django_uicomponents/templatetags/components.py | koenwoortman/django-uicomponents | 833dd219ebbbdaa7dc2b41730d5f21afa55641f1 | [
"MIT"
] | null | null | null | django_uicomponents/templatetags/components.py | koenwoortman/django-uicomponents | 833dd219ebbbdaa7dc2b41730d5f21afa55641f1 | [
"MIT"
] | null | null | null | from django import template
from django.conf import settings
register = template.Library()
@register.inclusion_tag('django_uicomponent.html')
def component(component_name, *args, **kwargs):
template = f'{settings.COMPONENTS_DIR}/{component_name}'
return {'UICOMPONENT_TEMPLATE_NAME': template, **kwargs }
| 31.5 | 61 | 0.774603 | 37 | 315 | 6.405405 | 0.567568 | 0.084388 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107937 | 315 | 9 | 62 | 35 | 0.843416 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
0dd68945d9da5e83d527210d3ce6b8a4c88e69e5 | 127 | py | Python | testing/vcs/test_vcs_plot_file_var.py | xylar/cdat | 8a5080cb18febfde365efc96147e25f51494a2bf | [
"BSD-3-Clause"
] | 62 | 2018-03-30T15:46:56.000Z | 2021-12-08T23:30:24.000Z | testing/vcs/test_vcs_plot_file_var.py | xylar/cdat | 8a5080cb18febfde365efc96147e25f51494a2bf | [
"BSD-3-Clause"
] | 114 | 2018-03-21T01:12:43.000Z | 2021-07-05T12:29:54.000Z | testing/vcs/test_vcs_plot_file_var.py | CDAT/uvcdat | 5133560c0c049b5c93ee321ba0af494253b44f91 | [
"BSD-3-Clause"
] | 14 | 2018-06-06T02:42:47.000Z | 2021-11-26T03:27:00.000Z | import os, sys, vcs, cdms2
f = cdms2.open(os.path.join(vcs.sample_data,"clt.nc"))
V = f("clt")
x = vcs.init()
x.plot(V, bg=1)
| 18.142857 | 54 | 0.629921 | 27 | 127 | 2.925926 | 0.703704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027273 | 0.133858 | 127 | 6 | 55 | 21.166667 | 0.690909 | 0 | 0 | 0 | 0 | 0 | 0.070866 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
0df0616c6c73234c35c8704081fa3170173ce954 | 599 | py | Python | Nimbus-Controller/s3-erasebucket.py | paulfdoyle/NIMBUS | 0f309b620c00a9438c55404e685bb1cafc44d200 | [
"MIT"
] | null | null | null | Nimbus-Controller/s3-erasebucket.py | paulfdoyle/NIMBUS | 0f309b620c00a9438c55404e685bb1cafc44d200 | [
"MIT"
] | null | null | null | Nimbus-Controller/s3-erasebucket.py | paulfdoyle/NIMBUS | 0f309b620c00a9438c55404e685bb1cafc44d200 | [
"MIT"
] | null | null | null | import boto.sqs
import argparse
import urllib
import os
import sys
import signal
import time
import datetime
import socket
import fcntl
import struct
from boto.sqs.message import Message
from subprocess import call
import boto.s3.connection
from boto.s3.connection import S3Connection
from boto.s3.connection import Location
from boto.s3.key import Key
conn = S3Connection('AKIAINWVSI3MIXIB5N3Q', 'p5YZH9h2x6Ua+5D2qC+p4HFUHQZRVo94J9zrOE+c')
bucket=conn.get_bucket('nimbus-results')
k = Key(bucket)
for key in bucket.list():
bucket.delete_key(key.name)
print "Bucket nimbus-results is now empty"
| 22.185185 | 87 | 0.816361 | 88 | 599 | 5.534091 | 0.477273 | 0.065708 | 0.098563 | 0.082136 | 0.106776 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035714 | 0.111853 | 599 | 26 | 88 | 23.038462 | 0.879699 | 0 | 0 | 0 | 0 | 0 | 0.180301 | 0.066778 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.73913 | null | null | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
217bb28221a074ded2d957423179c220d254d2b7 | 2,521 | py | Python | pytealutils/list.py | barnjamin/pyteal-utils | b4dcce3801aeb2a08ff171762d57f37ab8cbb5c1 | [
"MIT"
] | 6 | 2021-11-08T13:20:53.000Z | 2022-01-05T13:23:42.000Z | pytealutils/list.py | gmcgoldr/pyteal-utils | 3716ff74312d5136df89456e3db711037edccdcb | [
"MIT"
] | null | null | null | pytealutils/list.py | gmcgoldr/pyteal-utils | 3716ff74312d5136df89456e3db711037edccdcb | [
"MIT"
] | 1 | 2021-12-10T12:37:53.000Z | 2021-12-10T12:37:53.000Z | from pyteal import (
Len,
Concat,
Extract,
ExtractUint16,
ExtractUint32,
ExtractUint64,
Substring,
Itob,
)
from pyteal import TealType, Expr, Int, ScratchVar, Subroutine, Seq
from enum import Enum
# sort - In place? Spool to stack?
# reduce - (sum, mean, max, min)
# map - Same type only?
# Other list types? allow uvarints? dynamic length byte strings?
# Enum
uint16 = Int(16)
uint32 = Int(32)
uint64 = Int(64)
class List:
_internal = ScratchVar()
def __init__(self, size: Int):
# TODO: Make sure its in the enum
self.size = size
self.byte_size = size / Int(8)
def set(self, data: TealType.bytes) -> Expr:
# TODO: Check that length is factor of size
return self._internal.store(data)
def get(self) -> TealType.bytes:
return self._internal.load()
def __getitem__(self, idx: TealType.uint64) -> Expr:
# TODO: Make sure its not outside the list
if self.size.value == 16:
return ExtractUint16(self.get(), idx * self.byte_size)
elif self.size.value == 32:
return ExtractUint32(self.get(), idx * self.byte_size)
else:
return ExtractUint64(self.get(), idx * self.byte_size)
def __setitem__(self, idx: TealType.uint64, value: TealType.uint64) -> Expr:
return self.store(
Concat(
Substring(self.get(), Int(0), idx * self.byte_size),
Itob(value), # TODO: Take only the last `byte_size` byes?
Substring(self.get(), idx * self.byte_size, Len(self.get())),
)
)
def __delitem__(self, idx: TealType.uint64) -> Expr:
return self.store(
Concat(
Substring(self.get(), Int(0), idx * self.byte_size),
Substring(self.get(), (idx + Int(1)) * self.byte_size, Len(self.get())),
)
)
def get_bytes(self, idx: TealType.uint64) -> Expr:
return Substring(
self.get(), idx * self.byte_size, (idx + Int(1)) * self.byte_size
)
def swap(self, idx1: TealType.uint64, idx2: TealType.uint64) -> Expr:
# Store both in stack or state and write back
elem1, elem2 = ScratchVar(), ScratchVar()
return Seq(
elem1.store(self.get_bytes(idx1)),
elem2.store(self.get_bytes(idx2)),
# Put bytes
self.store(),
)
def __len__(self) -> TealType.uint64:
return Len(self.get()) / self.byte_size
| 30.011905 | 88 | 0.583499 | 312 | 2,521 | 4.592949 | 0.317308 | 0.063503 | 0.092114 | 0.073273 | 0.281228 | 0.263782 | 0.173064 | 0.10328 | 0.10328 | 0.10328 | 0 | 0.032131 | 0.296311 | 2,521 | 83 | 89 | 30.373494 | 0.775648 | 0.144784 | 0 | 0.101695 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012048 | 0 | 1 | 0.152542 | false | 0 | 0.050847 | 0.101695 | 0.40678 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
218e74715f07bbbc81d5c6177e90ca58f6d327e5 | 1,853 | py | Python | eddietool/commands.py | Tidanium/EDDIE-Tool | d54298891d4082d5d728b7a5b2bb27641a640a10 | [
"MIT"
] | null | null | null | eddietool/commands.py | Tidanium/EDDIE-Tool | d54298891d4082d5d728b7a5b2bb27641a640a10 | [
"MIT"
] | null | null | null | eddietool/commands.py | Tidanium/EDDIE-Tool | d54298891d4082d5d728b7a5b2bb27641a640a10 | [
"MIT"
] | null | null | null | """
File : commands.py
Start Date : 20080513
Refactor Date : 20180514
Description : Eddie-Tool command-line entry points.
$Id: commands.py 953 2018-05-14 04:57:27Z phillips.ryan $
"""
__copyright__ = 'Copyright (c) Ryan Phillips 2018'
__author__ = 'Chris Miles'
__author_email__ = 'miles.chris@gmail.com'
__maintainer__ = 'Ryan Phillips aka Tidanium'
__maintainer_email__ = 'ryan@ryanphillips.org'
__url__ = 'https://github.com/Tidanium/EDDIE-Tool'
__license__ = """
MIT License
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""
# todo will work on this more after the rest of the content it relies on
from .version import version as __version__
import sys, os, time, signal, re, threading, asyncio
from .common import utils, log
loop = asyncio.get_event_loop()
class Main:
def __init__(self):
self.relpath = os.path.relpath('.', '/') | 36.333333 | 78 | 0.765246 | 275 | 1,853 | 5.010909 | 0.596364 | 0.063861 | 0.018868 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023964 | 0.166757 | 1,853 | 51 | 79 | 36.333333 | 0.868523 | 0.148948 | 0 | 0 | 0 | 0 | 0.756688 | 0.026752 | 0 | 0 | 0 | 0.019608 | 0 | 1 | 0.032258 | false | 0 | 0.096774 | 0 | 0.16129 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
218ee7d07461a436d649b5c84d49ee03c19ccfbb | 314 | py | Python | sft/errors/config.py | placiflury/gridmonitor-sft | f2a3e4295728b71965838fb59360c63e3d485f5e | [
"BSD-3-Clause"
] | null | null | null | sft/errors/config.py | placiflury/gridmonitor-sft | f2a3e4295728b71965838fb59360c63e3d485f5e | [
"BSD-3-Clause"
] | null | null | null | sft/errors/config.py | placiflury/gridmonitor-sft | f2a3e4295728b71965838fb59360c63e3d485f5e | [
"BSD-3-Clause"
] | null | null | null | """
Container for Configuration related errors.
"""
class ConfigError(Exception):
""" Generic exception raised by
configuration errors.
"""
def __init__(self, expr, msg):
self.expression = expr
self.message = msg
def __str__(self):
return self.message
| 19.625 | 44 | 0.60828 | 31 | 314 | 5.903226 | 0.645161 | 0.120219 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.296178 | 314 | 15 | 45 | 20.933333 | 0.828054 | 0.302548 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.166667 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
21a67d401df14ceb738af4f6b7ab97841f7fd461 | 10,470 | py | Python | simscale_sdk/models/one_of_custom_fluid_bc_turbulent_kinetic_energy.py | slainesimscale/simscale-python-sdk | db483eeabe558e55d020f5f829a3bf13c9c287a7 | [
"MIT"
] | 8 | 2021-01-22T13:41:03.000Z | 2022-01-03T09:00:10.000Z | simscale_sdk/models/one_of_custom_fluid_bc_turbulent_kinetic_energy.py | slainesimscale/simscale-python-sdk | db483eeabe558e55d020f5f829a3bf13c9c287a7 | [
"MIT"
] | null | null | null | simscale_sdk/models/one_of_custom_fluid_bc_turbulent_kinetic_energy.py | slainesimscale/simscale-python-sdk | db483eeabe558e55d020f5f829a3bf13c9c287a7 | [
"MIT"
] | 3 | 2021-03-18T15:52:52.000Z | 2022-01-03T08:59:30.000Z | # coding: utf-8
"""
SimScale API
The version of the OpenAPI document: 0.0.0
Generated by: https://openapi-generator.tech
"""
import pprint
import re # noqa: F401
import six
from simscale_sdk.configuration import Configuration
class OneOfCustomFluidBCTurbulentKineticEnergy(object):
"""NOTE: This class is auto generated by OpenAPI Generator.
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
"""
Attributes:
openapi_types (dict): The key is attribute name
and the value is attribute type.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
"""
openapi_types = {
'type': 'str',
'gradient': 'DimensionalAcceleration',
'value': 'DimensionalTurbulenceKineticEnergy',
'intensity': 'float',
'wall_roughness': 'bool',
'roughness_height': 'DimensionalLength',
'roughness_constant': 'float'
}
attribute_map = {
'type': 'type',
'gradient': 'gradient',
'value': 'value',
'intensity': 'intensity',
'wall_roughness': 'wallRoughness',
'roughness_height': 'roughnessHeight',
'roughness_constant': 'roughnessConstant'
}
discriminator_value_class_map = {
'SYMMETRY': 'SymmetryTKEBC',
'FIXED_GRADIENT': 'FixedGradientTKEBC',
'FIXED_VALUE': 'FixedValueTKEBC',
'INLET_OUTLET': 'InletOutletTKEBC',
'ZERO_GRADIENT': 'ZeroGradientTKEBC',
'TURBULENT_INTENSITY_KINETIC_ENERGY_INLET': 'IntensityKineticEnergyInletTKEBC',
'WALL_FUNCTION': 'WallFunctionTKEBC',
'FULL_RESOLUTION': 'FullResolutionTKEBC'
}
def __init__(self, type='FULL_RESOLUTION', gradient=None, value=None, intensity=None, wall_roughness=None, roughness_height=None, roughness_constant=None, local_vars_configuration=None): # noqa: E501
"""OneOfCustomFluidBCTurbulentKineticEnergy - a model defined in OpenAPI""" # noqa: E501
if local_vars_configuration is None:
local_vars_configuration = Configuration()
self.local_vars_configuration = local_vars_configuration
self._type = None
self._gradient = None
self._value = None
self._intensity = None
self._wall_roughness = None
self._roughness_height = None
self._roughness_constant = None
self.discriminator = 'type'
self.type = type
if gradient is not None:
self.gradient = gradient
if value is not None:
self.value = value
if intensity is not None:
self.intensity = intensity
if wall_roughness is not None:
self.wall_roughness = wall_roughness
if roughness_height is not None:
self.roughness_height = roughness_height
if roughness_constant is not None:
self.roughness_constant = roughness_constant
@property
def type(self):
"""Gets the type of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
Schema name: FullResolutionTKEBC # noqa: E501
:return: The type of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:rtype: str
"""
return self._type
@type.setter
def type(self, type):
"""Sets the type of this OneOfCustomFluidBCTurbulentKineticEnergy.
Schema name: FullResolutionTKEBC # noqa: E501
:param type: The type of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:type: str
"""
if self.local_vars_configuration.client_side_validation and type is None: # noqa: E501
raise ValueError("Invalid value for `type`, must not be `None`") # noqa: E501
self._type = type
@property
def gradient(self):
"""Gets the gradient of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:return: The gradient of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:rtype: DimensionalAcceleration
"""
return self._gradient
@gradient.setter
def gradient(self, gradient):
"""Sets the gradient of this OneOfCustomFluidBCTurbulentKineticEnergy.
:param gradient: The gradient of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:type: DimensionalAcceleration
"""
self._gradient = gradient
@property
def value(self):
"""Gets the value of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:return: The value of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:rtype: DimensionalTurbulenceKineticEnergy
"""
return self._value
@value.setter
def value(self, value):
"""Sets the value of this OneOfCustomFluidBCTurbulentKineticEnergy.
:param value: The value of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:type: DimensionalTurbulenceKineticEnergy
"""
self._value = value
@property
def intensity(self):
"""Gets the intensity of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:return: The intensity of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:rtype: float
"""
return self._intensity
@intensity.setter
def intensity(self, intensity):
"""Sets the intensity of this OneOfCustomFluidBCTurbulentKineticEnergy.
:param intensity: The intensity of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:type: float
"""
if (self.local_vars_configuration.client_side_validation and
intensity is not None and intensity > 1): # noqa: E501
raise ValueError("Invalid value for `intensity`, must be a value less than or equal to `1`") # noqa: E501
if (self.local_vars_configuration.client_side_validation and
intensity is not None and intensity < 0): # noqa: E501
raise ValueError("Invalid value for `intensity`, must be a value greater than or equal to `0`") # noqa: E501
self._intensity = intensity
@property
def wall_roughness(self):
"""Gets the wall_roughness of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:return: The wall_roughness of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:rtype: bool
"""
return self._wall_roughness
@wall_roughness.setter
def wall_roughness(self, wall_roughness):
"""Sets the wall_roughness of this OneOfCustomFluidBCTurbulentKineticEnergy.
:param wall_roughness: The wall_roughness of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:type: bool
"""
self._wall_roughness = wall_roughness
@property
def roughness_height(self):
"""Gets the roughness_height of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:return: The roughness_height of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:rtype: DimensionalLength
"""
return self._roughness_height
@roughness_height.setter
def roughness_height(self, roughness_height):
"""Sets the roughness_height of this OneOfCustomFluidBCTurbulentKineticEnergy.
:param roughness_height: The roughness_height of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:type: DimensionalLength
"""
self._roughness_height = roughness_height
@property
def roughness_constant(self):
"""Gets the roughness_constant of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:return: The roughness_constant of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:rtype: float
"""
return self._roughness_constant
@roughness_constant.setter
def roughness_constant(self, roughness_constant):
"""Sets the roughness_constant of this OneOfCustomFluidBCTurbulentKineticEnergy.
:param roughness_constant: The roughness_constant of this OneOfCustomFluidBCTurbulentKineticEnergy. # noqa: E501
:type: float
"""
if (self.local_vars_configuration.client_side_validation and
roughness_constant is not None and roughness_constant < 0.5): # noqa: E501
raise ValueError("Invalid value for `roughness_constant`, must be a value greater than or equal to `0.5`") # noqa: E501
self._roughness_constant = roughness_constant
def get_real_child_model(self, data):
"""Returns the real base class specified by the discriminator"""
discriminator_key = self.attribute_map[self.discriminator]
discriminator_value = data[discriminator_key]
return self.discriminator_value_class_map.get(discriminator_value)
def to_dict(self):
"""Returns the model properties as a dict"""
result = {}
for attr, _ in six.iteritems(self.openapi_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
return result
def to_str(self):
"""Returns the string representation of the model"""
return pprint.pformat(self.to_dict())
def __repr__(self):
"""For `print` and `pprint`"""
return self.to_str()
def __eq__(self, other):
"""Returns true if both objects are equal"""
if not isinstance(other, OneOfCustomFluidBCTurbulentKineticEnergy):
return False
return self.to_dict() == other.to_dict()
def __ne__(self, other):
"""Returns true if both objects are not equal"""
if not isinstance(other, OneOfCustomFluidBCTurbulentKineticEnergy):
return True
return self.to_dict() != other.to_dict()
| 34.440789 | 204 | 0.652818 | 1,040 | 10,470 | 6.409615 | 0.15 | 0.039604 | 0.193219 | 0.157516 | 0.492649 | 0.389589 | 0.354035 | 0.211521 | 0.117912 | 0.088059 | 0 | 0.015246 | 0.267049 | 10,470 | 303 | 205 | 34.554455 | 0.853401 | 0.335053 | 0 | 0.095238 | 1 | 0 | 0.145702 | 0.024284 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.027211 | 0 | 0.29932 | 0.013605 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
21a799e98e051bf45e8a2fec5ebf19281b354609 | 337 | py | Python | src_bfm/Manifest.py | ktobro/uart_bfm_comments | b680cf1e680bfa2ceab53e428cb650283715f61f | [
"Apache-2.0"
] | 13 | 2019-10-29T12:05:55.000Z | 2022-02-03T11:41:17.000Z | src_bfm/Manifest.py | ktobro/uart_bfm_comments | b680cf1e680bfa2ceab53e428cb650283715f61f | [
"Apache-2.0"
] | 4 | 2019-12-24T10:46:15.000Z | 2021-01-14T09:04:36.000Z | src_bfm/Manifest.py | ktobro/uart_bfm_comments | b680cf1e680bfa2ceab53e428cb650283715f61f | [
"Apache-2.0"
] | 3 | 2020-04-06T10:00:56.000Z | 2021-04-28T08:08:43.000Z | files = ['avalon_mm_bfm_pkg.vhd',
'avalon_st_bfm_pkg.vhd',
'axilite_bfm_pkg.vhd',
'axistream_bfm_pkg.vhd',
'gmii_bfm_pkg.vhd',
'gpio_bfm_pkg.vhd',
'i2c_bfm_pkg.vhd',
'rgmii_bfm_pkg.vhd',
'sbi_bfm_pkg.vhd',
'spi_bfm_pkg.vhd',
'uart_bfm_pkg.vhd',
]
| 25.923077 | 33 | 0.551929 | 47 | 337 | 3.446809 | 0.340426 | 0.407407 | 0.611111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004292 | 0.308605 | 337 | 12 | 34 | 28.083333 | 0.690987 | 0 | 0 | 0 | 0 | 0 | 0.569733 | 0.186944 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
21c41a2ebfbb89ae34523fafb0d0641d301b5d0f | 1,944 | py | Python | web_server/nodes/models.py | ColeBoytinck/cmput404-group-project | 133e118fe8453b13f0d5afdf7b8d625eba9e4086 | [
"MIT"
] | null | null | null | web_server/nodes/models.py | ColeBoytinck/cmput404-group-project | 133e118fe8453b13f0d5afdf7b8d625eba9e4086 | [
"MIT"
] | null | null | null | web_server/nodes/models.py | ColeBoytinck/cmput404-group-project | 133e118fe8453b13f0d5afdf7b8d625eba9e4086 | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib.auth.hashers import make_password
# Create your models here.
#
# http://dsnfof.herokuapp.com/
# from hostname -> Authentication: basic server_username:server_password
# hostname = "dsnfof.herokuapp.com"
# server_username = "their username"
# server_password = "password"
# we send -> to api_location Authentication: api_username:api_password
# api_location = "dsnfof.herokuapp.com/api"
# api_username = "our username"
# api_password = "some password"
# As a server admin, I want to be able to add node to share with #44
class Node(models.Model):
foreign_server_hostname = models.CharField(
primary_key=True, max_length=500, unique=True)
# the credentials this foreign server use to log into our server
foreign_server_username = models.CharField(max_length=500, null=False)
foreign_server_password = models.CharField(max_length=500, null=False)
foreign_server_api_location = models.CharField(max_length=500, null=False)
# the credentials our server use to log into this foreign server
username_registered_on_foreign_server = models.CharField(
max_length=500)
password_registered_on_foreign_server = models.CharField(
max_length=500)
# As a server admin, I want to share or not share images with users on other servers. #5
image_share = models.BooleanField(default=True)
# As a server admin, I want to share or not share posts with users on other servers. #6
post_share = models.BooleanField(default=True)
append_slash = models.BooleanField(default=False)
def save(self, *args, **kwargs):
# make_password hashes the password
self.foreign_server_password = make_password(
self.foreign_server_password)
# self.password_registered_on_foreign_server = make_password(
# self.password_registered_on_foreign_server)
super(Node, self).save(*args, **kwargs)
| 38.88 | 92 | 0.742798 | 264 | 1,944 | 5.268939 | 0.32197 | 0.11215 | 0.051761 | 0.086269 | 0.457944 | 0.30266 | 0.30266 | 0.196981 | 0.196981 | 0.051761 | 0 | 0.01375 | 0.176955 | 1,944 | 49 | 93 | 39.673469 | 0.855625 | 0.462963 | 0 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.263158 | 0.105263 | 0 | 0.684211 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
21c577593a96dcd0fbdc5e1b3a4e7c172ce00132 | 182 | py | Python | tests/ut_repytests_loose-testremovefilepy.py | SeattleTestbed/repy_v1 | f40a02e2e398b1ec67fede84b41a264ae7356d2c | [
"MIT"
] | 1 | 2021-08-18T05:58:17.000Z | 2021-08-18T05:58:17.000Z | tests/ut_repytests_loose-testremovefilepy.py | SeattleTestbed/repy_v1 | f40a02e2e398b1ec67fede84b41a264ae7356d2c | [
"MIT"
] | 3 | 2015-11-17T21:01:03.000Z | 2016-07-14T09:08:04.000Z | tests/ut_repytests_loose-testremovefilepy.py | SeattleTestbed/repy_v1 | f40a02e2e398b1ec67fede84b41a264ae7356d2c | [
"MIT"
] | 5 | 2015-07-02T13:29:23.000Z | 2021-09-25T07:48:30.000Z | #pragma repy restrictions.loose
# create a junk.py file
myfo = open("junk.py","w")
print >> myfo, "print 'Hello world'"
myfo.close()
removefile("junk.py") # should be removed...
| 20.222222 | 46 | 0.675824 | 27 | 182 | 4.555556 | 0.740741 | 0.146341 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148352 | 182 | 8 | 47 | 22.75 | 0.793548 | 0.401099 | 0 | 0 | 0 | 0 | 0.320755 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
21d9b66f9b3ca724a80c3864197a1eb26d149335 | 1,566 | py | Python | photos/models.py | patrickrop-cloud/Gallery | b1a06b0b05472f2f06b752b97a62e23a68b26148 | [
"MIT"
] | null | null | null | photos/models.py | patrickrop-cloud/Gallery | b1a06b0b05472f2f06b752b97a62e23a68b26148 | [
"MIT"
] | null | null | null | photos/models.py | patrickrop-cloud/Gallery | b1a06b0b05472f2f06b752b97a62e23a68b26148 | [
"MIT"
] | null | null | null | from django.db import models
from django.db.models.base import Model
from cloudinary.models import CloudinaryField
# Create your models here.
class Category(models.Model):
category_name=models.CharField(max_length=100,blank=True,null=True )
def __str__(self):
return self.category_name
def save_categories(self):
self.save()
def delete_category(self):
self.delete()
@classmethod
def search_by_category_name(cls,search_term):
category = cls.objects.filter(category_name__icontains=search_term)
return category
def display_searched():
category=Image.objects.all()
return category
class Location(models.Model):
location_name=models.CharField(max_length=50,blank=False ,null=True)
def __str__(self):
return self.location_name
def save_location(self):
self.save()
def delete_location(self):
self.delete()
class Image(models.Model):
image=models.ImageField(null=True,blank=False)
image=CloudinaryField('image')
name=models.CharField(max_length=100)
description=models.CharField(max_length=4000)
category=models.ForeignKey(Category,on_delete=models.CASCADE,null=True)
location=models.ForeignKey(Location,on_delete=models.CASCADE,null=True)
def __str__(self):
return self.name
def save_image(self):
self.save()
def delete_image(self):
self.delete()
@classmethod
def update_image(self):
image=Image.objects.get_or_create()
return image
| 24.46875 | 75 | 0.697318 | 195 | 1,566 | 5.394872 | 0.271795 | 0.045627 | 0.068441 | 0.091255 | 0.326046 | 0.186312 | 0.079848 | 0 | 0 | 0 | 0 | 0.009639 | 0.204981 | 1,566 | 63 | 76 | 24.857143 | 0.835341 | 0.015326 | 0 | 0.302326 | 0 | 0 | 0.003249 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.27907 | false | 0 | 0.069767 | 0.069767 | 0.744186 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
21f41df216cde8c210a51351a6efd374143e433e | 1,767 | py | Python | temp_model_analysis.py | SgtVincent/Robothor-2020---VIPL-ICT | 5eee00c077c07e69120fb8108f574c2339688f34 | [
"Apache-2.0"
] | null | null | null | temp_model_analysis.py | SgtVincent/Robothor-2020---VIPL-ICT | 5eee00c077c07e69120fb8108f574c2339688f34 | [
"Apache-2.0"
] | 1 | 2022-03-14T03:34:49.000Z | 2022-03-14T03:34:49.000Z | temp_model_analysis.py | SgtVincent/Robothor-2020---VIPL-ICT | 5eee00c077c07e69120fb8108f574c2339688f34 | [
"Apache-2.0"
] | null | null | null | from torchsummary import summary
import sys
import os
import torch
import torchvision
from tensorboardX import SummaryWriter
import tensorwatch as tw
# from models.model_io import ModelInput, ModelOptions, ModelOutput
from utils.flag_parser import parse_arguments
import torch.jit as jit
# embed basemodel.py
import torch
import torch.nn as nn
import torch.nn.functional as F
from utils.net_util import norm_col_init, weights_init
from models import MatchModel
if __name__ == '__main__':
args = parse_arguments()
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
# device = 'cpu'
model = MatchModel(args).to(device)
# model_input = ModelInput()
state = torch.zeros(1,512,7,7).to(device) # [1,512,7,7]
hidden = (
torch.zeros(1, args.hidden_state_sz).to(device), # [1,512]
torch.zeros(1, args.hidden_state_sz).to(device), # [1,512]
)
target_class_embedding = torch.zeros(300).to(device) # [300]
action_probs = torch.zeros(1, args.action_space).to(device) # [1, #(ACTION_SPACE)]
# model_opts = ModelOptions()
# tw.draw_model(model,([1,512,7,7],
# [1,args.hidden_state_sz],
# [1,args.hidden_state_sz],
# [1, 300],
# [1, args.action_space]),
# 'model.png')
# with SummaryWriter("model_vis",comment="basemodel") as writer:
# writer.add_graph(model, (state, hidden[0], hidden[1],
# target_class_embedding, action_probs), verbose=True)
# summary(model,[(512,7,7),
# (args.hidden_state_sz),
# (args.hidden_state_sz),
# (300),
# (args.action_space)])
print(model)
| 35.34 | 87 | 0.621958 | 227 | 1,767 | 4.647577 | 0.339207 | 0.045498 | 0.085308 | 0.096682 | 0.1109 | 0.1109 | 0.075829 | 0.075829 | 0.075829 | 0.075829 | 0 | 0.040152 | 0.252971 | 1,767 | 49 | 88 | 36.061224 | 0.759091 | 0.442558 | 0 | 0.153846 | 0 | 0 | 0.015641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.538462 | 0 | 0.538462 | 0.038462 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
21f6365a45db237ae32f045e9f29fb8259f3541d | 5,605 | py | Python | pyaz/network/watcher/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/network/watcher/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | null | null | null | pyaz/network/watcher/__init__.py | py-az-cli/py-az-cli | 9a7dc44e360c096a5a2f15595353e9dad88a9792 | [
"MIT"
] | 1 | 2022-02-03T09:12:01.000Z | 2022-02-03T09:12:01.000Z | '''
Manage the Azure Network Watcher.
'''
from ... pyaz_utils import _call_az
from . import connection_monitor, flow_log, packet_capture, troubleshooting
def configure(locations, enabled=None, resource_group=None, tags=None):
'''
Configure the Network Watcher service for different regions.
Required Parameters:
- locations -- Location. Values from: `az account list-locations`. You can configure the default location using `az configure --defaults location=<location>`.
Optional Parameters:
- enabled -- None
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
- tags -- space-separated tags: key[=value] [key[=value] ...]. Use '' to clear existing tags.
'''
return _call_az("az network watcher configure", locals())
def list():
'''
List Network Watchers.
'''
return _call_az("az network watcher list", locals())
def test_ip_flow(direction, local, protocol, remote, vm, nic=None, resource_group=None):
'''
Test IP flow to/from a VM given the currently configured network security group rules.
Required Parameters:
- direction -- None
- local -- None
- protocol -- None
- remote -- None
- vm -- Name or ID of the VM to target. If the name of the VM is provided, the --resource-group is required.
Optional Parameters:
- nic -- Name or ID of the NIC resource to test. If the VM has multiple NICs and IP forwarding is enabled on any of them, this parameter is required.
- resource_group -- Name of the resource group the target VM is in.
'''
return _call_az("az network watcher test-ip-flow", locals())
def test_connectivity(source_resource, dest_address=None, dest_port=None, dest_resource=None, headers=None, method=None, protocol=None, resource_group=None, source_port=None, valid_status_codes=None):
'''
Test if a connection can be established between a Virtual Machine and a given endpoint.
Required Parameters:
- source_resource -- None
Optional Parameters:
- dest_address -- None
- dest_port -- None
- dest_resource -- None
- headers -- Space-separated list of headers in `KEY=VALUE` format.
- method -- HTTP method to use.
- protocol -- Protocol to test on.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
- source_port -- None
- valid_status_codes -- Space-separated list of HTTP status codes considered valid.
'''
return _call_az("az network watcher test-connectivity", locals())
def show_next_hop(dest_ip, resource_group, source_ip, vm, nic=None):
'''
Get information on the 'next hop' of a VM.
Required Parameters:
- dest_ip -- Destination IPv4 address.
- resource_group -- Name of the resource group the target VM is in.
- source_ip -- Source IPv4 address.
- vm -- Name or ID of the VM to target. If the name of the VM is provided, the --resource-group is required.
Optional Parameters:
- nic -- Name or ID of the NIC resource to test. If the VM has multiple NICs and IP forwarding is enabled on any of them, this parameter is required.
'''
return _call_az("az network watcher show-next-hop", locals())
def show_security_group_view(resource_group, vm):
'''
Get detailed security information on a VM for the currently configured network security group.
Required Parameters:
- resource_group -- Name of the resource group the target VM is in.
- vm -- Name or ID of the VM to target. If the name of the VM is provided, the --resource-group is required.
'''
return _call_az("az network watcher show-security-group-view", locals())
def show_topology(location=None, resource_group=None, subnet=None, vnet=None):
'''
Get the network topology of a resource group, virtual network or subnet.
Optional Parameters:
- location -- Location. Values from: `az account list-locations`. You can configure the default location using `az configure --defaults location=<location>`.
- resource_group -- None
- subnet -- Name or ID of the subnet to target. If name is used, --vnet NAME must also be supplied.
- vnet -- Name or ID of the virtual network to target.
'''
return _call_az("az network watcher show-topology", locals())
def run_configuration_diagnostic(resource, destination=None, direction=None, parent=None, port=None, protocol=None, queries=None, resource_group=None, resource_type=None, source=None):
'''
Run a configuration diagnostic on a target resource.
Required Parameters:
- resource -- Name or ID of the target resource to diagnose. If an ID is given, other resource arguments should not be given.
Optional Parameters:
- destination -- Traffic destination. Accepted values are '*', IP address/CIDR, or service tag.
- direction -- Direction of the traffic.
- parent -- The parent path. (ex: virtualMachineScaleSets/vmss1)
- port -- Traffic destination port. Accepted values are '*', port number (3389) or port range (80-100).
- protocol -- Protocol to be verified on.
- queries -- JSON list of queries to use. Use `@{path}` to load from a file.
- resource_group -- Name of resource group. You can configure the default group using `az configure --defaults group=<name>`
- resource_type -- The resource type
- source -- Traffic source. Accepted values are '*', IP address/CIDR, or service tag.
'''
return _call_az("az network watcher run-configuration-diagnostic", locals())
| 43.449612 | 200 | 0.705085 | 781 | 5,605 | 4.974392 | 0.193342 | 0.080309 | 0.02471 | 0.028829 | 0.454311 | 0.444273 | 0.385586 | 0.360875 | 0.360875 | 0.32278 | 0 | 0.002682 | 0.201606 | 5,605 | 128 | 201 | 43.789063 | 0.865475 | 0.676182 | 0 | 0 | 0 | 0 | 0.185034 | 0.035374 | 0 | 0 | 0 | 0 | 0 | 1 | 0.444444 | false | 0 | 0.111111 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
df03e9a7e0789c709836b379b84d710ffe458a07 | 523 | py | Python | purses/bindings.py | pgdr/purses | 1e6073c3639d73f1405149c39448d39d4d29432f | [
"Apache-2.0"
] | null | null | null | purses/bindings.py | pgdr/purses | 1e6073c3639d73f1405149c39448d39d4d29432f | [
"Apache-2.0"
] | 4 | 2018-06-21T09:52:42.000Z | 2018-06-23T08:00:18.000Z | purses/bindings.py | pgdr/purses | 1e6073c3639d73f1405149c39448d39d4d29432f | [
"Apache-2.0"
] | 1 | 2018-06-20T19:14:49.000Z | 2018-06-20T19:14:49.000Z | class binding_(object):
"""Register key bindings with the object binding."""
def __init__(self):
self.bindings = {}
def __call__(self, key):
def register(func):
def decorator(model, nav, io, *args, **kwargs):
res = func(model, nav, io, *args, **kwargs)
return res
self.bindings[key] = decorator
return decorator
return register
# this object works as a decorator for registering key bindings
binding = binding_()
| 24.904762 | 63 | 0.585086 | 58 | 523 | 5.103448 | 0.465517 | 0.074324 | 0.067568 | 0.094595 | 0.135135 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.315488 | 523 | 20 | 64 | 26.15 | 0.826816 | 0.208413 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
df042d5f05a1f20a162a862ebbbe09149ff0ef6d | 2,389 | py | Python | contacts/models.py | hugocorra/bjorncrm | 1304b90c1f7d32942e1b4987c43a2ff9dede22dd | [
"MIT"
] | 2 | 2018-08-27T00:47:18.000Z | 2020-12-11T01:23:59.000Z | contacts/models.py | hugocorra/bjorncrm | 1304b90c1f7d32942e1b4987c43a2ff9dede22dd | [
"MIT"
] | null | null | null | contacts/models.py | hugocorra/bjorncrm | 1304b90c1f7d32942e1b4987c43a2ff9dede22dd | [
"MIT"
] | 1 | 2020-12-11T01:24:02.000Z | 2020-12-11T01:24:02.000Z | from django.db import models
from django.contrib.auth.models import User
class Contato(models.Model):
TIPO_CONTATO = (
('P', 'Pessoal'),
('C', 'Comercial'),
)
nome = models.CharField(max_length=100, verbose_name='Nome')
ocupacao = models.CharField(max_length=50, blank=True, null=True, verbose_name='Ocupação')
tipo = models.CharField(choices=TIPO_CONTATO, max_length=1, verbose_name='Tipo')
instituicao = models.CharField(max_length=50, blank=True, null=True, verbose_name='Instituição')
departamento = models.CharField(max_length=50, blank=True, null=True, verbose_name='Departamento')
notas = models.TextField(blank=True, verbose_name='Notas')
class Endereco(models.Model):
pais = models.CharField(max_length=50, blank=True, null=True, verbose_name='País')
estado = models.CharField(max_length=30, blank=True, null=True, verbose_name='Estado')
cidade = models.CharField(max_length=50, blank=True, null=True, verbose_name='Cidade')
logradouro = models.CharField(max_length=100, blank=True, null=True, verbose_name='Logradouro')
numero = models.CharField(max_length=10, blank=True, null=True, verbose_name='Número')
complemento = models.CharField(max_length=10, blank=True, null=True, verbose_name='Complemento')
cep = models.CharField(max_length=9, blank=True, null=True, verbose_name='CEP')
class Telefone(models.Model):
numero = models.CharField(max_length=20, blank=True, verbose_name='Telefone')
class Email(models.Model):
email = models.EmailField(blank=True, verbose_name="E-mail")
class ContatoEndereco(models.Model):
contato = models.ForeignKey(Contato, on_delete=models.CASCADE)
endereco = models.ForeignKey(Endereco, on_delete=models.CASCADE)
#tipo = Residencial, Comercial, Outro
class ContatoTelefone(models.Model):
contato = models.ForeignKey(Contato, on_delete=models.CASCADE)
telefone = models.ForeignKey(Telefone, on_delete=models.CASCADE)
class ContatoEmail(models.Model):
contato = models.ForeignKey(Contato, on_delete=models.CASCADE)
email = models.ForeignKey(Email, on_delete=models.CASCADE)
# class Notas(models.Model):
# usuario = models.ForeignKey(User, blank=True, null=True, on_delete=models.SET_NULL)
# timestamp = models.DateTimeField(auto_now=False, auto_now_add=False)
# texto = models.TextField(verbose_name='Nota')
| 41.912281 | 102 | 0.744244 | 310 | 2,389 | 5.6 | 0.23871 | 0.101382 | 0.112327 | 0.165899 | 0.455069 | 0.373272 | 0.324885 | 0.324885 | 0.324885 | 0.324885 | 0 | 0.012452 | 0.125994 | 2,389 | 56 | 103 | 42.660714 | 0.818966 | 0.114692 | 0 | 0.088235 | 0 | 0 | 0.05782 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.911765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
df090ac617066f65eec5c04547279b1628a1c606 | 151 | py | Python | src/MACROS.py | semihyonet/SudokuSolver | 9f1ad10cc8f982e226b9f814e4bb1f4ec51c688d | [
"MIT"
] | null | null | null | src/MACROS.py | semihyonet/SudokuSolver | 9f1ad10cc8f982e226b9f814e4bb1f4ec51c688d | [
"MIT"
] | null | null | null | src/MACROS.py | semihyonet/SudokuSolver | 9f1ad10cc8f982e226b9f814e4bb1f4ec51c688d | [
"MIT"
] | null | null | null | SIZE = 9
INPUT_LEVEL_DIR = "File.txt"
INPUT_CONSTRAINTS_DIR = "Constraints.txt"
OUTPUT_SOLUTION_DIR = "Solution.txt"
ASSIGNED_VALUE_NUM = 0
| 16.777778 | 42 | 0.735099 | 21 | 151 | 4.904762 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016 | 0.172185 | 151 | 8 | 43 | 18.875 | 0.808 | 0 | 0 | 0 | 0 | 0 | 0.244755 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df0c449bd0dbf4cd6b9d023344b18a34a36a6280 | 727 | py | Python | src/part1_coding/t2_oop/advanced/multi_inh.py | paulbodean88/py-test-automation-course | f7e5fc0416f494356817c27e7b4e212f3884ae07 | [
"MIT"
] | 1 | 2022-03-25T10:19:05.000Z | 2022-03-25T10:19:05.000Z | src/part1_coding/t2_oop/advanced/multi_inh.py | paulbodean88/py-test-automation-course | f7e5fc0416f494356817c27e7b4e212f3884ae07 | [
"MIT"
] | null | null | null | src/part1_coding/t2_oop/advanced/multi_inh.py | paulbodean88/py-test-automation-course | f7e5fc0416f494356817c27e7b4e212f3884ae07 | [
"MIT"
] | null | null | null | class Mother:
@staticmethod
def take_screenshot():
print('I can make a screenshot')
@staticmethod
def receive_email():
print('I can receive email')
class Father:
@staticmethod
def drive_car():
print('I can drive a car ')
@staticmethod
def play_music():
print(f'Play music while driving')
class Grandfather:
@staticmethod
def smoke():
print('I can drive a car ')
@staticmethod
def sing():
print(f'Play music while driving')
class Kid(Mother, Father, Grandfather):
def behave(self):
self.take_screenshot()
self.drive_car()
self.smoke()
kid = Kid()
kid.behave()
kid.drive_car()
print(Kid.mro()) | 17.309524 | 42 | 0.607978 | 89 | 727 | 4.88764 | 0.314607 | 0.206897 | 0.082759 | 0.064368 | 0.298851 | 0.298851 | 0.298851 | 0.151724 | 0 | 0 | 0 | 0 | 0.27923 | 727 | 42 | 43 | 17.309524 | 0.830153 | 0 | 0 | 0.333333 | 0 | 0 | 0.173077 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.233333 | false | 0 | 0 | 0 | 0.366667 | 0.233333 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df3c74dcf00873207d7fd1a18cb83806cb3303bd | 3,969 | py | Python | test/readcsv2mysql.py | ellieandallen/my-own-script | c395f889183f4321fc34ff9f326c5974f02ea682 | [
"BSD-2-Clause"
] | null | null | null | test/readcsv2mysql.py | ellieandallen/my-own-script | c395f889183f4321fc34ff9f326c5974f02ea682 | [
"BSD-2-Clause"
] | null | null | null | test/readcsv2mysql.py | ellieandallen/my-own-script | c395f889183f4321fc34ff9f326c5974f02ea682 | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
'''
Created on 2017/04/28
@author: ellie
'''
import pandas as pd
from sqlalchemy import create_engine
class ReadCsv2Mysql:
def __init__(self):
pass
@staticmethod
def read_csv_mysql():
engine = create_engine('mysql+mysqldb://root:@127.0.0.1:3306/db?charset=utf8')
conn = engine.connect()
# df = pd.read_csv('/home/ellie/company.csv', encoding='utf8', error_bad_lines=False, header=None,
# names=['company_name'])
df = pd.read_csv('/home/ellie/result_0925.csv', encoding='utf8')
# df = pd.read_csv('/home/ellie/offlinegps1.csv', encoding='utf8', sep='\t', header=None,
# names=['uid', 'latitude', 'longitude'])
print df[:10]
df.to_sql(name='temp', con=conn, chunksize=20, if_exists='replace', index=False)
# df.to_sql(name='location52', con=conn, if_exists='replace', index=False)
conn.close()
if __name__ == '__main__':
a = ReadCsv2Mysql()
a.read_csv_mysql()
print '.csv is already in mysql'
# 关键缩写和包导入
#
# 在这个速查手册中,我们使用如下缩写:
#
# df:任意的Pandas DataFrame对象
# s:任意的Pandas Series对象
# 同时我们需要做如下的引入:
#
# import pandas as pd
# import numpy as np
#
# 导入数据
#
# pd.read_csv(filename):从CSV文件导入数据
# pd.read_table(filename):从限定分隔符的文本文件导入数据
# pd.read_excel(filename):从Excel文件导入数据
# pd.read_sql(query, connection_object):从SQL表/库导入数据
# pd.read_json(json_string):从JSON格式的字符串导入数据
# pd.read_html(url):解析URL、字符串或者HTML文件,抽取其中的tables表格
# pd.read_clipboard():从你的粘贴板获取内容,并传给read_table()
# pd.DataFrame(dict):从字典对象导入数据,Key是列名,Value是数据
#
# 导出数据
#
# df.to_csv(filename):导出数据到CSV文件
# df.to_excel(filename):导出数据到Excel文件
# df.to_sql(table_name, connection_object):导出数据到SQL表
# df.to_json(filename):以Json格式导出数据到文本文件
#
# 创建测试对象
#
# pd.DataFrame(np.random.rand(20,5)):创建20行5列的随机数组成的DataFrame对象
# pd.Series(my_list):从可迭代对象my_list创建一个Series对象
# df.index = pd.date_range('1900/1/30', periods=df.shape[0]):增加一个日期索引
#
# 查看、检查数据
#
# df.head(n):查看DataFrame对象的前n行
# df.tail(n):查看DataFrame对象的最后n行
# df.shape():查看行数和列数
# http://df.info():查看索引、数据类型和内存信息
# df.describe():查看数值型列的汇总统计
# s.value_counts(dropna=False):查看Series对象的唯一值和计数
# df.apply(pd.Series.value_counts):查看DataFrame对象中每一列的唯一值和计数
#
# 数据选取
#
# df[col]:根据列名,并以Series的形式返回列
# df[[col1, col2]]:以DataFrame形式返回多列
# s.iloc[0]:按位置选取数据
# s.loc['index_one']:按索引选取数据
# df.iloc[0,:]:返回第一行
# df.iloc[0,0]:返回第一列的第一个元素
#
# 数据清理
#
# df.columns = ['a','b','c']:重命名列名
# pd.isnull():检查DataFrame对象中的空值,并返回一个Boolean数组
# pd.notnull():检查DataFrame对象中的非空值,并返回一个Boolean数组
# df.dropna():删除所有包含空值的行
# df.dropna(axis=1):删除所有包含空值的列
# df.dropna(axis=1,thresh=n):删除所有小于n个非空值的行
# df.fillna(x):用x替换DataFrame对象中所有的空值
# s.astype(float):将Series中的数据类型更改为float类型
# s.replace(1,'one'):用‘one’代替所有等于1的值
# s.replace([1,3],['one','three']):用'one'代替1,用'three'代替3
# df.rename(columns=lambda x: x + 1):批量更改列名
# df.rename(columns={'old_name': 'new_ name'}):选择性更改列名
# df.set_index('column_one'):更改索引列
# df.rename(index=lambda x: x + 1):批量重命名索引
#
# 数据处理:Filter、Sort和GroupBy
#
# df[df[col] > 0.5]:选择col列的值大于0.5的行
# df.sort_values(col1):按照列col1排序数据,默认升序排列
# df.sort_values(col2, ascending=False):按照列col1降序排列数据
# df.sort_values([col1,col2], ascending=[True,False]):先按列col1升序排列,后按col2降序排列数据
# df.groupby(col):返回一个按列col进行分组的Groupby对象
# df.groupby([col1,col2]):返回一个按多列进行分组的Groupby对象
# df.groupby(col1)[col2]:返回按列col1进行分组后,列col2的均值
# df.pivot_table(index=col1, values=[col2,col3], aggfunc=max):创建一个按列col1进行分组,并计算col2和col3的最大值的数据透视表
# df.groupby(col1).agg(np.mean):返回按列col1分组的所有列的均值
# data.apply(np.mean):对DataFrame中的每一列应用函数np.mean
# data.apply(np.max,axis=1):对DataFrame中的每一行应用函数np.max
#
# 数据合并
#
# df1.append(df2):将df2中的行添加到df1的尾部
# df.concat([df1, df2],axis=1):将df2中的列添加到df1的尾部
# df1.join(df2,on=col1,how='inner'):对df1的列和df2的列执行SQL形式的join
#
# 数据统计
#
# df.describe():查看数据值列的汇总统计
# df.mean():返回所有列的均值
# df.corr():返回列与列之间的相关系数
# df.count():返回每一列中的非空值的个数
# df.max():返回每一列的最大值
# df.min():返回每一列的最小值
# df.median():返回每一列的中位数
# df.std():返回每一列的标准差
| 28.35 | 106 | 0.715042 | 534 | 3,969 | 5.20412 | 0.496255 | 0.021591 | 0.012954 | 0.011875 | 0.039583 | 0.021591 | 0 | 0 | 0 | 0 | 0 | 0.03038 | 0.112623 | 3,969 | 139 | 107 | 28.553957 | 0.75866 | 0.771983 | 0 | 0 | 0 | 0.058824 | 0.167553 | 0.105053 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.058824 | 0.117647 | null | null | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
df3e2cfacd8063f1f9ab38aec6fdf31683db90e8 | 9,111 | py | Python | tests/test_all.py | mar10/tabfix | 99f0490dffa3680d7ff050d0a377e155cc646997 | [
"MIT"
] | 3 | 2015-03-09T14:51:45.000Z | 2022-03-14T03:45:47.000Z | tests/test_all.py | mar10/tabfix | 99f0490dffa3680d7ff050d0a377e155cc646997 | [
"MIT"
] | 2 | 2017-12-30T16:07:51.000Z | 2021-08-13T19:42:12.000Z | tests/test_all.py | mar10/tabfix | 99f0490dffa3680d7ff050d0a377e155cc646997 | [
"MIT"
] | null | null | null | # -*- coding: iso-8859-1 -*-
# (c) 2010-2013 Martin Wendt; see https://github.com/mar10/tabfix
# Licensed under the MIT license: http://www.opensource.org/licenses/mit-license.php
"""
Unit tests for this package.
"""
import tempfile
import filecmp
from zipfile import ZipFile
import unittest
import os
import shutil
import sys
from tabfix import main, cmd_walker
from tabfix.main import read_text_lines, DELIM_CR, DELIM_CRLF, DELIM_LF
#import subprocess
#import StringIO
#IS_PY3 = (sys.version_info[0] >= 3)
USE_FIXED_FOLDER = False
class TestBasic(unittest.TestCase):
"""Basic tests.
Create a temp folder, extract test data there, and CWD to <temp>/test_files:
We have this structure:
<temp root>/
test_files/
sub1/
(some text and other files)
sub2/
(some text and other files)
(some text and other files)
"""
def setUp(self):
#
if USE_FIXED_FOLDER:
self.temp_path = os.path.join(os.path.expanduser("~"), "tabfix_test_")
if not os.path.exists(self.temp_path):
os.mkdir(self.temp_path)
else:
self.temp_path = tempfile.mkdtemp()
data_path = os.path.dirname(__file__)
zf = ZipFile(os.path.join(data_path, "test_files.zip"))
zf.extractall(self.temp_path)
self.prev_cwd = os.getcwd()
os.chdir(os.path.join(self.temp_path, "test_files"))
def tearDown(self):
os.chdir(self.prev_cwd)
shutil.rmtree(self.temp_path)
def test_read_text_lines(self):
stats = { DELIM_CR: 0, DELIM_LF: 0, DELIM_CRLF: 0 }
res = read_text_lines("test_cr.txt", stats)
res = list(res)
self.assertEqual(len(res), 7)
self.assertEqual(type(res[0]), type(b""))
self.assertTrue(res[0].endswith(DELIM_CR))
self.assertEqual(stats, { DELIM_CR: 6, DELIM_LF: 0, DELIM_CRLF: 0 })
stats = { DELIM_CR: 0, DELIM_LF: 0, DELIM_CRLF: 0 }
res = read_text_lines("test_crlf.txt", stats)
res = list(res)
self.assertEqual(len(res), 7)
self.assertEqual(type(res[0]), type(b""))
self.assertTrue(res[0].endswith(DELIM_CRLF))
self.assertEqual(stats, { DELIM_CR: 0, DELIM_LF: 0, DELIM_CRLF: 6 })
stats = { DELIM_CR: 0, DELIM_LF: 0, DELIM_CRLF: 0 }
res = read_text_lines("test_lf.txt", stats)
res = list(res)
self.assertEqual(len(res), 7)
self.assertEqual(type(res[0]), type(b""))
self.assertTrue(res[0].endswith(DELIM_LF))
self.assertEqual(stats, { DELIM_CR: 0, DELIM_LF: 6, DELIM_CRLF: 0 })
stats = { DELIM_CR: 0, DELIM_LF: 0, DELIM_CRLF: 0 }
res = read_text_lines("test_mixed.txt", stats)
res = list(res)
self.assertEqual(len(res), 45)
self.assertEqual(type(res[0]), type(b""))
self.assertTrue(res[0].endswith(DELIM_CRLF))
self.assertEqual(stats, { DELIM_CR: 0, DELIM_LF: 0, DELIM_CRLF: 44 })
def test_spacify_txt_flat(self):
# args = [self.temp_path]
args = ["."]
opts = main.Opts()
opts.backup = True
opts.dry_run = False
opts.ignore_list = None
opts.inputTabSize = None
opts.lineSeparator = None
opts.match_list = ["*.txt"]
opts.recursive = False
opts.tabbify = False
opts.tabSize = 4
opts.target_path = None
opts.verbose = 1
opts.zip_backup = False
data = {}
cmd_walker.process(args, opts, main.fix_tabs, data)
self.assertEqual(data.get("files_processed"), 10)
self.assertEqual(data.get("files_modified"), 5)
self.assertEqual(data.get("lines_modified"), 125)
self.assertTrue(os.path.isfile(os.path.join(self.temp_path, "test_files", "test_mixed.txt.bak")))
# TODO: use difflib
# http://docs.python.org/2/library/difflib.html
self.assertTrue(filecmp.cmp(os.path.join(self.temp_path, "test_files", "test_mixed.txt"),
os.path.join(os.path.dirname(__file__), "test_mixed_expect_spaced.txt")))
# TODO: test directly using shell exec & command line
def test_tabbify_txt_flat(self):
args = ["."]
opts = main.Opts()
opts.backup = True
opts.dry_run = False
opts.ignore_list = None
opts.inputTabSize = None
opts.lineSeparator = None
opts.match_list = ["*.txt"]
opts.recursive = False
opts.tabbify = True
opts.tabSize = 4
opts.target_path = None
opts.verbose = 1
opts.zip_backup = False
data = {}
cmd_walker.process(args, opts, main.fix_tabs, data)
self.assertEqual(data.get("files_processed"), 10)
self.assertEqual(data.get("files_modified"), 8)
self.assertEqual(data.get("lines_modified"), 154)
# created .bak file
self.assertTrue(os.path.isfile(os.path.join(self.temp_path, "test_files", "test_mixed.txt.bak")))
# TODO: use difflib
# http://docs.python.org/2/library/difflib.html
self.assertTrue(filecmp.cmp(os.path.join(self.temp_path, "test_files", "test_mixed.txt"),
os.path.join(os.path.dirname(__file__), "test_mixed_expect_tabbed.txt")))
# TODO: test directly using shell exec & command line
def test_match_all_flat(self):
args = ["."]
opts = main.Opts()
opts.match_list = ["*.*"]
opts.tabbify = False
data = {}
cmd_walker.process(args, opts, main.fix_tabs, data)
# Note: if this fails with '9 != 8', there might be a '.DS_Store'
# in 'test_files.zip':
self.assertEqual(data.get("files_processed"), 14)
self.assertEqual(data.get("files_modified"), 7)
def test_match_all_recursive(self):
args = ["."]
opts = main.Opts()
opts.match_list = ["*.*"]
opts.recursive = True
data = {}
cmd_walker.process(args, opts, main.fix_tabs, data)
self.assertEqual(data.get("files_processed"), 22)
def test_match_all_recursive_ignore(self):
args = ["."]
opts = main.Opts()
opts.ignore_list = ["*.html", "*.js", "sub2"]
opts.match_list = ["*.*"]
opts.recursive = True
opts.verbose = 4
data = {}
cmd_walker.process(args, opts, main.fix_tabs, data)
print(data)
self.assertEqual(data.get("files_processed"), 14)
self.assertEqual(data.get("files_skipped"), 5)
self.assertEqual(data.get("files_ignored"), 4)
self.assertEqual(data.get("dirs_processed"), 2)
self.assertEqual(data.get("dirs_ignored"), 1)
#class TestShell(unittest.TestCase):
# """Basic tests.
#
# Create a temp folder, extract test data there, and CWD to <temp>/test_files:
#
# We have this structure:
# <temp root>/
# test_files/
# sub1/
# (some text and other files)
# sub2/
# (some text and other files)
# (some text and other files)
# """
# def setUp(self):
# #
# if USE_FIXED_FOLDER:
# self.temp_path = os.path.join(os.path.expanduser("~"), "tabfix_test_")
# if not os.path.exists(self.temp_path):
# os.mkdir(self.temp_path)
# else:
# self.temp_path = tempfile.mkdtemp()
# data_path = os.path.dirname(__file__)
# zf = ZipFile(os.path.join(data_path, "test_files.zip"))
# zf.extractall(self.temp_path)
# self.prev_cwd = os.getcwd()
# os.chdir(os.path.join(self.temp_path, "test_files"))
#
# script_folder = os.path.abspath(os.path.join(os.path.dirname(__file__),
# "..", "tabfix"))
# if not script_folder in sys.path:
# sys.path.append(script_folder)
# self.script_path = os.path.abspath(os.path.join(script_folder, "main.py"))
#
# def tearDown(self):
# os.chdir(self.prev_cwd)
# shutil.rmtree(self.temp_path)
#
# def test_cmd_help(self):
## res = subprocess.call(["tabfix", "-h"])
# p = subprocess.Popen(["python", self.script_path, "-h"],
# stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
# out, _err = p.communicate()
# print(out)
# self.assertEqual(p.returncode, 0)
#def test_suite():
# """Called by 'setup.py test'."""
# suite = unittest.TestSuite()
if __name__ == "__main__":
print(sys.version)
unittest.main()
# suite = unittest.TestSuite()
# suite.addTest(TestShell("test_cmd_help"))
## suite.addTest(TestBasic("test_read_text_lines"))
## suite.addTest(TestBasic("test_tabbify_txt_flat"))
# unittest.TextTestRunner().run(suite)
| 35.451362 | 110 | 0.579629 | 1,138 | 9,111 | 4.454306 | 0.18717 | 0.031959 | 0.044979 | 0.060761 | 0.757151 | 0.726376 | 0.689091 | 0.680608 | 0.673703 | 0.651411 | 0 | 0.014464 | 0.286686 | 9,111 | 256 | 111 | 35.589844 | 0.765502 | 0.310943 | 0 | 0.544118 | 0 | 0 | 0.082952 | 0.0095 | 0 | 0 | 0 | 0.003906 | 0.25 | 1 | 0.058824 | false | 0 | 0.066176 | 0 | 0.132353 | 0.014706 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df4f71b3444370011d6a856f4fdded76a3ea60bc | 744 | py | Python | data_structures/binary_trees/check_divide_in_two_halves.py | FatiahBalo/python-ds | 9eb88425822b6da4d7bd673a124c13fbe6f17523 | [
"MIT"
] | 2 | 2021-05-19T02:11:35.000Z | 2021-07-01T03:33:31.000Z | data_structures/binary_trees/check_divide_in_two_halves.py | FatiahBalo/python-ds | 9eb88425822b6da4d7bd673a124c13fbe6f17523 | [
"MIT"
] | null | null | null | data_structures/binary_trees/check_divide_in_two_halves.py | FatiahBalo/python-ds | 9eb88425822b6da4d7bd673a124c13fbe6f17523 | [
"MIT"
] | 1 | 2020-10-06T07:19:46.000Z | 2020-10-06T07:19:46.000Z | # Check if removing an edge of a binary tree can divide
# the tree in two equal halves
# Count the number of nodes, say n. Then traverse the tree
# in bottom up manner and check if n - s = s
class Node:
def __init__(self, val):
self.val = val
self.left = None
self.right = None
def count(root):
if not root:
return 0
return count(root.left) + count(root.right) + 1
def check_util(root, n):
if root == None:
return False
# Check for root
if count(root) == n - count(root):
return True
# Check for all the other nodes
return check_util(root.left, n) or check_util(root.right, n)
def check(root):
n = count(rot)
return check_util(root, n)
| 20.666667 | 64 | 0.615591 | 119 | 744 | 3.781513 | 0.420168 | 0.1 | 0.115556 | 0.062222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003824 | 0.297043 | 744 | 35 | 65 | 21.257143 | 0.856597 | 0.306452 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0 | 0 | 0.611111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
df657e924af128c020a8d31bb21a7ecab16add5d | 11,932 | py | Python | configurator/api_pb2.py | rkojedzinszky/thermo-center | 707ab1eb09424a00f58af88d0bd59f270118fe25 | [
"BSD-3-Clause"
] | 1 | 2020-11-18T14:54:30.000Z | 2020-11-18T14:54:30.000Z | configurator/api_pb2.py | rkojedzinszky/thermo-center | 707ab1eb09424a00f58af88d0bd59f270118fe25 | [
"BSD-3-Clause"
] | 1 | 2020-05-01T03:08:13.000Z | 2020-05-01T03:08:14.000Z | configurator/api_pb2.py | rkojedzinszky/thermo-center | 707ab1eb09424a00f58af88d0bd59f270118fe25 | [
"BSD-3-Clause"
] | 1 | 2020-11-18T17:21:52.000Z | 2020-11-18T17:21:52.000Z | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: configurator/api.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='configurator/api.proto',
package='configurator',
syntax='proto3',
serialized_options=None,
serialized_pb=_b('\n\x16\x63onfigurator/api.proto\x12\x0c\x63onfigurator\"\"\n\x0fRadioCfgRequest\x12\x0f\n\x07\x63luster\x18\x01 \x01(\r\"J\n\x10RadioCfgResponse\x12\x0f\n\x07network\x18\x01 \x01(\r\x12\x14\n\x0cradio_config\x18\x02 \x01(\x0c\x12\x0f\n\x07\x61\x65s_key\x18\x03 \x01(\x0c\"\x17\n\x04Task\x12\x0f\n\x07task_id\x18\x01 \x01(\r\"a\n\x0bTaskDetails\x12\x0f\n\x07task_id\x18\x01 \x01(\r\x12\x11\n\tsensor_id\x18\x02 \x01(\r\x12.\n\x06\x63onfig\x18\x03 \x01(\x0b\x32\x1e.configurator.RadioCfgResponse\"%\n\x12TaskUpdateResponse\x12\x0f\n\x07success\x18\x01 \x01(\x08\"5\n\x13TaskFinishedRequest\x12\x0f\n\x07task_id\x18\x01 \x01(\r\x12\r\n\x05\x65rror\x18\x02 \x01(\t2\xc6\x02\n\x0c\x43onfigurator\x12N\n\x0bGetRadioCfg\x12\x1d.configurator.RadioCfgRequest\x1a\x1e.configurator.RadioCfgResponse\"\x00\x12>\n\x0bTaskAcquire\x12\x12.configurator.Task\x1a\x19.configurator.TaskDetails\"\x00\x12O\n\x15TaskDiscoveryReceived\x12\x12.configurator.Task\x1a .configurator.TaskUpdateResponse\"\x00\x12U\n\x0cTaskFinished\x12!.configurator.TaskFinishedRequest\x1a .configurator.TaskUpdateResponse\"\x00\x62\x06proto3')
)
_RADIOCFGREQUEST = _descriptor.Descriptor(
name='RadioCfgRequest',
full_name='configurator.RadioCfgRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='cluster', full_name='configurator.RadioCfgRequest.cluster', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=40,
serialized_end=74,
)
_RADIOCFGRESPONSE = _descriptor.Descriptor(
name='RadioCfgResponse',
full_name='configurator.RadioCfgResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='network', full_name='configurator.RadioCfgResponse.network', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='radio_config', full_name='configurator.RadioCfgResponse.radio_config', index=1,
number=2, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='aes_key', full_name='configurator.RadioCfgResponse.aes_key', index=2,
number=3, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=_b(""),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=76,
serialized_end=150,
)
_TASK = _descriptor.Descriptor(
name='Task',
full_name='configurator.Task',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='task_id', full_name='configurator.Task.task_id', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=152,
serialized_end=175,
)
_TASKDETAILS = _descriptor.Descriptor(
name='TaskDetails',
full_name='configurator.TaskDetails',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='task_id', full_name='configurator.TaskDetails.task_id', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='sensor_id', full_name='configurator.TaskDetails.sensor_id', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='config', full_name='configurator.TaskDetails.config', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=177,
serialized_end=274,
)
_TASKUPDATERESPONSE = _descriptor.Descriptor(
name='TaskUpdateResponse',
full_name='configurator.TaskUpdateResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='success', full_name='configurator.TaskUpdateResponse.success', index=0,
number=1, type=8, cpp_type=7, label=1,
has_default_value=False, default_value=False,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=276,
serialized_end=313,
)
_TASKFINISHEDREQUEST = _descriptor.Descriptor(
name='TaskFinishedRequest',
full_name='configurator.TaskFinishedRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='task_id', full_name='configurator.TaskFinishedRequest.task_id', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='error', full_name='configurator.TaskFinishedRequest.error', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=315,
serialized_end=368,
)
_TASKDETAILS.fields_by_name['config'].message_type = _RADIOCFGRESPONSE
DESCRIPTOR.message_types_by_name['RadioCfgRequest'] = _RADIOCFGREQUEST
DESCRIPTOR.message_types_by_name['RadioCfgResponse'] = _RADIOCFGRESPONSE
DESCRIPTOR.message_types_by_name['Task'] = _TASK
DESCRIPTOR.message_types_by_name['TaskDetails'] = _TASKDETAILS
DESCRIPTOR.message_types_by_name['TaskUpdateResponse'] = _TASKUPDATERESPONSE
DESCRIPTOR.message_types_by_name['TaskFinishedRequest'] = _TASKFINISHEDREQUEST
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
RadioCfgRequest = _reflection.GeneratedProtocolMessageType('RadioCfgRequest', (_message.Message,), dict(
DESCRIPTOR = _RADIOCFGREQUEST,
__module__ = 'configurator.api_pb2'
# @@protoc_insertion_point(class_scope:configurator.RadioCfgRequest)
))
_sym_db.RegisterMessage(RadioCfgRequest)
RadioCfgResponse = _reflection.GeneratedProtocolMessageType('RadioCfgResponse', (_message.Message,), dict(
DESCRIPTOR = _RADIOCFGRESPONSE,
__module__ = 'configurator.api_pb2'
# @@protoc_insertion_point(class_scope:configurator.RadioCfgResponse)
))
_sym_db.RegisterMessage(RadioCfgResponse)
Task = _reflection.GeneratedProtocolMessageType('Task', (_message.Message,), dict(
DESCRIPTOR = _TASK,
__module__ = 'configurator.api_pb2'
# @@protoc_insertion_point(class_scope:configurator.Task)
))
_sym_db.RegisterMessage(Task)
TaskDetails = _reflection.GeneratedProtocolMessageType('TaskDetails', (_message.Message,), dict(
DESCRIPTOR = _TASKDETAILS,
__module__ = 'configurator.api_pb2'
# @@protoc_insertion_point(class_scope:configurator.TaskDetails)
))
_sym_db.RegisterMessage(TaskDetails)
TaskUpdateResponse = _reflection.GeneratedProtocolMessageType('TaskUpdateResponse', (_message.Message,), dict(
DESCRIPTOR = _TASKUPDATERESPONSE,
__module__ = 'configurator.api_pb2'
# @@protoc_insertion_point(class_scope:configurator.TaskUpdateResponse)
))
_sym_db.RegisterMessage(TaskUpdateResponse)
TaskFinishedRequest = _reflection.GeneratedProtocolMessageType('TaskFinishedRequest', (_message.Message,), dict(
DESCRIPTOR = _TASKFINISHEDREQUEST,
__module__ = 'configurator.api_pb2'
# @@protoc_insertion_point(class_scope:configurator.TaskFinishedRequest)
))
_sym_db.RegisterMessage(TaskFinishedRequest)
_CONFIGURATOR = _descriptor.ServiceDescriptor(
name='Configurator',
full_name='configurator.Configurator',
file=DESCRIPTOR,
index=0,
serialized_options=None,
serialized_start=371,
serialized_end=697,
methods=[
_descriptor.MethodDescriptor(
name='GetRadioCfg',
full_name='configurator.Configurator.GetRadioCfg',
index=0,
containing_service=None,
input_type=_RADIOCFGREQUEST,
output_type=_RADIOCFGRESPONSE,
serialized_options=None,
),
_descriptor.MethodDescriptor(
name='TaskAcquire',
full_name='configurator.Configurator.TaskAcquire',
index=1,
containing_service=None,
input_type=_TASK,
output_type=_TASKDETAILS,
serialized_options=None,
),
_descriptor.MethodDescriptor(
name='TaskDiscoveryReceived',
full_name='configurator.Configurator.TaskDiscoveryReceived',
index=2,
containing_service=None,
input_type=_TASK,
output_type=_TASKUPDATERESPONSE,
serialized_options=None,
),
_descriptor.MethodDescriptor(
name='TaskFinished',
full_name='configurator.Configurator.TaskFinished',
index=3,
containing_service=None,
input_type=_TASKFINISHEDREQUEST,
output_type=_TASKUPDATERESPONSE,
serialized_options=None,
),
])
_sym_db.RegisterServiceDescriptor(_CONFIGURATOR)
DESCRIPTOR.services_by_name['Configurator'] = _CONFIGURATOR
# @@protoc_insertion_point(module_scope)
| 33.8017 | 1,121 | 0.754693 | 1,362 | 11,932 | 6.323054 | 0.142438 | 0.036229 | 0.056085 | 0.020437 | 0.525081 | 0.482234 | 0.44647 | 0.44647 | 0.427427 | 0.424408 | 0 | 0.032944 | 0.124874 | 11,932 | 352 | 1,122 | 33.897727 | 0.791802 | 0.048357 | 0 | 0.618123 | 1 | 0.003236 | 0.216609 | 0.16292 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.016181 | 0 | 0.016181 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
df6ff58b2ab48419ab63f329349ef142871ddfb0 | 1,429 | py | Python | Others/Source/02/2.2/print_test.py | silence0201/Learn-Python | 662da7c0e74221cedb445ba17d5cb1cd3af41c86 | [
"MIT"
] | 1 | 2018-05-30T01:38:23.000Z | 2018-05-30T01:38:23.000Z | Others/Source/02/2.2/print_test.py | silence0201/Learn-Python | 662da7c0e74221cedb445ba17d5cb1cd3af41c86 | [
"MIT"
] | null | null | null | Others/Source/02/2.2/print_test.py | silence0201/Learn-Python | 662da7c0e74221cedb445ba17d5cb1cd3af41c86 | [
"MIT"
] | null | null | null | # coding: utf-8
#########################################################################
# 网站: <a href="http://www.crazyit.org">疯狂Java联盟</a> #
# author yeeku.H.lee kongyeeku@163.com #
# #
# version 1.0 #
# #
# Copyright (C), 2001-2018, yeeku.H.Lee #
# #
# This program is protected by copyright laws. #
# #
# Program Name: #
# #
# <br>Date: #
#########################################################################
user_name = 'Charlie'
user_age = 8
# 同时输出多个变量和字符串
print("读者名:" , user_name, "年龄:", user_age)
# 同时输出多个变量和字符串,指定分隔符
print("读者名:" , user_name, "年龄:", user_age, sep='|')
# 指定end参数,指定输出之后不再换行
print(40, '\t', end="")
print(50, '\t', end="")
print(60, '\t', end="")
f = open("poem.txt", "w") # 打开文件以便写入
print('沧海月明珠有泪', file=f)
print('蓝田日暖玉生烟', file=f)
f.close() | 49.275862 | 74 | 0.269419 | 94 | 1,429 | 4.031915 | 0.638298 | 0.063325 | 0.047493 | 0.084433 | 0.131926 | 0.131926 | 0.131926 | 0 | 0 | 0 | 0 | 0.030882 | 0.524143 | 1,429 | 29 | 75 | 49.275862 | 0.526471 | 0.643107 | 0 | 0 | 0 | 0 | 0.166124 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.636364 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
df72d7c18dfce4bd3e6c7a50c7d30883281cf0a5 | 1,017 | py | Python | mining/change_detection.py | microsoft/iclr2019-learning-to-represent-edits | e5777d6aa6cdeda500cf076646177c48d1cb4622 | [
"MIT"
] | 8 | 2021-03-15T18:57:18.000Z | 2021-08-23T11:28:22.000Z | mining/change_detection.py | microsoft/iclr2019-learning-to-represent-edits | e5777d6aa6cdeda500cf076646177c48d1cb4622 | [
"MIT"
] | null | null | null | mining/change_detection.py | microsoft/iclr2019-learning-to-represent-edits | e5777d6aa6cdeda500cf076646177c48d1cb4622 | [
"MIT"
] | 4 | 2021-03-27T14:19:09.000Z | 2021-09-13T12:35:31.000Z | # Copyright (c) Microsoft Corporation.
# Licensed under the MIT license.
from difflib import SequenceMatcher
from typing import List, Dict, Set, Sequence, Tuple, Union
import re
from collections import namedtuple
from docopt import docopt
from utils.dataloading import load_json_gz
DIFF_OP_RE = re.compile(r'(equal,)?replace(,equal)?')
def detect_contiguous_change(prev_code_chunk:List[str], updated_code_chunk:List[str], num_contiguous_line:int=1):
matcher = SequenceMatcher(a=prev_code_chunk, b=updated_code_chunk)
trans_ops = matcher.get_opcodes()
trans_op_str = ','.join(x[0] for x in trans_ops)
m = DIFF_OP_RE.match(trans_op_str)
if m:
tag, i1, i2, j1, j2 = [x for x in trans_ops if x[0] == 'replace'][0]
if i2 - i1 <= num_contiguous_line and j1 - j2 <= num_contiguous_line:
return i1, i2, j1, j2
return None
def is_valid_change(change):
return change['prev_code_chunk'].strip() != change['updated_data'].strip()
| 30.818182 | 114 | 0.695182 | 153 | 1,017 | 4.392157 | 0.490196 | 0.066964 | 0.058036 | 0.056548 | 0.041667 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019656 | 0.199607 | 1,017 | 32 | 115 | 31.78125 | 0.805897 | 0.066863 | 0 | 0 | 0 | 0 | 0.065646 | 0.027352 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.315789 | 0.052632 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
df74dda7da4d6e3a74d02e4fe9fb0d6cb0e67468 | 2,095 | py | Python | fuzzbench/test_e2e_run.py | rsprabery/fuzzbench | a0fb95b518cb727f58d21c8fe96592c7249a0ec3 | [
"Apache-2.0"
] | null | null | null | fuzzbench/test_e2e_run.py | rsprabery/fuzzbench | a0fb95b518cb727f58d21c8fe96592c7249a0ec3 | [
"Apache-2.0"
] | 3 | 2020-06-04T16:34:04.000Z | 2020-07-29T18:13:14.000Z | fuzzbench/test_e2e_run.py | rsprabery/fuzzbench | a0fb95b518cb727f58d21c8fe96592c7249a0ec3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Checks the result of a test experiment run. Note that this is not a
standalone unit test module, but used as part of our end-to-end integration
test."""
import os
import pytest
import redis
from rq.job import Job
@pytest.fixture(scope='class')
def redis_connection():
"""Returns the default redis server connection."""
return redis.Redis(host='queue-server')
# pylint: disable=no-self-use
@pytest.mark.skipif('E2E_INTEGRATION_TEST' not in os.environ,
reason='Not running end-to-end test.')
@pytest.mark.usefixtures('redis_connection')
class TestEndToEndRunResults:
"""Checks the result of a test experiment run."""
def test_jobs_dependency(self): # pylint: disable=redefined-outer-name
"""Tests that jobs dependency preserves during working."""
assert True
def test_all_jobs_finished_successfully(self, redis_connection): # pylint: disable=redefined-outer-name
"""Tests all jobs finished successully."""
jobs = Job.fetch_many(['base-image'], connection=redis_connection)
for job in jobs:
assert job.get_status() == 'finished'
def test_measurement_jobs_were_started_before_trial_jobs_finished(self):
"""Fake test to be implemented later."""
assert True
def test_db_contains_experiment_results(self):
"""Fake test to be implemented later."""
assert True
def test_experiment_report_is_generated(self):
"""Fake test to be implemented later."""
assert True
| 35.508475 | 108 | 0.718377 | 291 | 2,095 | 5.068729 | 0.494845 | 0.040678 | 0.026441 | 0.034576 | 0.191186 | 0.191186 | 0.142373 | 0.142373 | 0.094915 | 0.066441 | 0 | 0.005316 | 0.191885 | 2,095 | 58 | 109 | 36.12069 | 0.865918 | 0.5179 | 0 | 0.173913 | 0 | 0 | 0.103774 | 0 | 0 | 0 | 0 | 0 | 0.217391 | 1 | 0.26087 | false | 0 | 0.173913 | 0 | 0.521739 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
df894f80118c2885d639cda8632808b0d5239d94 | 11,787 | py | Python | pyoptmat/hardening.py | Argonne-National-Laboratory/pyoptmat | a6e5e8d0b93c77374d4ccbc65a86262eec5df77b | [
"MIT"
] | null | null | null | pyoptmat/hardening.py | Argonne-National-Laboratory/pyoptmat | a6e5e8d0b93c77374d4ccbc65a86262eec5df77b | [
"MIT"
] | 1 | 2022-03-30T22:20:38.000Z | 2022-03-31T15:02:22.000Z | pyoptmat/hardening.py | Argonne-National-Laboratory/pyoptmat | a6e5e8d0b93c77374d4ccbc65a86262eec5df77b | [
"MIT"
] | 2 | 2021-11-16T15:13:54.000Z | 2022-01-06T21:35:42.000Z | """
Modules defining isotropic and kinematic hardening models.
"""
import torch
import torch.nn as nn
from pyoptmat import utility
class HardeningModel(nn.Module):
"""
Superclass for all hardening models. Right now this does nothing, but
could be a basis for future expansion.
"""
def __init__(self):
super().__init__()
class IsotropicHardeningModel(HardeningModel):
"""
Superclass for all isotropic hardening models. Right now this
does nothing but is here in case we need it in the future.
"""
def __init__(self):
super().__init__()
class VoceIsotropicHardeningModel(IsotropicHardeningModel):
"""
Voce isotropic hardening, defined by
.. math::
\\sigma_{iso} = h
\\dot{h} = d (R - h) \\left|\\dot{\\varepsilon}_{in}\\right|
Args:
R: saturated increase/decrease in flow stress
d: parameter controlling the rate of saturation
R_scale (optional): scaling function for R
d_scale (optional): scaling function for d
"""
def __init__(self, R, d):
super().__init__()
self.R = R
self.d = d
def value(self, h):
"""
Map from the vector of internal variables to the isotropic hardening
value
Args:
h: the vector of internal variables for this model
"""
return h[:,0]
def dvalue(self, h):
"""
Derivative of the map with respect to the internal variables
Args:
h: the vector of internal variables for this model
"""
return torch.ones((h.shape[0],1), device = h.device)
@property
def nhist(self):
"""
The number of internal variables: here just 1
"""
return 1
def history_rate(self, s, h, t, ep, T):
"""
The rate evolving the internal variables
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: the temperature
"""
return torch.unsqueeze(self.d(T) * (self.R(T) - h[:,0]) * torch.abs(ep), 1)
def dhistory_rate_dstress(self, s, h, t, ep, T):
"""
The derivative of this history rate with respect to the stress
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: temperature
"""
return torch.zeros_like(h)
def dhistory_rate_dhistory(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the internal variables
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: temperature
"""
return torch.unsqueeze(-torch.unsqueeze(self.d(T),-1) *
torch.ones_like(h) * torch.abs(ep)[:,None], 1)
def dhistory_rate_derate(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the inelastic
strain rate
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: temperature
"""
return torch.unsqueeze(torch.unsqueeze(self.d(T) *
(self.R(T) - h[:,0]) * torch.sign(ep), 1),1)
class KinematicHardeningModel(HardeningModel):
"""
Common superclass for kinematic hardening models
Right now this does nothing, but it's available for future expansion
"""
def __init__(self):
super().__init__()
class NoKinematicHardeningModel(KinematicHardeningModel):
"""
The simplest kinematic hardening model: a constant value of 0
"""
def __init__(self):
super().__init__()
def _setup(self):
"""
Cache the model parameters
"""
pass
@property
def nhist(self):
"""
The number of internal variables, here 0
"""
return 0
def value(self, h):
"""
The map between the vector of internal variables and the kinematic
hardening
Args:
h: vector of internal variables
"""
return torch.zeros(h.shape[0], device = h.device)
def dvalue(self, h):
"""
Derivative of the map to the kinematic hardening with respect to the
vector of internal variables
Args:
h: vector of internal variables
"""
return torch.zeros(h.shape[0],0, device = h.device)
def history_rate(self, s, h, t, ep, T):
"""
The history evolution rate. Here this is an empty vector.
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: the temperature
"""
return torch.empty_like(h)
def dhistory_rate_dstress(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the stress.
Here this is an empty vector.
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: temperature
"""
return torch.empty_like(h)
def dhistory_rate_dhistory(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the history
Here this is an empty vector.
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: temperature
"""
return torch.empty(h.shape[0],0,0, device = h.device)
def dhistory_rate_derate(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the inelastic
strain rate.
Here this is an empty vector.
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: temperature
"""
return torch.empty(h.shape[0],0,1, device = h.device)
class FAKinematicHardeningModel(KinematicHardeningModel):
"""
Frederick and Armstrong hardening, as defined in the (republished) paper:
Frederick, C. and P. Armstrong. "A mathematical representation of the
multiaxial Baushcinger effect." Materials at High Temperatures: 24(1)
pp. 1-26, 2007.
The kinematic hardening is equal to the single internal variable.
The variable evolves as:
.. math::
\\dot{x}=\\frac{2}{3}C\\dot{\\varepsilon}_{in}-gx\\left|\\dot{\\varepsilon}_{in}\\right|
Args:
C: kinematic hardening parameter
g: recovery parameter
"""
def __init__(self, C, g):
super().__init__()
self.C = C
self.g = g
def value(self, h):
"""
The map from the internal variables to the kinematic hardening
Args:
h: vector of internal variables
"""
return h[:,0]
def dvalue(self, h):
"""
Derivative of the map to the kinematic hardening with respect to the
vector of internal variables
Args:
h: vector of internal variables
"""
return torch.ones((h.shape[0],1), device = h.device)
@property
def nhist(self):
"""
The number of internal variables, here just 1
"""
return 1
def history_rate(self, s, h, t, ep, T):
"""
The evolution rate for the internal variables
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: temperature
"""
return torch.unsqueeze(
2.0/3 * self.C(T) * ep - self.g(T) * h[:,0] * torch.abs(ep), 1)
def dhistory_rate_dstress(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the stress
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: the temperature
"""
return torch.zeros_like(h)
def dhistory_rate_dhistory(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the history
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: the temperature
"""
return torch.unsqueeze(-self.g(T)[...,None] * torch.abs(ep)[:,None], 1)
def dhistory_rate_derate(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the inelastic
strain rate.
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: the temperature
"""
return torch.unsqueeze(torch.unsqueeze(2.0/3 * self.C(T) -
self.g(T) * h[:,0] * torch.sign(ep), 1), 1)
class ChabocheHardeningModel(KinematicHardeningModel):
"""
Chaboche kinematic hardening, as defined by
Chaboche, J. and D. Nouailhas. "A unified constitutive modmel for
cyclic viscoplasticity and its applications to various stainless steels."
Journal of Engineering Materials and Technology: 111, pp. 424-430, 1989.
The model maintains n backstresses and sums them to provide the
total kinematic hardening
.. math::
\\sigma_{kin}=\\sum_{i=1}^{n_{kin}}x_{i}
Each individual backstress evolves per the Frederick-Armstrong model
.. math::
\\dot{x}_{i}=\\frac{2}{3}C_{i}\\dot{\\varepsilon}_{in}-g_{i}x_{i}\\left|\\dot{\\varepsilon}_{in}\\right|
Args:
C: *vector* of hardening coefficients
g: *vector* of recovery coefficients
"""
def __init__(self, C, g):
super().__init__()
self.C = C
self.g = g
self.nback = self.C.shape[-1]
def value(self, h):
"""
The map between the internal variables and the kinematic hardening
Here :math:`\\sigma_{kin}=\\sum_{i=1}^{n_{kin}}x_{i}`
Args:
h: vector of internal variables
"""
return torch.sum(h, 1)
def dvalue(self, h):
"""
Derivative of the map between the internal variables and the
kinematic hardening with respect to the internal variables
Args:
h: vector of internal variables
"""
return torch.ones((h.shape[0],self.nback), device = h.device)
@property
def nhist(self):
"""
Number of history variables, equal to the number of backstresses
"""
return self.nback
def history_rate(self, s, h, t, ep, T):
"""
The evolution rate for the internal variables
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: temperature
"""
return (self.C(T)[None,...] * ep[:,None] - self.g(T)[None,...] * h *
torch.abs(ep)[:,None]).reshape(h.shape)
def dhistory_rate_dstress(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to stress
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: the temperature
"""
return torch.zeros_like(h)
def dhistory_rate_dhistory(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the history
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: the temperature
"""
return torch.diag_embed(-self.g(T)[None,...] *
torch.abs(ep)[:,None]).reshape(h.shape+h.shape[1:])
def dhistory_rate_derate(self, s, h, t, ep, T):
"""
The derivative of the history rate with respect to the inelastic strain
rate
Args:
s: stress
h: history
t: time
ep: the inelastic strain rate
T: the temperature
"""
return torch.unsqueeze(self.C(T)[None,...] * torch.ones_like(ep)[:,None] -
self.g(T)[None,:] * h * torch.sign(ep)[:,None],-1).reshape(h.shape + (1,))
| 25.792123 | 110 | 0.572325 | 1,512 | 11,787 | 4.386905 | 0.130291 | 0.061511 | 0.054274 | 0.066335 | 0.719584 | 0.697573 | 0.677974 | 0.658978 | 0.600633 | 0.581034 | 0 | 0.009542 | 0.324256 | 11,787 | 456 | 111 | 25.848684 | 0.823227 | 0.542547 | 0 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.36 | false | 0.01 | 0.03 | 0 | 0.74 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
df91e6bda4648b2722d7bc5892fa58324c2f1fab | 3,704 | py | Python | SDM/rules/Rule.py | jalilm/SDN-Monitoring | 4ba8dd0f0ed5e44c0e803713d6c82ee2c815c7e4 | [
"Apache-2.0"
] | 1 | 2017-01-02T12:05:21.000Z | 2017-01-02T12:05:21.000Z | SDM/rules/Rule.py | jalilm/SDN-Monitoring | 4ba8dd0f0ed5e44c0e803713d6c82ee2c815c7e4 | [
"Apache-2.0"
] | null | null | null | SDM/rules/Rule.py | jalilm/SDN-Monitoring | 4ba8dd0f0ed5e44c0e803713d6c82ee2c815c7e4 | [
"Apache-2.0"
] | null | null | null | from SDM.util import get_dirs, get_params
class Rule(object):
"""
A class that represents a rule in the switch table.
"""
def __init__(self, datapath, table_id=0, priority=0, father_rule=None):
self.datapath = datapath
self.table_id = table_id
self.priority = priority
self.father_rule = father_rule
self.match_args = {}
self.match = self.datapath.ofproto_parser.OFPMatch(**self.match_args)
self.params = get_params(get_dirs())
def __repr__(self):
return "Rule(" + repr(self.datapath) + ", " + repr(self.table_id) + \
", " + repr(self.priority) + ")"
def __str__(self):
return "Rule"
def __hash__(self):
return self.__repr__().__hash__()
def __ne__(self, other):
return not self.__eq__(other)
def __eq__(self, other):
return type(other) == type(self) and hash(self) == hash(other)
def get_match(self):
"""
Used as C++ style copy constructor.
Each rule should define its partial/full Copy Constructor of OFPMatch,
due to bug that prohibits more than serialization of such object
thus preventing saving the matches aside.
"""
copy_match = self.datapath.ofproto_parser.OFPMatch(**self.match_args)
return copy_match
def add_match_arg(self, key, value):
self.match_args[key] = value
self.update_match()
def update_match(self):
self.match = self.datapath.ofproto_parser.OFPMatch(**self.match_args)
def get_finer_rules(self):
return self
def get_coarse_rule(self):
return self.father_rule
def get_paired_rule(self):
return self
def next_table_id(self):
return self.table_id + 1
def remove_flow(self):
mod = self.datapath.ofproto_parser.OFPFlowMod(
datapath=self.datapath, command=self.datapath.ofproto.OFPFC_DELETE, table_id=self.table_id,
priority=self.priority, match=self.get_match())
self.datapath.send_msg(mod)
def add_flow(self, inst):
mod = self.datapath.ofproto_parser.OFPFlowMod(datapath=self.datapath, table_id=self.table_id,
priority=self.priority,
match=self.match, instructions=inst)
self.datapath.send_msg(mod)
def add_flow_and_apply_actions(self, actions):
ofproto = self.datapath.ofproto
parser = self.datapath.ofproto_parser
inst = [parser.OFPInstructionActions(ofproto.OFPIT_APPLY_ACTIONS, actions)]
self.add_flow(inst)
def add_flow_and_goto_next_table(self, actions):
ofproto = self.datapath.ofproto
parser = self.datapath.ofproto_parser
inst = [parser.OFPInstructionActions(ofproto.OFPIT_APPLY_ACTIONS, actions),
parser.OFPInstructionGotoTable(self.next_table_id())]
self.add_flow(inst)
def add_flow_and_send_to_meter(self, meter_id, actions):
ofproto = self.datapath.ofproto
parser = self.datapath.ofproto_parser
inst = [parser.OFPInstructionActions(ofproto.OFPIT_APPLY_ACTIONS, actions),
parser.OFPInstructionMeter(meter_id)]
self.add_flow(inst)
def add_meter_dscp(self, meter_id):
ofproto = self.datapath.ofproto
parser = self.datapath.ofproto_parser
bands = [parser.OFPMeterBandDscpRemark(1, 1, 0, None, None)]
meter_mod = parser.OFPMeterMod(self.datapath, meter_id=meter_id, bands=bands,
flags=ofproto.OFPMF_PKTPS | ofproto.OFPMF_STATS)
self.datapath.send_msg(meter_mod)
| 34.943396 | 103 | 0.642819 | 449 | 3,704 | 5.024499 | 0.233853 | 0.12234 | 0.117908 | 0.14406 | 0.430408 | 0.430408 | 0.430408 | 0.419326 | 0.366135 | 0.266844 | 0 | 0.002195 | 0.261879 | 3,704 | 105 | 104 | 35.27619 | 0.82297 | 0.071544 | 0 | 0.267606 | 0 | 0 | 0.004147 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.267606 | false | 0 | 0.014085 | 0.126761 | 0.43662 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
10cd4ead88a8132804f1850b27b81a14eabdbbac | 1,176 | py | Python | bindings-examples/python-pyv8/Example.py | cbarsony/SCION-CORE | d7673342b18a91a4bf743301e0fcc42d1ce4415c | [
"Apache-2.0"
] | null | null | null | bindings-examples/python-pyv8/Example.py | cbarsony/SCION-CORE | d7673342b18a91a4bf743301e0fcc42d1ce4415c | [
"Apache-2.0"
] | null | null | null | bindings-examples/python-pyv8/Example.py | cbarsony/SCION-CORE | d7673342b18a91a4bf743301e0fcc42d1ce4415c | [
"Apache-2.0"
] | null | null | null | import PyV8
import json
#define an inner class
#TODO: pass through setters and getters to outer class definition?
class Global(PyV8.JSClass): # define a compatible javascript class
def hello(self,event): # define a method
print "Hello World"
sm = {
"id" : "foo",
"states" : [
{
"id" : "bar",
"onEntry" : "hello", #this string will get resolved against the global object
"transitions" : [
{
"event" : "t",
"target" : "bat"
}
]
},
{
"id" : "bat"
}
]
}
ctxt = PyV8.JSContext(Global()) # create a context with an implicit global object
ctxt.enter() # enter the context (also support with statement)
#import SCION into js context
f = open('../../lib/scion.js')
s = f.read()
f.close()
ctxt.eval(s)
sms = json.dumps(sm)
print sms
#instantiate a new Statechart
sc = ctxt.eval("new scion.Statechart(" + sms + ")")
initialConfig = sc.start()
print initialConfig
nextConfig = sc.gen('t')
print nextConfig
| 23.058824 | 96 | 0.52466 | 128 | 1,176 | 4.820313 | 0.59375 | 0.02269 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003984 | 0.359694 | 1,176 | 50 | 97 | 23.52 | 0.815405 | 0.294218 | 0 | 0 | 0 | 0 | 0.135201 | 0 | 0 | 0 | 0 | 0.02 | 0 | 0 | null | null | 0 | 0.055556 | null | null | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
10ce2c7f231ec308a097f19b443b507d0f483eb6 | 4,788 | py | Python | obj/WSPRInterfaceObjects.py | elpenor23/ZachTek-WSPR | d4b9532bd1e0eb9a688fc9f05e07fbb8aed225d5 | [
"MIT"
] | 2 | 2021-03-02T10:14:08.000Z | 2021-11-22T21:59:00.000Z | obj/WSPRInterfaceObjects.py | elpenor23/ZachTek-WSPR | d4b9532bd1e0eb9a688fc9f05e07fbb8aed225d5 | [
"MIT"
] | 11 | 2019-11-05T22:28:47.000Z | 2020-08-25T13:43:50.000Z | obj/WSPRInterfaceObjects.py | elpenor23/ZachTek-WSPR | d4b9532bd1e0eb9a688fc9f05e07fbb8aed225d5 | [
"MIT"
] | 1 | 2021-11-22T22:00:49.000Z | 2021-11-22T22:00:49.000Z | #!/usr/bin/python3
#This object is used by the GUI to get and hold information
#from the WSPR device that it needs to display.
#all calls to the WSPR device should come through here.
#This will allow us to swap out the manager if we ever want to
#connect this to a different WSPR device
from lib.WSPRInterfaceManager import WSPRInterfaceManager
from obj.ErrorObjects import logError, ErrorLevel
from obj.ConfigurationObjects import ConfigObject
import enum
import os
import time
class Mode (enum.Enum):
SignalGenerator = 1
WSPRMode = 2
Idle = 3
#main object
class WSPRInterfaceObject:
def __init__(self, configurationFile):
logError(ErrorLevel.LOW, "*********** START UP *****************")
#Get config object using passed in config file
self.config = ConfigObject(configurationFile)
#User the interface manager for all interfaces with the actual wspr device
self.WSPRInterfaceManager = WSPRInterfaceManager()
#Object properties we need to take care of
self._bands = []
self._callsign = ""
self._currentMode = Mode
self._startupMode = Mode
self.power = ""
self.generatorfrequency = ""
self._port = None
self.bands = self.config.bands
self.gpsposition = "0000"
self.gpstime = ""
self.transmitStatus = False
#DEBUG: Printing config so we know what is in there
#self.config.print()
return
def print(self):
print("Port:" + self._port)
print("Bands:")
print(self._bands)
print("Callsign: " + self._callsign)
#print("Startup Mode: " + self.startUpMode)
#print("Live Mode:" + self.liveMode)
#print("Live Signal Mode Frequency:" + self.liveSignalModeFrequency)
#print("GPS Status:" + self.GPSStatus)
#print("WSPR Time:" + self.wsprTime)
#########################################
# Properties
########################################
#Port getters and setters
def get_port(self):
return self._port
def set_port(self,p):
self._port = p
def del_port(self):
self._port = None
port = property(get_port, set_port, del_port, "This is documentation")
#Bands getters and setters
def get_bands(self):
return self._bands
def set_bands(self, b):
self._bands = b
def del_bands(self):
#disable all bands
for band in self.bands:
band[1] = "D"
bands = property(get_bands, set_bands, del_bands)
#Callsign getters and setters
def get_callsign(self):
return self._callsign
def set_callsign(self, c):
self._callsign = c
def del_callsign(self):
self._callsign = ""
callsign = property(get_callsign, set_callsign, del_callsign)
#Startup Mode getters and setters
def get_startupmode(self):
return self._startupMode
def set_startupMode(self, smode):
self._startupMode = smode
startupMode = property(get_startupmode, set_startupMode)
#current Mode getters and setters
def get_currentmode(self):
return self._currentMode
def set_currentMode(self, cmode):
self._currentMode = cmode
currentMode = property(get_currentmode, set_currentMode)
#Port List getter
def get_portList(self):
ports = self.WSPRInterfaceManager.GetPortList()
return ports
allPorts = property(get_portList)
############################
#END Properties
############################
############################
#Start Methods
############################
def DetectPort(self):
#print(self.allPorts)
return self.WSPRInterfaceManager.detectTransmitter(self.allPorts,
self.config.checkSecurity,
self.config.securityErrorMessage,
self.config.deviceconstants.commands.responce.deviceinfo,
self.config.readDeviceTimeout,
self.config.deviceconstants.commands.commandEndChars)
def ReadData(self):
data = self.WSPRInterfaceManager.readSerialPort(self.port)
return data
def WriteCommand(self, commandType, command, value = ""):
self.WSPRInterfaceManager.sendCommand(self.port, self.config.deviceconstants.commands, commandType, command, self.config.deviceconstants.waitBetweenCommandsInSeconds, value)
return
###########################
#END Methods
########################### | 32.134228 | 181 | 0.584378 | 476 | 4,788 | 5.768908 | 0.334034 | 0.036417 | 0.030954 | 0.036417 | 0.044792 | 0.019665 | 0 | 0 | 0 | 0 | 0 | 0.002637 | 0.287176 | 4,788 | 149 | 182 | 32.134228 | 0.801934 | 0.203634 | 0 | 0.074074 | 0 | 0 | 0.024025 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.234568 | false | 0 | 0.074074 | 0.074074 | 0.567901 | 0.061728 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
10ce8f52e0bc6f6fa2cc8a3d2a85e5f36ce35c91 | 2,222 | py | Python | msgvis/apps/enhance/management/commands/remove_stopwords.py | hds-lab/textvis-drg | bfb136b6105df84fb6c1c89cc595bf9e9f22c5fe | [
"MIT"
] | 10 | 2015-12-04T07:43:11.000Z | 2021-01-23T00:44:56.000Z | msgvis/apps/enhance/management/commands/remove_stopwords.py | hds-lab/textvis-drg | bfb136b6105df84fb6c1c89cc595bf9e9f22c5fe | [
"MIT"
] | 200 | 2015-02-11T05:41:57.000Z | 2015-11-13T03:47:25.000Z | msgvis/apps/enhance/management/commands/remove_stopwords.py | hds-lab/textvis-drg | bfb136b6105df84fb6c1c89cc595bf9e9f22c5fe | [
"MIT"
] | 6 | 2015-10-02T18:01:09.000Z | 2021-01-23T00:44:58.000Z | from django.core.management.base import BaseCommand, make_option, CommandError
from msgvis.apps.corpus import models as corpus_models
import msgvis.apps.enhance.models as enhance_models
from msgvis.apps.groups import models as groups_models
from msgvis.apps.datatable import models as datatable_models
from msgvis.apps.dimensions import registry
from msgvis.apps.corpus import utils
from django.db.models import Q
import operator
import pdb
def derive_queryset(keywords_text):
pdb.set_trace()
return corpus_models.Dataset.objects.get(id=1).get_advanced_search_results(keywords_text, corpus_models.MessageType.objects.all())
class Command(BaseCommand):
help = "Enter live test environment."
def handle(self, *arguments, **options):
'''
soup = enhance_models.TweetWord.objects.get(original_text='soup')
ladies = enhance_models.TweetWord.objects.get(original_text='ladies')
food = enhance_models.TweetWord.objects.get(original_text='food')
jobs = enhance_models.TweetWord.objects.get(original_text='jobs')
or_soup = utils.levels_or("tweet_words__id", map(lambda x: x.id, soup.related_words))
or_ladies = utils.levels_or("tweet_words__id", map(lambda x: x.id, ladies.related_words))
or_food = utils.levels_or("tweet_words__id", map(lambda x: x.id, food.related_words))
or_jobs = utils.levels_or("tweet_words__id", map(lambda x: x.id, jobs.related_words))
soup_and_ladies = reduce(operator.and_, [or_soup, or_ladies])
soup_and_ladies_or_food = reduce(operator.or_, [soup_and_ladies, or_food])
queryset = corpus_models.Dataset.objects.get(id=1).message_set.all()
queryset = queryset.filter(soup_and_ladies_or_food)
final = reduce(operator.and_, [soup_and_ladies_or_food, reduce(operator.not_, [or_jobs])]) # this does not work
'''
#import nltk
#from nltk.corpus import stopwords
#stopwords_list = stopwords.words('english')
stopwords_list = ["i'm", "&", "via", "~", "+", "-", "i'll", "he's"]
for stopword in stopwords_list:
queryset = enhance_models.TweetWord.objects.filter(text=stopword)
print "stopword = %s, count = %d" %(stopword, queryset.count())
queryset.delete()
pdb.set_trace()
| 45.346939 | 134 | 0.741224 | 314 | 2,222 | 5 | 0.299363 | 0.057962 | 0.044586 | 0.092357 | 0.349045 | 0.29172 | 0.29172 | 0.096815 | 0.096815 | 0.096815 | 0 | 0.001045 | 0.138614 | 2,222 | 48 | 135 | 46.291667 | 0.819227 | 0.039154 | 0 | 0.090909 | 0 | 0 | 0.063792 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.454545 | null | null | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
10da3939b7145e6037097662ddc4519c29cfbd2b | 176 | py | Python | src/visualize.py | KevinSun127/pt-engine | 8e50d5b1d1cbe0981efdf5ec905f72a896316925 | [
"MIT"
] | null | null | null | src/visualize.py | KevinSun127/pt-engine | 8e50d5b1d1cbe0981efdf5ec905f72a896316925 | [
"MIT"
] | null | null | null | src/visualize.py | KevinSun127/pt-engine | 8e50d5b1d1cbe0981efdf5ec905f72a896316925 | [
"MIT"
] | null | null | null | import numpy as np
import pyvista as pv
import pandas as pd
SAVE_FILE = "resources/SAVE_PTS/save.pt.csv"
pts = np.genfromtxt(SAVE_FILE,delimiter=",")
pv.PolyData(pts).plot()
| 19.555556 | 44 | 0.755682 | 30 | 176 | 4.333333 | 0.6 | 0.123077 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.119318 | 176 | 8 | 45 | 22 | 0.83871 | 0 | 0 | 0 | 0 | 0 | 0.176136 | 0.170455 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
10daab83a3dd52c3c3361e9eaf223f155e982f3b | 47,778 | py | Python | pysnmp-with-texts/DHCP-SERVER-MIB.py | cisco-kusanagi/mibs.snmplabs.com | 48bb18954e5db428383114ff13b372b9d0cccfb2 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/DHCP-SERVER-MIB.py | copslock/cisco-kusanagi_mibs.snmplabs.com | 48bb18954e5db428383114ff13b372b9d0cccfb2 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/DHCP-SERVER-MIB.py | cisco-kusanagi/mibs.snmplabs.com | 48bb18954e5db428383114ff13b372b9d0cccfb2 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module DHCP-SERVER-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///home/tin/Dev/mibs.snmplabs.com/asn1/DHCP-SERVER-MIB
# Produced by pysmi-0.3.4 at Fri Jan 31 21:33:35 2020
# On host bier platform Linux version 5.4.0-3-amd64 by user tin
# Using Python version 3.7.6 (default, Jan 19 2020, 22:34:52)
#
ObjectIdentifier, Integer, OctetString = mibBuilder.importSymbols("ASN1", "ObjectIdentifier", "Integer", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
SingleValueConstraint, ValueRangeConstraint, ConstraintsUnion, ValueSizeConstraint, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "SingleValueConstraint", "ValueRangeConstraint", "ConstraintsUnion", "ValueSizeConstraint", "ConstraintsIntersection")
NotificationGroup, ModuleCompliance, ObjectGroup = mibBuilder.importSymbols("SNMPv2-CONF", "NotificationGroup", "ModuleCompliance", "ObjectGroup")
Integer32, Counter64, iso, Bits, Unsigned32, IpAddress, MibScalar, MibTable, MibTableRow, MibTableColumn, NotificationType, ObjectIdentity, Counter32, Gauge32, MibIdentifier, TimeTicks, enterprises, ModuleIdentity = mibBuilder.importSymbols("SNMPv2-SMI", "Integer32", "Counter64", "iso", "Bits", "Unsigned32", "IpAddress", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "NotificationType", "ObjectIdentity", "Counter32", "Gauge32", "MibIdentifier", "TimeTicks", "enterprises", "ModuleIdentity")
RowStatus, TruthValue, TextualConvention, DisplayString, DateAndTime = mibBuilder.importSymbols("SNMPv2-TC", "RowStatus", "TruthValue", "TextualConvention", "DisplayString", "DateAndTime")
lucent = MibIdentifier((1, 3, 6, 1, 4, 1, 1751))
products = MibIdentifier((1, 3, 6, 1, 4, 1, 1751, 1))
mibs = MibIdentifier((1, 3, 6, 1, 4, 1, 1751, 2))
ipspg = MibIdentifier((1, 3, 6, 1, 4, 1, 1751, 1, 48))
ipspgServices = MibIdentifier((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1))
ipspgDHCP = MibIdentifier((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1))
ipspgDNS = MibIdentifier((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 2))
ipspgTrap = MibIdentifier((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2))
dhcpServMib = ModuleIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1))
if mibBuilder.loadTexts: dhcpServMib.setLastUpdated('0606220830Z')
if mibBuilder.loadTexts: dhcpServMib.setOrganization('Lucent Technologies')
if mibBuilder.loadTexts: dhcpServMib.setContactInfo(' James Offutt Postal: Lucent Technologies 400 Lapp Road Malvern, PA 19355 USA Tel: +1 610-722-7900 Fax: +1 610-725-8559')
if mibBuilder.loadTexts: dhcpServMib.setDescription('The Vendor Specific MIB module for entities implementing the server side of the Bootstrap Protocol (BOOTP) and the Dynamic Host Configuration protocol (DHCP) for Internet Protocol version 4 (IPv4).')
dhcpServMibTraps = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0))
if mibBuilder.loadTexts: dhcpServMibTraps.setStatus('current')
if mibBuilder.loadTexts: dhcpServMibTraps.setDescription('DHCP Server MIB traps.')
dhcpServMibObjects = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1))
if mibBuilder.loadTexts: dhcpServMibObjects.setStatus('current')
if mibBuilder.loadTexts: dhcpServMibObjects.setDescription('DHCP Server MIB objects are all defined in this branch.')
dhcpServSystem = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 1))
if mibBuilder.loadTexts: dhcpServSystem.setStatus('current')
if mibBuilder.loadTexts: dhcpServSystem.setDescription('Group of objects that are related to the overall system.')
dhcpServSubnetCounters = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 2))
if mibBuilder.loadTexts: dhcpServSubnetCounters.setStatus('current')
if mibBuilder.loadTexts: dhcpServSubnetCounters.setDescription('Group of objects that count various subnet data values')
dhcpServBootpCounters = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 3))
if mibBuilder.loadTexts: dhcpServBootpCounters.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpCounters.setDescription('Group of objects that count various BOOTP events.')
dhcpServDhcpCounters = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4))
if mibBuilder.loadTexts: dhcpServDhcpCounters.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCounters.setDescription('Group of objects that count various DHCP Statistics.')
dhcpServBootpStatistics = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 5))
if mibBuilder.loadTexts: dhcpServBootpStatistics.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpStatistics.setDescription('Group of objects that measure various BOOTP statistics.')
dhcpServDhcpStatistics = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 6))
if mibBuilder.loadTexts: dhcpServDhcpStatistics.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpStatistics.setDescription('Group of objects that measure various DHCP statistics.')
dhcpServConfiguration = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7))
if mibBuilder.loadTexts: dhcpServConfiguration.setStatus('current')
if mibBuilder.loadTexts: dhcpServConfiguration.setDescription('Objects that contain pre-configured and Dynamic Config. Info.')
dhcpServFailover = ObjectIdentity((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 8))
if mibBuilder.loadTexts: dhcpServFailover.setStatus('current')
if mibBuilder.loadTexts: dhcpServFailover.setDescription('Objects that contain partner server info.')
class DhcpServTimeInterval(TextualConvention, Gauge32):
description = 'The number of milli-seconds that has elapsed since some epoch. Systems that cannot measure events to the milli-second resolution SHOULD round this value to the next available resolution that the system supports.'
status = 'current'
dhcpServSystemDescr = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 1, 1), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServSystemDescr.setStatus('current')
if mibBuilder.loadTexts: dhcpServSystemDescr.setDescription('A textual description of the server. This value should include the full name and version identification of the server. This string MUST contain only printable NVT ASCII characters.')
dhcpServSystemStatus = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4))).clone(namedValues=NamedValues(("starting", 0), ("running", 1), ("stopping", 2), ("stopped", 3), ("reload", 4)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServSystemStatus.setStatus('current')
if mibBuilder.loadTexts: dhcpServSystemStatus.setDescription(' Dhcp System Server Status ')
dhcpServSystemUpTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 1, 3), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServSystemUpTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServSystemUpTime.setDescription('If the server has a persistent state (e.g., a process), this value will be the seconds elapsed since it started. For software without persistant state, this value will be zero.')
dhcpServSystemResetTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 1, 4), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServSystemResetTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServSystemResetTime.setDescription("If the server has a persistent state (e.g., a process) and supports a `reset' operation (e.g., can be told to re-read configuration files), this value will be the seconds elapsed since the last time the name server was `reset.' For software that does not have persistence or does not support a `reset' operation, this value will be zero.")
dhcpServCountUsedSubnets = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 2, 1), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServCountUsedSubnets.setStatus('current')
if mibBuilder.loadTexts: dhcpServCountUsedSubnets.setDescription('The number subnets managed by the server (i.e. configured), from which the server has issued at least one lease.')
dhcpServCountUnusedSubnets = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 2, 2), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServCountUnusedSubnets.setStatus('current')
if mibBuilder.loadTexts: dhcpServCountUnusedSubnets.setDescription('The number subnets managed by the server (i.e. configured), from which the server has issued no leases.')
dhcpServCountFullSubnets = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 2, 3), Integer32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServCountFullSubnets.setStatus('current')
if mibBuilder.loadTexts: dhcpServCountFullSubnets.setDescription('The number subnets managed by the server (i.e. configured), in which the address pools have been exhausted.')
dhcpServBootpCountRequests = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 3, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpCountRequests.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpCountRequests.setDescription('The number of packets received that contain a Message Type of 1 (BOOTREQUEST) in the first octet and do not contain option number 53 (DHCP Message Type) in the options.')
dhcpServBootpCountInvalids = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 3, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpCountInvalids.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpCountInvalids.setDescription('The number of packets received that do not contain a Message Type of 1 (BOOTREQUEST) in the first octet or are not valid BOOTP packets (e.g.: too short, invalid field in packet header).')
dhcpServBootpCountReplies = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 3, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpCountReplies.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpCountReplies.setDescription('The number of packets sent that contain a Message Type of 1 (BOOTREQUEST) in the first octet and do not contain option number 53 (DHCP Message Type) in the options.')
dhcpServBootpCountDroppedUnknownClients = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 3, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpCountDroppedUnknownClients.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpCountDroppedUnknownClients.setDescription('The number of BOOTP packets dropped due to the server not recognizing or not providing service to the hardware address received in the incoming packet.')
dhcpServBootpCountDroppedNotServingSubnet = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 3, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpCountDroppedNotServingSubnet.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpCountDroppedNotServingSubnet.setDescription('The number of BOOTP packets dropped due to the server not being configured or not otherwise able to serve addresses on the subnet from which this message was received.')
dhcpServDhcpCountDiscovers = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 1), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountDiscovers.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountDiscovers.setDescription('The number of DHCPDISCOVER (option 53 with value 1) packets received.')
dhcpServDhcpCountRequests = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountRequests.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountRequests.setDescription('The number of DHCPREQUEST (option 53 with value 3) packets received.')
dhcpServDhcpCountReleases = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 3), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountReleases.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountReleases.setDescription('The number of DHCPRELEASE (option 53 with value 7) packets received.')
dhcpServDhcpCountDeclines = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountDeclines.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountDeclines.setDescription('The number of DHCPDECLINE (option 53 with value 4) packets received.')
dhcpServDhcpCountInforms = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 5), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountInforms.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountInforms.setDescription('The number of DHCPINFORM (option 53 with value 8) packets received.')
dhcpServDhcpCountInvalids = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 6), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountInvalids.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountInvalids.setDescription('The number of DHCP packets received whose DHCP message type (i.e.: option number 53) is not understood or handled by the server.')
dhcpServDhcpCountOffers = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 7), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountOffers.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountOffers.setDescription('The number of DHCPOFFER (option 53 with value 2) packets sent.')
dhcpServDhcpCountAcks = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 8), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountAcks.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountAcks.setDescription('The number of DHCPACK (option 53 with value 5) packets sent.')
dhcpServDhcpCountNacks = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 9), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountNacks.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountNacks.setDescription('The number of DHCPNACK (option 53 with value 6) packets sent.')
dhcpServDhcpCountDroppedUnknownClient = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 10), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountDroppedUnknownClient.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountDroppedUnknownClient.setDescription('The number of DHCP packets dropped due to the server not recognizing or not providing service to the client-id and/or hardware address received in the incoming packet.')
dhcpServDhcpCountDroppedNotServingSubnet = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 4, 11), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpCountDroppedNotServingSubnet.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpCountDroppedNotServingSubnet.setDescription('The number of DHCP packets dropped due to the server not being configured or not otherwise able to serve addresses on the subnet from which this message was received.')
dhcpServBootpStatMinArrivalInterval = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 5, 1), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpStatMinArrivalInterval.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpStatMinArrivalInterval.setDescription('The minimum amount of time between receiving two BOOTP messages. A message is received at the server when the server is able to begin processing the message. This typically occurs immediately after the message is read into server memory. If no messages have been received, then this object contains a zero value.')
dhcpServBootpStatMaxArrivalInterval = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 5, 2), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpStatMaxArrivalInterval.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpStatMaxArrivalInterval.setDescription('The maximum amount of time between receiving two BOOTP messages. A message is received at the server when the server is able to begin processing the message. This typically occurs immediately after the message is read into server memory. If no messages have been received, then this object contains a zero value.')
dhcpServBootpStatLastArrivalTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 5, 3), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpStatLastArrivalTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpStatLastArrivalTime.setDescription('The number of seconds since the last valid BOOTP message was received by the server. Invalid messages do not cause this value to change. If valid no messages have been received, then this object contains a zero value.')
dhcpServBootpStatMinResponseTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 5, 4), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpStatMinResponseTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpStatMinResponseTime.setDescription('The smallest time interval measured as the difference between the arrival of a BOOTP message at the server and the successful transmission of the response to that message. A message is received at the server when the server is able to begin processing the message. A message is transmitted after the server has no further use for the message. Note that the operating system may still have the message queued internally. The operating system queue time is not to be considered as part of the response time. Invalid messages do not cause this value to change. If no valid messages have been received, then this object contains a zero value.')
dhcpServBootpStatMaxResponseTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 5, 5), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpStatMaxResponseTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpStatMaxResponseTime.setDescription('The largest time interval measured as the difference between the arrival of a BOOTP message at the server and the successful transmission of the response to that message. A message is received at the server when the server is able to begin processing the message. A message is transmitted after the server has no further use for the message. Note that the operating system may still have the message queued internally. The operating system queue time is not to be considered as part of the response time. Invalid messages do not cause this value to change. If no valid messages have been received, then this object contains a zero value.')
dhcpServBootpStatSumResponseTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 5, 6), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServBootpStatSumResponseTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServBootpStatSumResponseTime.setDescription('The sum of the response time intervals in milli-seconds where a response time interval is measured as the difference between the arrival of a BOOTP message at the server and the successful transmission of the response to that message. A message is received at the server when the server is able to begin processing the message. A message is transmitted after the server has no further use for the message. Note that the operating system may still have the message queued internally. The operating system queue time is not to be considered as part of the response time. Invalid messages do not cause this value to change. If no valid messages have been received, then this object contains a zero value.')
dhcpServDhcpStatMinArrivalInterval = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 6, 1), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpStatMinArrivalInterval.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpStatMinArrivalInterval.setDescription('The minimum amount of time between receiving two DHCP messages. A message is received at the server when the server is able to begin processing the message. This typically occurs immediately after the message is read into server memory. If no messages have been received, then this object contains a zero value.')
dhcpServDhcpStatMaxArrivalInterval = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 6, 2), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpStatMaxArrivalInterval.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpStatMaxArrivalInterval.setDescription('The maximum amount of time between receiving two DHCP messages. A message is received at the server when the server is able to begin processing the message. This typically occurs immediately after the message is read into server memory. If no messages have been received, then this object contains a zero value.')
dhcpServDhcpStatLastArrivalTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 6, 3), TimeTicks()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpStatLastArrivalTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpStatLastArrivalTime.setDescription('The number of seconds since the last valid DHCP message was received by the server. Invalid messages do not cause this value to change. If no valid messages have been received, then this object contains a zero value.')
dhcpServDhcpStatMinResponseTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 6, 4), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpStatMinResponseTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpStatMinResponseTime.setDescription('The smallest time interval measured as the difference between the arrival of a DHCP message at the server and the successful transmission of the response to that message. A message is received at the server when the server is able to begin processing the message. A message is transmitted after the server has no further use for the message. Note that the operating system may still have the message queued internally. The operating system queue time is not to be considered as part of the response time. Invalid messages do not cause this value to change. If no valid messages have been received, then this object contains a zero value.')
dhcpServDhcpStatMaxResponseTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 6, 5), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpStatMaxResponseTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpStatMaxResponseTime.setDescription('The largest time interval measured as the difference between the arrival of a DHCP message at the server and the successful transmission of the response to that message. A message is received at the server when the server is able to begin processing the message. A message is transmitted after the server has no further use for the message. Note that the operating system may still have the message queued internally. The operating system queue time is not to be considered as part of the response time. Invalid messages do not cause this value to change. If no valid messages have been received, then this object contains a zero value.')
dhcpServDhcpStatSumResponseTime = MibScalar((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 6, 6), DhcpServTimeInterval()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServDhcpStatSumResponseTime.setStatus('current')
if mibBuilder.loadTexts: dhcpServDhcpStatSumResponseTime.setDescription('The sum of the response time intervals in milli-seconds where a response time interval is measured as the difference between the arrival of a DHCP message at the server and the successful transmission of the response to that message. A message is received at the server when the server is able to begin processing the message. A message is transmitted after the server has no further use for the message. Note that the operating system may still have the message queued internally. The operating system queue time is not to be considered as part of the response time. Invalid messages do not cause this value to change. If no valid messages have been received, then this object contains a zero value.')
dhcpServRangeTable = MibTable((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2), )
if mibBuilder.loadTexts: dhcpServRangeTable.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeTable.setDescription('A list of ranges that are configured on this server.')
dhcpServRangeEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1), ).setIndexNames((0, "DHCP-SERVER-MIB", "dhcpServRangeSubnetAddr"), (0, "DHCP-SERVER-MIB", "dhcpServRangeStart"))
if mibBuilder.loadTexts: dhcpServRangeEntry.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeEntry.setDescription('A logical row in the serverRangeTable.')
dhcpServRangeSubnetAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1, 1), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServRangeSubnetAddr.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeSubnetAddr.setDescription('The IP address defining a subnet')
dhcpServRangeSubnetMask = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1, 2), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServRangeSubnetMask.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeSubnetMask.setDescription('The subnet mask (DHCP option 1) provided to any client offered an address from this range.')
dhcpServRangeStart = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1, 3), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServRangeStart.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeStart.setDescription('Start of Subnet Address, Index for Conceptual Tabl, Type IP address ')
dhcpServRangeEnd = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1, 4), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServRangeEnd.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeEnd.setDescription('The IP address of the last address in the range. The value of range end must be greater than or equal to the value of range start.')
dhcpServRangeInUse = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1, 5), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServRangeInUse.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeInUse.setDescription('The number of addresses in this range that are currently in use. This number includes those addresses whose lease has not expired and addresses which have been reserved (either by the server or through configuration).')
dhcpServRangeOutstandingOffers = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1, 6), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServRangeOutstandingOffers.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeOutstandingOffers.setDescription('The number of outstanding DHCPOFFER messages for this range is reported with this value. An offer is outstanding if the server has sent a DHCPOFFER message to a client, but has not yet received a DHCPREQUEST message from the client nor has the server-specific timeout (limiting the time in which a client can respond to the offer message) for the offer message expired.')
dhcpServRangeUnavailable = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1, 7), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServRangeUnavailable.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeUnavailable.setDescription(' Dhcp Server IP Addresses unavailable in a Subnet ')
dhcpServRangeType = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1, 8), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("manBootp", 1), ("autoBootp", 2), ("manDhcp", 3), ("autoDhcp", 4), ("dynamicDhcp", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServRangeType.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeType.setDescription('Dhcp Server Client Lease Type ')
dhcpServRangeUnused = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 7, 2, 1, 9), Gauge32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServRangeUnused.setStatus('current')
if mibBuilder.loadTexts: dhcpServRangeUnused.setDescription('The number of addresses in this range that are currently unused. This number includes those addresses whose lease has not expired and addresses which have been reserved (either by the server or through configuration).')
dhcpServFailoverTable = MibTable((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 8, 1), )
if mibBuilder.loadTexts: dhcpServFailoverTable.setStatus('current')
if mibBuilder.loadTexts: dhcpServFailoverTable.setDescription('A list of partner server.')
dhcpServFailoverEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 8, 1, 1), ).setIndexNames((0, "DHCP-SERVER-MIB", "dhcpServFailoverPartnerAddr"))
if mibBuilder.loadTexts: dhcpServFailoverEntry.setStatus('current')
if mibBuilder.loadTexts: dhcpServFailoverEntry.setDescription('A logical row in the serverFailoverTable.')
dhcpServFailoverPartnerAddr = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 8, 1, 1, 1), IpAddress()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServFailoverPartnerAddr.setStatus('current')
if mibBuilder.loadTexts: dhcpServFailoverPartnerAddr.setDescription('The IP address defining a partner server')
dhcpServFailoverPartnerType = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 8, 1, 1, 2), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3))).clone(namedValues=NamedValues(("primary", 1), ("failover", 2), ("unconfigured", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServFailoverPartnerType.setStatus('current')
if mibBuilder.loadTexts: dhcpServFailoverPartnerType.setDescription('Dhcp Server Failover server type ')
dhcpServFailoverPartnerStatus = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 8, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3))).clone(namedValues=NamedValues(("unknown", 0), ("syncing", 1), ("active", 2), ("inactive", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServFailoverPartnerStatus.setStatus('current')
if mibBuilder.loadTexts: dhcpServFailoverPartnerStatus.setDescription('Dhcp Server Partner status ')
dhcpServFailoverPartnerPolltime = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 1, 8, 1, 1, 4), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: dhcpServFailoverPartnerPolltime.setStatus('current')
if mibBuilder.loadTexts: dhcpServFailoverPartnerPolltime.setDescription('The last time there was a successfull communication with the partner server. This value is local time in seconds since some epoch.')
ipspgDhcpTrapTable = MibTable((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1), )
if mibBuilder.loadTexts: ipspgDhcpTrapTable.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrapTable.setDescription("The agent's table of IPSPG alarm information.")
ipspgDhcpTrapEntry = MibTableRow((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1), ).setIndexNames((0, "DHCP-SERVER-MIB", "ipspgDhcpTrIndex"))
if mibBuilder.loadTexts: ipspgDhcpTrapEntry.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrapEntry.setDescription('Information about the last alarm trap generated by the agent.')
ipspgDhcpTrIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 1), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 31))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrIndex.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrIndex.setDescription('Index into the IPSPG Alarm traps')
ipspgDhcpTrSequence = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 2), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrSequence.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrSequence.setDescription('Counter of the number of IPSPG alarm traps since the agent was last initialized')
ipspgDhcpTrId = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 3), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 3))).clone(namedValues=NamedValues(("monitor", 1), ("analyzer", 3)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrId.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrId.setDescription('The application which generated this IPSPG alarm.')
ipspgDhcpTrText = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 4), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 80))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrText.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrText.setDescription('An ASCII string describing the IPSPG alarm condition/cause.')
ipspgDhcpTrPriority = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 5), Integer32().subtype(subtypeSpec=ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))).clone(namedValues=NamedValues(("inform", 1), ("warning", 2), ("minor", 3), ("major", 4), ("critical", 5)))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrPriority.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrPriority.setDescription('The priority level as set on the agent for this Calss and Type of trap.')
ipspgDhcpTrClass = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 6), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrClass.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrClass.setDescription('The Class number of the described IPSPG alarm.')
ipspgDhcpTrType = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 7), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrType.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrType.setDescription('The type number of the described IPSPG alarm.')
ipspgDhcpTrTime = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 8), Counter32()).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrTime.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrTime.setDescription('The time that the condition or event occurred which caused generation of this alarm. This value is given in seconds since 00:00:00 Greenwich mean time (GMT) January 1, 1970.')
ipspgDhcpTrSuspect = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 9), DisplayString().subtype(subtypeSpec=ValueSizeConstraint(0, 32))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrSuspect.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrSuspect.setDescription('An ASCII string describing the host which caused the IPSPG alarm.')
ipspgDhcpTrDiagId = MibTableColumn((1, 3, 6, 1, 4, 1, 1751, 1, 48, 2, 1, 1, 10), Integer32().subtype(subtypeSpec=ValueRangeConstraint(1, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: ipspgDhcpTrDiagId.setStatus('current')
if mibBuilder.loadTexts: ipspgDhcpTrDiagId.setDescription('An integer describing the diagnosis which triggered this IPSPG alarm.')
dhcpServerStarted = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 1)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerStarted.setStatus('current')
if mibBuilder.loadTexts: dhcpServerStarted.setDescription('The monitor has determined that the DHCP server has been started.')
dhcpServerStopped = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 2)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerStopped.setStatus('current')
if mibBuilder.loadTexts: dhcpServerStopped.setDescription('The monitor has determined that the DHCP server has been stopped.')
dhcpServerReload = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 3)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerReload.setStatus('current')
if mibBuilder.loadTexts: dhcpServerReload.setDescription('The monitor has determined that the DHCP server has been reloaded.')
dhcpServerSubnetDepleted = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 4)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerSubnetDepleted.setStatus('current')
if mibBuilder.loadTexts: dhcpServerSubnetDepleted.setDescription('The monitor has determined that the DHCP server has run out of addresses in a subnet.')
dhcpServerBadPacket = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 5)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerBadPacket.setStatus('current')
if mibBuilder.loadTexts: dhcpServerBadPacket.setDescription('The monitor has determined that the DHCP server has received a bad DHCP or Bootp packet.')
dhcpServerFailoverActive = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 6)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerFailoverActive.setStatus('current')
if mibBuilder.loadTexts: dhcpServerFailoverActive.setDescription('This trap is issued by the secondary server. It indicates a primary partner server is down and its scopes are now being served by this failover server.')
dhcpServerFailoverReturnedControl = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 7)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerFailoverReturnedControl.setStatus('current')
if mibBuilder.loadTexts: dhcpServerFailoverReturnedControl.setDescription('This trap is issued by the secondary server. It indicates that the failover server has returned control to its primary partner.')
dhcpServerSubnetThresholdExceeded = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 8)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerSubnetThresholdExceeded.setStatus('current')
if mibBuilder.loadTexts: dhcpServerSubnetThresholdExceeded.setDescription('This trap is issued when subnet threshold is exceeded.')
dhcpServerSubnetThresholdDescent = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 9)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerSubnetThresholdDescent.setStatus('current')
if mibBuilder.loadTexts: dhcpServerSubnetThresholdDescent.setDescription('This trap is issued when subnet unavailable lease percentage falls below the descent threshold value.')
dhcpServerDropUnknownClient = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 10)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerDropUnknownClient.setStatus('current')
if mibBuilder.loadTexts: dhcpServerDropUnknownClient.setDescription('This trap is issued when the server drops a client message because the client MAC address is either in a MAC exclusion pool or is not in an inclusion pool.')
dhcpServerPingResponseReceived = NotificationType((1, 3, 6, 1, 4, 1, 1751, 1, 48, 1, 1, 1, 0, 11)).setObjects(("DHCP-SERVER-MIB", "ipspgDhcpTrSequence"), ("DHCP-SERVER-MIB", "ipspgDhcpTrId"), ("DHCP-SERVER-MIB", "ipspgDhcpTrText"), ("DHCP-SERVER-MIB", "ipspgDhcpTrPriority"), ("DHCP-SERVER-MIB", "ipspgDhcpTrClass"), ("DHCP-SERVER-MIB", "ipspgDhcpTrType"), ("DHCP-SERVER-MIB", "ipspgDhcpTrTime"), ("DHCP-SERVER-MIB", "ipspgDhcpTrSuspect"), ("DHCP-SERVER-MIB", "ipspgDhcpTrDiagId"))
if mibBuilder.loadTexts: dhcpServerPingResponseReceived.setStatus('current')
if mibBuilder.loadTexts: dhcpServerPingResponseReceived.setDescription('This trap is issued when the server receives a ping response.')
mibBuilder.exportSymbols("DHCP-SERVER-MIB", dhcpServDhcpStatMaxArrivalInterval=dhcpServDhcpStatMaxArrivalInterval, dhcpServDhcpStatMinResponseTime=dhcpServDhcpStatMinResponseTime, dhcpServDhcpStatMaxResponseTime=dhcpServDhcpStatMaxResponseTime, dhcpServDhcpStatistics=dhcpServDhcpStatistics, dhcpServDhcpCountDeclines=dhcpServDhcpCountDeclines, dhcpServBootpCounters=dhcpServBootpCounters, dhcpServBootpCountInvalids=dhcpServBootpCountInvalids, dhcpServBootpStatMaxResponseTime=dhcpServBootpStatMaxResponseTime, ipspgDhcpTrText=ipspgDhcpTrText, dhcpServFailover=dhcpServFailover, dhcpServSystem=dhcpServSystem, dhcpServMibTraps=dhcpServMibTraps, DhcpServTimeInterval=DhcpServTimeInterval, PYSNMP_MODULE_ID=dhcpServMib, dhcpServDhcpStatMinArrivalInterval=dhcpServDhcpStatMinArrivalInterval, ipspgDhcpTrClass=ipspgDhcpTrClass, dhcpServMib=dhcpServMib, dhcpServBootpStatMaxArrivalInterval=dhcpServBootpStatMaxArrivalInterval, dhcpServRangeEntry=dhcpServRangeEntry, dhcpServRangeInUse=dhcpServRangeInUse, dhcpServDhcpCountDiscovers=dhcpServDhcpCountDiscovers, dhcpServBootpCountDroppedUnknownClients=dhcpServBootpCountDroppedUnknownClients, dhcpServBootpCountReplies=dhcpServBootpCountReplies, ipspgDhcpTrPriority=ipspgDhcpTrPriority, dhcpServerFailoverReturnedControl=dhcpServerFailoverReturnedControl, dhcpServDhcpCountAcks=dhcpServDhcpCountAcks, ipspgDhcpTrTime=ipspgDhcpTrTime, dhcpServFailoverPartnerType=dhcpServFailoverPartnerType, dhcpServConfiguration=dhcpServConfiguration, dhcpServRangeEnd=dhcpServRangeEnd, dhcpServCountUsedSubnets=dhcpServCountUsedSubnets, dhcpServerStarted=dhcpServerStarted, ipspgDhcpTrSuspect=ipspgDhcpTrSuspect, dhcpServBootpCountDroppedNotServingSubnet=dhcpServBootpCountDroppedNotServingSubnet, dhcpServDhcpCountReleases=dhcpServDhcpCountReleases, ipspgDhcpTrapEntry=ipspgDhcpTrapEntry, dhcpServDhcpCountDroppedUnknownClient=dhcpServDhcpCountDroppedUnknownClient, dhcpServRangeOutstandingOffers=dhcpServRangeOutstandingOffers, dhcpServDhcpCountNacks=dhcpServDhcpCountNacks, dhcpServerDropUnknownClient=dhcpServerDropUnknownClient, lucent=lucent, mibs=mibs, ipspgDhcpTrapTable=ipspgDhcpTrapTable, dhcpServRangeUnavailable=dhcpServRangeUnavailable, dhcpServerReload=dhcpServerReload, dhcpServRangeUnused=dhcpServRangeUnused, dhcpServCountFullSubnets=dhcpServCountFullSubnets, dhcpServRangeSubnetMask=dhcpServRangeSubnetMask, dhcpServDhcpCountDroppedNotServingSubnet=dhcpServDhcpCountDroppedNotServingSubnet, dhcpServRangeType=dhcpServRangeType, dhcpServRangeTable=dhcpServRangeTable, dhcpServSystemDescr=dhcpServSystemDescr, ipspgDhcpTrSequence=ipspgDhcpTrSequence, dhcpServerFailoverActive=dhcpServerFailoverActive, dhcpServerStopped=dhcpServerStopped, ipspgTrap=ipspgTrap, ipspgDhcpTrId=ipspgDhcpTrId, dhcpServerBadPacket=dhcpServerBadPacket, ipspgServices=ipspgServices, ipspgDhcpTrType=ipspgDhcpTrType, dhcpServDhcpCountRequests=dhcpServDhcpCountRequests, dhcpServDhcpCountInforms=dhcpServDhcpCountInforms, dhcpServDhcpCountOffers=dhcpServDhcpCountOffers, products=products, dhcpServerSubnetThresholdExceeded=dhcpServerSubnetThresholdExceeded, ipspgDhcpTrIndex=ipspgDhcpTrIndex, dhcpServerPingResponseReceived=dhcpServerPingResponseReceived, dhcpServRangeStart=dhcpServRangeStart, dhcpServSubnetCounters=dhcpServSubnetCounters, dhcpServFailoverPartnerAddr=dhcpServFailoverPartnerAddr, dhcpServDhcpStatLastArrivalTime=dhcpServDhcpStatLastArrivalTime, ipspgDNS=ipspgDNS, dhcpServerSubnetDepleted=dhcpServerSubnetDepleted, ipspg=ipspg, dhcpServBootpCountRequests=dhcpServBootpCountRequests, dhcpServFailoverPartnerPolltime=dhcpServFailoverPartnerPolltime, dhcpServDhcpStatSumResponseTime=dhcpServDhcpStatSumResponseTime, dhcpServSystemUpTime=dhcpServSystemUpTime, dhcpServBootpStatMinResponseTime=dhcpServBootpStatMinResponseTime, dhcpServBootpStatSumResponseTime=dhcpServBootpStatSumResponseTime, dhcpServFailoverEntry=dhcpServFailoverEntry, ipspgDhcpTrDiagId=ipspgDhcpTrDiagId, ipspgDHCP=ipspgDHCP, dhcpServCountUnusedSubnets=dhcpServCountUnusedSubnets, dhcpServFailoverPartnerStatus=dhcpServFailoverPartnerStatus, dhcpServMibObjects=dhcpServMibObjects, dhcpServDhcpCounters=dhcpServDhcpCounters, dhcpServerSubnetThresholdDescent=dhcpServerSubnetThresholdDescent, dhcpServBootpStatMinArrivalInterval=dhcpServBootpStatMinArrivalInterval, dhcpServSystemResetTime=dhcpServSystemResetTime, dhcpServFailoverTable=dhcpServFailoverTable, dhcpServBootpStatistics=dhcpServBootpStatistics, dhcpServDhcpCountInvalids=dhcpServDhcpCountInvalids, dhcpServSystemStatus=dhcpServSystemStatus, dhcpServBootpStatLastArrivalTime=dhcpServBootpStatLastArrivalTime, dhcpServRangeSubnetAddr=dhcpServRangeSubnetAddr)
| 166.473868 | 4,624 | 0.786722 | 5,818 | 47,778 | 6.460296 | 0.098487 | 0.012505 | 0.097217 | 0.010004 | 0.570665 | 0.464934 | 0.426169 | 0.372852 | 0.361065 | 0.351647 | 0 | 0.047133 | 0.099 | 47,778 | 286 | 4,625 | 167.055944 | 0.825985 | 0.006446 | 0 | 0 | 0 | 0.136691 | 0.3781 | 0.001981 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.021583 | 0 | 0.032374 | 0.003597 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
10e03b1d5a1e1357e1c9400e01ebb43cba0f8cfc | 26,630 | py | Python | test/test_types_settings.py | hi-artem/twistlock-py | 9888e905f5b9d3cc00f9b84244588c0992f8e4f4 | [
"RSA-MD"
] | null | null | null | test/test_types_settings.py | hi-artem/twistlock-py | 9888e905f5b9d3cc00f9b84244588c0992f8e4f4 | [
"RSA-MD"
] | null | null | null | test/test_types_settings.py | hi-artem/twistlock-py | 9888e905f5b9d3cc00f9b84244588c0992f8e4f4 | [
"RSA-MD"
] | null | null | null | # coding: utf-8
"""
Prisma Cloud Compute API
No description provided (generated by Openapi Generator https://github.com/openapitools/openapi-generator) # noqa: E501
The version of the OpenAPI document: 21.04.439
Generated by: https://openapi-generator.tech
"""
from __future__ import absolute_import
import unittest
import datetime
import openapi_client
from openapi_client.models.types_settings import TypesSettings # noqa: E501
from openapi_client.rest import ApiException
class TestTypesSettings(unittest.TestCase):
"""TypesSettings unit test stubs"""
def setUp(self):
pass
def tearDown(self):
pass
def make_instance(self, include_optional):
"""Test TypesSettings
include_option is a boolean, when False only required
params are included, when True both required and
optional params are included """
# model = openapi_client.models.types_settings.TypesSettings() # noqa: E501
if include_optional :
return TypesSettings(
access_ca_cert = '',
address = '',
alerts = openapi_client.models.api/alert_settings.api.AlertSettings(
aggregation_period_ms = 56,
security_advisor_webhook = '', ),
cert_settings = openapi_client.models.types/cert_settings.types.CertSettings(
ca_expiration = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
console_san = [
''
], ),
certificate_period_days = 56,
check_revocation = True,
code_repo_settings = openapi_client.models.shared/code_repo_settings.shared.CodeRepoSettings(
specifications = [
openapi_client.models.shared/code_repo_specification.shared.CodeRepoSpecification(
credential_id = '',
excluded_manifest_paths = [
''
],
explicit_manifest_names = [
''
],
public_only = True,
repositories = [
''
],
target_python_version = '',
type = '[\"github\",\"CI\"]', )
],
webhook_url_suffix = '', ),
communication_port = 56,
console_ca_cert = '',
console_custom_cert = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
console_names = [
''
],
custom_endpoint = '',
custom_endpoint_ca_cert = '',
custom_endpoint_credential_id = '',
custom_endpoint_enabled = True,
custom_labels = openapi_client.models.shared/custom_labels_settings.shared.CustomLabelsSettings(
labels = [
''
], ),
defender_settings = openapi_client.models.defender/settings.defender.Settings(
admission_control_enabled = True,
admission_control_webhook_suffix = '',
automatic_upgrade = True,
disconnect_period_days = 56,
host_custom_compliance_enabled = True,
listening_port = 56, ),
enabled = True,
forensic = openapi_client.models.shared/forensic_settings.shared.ForensicSettings(
collect_network_firewall = True,
collect_network_snapshot = True,
container_disk_usage_mb = 56,
enabled = True,
host_disk_usage_mb = 56,
incident_snapshots_cap = 56, ),
has_admin = True,
host_auto_deploy = [
openapi_client.models.shared/host_auto_deploy_specification.shared.HostAutoDeploySpecification(
aws_region_type = '[\"regular\",\"gov\",\"china\",\"all\"]',
collections = [
openapi_client.models.collection/collection.collection.Collection(
account_ids = [
''
],
app_ids = [
''
],
clusters = [
''
],
code_repos = [
''
],
color = '',
containers = [
''
],
description = '',
functions = [
''
],
hosts = [
''
],
images = [
''
],
labels = [
''
],
modified = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
name = '',
namespaces = [
''
],
owner = '',
prisma = True,
system = True, )
],
console_hostname = '',
credential_id = '',
last_modified = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
name = '', )
],
hpkp = openapi_client.models.types/hpkp_settings.types.HPKPSettings(
certs = '',
enabled = True,
fingerprints = [
''
], ),
identity_settings = openapi_client.models.identity/settings.identity.Settings(
ldap = openapi_client.models.identity/ldap_settings.identity.LdapSettings(
account_password = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
account_upn = '',
ca_cert = '',
enabled = True,
group_search_base = '',
search_base = '',
type = '',
url = '',
user_search_base = '',
user_search_identifier = '', ),
oauth = openapi_client.models.identity/provider_settings.identity.ProviderSettings(
auth_url = '',
cert = '',
client_id = '',
client_secret = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
enabled = True,
group_claim = '',
group_scope = '',
open_id_issues_url = '',
openshift_base_url = '',
provider_alias = '',
provider_name = '[\"github\",\"openshift\"]',
token_url = '', ),
openid = openapi_client.models.identity/provider_settings.identity.ProviderSettings(
auth_url = '',
cert = '',
client_id = '',
enabled = True,
group_claim = '',
group_scope = '',
open_id_issues_url = '',
openshift_base_url = '',
provider_alias = '',
token_url = '', ),
saml = openapi_client.models.identity/saml_settings.identity.SamlSettings(
app_id = '',
app_secret = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
audience = '',
cert = '',
console_url = '',
enabled = True,
issuer = '',
provider_alias = '',
skip_authn_context = True,
tenant_id = '',
type = '[\"okta\",\"gsuite\",\"ping\",\"shibboleth\",\"azure\",\"adfs\"]',
url = '', ), ),
kubernetes_audit = openapi_client.models.shared/kubernetes_audit_settings.shared.KubernetesAuditSettings(
credential_id = '',
deployment_type = '[\"default\",\"gke\"]',
last_polling_time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
project_ids = [
''
],
stackdriver_filter = '',
webhook_url_suffix = '', ),
ldap_enabled = True,
license_key = '',
logging = openapi_client.models.shared/logging_settings.shared.LoggingSettings(
console_address = '',
enable_metrics_collection = True,
include_runtime_link = True,
stdout = openapi_client.models.shared/logger_setting.shared.LoggerSetting(
all_proc_events = True,
enabled = True,
verbose_scan = True, ),
syslog = openapi_client.models.shared/syslog_settings.shared.SyslogSettings(
addr = '',
all_proc_events = True,
enabled = True,
id = '',
verbose_scan = True, ), ),
logon = openapi_client.models.types/logon_settings.types.LogonSettings(
basic_auth_disabled = True,
include_tls = True,
session_timeout_sec = 56,
strong_password = True,
use_support_credentials = True, ),
oauth_enabled = True,
oidc_enabled = True,
projects = openapi_client.models.api/project_settings.api.ProjectSettings(
master = True,
redirect_url = '', ),
proxy = openapi_client.models.common/proxy_settings.common.ProxySettings(
ca = '',
http_proxy = '',
no_proxy = '',
password = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
user = '', ),
registry = openapi_client.models.shared/registry_settings.shared.RegistrySettings(
harbor_scanner_url_suffix = '',
specifications = [
openapi_client.models.shared/registry_specification.shared.RegistrySpecification(
cap = 56,
collections = [
''
],
credential = openapi_client.models.cred/credential.cred.Credential(
_id = '',
account_guid = '',
account_id = '',
api_token = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
ca_cert = '',
description = '',
external = True,
last_modified = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
owner = '',
role_arn = '',
secret = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
tokens = openapi_client.models.cred/temporary_token.cred.TemporaryToken(
aws_access_key_id = '',
aws_secret_access_key = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
duration = 56,
expiration_time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
token = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ), ),
type = '[\"aws\",\"azure\",\"gcp\",\"ibmCloud\",\"apiToken\",\"githubToken\",\"basic\",\"dtr\",\"kubeconfig\",\"certificate\"]',
use_aws_role = True, ),
credential_id = '',
excluded_repositories = [
''
],
excluded_tags = [
''
],
jfrog_repo_types = [
'[\"local\",\"remote\",\"virtual\"]'
],
namespace = '',
os = '[\"linux\",\"windows\"]',
registry = '',
repository = '',
scanners = 56,
tag = '',
version = '',
version_pattern = '', )
],
webhook_url_suffix = '', ),
saml_enabled = True,
scan = openapi_client.models.shared/scan_settings.shared.ScanSettings(
cloud_platforms_scan_period_ms = 56,
code_repos_scan_period_ms = 56,
containers_scan_period_ms = 56,
extract_archive = True,
images_scan_period_ms = 56,
include_js_dependencies = True,
registry_scan_period_ms = 56,
registry_scan_retention_days = 56,
scan_running_images = True,
serverless_scan_period_ms = 56,
show_infra_containers = True,
show_negligible_vulnerabilities = True,
system_scan_period_ms = 56,
tas_droplets_scan_period_ms = 56,
vm_scan_period_ms = 56, ),
secrets_stores = openapi_client.models.shared/secrets_stores.shared.SecretsStores(
refresh_period_hours = 56,
secrets_stores = [
openapi_client.models.shared/secrets_store.shared.SecretsStore(
app_id = '',
ca_cert = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
client_cert = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
credential_id = '',
name = '',
region = '',
type = '[\"hashicorp\",\"hashicorp010\",\"cyberark\",\"awsParameterStore\",\"awsSecretsManager\",\"azure\"]',
url = '', )
], ),
secured_console_port = 56,
serverless_auto_deploy = [
openapi_client.models.shared/serverless_auto_deploy_specification.shared.ServerlessAutoDeploySpecification(
collections = [
openapi_client.models.collection/collection.collection.Collection(
account_ids = [
''
],
app_ids = [
''
],
clusters = [
''
],
code_repos = [
''
],
color = '',
containers = [
''
],
description = '',
functions = [
''
],
hosts = [
''
],
images = [
''
],
labels = [
''
],
modified = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
name = '',
namespaces = [
''
],
owner = '',
prisma = True,
system = True, )
],
console_addr = '',
credential_id = '',
last_modified = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
name = '',
region = '',
runtimes = [
''
], )
],
serverless_scan = [
openapi_client.models.shared/serverless_scan_specification.shared.ServerlessScanSpecification(
aws_region_type = '[\"regular\",\"gov\",\"china\",\"all\"]',
cap = 56,
credential = openapi_client.models.cred/credential.cred.Credential(
_id = '',
account_guid = '',
account_id = '',
api_token = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
ca_cert = '',
description = '',
external = True,
last_modified = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
owner = '',
role_arn = '',
secret = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
tokens = openapi_client.models.cred/temporary_token.cred.TemporaryToken(
aws_access_key_id = '',
aws_secret_access_key = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ),
duration = 56,
expiration_time = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
token = openapi_client.models.common/secret.common.Secret(
encrypted = '',
plain = '', ), ),
type = '[\"aws\",\"azure\",\"gcp\",\"ibmCloud\",\"apiToken\",\"githubToken\",\"basic\",\"dtr\",\"kubeconfig\",\"certificate\"]',
use_aws_role = True, ),
credential_id = '',
provider = '[\"aws\",\"azure\",\"gcp\",\"alibaba\",\"others\"]',
scan_all_versions = True,
scan_layers = True, )
],
tas_droplets = [
openapi_client.models.shared/tas_droplet_specification.shared.TASDropletSpecification(
cap = 56,
cloud_controller_address = '',
hostname = '',
pattern = '', )
],
telemetry = openapi_client.models.types/telemetry_settings.types.TelemetrySettings(
enabled = True, ),
token = '',
trusted_certs = [
openapi_client.models.shared/trusted_cert_signature.shared.TrustedCertSignature(
cn = '',
issuer = '',
not_after1 = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
not_before1 = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
raw = '', )
],
trusted_certs_enabled = True,
upload_disabled = True,
version = '',
vms = [
openapi_client.models.shared/vm_specification.shared.VMSpecification(
cap = 56,
collections = [
openapi_client.models.collection/collection.collection.Collection(
account_ids = [
''
],
app_ids = [
''
],
clusters = [
''
],
code_repos = [
''
],
color = '',
containers = [
''
],
description = '',
functions = [
''
],
hosts = [
''
],
images = [
''
],
labels = [
''
],
modified = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
name = '',
namespaces = [
''
],
owner = '',
prisma = True,
system = True, )
],
console_addr = '',
credential_id = '',
excluded_images = [
''
],
region = '',
scanners = 56,
version = '', )
],
wild_fire_settings = openapi_client.models.shared/wild_fire_settings.shared.WildFireSettings(
api_key = '',
api_key_expiration = datetime.datetime.strptime('2013-10-20 19:20:30.00', '%Y-%m-%d %H:%M:%S.%f'),
last_error = '',
policy = openapi_client.models.shared/wild_fire_policy.shared.WildFirePolicy(
compliance_enabled = True,
grayware_as_malware = True,
region = '',
runtime_enabled = True,
upload_enabled = True, ), ),
windows_feed_enabled = True
)
else :
return TypesSettings(
)
def testTypesSettings(self):
"""Test TypesSettings"""
inst_req_only = self.make_instance(include_optional=False)
inst_req_and_optional = self.make_instance(include_optional=True)
if __name__ == '__main__':
unittest.main()
| 50.340265 | 161 | 0.361848 | 1,633 | 26,630 | 5.639314 | 0.233925 | 0.0847 | 0.119666 | 0.057009 | 0.440873 | 0.406776 | 0.37876 | 0.372027 | 0.36182 | 0.36182 | 0 | 0.025235 | 0.547615 | 26,630 | 528 | 162 | 50.435606 | 0.739188 | 0.020278 | 0 | 0.598802 | 1 | 0.003992 | 0.047901 | 0.024258 | 0 | 0 | 0 | 0 | 0 | 1 | 0.007984 | false | 0.00998 | 0.011976 | 0 | 0.025948 | 0.001996 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
10e66c5e5662c47bc5e81111716d5f39cdded679 | 415 | py | Python | udeer/property_management/doctype/property_unit/property_unit_dashboard.py | mhbu50/udeer | abb453c2d7c073c8b6e06756332ff44a792c7b43 | [
"MIT"
] | 1 | 2016-11-18T08:49:51.000Z | 2016-11-18T08:49:51.000Z | udeer/property_management/doctype/property_unit/property_unit_dashboard.py | mhbu50/udeer | abb453c2d7c073c8b6e06756332ff44a792c7b43 | [
"MIT"
] | 1 | 2021-12-06T11:35:12.000Z | 2021-12-06T11:35:12.000Z | udeer/property_management/doctype/property_unit/property_unit_dashboard.py | mhbu50/udeer | abb453c2d7c073c8b6e06756332ff44a792c7b43 | [
"MIT"
] | null | null | null | from frappe import _
def get_data():
return {
'fieldname':
'property_unit',
'transactions': [{
'label': _('Leases'),
'items': ['Lease']
}, {
'label': _('Expenses'),
'items': ["Property Expense"]
},{
'label': _('Maintenance'),
'items': ["Maintenance Schedule","Maintenance Visit"]
}]
}
| 21.842105 | 65 | 0.436145 | 27 | 415 | 6.481481 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.390361 | 415 | 18 | 66 | 23.055556 | 0.6917 | 0 | 0 | 0.125 | 0 | 0 | 0.354217 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | true | 0 | 0.0625 | 0.0625 | 0.1875 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
10e823961657f7d461d363a3425b0fd09ebb1d3e | 3,460 | py | Python | app/api/v2/models/usermodels.py | jaystaks/Politico | baef7c079996f26f7379e54faf9dbcfa0494aa4f | [
"MIT"
] | 1 | 2019-03-24T18:13:11.000Z | 2019-03-24T18:13:11.000Z | app/api/v2/models/usermodels.py | jaystaks/Politico | baef7c079996f26f7379e54faf9dbcfa0494aa4f | [
"MIT"
] | 34 | 2019-02-07T20:54:29.000Z | 2022-01-21T19:42:27.000Z | app/api/v2/models/usermodels.py | jaystaks/Politico | baef7c079996f26f7379e54faf9dbcfa0494aa4f | [
"MIT"
] | 2 | 2019-02-07T04:20:53.000Z | 2019-02-07T20:14:29.000Z | import datetime
import json
from werkzeug.security import generate_password_hash, check_password_hash
from app.api.db.database import Database
class Users():
def __init__(self,
firstname=None,
lastname=None,
othername=None,
email=None,
password=None,
phonenumber=None,
passporturl=None,
):
self.firstname = firstname
self.lastname = lastname
self.othername = othername
self.email = email
self.password = generate_password_hash(password)
self.phonenumber = phonenumber
self.passporturl = passporturl
def registerUser(self):
user = Database().execute_query(
''' INSERT INTO users(firstname,lastname,othername,email,password,phonenumber,passporturl)\
VALUES('{}','{}','{}','{}','{}','{}','{}')\
RETURNING firstname,lastname,othername,email,password,phonenumber,passporturl''' \
.format(self.firstname, self.lastname, self.othername, self.email, self.password, self.phonenumber,
self.passporturl))
return user
@staticmethod
def fetchUsers():
user = Database().execute_query(''' SELECT * FROM users''')
return json.dumps(user, default=str)
def fetchEmail(self):
user = Database().execute_query("SELECT * FROM users WHERE email= '" + str(self.email) + "'", True)
return user
def fetchPhonenumber(self):
user = Database().execute_query(''' SELECT * FROM users WHERE phonenumber=''' + str(self.phonenumber))
return user
def fetchPassporturl(self):
user = Database().execute_query(''' SELECT * FROM users WHERE passporturl=''' + str(self.passporturl))
return user
def generate_hash(self):
self.password_hash = generate_password_hash(self.password)
def check_hash(self, password):
return check_password_hash(self.password_hash, password)
def update(self, data):
db = Database()
for key, item in data.items():
if key == 'password':
self.password = self.generate_hash()
setattr(self, key, item)
self.modified_at = datetime.datetime.utcnow()
db.conn.commit()
def delete(self):
db = Database()
db.conn.delete(self)
db.conn.commit()
''' @staticmethod
def get_all_users():
return User.query.all()
@staticmethod
def get_one_user(id):
return User.query.get(id)
def auth_token_encode(self, id):
try:
payload = {
'exp': datetime.datetime.utcnow() + datetime.timedelta(days=1, seconds=0),
'iat': datetime.datetime.utcnow(),
'sub': id
}
return jwt.encode(
payload,
os.getenv('JWT_SECRET_KEY'),
algorithm='HS256'
)
except Exception as e:
return e
@classmethod
def decode_auth_token(cls, auth_token):
try:
payload = jwt.decode(auth_token, os.getenv('JWT_SECRET_KEY'))
return payload['sub']
except jwt.ExpiredSignatureError:
return 'Signature expired! Please log in again.'
except jwt.InvalidTokenError:
return 'Invalid token! Please log in again.'''
| 32.336449 | 115 | 0.57948 | 348 | 3,460 | 5.649425 | 0.287356 | 0.042726 | 0.048321 | 0.061038 | 0.189725 | 0.155137 | 0.155137 | 0.073245 | 0.073245 | 0 | 0 | 0.002096 | 0.310694 | 3,460 | 106 | 116 | 32.641509 | 0.822222 | 0 | 0 | 0.142857 | 1 | 0 | 0.063772 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.178571 | false | 0.25 | 0.071429 | 0.017857 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
10f1f7311325ff8b922534481812f358ca7b3651 | 235 | py | Python | PMIa/2015/ANDROS_D_A/task_5_2.py | YukkaSarasti/pythonintask | eadf4245abb65f4400a3bae30a4256b4658e009c | [
"Apache-2.0"
] | null | null | null | PMIa/2015/ANDROS_D_A/task_5_2.py | YukkaSarasti/pythonintask | eadf4245abb65f4400a3bae30a4256b4658e009c | [
"Apache-2.0"
] | null | null | null | PMIa/2015/ANDROS_D_A/task_5_2.py | YukkaSarasti/pythonintask | eadf4245abb65f4400a3bae30a4256b4658e009c | [
"Apache-2.0"
] | null | null | null | # Задача 5.Вариант 2.
# Напишите программу, которая бы при запуске случайным образом отображала имя одного из трех поросят.
# Andros D.A.
# 14.04.2016
import random
pigs=["Naf-Naf","Nuf-Nuf","Nif-Nif"]
p=random.choice(pigs)
print(p)
| 23.5 | 101 | 0.731915 | 40 | 235 | 4.325 | 0.85 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04878 | 0.12766 | 235 | 9 | 102 | 26.111111 | 0.790244 | 0.604255 | 0 | 0 | 0 | 0 | 0.241379 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.25 | null | null | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
10f52628f14c02cae8ea5f324f92c15fd3474b5c | 179 | py | Python | src/parruc/violareggiocalabria/__init__.py | parruc/parruc.violareggiocalabria | 258d966ccf35dfe76be821f0208fb9077a585ac3 | [
"MIT"
] | null | null | null | src/parruc/violareggiocalabria/__init__.py | parruc/parruc.violareggiocalabria | 258d966ccf35dfe76be821f0208fb9077a585ac3 | [
"MIT"
] | null | null | null | src/parruc/violareggiocalabria/__init__.py | parruc/parruc.violareggiocalabria | 258d966ccf35dfe76be821f0208fb9077a585ac3 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""Init and utils."""
from zope.i18nmessageid import MessageFactory
import monkey
_ = MessageFactory('parruc.violareggiocalabria')
monkey # pyflakes
| 14.916667 | 48 | 0.72067 | 18 | 179 | 7.111111 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019608 | 0.145251 | 179 | 11 | 49 | 16.272727 | 0.816993 | 0.26257 | 0 | 0 | 0 | 0 | 0.208 | 0.208 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
10fb1aa64022cab2ceb1cb52b11e3dc92a428302 | 778 | py | Python | oops_fhir/r4/value_set/v3_substitution_condition.py | Mikuana/oops_fhir | 77963315d123756b7d21ae881f433778096a1d25 | [
"MIT"
] | null | null | null | oops_fhir/r4/value_set/v3_substitution_condition.py | Mikuana/oops_fhir | 77963315d123756b7d21ae881f433778096a1d25 | [
"MIT"
] | null | null | null | oops_fhir/r4/value_set/v3_substitution_condition.py | Mikuana/oops_fhir | 77963315d123756b7d21ae881f433778096a1d25 | [
"MIT"
] | null | null | null | from pathlib import Path
from fhir.resources.valueset import ValueSet as _ValueSet
from oops_fhir.utils import ValueSet
from oops_fhir.r4.code_system.v3_substitution_condition import (
v3SubstitutionCondition as v3SubstitutionCondition_,
)
__all__ = ["v3SubstitutionCondition"]
_resource = _ValueSet.parse_file(Path(__file__).with_suffix(".json"))
class v3SubstitutionCondition(v3SubstitutionCondition_):
"""
v3 Code System SubstitutionCondition
Identifies what sort of change is permitted or has occurred between the
item that was ordered/requested and the one that was/will be provided.
Status: active - Version: 2018-08-12
http://terminology.hl7.org/ValueSet/v3-SubstitutionCondition
"""
class Meta:
resource = _resource
| 24.3125 | 76 | 0.772494 | 91 | 778 | 6.373626 | 0.648352 | 0.048276 | 0.055172 | 0.068966 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027523 | 0.159383 | 778 | 31 | 77 | 25.096774 | 0.859327 | 0.376607 | 0 | 0 | 0 | 0 | 0.059701 | 0.049041 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.363636 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
10fe086d26722c41009fa9de21c9182b30573a05 | 146 | py | Python | BookCab/apps.py | AtmegaBuzz/osmd | 68f9e3a0979139bd2a4ffb811b9d3062f9263f0e | [
"MIT"
] | 1 | 2022-02-09T06:17:33.000Z | 2022-02-09T06:17:33.000Z | BookCab/apps.py | AtmegaBuzz/osmd | 68f9e3a0979139bd2a4ffb811b9d3062f9263f0e | [
"MIT"
] | null | null | null | BookCab/apps.py | AtmegaBuzz/osmd | 68f9e3a0979139bd2a4ffb811b9d3062f9263f0e | [
"MIT"
] | null | null | null | from django.apps import AppConfig
class BookcabConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'BookCab'
| 20.857143 | 56 | 0.760274 | 17 | 146 | 6.411765 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150685 | 146 | 6 | 57 | 24.333333 | 0.879032 | 0 | 0 | 0 | 0 | 0 | 0.246575 | 0.19863 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
10ff688161d965820c56a570d8782ce1f5cedee2 | 2,221 | py | Python | irrigator_pro/uga/models.py | warnes/irrigatorpro | 4838f8832bdbf87f394a0298adc5dabfc26e82e8 | [
"MIT"
] | null | null | null | irrigator_pro/uga/models.py | warnes/irrigatorpro | 4838f8832bdbf87f394a0298adc5dabfc26e82e8 | [
"MIT"
] | null | null | null | irrigator_pro/uga/models.py | warnes/irrigatorpro | 4838f8832bdbf87f394a0298adc5dabfc26e82e8 | [
"MIT"
] | null | null | null | from django.db import models
from farms.models import *
class UGAProbeData(models.Model):
id = models.AutoField (db_column="data_id", primary_key=True)
datetime = models.DateTimeField(db_column="dt" )
field_code = models.IntegerField (db_column="fieldid" )
node_code = models.IntegerField (db_column="nodeid" )
radio_id = models.CharField (db_column="netaddr", max_length=10 )
battery_voltage = models.FloatField (db_column="batt" )
battery_percent = models.FloatField (db_column="battlife" )
soil_potential_8 = models.FloatField (db_column="sm1" )
soil_potential_16 = models.FloatField (db_column="sm2" )
soil_potential_24 = models.FloatField (db_column="sm3" )
circuit_board_temp = models.FloatField (db_column="boardtemp" )
thermocouple_1_temp = models.FloatField (db_column="temp1" )
thermocouple_2_temp = models.FloatField (db_column="temp2" )
minutes_awake = models.IntegerField (db_column="awake" )
__database__ = "ugatifton"
class Meta:
managed = False
db_table = 'fields"."data' # hack to access 'data' table within schema 'fields'
def __unicode__(self):
return u"RadioID '%s' at '%s': (%f, %f, %f) (%d, %d)" % (self.radio_id,
self.datetime,
self.soil_potential_8,
self.soil_potential_16,
self.soil_potential_24,
self.thermocouple_1_temp,
self.thermocouple_2_temp)
def get_all_radio_ids():
"""
Return a list of all available probe radio_ids
"""
recs = UGAProbeData.objects.values('radio_id').distinct()
radio_ids = [ r['radio_id'] for r in recs]
radio_ids.sort()
return radio_ids
| 52.880952 | 89 | 0.517335 | 215 | 2,221 | 5.046512 | 0.437209 | 0.103226 | 0.132719 | 0.176959 | 0.132719 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015637 | 0.395317 | 2,221 | 41 | 90 | 54.170732 | 0.792256 | 0.044124 | 0 | 0 | 0 | 0.029412 | 0.073564 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.058824 | 0.029412 | 0.676471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
80062e39fbf3efe6efc5c12dd7d9a029860020f0 | 163 | py | Python | representations/sha256.py | Alevsk/stringTransformer | 5dcf4b2f208de25ba58d10bd56f3360ca00c543e | [
"MIT"
] | null | null | null | representations/sha256.py | Alevsk/stringTransformer | 5dcf4b2f208de25ba58d10bd56f3360ca00c543e | [
"MIT"
] | null | null | null | representations/sha256.py | Alevsk/stringTransformer | 5dcf4b2f208de25ba58d10bd56f3360ca00c543e | [
"MIT"
] | null | null | null | #!/usr/bin/env python -B
import hashlib
class sha256:
def transform(self,input,params={}):
hash_object = hashlib.sha256(input)
return hash_object.hexdigest() | 23.285714 | 37 | 0.748466 | 23 | 163 | 5.217391 | 0.782609 | 0.166667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041667 | 0.116564 | 163 | 7 | 38 | 23.285714 | 0.791667 | 0.141104 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.8 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
800f03cfb9c18841905435493fa69fee36ac8cc7 | 450 | py | Python | src/days/day.py | cajones314/avocd2019 | 268e03c5d1bb5b3e14459b831916bb7846f40def | [
"MIT"
] | null | null | null | src/days/day.py | cajones314/avocd2019 | 268e03c5d1bb5b3e14459b831916bb7846f40def | [
"MIT"
] | null | null | null | src/days/day.py | cajones314/avocd2019 | 268e03c5d1bb5b3e14459b831916bb7846f40def | [
"MIT"
] | null | null | null | from io import IOBase
"""Base class for the individual day puzzles"""
class Day:
def __init__(self, input_stream: IOBase):
self._input_stream = input_stream
def _puzzle1(self):
raise NotImplementedError
def _puzzle2(self):
raise NotImplementedError
def run(self, puzzle: int):
if puzzle == 1:
return self._puzzle1()
elif puzzle == 2:
return self._puzzle2()
else:
raise NotImplementedError
| 20.454545 | 47 | 0.675556 | 54 | 450 | 5.407407 | 0.537037 | 0.113014 | 0.10274 | 0.212329 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017544 | 0.24 | 450 | 21 | 48 | 21.428571 | 0.836257 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.066667 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
801364966c4fecc5e79e8b9f7270e310b7be7b35 | 17,886 | py | Python | ci-scripts/generate_spgwc_config_script.py | Musrela/Openair-epc | 61d8954c670a882decac7c2f11443a191ec6f7a3 | [
"Apache-2.0"
] | 2 | 2021-01-07T20:33:45.000Z | 2021-01-14T19:14:04.000Z | ci-scripts/generate_spgwc_config_script.py | Musrela/Openair-epc | 61d8954c670a882decac7c2f11443a191ec6f7a3 | [
"Apache-2.0"
] | null | null | null | ci-scripts/generate_spgwc_config_script.py | Musrela/Openair-epc | 61d8954c670a882decac7c2f11443a191ec6f7a3 | [
"Apache-2.0"
] | 1 | 2021-11-16T10:16:42.000Z | 2021-11-16T10:16:42.000Z | #/*
# * Licensed to the OpenAirInterface (OAI) Software Alliance under one or more
# * contributor license agreements. See the NOTICE file distributed with
# * this work for additional information regarding copyright ownership.
# * The OpenAirInterface Software Alliance licenses this file to You under
# * the OAI Public License, Version 1.1 (the "License"); you may not use this file
# * except in compliance with the License.
# * You may obtain a copy of the License at
# *
# * http://www.openairinterface.org/?page_id=698
# *
# * Unless required by applicable law or agreed to in writing, software
# * distributed under the License is distributed on an "AS IS" BASIS,
# * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# * See the License for the specific language governing permissions and
# * limitations under the License.
# *-------------------------------------------------------------------------------
# * For more information about the OpenAirInterface (OAI) Software Alliance:
# * contact@openairinterface.org
# */
#---------------------------------------------------------------------
import os
import re
import sys
import ipaddress
class spgwcConfigGen():
def __init__(self):
self.kind = ''
self.s11c_name = ''
self.sxc_name = ''
self.prefix = ''
self.fromDockerFile = False
self.apn_list = ''
self.pdn_list = ''
self.dns1 = ''
self.dns2 = ''
self.s5p5 = 'auto'
def GenerateSpgwcConfigurer(self):
apns = self.apn_list.split();
pdns = self.pdn_list.split();
conf_file = open('./spgw_c.conf', 'w')
conf_file.write('# generated by generate_spgwc_config_script.py\n')
conf_file.write('S-GW =\n')
conf_file.write('{\n')
conf_file.write(' INSTANCE = 0; # 0 is the default\n')
conf_file.write(' PID_DIRECTORY = "/var/run"; # /var/run is the default\n')
conf_file.write(' #ITTI_TASKS :\n')
conf_file.write(' #{\n')
conf_file.write(' #ITTI_TIMER_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 85;\n')
conf_file.write(' #};\n')
conf_file.write(' #S11_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #S5S8_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #SX_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #SGW_APP_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #ASYNC_CMD_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #};\n')
conf_file.write(' INTERFACES :\n')
conf_file.write(' {\n')
conf_file.write(' S11_CP :\n')
conf_file.write(' {\n')
conf_file.write(' # S-GW binded interface for S11 communication (GTPV2-C), if none selected the ITTI message interface is used\n')
conf_file.write(' INTERFACE_NAME = "'+self.s11c_name+'"; # STRING, interface name, YOUR NETWORK CONFIG HERE\n')
conf_file.write(' IPV4_ADDRESS = "read"; # STRING, CIDR or "read" to let app read interface configured IP address, YOUR NETWORK CONFIG HERE\n')
conf_file.write(' #PORT = 2123; # INTEGER, port number, PREFER NOT CHANGE UNLESS YOU KNOW WHAT YOU ARE DOING\n')
conf_file.write(' #SCHED_PARAMS : # SCHEADULING PARAMS OF THE LOOPING RECEIVER THREAD BOUND TO THIS INTERFACE/PROTOCOL\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 95;\n')
conf_file.write(' #};\n')
conf_file.write(' };\n')
conf_file.write(' S5_S8_CP :\n')
conf_file.write(' {\n')
conf_file.write(' # S-GW binded interface for S5 or S8 communication\n')
if self.s5p5 == 'auto':
conf_file.write(' INTERFACE_NAME = "lo"; # STRING, interface name\n')
conf_file.write(' IPV4_ADDRESS = "127.0.8.1/8"; # STRING, CIDR or "read" to let app read interface configured IP address\n')
else:
conf_file.write(' INTERFACE_NAME = "lo:s5c"; # STRING, interface name\n')
conf_file.write(' IPV4_ADDRESS = "read"; # STRING, CIDR or "read" to let app read interface configured IP address\n')
conf_file.write(' #PORT = 2123; # INTEGER, port number, PREFER NOT CHANGE UNLESS YOU KNOW WHAT YOU ARE DOING\n')
conf_file.write(' #SCHED_PARAMS : # SCHEADULING PARAMS OF THE LOOPING RECEIVER THREAD BOUND TO THIS INTERFACE/PROTOCOL\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 95;\n')
conf_file.write(' #};\n')
conf_file.write(' };\n')
conf_file.write(' };\n')
conf_file.write('};\n')
conf_file.write('P-GW =\n')
conf_file.write('{\n')
conf_file.write(' INSTANCE = 0; # 0 is the default\n')
conf_file.write(' PID_DIRECTORY = "/var/run"; # /var/run is the default\n')
conf_file.write(' #ITTI_TASKS :\n')
conf_file.write(' #{\n')
conf_file.write(' #ITTI_TIMER_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 85;\n')
conf_file.write(' #};\n')
conf_file.write(' #S11_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #S5S8_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #SX_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #PGW_APP_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #ASYNC_CMD_SCHED_PARAMS :\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 84;\n')
conf_file.write(' #};\n')
conf_file.write(' #};\n')
conf_file.write(' INTERFACES :\n')
conf_file.write(' {\n')
conf_file.write(' S5_S8_CP :\n')
conf_file.write(' {\n')
conf_file.write(' # P-GW binded interface for S5 or S8 communication\n')
if self.s5p5 == 'auto':
conf_file.write(' INTERFACE_NAME = "lo"; # STRING, interface name\n')
conf_file.write(' IPV4_ADDRESS = "127.0.8.2/8"; # STRING, CIDR or "read" to let app read interface configured IP address\n')
else:
conf_file.write(' INTERFACE_NAME = "lo:p5c"; # STRING, interface name\n')
conf_file.write(' IPV4_ADDRESS = "read"; # STRING, CIDR or "read" to let app read interface configured IP address\n')
conf_file.write(' #PORT = 2123;\n')
conf_file.write(' #SCHED_PARAMS : # SCHEADULING PARAMS OF THE LOOPING RECEIVER THREAD BOUND TO THIS INTERFACE/PROTOCOL\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 95;\n')
conf_file.write(' #};\n')
conf_file.write(' };\n')
conf_file.write(' SX :\n')
conf_file.write(' {\n')
conf_file.write(' # P-GW binded interface for SX communication\n')
conf_file.write(' INTERFACE_NAME = "'+self.sxc_name+'"; # STRING, interface name\n')
conf_file.write(' IPV4_ADDRESS = "read"; # STRING, CIDR or "read" to let app read interface configured IP address\n')
conf_file.write(' #PORT = 8805;\n')
conf_file.write(' #SCHED_PARAMS : # SCHEADULING PARAMS OF THE LOOPING RECEIVER THREAD BOUND TO THIS INTERFACE/PROTOCOL\n')
conf_file.write(' #{\n')
conf_file.write(' #CPU_ID = 1;\n')
conf_file.write(' #SCHED_POLICY = "SCHED_FIFO"; # Values in { SCHED_OTHER, SCHED_IDLE, SCHED_BATCH, SCHED_FIFO, SCHED_RR }\n')
conf_file.write(' #SCHED_PRIORITY = 95;\n')
conf_file.write(' #};\n')
conf_file.write(' };\n')
conf_file.write(' };\n')
conf_file.write(' # Pool of UE assigned IP addresses\n')
conf_file.write(' # Do not make IP pools overlap\n')
conf_file.write(' # first IPv4 address X.Y.Z.1 is reserved for GTP network device on SPGW\n')
conf_file.write(' # Normally no more than 96 pools allowed, but for non OVS GTP solution, only one pool allowed (TODO).\n')
conf_file.write(' IP_ADDRESS_POOL :\n')
conf_file.write(' {\n')
conf_file.write(' IPV4_LIST = (\n')
for pdn in pdns[ 0:len(pdns)-1 ]:
hosts = list(ipaddress.ip_network(pdn).hosts())
conf_file.write(' {RANGE = "'+str(hosts[1])+' - '+str(hosts[-2])+'";}, # STRING, IPv4 RANGE IP_start - IP_end, YOUR NETWORK CONFIG HERE.\n')
pdn = pdns[len(pdns) - 1]
hosts = list(ipaddress.ip_network(pdn).hosts())
conf_file.write(' {RANGE = "'+str(hosts[1])+' - '+str(hosts[-2])+'";} # STRING, IPv4 RANGE IP_start - IP_end, YOUR NETWORK CONFIG HERE.\n')
conf_file.write(' );\n')
conf_file.write(' IPV6_LIST = (\n')
conf_file.write(' {PREFIX = "2001:1:2::/64";}, # STRING, IPv6 prefix, YOUR NETWORK CONFIG HERE.\n')
conf_file.write(' {PREFIX = "3001:1:2::/64";}, # STRING, IPv6 prefix, YOUR NETWORK CONFIG HERE.\n')
conf_file.write(' {PREFIX = "4001:1:2::/64";} # STRING, IPv6 prefix, YOUR NETWORK CONFIG HERE.\n')
conf_file.write(' );\n')
conf_file.write(' };\n')
conf_file.write(' APN_LIST = (\n')
conf_file.write(' # IPV4_POOL, IPV6_POOL are index in IPV4_LIST, IPV6_LIST, PDN_TYPE choice in {IPv4, IPv6, IPv4v6}\n')
i = 0
for apn in apns[ 0:len(apns)-1 ]:
conf_file.write(' {APN_NI = "'+apn+'"; PDN_TYPE = "IPv4"; IPV4_POOL = '+str(i)+'; IPV6_POOL = -1},\n')
i += 1
apn = apns[len(apns) - 1]
conf_file.write(' {APN_NI = "'+apn+'"; PDN_TYPE = "IPv4"; IPV4_POOL = '+str(i)+'; IPV6_POOL = -1}\n')
conf_file.write(' );\n')
conf_file.write(' # DNS address communicated to UEs\n')
conf_file.write(' DEFAULT_DNS_IPV4_ADDRESS = "'+self.dns1+'"; # YOUR NETWORK CONFIG HERE\n')
conf_file.write(' DEFAULT_DNS_SEC_IPV4_ADDRESS = "'+self.dns2+'"; # YOUR NETWORK CONFIG HERE\n')
conf_file.write(' DEFAULT_DNS_IPV6_ADDRESS = "2001:4860:4860::8888"; # FFU\n')
conf_file.write(' DEFAULT_DNS_SEC_IPV6_ADDRESS = "2001:4860:4860::8844"; # FFU\n')
conf_file.write(' # Non standard feature, normally should be set to "no", but you may need to set to yes for UE that do not explicitly request a PDN address through NAS signalling\n')
conf_file.write(' FORCE_PUSH_PROTOCOL_CONFIGURATION_OPTIONS = "no"; # STRING, {"yes", "no"}.\n')
conf_file.write(' PCEF :\n')
conf_file.write(' {\n')
conf_file.write(' # Waiting for HSS APN-AMBR IE ...\n')
conf_file.write(' APN_AMBR_UL = 500000; # Maximum UL bandwidth that can be used by non guaranteed bit rate traffic in Kbits/seconds.\n')
conf_file.write(' APN_AMBR_DL = 500000; # Maximum DL bandwidth that can be used by non guaranteed bit rate traffic in Kbits/seconds.\n')
conf_file.write(' };\n')
conf_file.write('};\n')
conf_file.close()
if self.s5p5 == 'prod':
spgwcFile = open('./spgwc-cfg.sh', 'w')
spgwcFile.write('#!/bin/bash\n')
spgwcFile.write('\n')
spgwcFile.write('ifconfig lo:s5c 127.0.0.15 up\n')
spgwcFile.write('echo "ifconfig lo:s5c 127.0.0.15 up --> OK"\n')
spgwcFile.write('ifconfig lo:p5c 127.0.0.16 up\n')
spgwcFile.write('echo "ifconfig lo:p5c 127.0.0.16 up --> OK"\n')
spgwcFile.write('exit 0\n')
spgwcFile.close()
#-----------------------------------------------------------
# Usage()
#-----------------------------------------------------------
def Usage():
print('----------------------------------------------------------------------------------------------------------------------')
print('generate_spgwc_config_script.py')
print(' Prepare a bash script to be run in the workspace where SPGW-C is being built.')
print(' That bash script will copy configuration template files and adapt to your configuration.')
print('----------------------------------------------------------------------------------------------------------------------')
print('Usage: python3 generate_spgwc_config_script.py [options]')
print(' --help Show this help.')
print('------------------------------------------------------------------------------------------------- SPGW-C Options -----')
print(' --kind=SPGW-C')
print(' --s11c=[SPGW-C S11 Interface Name]')
print(' --sxc=[SPGW-C SX Interface Name]')
print(' --apn_list=["APNs"]')
print(' --pdn_list=["PDNs"]')
print(' --prefix=["Prefix for configuration files"]')
print(' --from_docker_file')
argvs = sys.argv
argc = len(argvs)
cwd = os.getcwd()
mySpgwcCfg = spgwcConfigGen()
while len(argvs) > 1:
myArgv = argvs.pop(1)
if re.match('^\-\-help$', myArgv, re.IGNORECASE):
Usage()
sys.exit(0)
elif re.match('^\-\-kind=(.+)$', myArgv, re.IGNORECASE):
matchReg = re.match('^\-\-kind=(.+)$', myArgv, re.IGNORECASE)
mySpgwcCfg.kind = matchReg.group(1)
elif re.match('^\-\-s11c=(.+)$', myArgv, re.IGNORECASE):
matchReg = re.match('^\-\-s11c=(.+)$', myArgv, re.IGNORECASE)
mySpgwcCfg.s11c_name = matchReg.group(1)
elif re.match('^\-\-sxc=(.+)$', myArgv, re.IGNORECASE):
matchReg = re.match('^\-\-sxc=(.+)$', myArgv, re.IGNORECASE)
mySpgwcCfg.sxc_name = matchReg.group(1)
elif re.match('^\-\-dns1=(.+)$', myArgv, re.IGNORECASE):
matchReg = re.match('^\-\-dns1=(.+)$', myArgv, re.IGNORECASE)
mySpgwcCfg.dns1 = matchReg.group(1)
elif re.match('^\-\-dns2=(.+)$', myArgv, re.IGNORECASE):
matchReg = re.match('^\-\-dns2=(.+)$', myArgv, re.IGNORECASE)
mySpgwcCfg.dns2 = matchReg.group(1)
elif re.match('^\-\-apn_list=(.+)$', myArgv, re.IGNORECASE):
matchReg = re.match('^\-\-apn_list=(.+)$', myArgv, re.IGNORECASE)
mySpgwcCfg.apn_list = str(matchReg.group(1))
elif re.match('^\-\-pdn_list=(.+)$', myArgv, re.IGNORECASE):
matchReg = re.match('^\-\-pdn_list=(.+)$', myArgv, re.IGNORECASE)
mySpgwcCfg.pdn_list = str(matchReg.group(1))
elif re.match('^\-\-prefix=(.+)$', myArgv, re.IGNORECASE):
matchReg = re.match('^\-\-prefix=(.+)$', myArgv, re.IGNORECASE)
mySpgwcCfg.prefix = matchReg.group(1)
elif re.match('^\-\-from_docker_file', myArgv, re.IGNORECASE):
mySpgwcCfg.fromDockerFile = True
elif re.match('^\-\-s5p5_production', myArgv, re.IGNORECASE):
mySpgwcCfg.s5p5 = 'prod'
else:
Usage()
sys.exit('Invalid Parameter: ' + myArgv)
if mySpgwcCfg.kind == '':
Usage()
sys.exit('missing kind parameter')
if mySpgwcCfg.kind == 'SPGW-C':
if mySpgwcCfg.s11c_name == '':
Usage()
sys.exit('missing S11 Interface Name on SPGW-C container')
elif mySpgwcCfg.sxc_name == '':
Usage()
sys.exit('missing SX Interface Name on SPGW-C container')
elif mySpgwcCfg.apn_list == '':
Usage()
sys.exit('missing apn_list')
elif mySpgwcCfg.pdn_list == '':
Usage()
sys.exit('missing pdn_list')
elif mySpgwcCfg.dns1 == '':
Usage()
sys.exit('missing primary DNS')
elif mySpgwcCfg.dns2 == '':
Usage()
sys.exit('missing secondary DNS')
elif mySpgwcCfg.prefix == '':
Usage()
sys.exit('missing prefix')
else:
mySpgwcCfg.GenerateSpgwcConfigurer()
sys.exit(0)
else:
Usage()
sys.exit('invalid kind parameter')
| 50.383099 | 188 | 0.626188 | 2,516 | 17,886 | 4.262321 | 0.133943 | 0.1395 | 0.224263 | 0.229765 | 0.72044 | 0.675401 | 0.626632 | 0.596326 | 0.562477 | 0.56201 | 0 | 0.021288 | 0.172705 | 17,886 | 354 | 189 | 50.525424 | 0.703453 | 0.068266 | 0 | 0.514286 | 0 | 0.107937 | 0.552311 | 0.046632 | 0 | 0 | 0 | 0 | 0 | 1 | 0.009524 | false | 0 | 0.012698 | 0 | 0.025397 | 0.047619 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
80143aaf7cbddc41feaad03b36e7ea708858e468 | 6,536 | py | Python | fmriprepgr/_html_snippets.py | JessyD/fmriprep-group-report | 5468fa8cee8421d8139c02e5be5faa5ef7fe7ed1 | [
"CC0-1.0"
] | 2 | 2022-02-17T21:18:06.000Z | 2022-02-19T15:07:58.000Z | fmriprepgr/_html_snippets.py | JessyD/fmriprep-group-report | 5468fa8cee8421d8139c02e5be5faa5ef7fe7ed1 | [
"CC0-1.0"
] | 4 | 2022-02-17T15:51:06.000Z | 2022-02-23T14:44:01.000Z | fmriprepgr/_html_snippets.py | JessyD/fmriprep-group-report | 5468fa8cee8421d8139c02e5be5faa5ef7fe7ed1 | [
"CC0-1.0"
] | 2 | 2021-12-16T17:43:53.000Z | 2022-02-18T19:25:55.000Z | html_head = r"""<?xml version="1.0" encoding="utf-8" ?>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8" />
<meta name="generator" content="Docutils 0.12: http://docutils.sourceforge.net/" />
<title></title>
<script src="https://code.jquery.com/jquery-3.3.1.slim.min.js" integrity="sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo" crossorigin="anonymous"></script>
<script src="https://stackpath.bootstrapcdn.com/bootstrap/4.1.3/js/bootstrap.min.js" integrity="sha384-ChfqqxuZUCnJSK3+MXmPNIyE6ZbWh2IMqE241rYiqJxyMiZ6OW/JmZQ5stwEULTy" crossorigin="anonymous"></script>
<script type="text/javascript">
var subjs = []
function updateCounts(){
var counts = {report:{"-1":0, "1":0, "0":0}}
subjs.forEach(function(val, idx, arr){
counts.report[val.report] += 1;
})
$("#nrpass").text(counts.report["1"])
$("#nrfail").text(counts.report["0"])
$("#nrtodo").text(counts.report["-1"])
}
function qc_update(run_id, stage, value) {
if (stage == 'report') {
subjs[run_id][stage] = parseInt(value)
updateCounts();
}
else {
subjs[run_id][stage] = value
}
}
function update_all(stage, value) {
subjs.forEach( subj => {subj[stage]=value})
}
function get_csv(items) {
// https://stackoverflow.com/questions/44396943/generate-a-csv-file-from-a-javascript-array-of-objects
let csv = ''
// Loop the array of objects
for(let row = 0; row < items.length; row++){
let keysAmount = Object.keys(items[row]).length
let keysCounter = 0
// If this is the first row, generate the headings
if(row === 0){
// Loop each property of the object
for(let key in items[row]){
// This is to not add a comma at the last cell
// The '\r\n' adds a new line
csv += key + (keysCounter+1 < keysAmount ? '\t' : '\r\n' )
keysCounter++
}
let keysCounterb = 0
for(let key in items[row]){
csv += items[row][key] + (keysCounterb+1 < keysAmount ? '\t' : '\r\n' )
keysCounterb++
}
}else{
for(let key in items[row]){
csv += items[row][key] + (keysCounter+1 < keysAmount ? '\t' : '\r\n' )
keysCounter++
}
}
keysCounter = 0
}
// Once we are done looping, download the .csv by creating a link
// if a link has already been created, update it
if (document.querySelector('#download-csv') == null){
let link = document.createElement('a')
link.id = 'download-csv'
link.setAttribute('href', 'data:text/plain;charset=utf-8,' + encodeURIComponent(csv));
link.setAttribute('download', 'thistextissolongitmustabsolutelybeuniqueright');
document.body.appendChild(link)
} else {
let link = document.querySelector('#download-csv')
link.setAttribute('href', 'data:text/plain;charset=utf-8,' + encodeURIComponent(csv));
link.setAttribute('download', 'thistextissolongitmustabsolutelybeuniqueright');
}
document.querySelector('#download-csv').click()
}
function parse_id(idstr) {
return idstr.split('_')[0].split('-')[1]
}
var observer = new IntersectionObserver(function(entries, observer) {
entries.forEach(entry => {
eid = parse_id(entry.target.id)
if (entry['intersectionRatio'] == 1 && subjs[eid]['been_on_screen'] == false) {
subjs[eid]['been_on_screen'] = true
}
else if (entry['intersectionRatio'] == 0 && subjs[eid]['been_on_screen'] == true && subjs[eid]['report'] == -1) {
subjs[eid]['report'] = 1
observer.unobserve(entry.target)
updateCounts();
radioid = 'inlineRadio' + eid
document.querySelectorAll('[name=' + radioid + ']')[0].checked = true
}
/* Here's where we deal with every intersection */
});
}
, {root:document.querySelector('#scrollArea'), threshold:[0,1]});
</script>
<link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.1.3/css/bootstrap.min.css" integrity="sha384-MCw98/SFnGE8fJT3GXwEOngsV7Zt27NXFoaoApmYm81iuXoPkFOJwJ8ERdknLPMO" crossorigin="anonymous">
<style type="text/css">
.sub-report-title {}
.run-title {}
h1 { padding-top: 35px; }
h2 { padding-top: 20px; }
h3 { padding-top: 15px; }
.elem-desc {}
.elem-caption {
margin-top: 15px
margin-bottom: 0;
}
.elem-filename {}
div.elem-image {
width: 100%;
page-break-before:always;
}
.elem-image object.svg-reportlet {
width: 100%;
padding-bottom: 5px;
}
body {
padding: 65px 10px 10px;
}
.boiler-html {
font-family: "Bitstream Charter", "Georgia", Times;
margin: 20px 25px;
padding: 10px;
background-color: #F8F9FA;
}
div#boilerplate pre {
margin: 20px 25px;
padding: 10px;
background-color: #F8F9FA;
}
</style>
</head>
<body>"""
html_foot = """<script type="text/javascript">
function toggle(id) {
var element = document.getElementById(id);
if(element.style.display == 'block')
element.style.display = 'none';
else
element.style.display = 'block';
}
</script>
<script>
updateCounts();
document.querySelectorAll('[id^="id"]').forEach(img => {observer.observe(img)})
</script>
</body>
</html>"""
reviewer_initials = """
<p> Initials: <input type="text" id="initials_box" oninput="update_all('rater', this.value)"></p>
"""
nav= """<nav class="navbar fixed-top navbar-expand-lg navbar-light bg-light">
<div class="navbar-header">
Ratings: <span id="nrpass" class="badge badge-success">0</span> <span id="nrfail" class="badge badge-danger">0</span> <span id="nrtodo" class="badge badge-warning">0</span>
</div>
<div class="navbar-text">
<button type="button" class="btn btn-info btn-sm" id="csv_download" onclick="get_csv(subjs)">Download TSV</button>
</div>
</div>
</nav>"""
def _generate_html_head(dl_file_name):
"""
generate an html head block where the name of the downloaded file is set appropriately.
Parameters
----------
dl_file_name : str
Returns
-------
str
"""
return html_head.replace('thistextissolongitmustabsolutelybeuniqueright', dl_file_name)
| 31.882927 | 211 | 0.621787 | 776 | 6,536 | 5.195876 | 0.365979 | 0.010417 | 0.018849 | 0.008185 | 0.179067 | 0.170635 | 0.154018 | 0.154018 | 0.092262 | 0.092262 | 0 | 0.02866 | 0.209914 | 6,536 | 204 | 212 | 32.039216 | 0.75213 | 0.022797 | 1 | 0.170732 | 0 | 0.103659 | 0.972585 | 0.293525 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006098 | false | 0.012195 | 0 | 0 | 0.018293 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8014b101a195eed6168c6b8be988c1c906657f4a | 264 | py | Python | pyhiveapi/apyhiveapi/helper/map.py | ms32035/Pyhiveapi | c84389aa8118acd006a4b228e58b6a966e49e7dc | [
"MIT"
] | 10 | 2020-10-11T20:50:36.000Z | 2021-05-01T16:11:19.000Z | pyhiveapi/apyhiveapi/helper/map.py | ms32035/Pyhiveapi | c84389aa8118acd006a4b228e58b6a966e49e7dc | [
"MIT"
] | 11 | 2020-10-27T19:34:12.000Z | 2021-03-11T22:30:13.000Z | pyhiveapi/apyhiveapi/helper/map.py | ms32035/Pyhiveapi | c84389aa8118acd006a4b228e58b6a966e49e7dc | [
"MIT"
] | 8 | 2020-10-05T18:55:41.000Z | 2021-03-04T23:45:05.000Z | """Dot notation for dictionary."""
class Map(dict):
"""dot.notation access to dictionary attributes.
Args:
dict (dict): dictionary to map.
"""
__getattr__ = dict.get
__setattr__ = dict.__setitem__
__delattr__ = dict.__delitem__
| 18.857143 | 52 | 0.651515 | 28 | 264 | 5.428571 | 0.607143 | 0.144737 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.238636 | 264 | 13 | 53 | 20.307692 | 0.756219 | 0.443182 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
801c0227663c2e2af4cd8891a4d3e4777f6df5cd | 219 | py | Python | app.py | fdoliveira/infosystem-seed | d6118cb431c9a8350f69b0f317d6e747e0fc3175 | [
"MIT"
] | 1 | 2018-10-03T02:12:26.000Z | 2018-10-03T02:12:26.000Z | app.py | fdoliveira/infosystem-seed | d6118cb431c9a8350f69b0f317d6e747e0fc3175 | [
"MIT"
] | null | null | null | app.py | fdoliveira/infosystem-seed | d6118cb431c9a8350f69b0f317d6e747e0fc3175 | [
"MIT"
] | 1 | 2018-09-02T18:52:59.000Z | 2018-09-02T18:52:59.000Z | import os
import app
if __name__ == '__main__':
app_host = os.environ.get('HOST', '0.0.0.0')
app_port = int(os.environ.get('PORT', 5000))
system = app.System()
system.run(host=app_host, port=app_port)
| 21.9 | 48 | 0.648402 | 36 | 219 | 3.611111 | 0.416667 | 0.046154 | 0.184615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044444 | 0.178082 | 219 | 9 | 49 | 24.333333 | 0.677778 | 0 | 0 | 0 | 0 | 0 | 0.105023 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
80264b76ab3a1749df01417e4a5ec36980427147 | 688 | py | Python | sc/pyics/__init__.py | AlexWorldD/ForestFire | 855ce20fc9a51d55457773437e7135b3cce89efe | [
"MIT"
] | null | null | null | sc/pyics/__init__.py | AlexWorldD/ForestFire | 855ce20fc9a51d55457773437e7135b3cce89efe | [
"MIT"
] | null | null | null | sc/pyics/__init__.py | AlexWorldD/ForestFire | 855ce20fc9a51d55457773437e7135b3cce89efe | [
"MIT"
] | null | null | null | """
PyICS
Framework for the UvA bachelor course `Introduction Computational Science'.
Contains functionality for creating, visualizing and testing simulations.
See documentation of `Model' and `paramsweep' for more details.
"""
# Placeholder file to make the `pyics' directory a python module.
# Additionally, we expose the Model and paramsweep as direct members of this
# module, so scripts can do
# >>> import pyics.Model
# >>> import pyics.paramsweep
# etc.
from .pycx_gui import GUI
from .model import Model
from .paramsweep import paramsweep
# When importing everything (from pyics import *) limit it so useful stuff.
__all__ = ['GUI', 'Model', 'paramsweep']
| 31.272727 | 79 | 0.741279 | 89 | 688 | 5.674157 | 0.629213 | 0.031683 | 0.071287 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178779 | 688 | 21 | 80 | 32.761905 | 0.893805 | 0.755814 | 0 | 0 | 0 | 0 | 0.130435 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
8030cc4b714b9df361bd06842c747c491334c0f1 | 323 | py | Python | engine.py | anazalea/zestlife | 9af668b4d661bcdd5f8b579909ed05043570b3b0 | [
"MIT"
] | null | null | null | engine.py | anazalea/zestlife | 9af668b4d661bcdd5f8b579909ed05043570b3b0 | [
"MIT"
] | null | null | null | engine.py | anazalea/zestlife | 9af668b4d661bcdd5f8b579909ed05043570b3b0 | [
"MIT"
] | null | null | null | from entities.environment import GameEnvironment
class Engine():
def draw_background():
"""draw any non-interactive object"""
pass
def tick(e: GameEnvironment):
"""Update the state every entity in the game"""
for o in e.game_objects:
o.update()
o.move()
| 20.1875 | 55 | 0.594427 | 38 | 323 | 5 | 0.736842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.306502 | 323 | 15 | 56 | 21.533333 | 0.848214 | 0.226006 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.125 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
80315efd712342b5dbf4eebf9e1c95dd9ccf148e | 5,957 | py | Python | python/ManaBus.py | harryiliffe/Bus-Sniffer | 2b5b93410a5ef2c8f59294bf89010a56d1cd5d4b | [
"MIT"
] | null | null | null | python/ManaBus.py | harryiliffe/Bus-Sniffer | 2b5b93410a5ef2c8f59294bf89010a56d1cd5d4b | [
"MIT"
] | null | null | null | python/ManaBus.py | harryiliffe/Bus-Sniffer | 2b5b93410a5ef2c8f59294bf89010a56d1cd5d4b | [
"MIT"
] | null | null | null | import re, os.path, mechanize, csv
from bs4 import BeautifulSoup
from datetime import date,timedelta
from collections import deque
import requests
locationTo = "Palmerston North"
locationFrom = "Wellington - Central"
locationDict = {'Hastings': '218', 'Wairakei': '225', 'Auckland': '191', 'Kaiwaka': '310', 'Whitianga': '292', 'Whenuakite': '548', 'Oakleigh': '308', 'Bay View': '704', 'Whangarei - Kamo': '554', 'Tairua': '293', 'Akoranga ( North Shore Auckland )': '553', 'Mangatawhiri': '431', 'Te Pohue': '695', 'Towai': '560', 'Pokeno': '1000163', 'Wellsford': '307', 'Paeroa': '428', 'Whangarei': '309', 'Ngaruawahia': '228', 'Hamilton - Central': '193', 'Ngatea': '429', 'Albany ( North Shore Auckland )': '304', 'Hamilton - Waikato University': '449', 'Manukau ( Auckland )': '192', 'Maramarua': '430', 'Warkworth': '306', 'Bulls': '299', 'Mount Maunganui ( Tauranga )': '291', 'Napier': '220', 'Karapiro': '241', 'Auckland Airport - Int Terminal': '536', 'Taihape': '234', 'Massey University (Palmerston Nth)': '256', 'Tauranga (Central)': '229', 'Matamata': '231', 'Reporoa': '745', 'Hot Water Beach': '551', 'Palmerston North': '240', 'Auckland Airport - Domestic Terminal': '534', 'Kawakawa': '562', 'Paihia - Bay of Islands': '558', 'Clive': '246', 'Rotorua': '194', 'Hikuai': '300', 'Taupo': '224', 'Opua Hill': '631', 'Thames': '294', 'Morrinsville-Tatuanui': '697', 'Waihi': '301', 'Brynderwyn': '311', 'Ruakaka': '313', 'Wellington - Central': '196', 'Bayfair ( Tauranga )': '290', 'Coroglen': '552', 'Morrinsville': '696', 'Hikurangi': '559', 'Tirau': '216', 'Cambridge': '213', 'Te Puke': '284', 'Hahei': '549', 'Mercer': '435', 'Porirua': '245', 'Waipu': '312', 'Te Puna': '425', 'Levin': '243', 'Whitianga - Buffalo Beach Road': '734', 'Katikati': '426', 'Rangiriri': '433', 'Cathedral Cove': '550', 'Kopu': '423', 'Bombay': '215', 'Turangi': '237', 'Tauranga - Bethlehem': '424'}
travelDateStart = date(2017,10,20)
travelDateEnd = date(2018,9,7)
filename = "csv/MANA:%s - %s.csv" % (locationFrom.replace("/","_"), locationTo.replace("/","_"))
filenameOfInterest = "csv/MANA:%s - %s - CHEAPAF.csv" % (locationFrom.replace("/","_"), locationTo.replace("/","_"))
startFresh = False
oneday = timedelta(1)
dayCount = 0
days = ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday", "Sunday"]
busIndex = []
busIndexNum = 0
busOfInterest = []
dateFinderURL = "https://www.manabus.com/api/search/available-dates/?originId=%s&destinationId=%s" % (locationDict[locationFrom], locationDict[locationTo])
dateFinderResponse = requests.get(dateFinderURL).json()["turnDates"]
dateDict = {}
for i in range(0,len(dateFinderResponse)):
dateDict[str(dateFinderResponse[i]["date"])] = str(dateFinderResponse[i]["dateAsLong"])
def Format(dateIn):
return "%d-%s-%s" % (dateIn.year, str(dateIn.month).zfill(2), str(dateIn.day).zfill(2))
def ParseDate(dateIn):
year,month,day = re.split("-", dateIn)
return date(int(year),int(month),int(day))
class Bus:
def __init__(self,index,date,weekday,departs,arrives,price):
self.index = index
self.date = date
self.weekday = weekday
self.departure = departs
self.arrival = arrives
self.price = price
f = open(filename, 'a')
try:
writer = csv.writer(f)
writer.writerow((self.index, self.date, days[self.weekday], self.departure, self.arrival, self.price))
finally:
f.close()
def describe(self):
print "Index: %d, Date: %s, Weekday: %s, Departure Time: %s, Arrivel Time: %s, Price: %s" % (self.index, self.date, days[self.weekday], self.departure, self.arrival, self.price)
def cost(self):
if self.price != "SOLD OUT":
return self.price
else:
return 1000
def ofInterest(self):
f = open(filenameOfInterest, 'a')
try:
writer = csv.writer(f)
writer.writerow((self.index, self.date, days[self.weekday], self.departure, self.arrival, self.price))
finally:
f.close()
if not os.path.isfile(filename) or startFresh:
f = open(filename, 'w')
try:
writer = csv.writer(f)
writer.writerow( ("Index", "Date", "Weekday", "Departure Time", "Arrivel Time", "Price") )
finally:
f.close()
f = open(filenameOfInterest, 'w')
try:
writer = csv.writer(f)
writer.writerow( ("Index", "Date", "Weekday", "Departure Time", "Arrivel Time", "Price") )
finally:
f.close()
elif not startFresh:
with open(filename, 'r') as f:
try:
lastrow = deque(csv.reader(f), 1)[0]
except IndexError: # empty file
lastrow = None
travelDateStart = ParseDate(lastrow[1]) + oneday
else:
print "couldn't find file :("
quit()
while True:
while True:
searchDate = travelDateStart + (oneday*dayCount)
searchDateFormated = Format(searchDate)
try:
dateDict[searchDateFormated]
break
except:
dayCount += 1
url = "https://www.manabus.com/api/search/do-search/?lio="+locationDict[locationFrom]+"&ldo="+locationFrom.replace(" ", "%20")+"&lid="+locationDict[locationTo]+"&ldd="+locationTo.replace(" ", "%20")+"&sdd="+dateDict[searchDateFormated]+"&srd=&srd=Return%3A+No+Return&nop=1&pc=&oneDaySearch=true&website=Mana"
response = requests.post(url)
busData = response.json()[0]["searchResultsTableData"]
for i in range(0, len(busData)):
departureTime = busData[i][1]["departureTimeDisplay"]
arrivalTime = busData[i][1]["arrivalTimeDisplay"]
cost = busData[i][1]["tripPrice"]
busIndex.append(Bus(busIndexNum, searchDate, searchDate.weekday(), departureTime, arrivalTime, cost))
busIndex[busIndexNum].describe()
if busIndex[busIndexNum].cost() < 3:
print "Found Cheap Bus"
busOfInterest.append(busIndexNum)
busIndex[busIndexNum].ofInterest()
busIndexNum += 1
dayCount +=1
# placeIDs = "https://www.manabus.com/api/search/travel-locations/origin/"
# response = requests.get(placeIDs)
# print response.json()[0]["name"]
# dict = {}
# for i in range(0,74):
# dict[str(response.json()[i]["name"])] = str(response.json()[i]["id"])
# print dict
| 43.481752 | 1,685 | 0.660903 | 719 | 5,957 | 5.464534 | 0.467316 | 0.013744 | 0.013235 | 0.018325 | 0.171545 | 0.148638 | 0.120387 | 0.120387 | 0.120387 | 0.120387 | 0 | 0.052519 | 0.136982 | 5,957 | 136 | 1,686 | 43.801471 | 0.711729 | 0.044318 | 0 | 0.269231 | 0 | 0.028846 | 0.313688 | 0.01988 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.048077 | null | null | 0.028846 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
803e86f1b987fb6f38eb73922974f7b3d662a1a1 | 442 | py | Python | tests/test_Categories.py | TestowanieAutomatyczneUG/laboratorium-8-cati97 | b02a6af36bcb602dc909b50918aa9c99c01ef16c | [
"MIT"
] | null | null | null | tests/test_Categories.py | TestowanieAutomatyczneUG/laboratorium-8-cati97 | b02a6af36bcb602dc909b50918aa9c99c01ef16c | [
"MIT"
] | null | null | null | tests/test_Categories.py | TestowanieAutomatyczneUG/laboratorium-8-cati97 | b02a6af36bcb602dc909b50918aa9c99c01ef16c | [
"MIT"
] | null | null | null | import unittest
from src.sample.Categories import *
class TestMealDetails(unittest.TestCase):
def setUp(self) -> None:
self.search = Categories()
def test_get_successful_status_code_200(self):
self.assertEqual(self.search.get_status_code(), 200)
def test_check_len_of_categories_correct(self):
self.assertEqual(self.search.get_len_of_categories(), 14)
if __name__ == '__main__':
unittest.main()
| 23.263158 | 65 | 0.726244 | 56 | 442 | 5.321429 | 0.517857 | 0.100671 | 0.087248 | 0.154362 | 0.214765 | 0.214765 | 0 | 0 | 0 | 0 | 0 | 0.021798 | 0.169683 | 442 | 18 | 66 | 24.555556 | 0.790191 | 0 | 0 | 0 | 0 | 0 | 0.0181 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 1 | 0.272727 | false | 0 | 0.181818 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
80441219a98633d911f4981c66a7e34f39b0810b | 1,720 | py | Python | tools/cal_ap.py | tianyaoZhang/DeeCamp | d731a80e8a30c78e52295bc53525e14a8c65af1c | [
"Apache-2.0"
] | null | null | null | tools/cal_ap.py | tianyaoZhang/DeeCamp | d731a80e8a30c78e52295bc53525e14a8c65af1c | [
"Apache-2.0"
] | null | null | null | tools/cal_ap.py | tianyaoZhang/DeeCamp | d731a80e8a30c78e52295bc53525e14a8c65af1c | [
"Apache-2.0"
] | null | null | null | import argparse
import os
import mmcv
import torch
from mmcv import Config, DictAction
from mmcv.parallel import MMDataParallel, MMDistributedDataParallel
from mmcv.runner import get_dist_info, init_dist, load_checkpoint
from tools.fuse_conv_bn import fuse_module
from mmdet.apis import multi_gpu_test, single_gpu_test
from mmdet.core import wrap_fp16_model
from mmdet.datasets import build_dataloader, build_dataset
from mmdet.models import build_detector
'''
export PYTHONPATH=${PWD}:$PYTHONPATH
python3 tools/cal_ap.py tools/cal_ap_config.py --user_json submits/atss_r50_fpn_ms12_results.bbox.json --eval bbox
'''
def parse_args():
parser = argparse.ArgumentParser(
description='MMDet test (and eval) a model')
parser.add_argument('config', help='test config file path')
parser.add_argument('--user_json', help='user json')
parser.add_argument(
'--eval',
type=str,
nargs='+',
help='evaluation metrics, which depends on the dataset, e.g., "bbox",'
' "segm", "proposal" for COCO, and "mAP", "recall" for PASCAL VOC')
args = parser.parse_args()
return args
def main():
args = parse_args()
cfg = Config.fromfile(args.config)
# set cudnn_benchmark
cfg.data.test.test_mode = True
# build the dataloader
# TODO: support multiple images per gpu (only minor changes are needed)
dataset = build_dataset(cfg.data.test)
print(args.user_json)
#with open(args.user_json, 'r') as f:
#outputs = mmcv.load(args.user_json)
outputs = args.user_json
kwargs = {}
res = dataset.evaluate_json(outputs, args.eval, **kwargs)
print(res)
print(res['bbox_mAP'])
if __name__ == '__main__':
main()
| 28.196721 | 114 | 0.711047 | 239 | 1,720 | 4.916318 | 0.502092 | 0.04766 | 0.040851 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004979 | 0.182558 | 1,720 | 60 | 115 | 28.666667 | 0.830725 | 0.105233 | 0 | 0 | 0 | 0 | 0.164483 | 0 | 0 | 0 | 0 | 0.016667 | 0 | 1 | 0.052632 | false | 0 | 0.315789 | 0 | 0.394737 | 0.078947 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
80481db0799d9f78b2d4c3805430c0925d5a0af7 | 2,718 | py | Python | skypy/pipeline/_items.py | sibirrer/skypy | 40301a6cc96e69e22b52b6cf3a68d30505640023 | [
"BSD-3-Clause"
] | null | null | null | skypy/pipeline/_items.py | sibirrer/skypy | 40301a6cc96e69e22b52b6cf3a68d30505640023 | [
"BSD-3-Clause"
] | null | null | null | skypy/pipeline/_items.py | sibirrer/skypy | 40301a6cc96e69e22b52b6cf3a68d30505640023 | [
"BSD-3-Clause"
] | null | null | null | '''item types in the pipeline'''
from collections.abc import Sequence, Mapping
import inspect
class Item:
'''base class for items in the pipeline'''
def infer(self, context):
'''infer additional properties from context'''
pass
def depend(self, pipeline):
'''return list of dependencies'''
return []
def evaluate(self, pipeline):
'''return computed value of item'''
return None
class Ref(Item):
'''reference to another item'''
def __init__(self, ref):
self.ref = ref
def depend(self, pipeline):
return [self.ref]
def evaluate(self, pipeline):
return pipeline[self.ref]
class Call(Item):
'''item that calls a function'''
def __init__(self, function, args=[], kwargs={}):
'''initialise the call'''
if not callable(function):
raise TypeError('function is not callable')
if not isinstance(args, Sequence):
raise TypeError('args is not a sequence')
if not isinstance(kwargs, Mapping):
raise TypeError('kwargs is not a mapping')
self.function = function
self.args = args
self.kwargs = kwargs
def infer(self, context):
'''infer missing function args and kwargs from context'''
try:
# inspect the function
sig = inspect.signature(self.function)
except ValueError:
# not all functions can be inspected
sig = None
if sig is not None:
# inspect the function call for the given args and kwargs
given = sig.bind_partial(*self.args, **self.kwargs)
# now go through parameters one by one:
# - check if the parameter has an argument given
# - if not, check if the parameter has a default argument
# - if not, check if the argument can be inferred from context
for name, par in sig.parameters.items():
if name in given.arguments:
pass
elif par.default is not par.empty:
pass
elif name in context:
given.arguments[name] = context[name]
# augment args and kwargs
self.args = given.args
self.kwargs = given.kwargs
def depend(self, pipeline):
'''return a list of dependencies of the call'''
return pipeline.depend(self.args) + pipeline.depend(self.kwargs)
def evaluate(self, pipeline):
'''execute the call in the given pipeline'''
args = pipeline.evaluate(self.args)
kwargs = pipeline.evaluate(self.kwargs)
return self.function(*args, **kwargs)
| 29.543478 | 74 | 0.586093 | 320 | 2,718 | 4.95 | 0.271875 | 0.045455 | 0.056818 | 0.039773 | 0.15846 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.323767 | 2,718 | 91 | 75 | 29.868132 | 0.861806 | 0.260118 | 0 | 0.22449 | 0 | 0 | 0.035421 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.204082 | false | 0.061224 | 0.040816 | 0.040816 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
804e16971204fb384ef30402b24aa3f1131dbf15 | 73 | py | Python | posthub/__init__.py | pyflow/posthub | a05103721446e914d478d5adb388e371144ff149 | [
"MIT"
] | 1 | 2022-03-22T14:27:17.000Z | 2022-03-22T14:27:17.000Z | posthub/__init__.py | pyflow/posthub | a05103721446e914d478d5adb388e371144ff149 | [
"MIT"
] | null | null | null | posthub/__init__.py | pyflow/posthub | a05103721446e914d478d5adb388e371144ff149 | [
"MIT"
] | null | null | null | __version__ = '0.1.0'
# __slots__ = ['locker', 'pubsub', 'queue', 'cli'] | 24.333333 | 50 | 0.589041 | 9 | 73 | 3.888889 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.047619 | 0.136986 | 73 | 3 | 50 | 24.333333 | 0.507937 | 0.657534 | 0 | 0 | 0 | 0 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8051a4e87a5c5f4b02ca96a89e640a21ee59b741 | 5,211 | py | Python | src/hub/dataload/sources/unii/unii_upload.py | biothings/mychem.info | 3d0f66ae9c1e9e6fa78a32868f440d162660e2aa | [
"Apache-2.0"
] | 10 | 2017-07-24T11:45:27.000Z | 2022-02-14T13:42:36.000Z | src/hub/dataload/sources/unii/unii_upload.py | biothings/mychem.info | 3d0f66ae9c1e9e6fa78a32868f440d162660e2aa | [
"Apache-2.0"
] | 92 | 2017-06-22T16:49:20.000Z | 2022-03-24T20:50:01.000Z | src/hub/dataload/sources/unii/unii_upload.py | biothings/mychem.info | 3d0f66ae9c1e9e6fa78a32868f440d162660e2aa | [
"Apache-2.0"
] | 11 | 2017-06-12T18:31:35.000Z | 2022-01-31T02:56:52.000Z | import os
import glob
from .unii_parser import load_data
from hub.dataload.uploader import BaseDrugUploader
import biothings.hub.dataload.storage as storage
from biothings.hub.datatransform import DataTransformMDB
from hub.datatransform.keylookup import MyChemKeyLookup
SRC_META = {
"url": 'https://fdasis.nlm.nih.gov/srs/',
"license": "public domain",
"license_url" : "https://www.nlm.nih.gov/copyright.html",
"license_url_short": "http://bit.ly/2Pg8Oo9"
}
class UniiUploader(BaseDrugUploader):
name = "unii"
storage_class = storage.IgnoreDuplicatedStorage
__metadata__ = {"src_meta" : SRC_META}
keylookup = MyChemKeyLookup([('inchikey', 'unii.inchikey'),
('pubchem', 'unii.pubchem'),
('unii', 'unii.unii')],
copy_from_doc=True,
)
def load_data(self,data_folder):
self.logger.info("Load data from '%s'" % data_folder)
record_files = glob.glob(os.path.join(data_folder,"*Records*.txt"))
assert len(record_files) == 1, "Expecting one record.txt file, got %s" % repr(record_files)
input_file = record_files.pop()
assert os.path.exists(input_file), "Can't find input file '%s'" % input_file
# disable keylookup - unii is a base collection used for drugname lookup
# and should be loaded first, (keylookup commented out)
# return self.keylookup(load_data)(input_file)
return load_data(input_file)
def post_update_data(self,*args,**kwargs):
for field in ("unii.unii","unii.preferred_term"):
self.logger.info("Indexing '%s'" % field)
self.collection.create_index(field,background=True)
@classmethod
def get_mapping(klass):
mapping = {
"unii": {
"properties": {
"unii": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
'copy_to': ['all'],
},
"preferred_term": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"registry_number": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"ec": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"ncit": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"rxcui": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"itis": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"ncbi": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"plants": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"grin": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"inn_id": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"molecular_formula": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"inchikey": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"smiles": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"unii_type": {
"type": "text"
},
"pubchem": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
},
"mpns": {
"normalizer": "keyword_lowercase_normalizer",
"type": "keyword",
}
}
}
}
return mapping
| 41.031496 | 99 | 0.407024 | 344 | 5,211 | 5.953488 | 0.380814 | 0.132813 | 0.203125 | 0.28125 | 0.367188 | 0.367188 | 0 | 0 | 0 | 0 | 0 | 0.001507 | 0.490693 | 5,211 | 126 | 100 | 41.357143 | 0.770158 | 0.032431 | 0 | 0.309091 | 0 | 0 | 0.253524 | 0.088942 | 0 | 0 | 0 | 0 | 0.018182 | 1 | 0.027273 | false | 0 | 0.063636 | 0 | 0.154545 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
8052d674e66be686aec775cbd19b9b983c3e15fa | 2,440 | py | Python | tests/test_step.py | FeetMe/sopic | 419bb2a84a8c41f3d09cd7a1a26ecf6e71e735d3 | [
"MIT"
] | 3 | 2019-05-29T13:51:06.000Z | 2020-08-02T16:18:15.000Z | tests/test_step.py | FeetMe/sopic | 419bb2a84a8c41f3d09cd7a1a26ecf6e71e735d3 | [
"MIT"
] | null | null | null | tests/test_step.py | FeetMe/sopic | 419bb2a84a8c41f3d09cd7a1a26ecf6e71e735d3 | [
"MIT"
] | null | null | null | from sopic import Step
class DummyLogger:
def debug(self, x):
pass
def info(self, x):
pass
def error(self, x):
pass
def test_step_name():
class CustomStep(Step):
STEP_NAME = "custom-step"
step = CustomStep("station_name", 42, None, True)
assert step.getStepName() == "custom-step"
def test_step_ok():
step = Step("station_name", 42, DummyLogger(), True)
assert step.OK() == (
{
"passed": True,
"stepData": {},
"terminate": False,
"infoStr": "",
"errorCode": None,
}
)
def test_step_ok_with_infostr():
step = Step("station_name", 42, DummyLogger(), True)
assert step.OK("infoStr") == (
{
"passed": True,
"stepData": {},
"terminate": False,
"infoStr": "infoStr",
"errorCode": None,
}
)
def test_step_ko():
step = Step("station_name", 42, DummyLogger(), True)
assert step.KO() == (
{
"passed": False,
"stepData": {},
"terminate": False,
"infoStr": "",
"errorCode": None,
}
)
def test_step_ko_terminate():
step = Step("station_name", 42, DummyLogger(), True)
assert step.KO(terminate=True) == (
{
"passed": False,
"stepData": {},
"terminate": True,
"infoStr": "",
"errorCode": None,
}
)
def test_step_ko_errorCode():
step = Step("station_name", 42, DummyLogger(), True)
assert step.KO(errorCode=123) == (
{
"passed": False,
"stepData": {},
"terminate": False,
"infoStr": "",
"errorCode": 123,
}
)
def test_step_ko_errorStr():
step = Step("station_name", 42, DummyLogger(), True)
assert step.KO(errorStr="foo") == (
{
"passed": False,
"stepData": {},
"terminate": False,
"infoStr": "foo",
"errorCode": None,
}
)
def test_step_ko_fulle():
step = Step("station_name", 42, DummyLogger(), True)
assert step.KO(terminate=True, errorCode=123, errorStr="foo") == (
{
"passed": False,
"stepData": {},
"terminate": True,
"infoStr": "foo",
"errorCode": 123,
}
)
| 21.785714 | 70 | 0.47541 | 220 | 2,440 | 5.131818 | 0.163636 | 0.053144 | 0.077945 | 0.117803 | 0.712135 | 0.712135 | 0.527901 | 0.414526 | 0.414526 | 0.320638 | 0 | 0.018241 | 0.370902 | 2,440 | 111 | 71 | 21.981982 | 0.717264 | 0 | 0 | 0.488889 | 0 | 0 | 0.170902 | 0 | 0 | 0 | 0 | 0 | 0.088889 | 1 | 0.122222 | false | 0.111111 | 0.011111 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
337a2cb45679de4f68619e5432ccc2f5ce58cdc7 | 408 | py | Python | accounts/urls.py | pddg/task_categorizer | 7716802ae7c50a091d666f45beac59ac3fcb2d03 | [
"MIT"
] | null | null | null | accounts/urls.py | pddg/task_categorizer | 7716802ae7c50a091d666f45beac59ac3fcb2d03 | [
"MIT"
] | 9 | 2019-12-17T09:42:05.000Z | 2021-09-22T18:15:36.000Z | accounts/urls.py | pddg/task_categorizer | 7716802ae7c50a091d666f45beac59ac3fcb2d03 | [
"MIT"
] | null | null | null | import logging
from typing import TYPE_CHECKING
from django.urls import path
from django.contrib.auth import views as auth_views
if TYPE_CHECKING:
pass
logger = logging.getLogger(__name__)
app_name = 'accounts'
urlpatterns = [
path('login/', auth_views.LoginView.as_view(template_name='accounts/login.html'), name='login'),
path('logout/', auth_views.LogoutView.as_view(), name='logout'),
]
| 22.666667 | 100 | 0.752451 | 56 | 408 | 5.25 | 0.5 | 0.091837 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127451 | 408 | 17 | 101 | 24 | 0.825843 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.083333 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.