hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
67fd71b159a22e60b64a07348a0a3e35c2a3b7e5 | 382 | py | Python | phyutil/__init__.py | frib-high-level-controls/phyhlc | 6486607e3aa0212054a12e9f2ad1a3ef15542f48 | [
"BSD-3-Clause"
] | 1 | 2018-03-22T15:18:54.000Z | 2018-03-22T15:18:54.000Z | phyutil/__init__.py | frib-high-level-controls/phyhlc | 6486607e3aa0212054a12e9f2ad1a3ef15542f48 | [
"BSD-3-Clause"
] | null | null | null | phyutil/__init__.py | frib-high-level-controls/phyhlc | 6486607e3aa0212054a12e9f2ad1a3ef15542f48 | [
"BSD-3-Clause"
] | null | null | null | # encoding: UTF-8
"""Physics Applications Utility"""
__copyright__ = "Copyright (c) 2015, Facility for Rare Isotope Beams"
__author__ = "Dylan Maxwell"
__version__ = "0.0.1"
import logging
import phylib
import machine
from machine import *
from phylib.libCore import *
# configure the root logger
logging.basicConfig(format="%(levelname)s: %(asctime)s: %(name)s: %(message)s")
| 21.222222 | 79 | 0.740838 | 50 | 382 | 5.42 | 0.74 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024169 | 0.133508 | 382 | 17 | 80 | 22.470588 | 0.794562 | 0.185864 | 0 | 0 | 0 | 0 | 0.388158 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.555556 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
db15276b717208ef752639b4aaf944577ef66238 | 1,032 | py | Python | mportal/wsgi_start.py | auyeongwy/mportal | e406baea802093569c90c7206649c5afd9431dab | [
"Apache-2.0"
] | null | null | null | mportal/wsgi_start.py | auyeongwy/mportal | e406baea802093569c90c7206649c5afd9431dab | [
"Apache-2.0"
] | null | null | null | mportal/wsgi_start.py | auyeongwy/mportal | e406baea802093569c90c7206649c5afd9431dab | [
"Apache-2.0"
] | null | null | null | # Copyright 2014 Au Yeong Wing Yau
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#!/usr/bin/python
# -*- coding: utf-8 -*-
""" All start-up processes to be called when the WSGI process starts.
"""
from mportal_tools import mportal_log
from mportal_tools import mportal_db
import mportal_urls, template_mgr
mportal_log.init_log() # Initializes logging file handler.
mportal_db.init_db() # Initializes database connections.
mportal_urls.init_urls() # Initializes URL list.
template_mgr.init_templates() # Initializes HTML templates.
| 34.4 | 74 | 0.774225 | 156 | 1,032 | 5.032051 | 0.641026 | 0.076433 | 0.033121 | 0.040764 | 0.073885 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010239 | 0.148256 | 1,032 | 29 | 75 | 35.586207 | 0.882821 | 0.749031 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
db33adbcb92391813fa24af06e3df16ea1f77a19 | 236 | py | Python | pyvisdk/enums/virtual_machine_ht_sharing.py | Infinidat/pyvisdk | f2f4e5f50da16f659ccc1d84b6a00f397fa997f8 | [
"MIT"
] | null | null | null | pyvisdk/enums/virtual_machine_ht_sharing.py | Infinidat/pyvisdk | f2f4e5f50da16f659ccc1d84b6a00f397fa997f8 | [
"MIT"
] | null | null | null | pyvisdk/enums/virtual_machine_ht_sharing.py | Infinidat/pyvisdk | f2f4e5f50da16f659ccc1d84b6a00f397fa997f8 | [
"MIT"
] | null | null | null |
########################################
# Automatically generated, do not edit.
########################################
from pyvisdk.thirdparty import Enum
VirtualMachineHtSharing = Enum(
'any',
'internal',
'none',
)
| 15.733333 | 40 | 0.440678 | 15 | 236 | 6.933333 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152542 | 236 | 14 | 41 | 16.857143 | 0.52 | 0.15678 | 0 | 0 | 1 | 0 | 0.12931 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e1d83fca2e1bb93962f5e57c6f7075495edf9d91 | 8,688 | py | Python | src/06_tool/regular_expression.py | edgardeng/python-advance-interview | 59fd7bee8e871acdc7fdfecf2a110db840c47ebb | [
"Apache-2.0"
] | 1 | 2022-03-06T13:03:56.000Z | 2022-03-06T13:03:56.000Z | src/06_tool/regular_expression.py | edgardeng/python-advance-interview | 59fd7bee8e871acdc7fdfecf2a110db840c47ebb | [
"Apache-2.0"
] | null | null | null | src/06_tool/regular_expression.py | edgardeng/python-advance-interview | 59fd7bee8e871acdc7fdfecf2a110db840c47ebb | [
"Apache-2.0"
] | null | null | null | '''
' Python Regular Expression 正则表达式
'
'''
import re
def test_match():
s = 'hello python Hello'
p = 'hello'
o = re.match(p, s)
print(o)
print(dir(o))
print(o.group()) # 返回匹配的字符串
print(o.span()) # 范围
print(o.start()) # 开始处
print('*' * 30, 'flags参数的使用')
o2 = re.match(p, s, re.L)
print(o2.group()) # 返回匹配的字符串
# 常用字符的使用
def test_match_character():
print('-' * 30, ' . 匹配任意一个字符')
print(re.match('.', 'abv'))
print(re.match('.', '12'))
print(re.match('.', '\n'))
print('-' * 30, ' \d 匹配数字 0-9')
print(re.match('\d', 'abc456'))
print(re.match('\d', '234svd'))
print('-' * 30, ' \D 匹配非数字 0-9')
print(re.match('\D', 'abc456'))
print(re.match('\D', '234svd'))
print('-' * 30, ' \s 匹配空白字符')
print(re.match('\s', '\n12\t'))
print(re.match('\s', '\t'))
print(re.match('\s', 'addd'))
print('-' * 30, ' \S 匹配非空白字符')
print(re.match('\S', '\n12\t'))
print(re.match('\S', '\t'))
print(re.match('\S', 'addd'))
print('-' * 30, ' \w 匹配字母、数字')
print(re.match('\w', 'AB'))
print(re.match('\w', 'ab'))
print(re.match('\w', '12'))
print(re.match('\w', '__'))
print(re.match('\w', '##'))
print('-' * 30, ' \W 匹配非 字母、数字')
print(re.match('\W', 'AB'))
print(re.match('\W', 'ab'))
print(re.match('\W', '12'))
print(re.match('\W', '__'))
print(re.match('\W', '##'))
print('-' * 30, ' \[] 匹配列表中的字符')
print(re.match('[2468]', '22'))
print(re.match('[2468]', '33'))
print(re.match('[2468]', '83'))
print(re.match('[2468]', '38'))
def test_match_phone():
print('-' * 30, ' 匹配手机号')
patten = '\d\d\d\d\d\d\d\d\d\d\d'
print(re.match(patten, '13466669999'))
print(re.match('1[345789]\d\d\d\d\d\d\d\d\d', '13466669999'))
# 限定符
def test_match_qualifier():
print('-' * 30, ' * 匹配零次或多次')
print(re.match('\d*', '123abc')) # 匹配开头的数字
print(re.match('\d*', 'abc'))
print('-' * 30, ' + |匹配一次或多次')
print(re.match('\d+', '123abc')) # 匹配开头的数字
print(re.match('\d+', 'abc'))
print('-' * 30, ' ? |匹配一次或零次')
print(re.match('\d?', '1abc'))
print(re.match('\d?', '123abc')) # 匹配开头的数字
print(re.match('\d?', 'abc'))
print('-' * 30, ' {m} |重复m次')
print(re.match('\d{2}', '123abc')) # 匹配开头2个数字
print(re.match('\d{2}', '12abc'))
print(re.match('\d{2}', '1abc'))
print(re.match('\d{2}', 'abc'))
print('-' * 30, '{m,n}|重复m到n次')
print(re.match('\d{1,3}', '1234abc')) # 匹配开头2个数字
print(re.match('\d{1,3}', '123abc'))
print(re.match('\d{1,3}', '12abc'))
print(re.match('\d{1,3}', '1abc'))
print(re.match('\d{1,3}', 'abc'))
print('-' * 30, '{m,}|至少m次')
print(re.match('\d{2,}', '1234abc')) # 匹配开头2个数字
print(re.match('\d{2,}', '123abc'))
print(re.match('\d{2,}', '12abc'))
print(re.match('\d{2,}', '1abc'))
print(re.match('\d{2,}', 'abc'))
print('-' * 30, '案例1 首字母为大写字符,其他小写字符')
print(re.match('[A-Z][a-z]*', 'abc'))
print(re.match('[A-Z][a-z]*', 'ABC'))
print(re.match('[A-Z][a-z]*', 'Abc'))
print(re.match('[A-Z][a-z]*', 'AbC'))
print('-' * 30, '案例2 有效变量名 字母数字下划线,数字不开头')
print(re.match('[a-zA-Z_][a-zA-Z0-9_]*', 'abc'))
print(re.match('[a-zA-Z_]\w*', 'abc'))
print(re.match('[a-zA-Z_][a-zA-Z0-9_]*', 'abc123'))
print(re.match('[a-zA-Z_]\w*', '123abc'))
print(re.match('[a-zA-Z_]\w*', '_123abc'))
print('-' * 30, '案例2 1-99的数字')
print(re.match('[1-9]\d?', '23abc'))
print(re.match('[1-9]\d?', '100'))
print(re.match('[1-9]\d?', '11'))
print(re.match('[1-9]\d?', '1'))
print(re.match('[1-9]\d?', '0'))
print(re.match('[1-9]\d?', '09'))
print('-' * 30, '案例2 8-20随机密码 大写,小写,下划线,数字')
print(re.match('\w{8,20}', '1234567'))
print(re.match('\w{8,20}', '1234567$$'))
print(re.match('\w{8,20}', '1234567abc_'))
print(re.match('\w{8,20}', '1234567abc#'))
print(re.match('\w{8,20}', '12345678901234567890zx'))
# 转义字符 原生字符
def escape_character():
print('C:\t\d\e')
print('C:\\t\\d\\e')
print(r'C:\t\d\e')
# 边界字符
def boundary():
print('-' * 30, '$ 匹配字符串结尾')
print(re.match('[1-9]\d{4,9}@qq.com', '1234567@qq.com'))
print(re.match('[1-9]\d{4,9}@qq.com', '1234567@qq.com.126.cn'))
print(re.match(r'[1-9]\d{4,9}@qq.com$', '1234567@qq.com'))
print(re.match(r'[1-9]\d{4,9}@qq.com$', '1234567@qq.com.126.cn'))
print('-' * 30, ' ^ 匹配字符串开头')
print(re.match(r'^hello.*', 'hello abc'))
print(re.match(r'^hello.*', 'abc hello abc'))
print('-' * 30, ' \b 匹配单词的边界')
print(re.match(r'.*\bab', '123 aabc')) # 单词 ab 开始
print(re.match(r'.*\bab', '123 abcd'))
print(re.match(r'.*\bab', '123 aaa'))
print(re.match(r'.*\bab', '123 abcd cdab'))
print(re.match(r'.*ab\b', '123 abc')) # 单词 ab 结尾
print(re.match(r'.*ab\b', '123 aaa'))
print(re.match(r'.*ab\b', '123 ab'))
print(re.match(r'.*ab\b', '123 cdab'))
print(re.match(r'.*ab\b', '123 abcd cdab'))
def test_search():
print(re.match(r'hello', 'hello python'))
print(re.search(r'hello', 'hello python'))
print(re.match(r'hello', 'python hello'))
print(re.search(r'hello', 'python hello '))
print(re.match('aa|bb|cc', 'aa'))
print(re.match('aa|bb|cc', 'bbb'))
print(re.match('aa|bb|cc', 'ccc'))
print(re.match('aa|bb|cc', 'a bb ccc'))
print(re.search('aa|bb|cc', 'a bb ccc'))
# 多个字符
def test_multi_character():
print('-' * 30, '案例 0-100之间的数字: 0-99 | 100')
print(re.match('[1-9]?\d|100', '1'))
print(re.match('[1-9]?\d|100', '11'))
print(re.match('[1-9]?\d|100', '100'))
print(re.match('[1-9]?\d$|100$', '100'))
print(re.match('[1-9]?\d$|100$', '1000'))
print('-' * 30, '案例 ')
print(re.match('[ab][cd]', 'ab'))
print(re.match('[ab][cd]', 'ac'))
print(re.match('[ab][cd]', 'ad'))
print(re.match('ab|cd', 'abc'))
print(re.match('ab|cd', 'ac'))
# 匹配分组
def test_group():
print('-' * 30, '座机号码 区号{3,4} 号码{5,8} 010-0000 0791-222222')
print(re.match(r'\d{3,4}-[1-9]\d{4,7}', '010-10086'))
print(re.match(r'\d{3,4}-[1-9]\d{4,7}', '010-88888888'))
print(re.match(r'\d{3,4}-[1-9]\d{4,7}', '1111-10086'))
print(re.match(r'\d{3,4}-[1-9]\d{4,7}', '1111-88888888'))
print('-' * 30, ' 匹配分组')
o = re.match(r'(\d{3,4})-([1-9]\d{4,7})', '1111-88888888')
print(o)
print(o.group(0), o.group(1), o.group(2))
print(o.groups(), o.groups()[0], o.groups()[1])
print('-' * 30, 'html 标签')
print(re.match(r'<.+><.+>.+</.+></.+>', '<html><a>abc</a></html>'))
print(re.match(r'<.+><.+>.+</.+></.+>', '<html><a>abc</b></html>'))
print(re.match(r'<(.*)><(.*)>.*</\2></\1>', '<html><a>abc</b></html>'))
print(re.match(r'<(.*)><(.*)>.*</\2></\1>', '<html><d>abc</d></html>'))
print('-' * 30, 'html 标签 - 别名')
print(re.match(r'<(?P<k_html>.+)><(?P<k_head>.+)>.*</(?P=k_head)></(?P=k_html)>', '<html><d>abc</d></html>'))
## 搜索与替换
def test_sub():
print('-' * 30, ' 替换')
print(re.sub(r'#.*$', '', '2004-222-23322 # 这是个什么')) # 替换#开头的部分
print(re.sub(r'#\D*', '', '2004-222-23322 # 这是个什么'))
print('-' * 30, ' 替换 subn')
print(re.subn(r'#\D*', '', '2004-222-23322 # 这是个什么'))
print(re.subn(r'#.*$', '', '2004-222-23322 # 这是个什么'))
def test_compile():
print('-' * 30, ' compile的使用')
regex = re.compile(r'\w+') # 匹配字母或数字
print(regex.match('1223dfdf'))
print(regex.match('##1223dfdf'))
def test_findall():
print('-' * 30, ' findall 返回数组')
print(re.findall(r'\w', '##1223dfdf')) # 匹配字母或数字 f
print(re.findall(r'\w+', '## 1223 df df 1'))
print('-' * 30, ' finditer 返回迭代器')
print(re.finditer(r'\w+', '## 1223 df df 1'))
for i in re.finditer(r'\w+', '## 1223 df df 1'):
print(i, i.group())
def test_split():
print('-' * 30, ' split 返回数组')
print(re.split(r'\d+', '123abc123abc'))
print(re.split(r'\d+', '123 abc 123 abc'))
print(re.split(r'\d+', 'abc123 abc 123 abc'))
print(re.split(r'\d+', 'abc 123 abc 123 abc',1))
def greedy_mode():
print('-' * 30, ' 贪婪模式')
result = re.match(r'(.+)(\d+-\d+-\d+)', 'this is my tel: 122-1244-1242')
print(result.group(1))
print(result.group(2))
print('-' * 30, ' 非贪婪模式 尽可能少的匹配')
result = re.match(r'(.+?)(\d+-\d+-\d+)', 'this is my tel: 122-1244-1242')
print(result.group(1))
print(result.group(2))
print('-' * 30, ' 贪婪模式')
print(re.match(r'abc(\d+)', 'abc123456'))
print(re.match(r'abc(\d+?)', 'abc123456'))
if __name__ == '__main__':
# test_match()
# test_match_character()
# test_match_phone()
# test_match_qualifier()
# escape_character()
# boundary()
# test_search()
# test_multi_character()
# test_group()
# test_sub()
# test_compile()
# test_findall()
# test_split()
# greedy_mode()
# <.+><.+>.+</.+></.+>
s = '<link href="../assets/css/app.css?t=20112455" type="text/css" rel="stylesheet">'
mathched = re.findall(r'\S+assets/css/\S+.css\S+"', s)
for m in mathched:
print(m, m.index('.css'))
s = s.replace(m, m[:m.index('.css')] + '.css?t=00000"')
print(s)
| 30.484211 | 111 | 0.536027 | 1,432 | 8,688 | 3.210894 | 0.155028 | 0.191823 | 0.292301 | 0.07351 | 0.610048 | 0.555241 | 0.446498 | 0.366464 | 0.332753 | 0.319704 | 0 | 0.09557 | 0.142495 | 8,688 | 284 | 112 | 30.591549 | 0.521611 | 0.05145 | 0 | 0.046512 | 0 | 0.013953 | 0.3446 | 0.057352 | 0 | 0 | 0 | 0 | 0 | 1 | 0.065116 | false | 0 | 0.004651 | 0 | 0.069767 | 0.860465 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
e1ed3b48fe37cb69350c8b6542e4845c264e91f2 | 1,125 | py | Python | src/mist/api/poller/schedulers.py | vladimir-ilyashenko/mist.api | f77c451679732ac1cfdafa85ad023c7c170faec4 | [
"Apache-2.0"
] | null | null | null | src/mist/api/poller/schedulers.py | vladimir-ilyashenko/mist.api | f77c451679732ac1cfdafa85ad023c7c170faec4 | [
"Apache-2.0"
] | null | null | null | src/mist/api/poller/schedulers.py | vladimir-ilyashenko/mist.api | f77c451679732ac1cfdafa85ad023c7c170faec4 | [
"Apache-2.0"
] | null | null | null | from celerybeatmongo.schedulers import MongoScheduler
from mist.api.sharding.mixins import ShardManagerMixin
from mist.api.poller.models import PollingSchedule
from mist.api.poller.models import OwnerPollingSchedule
from mist.api.poller.models import CloudPollingSchedule
from mist.api.poller.models import MachinePollingSchedule
import datetime
class PollingScheduler(MongoScheduler):
Model = PollingSchedule
UPDATE_INTERVAL = datetime.timedelta(seconds=20)
class OwnerPollingScheduler(MongoScheduler):
Model = OwnerPollingSchedule
UPDATE_INTERVAL = datetime.timedelta(seconds=20)
class CloudPollingScheduler(MongoScheduler):
Model = CloudPollingSchedule
UPDATE_INTERVAL = datetime.timedelta(seconds=20)
class MachinePollingScheduler(MongoScheduler):
Model = MachinePollingSchedule
UPDATE_INTERVAL = datetime.timedelta(seconds=20)
class ShardedOwnerScheduler(ShardManagerMixin, OwnerPollingScheduler):
pass
class ShardedCloudScheduler(ShardManagerMixin, CloudPollingScheduler):
pass
class ShardedMachineScheduler(ShardManagerMixin, MachinePollingScheduler):
pass
| 26.162791 | 74 | 0.826667 | 101 | 1,125 | 9.168317 | 0.316832 | 0.043197 | 0.059395 | 0.073434 | 0.319654 | 0.319654 | 0.194384 | 0 | 0 | 0 | 0 | 0.008048 | 0.116444 | 1,125 | 42 | 75 | 26.785714 | 0.923541 | 0 | 0 | 0.28 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.12 | 0.28 | 0 | 0.88 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
c00233a09c18a5f027b1634d9d3dd63a23d04cbb | 1,009 | py | Python | morpfw/authn/pas/user/rulesprovider.py | morpframework/morpfw | b867e5809d6c52e8839586670a29fcd179ce64c7 | [
"Apache-2.0"
] | 8 | 2018-12-08T01:41:58.000Z | 2020-12-21T15:30:12.000Z | morpfw/authn/pas/user/rulesprovider.py | morpframework/morpfw | b867e5809d6c52e8839586670a29fcd179ce64c7 | [
"Apache-2.0"
] | 17 | 2019-02-05T15:01:32.000Z | 2020-04-28T16:17:42.000Z | morpfw/authn/pas/user/rulesprovider.py | morpframework/morpfw | b867e5809d6c52e8839586670a29fcd179ce64c7 | [
"Apache-2.0"
] | 2 | 2018-12-08T05:03:37.000Z | 2019-03-20T07:15:21.000Z | from ....crud.rulesprovider.base import RulesProvider
from .. import exc
from ..app import App
from ..utils import has_role
from .model import UserCollection, UserModel
class UserRulesProvider(RulesProvider):
context: UserModel
def change_password(self, password: str, new_password: str, secure: bool = True):
context = self.context
if secure and not has_role(self.request, "administrator"):
if not context.validate(password, check_state=False):
raise exc.InvalidPasswordError(context.userid)
context.storage.change_password(context, context.identity.userid, new_password)
def validate(self, password: str, check_state=True) -> bool:
context = self.context
if check_state and context.data["state"] != "active":
return False
return context.storage.validate(context, context.userid, password)
@App.rulesprovider(model=UserModel)
def get_user_rulesprovider(context):
return UserRulesProvider(context)
| 34.793103 | 87 | 0.718533 | 117 | 1,009 | 6.102564 | 0.376068 | 0.046218 | 0.042017 | 0.056022 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.189296 | 1,009 | 28 | 88 | 36.035714 | 0.872861 | 0 | 0 | 0.095238 | 0 | 0 | 0.023786 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0.285714 | 0.238095 | 0.047619 | 0.619048 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
c01048af256422693f245a8c170084866f81cf42 | 1,733 | py | Python | bin/plpproject.py | stefanct/pulp-tools | 63a05d59908534ad01133d0111e181fa69d00ce3 | [
"Apache-2.0"
] | 2 | 2018-02-09T08:12:34.000Z | 2020-06-16T17:45:33.000Z | bin/plpproject.py | stefanct/pulp-tools | 63a05d59908534ad01133d0111e181fa69d00ce3 | [
"Apache-2.0"
] | 2 | 2018-02-09T07:54:32.000Z | 2018-03-09T08:51:31.000Z | bin/plpproject.py | stefanct/pulp-tools | 63a05d59908534ad01133d0111e181fa69d00ce3 | [
"Apache-2.0"
] | 6 | 2018-03-08T11:12:22.000Z | 2019-12-05T12:36:47.000Z |
#
# Copyright (C) 2018 ETH Zurich and University of Bologna
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import plptools as plp
class PkgDep(plp.PkgDep):
def __init__(self, *kargs, **kwargs):
super(PkgDep, self).__init__(*kargs, **kwargs)
class Package(plp.Package):
def __init__(self, *kargs, **kwargs):
super(Package, self).__init__(*kargs, **kwargs)
class ArtifactoryServer(object):
def __init__(self, name, url, ssl_verify=True):
self.name = name
self.url = url
self.ssl_verify = ssl_verify
class Module(plp.Module):
def __init__(self, *kargs, **kwargs):
super(Module, self).__init__(*kargs, **kwargs)
class BuildStep(object):
def __init__(self, name, command):
self.name = name
self.command = command
class Group(plp.Group):
def __init__(self, *kargs, **kwargs):
super(Group, self).__init__(*kargs, **kwargs)
class BuildStepMap(object):
def __init__(self, name, stepList):
self.name = name
self.stepList = stepList
class BuildSteps(object):
def __init__(self, stepList):
self.stepList = stepList
self.steps = {}
for step in stepList:
self.steps[step.name] = step
def get(self, name): return self.steps.get(name).stepList
| 23.739726 | 74 | 0.705713 | 240 | 1,733 | 4.883333 | 0.4 | 0.047782 | 0.075085 | 0.054608 | 0.227816 | 0.09215 | 0 | 0 | 0 | 0 | 0 | 0.005634 | 0.180612 | 1,733 | 72 | 75 | 24.069444 | 0.819718 | 0.332949 | 0 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.030303 | 0.030303 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c01466f2b1b58f8291be4e054c30cb12aa407427 | 326 | py | Python | string_30.py | Technicoryx/python_strings_inbuilt_functions | 78892d043c6c6d65affe3bd4906ba0162c5d6604 | [
"MIT"
] | null | null | null | string_30.py | Technicoryx/python_strings_inbuilt_functions | 78892d043c6c6d65affe3bd4906ba0162c5d6604 | [
"MIT"
] | null | null | null | string_30.py | Technicoryx/python_strings_inbuilt_functions | 78892d043c6c6d65affe3bd4906ba0162c5d6604 | [
"MIT"
] | null | null | null | """Below Python Programme demonstrate rpartition
functions in a string"""
string = "Python is fun"
# 'is' separator is found
print(string.rpartition('is '))
# 'not' separator is not found
print(string.rpartition('not '))
string = "Python is fun, isn't it"
# splits at last occurence of 'is'
print(string.rpartition('is'))
| 21.733333 | 48 | 0.717791 | 47 | 326 | 4.978723 | 0.489362 | 0.141026 | 0.269231 | 0.145299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.147239 | 326 | 14 | 49 | 23.285714 | 0.841727 | 0.472393 | 0 | 0.4 | 0 | 0 | 0.27439 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.6 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
c014e0fef503433734848ae3b6b9307338d2ae08 | 4,583 | py | Python | env/lib/python3.8/site-packages/unidecode/x054.py | avdhari/enigma | b7e965a91ca5f0e929c4c719d695f15ccb8b5a2c | [
"MIT"
] | 48 | 2021-11-20T08:17:53.000Z | 2022-03-19T13:57:15.000Z | venv/lib/python3.6/site-packages/unidecode/x054.py | mrsaicharan1/iiita-updates | a22a0157b90d29b946d0f020e5f76744f73a6bff | [
"Apache-2.0"
] | 392 | 2015-07-30T14:37:05.000Z | 2022-03-21T16:56:09.000Z | venv/lib/python3.6/site-packages/unidecode/x054.py | mrsaicharan1/iiita-updates | a22a0157b90d29b946d0f020e5f76744f73a6bff | [
"Apache-2.0"
] | 15 | 2015-10-01T21:31:08.000Z | 2020-05-05T00:03:27.000Z | data = (
'Mie ', # 0x00
'Xu ', # 0x01
'Mang ', # 0x02
'Chi ', # 0x03
'Ge ', # 0x04
'Xuan ', # 0x05
'Yao ', # 0x06
'Zi ', # 0x07
'He ', # 0x08
'Ji ', # 0x09
'Diao ', # 0x0a
'Cun ', # 0x0b
'Tong ', # 0x0c
'Ming ', # 0x0d
'Hou ', # 0x0e
'Li ', # 0x0f
'Tu ', # 0x10
'Xiang ', # 0x11
'Zha ', # 0x12
'Xia ', # 0x13
'Ye ', # 0x14
'Lu ', # 0x15
'A ', # 0x16
'Ma ', # 0x17
'Ou ', # 0x18
'Xue ', # 0x19
'Yi ', # 0x1a
'Jun ', # 0x1b
'Chou ', # 0x1c
'Lin ', # 0x1d
'Tun ', # 0x1e
'Yin ', # 0x1f
'Fei ', # 0x20
'Bi ', # 0x21
'Qin ', # 0x22
'Qin ', # 0x23
'Jie ', # 0x24
'Bu ', # 0x25
'Fou ', # 0x26
'Ba ', # 0x27
'Dun ', # 0x28
'Fen ', # 0x29
'E ', # 0x2a
'Han ', # 0x2b
'Ting ', # 0x2c
'Hang ', # 0x2d
'Shun ', # 0x2e
'Qi ', # 0x2f
'Hong ', # 0x30
'Zhi ', # 0x31
'Shen ', # 0x32
'Wu ', # 0x33
'Wu ', # 0x34
'Chao ', # 0x35
'Ne ', # 0x36
'Xue ', # 0x37
'Xi ', # 0x38
'Chui ', # 0x39
'Dou ', # 0x3a
'Wen ', # 0x3b
'Hou ', # 0x3c
'Ou ', # 0x3d
'Wu ', # 0x3e
'Gao ', # 0x3f
'Ya ', # 0x40
'Jun ', # 0x41
'Lu ', # 0x42
'E ', # 0x43
'Ge ', # 0x44
'Mei ', # 0x45
'Ai ', # 0x46
'Qi ', # 0x47
'Cheng ', # 0x48
'Wu ', # 0x49
'Gao ', # 0x4a
'Fu ', # 0x4b
'Jiao ', # 0x4c
'Hong ', # 0x4d
'Chi ', # 0x4e
'Sheng ', # 0x4f
'Ne ', # 0x50
'Tun ', # 0x51
'Fu ', # 0x52
'Yi ', # 0x53
'Dai ', # 0x54
'Ou ', # 0x55
'Li ', # 0x56
'Bai ', # 0x57
'Yuan ', # 0x58
'Kuai ', # 0x59
'[?] ', # 0x5a
'Qiang ', # 0x5b
'Wu ', # 0x5c
'E ', # 0x5d
'Shi ', # 0x5e
'Quan ', # 0x5f
'Pen ', # 0x60
'Wen ', # 0x61
'Ni ', # 0x62
'M ', # 0x63
'Ling ', # 0x64
'Ran ', # 0x65
'You ', # 0x66
'Di ', # 0x67
'Zhou ', # 0x68
'Shi ', # 0x69
'Zhou ', # 0x6a
'Tie ', # 0x6b
'Xi ', # 0x6c
'Yi ', # 0x6d
'Qi ', # 0x6e
'Ping ', # 0x6f
'Zi ', # 0x70
'Gu ', # 0x71
'Zi ', # 0x72
'Wei ', # 0x73
'Xu ', # 0x74
'He ', # 0x75
'Nao ', # 0x76
'Xia ', # 0x77
'Pei ', # 0x78
'Yi ', # 0x79
'Xiao ', # 0x7a
'Shen ', # 0x7b
'Hu ', # 0x7c
'Ming ', # 0x7d
'Da ', # 0x7e
'Qu ', # 0x7f
'Ju ', # 0x80
'Gem ', # 0x81
'Za ', # 0x82
'Tuo ', # 0x83
'Duo ', # 0x84
'Pou ', # 0x85
'Pao ', # 0x86
'Bi ', # 0x87
'Fu ', # 0x88
'Yang ', # 0x89
'He ', # 0x8a
'Zha ', # 0x8b
'He ', # 0x8c
'Hai ', # 0x8d
'Jiu ', # 0x8e
'Yong ', # 0x8f
'Fu ', # 0x90
'Que ', # 0x91
'Zhou ', # 0x92
'Wa ', # 0x93
'Ka ', # 0x94
'Gu ', # 0x95
'Ka ', # 0x96
'Zuo ', # 0x97
'Bu ', # 0x98
'Long ', # 0x99
'Dong ', # 0x9a
'Ning ', # 0x9b
'Tha ', # 0x9c
'Si ', # 0x9d
'Xian ', # 0x9e
'Huo ', # 0x9f
'Qi ', # 0xa0
'Er ', # 0xa1
'E ', # 0xa2
'Guang ', # 0xa3
'Zha ', # 0xa4
'Xi ', # 0xa5
'Yi ', # 0xa6
'Lie ', # 0xa7
'Zi ', # 0xa8
'Mie ', # 0xa9
'Mi ', # 0xaa
'Zhi ', # 0xab
'Yao ', # 0xac
'Ji ', # 0xad
'Zhou ', # 0xae
'Ge ', # 0xaf
'Shuai ', # 0xb0
'Zan ', # 0xb1
'Xiao ', # 0xb2
'Ke ', # 0xb3
'Hui ', # 0xb4
'Kua ', # 0xb5
'Huai ', # 0xb6
'Tao ', # 0xb7
'Xian ', # 0xb8
'E ', # 0xb9
'Xuan ', # 0xba
'Xiu ', # 0xbb
'Wai ', # 0xbc
'Yan ', # 0xbd
'Lao ', # 0xbe
'Yi ', # 0xbf
'Ai ', # 0xc0
'Pin ', # 0xc1
'Shen ', # 0xc2
'Tong ', # 0xc3
'Hong ', # 0xc4
'Xiong ', # 0xc5
'Chi ', # 0xc6
'Wa ', # 0xc7
'Ha ', # 0xc8
'Zai ', # 0xc9
'Yu ', # 0xca
'Di ', # 0xcb
'Pai ', # 0xcc
'Xiang ', # 0xcd
'Ai ', # 0xce
'Hen ', # 0xcf
'Kuang ', # 0xd0
'Ya ', # 0xd1
'Da ', # 0xd2
'Xiao ', # 0xd3
'Bi ', # 0xd4
'Yue ', # 0xd5
'[?] ', # 0xd6
'Hua ', # 0xd7
'Sasou ', # 0xd8
'Kuai ', # 0xd9
'Duo ', # 0xda
'[?] ', # 0xdb
'Ji ', # 0xdc
'Nong ', # 0xdd
'Mou ', # 0xde
'Yo ', # 0xdf
'Hao ', # 0xe0
'Yuan ', # 0xe1
'Long ', # 0xe2
'Pou ', # 0xe3
'Mang ', # 0xe4
'Ge ', # 0xe5
'E ', # 0xe6
'Chi ', # 0xe7
'Shao ', # 0xe8
'Li ', # 0xe9
'Na ', # 0xea
'Zu ', # 0xeb
'He ', # 0xec
'Ku ', # 0xed
'Xiao ', # 0xee
'Xian ', # 0xef
'Lao ', # 0xf0
'Bo ', # 0xf1
'Zhe ', # 0xf2
'Zha ', # 0xf3
'Liang ', # 0xf4
'Ba ', # 0xf5
'Mie ', # 0xf6
'Le ', # 0xf7
'Sui ', # 0xf8
'Fou ', # 0xf9
'Bu ', # 0xfa
'Han ', # 0xfb
'Heng ', # 0xfc
'Geng ', # 0xfd
'Shuo ', # 0xfe
'Ge ', # 0xff
)
| 17.694981 | 19 | 0.382064 | 510 | 4,583 | 3.433333 | 0.805882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206674 | 0.391883 | 4,583 | 258 | 20 | 17.763566 | 0.4216 | 0.279075 | 0 | 0.616279 | 0 | 0 | 0.324253 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c01a9c714b265a55e25bf66dffd00ac40d13d9db | 1,504 | py | Python | cnn/test2.py | INFINITSY/darts | 684f97e407ee044a14c375f4a3078398a4b802bc | [
"Apache-2.0"
] | null | null | null | cnn/test2.py | INFINITSY/darts | 684f97e407ee044a14c375f4a3078398a4b802bc | [
"Apache-2.0"
] | null | null | null | cnn/test2.py | INFINITSY/darts | 684f97e407ee044a14c375f4a3078398a4b802bc | [
"Apache-2.0"
] | null | null | null | import matplotlib.pyplot as plt
import numpy as np
# darts_025 = [0, 0, 0, 0, 2, 5, 6, 7, 8]
darts_025 = [0, 0, 0, 2, 3, 5, 7, 8]
darts_05 = [0, 0, 3, 3, 4, 4, 5, 7, 7]
adas_025_9 = [0, 0, 0, 0, 3, 5, 7]
adas_05_9 = [0, 0, 1, 4, 5, 6, 6, 7, 7, 7, 7]
adas_05_95 = []
adas_05_97 = [0, 0, 0, 2, 4, 4, 4, 4, 4, 6, 8]
mile = [0, 0, 0, 2, 4, 4, 4, 3, 4, 4, 4]
mile_adas_025_9 = [0, 0, 0, 0, 3, 4, 5, 5, 6, 6, 6]
mile_adas_05_9 = [0, 0, 0, 3, 4, 5, 5, 5, 5, 6, 6]
mile_adas_05_95 = [0, 0, 0, 0, 1, 1, 5, 5, 6, 6, 6]
mile_adas_05_97 = [0, 0, 0, 0, 0, 3, 3, 4, 4, 4, 4]
plt.plot(range(0, 36, 5), darts_025, '-o', label='DARTS, lr: 0.025')
# plt.plot(range(0, 41, 5), darts_05, '-o', label='DARTS, lr: 0.05')
#
# # plt.plot(range(0, 31, 5), adas_025_9, '-o', label='DARTS+Adas, lr: 0.025, beta: 0.9')
# # plt.plot(range(0, 51, 5), adas_05_9, '-o', label='DARTS+Adas, lr: 0.05, beta: 0.9')
# # plt.plot(range(0, 51, 5), adas_05_97, '-o', label='DARTS+Adas, lr: 0.05, beta: 0.97')
plt.plot(range(0, 51, 5), mile, '--o', label='MiLeNAS, lr: 0.025')
plt.plot(range(0, 51, 5), mile_adas_025_9, '--o', label='MiLeNAS+Adas, lr: 0.025, beta: 0.9')
plt.plot(range(0, 51, 5), mile_adas_05_9, '--o', label='MiLeNAS+Adas, lr: 0.05, beta: 0.9')
plt.plot(range(0, 51, 5), mile_adas_05_95, '--o', label='MiLeNAS+Adas, lr: 0.05, beta: 0.95')
plt.plot(range(0, 51, 5), mile_adas_05_97, '--o', linewidth=3.0, label='MiLeNAS+Adas, lr: 0.05, beta: 0.97')
plt.xlabel('Epoch')
plt.ylabel('#Skip-connection')
plt.legend()
plt.show()
| 44.235294 | 108 | 0.571809 | 346 | 1,504 | 2.361272 | 0.124277 | 0.063647 | 0.05508 | 0.159119 | 0.700122 | 0.603427 | 0.572827 | 0.46022 | 0.318237 | 0.19339 | 0 | 0.217774 | 0.169548 | 1,504 | 33 | 109 | 45.575758 | 0.436349 | 0.242686 | 0 | 0 | 0 | 0 | 0.183348 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.086957 | 0 | 0.086957 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c02010efda9ce4c421135232fa4f140efb168b1f | 969 | py | Python | carbon0/carbon_quiz/migrations/0010_auto_20200909_0853.py | Carbon0-Games/carbon0-web-app | 068a7223b2717d602944ec561adcde39930cba85 | [
"MIT"
] | 2 | 2020-10-30T15:07:28.000Z | 2020-12-22T04:29:50.000Z | carbon0/carbon_quiz/migrations/0010_auto_20200909_0853.py | Carbon0-Games/carbon0-web-app | 068a7223b2717d602944ec561adcde39930cba85 | [
"MIT"
] | 45 | 2020-09-22T12:47:55.000Z | 2022-03-12T00:48:18.000Z | carbon0/carbon_quiz/migrations/0010_auto_20200909_0853.py | Carbon0-Games/carbon0-web-app | 068a7223b2717d602944ec561adcde39930cba85 | [
"MIT"
] | 1 | 2020-09-08T15:48:13.000Z | 2020-09-08T15:48:13.000Z | # Generated by Django 3.1.1 on 2020-09-09 12:53
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
("carbon_quiz", "0009_auto_20200908_2201"),
]
operations = [
migrations.RemoveField(
model_name="mission",
name="description",
),
migrations.RemoveField(
model_name="mission",
name="status",
),
migrations.AddField(
model_name="mission",
name="action",
field=models.CharField(
help_text="Describes what the user needs to do.",
max_length=500,
null=True,
),
),
migrations.AddField(
model_name="mission",
name="clicks_needed",
field=models.IntegerField(
default=1, help_text="Number of the links user needs to click."
),
),
]
| 25.5 | 79 | 0.522188 | 91 | 969 | 5.428571 | 0.626374 | 0.072874 | 0.129555 | 0.161943 | 0.319838 | 0.319838 | 0 | 0 | 0 | 0 | 0 | 0.057851 | 0.375645 | 969 | 37 | 80 | 26.189189 | 0.758678 | 0.04644 | 0 | 0.451613 | 1 | 0 | 0.18872 | 0.024946 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.032258 | 0 | 0.129032 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c03b78905f8ecc14f0212e38dfa62f635acd9408 | 59,338 | py | Python | msgraph-cli-extensions/v1_0/sites_v1_0/azext_sites_v1_0/vendored_sdks/sites/models/_sites_enums.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/v1_0/sites_v1_0/azext_sites_v1_0/vendored_sdks/sites/models/_sites_enums.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | msgraph-cli-extensions/v1_0/sites_v1_0/azext_sites_v1_0/vendored_sdks/sites/models/_sites_enums.py | thewahome/msgraph-cli | 33127d9efa23a0e5f5303c93242fbdbb73348671 | [
"MIT"
] | null | null | null | # coding=utf-8
# --------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for license information.
# Code generated by Microsoft (R) AutoRest Code Generator.
# Changes may cause incorrect behavior and will be lost if the code is regenerated.
# --------------------------------------------------------------------------
from enum import Enum, EnumMeta
from six import with_metaclass
class _CaseInsensitiveEnumMeta(EnumMeta):
def __getitem__(self, name):
return super().__getitem__(name.upper())
def __getattr__(cls, name):
"""Return the enum member matching `name`
We use __getattr__ instead of descriptors or inserting into the enum
class' __dict__ in order to support `name` and `value` being both
properties for enum members (which live in the class' __dict__) and
enum members themselves.
"""
try:
return cls._member_map_[name.upper()]
except KeyError:
raise AttributeError(name)
class Enum100(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
DESCRIPTION = "description"
GROUP = "group"
HIDDEN = "hidden"
INHERITED_FROM = "inheritedFrom"
NAME = "name"
ORDER = "order"
PARENT_ID = "parentId"
READ_ONLY = "readOnly"
SEALED = "sealed"
COLUMN_LINKS = "columnLinks"
class Enum101(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
COLUMN_LINKS = "columnLinks"
class Enum102(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
NAME = "name"
NAME_DESC = "name desc"
class Enum103(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
NAME = "name"
class Enum104(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
NAME = "name"
class Enum105(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DRIVE_TYPE = "driveType"
OWNER = "owner"
QUOTA = "quota"
SHARE_POINT_IDS = "sharePointIds"
SYSTEM = "system"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
FOLLOWING = "following"
ITEMS = "items"
LIST = "list"
ROOT = "root"
SPECIAL = "special"
class Enum106(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
FOLLOWING = "following"
ITEMS = "items"
LIST = "list"
ROOT = "root"
SPECIAL = "special"
class Enum107(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
CREATED_BY = "createdBy"
CREATED_BY_DESC = "createdBy desc"
CREATED_DATE_TIME = "createdDateTime"
CREATED_DATE_TIME_DESC = "createdDateTime desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
E_TAG = "eTag"
E_TAG_DESC = "eTag desc"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_BY_DESC = "lastModifiedBy desc"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
LAST_MODIFIED_DATE_TIME_DESC = "lastModifiedDateTime desc"
NAME = "name"
NAME_DESC = "name desc"
PARENT_REFERENCE = "parentReference"
PARENT_REFERENCE_DESC = "parentReference desc"
WEB_URL = "webUrl"
WEB_URL_DESC = "webUrl desc"
CONTENT_TYPE = "contentType"
CONTENT_TYPE_DESC = "contentType desc"
SHAREPOINT_IDS = "sharepointIds"
SHAREPOINT_IDS_DESC = "sharepointIds desc"
class Enum108(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
CONTENT_TYPE = "contentType"
SHAREPOINT_IDS = "sharepointIds"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
DRIVE_ITEM = "driveItem"
FIELDS = "fields"
VERSIONS = "versions"
class Enum109(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
DRIVE_ITEM = "driveItem"
FIELDS = "fields"
VERSIONS = "versions"
class Enum110(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
CONTENT_TYPE = "contentType"
SHAREPOINT_IDS = "sharepointIds"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
DRIVE_ITEM = "driveItem"
FIELDS = "fields"
VERSIONS = "versions"
class Enum111(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
DRIVE_ITEM = "driveItem"
FIELDS = "fields"
VERSIONS = "versions"
class Enum112(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ALL_TIME = "allTime"
ITEM_ACTIVITY_STATS = "itemActivityStats"
LAST_SEVEN_DAYS = "lastSevenDays"
class Enum113(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
ALL_TIME = "allTime"
ITEM_ACTIVITY_STATS = "itemActivityStats"
LAST_SEVEN_DAYS = "lastSevenDays"
class Enum114(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
AUDIO = "audio"
CONTENT = "content"
C_TAG = "cTag"
DELETED = "deleted"
FILE = "file"
FILE_SYSTEM_INFO = "fileSystemInfo"
FOLDER = "folder"
IMAGE = "image"
LOCATION = "location"
PACKAGE = "package"
PENDING_OPERATIONS = "pendingOperations"
PHOTO = "photo"
PUBLICATION = "publication"
REMOTE_ITEM = "remoteItem"
ROOT = "root"
SEARCH_RESULT = "searchResult"
SHARED = "shared"
SHAREPOINT_IDS = "sharepointIds"
SIZE = "size"
SPECIAL_FOLDER = "specialFolder"
VIDEO = "video"
WEB_DAV_URL = "webDavUrl"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
WORKBOOK = "workbook"
ANALYTICS = "analytics"
CHILDREN = "children"
LIST_ITEM = "listItem"
PERMISSIONS = "permissions"
SUBSCRIPTIONS = "subscriptions"
THUMBNAILS = "thumbnails"
VERSIONS = "versions"
class Enum115(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
WORKBOOK = "workbook"
ANALYTICS = "analytics"
CHILDREN = "children"
LIST_ITEM = "listItem"
PERMISSIONS = "permissions"
SUBSCRIPTIONS = "subscriptions"
THUMBNAILS = "thumbnails"
VERSIONS = "versions"
class Enum116(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_BY_DESC = "lastModifiedBy desc"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
LAST_MODIFIED_DATE_TIME_DESC = "lastModifiedDateTime desc"
PUBLICATION = "publication"
PUBLICATION_DESC = "publication desc"
class Enum117(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
PUBLICATION = "publication"
FIELDS = "fields"
class Enum118(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
FIELDS = "fields"
class Enum119(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
PUBLICATION = "publication"
FIELDS = "fields"
class Enum120(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
FIELDS = "fields"
class Enum121(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
APPLICATION_ID = "applicationId"
APPLICATION_ID_DESC = "applicationId desc"
CHANGE_TYPE = "changeType"
CHANGE_TYPE_DESC = "changeType desc"
CLIENT_STATE = "clientState"
CLIENT_STATE_DESC = "clientState desc"
CREATOR_ID = "creatorId"
CREATOR_ID_DESC = "creatorId desc"
ENCRYPTION_CERTIFICATE = "encryptionCertificate"
ENCRYPTION_CERTIFICATE_DESC = "encryptionCertificate desc"
ENCRYPTION_CERTIFICATE_ID = "encryptionCertificateId"
ENCRYPTION_CERTIFICATE_ID_DESC = "encryptionCertificateId desc"
EXPIRATION_DATE_TIME = "expirationDateTime"
EXPIRATION_DATE_TIME_DESC = "expirationDateTime desc"
INCLUDE_RESOURCE_DATA = "includeResourceData"
INCLUDE_RESOURCE_DATA_DESC = "includeResourceData desc"
LATEST_SUPPORTED_TLS_VERSION = "latestSupportedTlsVersion"
LATEST_SUPPORTED_TLS_VERSION_DESC = "latestSupportedTlsVersion desc"
LIFECYCLE_NOTIFICATION_URL = "lifecycleNotificationUrl"
LIFECYCLE_NOTIFICATION_URL_DESC = "lifecycleNotificationUrl desc"
NOTIFICATION_URL = "notificationUrl"
NOTIFICATION_URL_DESC = "notificationUrl desc"
RESOURCE = "resource"
RESOURCE_DESC = "resource desc"
class Enum122(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
APPLICATION_ID = "applicationId"
CHANGE_TYPE = "changeType"
CLIENT_STATE = "clientState"
CREATOR_ID = "creatorId"
ENCRYPTION_CERTIFICATE = "encryptionCertificate"
ENCRYPTION_CERTIFICATE_ID = "encryptionCertificateId"
EXPIRATION_DATE_TIME = "expirationDateTime"
INCLUDE_RESOURCE_DATA = "includeResourceData"
LATEST_SUPPORTED_TLS_VERSION = "latestSupportedTlsVersion"
LIFECYCLE_NOTIFICATION_URL = "lifecycleNotificationUrl"
NOTIFICATION_URL = "notificationUrl"
RESOURCE = "resource"
class Enum123(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
APPLICATION_ID = "applicationId"
CHANGE_TYPE = "changeType"
CLIENT_STATE = "clientState"
CREATOR_ID = "creatorId"
ENCRYPTION_CERTIFICATE = "encryptionCertificate"
ENCRYPTION_CERTIFICATE_ID = "encryptionCertificateId"
EXPIRATION_DATE_TIME = "expirationDateTime"
INCLUDE_RESOURCE_DATA = "includeResourceData"
LATEST_SUPPORTED_TLS_VERSION = "latestSupportedTlsVersion"
LIFECYCLE_NOTIFICATION_URL = "lifecycleNotificationUrl"
NOTIFICATION_URL = "notificationUrl"
RESOURCE = "resource"
class Enum127(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
CREATED_BY = "createdBy"
CREATED_BY_DESC = "createdBy desc"
CREATED_DATE_TIME = "createdDateTime"
CREATED_DATE_TIME_DESC = "createdDateTime desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
E_TAG = "eTag"
E_TAG_DESC = "eTag desc"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_BY_DESC = "lastModifiedBy desc"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
LAST_MODIFIED_DATE_TIME_DESC = "lastModifiedDateTime desc"
NAME = "name"
NAME_DESC = "name desc"
PARENT_REFERENCE = "parentReference"
PARENT_REFERENCE_DESC = "parentReference desc"
WEB_URL = "webUrl"
WEB_URL_DESC = "webUrl desc"
DISPLAY_NAME = "displayName"
DISPLAY_NAME_DESC = "displayName desc"
ERROR = "error"
ERROR_DESC = "error desc"
ROOT = "root"
ROOT_DESC = "root desc"
SHAREPOINT_IDS = "sharepointIds"
SHAREPOINT_IDS_DESC = "sharepointIds desc"
SITE_COLLECTION = "siteCollection"
SITE_COLLECTION_DESC = "siteCollection desc"
class Enum128(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DISPLAY_NAME = "displayName"
ERROR = "error"
ROOT = "root"
SHAREPOINT_IDS = "sharepointIds"
SITE_COLLECTION = "siteCollection"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Enum129(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Enum130(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DISPLAY_NAME = "displayName"
ERROR = "error"
ROOT = "root"
SHAREPOINT_IDS = "sharepointIds"
SITE_COLLECTION = "siteCollection"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Enum131(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Enum132(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
CREATED_BY = "createdBy"
CREATED_BY_DESC = "createdBy desc"
CREATED_DATE_TIME = "createdDateTime"
CREATED_DATE_TIME_DESC = "createdDateTime desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
E_TAG = "eTag"
E_TAG_DESC = "eTag desc"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_BY_DESC = "lastModifiedBy desc"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
LAST_MODIFIED_DATE_TIME_DESC = "lastModifiedDateTime desc"
NAME = "name"
NAME_DESC = "name desc"
PARENT_REFERENCE = "parentReference"
PARENT_REFERENCE_DESC = "parentReference desc"
WEB_URL = "webUrl"
WEB_URL_DESC = "webUrl desc"
DISPLAY_NAME = "displayName"
DISPLAY_NAME_DESC = "displayName desc"
ERROR = "error"
ERROR_DESC = "error desc"
ROOT = "root"
ROOT_DESC = "root desc"
SHAREPOINT_IDS = "sharepointIds"
SHAREPOINT_IDS_DESC = "sharepointIds desc"
SITE_COLLECTION = "siteCollection"
SITE_COLLECTION_DESC = "siteCollection desc"
class Enum133(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DISPLAY_NAME = "displayName"
ERROR = "error"
ROOT = "root"
SHAREPOINT_IDS = "sharepointIds"
SITE_COLLECTION = "siteCollection"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Enum134(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Enum135(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
CREATED_BY = "createdBy"
CREATED_BY_DESC = "createdBy desc"
CREATED_DATE_TIME = "createdDateTime"
CREATED_DATE_TIME_DESC = "createdDateTime desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
E_TAG = "eTag"
E_TAG_DESC = "eTag desc"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_BY_DESC = "lastModifiedBy desc"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
LAST_MODIFIED_DATE_TIME_DESC = "lastModifiedDateTime desc"
NAME = "name"
NAME_DESC = "name desc"
PARENT_REFERENCE = "parentReference"
PARENT_REFERENCE_DESC = "parentReference desc"
WEB_URL = "webUrl"
WEB_URL_DESC = "webUrl desc"
DISPLAY_NAME = "displayName"
DISPLAY_NAME_DESC = "displayName desc"
ERROR = "error"
ERROR_DESC = "error desc"
ROOT = "root"
ROOT_DESC = "root desc"
SHAREPOINT_IDS = "sharepointIds"
SHAREPOINT_IDS_DESC = "sharepointIds desc"
SITE_COLLECTION = "siteCollection"
SITE_COLLECTION_DESC = "siteCollection desc"
class Enum65(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DISPLAY_NAME = "displayName"
ERROR = "error"
ROOT = "root"
SHAREPOINT_IDS = "sharepointIds"
SITE_COLLECTION = "siteCollection"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Enum66(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Enum68(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Enum69(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ALL_TIME = "allTime"
ITEM_ACTIVITY_STATS = "itemActivityStats"
LAST_SEVEN_DAYS = "lastSevenDays"
class Enum70(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
ALL_TIME = "allTime"
ITEM_ACTIVITY_STATS = "itemActivityStats"
LAST_SEVEN_DAYS = "lastSevenDays"
class Enum71(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
BOOLEAN = "boolean"
BOOLEAN_DESC = "boolean desc"
CALCULATED = "calculated"
CALCULATED_DESC = "calculated desc"
CHOICE = "choice"
CHOICE_DESC = "choice desc"
COLUMN_GROUP = "columnGroup"
COLUMN_GROUP_DESC = "columnGroup desc"
CURRENCY = "currency"
CURRENCY_DESC = "currency desc"
DATE_TIME = "dateTime"
DATE_TIME_DESC = "dateTime desc"
DEFAULT_VALUE = "defaultValue"
DEFAULT_VALUE_DESC = "defaultValue desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
DISPLAY_NAME = "displayName"
DISPLAY_NAME_DESC = "displayName desc"
ENFORCE_UNIQUE_VALUES = "enforceUniqueValues"
ENFORCE_UNIQUE_VALUES_DESC = "enforceUniqueValues desc"
GEOLOCATION = "geolocation"
GEOLOCATION_DESC = "geolocation desc"
HIDDEN = "hidden"
HIDDEN_DESC = "hidden desc"
INDEXED = "indexed"
INDEXED_DESC = "indexed desc"
LOOKUP = "lookup"
LOOKUP_DESC = "lookup desc"
NAME = "name"
NAME_DESC = "name desc"
NUMBER = "number"
NUMBER_DESC = "number desc"
PERSON_OR_GROUP = "personOrGroup"
PERSON_OR_GROUP_DESC = "personOrGroup desc"
READ_ONLY = "readOnly"
READ_ONLY_DESC = "readOnly desc"
REQUIRED = "required"
REQUIRED_DESC = "required desc"
TEXT = "text"
TEXT_DESC = "text desc"
class Enum72(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
BOOLEAN = "boolean"
CALCULATED = "calculated"
CHOICE = "choice"
COLUMN_GROUP = "columnGroup"
CURRENCY = "currency"
DATE_TIME = "dateTime"
DEFAULT_VALUE = "defaultValue"
DESCRIPTION = "description"
DISPLAY_NAME = "displayName"
ENFORCE_UNIQUE_VALUES = "enforceUniqueValues"
GEOLOCATION = "geolocation"
HIDDEN = "hidden"
INDEXED = "indexed"
LOOKUP = "lookup"
NAME = "name"
NUMBER = "number"
PERSON_OR_GROUP = "personOrGroup"
READ_ONLY = "readOnly"
REQUIRED = "required"
TEXT = "text"
class Enum73(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
BOOLEAN = "boolean"
CALCULATED = "calculated"
CHOICE = "choice"
COLUMN_GROUP = "columnGroup"
CURRENCY = "currency"
DATE_TIME = "dateTime"
DEFAULT_VALUE = "defaultValue"
DESCRIPTION = "description"
DISPLAY_NAME = "displayName"
ENFORCE_UNIQUE_VALUES = "enforceUniqueValues"
GEOLOCATION = "geolocation"
HIDDEN = "hidden"
INDEXED = "indexed"
LOOKUP = "lookup"
NAME = "name"
NUMBER = "number"
PERSON_OR_GROUP = "personOrGroup"
READ_ONLY = "readOnly"
REQUIRED = "required"
TEXT = "text"
class Enum74(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
GROUP = "group"
GROUP_DESC = "group desc"
HIDDEN = "hidden"
HIDDEN_DESC = "hidden desc"
INHERITED_FROM = "inheritedFrom"
INHERITED_FROM_DESC = "inheritedFrom desc"
NAME = "name"
NAME_DESC = "name desc"
ORDER = "order"
ORDER_DESC = "order desc"
PARENT_ID = "parentId"
PARENT_ID_DESC = "parentId desc"
READ_ONLY = "readOnly"
READ_ONLY_DESC = "readOnly desc"
SEALED = "sealed"
SEALED_DESC = "sealed desc"
class Enum75(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
DESCRIPTION = "description"
GROUP = "group"
HIDDEN = "hidden"
INHERITED_FROM = "inheritedFrom"
NAME = "name"
ORDER = "order"
PARENT_ID = "parentId"
READ_ONLY = "readOnly"
SEALED = "sealed"
COLUMN_LINKS = "columnLinks"
class Enum76(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
COLUMN_LINKS = "columnLinks"
class Enum77(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
DESCRIPTION = "description"
GROUP = "group"
HIDDEN = "hidden"
INHERITED_FROM = "inheritedFrom"
NAME = "name"
ORDER = "order"
PARENT_ID = "parentId"
READ_ONLY = "readOnly"
SEALED = "sealed"
COLUMN_LINKS = "columnLinks"
class Enum78(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
COLUMN_LINKS = "columnLinks"
class Enum79(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
NAME = "name"
NAME_DESC = "name desc"
class Enum80(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
NAME = "name"
class Enum81(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
NAME = "name"
class Enum82(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DRIVE_TYPE = "driveType"
OWNER = "owner"
QUOTA = "quota"
SHARE_POINT_IDS = "sharePointIds"
SYSTEM = "system"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
FOLLOWING = "following"
ITEMS = "items"
LIST = "list"
ROOT = "root"
SPECIAL = "special"
class Enum83(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
FOLLOWING = "following"
ITEMS = "items"
LIST = "list"
ROOT = "root"
SPECIAL = "special"
class Enum84(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
CREATED_BY = "createdBy"
CREATED_BY_DESC = "createdBy desc"
CREATED_DATE_TIME = "createdDateTime"
CREATED_DATE_TIME_DESC = "createdDateTime desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
E_TAG = "eTag"
E_TAG_DESC = "eTag desc"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_BY_DESC = "lastModifiedBy desc"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
LAST_MODIFIED_DATE_TIME_DESC = "lastModifiedDateTime desc"
NAME = "name"
NAME_DESC = "name desc"
PARENT_REFERENCE = "parentReference"
PARENT_REFERENCE_DESC = "parentReference desc"
WEB_URL = "webUrl"
WEB_URL_DESC = "webUrl desc"
DRIVE_TYPE = "driveType"
DRIVE_TYPE_DESC = "driveType desc"
OWNER = "owner"
OWNER_DESC = "owner desc"
QUOTA = "quota"
QUOTA_DESC = "quota desc"
SHARE_POINT_IDS = "sharePointIds"
SHARE_POINT_IDS_DESC = "sharePointIds desc"
SYSTEM = "system"
SYSTEM_DESC = "system desc"
class Enum85(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DRIVE_TYPE = "driveType"
OWNER = "owner"
QUOTA = "quota"
SHARE_POINT_IDS = "sharePointIds"
SYSTEM = "system"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
FOLLOWING = "following"
ITEMS = "items"
LIST = "list"
ROOT = "root"
SPECIAL = "special"
class Enum86(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
FOLLOWING = "following"
ITEMS = "items"
LIST = "list"
ROOT = "root"
SPECIAL = "special"
class Enum87(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DRIVE_TYPE = "driveType"
OWNER = "owner"
QUOTA = "quota"
SHARE_POINT_IDS = "sharePointIds"
SYSTEM = "system"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
FOLLOWING = "following"
ITEMS = "items"
LIST = "list"
ROOT = "root"
SPECIAL = "special"
class Enum88(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
FOLLOWING = "following"
ITEMS = "items"
LIST = "list"
ROOT = "root"
SPECIAL = "special"
class Enum89(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
CREATED_BY = "createdBy"
CREATED_BY_DESC = "createdBy desc"
CREATED_DATE_TIME = "createdDateTime"
CREATED_DATE_TIME_DESC = "createdDateTime desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
E_TAG = "eTag"
E_TAG_DESC = "eTag desc"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_BY_DESC = "lastModifiedBy desc"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
LAST_MODIFIED_DATE_TIME_DESC = "lastModifiedDateTime desc"
NAME = "name"
NAME_DESC = "name desc"
PARENT_REFERENCE = "parentReference"
PARENT_REFERENCE_DESC = "parentReference desc"
WEB_URL = "webUrl"
WEB_URL_DESC = "webUrl desc"
DISPLAY_NAME = "displayName"
DISPLAY_NAME_DESC = "displayName desc"
LIST = "list"
LIST_DESC = "list desc"
SHAREPOINT_IDS = "sharepointIds"
SHAREPOINT_IDS_DESC = "sharepointIds desc"
SYSTEM = "system"
SYSTEM_DESC = "system desc"
class Enum90(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DISPLAY_NAME = "displayName"
LIST = "list"
SHAREPOINT_IDS = "sharepointIds"
SYSTEM = "system"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
ITEMS = "items"
SUBSCRIPTIONS = "subscriptions"
class Enum91(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
ITEMS = "items"
SUBSCRIPTIONS = "subscriptions"
class Enum92(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DISPLAY_NAME = "displayName"
LIST = "list"
SHAREPOINT_IDS = "sharepointIds"
SYSTEM = "system"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
ITEMS = "items"
SUBSCRIPTIONS = "subscriptions"
class Enum93(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
ITEMS = "items"
SUBSCRIPTIONS = "subscriptions"
class Enum94(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
BOOLEAN = "boolean"
BOOLEAN_DESC = "boolean desc"
CALCULATED = "calculated"
CALCULATED_DESC = "calculated desc"
CHOICE = "choice"
CHOICE_DESC = "choice desc"
COLUMN_GROUP = "columnGroup"
COLUMN_GROUP_DESC = "columnGroup desc"
CURRENCY = "currency"
CURRENCY_DESC = "currency desc"
DATE_TIME = "dateTime"
DATE_TIME_DESC = "dateTime desc"
DEFAULT_VALUE = "defaultValue"
DEFAULT_VALUE_DESC = "defaultValue desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
DISPLAY_NAME = "displayName"
DISPLAY_NAME_DESC = "displayName desc"
ENFORCE_UNIQUE_VALUES = "enforceUniqueValues"
ENFORCE_UNIQUE_VALUES_DESC = "enforceUniqueValues desc"
GEOLOCATION = "geolocation"
GEOLOCATION_DESC = "geolocation desc"
HIDDEN = "hidden"
HIDDEN_DESC = "hidden desc"
INDEXED = "indexed"
INDEXED_DESC = "indexed desc"
LOOKUP = "lookup"
LOOKUP_DESC = "lookup desc"
NAME = "name"
NAME_DESC = "name desc"
NUMBER = "number"
NUMBER_DESC = "number desc"
PERSON_OR_GROUP = "personOrGroup"
PERSON_OR_GROUP_DESC = "personOrGroup desc"
READ_ONLY = "readOnly"
READ_ONLY_DESC = "readOnly desc"
REQUIRED = "required"
REQUIRED_DESC = "required desc"
TEXT = "text"
TEXT_DESC = "text desc"
class Enum95(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
BOOLEAN = "boolean"
CALCULATED = "calculated"
CHOICE = "choice"
COLUMN_GROUP = "columnGroup"
CURRENCY = "currency"
DATE_TIME = "dateTime"
DEFAULT_VALUE = "defaultValue"
DESCRIPTION = "description"
DISPLAY_NAME = "displayName"
ENFORCE_UNIQUE_VALUES = "enforceUniqueValues"
GEOLOCATION = "geolocation"
HIDDEN = "hidden"
INDEXED = "indexed"
LOOKUP = "lookup"
NAME = "name"
NUMBER = "number"
PERSON_OR_GROUP = "personOrGroup"
READ_ONLY = "readOnly"
REQUIRED = "required"
TEXT = "text"
class Enum96(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
BOOLEAN = "boolean"
CALCULATED = "calculated"
CHOICE = "choice"
COLUMN_GROUP = "columnGroup"
CURRENCY = "currency"
DATE_TIME = "dateTime"
DEFAULT_VALUE = "defaultValue"
DESCRIPTION = "description"
DISPLAY_NAME = "displayName"
ENFORCE_UNIQUE_VALUES = "enforceUniqueValues"
GEOLOCATION = "geolocation"
HIDDEN = "hidden"
INDEXED = "indexed"
LOOKUP = "lookup"
NAME = "name"
NUMBER = "number"
PERSON_OR_GROUP = "personOrGroup"
READ_ONLY = "readOnly"
REQUIRED = "required"
TEXT = "text"
class Enum97(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
GROUP = "group"
GROUP_DESC = "group desc"
HIDDEN = "hidden"
HIDDEN_DESC = "hidden desc"
INHERITED_FROM = "inheritedFrom"
INHERITED_FROM_DESC = "inheritedFrom desc"
NAME = "name"
NAME_DESC = "name desc"
ORDER = "order"
ORDER_DESC = "order desc"
PARENT_ID = "parentId"
PARENT_ID_DESC = "parentId desc"
READ_ONLY = "readOnly"
READ_ONLY_DESC = "readOnly desc"
SEALED = "sealed"
SEALED_DESC = "sealed desc"
class Enum98(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
DESCRIPTION = "description"
GROUP = "group"
HIDDEN = "hidden"
INHERITED_FROM = "inheritedFrom"
NAME = "name"
ORDER = "order"
PARENT_ID = "parentId"
READ_ONLY = "readOnly"
SEALED = "sealed"
COLUMN_LINKS = "columnLinks"
class Enum99(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
COLUMN_LINKS = "columnLinks"
class Get1ItemsItem(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DISPLAY_NAME = "displayName"
ERROR = "error"
ROOT = "root"
SHAREPOINT_IDS = "sharepointIds"
SITE_COLLECTION = "siteCollection"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Get2ItemsItem(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DISPLAY_NAME = "displayName"
ERROR = "error"
ROOT = "root"
SHAREPOINT_IDS = "sharepointIds"
SITE_COLLECTION = "siteCollection"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Get3ItemsItem(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Get5ItemsItem(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
CREATED_BY = "createdBy"
CREATED_BY_DESC = "createdBy desc"
CREATED_DATE_TIME = "createdDateTime"
CREATED_DATE_TIME_DESC = "createdDateTime desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
E_TAG = "eTag"
E_TAG_DESC = "eTag desc"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_BY_DESC = "lastModifiedBy desc"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
LAST_MODIFIED_DATE_TIME_DESC = "lastModifiedDateTime desc"
NAME = "name"
NAME_DESC = "name desc"
PARENT_REFERENCE = "parentReference"
PARENT_REFERENCE_DESC = "parentReference desc"
WEB_URL = "webUrl"
WEB_URL_DESC = "webUrl desc"
DISPLAY_NAME = "displayName"
DISPLAY_NAME_DESC = "displayName desc"
ERROR = "error"
ERROR_DESC = "error desc"
ROOT = "root"
ROOT_DESC = "root desc"
SHAREPOINT_IDS = "sharepointIds"
SHAREPOINT_IDS_DESC = "sharepointIds desc"
SITE_COLLECTION = "siteCollection"
SITE_COLLECTION_DESC = "siteCollection desc"
class Get6ItemsItem(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
ID_DESC = "id desc"
CREATED_BY = "createdBy"
CREATED_BY_DESC = "createdBy desc"
CREATED_DATE_TIME = "createdDateTime"
CREATED_DATE_TIME_DESC = "createdDateTime desc"
DESCRIPTION = "description"
DESCRIPTION_DESC = "description desc"
E_TAG = "eTag"
E_TAG_DESC = "eTag desc"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_BY_DESC = "lastModifiedBy desc"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
LAST_MODIFIED_DATE_TIME_DESC = "lastModifiedDateTime desc"
NAME = "name"
NAME_DESC = "name desc"
PARENT_REFERENCE = "parentReference"
PARENT_REFERENCE_DESC = "parentReference desc"
WEB_URL = "webUrl"
WEB_URL_DESC = "webUrl desc"
DISPLAY_NAME = "displayName"
DISPLAY_NAME_DESC = "displayName desc"
ERROR = "error"
ERROR_DESC = "error desc"
ROOT = "root"
ROOT_DESC = "root desc"
SHAREPOINT_IDS = "sharepointIds"
SHAREPOINT_IDS_DESC = "sharepointIds desc"
SITE_COLLECTION = "siteCollection"
SITE_COLLECTION_DESC = "siteCollection desc"
class Get7ItemsItem(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ID = "id"
CREATED_BY = "createdBy"
CREATED_DATE_TIME = "createdDateTime"
DESCRIPTION = "description"
E_TAG = "eTag"
LAST_MODIFIED_BY = "lastModifiedBy"
LAST_MODIFIED_DATE_TIME = "lastModifiedDateTime"
NAME = "name"
PARENT_REFERENCE = "parentReference"
WEB_URL = "webUrl"
DISPLAY_NAME = "displayName"
ERROR = "error"
ROOT = "root"
SHAREPOINT_IDS = "sharepointIds"
SITE_COLLECTION = "siteCollection"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class Get8ItemsItem(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ASTERISK = "*"
CREATED_BY_USER = "createdByUser"
LAST_MODIFIED_BY_USER = "lastModifiedByUser"
ANALYTICS = "analytics"
COLUMNS = "columns"
CONTENT_TYPES = "contentTypes"
DRIVE = "drive"
DRIVES = "drives"
ITEMS = "items"
LISTS = "lists"
SITES = "sites"
ONENOTE = "onenote"
class MicrosoftGraphActionState(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
PENDING = "pending"
CANCELED = "canceled"
ACTIVE = "active"
DONE = "done"
FAILED = "failed"
NOT_SUPPORTED = "notSupported"
class MicrosoftGraphAttendeeType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
REQUIRED = "required"
OPTIONAL = "optional"
RESOURCE = "resource"
class MicrosoftGraphAutomaticRepliesStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
DISABLED = "disabled"
ALWAYS_ENABLED = "alwaysEnabled"
SCHEDULED = "scheduled"
class MicrosoftGraphBodyType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
TEXT = "text"
HTML = "html"
class MicrosoftGraphCalendarColor(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
LIGHT_BLUE = "lightBlue"
LIGHT_GREEN = "lightGreen"
AUTO = "auto"
LIGHT_ORANGE = "lightOrange"
LIGHT_GRAY = "lightGray"
LIGHT_YELLOW = "lightYellow"
LIGHT_TEAL = "lightTeal"
LIGHT_PINK = "lightPink"
LIGHT_BROWN = "lightBrown"
LIGHT_RED = "lightRed"
MAX_COLOR = "maxColor"
class MicrosoftGraphCalendarRoleType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
FREE_BUSY_READ = "freeBusyRead"
LIMITED_READ = "limitedRead"
READ = "read"
WRITE = "write"
DELEGATE_WITHOUT_PRIVATE_EVENT_ACCESS = "delegateWithoutPrivateEventAccess"
DELEGATE_WITH_PRIVATE_EVENT_ACCESS = "delegateWithPrivateEventAccess"
CUSTOM = "custom"
class MicrosoftGraphCategoryColor(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
PRESET0 = "preset0"
PRESET1 = "preset1"
NONE = "none"
PRESET2 = "preset2"
PRESET3 = "preset3"
PRESET4 = "preset4"
PRESET5 = "preset5"
PRESET6 = "preset6"
PRESET7 = "preset7"
PRESET8 = "preset8"
PRESET9 = "preset9"
PRESET10 = "preset10"
PRESET11 = "preset11"
PRESET12 = "preset12"
PRESET13 = "preset13"
PRESET14 = "preset14"
PRESET15 = "preset15"
PRESET16 = "preset16"
PRESET17 = "preset17"
PRESET18 = "preset18"
PRESET19 = "preset19"
PRESET20 = "preset20"
PRESET21 = "preset21"
PRESET22 = "preset22"
PRESET23 = "preset23"
PRESET24 = "preset24"
class MicrosoftGraphChannelMembershipType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
STANDARD = "standard"
PRIVATE = "private"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphChatMessageImportance(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NORMAL = "normal"
HIGH = "high"
URGENT = "urgent"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphChatMessagePolicyViolationDlpActionTypes(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
NOTIFY_SENDER = "notifySender"
BLOCK_ACCESS = "blockAccess"
BLOCK_ACCESS_EXTERNAL = "blockAccessExternal"
class MicrosoftGraphChatMessagePolicyViolationUserActionTypes(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
OVERRIDE = "override"
REPORT_FALSE_POSITIVE = "reportFalsePositive"
class MicrosoftGraphChatMessagePolicyViolationVerdictDetailsTypes(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
ALLOW_FALSE_POSITIVE_OVERRIDE = "allowFalsePositiveOverride"
ALLOW_OVERRIDE_WITHOUT_JUSTIFICATION = "allowOverrideWithoutJustification"
ALLOW_OVERRIDE_WITH_JUSTIFICATION = "allowOverrideWithJustification"
class MicrosoftGraphChatMessageType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
MESSAGE = "message"
CHAT_EVENT = "chatEvent"
TYPING = "typing"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphComplianceState(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
UNKNOWN = "unknown"
COMPLIANT = "compliant"
NONCOMPLIANT = "noncompliant"
CONFLICT = "conflict"
ERROR = "error"
IN_GRACE_PERIOD = "inGracePeriod"
CONFIG_MANAGER = "configManager"
class MicrosoftGraphComplianceStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
UNKNOWN = "unknown"
NOT_APPLICABLE = "notApplicable"
COMPLIANT = "compliant"
REMEDIATED = "remediated"
NON_COMPLIANT = "nonCompliant"
ERROR = "error"
CONFLICT = "conflict"
NOT_ASSIGNED = "notAssigned"
class MicrosoftGraphDayOfWeek(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
SUNDAY = "sunday"
MONDAY = "monday"
TUESDAY = "tuesday"
WEDNESDAY = "wednesday"
THURSDAY = "thursday"
FRIDAY = "friday"
SATURDAY = "saturday"
class MicrosoftGraphDelegateMeetingMessageDeliveryOptions(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
SEND_TO_DELEGATE_AND_INFORMATION_TO_PRINCIPAL = "sendToDelegateAndInformationToPrincipal"
SEND_TO_DELEGATE_AND_PRINCIPAL = "sendToDelegateAndPrincipal"
SEND_TO_DELEGATE_ONLY = "sendToDelegateOnly"
class MicrosoftGraphDeviceEnrollmentType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
UNKNOWN = "unknown"
USER_ENROLLMENT = "userEnrollment"
DEVICE_ENROLLMENT_MANAGER = "deviceEnrollmentManager"
APPLE_BULK_WITH_USER = "appleBulkWithUser"
APPLE_BULK_WITHOUT_USER = "appleBulkWithoutUser"
WINDOWS_AZURE_AD_JOIN = "windowsAzureADJoin"
WINDOWS_BULK_USERLESS = "windowsBulkUserless"
WINDOWS_AUTO_ENROLLMENT = "windowsAutoEnrollment"
WINDOWS_BULK_AZURE_DOMAIN_JOIN = "windowsBulkAzureDomainJoin"
WINDOWS_CO_MANAGEMENT = "windowsCoManagement"
class MicrosoftGraphDeviceManagementExchangeAccessState(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
UNKNOWN = "unknown"
ALLOWED = "allowed"
BLOCKED = "blocked"
QUARANTINED = "quarantined"
class MicrosoftGraphDeviceManagementExchangeAccessStateReason(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
UNKNOWN = "unknown"
EXCHANGE_GLOBAL_RULE = "exchangeGlobalRule"
EXCHANGE_INDIVIDUAL_RULE = "exchangeIndividualRule"
EXCHANGE_DEVICE_RULE = "exchangeDeviceRule"
EXCHANGE_UPGRADE = "exchangeUpgrade"
EXCHANGE_MAILBOX_POLICY = "exchangeMailboxPolicy"
OTHER = "other"
COMPLIANT = "compliant"
NOT_COMPLIANT = "notCompliant"
NOT_ENROLLED = "notEnrolled"
UNKNOWN_LOCATION = "unknownLocation"
MFA_REQUIRED = "mfaRequired"
AZURE_AD_BLOCK_DUE_TO_ACCESS_POLICY = "azureADBlockDueToAccessPolicy"
COMPROMISED_PASSWORD = "compromisedPassword"
DEVICE_NOT_KNOWN_WITH_MANAGED_APP = "deviceNotKnownWithManagedApp"
class MicrosoftGraphDeviceRegistrationState(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NOT_REGISTERED = "notRegistered"
REGISTERED = "registered"
REVOKED = "revoked"
KEY_CONFLICT = "keyConflict"
APPROVAL_PENDING = "approvalPending"
CERTIFICATE_RESET = "certificateReset"
NOT_REGISTERED_PENDING_ENROLLMENT = "notRegisteredPendingEnrollment"
UNKNOWN = "unknown"
class MicrosoftGraphEventType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
SINGLE_INSTANCE = "singleInstance"
OCCURRENCE = "occurrence"
EXCEPTION = "exception"
SERIES_MASTER = "seriesMaster"
class MicrosoftGraphExternalAudienceScope(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
CONTACTS_ONLY = "contactsOnly"
ALL = "all"
class MicrosoftGraphFollowupFlagStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NOT_FLAGGED = "notFlagged"
COMPLETE = "complete"
FLAGGED = "flagged"
class MicrosoftGraphFreeBusyStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
FREE = "free"
TENTATIVE = "tentative"
UNKNOWN = "unknown"
BUSY = "busy"
OOF = "oof"
WORKING_ELSEWHERE = "workingElsewhere"
class MicrosoftGraphGiphyRatingType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
STRICT = "strict"
MODERATE = "moderate"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphImportance(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
LOW = "low"
NORMAL = "normal"
HIGH = "high"
class MicrosoftGraphInferenceClassificationType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
FOCUSED = "focused"
OTHER = "other"
class MicrosoftGraphLocationType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
DEFAULT = "default"
CONFERENCE_ROOM = "conferenceRoom"
HOME_ADDRESS = "homeAddress"
BUSINESS_ADDRESS = "businessAddress"
GEO_COORDINATES = "geoCoordinates"
STREET_ADDRESS = "streetAddress"
HOTEL = "hotel"
RESTAURANT = "restaurant"
LOCAL_BUSINESS = "localBusiness"
POSTAL_ADDRESS = "postalAddress"
class MicrosoftGraphLocationUniqueIdType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
UNKNOWN = "unknown"
LOCATION_STORE = "locationStore"
DIRECTORY = "directory"
PRIVATE = "private"
BING = "bing"
class MicrosoftGraphManagedAppFlaggedReason(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
ROOTED_DEVICE = "rootedDevice"
class MicrosoftGraphManagedDeviceOwnerType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
UNKNOWN = "unknown"
COMPANY = "company"
PERSONAL = "personal"
class MicrosoftGraphManagedDevicePartnerReportedHealthState(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
UNKNOWN = "unknown"
ACTIVATED = "activated"
DEACTIVATED = "deactivated"
SECURED = "secured"
LOW_SEVERITY = "lowSeverity"
MEDIUM_SEVERITY = "mediumSeverity"
HIGH_SEVERITY = "highSeverity"
UNRESPONSIVE = "unresponsive"
COMPROMISED = "compromised"
MISCONFIGURED = "misconfigured"
class MicrosoftGraphManagementAgentType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
EAS = "eas"
MDM = "mdm"
EAS_MDM = "easMdm"
INTUNE_CLIENT = "intuneClient"
EAS_INTUNE_CLIENT = "easIntuneClient"
CONFIGURATION_MANAGER_CLIENT = "configurationManagerClient"
CONFIGURATION_MANAGER_CLIENT_MDM = "configurationManagerClientMdm"
CONFIGURATION_MANAGER_CLIENT_MDM_EAS = "configurationManagerClientMdmEas"
UNKNOWN = "unknown"
JAMF = "jamf"
GOOGLE_CLOUD_DEVICE_POLICY_CONTROLLER = "googleCloudDevicePolicyController"
class MicrosoftGraphMessageActionFlag(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ANY = "any"
CALL = "call"
DO_NOT_FORWARD = "doNotForward"
FOLLOW_UP = "followUp"
FYI = "fyi"
FORWARD = "forward"
NO_RESPONSE_NECESSARY = "noResponseNecessary"
READ = "read"
REPLY = "reply"
REPLY_TO_ALL = "replyToAll"
REVIEW = "review"
class MicrosoftGraphOnenotePatchActionType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
REPLACE = "Replace"
APPEND = "Append"
DELETE = "Delete"
INSERT = "Insert"
PREPEND = "Prepend"
class MicrosoftGraphOnenotePatchInsertPosition(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
AFTER = "After"
BEFORE = "Before"
class MicrosoftGraphOnenoteSourceService(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
UNKNOWN = "Unknown"
ONE_DRIVE = "OneDrive"
ONE_DRIVE_FOR_BUSINESS = "OneDriveForBusiness"
ON_PREM_ONE_DRIVE_FOR_BUSINESS = "OnPremOneDriveForBusiness"
class MicrosoftGraphOnenoteUserRole(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
OWNER = "Owner"
CONTRIBUTOR = "Contributor"
NONE = "None"
READER = "Reader"
class MicrosoftGraphOnlineMeetingProviderType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
UNKNOWN = "unknown"
SKYPE_FOR_BUSINESS = "skypeForBusiness"
SKYPE_FOR_CONSUMER = "skypeForConsumer"
TEAMS_FOR_BUSINESS = "teamsForBusiness"
class MicrosoftGraphOperationStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NOT_STARTED = "NotStarted"
RUNNING = "Running"
COMPLETED = "Completed"
FAILED = "Failed"
class MicrosoftGraphPhoneType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
HOME = "home"
BUSINESS = "business"
MOBILE = "mobile"
OTHER = "other"
ASSISTANT = "assistant"
HOME_FAX = "homeFax"
BUSINESS_FAX = "businessFax"
OTHER_FAX = "otherFax"
PAGER = "pager"
RADIO = "radio"
class MicrosoftGraphPlannerPreviewType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
AUTOMATIC = "automatic"
NO_PREVIEW = "noPreview"
CHECKLIST = "checklist"
DESCRIPTION = "description"
REFERENCE = "reference"
class MicrosoftGraphPolicyPlatformType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ANDROID = "android"
I_OS = "iOS"
MAC_OS = "macOS"
WINDOWS_PHONE81 = "windowsPhone81"
WINDOWS81_AND_LATER = "windows81AndLater"
WINDOWS10_AND_LATER = "windows10AndLater"
ANDROID_WORK_PROFILE = "androidWorkProfile"
ALL = "all"
class MicrosoftGraphRecurrencePatternType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
DAILY = "daily"
WEEKLY = "weekly"
ABSOLUTE_MONTHLY = "absoluteMonthly"
RELATIVE_MONTHLY = "relativeMonthly"
ABSOLUTE_YEARLY = "absoluteYearly"
RELATIVE_YEARLY = "relativeYearly"
class MicrosoftGraphRecurrenceRangeType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
END_DATE = "endDate"
NO_END = "noEnd"
NUMBERED = "numbered"
class MicrosoftGraphResponseType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
ORGANIZER = "organizer"
TENTATIVELY_ACCEPTED = "tentativelyAccepted"
ACCEPTED = "accepted"
DECLINED = "declined"
NOT_RESPONDED = "notResponded"
class MicrosoftGraphScheduleChangeRequestActor(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
SENDER = "sender"
RECIPIENT = "recipient"
MANAGER = "manager"
SYSTEM = "system"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphScheduleChangeState(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
PENDING = "pending"
APPROVED = "approved"
DECLINED = "declined"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphScheduleEntityTheme(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
WHITE = "white"
BLUE = "blue"
GREEN = "green"
PURPLE = "purple"
PINK = "pink"
YELLOW = "yellow"
GRAY = "gray"
DARK_BLUE = "darkBlue"
DARK_GREEN = "darkGreen"
DARK_PURPLE = "darkPurple"
DARK_PINK = "darkPink"
DARK_YELLOW = "darkYellow"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphSelectionLikelihoodInfo(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NOT_SPECIFIED = "notSpecified"
HIGH = "high"
class MicrosoftGraphSensitivity(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NORMAL = "normal"
PERSONAL = "personal"
PRIVATE = "private"
CONFIDENTIAL = "confidential"
class MicrosoftGraphStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
ACTIVE = "active"
UPDATED = "updated"
DELETED = "deleted"
IGNORED = "ignored"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphTeamsAppDistributionMethod(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
STORE = "store"
ORGANIZATION = "organization"
SIDELOADED = "sideloaded"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphTeamsAsyncOperationStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
INVALID = "invalid"
NOT_STARTED = "notStarted"
IN_PROGRESS = "inProgress"
SUCCEEDED = "succeeded"
FAILED = "failed"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphTeamsAsyncOperationType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
INVALID = "invalid"
CLONE_TEAM = "cloneTeam"
ARCHIVE_TEAM = "archiveTeam"
UNARCHIVE_TEAM = "unarchiveTeam"
CREATE_TEAM = "createTeam"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphTeamSpecialization(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
EDUCATION_STANDARD = "educationStandard"
EDUCATION_CLASS = "educationClass"
EDUCATION_PROFESSIONAL_LEARNING_COMMUNITY = "educationProfessionalLearningCommunity"
EDUCATION_STAFF = "educationStaff"
HEALTHCARE_STANDARD = "healthcareStandard"
HEALTHCARE_CARE_COORDINATION = "healthcareCareCoordination"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphTeamVisibilityType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
PRIVATE = "private"
PUBLIC = "public"
HIDDEN_MEMBERSHIP = "hiddenMembership"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphTimeOffReasonIconType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NONE = "none"
CAR = "car"
CALENDAR = "calendar"
RUNNING = "running"
PLANE = "plane"
FIRST_AID = "firstAid"
DOCTOR = "doctor"
NOT_WORKING = "notWorking"
CLOCK = "clock"
JURY_DUTY = "juryDuty"
GLOBE = "globe"
CUP = "cup"
PHONE = "phone"
WEATHER = "weather"
UMBRELLA = "umbrella"
PIGGY_BANK = "piggyBank"
DOG = "dog"
CAKE = "cake"
TRAFFIC_CONE = "trafficCone"
PIN = "pin"
SUNNY = "sunny"
UNKNOWN_FUTURE_VALUE = "unknownFutureValue"
class MicrosoftGraphWebsiteType(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
OTHER = "other"
HOME = "home"
WORK = "work"
BLOG = "blog"
PROFILE = "profile"
class MicrosoftGraphWeekIndex(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
FIRST = "first"
SECOND = "second"
THIRD = "third"
FOURTH = "fourth"
LAST = "last"
class MicrosoftGraphWorkbookOperationStatus(with_metaclass(_CaseInsensitiveEnumMeta, str, Enum)):
NOT_STARTED = "notStarted"
RUNNING = "running"
SUCCEEDED = "succeeded"
FAILED = "failed"
| 30.259052 | 119 | 0.701945 | 5,592 | 59,338 | 7.164342 | 0.146102 | 0.044455 | 0.122208 | 0.132391 | 0.708609 | 0.656042 | 0.652597 | 0.622894 | 0.622894 | 0.613658 | 0 | 0.005631 | 0.200883 | 59,338 | 1,960 | 120 | 30.27449 | 0.839259 | 0.012134 | 0 | 0.717534 | 0 | 0 | 0.260326 | 0.017041 | 0 | 0 | 0 | 0 | 0 | 1 | 0.001197 | false | 0.000598 | 0.002394 | 0.000598 | 0.999402 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c03be9166bd151ec0d6a3cb24a69aeb0b4160c8e | 456 | py | Python | evntbus/decorators.py | jmwri/eventbus | fe91ab2486b99bffb0232c23d45d0c5dedce1b42 | [
"MIT"
] | null | null | null | evntbus/decorators.py | jmwri/eventbus | fe91ab2486b99bffb0232c23d45d0c5dedce1b42 | [
"MIT"
] | null | null | null | evntbus/decorators.py | jmwri/eventbus | fe91ab2486b99bffb0232c23d45d0c5dedce1b42 | [
"MIT"
] | null | null | null | import typing
if typing.TYPE_CHECKING:
from evntbus.bus import Bus
def listen_decorator(evntbus: 'Bus'):
class ListenDecorator(object):
def __init__(self, event: typing.Type, priority: int = 5):
self.event = event
self.priority = priority
def __call__(self, f: typing.Callable) -> typing.Callable:
evntbus.listen(self.event, f, self.priority)
return f
return ListenDecorator
| 25.333333 | 66 | 0.644737 | 53 | 456 | 5.358491 | 0.45283 | 0.09507 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002985 | 0.265351 | 456 | 17 | 67 | 26.823529 | 0.844776 | 0 | 0 | 0 | 0 | 0 | 0.006579 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.166667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c03c898e35d62712b812e780c7c19ccba395542b | 1,481 | py | Python | src/shortcircuit/model/crestprocessor.py | farshield/shortcircu | 87d45ea85b78e3e7da72b7b44755dc429b4fdf5a | [
"MIT"
] | 35 | 2016-06-22T20:07:31.000Z | 2021-04-07T11:02:08.000Z | src/shortcircuit/model/crestprocessor.py | farshield/shortcircu | 87d45ea85b78e3e7da72b7b44755dc429b4fdf5a | [
"MIT"
] | 15 | 2016-06-17T09:36:02.000Z | 2020-10-30T11:39:07.000Z | src/shortcircuit/model/crestprocessor.py | farshield/shortcircu | 87d45ea85b78e3e7da72b7b44755dc429b4fdf5a | [
"MIT"
] | 16 | 2016-10-02T16:09:18.000Z | 2021-05-29T02:51:14.000Z | # crestprocessor.py
import threading
from PySide import QtCore
from crest.crest import Crest
class CrestProcessor(QtCore.QObject):
"""
CREST Middle-ware
"""
login_response = QtCore.Signal(str)
logout_response = QtCore.Signal()
location_response = QtCore.Signal(str)
destination_response = QtCore.Signal(bool)
def __init__(self, implicit, client_id, client_secret, parent=None):
super(CrestProcessor, self).__init__(parent)
self.crest = Crest(implicit, client_id, client_secret, self._login_callback, self._logout_callback)
def login(self):
return self.crest.start_server()
def logout(self):
self.crest.logout()
def get_location(self):
server_thread = threading.Thread(target=self._get_location)
server_thread.setDaemon(True)
server_thread.start()
def _get_location(self):
location = self.crest.get_char_location()
self.location_response.emit(location)
def set_destination(self, sys_id):
server_thread = threading.Thread(target=self._set_destination, args=(sys_id, ))
server_thread.setDaemon(True)
server_thread.start()
def _set_destination(self, sys_id):
response = self.crest.set_char_destination(sys_id)
self.destination_response.emit(response)
def _login_callback(self, char_name):
self.login_response.emit(char_name)
def _logout_callback(self):
self.logout_response.emit()
| 29.62 | 107 | 0.704929 | 179 | 1,481 | 5.52514 | 0.251397 | 0.072801 | 0.08089 | 0.046512 | 0.271992 | 0.215369 | 0.091001 | 0.091001 | 0 | 0 | 0 | 0 | 0.197839 | 1,481 | 49 | 108 | 30.22449 | 0.832492 | 0.024308 | 0 | 0.121212 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.090909 | 0.030303 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
c04b151e636326dee485fc70fa9e09aa52af0717 | 2,319 | py | Python | OpenGLWrapper_JE/venv/Lib/site-packages/OpenGL/raw/GL/NV/geometry_program4.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | OpenGLWrapper_JE/venv/Lib/site-packages/OpenGL/raw/GL/NV/geometry_program4.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | OpenGLWrapper_JE/venv/Lib/site-packages/OpenGL/raw/GL/NV/geometry_program4.py | JE-Chen/je_old_repo | a8b2f1ac2eec25758bd15b71c64b59b27e0bcda5 | [
"MIT"
] | null | null | null | '''Autogenerated by xml_generate script, do not edit!'''
from OpenGL import platform as _p, arrays
# Code generation uses this
from OpenGL.raw.GL import _types as _cs
# End users want this...
from OpenGL.raw.GL._types import *
from OpenGL.raw.GL import _errors
from OpenGL.constant import Constant as _C
import ctypes
_EXTENSION_NAME = 'GL_NV_geometry_program4'
def _f( function ):
return _p.createFunction( function,_p.PLATFORM.GL,'GL_NV_geometry_program4',error_checker=_errors._error_checker)
GL_FRAMEBUFFER_ATTACHMENT_LAYERED_EXT=_C('GL_FRAMEBUFFER_ATTACHMENT_LAYERED_EXT',0x8DA7)
GL_FRAMEBUFFER_ATTACHMENT_TEXTURE_LAYER_EXT=_C('GL_FRAMEBUFFER_ATTACHMENT_TEXTURE_LAYER_EXT',0x8CD4)
GL_FRAMEBUFFER_INCOMPLETE_LAYER_COUNT_EXT=_C('GL_FRAMEBUFFER_INCOMPLETE_LAYER_COUNT_EXT',0x8DA9)
GL_FRAMEBUFFER_INCOMPLETE_LAYER_TARGETS_EXT=_C('GL_FRAMEBUFFER_INCOMPLETE_LAYER_TARGETS_EXT',0x8DA8)
GL_GEOMETRY_INPUT_TYPE_EXT=_C('GL_GEOMETRY_INPUT_TYPE_EXT',0x8DDB)
GL_GEOMETRY_OUTPUT_TYPE_EXT=_C('GL_GEOMETRY_OUTPUT_TYPE_EXT',0x8DDC)
GL_GEOMETRY_PROGRAM_NV=_C('GL_GEOMETRY_PROGRAM_NV',0x8C26)
GL_GEOMETRY_VERTICES_OUT_EXT=_C('GL_GEOMETRY_VERTICES_OUT_EXT',0x8DDA)
GL_LINES_ADJACENCY_EXT=_C('GL_LINES_ADJACENCY_EXT',0x000A)
GL_LINE_STRIP_ADJACENCY_EXT=_C('GL_LINE_STRIP_ADJACENCY_EXT',0x000B)
GL_MAX_GEOMETRY_TEXTURE_IMAGE_UNITS_EXT=_C('GL_MAX_GEOMETRY_TEXTURE_IMAGE_UNITS_EXT',0x8C29)
GL_MAX_PROGRAM_OUTPUT_VERTICES_NV=_C('GL_MAX_PROGRAM_OUTPUT_VERTICES_NV',0x8C27)
GL_MAX_PROGRAM_TOTAL_OUTPUT_COMPONENTS_NV=_C('GL_MAX_PROGRAM_TOTAL_OUTPUT_COMPONENTS_NV',0x8C28)
GL_PROGRAM_POINT_SIZE_EXT=_C('GL_PROGRAM_POINT_SIZE_EXT',0x8642)
GL_TRIANGLES_ADJACENCY_EXT=_C('GL_TRIANGLES_ADJACENCY_EXT',0x000C)
GL_TRIANGLE_STRIP_ADJACENCY_EXT=_C('GL_TRIANGLE_STRIP_ADJACENCY_EXT',0x000D)
@_f
@_p.types(None,_cs.GLenum,_cs.GLenum,_cs.GLuint,_cs.GLint)
def glFramebufferTextureEXT(target,attachment,texture,level):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLenum,_cs.GLuint,_cs.GLint,_cs.GLenum)
def glFramebufferTextureFaceEXT(target,attachment,texture,level,face):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLenum,_cs.GLuint,_cs.GLint,_cs.GLint)
def glFramebufferTextureLayerEXT(target,attachment,texture,level,layer):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLint)
def glProgramVertexLimitNV(target,limit):pass
| 55.214286 | 118 | 0.850367 | 355 | 2,319 | 4.991549 | 0.278873 | 0.027088 | 0.044018 | 0.038375 | 0.564334 | 0.327878 | 0.168736 | 0.091986 | 0.077878 | 0.077878 | 0 | 0.026087 | 0.057784 | 2,319 | 41 | 119 | 56.560976 | 0.784897 | 0.043122 | 0 | 0.108108 | 1 | 0 | 0.256564 | 0.256564 | 0 | 0 | 0.044219 | 0 | 0 | 1 | 0.135135 | false | 0.108108 | 0.162162 | 0.027027 | 0.324324 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
c0556573b1b396000e337b73f3de0c54b4d2d005 | 374 | py | Python | src/viewer/abs/forms.py | ozacas/asxtrade | a3645ae526bfc7a546fdf2a39520feda99e3390a | [
"Apache-2.0"
] | 8 | 2021-03-20T13:12:25.000Z | 2022-02-07T11:17:40.000Z | src/viewer/abs/forms.py | ozacas/asxtrade | a3645ae526bfc7a546fdf2a39520feda99e3390a | [
"Apache-2.0"
] | 8 | 2021-03-07T03:23:46.000Z | 2021-06-01T10:49:56.000Z | src/viewer/abs/forms.py | ozacas/asxtrade | a3645ae526bfc7a546fdf2a39520feda99e3390a | [
"Apache-2.0"
] | 3 | 2020-12-08T10:22:23.000Z | 2021-08-04T01:59:24.000Z | from django import forms
from django.core.exceptions import ValidationError
from abs.models import dataflows
class ABSDataflowForm(forms.Form):
dataflow = forms.ChoiceField(choices=(), required=True)
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.fields["dataflow"].choices = [(i.abs_id, i.name) for i in dataflows()]
| 31.166667 | 83 | 0.71123 | 47 | 374 | 5.468085 | 0.638298 | 0.077821 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.157754 | 374 | 11 | 84 | 34 | 0.815873 | 0 | 0 | 0 | 0 | 0 | 0.02139 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
c058a47a9fcf9cced343a8955317d5594bcf17a7 | 734 | py | Python | pkgs/sdk-pkg/src/genie/libs/sdk/apis/iosxe/dot1x/clear.py | patrickboertje/genielibs | 61c37aacf3dd0f499944555e4ff940f92f53dacb | [
"Apache-2.0"
] | 1 | 2022-01-16T10:00:24.000Z | 2022-01-16T10:00:24.000Z | pkgs/sdk-pkg/src/genie/libs/sdk/apis/iosxe/dot1x/clear.py | patrickboertje/genielibs | 61c37aacf3dd0f499944555e4ff940f92f53dacb | [
"Apache-2.0"
] | null | null | null | pkgs/sdk-pkg/src/genie/libs/sdk/apis/iosxe/dot1x/clear.py | patrickboertje/genielibs | 61c37aacf3dd0f499944555e4ff940f92f53dacb | [
"Apache-2.0"
] | null | null | null | # Python
import logging
# Unicon
from unicon.core.errors import SubCommandFailure
# Logger
log = logging.getLogger(__name__)
def clear_access_session_intf(device, intf):
""" clear access-session interface {}
Args:
device (`obj`): Device object
intf('str'): Name of the interface to clear access-session
Returns:
None
Raises:
SubCommandFailure
"""
try:
device.execute('clear access-session interface {intf}'.format(intf=intf))
except SubCommandFailure as e:
raise SubCommandFailure(
"Could not clear access-session interface on {device}. Error:\n{error}"
.format(device=device, error=e)
)
| 24.466667 | 83 | 0.622616 | 78 | 734 | 5.769231 | 0.525641 | 0.122222 | 0.2 | 0.18 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.284741 | 734 | 29 | 84 | 25.310345 | 0.857143 | 0.280654 | 0 | 0 | 0 | 0 | 0.231441 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.181818 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
c063c02a86fbd38bc9d19422a9222b6d2583e226 | 252 | py | Python | example/func_doc.py | tinashime/Python27 | b632918c7368a9bcfc5af8353e136247d954fb5e | [
"bzip2-1.0.6"
] | null | null | null | example/func_doc.py | tinashime/Python27 | b632918c7368a9bcfc5af8353e136247d954fb5e | [
"bzip2-1.0.6"
] | null | null | null | example/func_doc.py | tinashime/Python27 | b632918c7368a9bcfc5af8353e136247d954fb5e | [
"bzip2-1.0.6"
] | null | null | null | def printMax(x,y):
'''prints the maximum of two numbers.
The two values must be integers.'''
x = int(x)
y = int(y)
if x > y:
print x,'is maximun'
else:
print y,'is maximum'
printMax(3,5)
print printMax.__doc__
| 18 | 41 | 0.575397 | 40 | 252 | 3.525 | 0.575 | 0.042553 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011364 | 0.301587 | 252 | 13 | 42 | 19.384615 | 0.789773 | 0 | 0 | 0 | 0 | 0 | 0.114943 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.555556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
fbe3b3f30ddf6f664ac393236c6cc50652de4531 | 9,893 | py | Python | argparser.py | geoff-smith/MCplotscripts | 16dd5fd849671bb082a71f08492676be876209d3 | [
"MIT"
] | null | null | null | argparser.py | geoff-smith/MCplotscripts | 16dd5fd849671bb082a71f08492676be876209d3 | [
"MIT"
] | null | null | null | argparser.py | geoff-smith/MCplotscripts | 16dd5fd849671bb082a71f08492676be876209d3 | [
"MIT"
] | null | null | null | # argParser
# this class generates a RunParams object from the args passed to the script
from runparams import *
import os.path
import string
## handles args passed to the program
#
class ArgParser(object):
def parsePtCutString(self, ptCutString):
return map(float, string.split(ptCutString,',') )
def parseEventsString(self, eventsString):
return map(int, string.split(eventsString,',') )
def displayUserInfo(self):
print ""
print "o------------------o"
print "|Extracthistos Info|"
print "o------------------o"
print ""
print "[example usage]"
print ""
print "extracthistos inputFile.root"
print ""
print "extracthistos inputFile.root /intputDir/*.root --visualize --output outputfile-extracted.root --ptcuts 20,30,50,100 --etacut 2.5 --limit 100"
print ""
print "extracthistos inputFile.root /intputDir/*.root -v -o outputfile-extracted.root -p 20,30,50,100 -e 2.5 -l 100"
print ""
print "[switches]"
print " -d | --debug: Show debug information"
print " -e | --etacut: Set etaCut (double)"
print " -f | --force: Force overwriting of output file"
print " -i | --info: Shows this info"
print " -l | --limit: Limit maximum # of events processed"
print " -o | --output: Set output file (string)"
print " -od | --output-outputdirectory: Set output directory (string)"
print " -p | --ptcuts: Set pTcuts (list of doubles seperated by ',')"
print " -# | --events: Specify events to processed (list of ints seperated by ',')"
print " -m | --multi-processing: create n (int) subprocesses"
print " -% | --modulo: process only every nth event (int)"
print " -%r | --modulo-rest: process only every nth + r event (int)"
print " -v | --visualize: Create visualization(s)"
print " -vs | --visualize-skip-copies: Do not render non-physical particle copies"
print " -vnu | --visualize-no-underlying-event: Do not visualize the underlying event"
print " -vni | --visualize-no-main-interaction: Do not visualize the main interaction"
print " -vsj | --visualize-color-special-jets: Color special particle jets"
print " -vce | --visualize-cutoff-energy: Specify Visualization energy cutoff (double)"
print " -vcs | --visualize-cutoff-special-jets: Cutoff Special Jets"
print " -vcr | --visualize-cutoff-radiation: Cutoff ISR/FSR Jets"
print " -vme | --visualize-mode-energy: Color particles by their energy"
print " -vmp | --visualize-mode-pt: Color particles by their pT"
print " -vr | --visualize-renderer: Specify GraphViz renderer (string), defaults to 'dot'"
print ""
def __init__(self, args):
self.runParams = RunParams()
lenArgs = len(args)
skip = False
forceOutputOverride = False
for i in range (0, lenArgs):
# skip first arg as it's the script's name
if i == 0 or skip:
skip = False
continue
# provide arg and nextArg (if possible)
arg = args[i]
nextArg = None
if (i < lenArgs - 1):
nextArg = args[i+1]
# parse switches
if ( arg == "-d" ) or ( arg == "--debug" ) :
self.runParams.useDebugOutput = True
continue
if ( arg == "-e" ) or ( arg == "--etacut" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
self.runParams.eta = float(nextArg)
skip = True
continue
if ( arg == "-f" ) or ( arg == "--force" ) :
forceOutputOverride = True
continue
if ( arg == "-i" ) or ( arg == "--info" ) :
self.displayUserInfo()
self.runParams.run = False
break
if ( arg == "-l" ) or ( arg == "--limit" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
self.runParams.maxEvents = int(nextArg)
skip = True
continue
if ( arg == "-o" ) or ( arg == "--output" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
if nextArg [-15:] <> '-extracted.root':
raise Exception("'" + arg + "': Output file must end with '-extracted.root'!")
self.runParams.outputFile = nextArg
skip = True
continue
if ( arg == "-p" ) or ( arg == "--ptcuts" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
ptCutString = nextArg
self.runParams.pTCuts = self.parsePtCutString(ptCutString)
skip = True
continue
if ( arg == "-v" ) or ( arg == "--visualize" ) :
self.runParams.useVisualization = True
continue
if ( arg == "-vs" ) or ( arg == "--visualize-skip-copies" ) :
self.runParams.visualizationSkipCopies = True
continue
if ( arg == "-vnu" ) or ( arg == "--visualize-no-underlying-event" ) :
self.runParams.visualizationShowUnderlyingEvent = False
continue
if ( arg == "-vni" ) or ( arg == "--visualize-no-main-interaction" ) :
self.runParams.visualizationShowMainInteraction = False
continue
if ( arg == "-vsj" ) or ( arg == "--visualize-color-special-jets" ) :
self.runParams.visualizationColorSpecialJets = True
continue
if ( arg == "-vme" ) or ( arg == "--visualize-mode-energy" ) :
self.runParams.visualizationEnergyMode = True
continue
if ( arg == "-vmp" ) or ( arg == "--visualize-mode-pt" ) :
self.runParams.visualizationPtMode = True
continue
if ( arg == "-vce" ) or ( arg == "--visualize-cutoff-energy" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
self.runParams.visualizationEnergyCutoff = int(nextArg)
skip = True
continue
if ( arg == "-vcr" ) or ( arg == "--visualize-cutoff-radiation" ) :
self.runParams.visualizationCutoffRadiation = True
continue
if ( arg == "-vcs" ) or ( arg == "--visualize-cutoff-special-jets" ) :
self.runParams.visualizationCutSpecialJets = True
continue
#if ( arg == "-vp" ) or ( arg == "--visualize-pt-cutoff" ) :
#if nextArg is None or nextArg[0] == '-':
#raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
#self.runParams.visualizationPtCutoff = int(nextArg)
#skip = True
#continue
if ( arg == "-vr" ) or ( arg == "--visualize-renderer:" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
self.runParams.visualizationRenderer = nextArg
skip = True
continue
#if ( arg == "-z" ) or ( arg == "--zero-jets" ) :
#self.runParams.zeroAdditionalJets = True
#continue
if ( arg == "-#" ) or ( arg == "--events" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
eventsString = nextArg
self.runParams.events = self.parseEventsString(eventsString)
skip = True
continue
if ( arg == "-od" ) or ( arg == "--output-outputdirectory" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
self.runParams.outputDir = nextArg
skip = True
continue
if ( arg == "-m" ) or ( arg == "--multi-processing" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
self.runParams.multiProcessing = int(nextArg)
skip = True
continue
if ( arg == "-%" ) or ( arg == "--modulo" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
self.runParams.modulo = int(nextArg)
skip = True
continue
if ( arg == "-%r" ) or ( arg == "--modulo-rest" ) :
if nextArg is None or nextArg[0] == '-':
raise Exception("'" + arg + "': Parse Error after '"+arg+"'!")
self.runParams.moduloRest = int(nextArg)
skip = True
continue
if (arg[0] == '-'):
raise Exception("'" + arg + "' is not a valid switch!")
# deny input files ending with '-extracted.root', as this is our signature for output files:
if arg[-15:] == '-extracted.root':
print "Warning: File '" + arg + "' is being skipped."
continue
# parse input files:
if arg[-5:] == '.root':
thisFile = arg
if thisFile[:7] == "/store/":
if not os.path.isfile(thisFile):
thisFile = "root://xrootd.ba.infn.it/" + thisFile
else:
if not os.path.isfile(thisFile):
raise Exception("File '" + thisFile + "' does not exist!")
self.runParams.inputFileList.append(thisFile)
continue
raise Exception("'" + arg + "' is not a valid root file!")
if self.runParams.useVisualization and len(self.runParams.inputFileList) > 1:
raise Exception("Visualization is allowed only for exactly one input file.")
if self.runParams.run:
if os.path.isfile(self.runParams.outputFile) and not forceOutputOverride:
raise Exception("'" + self.runParams.outputFile + "' exists. Use the --force switch to force overriding.")
if len(self.runParams.outputDir) <> 0:
if not os.path.exists(self.runParams.outputDir):
os.makedirs(self.runParams.outputDir)
self.runParams.outputFilePath = self.runParams.outputDir + "/" + self.runParams.outputFile
else:
self.runParams.outputFilePath = self.runParams.outputFile
#self.displayInfo()
| 39.730924 | 153 | 0.574548 | 1,091 | 9,893 | 5.206233 | 0.208066 | 0.089261 | 0.05493 | 0.065845 | 0.276056 | 0.240493 | 0.207218 | 0.147359 | 0.147359 | 0.147359 | 0 | 0.007368 | 0.27292 | 9,893 | 248 | 154 | 39.891129 | 0.782288 | 0.067725 | 0 | 0.376289 | 1 | 0.010309 | 0.342534 | 0.067159 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.015464 | null | null | 0.201031 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
fbe8a390825becc2ff9eab5332457693f2473fbc | 3,606 | py | Python | pysnmp-with-texts/IANA-MALLOC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 8 | 2019-05-09T17:04:00.000Z | 2021-06-09T06:50:51.000Z | pysnmp-with-texts/IANA-MALLOC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 4 | 2019-05-31T16:42:59.000Z | 2020-01-31T21:57:17.000Z | pysnmp-with-texts/IANA-MALLOC-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module IANA-MALLOC-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/IANA-MALLOC-MIB
# Produced by pysmi-0.3.4 at Wed May 1 13:50:25 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
Integer, ObjectIdentifier, OctetString = mibBuilder.importSymbols("ASN1", "Integer", "ObjectIdentifier", "OctetString")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueRangeConstraint, ConstraintsIntersection, SingleValueConstraint, ConstraintsUnion, ValueSizeConstraint = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueRangeConstraint", "ConstraintsIntersection", "SingleValueConstraint", "ConstraintsUnion", "ValueSizeConstraint")
ModuleCompliance, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "NotificationGroup")
Integer32, iso, MibScalar, MibTable, MibTableRow, MibTableColumn, MibIdentifier, NotificationType, TimeTicks, mib_2, ObjectIdentity, Bits, Counter64, Gauge32, Unsigned32, ModuleIdentity, Counter32, IpAddress = mibBuilder.importSymbols("SNMPv2-SMI", "Integer32", "iso", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "MibIdentifier", "NotificationType", "TimeTicks", "mib-2", "ObjectIdentity", "Bits", "Counter64", "Gauge32", "Unsigned32", "ModuleIdentity", "Counter32", "IpAddress")
DisplayString, TextualConvention = mibBuilder.importSymbols("SNMPv2-TC", "DisplayString", "TextualConvention")
ianaMallocMIB = ModuleIdentity((1, 3, 6, 1, 2, 1, 102))
ianaMallocMIB.setRevisions(('2014-05-22 00:00', '2003-01-27 12:00',))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
if mibBuilder.loadTexts: ianaMallocMIB.setRevisionsDescriptions(('Updated contact info.', 'Initial version.',))
if mibBuilder.loadTexts: ianaMallocMIB.setLastUpdated('201405220000Z')
if mibBuilder.loadTexts: ianaMallocMIB.setOrganization('IANA')
if mibBuilder.loadTexts: ianaMallocMIB.setContactInfo(' Internet Assigned Numbers Authority Internet Corporation for Assigned Names and Numbers 12025 Waterfront Drive, Suite 300 Los Angeles, CA 90094-2536 Phone: +1 310-301-5800 EMail: iana&iana.org')
if mibBuilder.loadTexts: ianaMallocMIB.setDescription('This MIB module defines the IANAscopeSource and IANAmallocRangeSource textual conventions for use in MIBs which need to identify ways of learning multicast scope and range information. Any additions or changes to the contents of this MIB module require either publication of an RFC, or Designated Expert Review as defined in the Guidelines for Writing IANA Considerations Section document. The Designated Expert will be selected by the IESG Area Director(s) of the Transport Area.')
class IANAscopeSource(TextualConvention, Integer32):
description = 'The source of multicast scope information.'
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3, 4, 5))
namedValues = NamedValues(("other", 1), ("manual", 2), ("local", 3), ("mzap", 4), ("madcap", 5))
class IANAmallocRangeSource(TextualConvention, Integer32):
description = 'The source of multicast address allocation range information.'
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(1, 2, 3))
namedValues = NamedValues(("other", 1), ("manual", 2), ("local", 3))
mibBuilder.exportSymbols("IANA-MALLOC-MIB", IANAmallocRangeSource=IANAmallocRangeSource, IANAscopeSource=IANAscopeSource, ianaMallocMIB=ianaMallocMIB, PYSNMP_MODULE_ID=ianaMallocMIB)
| 100.166667 | 537 | 0.781475 | 400 | 3,606 | 7.0375 | 0.495 | 0.049023 | 0.0373 | 0.060391 | 0.329663 | 0.259325 | 0.259325 | 0.218828 | 0.189698 | 0.189698 | 0 | 0.053428 | 0.102052 | 3,606 | 35 | 538 | 103.028571 | 0.815936 | 0.08985 | 0 | 0.08 | 0 | 0.08 | 0.423159 | 0.019859 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.24 | 0 | 0.64 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fbf29fa665c3f19650fb43d520ce03961090f743 | 7,007 | py | Python | ovs/extensions/hypervisor/hypervisors/vmware.py | mflu/openvstorage_centos | 280a98d3e5d212d58297e0ffcecd325dfecef0f8 | [
"Apache-2.0"
] | 1 | 2015-08-29T16:36:40.000Z | 2015-08-29T16:36:40.000Z | ovs/extensions/hypervisor/hypervisors/vmware.py | rootfs-analytics/openvstorage | 6184822340faea1d2927643330a7aaa781d92d36 | [
"Apache-2.0"
] | null | null | null | ovs/extensions/hypervisor/hypervisors/vmware.py | rootfs-analytics/openvstorage | 6184822340faea1d2927643330a7aaa781d92d36 | [
"Apache-2.0"
] | null | null | null | # Copyright 2014 CloudFounders NV
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""
Module for the VMware hypervisor client
"""
import os
from ovs.extensions.hypervisor.apis.vmware.sdk import Sdk
class VMware(object):
"""
Represents the hypervisor client for VMware
"""
def __init__(self, ip, username, password):
"""
Initializes the object with credentials and connection information
"""
self.sdk = Sdk(ip, username, password)
self.state_mapping = {'poweredOn' : 'RUNNING',
'poweredOff': 'HALTED',
'suspended' : 'PAUSED'}
def get_state(self, vmid):
"""
Get the current power state of a virtual machine
@param vmid: hypervisor id of the virtual machine
"""
return self.state_mapping[self.sdk.get_power_state(vmid)]
def create_vm_from_template(self, name, source_vm, disks, ip, mountpoint, wait=True):
"""
Create a new vmachine from an existing template
"""
task = self.sdk.create_vm_from_template(name, source_vm, disks, ip, mountpoint, wait)
if wait is True:
if self.sdk.validate_result(task):
task_info = self.sdk.get_task_info(task)
return task_info.info.result.value
return None
def clone_vm(self, vmid, name, disks, wait=False):
"""
Clone a vmachine
@param vmid: hypervisor id of the virtual machine
@param name: name of the virtual machine
@param disks: list of disk information
@param wait: wait for action to complete
"""
task = self.sdk.clone_vm(vmid, name, disks, wait)
if wait is True:
if self.sdk.validate_result(task):
task_info = self.sdk.get_task_info(task)
return task_info.info.result.value
return None
def delete_vm(self, vmid, storagedriver_mountpoint, storagedriver_storage_ip, devicename, disks_info=None, wait=False):
"""
Remove the vmachine from the hypervisor
@param vmid: hypervisor id of the virtual machine
@param wait: wait for action to complete
"""
if disks_info is None:
disks_info = []
_ = disks_info
self.sdk.delete_vm(vmid, storagedriver_mountpoint, storagedriver_storage_ip, devicename, wait)
def get_vm_object(self, vmid):
"""
Gets the VMware virtual machine object from VMware by its identifier
"""
return self.sdk.get_vm(vmid)
def get_vm_agnostic_object(self, vmid):
"""
Gets the VMware virtual machine object from VMware by its identifier
"""
return self.sdk.make_agnostic_config(self.sdk.get_vm(vmid))
def get_vm_object_by_devicename(self, devicename, ip, mountpoint):
"""
Gets the VMware virtual machine object from VMware by devicename
and datastore identifiers
"""
return self.sdk.make_agnostic_config(self.sdk.get_nfs_datastore_object(ip, mountpoint, devicename)[0])
def get_vms_by_nfs_mountinfo(self, ip, mountpoint):
"""
Gets a list of agnostic vm objects for a given ip and mountpoint
"""
for vm in self.sdk.get_vms(ip, mountpoint):
yield self.sdk.make_agnostic_config(vm)
def is_datastore_available(self, ip, mountpoint):
"""
@param ip : hypervisor ip to query for datastore presence
@param mountpoint: nfs mountpoint on hypervisor
@rtype: boolean
@return: True | False
"""
return self.sdk.is_datastore_available(ip, mountpoint)
def set_as_template(self, vmid, disks, wait=False):
"""
Configure a vm as template
This lets the machine exist on the hypervisor but configures
all disks as "Independent Non-persistent"
@param vmid: hypervisor id of the virtual machine
"""
return self.sdk.set_disk_mode(vmid, disks, 'independent_nonpersistent', wait)
def mount_nfs_datastore(self, name, remote_host, remote_path):
"""
Mounts a given NFS export as a datastore
"""
return self.sdk.mount_nfs_datastore(name, remote_host, remote_path)
def test_connection(self):
"""
Checks whether this node is a vCenter
"""
return self.sdk.test_connection()
def clean_backing_disk_filename(self, path):
"""
Cleans a backing disk filename to the corresponding disk filename
"""
_ = self
return path.replace('-flat.vmdk', '.vmdk').strip('/')
def get_backing_disk_path(self, machinename, devicename):
"""
Builds the path for the file backing a given device/disk
"""
_ = self
return '/{}/{}-flat.vmdk'.format(machinename.replace(' ', '_'), devicename)
def get_disk_path(self, machinename, devicename):
"""
Builds the path for the file backing a given device/disk
"""
_ = self
return '/{}/{}.vmdk'.format(machinename.replace(' ', '_'), devicename)
def clean_vmachine_filename(self, path):
"""
Cleans a VM filename
"""
_ = self
return path.strip('/')
def get_vmachine_path(self, machinename, storagerouter_machineid):
"""
Builds the path for the file representing a given vmachine
"""
_ = self, storagerouter_machineid # For compatibility purposes only
machinename = machinename.replace(' ', '_')
return '/{}/{}.vmx'.format(machinename, machinename)
def get_rename_scenario(self, old_name, new_name):
"""
Gets the rename scenario based on the old and new name
"""
_ = self
if old_name.endswith('.vmx') and new_name.endswith('.vmx'):
return 'RENAME'
elif old_name.endswith('.vmx~') and new_name.endswith('.vmx'):
return 'UPDATE'
return 'UNSUPPORTED'
def should_process(self, devicename, machine_ids=None):
"""
Checks whether a given device should be processed
"""
_ = self, devicename, machine_ids
return True
def file_exists(self, vpool, devicename):
"""
Check if devicename exists on the given vpool
"""
_ = self
filename = '/mnt/{0}/{1}'.format(vpool.name, devicename)
return os.path.exists(filename) and os.path.isfile(filename)
| 34.860697 | 123 | 0.628942 | 856 | 7,007 | 5.004673 | 0.251168 | 0.03268 | 0.01634 | 0.022176 | 0.347806 | 0.314426 | 0.289916 | 0.234127 | 0.226891 | 0.179272 | 0 | 0.00218 | 0.280006 | 7,007 | 200 | 124 | 35.035 | 0.846977 | 0.330241 | 0 | 0.210526 | 0 | 0 | 0.045298 | 0.006155 | 0 | 0 | 0 | 0 | 0 | 1 | 0.276316 | false | 0.026316 | 0.026316 | 0 | 0.605263 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fbf4c0c322e799620006a7ec56b567282c3ba0ca | 226 | py | Python | checkTicTacToe/checkTicTacToe.py | nate-ar-williams/coding-questions | 24baa901a786e6e2c4e8ea823a26416bc51e1f6a | [
"MIT"
] | null | null | null | checkTicTacToe/checkTicTacToe.py | nate-ar-williams/coding-questions | 24baa901a786e6e2c4e8ea823a26416bc51e1f6a | [
"MIT"
] | null | null | null | checkTicTacToe/checkTicTacToe.py | nate-ar-williams/coding-questions | 24baa901a786e6e2c4e8ea823a26416bc51e1f6a | [
"MIT"
] | null | null | null | #!/usr/bin/python3
# let board be 3x3 bool array
def isWin(board):
start = board[0][0]
win = False
next = [(0, 1), (1, 1), (1, 0)]
while(!win):
while
return win
def main():
pass
if __name__ == '__main__':
main()
| 12.555556 | 32 | 0.588496 | 37 | 226 | 3.378378 | 0.621622 | 0.048 | 0.048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.221239 | 226 | 17 | 33 | 13.294118 | 0.647727 | 0.199115 | 0 | 0 | 0 | 0 | 0.044693 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.090909 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
fbf52c7f3a9bab66d56f2bccbaf8974ecb5420d3 | 2,138 | py | Python | openerp/exceptions.py | ntiufalara/openerp7 | 903800da0644ec0dd9c1dcd34205541f84d45fe4 | [
"MIT"
] | 3 | 2016-01-29T14:39:49.000Z | 2018-12-29T22:42:00.000Z | openerp/exceptions.py | ntiufalara/openerp7 | 903800da0644ec0dd9c1dcd34205541f84d45fe4 | [
"MIT"
] | 2 | 2016-03-23T14:29:41.000Z | 2017-02-20T17:11:30.000Z | openerp/exceptions.py | ntiufalara/openerp7 | 903800da0644ec0dd9c1dcd34205541f84d45fe4 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
##############################################################################
#
# OpenERP, Open Source Management Solution
# Copyright (C) 2011 OpenERP s.a. (<http://openerp.com>).
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU Affero General Public License as
# published by the Free Software Foundation, either version 3 of the
# License, or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU Affero General Public License for more details.
#
# You should have received a copy of the GNU Affero General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#
##############################################################################
""" OpenERP core exceptions.
This module defines a few exception types. Those types are understood by the
RPC layer. Any other exception type bubbling until the RPC layer will be
treated as a 'Server error'.
"""
class Warning(Exception):
pass
class AccessDenied(Exception):
""" Login/password error. No message, no traceback. """
def __init__(self):
super(AccessDenied, self).__init__('Access denied.')
self.traceback = ('', '', '')
class AccessError(Exception):
""" Access rights error. """
class DeferredException(Exception):
""" Exception object holding a traceback for asynchronous reporting.
Some RPC calls (database creation and report generation) happen with
an initial request followed by multiple, polling requests. This class
is used to store the possible exception occuring in the thread serving
the first request, and is then sent to a polling request.
('Traceback' is misleading, this is really a exc_info() triple.)
"""
def __init__(self, msg, tb):
self.message = msg
self.traceback = tb
# vim:expandtab:smartindent:tabstop=4:softtabstop=4:shiftwidth=4:
| 37.508772 | 78 | 0.658092 | 273 | 2,138 | 5.106227 | 0.578755 | 0.023673 | 0.025825 | 0.04089 | 0.071736 | 0.071736 | 0.04878 | 0 | 0 | 0 | 0 | 0.005199 | 0.190365 | 2,138 | 56 | 79 | 38.178571 | 0.800116 | 0.713751 | 0 | 0 | 0 | 0 | 0.036269 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0.090909 | 0 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
fbfb4b2b18ec51f6264b25bae8ef574c623943f4 | 810 | py | Python | utils/utilsFreq.py | geobook2015/magPy | af0f31fc931786ac6f8d69a5290366418035859d | [
"Apache-2.0"
] | 1 | 2021-05-19T18:29:15.000Z | 2021-05-19T18:29:15.000Z | utils/utilsFreq.py | geobook2015/magPy | af0f31fc931786ac6f8d69a5290366418035859d | [
"Apache-2.0"
] | null | null | null | utils/utilsFreq.py | geobook2015/magPy | af0f31fc931786ac6f8d69a5290366418035859d | [
"Apache-2.0"
] | 2 | 2021-06-03T01:59:02.000Z | 2021-07-03T07:47:10.000Z | # utility functions for frequency related stuff
import numpy as np
import numpy.fft as fft
import math
def getFrequencyArray(fs, samples):
# frequencies go from to nyquist
nyquist = fs/2
return np.linspace(0, nyquist, samples)
# use this function for all FFT calculations
# then if change FFT later (i.e. FFTW), just replace one function
def forwardFFT(data, **kwargs):
if "norm" in kwargs and not kwargs["norm"]:
return fft.rfft(data, axis=0)
return fft.rfft(data, norm='ortho', axis=0)
def inverseFFT(data, length, **kwargs):
if "norm" in kwargs and not kwargs["norm"]:
return fft.irfft(data, n=length)
return fft.irfft(data, n=length, norm='ortho')
def padNextPower2(size):
next2Power = math.ceil(math.log(size,2))
next2Size = math.pow(2, int(next2Power))
return int(next2Size) - size
| 27.931034 | 65 | 0.728395 | 127 | 810 | 4.645669 | 0.488189 | 0.061017 | 0.040678 | 0.047458 | 0.222034 | 0.222034 | 0.152542 | 0.152542 | 0.152542 | 0.152542 | 0 | 0.016012 | 0.151852 | 810 | 28 | 66 | 28.928571 | 0.842795 | 0.228395 | 0 | 0.111111 | 0 | 0 | 0.042071 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.166667 | 0 | 0.722222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
fbff951b3453445a7ed046dfadb09ce047c59a21 | 1,766 | py | Python | authentik/stages/password/migrations/0007_app_password.py | BeryJu/passbook | 350f0d836580f4411524614f361a76c4f27b8a2d | [
"MIT"
] | 15 | 2020-01-05T09:09:57.000Z | 2020-11-28T05:27:39.000Z | authentik/stages/password/migrations/0007_app_password.py | BeryJu/passbook | 350f0d836580f4411524614f361a76c4f27b8a2d | [
"MIT"
] | 302 | 2020-01-21T08:03:59.000Z | 2020-12-04T05:04:57.000Z | authentik/stages/password/migrations/0007_app_password.py | BeryJu/passbook | 350f0d836580f4411524614f361a76c4f27b8a2d | [
"MIT"
] | 3 | 2020-03-04T08:21:59.000Z | 2020-08-01T20:37:18.000Z | # Generated by Django 3.2.6 on 2021-08-23 14:34
import django.contrib.postgres.fields
from django.apps.registry import Apps
from django.db import migrations, models
from django.db.backends.base.schema import BaseDatabaseSchemaEditor
from authentik.stages.password import BACKEND_APP_PASSWORD, BACKEND_INBUILT
def update_default_backends(apps: Apps, schema_editor: BaseDatabaseSchemaEditor):
PasswordStage = apps.get_model("authentik_stages_password", "passwordstage")
db_alias = schema_editor.connection.alias
stages = PasswordStage.objects.using(db_alias).filter(name="default-authentication-password")
if not stages.exists():
return
stage = stages.first()
stage.backends.append(BACKEND_APP_PASSWORD)
stage.save()
class Migration(migrations.Migration):
dependencies = [
("authentik_stages_password", "0006_passwordchange_rename"),
]
operations = [
migrations.AlterField(
model_name="passwordstage",
name="backends",
field=django.contrib.postgres.fields.ArrayField(
base_field=models.TextField(
choices=[
("authentik.core.auth.InbuiltBackend", "User database + standard password"),
("authentik.core.auth.TokenBackend", "User database + app passwords"),
(
"authentik.sources.ldap.auth.LDAPBackend",
"User database + LDAP password",
),
]
),
help_text="Selection of backends to test the password against.",
size=None,
),
),
migrations.RunPython(update_default_backends),
]
| 36.040816 | 100 | 0.625708 | 169 | 1,766 | 6.402367 | 0.52071 | 0.027726 | 0.063771 | 0.049908 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015008 | 0.283126 | 1,766 | 48 | 101 | 36.791667 | 0.839652 | 0.025481 | 0 | 0.102564 | 1 | 0 | 0.225713 | 0.123328 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025641 | false | 0.25641 | 0.128205 | 0 | 0.25641 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
220378f315f7e2f7d8cd6b8b856c000fc8a490f5 | 12,933 | py | Python | 2020/day11.py | asmeurer/advent-of-code | 3ba3edb0c29994487f1b3344383dc41dfea9bfcb | [
"MIT"
] | null | null | null | 2020/day11.py | asmeurer/advent-of-code | 3ba3edb0c29994487f1b3344383dc41dfea9bfcb | [
"MIT"
] | null | null | null | 2020/day11.py | asmeurer/advent-of-code | 3ba3edb0c29994487f1b3344383dc41dfea9bfcb | [
"MIT"
] | null | null | null | test_input = """
L.LL.LL.LL
LLLLLLL.LL
L.L.L..L..
LLLL.LL.LL
L.LL.LL.LL
L.LLLLL.LL
..L.L.....
LLLLLLLLLL
L.LLLLLL.L
L.LLLLL.LL
"""
test_input2 = """
.......#.
...#.....
.#.......
.........
..#L....#
....#....
.........
#........
...#.....
"""
test_input3 = """
.............
.L.L.#.#.#.#.
.............
"""
test_input4 = """
.##.##.
#.#.#.#
##...##
...L...
##...##
#.#.#.#
.##.##.
"""
input = """
LL.LL.LLLLLL.LLLLLLLLLLLLLLLLLL.LLLLL..LLLLLLL.LLLLLLLLLLLL.LLLL.LLLLL.LL.LLLLLL.LLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLL.LL.LLLLLLLLLLLLLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLLLLLLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLL.LLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLLLLLLLLLLLLLLLLLLLLLL.LLLLL.LLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLLLLLLLL
.LL...LL.L.L....LL..LL..L.L.L..L.....L...LL.....LLL..L..L..L.....L.L..LLLL...LL.LL.L.......
LLLLLLLLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLLLLLLLLLLLLLLL..LLLLLLLLLLLLLLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLLLLLLL.LLLLLLLLLLLLLL.LLLLL.LLLLLLLLLLL.LLLLLLLL.LLLLLLLLLLL.LLLLL
LLLLLLLLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLLLLLLLLLL.LL.LLLLLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLL.LLLLLLLLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLLLLLLLL
LLLLLLLLLLLLLLLLLLLLL.LLLL.LLLLLLLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLLLLLLLLL.LLLL.LLLLL
LL.L......L...LL....L...L.LL.L.....L.LL.L....L...LLL....LL.....LL.L.LLL...LL.L...LLL.L.L...
LLLLLLLLLLLL.LLLLLLLL.L.LL.L.LLLLLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLL.LLLLL
LLLLLLLLLLLL.LLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLLLLLLLLL.LLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLL.LLLLLLLLLL.LL.LLLLLLLLLLL.LLLLLLLLLLLLLLL.LLLL.LLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLL.LLLL.LLLLLL.L.LLLLL.LLLLLLLLLLLL.LLLL.LLLLLLL..LLLLLL.LLLL.LLLLL
LLLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLL.LLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLL.L.LL.LLLLL
.LLLL.LLLLLL.LLLLLLLL.LLLLLLLLLLLLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLL.LLLLLL.LLLLLLL.LLLLLLLLLLLLLLLLLLLLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
...L..L......L..L.L.......LL...L.LL.L...LL...L..LL....L....L.L..L...L...L.L.....LL.....L..L
LLLLL.LLLLLL.LLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLL.LLLLLLLLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLLLLL.LL
LLLLL.LLLLLLLL.LL.LLLLLLLL.LLLL.LLLLLL.LLLLLLLLLLL.L.LLLLLL.LLLLLLLLLLLLLLLLLLLLLLLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLL.LLL.LLLLL.LLLLLL.LLLL.LLLLLLLLLLLLLLL.LLLL.LLLLL
LLLLLLLLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLLLLLLLLLL.L.LLL.LLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLL.LLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLLLLLLLL
LLLLLLLLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLL.LLLLLLLLLLL.LLLLL.LLLLLLLLLLLLLLL.LLLLLLLLLL
.......LL.L.L...LL..L....LL....L.L.L....L......L..LL...LL.LLL..L....L......L.LLL.L.....LLLL
LLLLL.LLLLLLLLLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLLLLLLLLLL.LLLL.LLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLL.LLLL.LLLLLLLLLL.LLLLLLLLL.LLLL.L.LLLL.LLLLLLLL.LLLLLL.L.LLLLLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLLLLLLL.
LLLLL.LLLLLLLLLLLLLLL.LLLLLLLLLLLLLLLLLLLLLLLL.LLL.LLLLLLLL.LLLL.LLLLLLLL.LLLLLL.LLL..LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLLLLLLLLLL.LLLLLLLLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLLLLLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLLLLLLL.LLLLLLLLLLLLLL.LLLLL.LLLLLLLLLLL.LLLLLLLL.LLLLLLLLLLL.LLLLL
LLLLLLLLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLLLLLLLLLLLLL.LLLLLLL
LLLLL.LLLLLL.LL.LLLLLLLLLL.LLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLLLLLLLLLL.LLLLL.LLLLLLLLLLL.LLL.LLLL.LLLLLLLLLLLLLLLLL
.L........L..L.L.LLLLL.......LL.......L..L.L..LL.L......L.......LLL..LLL.LL...L.L...L.LL.L.
LLLLL.LLLLLL.LLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLL.LLLLLLLLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLLLLLLLLLLLLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLL.LLLL.LLLLL
LLLLL..LLLLL.LLLLLLLL.LLLL.LLL..LLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLLLLLLLL
LLLLL.LLLLL..LLLLLLLL.LLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLL.LLLLL.LLLLLLLLLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLL.LLLLLLL.LLLLLLLLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLL.LLLLL
..L..LL.......L.LLLL.L.....L...L.LL...LLLLL.L.....L..L...LL.LL..L..LLLLLL..........LL.....L
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLLLLLLLLLLLLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLL.LLLLLLLLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLL.LLLLL
LLLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLL.LLLLL.LLLLLLLLLLL.LLLLLLLLLLLLLLL.LLLLLLLLLL
LLLLLLLLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLLLLLLLLLL.LLLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLL.LLLLL
LLLLL..LLLLL.LLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLLLLLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLL..LLLLLL.LLLLLLLLLLLLLLLLLLLLLLLLLL.LL.LLLLLLLL.LLLLL
LLLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLL.LLLL..LLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLL.LLLLL
L...LL....L..L..LL.........L.L...LL..LL.L....L...........LL.L.......L.L.L.......L..L..LL..L
LLLLL.LLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLL.LLLLL.LL.LLL.LLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLL.LLLL.LLLLLL.L.LLLLLLLLLLL.LL.LLL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLLLLLLLLLLLLLLLLLL.LLLL.LLLL.L.LLLL.LLLLLLLLLLLL..L.LLLL.L.LL.LLLLLLLL.LLLLLLLLLLLLLLLL.
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLLLLLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLLL.LLL.LLLL.LLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLLLLLLLLL.LLLLLLLL.LLLLLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLLL.LLLLLL.LLLLLLLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLL.LLLLL.LLLLLLLLLLL.LLLLLLLLLLLLLLL.LLLLLLLLLL
LLLLLLLLLLLL.LLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLLLLLLLL
.....L.LLL...LL..LL.....L....LL.......L...LL..L..L...L...L.LL.LL.LL...LL..LLL.L..LLL..LLLL.
LLLLLLLLLLLLLLLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLLLLLLLLL.LLLLLLLL.LLLLLLLLL.LLLLLLLLLLLL.L.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLL.LLLLLLLLL.LLLLLLLLL.LLLLLLLLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLLLLLLLL
LLLLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLLL.LLLLLLLLLLL.LLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLL..LLL.LLLLLLLLLLLLLL.LLLL..LLLLLLLLLLL.LLLLLLLL.LLLLLL.LLLLLLLLLL
LLLLLLLLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLL.LLLLLLLL.LL.LLLLLLLLLLLLL.LL.LLL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLL
..L..LL.........L....L.L.L.L...L....L...........LL....L..L...L.LL..L..LL.L..LL..L..L.L..L.L
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLLLLLLLLLLLL.LLLL.LLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLL.LLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLLLLLLLLLLLLLLLLLLLLLL.LLLLLLLLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLLLLLLLL
LLLLL.LLLLLLLLLLLLLLLLLLLL.LLLL.LLLLLL.LLLLLLL.LLLLLLLLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLLLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLLLLLLLLLL.LLLL.LLLLL
....L............L....LL......L.LLL.LLL....LL.....L..L.LL.L........L..L......L.LLL..LL..LL.
LL.LLLLLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLLLLLL
LLLLL.LLLLLL.LLLLLLLL.L.LLLLLLL.LLLLLLLLLLLLLL.LLLLLLLLLLLLLLLLL.LLLLLLLL..LLLLL.LLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLLLLLLL.LLLL.LLLLLL.LLLLLLL.LLLLLLLLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLLLLLLLL
LLLLLLLLLLLL.LLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLLL..LLLLLLLLLLLLLLLLLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLLLLLLLLLLL.LL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLLLLLLLLL.LLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLL
LLLLL.L.LLLL.LLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLLLLLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLLLLLLLL
LLLLL.LLLLLLLLLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLLL.LLLLLL.LLLLLLLLLL
.L......LLL...L.L.LL.L.....LL.L..L.L.LLLLL....LL..L...L..L.....L.L...L...L.L.LL.LL.L.......
LLLLLLLLLLLL.LLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLLLLLLLLL.LLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLL.LLLL.LLLLLL.LLLLLLL.LLLLL.LLLLLLLLLLL.LLLLLLLLLLLLLLLLLLLL.LLLLL
LLLLL.LLLLLL.LLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLL.LLLLLLLLLLLL.LLLL.LLLLLLLL.LLLLLLLLLLL.LLLLL
LLLLLLLLLLLLLLLLLLLLL.LLLLLLLLL.LLLLLL.LLLLLLLLLLLLL.LLLLLLLLLLL.LLLLLLLLLLLLLLL.LLLLLLLLLL
LLLLL.LLLLLLLLLLLLLL..LLLLLLLLL.LLLLLL.LLLLLLL.LLLLL.LLLLL..LLLL.LLLLLLLLLLLLLLLLLLLLLLLLLL
LLLLLLLLLLLLLLLLLLLLL.LLLL.LLLLLLLLLLLLLLLLLLL.LLLLL.LLLLLL.LLLL.LLLLLLLL.LLLLLL.LLLL.LLLLL
"""
import numpy as np
val = {'L': -1, '#': 1, '.': 0}
rval = {v: k for k, v in val.items()}
def strtoarray(text):
return np.array([[val[i] for i in line] for line in
text.strip().splitlines()])
def arraytostr(a):
if a.ndim == 1:
a = a.reshape((1, a.size))
return '\n'.join([''.join([rval[i] for i in row]) for row in a])
def adjacent(a, i, j):
rows, cols = a.shape
adj = []
for newi in [i - 1, i, i + 1]:
for newj in [j - 1, j, j + 1]:
if (newi, newj) == (i, j):
continue
if newi < 0 or newj < 0:
continue
if newi >= rows or newj >= cols:
continue
adj.append(a[newi, newj])
return np.array(adj)
def adjacent2(a, i, j):
rows, cols = a.shape
adj = []
for idir in [-1, 0, 1]:
for jdir in [-1, 0, 1]:
if idir == jdir == 0:
continue
for x in range(1, max(rows, cols)):
newi = i + idir*x
newj = j + jdir*x
if newi < 0 or newi >= rows or newj < 0 or newj >= cols:
break
c = a[newi, newj]
if c in [-1, 1]:
adj.append(c)
break
return np.array(adj)
def apply_rules(a):
newa = a.copy()
rows, cols = a.shape
changed = False
for i in range(rows):
for j in range(cols):
if a[i, j] == 0:
continue
adj = adjacent(a, i, j)
if a[i, j] == -1 and np.sum(adj==1) == 0:
changed = True
newa[i, j] = 1
elif a[i, j] == 1 and np.sum(adj==1) >= 4:
changed = True
newa[i, j] = -1
return newa, changed
def generations(a):
n = 0
while True:
print("Generation", n)
print(arraytostr(a))
print()
a, changed = apply_rules(a)
if not changed:
return a
n += 1
def apply_rules2(a):
newa = a.copy()
rows, cols = a.shape
changed = False
for i in range(rows):
for j in range(cols):
if a[i, j] == 0:
continue
adj = adjacent2(a, i, j)
if a[i, j] == -1 and np.sum(adj==1) == 0:
changed = True
newa[i, j] = 1
elif a[i, j] == 1 and np.sum(adj==1) >= 5:
changed = True
newa[i, j] = -1
return newa, changed
def generations2(a):
n = 0
while True:
print("Generation", n)
print(arraytostr(a))
print()
a, changed = apply_rules2(a)
if not changed:
return a
n += 1
print("Day 11")
print("Part 1")
print("Test input")
testa = strtoarray(test_input)
print(test_input)
print(testa)
print(arraytostr(testa))
print("Adjacent to 0, 0", arraytostr(adjacent(testa, 0, 0)))
print("Adjacent to 2, 2", arraytostr(adjacent(testa, 2, 2)))
test_finala = generations(testa)
print(np.sum(test_finala == 1))
print("Puzzle input")
a = strtoarray(input)
finala = generations(a)
print(np.sum(finala == 1))
print("Part 2")
print("Test input")
testa2 = strtoarray(test_input2)
assert testa2[4, 3] == -1
print(adjacent2(testa2, 4, 3))
testa3 = strtoarray(test_input3)
assert testa3[1, 3] == -1
print(adjacent2(testa3, 1, 3))
testa4 = strtoarray(test_input4)
assert testa4[3, 3] == -1
print(adjacent2(testa4, 3, 3))
test_finala = generations2(testa)
print(np.sum(test_finala==1))
print("Puzzle input")
finala = generations2(a)
print(np.sum(finala == 1))
| 45.861702 | 91 | 0.718163 | 1,657 | 12,933 | 5.595051 | 0.069403 | 0.018984 | 0.016179 | 0.012944 | 0.59616 | 0.507604 | 0.424442 | 0.3155 | 0.178945 | 0.123827 | 0 | 0.008788 | 0.111343 | 12,933 | 281 | 92 | 46.024911 | 0.797877 | 0 | 0 | 0.335907 | 0 | 0.189189 | 0.722647 | 0.682518 | 0 | 0 | 0 | 0 | 0.011583 | 1 | 0.030888 | false | 0 | 0.003861 | 0.003861 | 0.065637 | 0.096525 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2206a89728beed4abfd89a30818175cab85e95be | 825 | py | Python | P20-Stack Abstract Data Type/Stack - Reverse Stack.py | necrospiritus/Python-Working-Examples | 075d410673e470fc7c4ffc262e92109a3032132f | [
"MIT"
] | null | null | null | P20-Stack Abstract Data Type/Stack - Reverse Stack.py | necrospiritus/Python-Working-Examples | 075d410673e470fc7c4ffc262e92109a3032132f | [
"MIT"
] | null | null | null | P20-Stack Abstract Data Type/Stack - Reverse Stack.py | necrospiritus/Python-Working-Examples | 075d410673e470fc7c4ffc262e92109a3032132f | [
"MIT"
] | null | null | null | """Reverse stack is using a list where the top is at the beginning instead of at the end."""
class Reverse_Stack:
def __init__(self):
self.items = []
def is_empty(self): # test to see whether the stack is empty.
return self.items == []
def push(self, item): # adds a new item to the base of the stack.
self.items.insert(0, item)
def pop(self): # removes the base item from the stack.
return self.items.pop(0)
def peek(self): # return the base item from the stack.
return self.items[0]
def size(self): # returns the number of items on the stack.
return len(self.items)
s = Reverse_Stack()
print(s.is_empty())
s.push(4)
s.push("Dog")
print(s.peek())
s.push("Cat")
print(s.size())
print(s.is_empty())
s.pop()
print(s.peek())
print(s.size())
| 22.916667 | 92 | 0.632727 | 138 | 825 | 3.717391 | 0.333333 | 0.105263 | 0.087719 | 0.05848 | 0.202729 | 0.148148 | 0.148148 | 0.148148 | 0.148148 | 0 | 0 | 0.006329 | 0.233939 | 825 | 35 | 93 | 23.571429 | 0.80538 | 0.346667 | 0 | 0.25 | 0 | 0 | 0.011342 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0.166667 | 0.458333 | 0.25 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
220da7c8db31ca8e3ea4491d39c1e1bb6b8b46fe | 1,018 | py | Python | examples/cam.py | jtme/button-shim | 19b80a236866fad068e6d3aeb643a1270d6ae934 | [
"MIT"
] | null | null | null | examples/cam.py | jtme/button-shim | 19b80a236866fad068e6d3aeb643a1270d6ae934 | [
"MIT"
] | null | null | null | examples/cam.py | jtme/button-shim | 19b80a236866fad068e6d3aeb643a1270d6ae934 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import signal
import buttonshim
print("""
Button SHIM: rainbow.py
Command on button press.
Press Ctrl+C to exit.
""")
import commands
@buttonshim.on_press(buttonshim.BUTTON_A)
def button_a(button, pressed):
buttonshim.set_pixel(0x94, 0x00, 0xd3)
s=commands.getstatusoutput("raspistill -w 320 -h 240 -o IMG/snap.jpg")
print s
if s[0] != 0:
self.output(s[1], s[0])
else:
self.output("error occured", status[0])
@buttonshim.on_press(buttonshim.BUTTON_B)
def button_b(button, pressed):
buttonshim.set_pixel(0x00, 0x00, 0xff)
@buttonshim.on_press(buttonshim.BUTTON_C)
def button_c(button, pressed):
buttonshim.set_pixel(0x00, 0xff, 0x00)
@buttonshim.on_press(buttonshim.BUTTON_D)
def button_d(button, pressed):
buttonshim.set_pixel(0xff, 0xff, 0x00)
@buttonshim.on_press(buttonshim.BUTTON_E)
def button_e(button, pressed):
buttonshim.set_pixel(0xff, 0x00, 0x00)
signal.pause()
| 19.960784 | 74 | 0.681729 | 142 | 1,018 | 4.746479 | 0.373239 | 0.089021 | 0.126113 | 0.200297 | 0.522255 | 0.329377 | 0.121662 | 0 | 0 | 0 | 0 | 0.055012 | 0.196464 | 1,018 | 50 | 75 | 20.36 | 0.768949 | 0 | 0 | 0 | 0 | 0 | 0.129388 | 0 | 0 | 0 | 0.060181 | 0 | 0 | 0 | null | null | 0 | 0.1 | null | null | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2221af1a0ee8e71a36084e82816e4e484658018d | 1,245 | py | Python | api/voters/tests/test_models.py | citizenlabsgr/voter-engagement | 2d33eac1531471988543c6c3781b95ac73ec6dd9 | [
"MIT"
] | 6 | 2017-11-10T00:50:17.000Z | 2018-03-25T02:26:19.000Z | api/voters/tests/test_models.py | citizenlabsgr/voter-engagement | 2d33eac1531471988543c6c3781b95ac73ec6dd9 | [
"MIT"
] | 40 | 2017-10-25T16:16:55.000Z | 2018-08-15T05:27:36.000Z | api/voters/tests/test_models.py | citizenlabsgr/voter-engagement | 2d33eac1531471988543c6c3781b95ac73ec6dd9 | [
"MIT"
] | 3 | 2017-11-22T01:50:41.000Z | 2018-04-17T23:33:08.000Z | # pylint: disable=unused-variable,unused-argument,expression-not-assigned
from django.forms.models import model_to_dict
import arrow
import pytest
from expecter import expect
from api.elections.models import Election
from .. import models
@pytest.fixture
def info():
return models.Identity(
first_name="John",
last_name="Doe",
birth_date=arrow.get("1985-06-19"),
)
@pytest.fixture
def voter(info):
return models.Voter(
email="john@example.com",
**model_to_dict(info),
)
@pytest.fixture
def status(voter):
return models.Status(
voter=voter,
election=Election(name="Sample Election"),
)
def describe_registration_info():
def describe_birth_month():
def is_parsed_from_date(info):
expect(info.birth_month) == "June"
def describe_birth_year():
def is_parsed_from_date(info):
expect(info.birth_year) == 1985
def describe_voter():
def describe_str():
def is_based_on_name(voter):
expect(str(voter)) == "John Doe"
def describe_status():
def describe_str():
def is_based_on_voter_and_election(status):
expect(str(status)) == "Sample Election: John Doe"
| 18.863636 | 73 | 0.658635 | 157 | 1,245 | 5.012739 | 0.356688 | 0.09784 | 0.060991 | 0.038119 | 0.162643 | 0.162643 | 0.162643 | 0.096569 | 0.096569 | 0 | 0 | 0.012526 | 0.230522 | 1,245 | 65 | 74 | 19.153846 | 0.808977 | 0.057028 | 0 | 0.175 | 0 | 0 | 0.072526 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.35 | false | 0 | 0.15 | 0.075 | 0.575 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
222b145f8daf822353fc31ee9861239abfadffb3 | 11,613 | py | Python | dotmotif/parsers/v2/test_v2_parser.py | aplbrain/dotmotif | db093ddad7308756e9cf7ee01199f0dca1369872 | [
"Apache-2.0"
] | 28 | 2020-06-12T20:46:15.000Z | 2022-02-05T18:33:46.000Z | dotmotif/parsers/v2/test_v2_parser.py | aplbrain/dotmotif | db093ddad7308756e9cf7ee01199f0dca1369872 | [
"Apache-2.0"
] | 26 | 2020-06-09T20:09:32.000Z | 2022-02-01T18:22:20.000Z | dotmotif/parsers/v2/test_v2_parser.py | aplbrain/dotmotif | db093ddad7308756e9cf7ee01199f0dca1369872 | [
"Apache-2.0"
] | 4 | 2021-03-08T02:47:49.000Z | 2021-09-13T19:16:29.000Z | from . import ParserV2
import dotmotif
import unittest
_THREE_CYCLE = """A -> B\nB -> C\nC -> A\n"""
_THREE_CYCLE_NEG = """A !> B\nB !> C\nC !> A\n"""
_THREE_CYCLE_INH = """A -| B\nB -| C\nC -| A\n"""
_THREE_CYCLE_NEG_INH = """A !| B\nB !| C\nC !| A\n"""
_ABC_TO_D = """\nA -> D\nB -> D\nC -> D\n"""
_THREE_CYCLE_CSV = """\nA,B\nB,C\nC,A\n"""
_THREE_CYCLE_NEG_CSV = """\nA,B\nB,C\nC,A\n"""
class TestDotmotif_Parserv2_DM(unittest.TestCase):
def test_sanity(self):
self.assertEqual(1, 1)
def test_dm_parser(self):
dm = dotmotif.Motif(_THREE_CYCLE)
self.assertEqual(len(dm._g.edges()), 3)
self.assertEqual(len(dm._g.nodes()), 3)
def test_dm_parser_actions(self):
dm = dotmotif.Motif(_THREE_CYCLE)
self.assertEqual([e[2]["action"] for e in dm._g.edges(data=True)], ["SYN"] * 3)
dm = dotmotif.Motif(_THREE_CYCLE_INH)
self.assertEqual([e[2]["action"] for e in dm._g.edges(data=True)], ["INH"] * 3)
def test_dm_parser_edge_exists(self):
dm = dotmotif.Motif(_THREE_CYCLE)
self.assertEqual([e[2]["exists"] for e in dm._g.edges(data=True)], [True] * 3)
dm = dotmotif.Motif(_THREE_CYCLE_NEG)
self.assertEqual([e[2]["exists"] for e in dm._g.edges(data=True)], [False] * 3)
dm = dotmotif.Motif(_THREE_CYCLE_NEG_INH)
self.assertEqual([e[2]["exists"] for e in dm._g.edges(data=True)], [False] * 3)
class TestDotmotif_Parserv2_DM_Macros(unittest.TestCase):
def test_macro_not_added(self):
exp = """\
edge(A, B) {
A -> B
}
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm._g.edges()), 0)
def test_simple_macro(self):
exp = """\
edge(A, B) {
A -> B
}
edge(C, D)
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm._g.edges()), 1)
def test_simple_macro_construction(self):
exp = """\
edge(A, B) {
A -> B
}
edge(C, D)
"""
dm = dotmotif.Motif(exp)
exp_edge = list(dm._g.edges(data=True))[0]
self.assertEqual(exp_edge[0], "C")
self.assertEqual(exp_edge[1], "D")
def test_multiline_macro_construction(self):
exp = """\
dualedge(A, B) {
A -> B
B -> A
}
dualedge(C, D)
"""
dm = dotmotif.Motif(exp)
exp_edge = list(dm._g.edges(data=True))[0]
self.assertEqual(exp_edge[0], "C")
self.assertEqual(exp_edge[1], "D")
def test_undefined_macro(self):
exp = """\
dualedge(A, B) {
A -> B
B -> A
}
foo(C, D)
"""
# with self.assertRaises(ValueError):
with self.assertRaises(Exception):
dotmotif.Motif(exp)
def test_wrong_args_macro(self):
exp = """\
edge(A, B) {
A -> B
B -> A
}
edge(C, D, E)
"""
# with self.assertRaises(ValueError):
with self.assertRaises(Exception):
dotmotif.Motif(exp)
def test_more_complex_macro(self):
exp = """\
tri(A, B, C) {
A -> B
B -> C
C -> A
}
tri(C, D, E)
"""
dm = dotmotif.Motif(exp)
edges = list(dm._g.edges(data=True))
self.assertEqual(len(edges), 3)
def test_macro_reuse(self):
exp = """\
tri(A, B, C) {
A -> B
B -> C
C -> A
}
tri(C, D, E)
tri(F, G, H)
"""
dm = dotmotif.Motif(exp)
edges = list(dm._g.edges(data=True))
self.assertEqual(len(edges), 6)
def test_conflicting_macro_invalid_edge_throws(self):
exp = """\
tri(A, B, C) {
A -> B
B -> C
C -> A
}
nontri(A, B, C) {
A !> B
B !> C
C !> A
}
tri(C, D, E)
nontri(D, E, F)
"""
# with self.assertRaises(dotmotif.validators.DisagreeingEdgesValidatorError):
with self.assertRaises(Exception):
dotmotif.Motif(exp)
def test_nested_macros(self):
exp = """\
dualedge(A, B) {
A -> B
B -> A
}
dualtri(A, B, C) {
dualedge(A, B)
dualedge(B, C)
dualedge(C, A)
}
dualtri(foo, bar, baz)
"""
dm = dotmotif.Motif(exp)
edges = list(dm._g.edges(data=True))
self.assertEqual(len(edges), 6)
def test_deeply_nested_macros(self):
exp = """\
edge(A, B) {
A -> B
}
dualedge(A, B) {
edge(A, B)
edge(B, A)
}
dualtri(A, B, C) {
dualedge(A, B)
dualedge(B, C)
dualedge(C, A)
}
dualtri(foo, bar, baz)
"""
dm = dotmotif.Motif(exp)
edges = list(dm._g.edges(data=True))
self.assertEqual(len(edges), 6)
def test_clustercuss_macros_no_repeats(self):
exp = """\
edge(A, B) {
A -> B
}
dualedge(A, B) {
edge(A, B)
edge(B, A)
}
dualtri(A, B, C) {
dualedge(A, B)
dualedge(B, C)
dualedge(C, A)
}
dualtri(foo, bar, baz)
dualtri(foo, bar, baf)
"""
dm = dotmotif.Motif(exp)
edges = list(dm._g.edges(data=True))
self.assertEqual(len(edges), 10)
def test_comment_in_macro(self):
exp = """\
# Outside comment
edge(A, B) {
# Inside comment
A -> B
}
dualedge(A, B) {
# Nested-inside comment
edge(A, B)
edge(B, A)
}
dualedge(foo, bar)
"""
dm = dotmotif.Motif(exp)
edges = list(dm._g.edges(data=True))
self.assertEqual(len(edges), 2)
def test_combo_macro(self):
exp = """\
edge(A, B) {
A -> B
}
dualedge(A, B) {
# Nested-inside comment!
edge(A, B)
B -> A
}
dualedge(foo, bar)
"""
dm = dotmotif.Motif(exp)
edges = list(dm._g.edges(data=True))
self.assertEqual(len(edges), 2)
def test_comment_macro_inline(self):
exp = """\
edge(A, B) {
A -> B
}
dualedge(A, B) {
# Nested-inside comment!
edge(A, B) # inline comment
B -> A
}
dualedge(foo, bar) # inline comment
# standalone comment
foo -> bar # inline comment
"""
dm = dotmotif.Motif(exp)
edges = list(dm._g.edges(data=True))
self.assertEqual(len(edges), 2)
def test_alphanumeric_variables(self):
exp = """\
edge(A, B) {
A -> B
}
dualedge(A1, B) {
# Nested-inside comment!
edge(A1, B) # inline comment
B -> A1
}
dualedge(foo_1, bar_2) # inline comment
# standalone comment
foo_1 -> bar_2 # inline comment
"""
dm = dotmotif.Motif(exp)
edges = list(dm._g.edges(data=True))
self.assertEqual(len(edges), 2)
self.assertEqual(list(dm._g.nodes()), ["foo_1", "bar_2"])
self.assertEqual(type(list(dm._g.nodes())[0]), str)
new_exp = """
L1 -> Mi1
L1 -> Tm3
L3 -> Mi9
"""
dm = dotmotif.Motif(new_exp)
self.assertEqual(list(dm._g.nodes()), ["L1", "Mi1", "Tm3", "L3", "Mi9"])
class TestDotmotif_Parserv2_DM_EdgeAttributes(unittest.TestCase):
def test_basic_edge_attr(self):
exp = """\
Aa -> Ba [type == 1]
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm._g.edges()), 1)
u, v, d = list(dm._g.edges(["Aa", "Bb"], data=True))[0]
self.assertEqual(type(list(dm._g.nodes())[0]), str)
self.assertEqual(type(list(dm._g.nodes())[1]), str)
self.assertEqual(d["constraints"]["type"], {"==": [1]})
def test_edge_multi_attr(self):
exp = """\
Aa -> Ba [type != 1, type != 12]
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm._g.edges()), 1)
u, v, d = list(dm._g.edges(data=True))[0]
self.assertEqual(d["constraints"]["type"], {"!=": [1, 12]})
def test_edge_macro_attr(self):
exp = """\
macro(Aa, Ba) {
Aa -> Ba [type != 1, type != 12]
}
macro(X, Y)
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm._g.edges()), 1)
u, v, d = list(dm._g.edges(data=True))[0]
self.assertEqual(d["constraints"]["type"], {"!=": [1, 12]})
class TestDotmotif_Parserv2_DM_NodeAttributes(unittest.TestCase):
def test_basic_node_attr(self):
exp = """\
Aa -> Ba
Aa.type = "excitatory"
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm.list_node_constraints()), 1)
self.assertEqual(list(dm.list_node_constraints().keys()), ["Aa"])
def test_node_multi_attr(self):
exp = """\
Aa -> Ba
Aa.type = "excitatory"
Aa.size = 4.5
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm.list_node_constraints()), 1)
self.assertEqual(len(dm.list_node_constraints()["Aa"]), 2)
self.assertEqual(dm.list_node_constraints()["Aa"]["type"]["="], ["excitatory"])
self.assertEqual(dm.list_node_constraints()["Aa"]["size"]["="], [4.5])
self.assertEqual(list(dm.list_node_constraints().keys()), ["Aa"])
def test_multi_node_attr(self):
exp = """\
Aa -> Ba
Aa.type = "excitatory"
Ba.size=4.0
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm.list_node_constraints()), 2)
self.assertEqual(list(dm.list_node_constraints().keys()), ["Aa", "Ba"])
def test_node_macro_attr(self):
exp = """\
macro(A) {
A.type = "excitatory"
A.size >= 4.0
}
Aaa -> Ba
macro(Aaa)
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm.list_node_constraints()), 1)
self.assertEqual(list(dm.list_node_constraints().keys()), ["Aaa"])
exp = """\
macro(A) {
A.type = "excitatory"
A.size >= 4.0
}
Aaa -> Ba
macro(Aaa)
macro(Ba)
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm.list_node_constraints()), 2)
self.assertEqual(list(dm.list_node_constraints().keys()), ["Aaa", "Ba"])
class TestDynamicNodeConstraints(unittest.TestCase):
def test_dynamic_constraints(self):
"""
Test that comparisons may be made between variables, e.g.:
A.type != B.type
"""
exp = """\
A -> B
A.radius < B.radius
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm.list_dynamic_node_constraints()), 1)
def test_dynamic_constraints_in_macro(self):
"""
Test that comparisons may be made between variables in a macro, e.g.:
A.type != B.type
"""
exp = """\
macro(A, B) {
A.radius > B.radius
}
macro(A, B)
A -> B
"""
dm = dotmotif.Motif(exp)
self.assertEqual(len(dm.list_dynamic_node_constraints()), 1)
| 26.819861 | 87 | 0.490313 | 1,418 | 11,613 | 3.868829 | 0.09732 | 0.021145 | 0.082027 | 0.075465 | 0.792016 | 0.744987 | 0.704703 | 0.655669 | 0.626322 | 0.553956 | 0 | 0.013016 | 0.351675 | 11,613 | 432 | 88 | 26.881944 | 0.715633 | 0.026953 | 0 | 0.634615 | 0 | 0 | 0.37372 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0.07967 | false | 0 | 0.008242 | 0 | 0.101648 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
223135074d80c82a77c3e9b47c439c7c3abe7792 | 632 | py | Python | brp/formutils.py | chop-dbhi/biorepo-portal | 7db13c40b2b9d62af43a28e4af08c2472b98fc96 | [
"BSD-2-Clause"
] | 6 | 2016-10-26T19:51:11.000Z | 2021-03-18T16:05:55.000Z | brp/formutils.py | chop-dbhi/biorepo-portal | 7db13c40b2b9d62af43a28e4af08c2472b98fc96 | [
"BSD-2-Clause"
] | 207 | 2015-09-24T17:41:37.000Z | 2021-05-18T18:14:08.000Z | brp/formutils.py | chop-dbhi/biorepo-portal | 7db13c40b2b9d62af43a28e4af08c2472b98fc96 | [
"BSD-2-Clause"
] | 8 | 2016-04-27T19:04:50.000Z | 2020-08-24T02:33:05.000Z | from django import template
from django.forms import widgets
register = template.Library()
@register.inclusion_tag('formfield.html')
def formfield(field):
widget = field.field.widget
type_ = None
if isinstance(widget, widgets.Input):
type_ = 'input'
elif isinstance(widget, widgets.Textarea):
type_ = 'textarea'
elif isinstance(widget, widgets.Select):
type_ = 'select'
elif isinstance(widget, widgets.CheckboxInput):
type_ = 'checkbox'
elif isinstance(widget, widgets.RadioInput):
type_ = 'radio'
return {'field': field, 'form': field.form, 'type': type_}
| 28.727273 | 62 | 0.675633 | 69 | 632 | 6.072464 | 0.42029 | 0.190931 | 0.274463 | 0.257757 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.208861 | 632 | 21 | 63 | 30.095238 | 0.838 | 0 | 0 | 0 | 0 | 0 | 0.093354 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.111111 | 0 | 0.222222 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2237956981da3e82e0d6350f1b78b20897718d48 | 2,441 | py | Python | explicalib/distribution/multiclass_distribution.py | euranova/estimating_eces | 9bfa81dd7a39ebe069c5b11b8e7a9bf9017e9350 | [
"MIT"
] | 2 | 2021-11-30T18:44:11.000Z | 2021-11-30T18:44:19.000Z | explicalib/distribution/multiclass_distribution.py | euranova/estimating_eces | 9bfa81dd7a39ebe069c5b11b8e7a9bf9017e9350 | [
"MIT"
] | null | null | null | explicalib/distribution/multiclass_distribution.py | euranova/estimating_eces | 9bfa81dd7a39ebe069c5b11b8e7a9bf9017e9350 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
@author: nicolas.posocco
"""
from abc import ABC
import numpy as np
class MulticlassDistribution(ABC):
def __init__(self):
"""
Initializes the distribution, allowing later sampling and posterior probabilities calculations.
"""
pass
def sample(self, n_samples, return_posterior=True, reproducible=None):
"""
Samples n_samples times from the distribution, and their label.
Returns also the array of posterior probabilities if return_posterior=True.
"""
# n_samples
assert type(n_samples) is int, "n_samples should be an integer."
assert n_samples > 0, "n_samples should be positive."
# return_posterior
assert type(return_posterior) is bool, "return_posterior should be a boolean."
# reproducible
assert type(reproducible) is np.random.RandomState, "reproducible should be a np.random.RandomState object."
raise NotImplementedError
def posteriors(self, X):
# X
assert type(X) is np.ndarray, "X should be a numpy array."
assert X.ndim == 2, "X should be of shape (n_samples, n_features), here is of shape {}".format(X.shape)
raise NotImplementedError
def get_bayes_classifier(self):
"""
Instanciates the optimal Bayes classifier for this distribution.
"""
return BayesClassifier(distribution=self)
class BayesClassifier(ABC):
def __init__(self, distribution):
# distribution
assert isinstance(distribution,
MulticlassDistribution), "distribution should inherit from MulticlassDistribution."
self.distribution = distribution
def fit(self, X):
pass
def predict_proba(self, X):
# X
assert type(X) is np.ndarray, "X should be a numpy array, here is a {}.".format(type(X))
assert X.ndim == 2, "X should be of shape (n_samples, n_features), here is of shape {}".format(X.shape)
posteriors = self.distribution.posteriors(X)
return posteriors
def predict(self, X):
# X
assert type(X) is np.ndarray, "X should be a numpy array, here is a {}.".format(type(X))
assert X.ndim == 2, "X should be of shape (n_samples, n_features), here is of shape {}".format(X.shape)
posteriors = self.predict_proba(X)
return np.argmax(posteriors, axis=0) | 29.059524 | 116 | 0.643179 | 303 | 2,441 | 5.082508 | 0.280528 | 0.051948 | 0.035065 | 0.023377 | 0.271429 | 0.271429 | 0.271429 | 0.271429 | 0.271429 | 0.271429 | 0 | 0.003337 | 0.263417 | 2,441 | 84 | 117 | 29.059524 | 0.85317 | 0.166735 | 0 | 0.264706 | 0 | 0 | 0.261451 | 0.022645 | 0 | 0 | 0 | 0 | 0.323529 | 1 | 0.235294 | false | 0.058824 | 0.058824 | 0 | 0.441176 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
22574ae53ea97f421f17d51078183bbb695cb068 | 566 | py | Python | middleware/login_required.py | ahmetelgun/flask-boilerplate | 56bc0235b5f00a7aaca6a9536a1744442863b8b6 | [
"Apache-2.0"
] | 2 | 2021-12-01T20:48:44.000Z | 2022-02-04T16:33:33.000Z | middleware/login_required.py | ahmetelgun/flask_authentication_boilerplate | 56bc0235b5f00a7aaca6a9536a1744442863b8b6 | [
"Apache-2.0"
] | null | null | null | middleware/login_required.py | ahmetelgun/flask_authentication_boilerplate | 56bc0235b5f00a7aaca6a9536a1744442863b8b6 | [
"Apache-2.0"
] | null | null | null | from flask import request, make_response, jsonify, g
from datetime import datetime
from functools import wraps
import jwt
from models import DBContext, User
from settings import SECRET_KEY
from service import is_token_valid
def login_required(func):
@wraps(func)
def wrapper(*args, **kwargs):
not_authenticated_response = make_response(
jsonify({"message": "Login required"}),
401
)
if g.user:
return func(g.user, *args, **kwargs)
return not_authenticated_response
return wrapper
| 22.64 | 52 | 0.681979 | 69 | 566 | 5.449275 | 0.507246 | 0.06383 | 0.101064 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007042 | 0.24735 | 566 | 24 | 53 | 23.583333 | 0.875587 | 0 | 0 | 0 | 0 | 0 | 0.037102 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.388889 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
2258e4decef3126cb93f24dd49680df54adc84dc | 243 | py | Python | config/environments/__init__.py | mihail-ivanov/flask-init | 47f634f70bb8bd02db8f0a0a3a1955b08a249254 | [
"MIT"
] | null | null | null | config/environments/__init__.py | mihail-ivanov/flask-init | 47f634f70bb8bd02db8f0a0a3a1955b08a249254 | [
"MIT"
] | null | null | null | config/environments/__init__.py | mihail-ivanov/flask-init | 47f634f70bb8bd02db8f0a0a3a1955b08a249254 | [
"MIT"
] | null | null | null |
from .development import DevelopmentConfig
from .testing import TestingConfig
from .production import ProductionConfig
app_config = {
'development': DevelopmentConfig,
'testing': TestingConfig,
'production': ProductionConfig,
}
| 20.25 | 42 | 0.773663 | 20 | 243 | 9.35 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152263 | 243 | 11 | 43 | 22.090909 | 0.907767 | 0 | 0 | 0 | 0 | 0 | 0.115702 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
225d1d06e227d6a8a3242fe225e574042e91441e | 12,591 | py | Python | troposphere/kendra.py | marinpurgar/troposphere | ec35854000ddfd5e2eecd251d5ecaf31979bd2d1 | [
"BSD-2-Clause"
] | null | null | null | troposphere/kendra.py | marinpurgar/troposphere | ec35854000ddfd5e2eecd251d5ecaf31979bd2d1 | [
"BSD-2-Clause"
] | null | null | null | troposphere/kendra.py | marinpurgar/troposphere | ec35854000ddfd5e2eecd251d5ecaf31979bd2d1 | [
"BSD-2-Clause"
] | null | null | null | # Copyright (c) 2012-2020, Mark Peek <mark@peek.org>
# All rights reserved.
#
# See LICENSE file for full license.
#
# *** Do not modify - this file is autogenerated ***
# Resource specification version: 18.6.0
from . import AWSObject
from . import AWSProperty
from . import Tags
from .validators import boolean
from .validators import integer
class AclConfiguration(AWSProperty):
props = {
'AllowedGroupsColumnName': (basestring, True),
}
class ChangeDetectingColumns(AWSProperty):
props = {
'ChangeDetectingColumns': ([basestring], False),
}
class DataSourceToIndexFieldMapping(AWSProperty):
props = {
'DataSourceFieldName': (basestring, True),
'DateFieldFormat': (basestring, False),
'IndexFieldName': (basestring, True),
}
class DataSourceToIndexFieldMappingList(AWSProperty):
props = {
'DataSourceToIndexFieldMappingList':
([DataSourceToIndexFieldMapping], False),
}
class ColumnConfiguration(AWSProperty):
props = {
'ChangeDetectingColumns': (ChangeDetectingColumns, True),
'DocumentDataColumnName': (basestring, True),
'DocumentIdColumnName': (basestring, True),
'DocumentTitleColumnName': (basestring, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
}
class ConnectionConfiguration(AWSProperty):
props = {
'DatabaseHost': (basestring, True),
'DatabaseName': (basestring, True),
'DatabasePort': (integer, True),
'SecretArn': (basestring, True),
'TableName': (basestring, True),
}
class DataSourceVpcConfiguration(AWSProperty):
props = {
'SecurityGroupIds': ([basestring], True),
'SubnetIds': ([basestring], True),
}
class SqlConfiguration(AWSProperty):
props = {
'QueryIdentifiersEnclosingOption': (basestring, False),
}
class DatabaseConfiguration(AWSProperty):
props = {
'AclConfiguration': (AclConfiguration, False),
'ColumnConfiguration': (ColumnConfiguration, True),
'ConnectionConfiguration': (ConnectionConfiguration, True),
'DatabaseEngineType': (basestring, True),
'SqlConfiguration': (SqlConfiguration, False),
'VpcConfiguration': (DataSourceVpcConfiguration, False),
}
class DataSourceInclusionsExclusionsStrings(AWSProperty):
props = {
'DataSourceInclusionsExclusionsStrings': ([basestring], False),
}
class OneDriveUserList(AWSProperty):
props = {
'OneDriveUserList': ([basestring], False),
}
class S3Path(AWSProperty):
props = {
'Bucket': (basestring, True),
'Key': (basestring, True),
}
class OneDriveUsers(AWSProperty):
props = {
'OneDriveUserList': (OneDriveUserList, False),
'OneDriveUserS3Path': (S3Path, False),
}
class OneDriveConfiguration(AWSProperty):
props = {
'ExclusionPatterns': (DataSourceInclusionsExclusionsStrings, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
'InclusionPatterns': (DataSourceInclusionsExclusionsStrings, False),
'OneDriveUsers': (OneDriveUsers, True),
'SecretArn': (basestring, True),
'TenantDomain': (basestring, True),
}
class AccessControlListConfiguration(AWSProperty):
props = {
'KeyPath': (basestring, False),
}
class DocumentsMetadataConfiguration(AWSProperty):
props = {
'S3Prefix': (basestring, False),
}
class S3DataSourceConfiguration(AWSProperty):
props = {
'AccessControlListConfiguration':
(AccessControlListConfiguration, False),
'BucketName': (basestring, True),
'DocumentsMetadataConfiguration':
(DocumentsMetadataConfiguration, False),
'ExclusionPatterns': (DataSourceInclusionsExclusionsStrings, False),
'InclusionPrefixes': (DataSourceInclusionsExclusionsStrings, False),
}
class SalesforceChatterFeedIncludeFilterTypes(AWSProperty):
props = {
'SalesforceChatterFeedIncludeFilterTypes': ([basestring], False),
}
class SalesforceChatterFeedConfiguration(AWSProperty):
props = {
'DocumentDataFieldName': (basestring, True),
'DocumentTitleFieldName': (basestring, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
'IncludeFilterTypes':
(SalesforceChatterFeedIncludeFilterTypes, False),
}
class SalesforceCustomKnowledgeArticleTypeConfiguration(AWSProperty):
props = {
'DocumentDataFieldName': (basestring, True),
'DocumentTitleFieldName': (basestring, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
'Name': (basestring, True),
}
class SalesforceCustomKnowledgeArticleTypeConfigurationList(AWSProperty):
props = {
'SalesforceCustomKnowledgeArticleTypeConfigurationList':
([SalesforceCustomKnowledgeArticleTypeConfiguration], False),
}
class SalesforceKnowledgeArticleStateList(AWSProperty):
props = {
'SalesforceKnowledgeArticleStateList': ([basestring], False),
}
class SalesforceStandardKnowledgeArticleTypeConfiguration(AWSProperty):
props = {
'DocumentDataFieldName': (basestring, True),
'DocumentTitleFieldName': (basestring, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
}
class SalesforceKnowledgeArticleConfiguration(AWSProperty):
props = {
'CustomKnowledgeArticleTypeConfigurations':
(SalesforceCustomKnowledgeArticleTypeConfigurationList, False),
'IncludedStates': (SalesforceKnowledgeArticleStateList, True),
'StandardKnowledgeArticleTypeConfiguration':
(SalesforceStandardKnowledgeArticleTypeConfiguration, False),
}
class SalesforceStandardObjectAttachmentConfiguration(AWSProperty):
props = {
'DocumentTitleFieldName': (basestring, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
}
class SalesforceStandardObjectConfiguration(AWSProperty):
props = {
'DocumentDataFieldName': (basestring, True),
'DocumentTitleFieldName': (basestring, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
'Name': (basestring, True),
}
class SalesforceStandardObjectConfigurationList(AWSProperty):
props = {
'SalesforceStandardObjectConfigurationList':
([SalesforceStandardObjectConfiguration], False),
}
class SalesforceConfiguration(AWSProperty):
props = {
'ChatterFeedConfiguration':
(SalesforceChatterFeedConfiguration, False),
'CrawlAttachments': (boolean, False),
'ExcludeAttachmentFilePatterns':
(DataSourceInclusionsExclusionsStrings, False),
'IncludeAttachmentFilePatterns':
(DataSourceInclusionsExclusionsStrings, False),
'KnowledgeArticleConfiguration':
(SalesforceKnowledgeArticleConfiguration, False),
'SecretArn': (basestring, True),
'ServerUrl': (basestring, True),
'StandardObjectAttachmentConfiguration':
(SalesforceStandardObjectAttachmentConfiguration, False),
'StandardObjectConfigurations':
(SalesforceStandardObjectConfigurationList, False),
}
class ServiceNowKnowledgeArticleConfiguration(AWSProperty):
props = {
'CrawlAttachments': (boolean, False),
'DocumentDataFieldName': (basestring, True),
'DocumentTitleFieldName': (basestring, False),
'ExcludeAttachmentFilePatterns':
(DataSourceInclusionsExclusionsStrings, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
'IncludeAttachmentFilePatterns':
(DataSourceInclusionsExclusionsStrings, False),
}
class ServiceNowServiceCatalogConfiguration(AWSProperty):
props = {
'CrawlAttachments': (boolean, False),
'DocumentDataFieldName': (basestring, True),
'DocumentTitleFieldName': (basestring, False),
'ExcludeAttachmentFilePatterns':
(DataSourceInclusionsExclusionsStrings, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
'IncludeAttachmentFilePatterns':
(DataSourceInclusionsExclusionsStrings, False),
}
class ServiceNowConfiguration(AWSProperty):
props = {
'HostUrl': (basestring, True),
'KnowledgeArticleConfiguration':
(ServiceNowKnowledgeArticleConfiguration, False),
'SecretArn': (basestring, True),
'ServiceCatalogConfiguration':
(ServiceNowServiceCatalogConfiguration, False),
'ServiceNowBuildVersion': (basestring, True),
}
class SharePointConfiguration(AWSProperty):
props = {
'CrawlAttachments': (boolean, False),
'DocumentTitleFieldName': (basestring, False),
'ExclusionPatterns': (DataSourceInclusionsExclusionsStrings, False),
'FieldMappings': (DataSourceToIndexFieldMappingList, False),
'InclusionPatterns': (DataSourceInclusionsExclusionsStrings, False),
'SecretArn': (basestring, True),
'SharePointVersion': (basestring, True),
'Urls': ([basestring], True),
'UseChangeLog': (boolean, False),
'VpcConfiguration': (DataSourceVpcConfiguration, False),
}
class DataSourceConfiguration(AWSProperty):
props = {
'DatabaseConfiguration': (DatabaseConfiguration, False),
'OneDriveConfiguration': (OneDriveConfiguration, False),
'S3Configuration': (S3DataSourceConfiguration, False),
'SalesforceConfiguration': (SalesforceConfiguration, False),
'ServiceNowConfiguration': (ServiceNowConfiguration, False),
'SharePointConfiguration': (SharePointConfiguration, False),
}
class DataSource(AWSObject):
resource_type = "AWS::Kendra::DataSource"
props = {
'DataSourceConfiguration': (DataSourceConfiguration, True),
'Description': (basestring, False),
'IndexId': (basestring, True),
'Name': (basestring, True),
'RoleArn': (basestring, True),
'Schedule': (basestring, False),
'Tags': (Tags, False),
'Type': (basestring, True),
}
class Faq(AWSObject):
resource_type = "AWS::Kendra::Faq"
props = {
'Description': (basestring, False),
'FileFormat': (basestring, False),
'IndexId': (basestring, True),
'Name': (basestring, True),
'RoleArn': (basestring, True),
'S3Path': (S3Path, True),
'Tags': (Tags, False),
}
class CapacityUnitsConfiguration(AWSProperty):
props = {
'QueryCapacityUnits': (integer, True),
'StorageCapacityUnits': (integer, True),
}
class ValueImportanceItem(AWSProperty):
props = {
'Key': (basestring, False),
'Value': (integer, False),
}
class ValueImportanceItems(AWSProperty):
props = {
'ValueImportanceItems': ([ValueImportanceItem], False),
}
class Relevance(AWSProperty):
props = {
'Duration': (basestring, False),
'Freshness': (boolean, False),
'Importance': (integer, False),
'RankOrder': (basestring, False),
'ValueImportanceItems': (ValueImportanceItems, False),
}
class Search(AWSProperty):
props = {
'Displayable': (boolean, False),
'Facetable': (boolean, False),
'Searchable': (boolean, False),
'Sortable': (boolean, False),
}
class DocumentMetadataConfiguration(AWSProperty):
props = {
'Name': (basestring, True),
'Relevance': (Relevance, False),
'Search': (Search, False),
'Type': (basestring, True),
}
class DocumentMetadataConfigurationList(AWSProperty):
props = {
'DocumentMetadataConfigurationList':
([DocumentMetadataConfiguration], False),
}
class ServerSideEncryptionConfiguration(AWSProperty):
props = {
'KmsKeyId': (basestring, False),
}
class Index(AWSObject):
resource_type = "AWS::Kendra::Index"
props = {
'CapacityUnits': (CapacityUnitsConfiguration, False),
'Description': (basestring, False),
'DocumentMetadataConfigurations':
(DocumentMetadataConfigurationList, False),
'Edition': (basestring, True),
'Name': (basestring, True),
'RoleArn': (basestring, True),
'ServerSideEncryptionConfiguration':
(ServerSideEncryptionConfiguration, False),
'Tags': (Tags, False),
}
| 30.194245 | 76 | 0.669685 | 698 | 12,591 | 12.075931 | 0.224928 | 0.074742 | 0.024795 | 0.066437 | 0.266224 | 0.230158 | 0.221734 | 0.210108 | 0.20382 | 0.16372 | 0 | 0.002133 | 0.218013 | 12,591 | 416 | 77 | 30.266827 | 0.853951 | 0.015567 | 0 | 0.370253 | 1 | 0 | 0.212158 | 0.118673 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.031646 | 0 | 0.31962 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
226749a06c765ec39cc633d7c553b9c567992420 | 811 | py | Python | q.py | Akatsuki1910/tokuron | 2f5b05dc1c1395f30e738a0d5749ac32d46e5379 | [
"MIT"
] | null | null | null | q.py | Akatsuki1910/tokuron | 2f5b05dc1c1395f30e738a0d5749ac32d46e5379 | [
"MIT"
] | null | null | null | q.py | Akatsuki1910/tokuron | 2f5b05dc1c1395f30e738a0d5749ac32d46e5379 | [
"MIT"
] | null | null | null | """ Q learning """
import numpy as np
import plot
Q = np.array(np.zeros([11, 3]))
GAMMA = 0.9
ALPHA = 0.1
def action_select(s_s):
""" action select """
return np.random.choice([i for i in range(1, 4) if i + s_s < 11])
for i in range(10000):
S_STATE = 0
while S_STATE != 10:
a_state = action_select(S_STATE)
R = 0.001
s_state_dash = S_STATE + a_state
if s_state_dash == 10:
R = -10
else:
s_state_dash = action_select(s_state_dash)+s_state_dash
if s_state_dash == 10:
R = 10
Q[S_STATE, a_state-1] = Q[S_STATE, a_state-1]+ALPHA * \
(R+GAMMA * Q[s_state_dash,
np.argmax(Q[s_state_dash, ])] - Q[S_STATE, a_state-1])
S_STATE = s_state_dash
plot.plot_func(Q)
| 21.918919 | 69 | 0.557337 | 138 | 811 | 3.007246 | 0.289855 | 0.245783 | 0.216867 | 0.115663 | 0.260241 | 0.183133 | 0.081928 | 0 | 0 | 0 | 0 | 0.061372 | 0.316893 | 811 | 36 | 70 | 22.527778 | 0.687726 | 0.030826 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.083333 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
22727c318ff129b6243d715f6523dbfa7a528208 | 769 | py | Python | NFCow/malls/migrations/0001_initial.py | jojoriveraa/titulacion-NFCOW | 643f7f2cbe9c68d9343f38d12629720b12e9ce1e | [
"Apache-2.0"
] | null | null | null | NFCow/malls/migrations/0001_initial.py | jojoriveraa/titulacion-NFCOW | 643f7f2cbe9c68d9343f38d12629720b12e9ce1e | [
"Apache-2.0"
] | 11 | 2016-01-09T06:27:02.000Z | 2016-01-10T05:21:05.000Z | NFCow/malls/migrations/0001_initial.py | jojoriveraa/titulacion-NFCOW | 643f7f2cbe9c68d9343f38d12629720b12e9ce1e | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by Django 1.9 on 2015-12-23 08:44
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Mall',
fields=[
('id', models.PositiveIntegerField(primary_key=True, serialize=False)),
('name', models.CharField(max_length=255)),
('address', models.CharField(max_length=255)),
('group', models.CharField(max_length=255)),
('image', models.ImageField(upload_to='malls')),
('postcode', models.PositiveIntegerField()),
],
),
]
| 27.464286 | 87 | 0.572172 | 73 | 769 | 5.890411 | 0.684932 | 0.104651 | 0.125581 | 0.167442 | 0.188372 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04428 | 0.295189 | 769 | 27 | 88 | 28.481481 | 0.749077 | 0.084525 | 0 | 0 | 1 | 0 | 0.057061 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.105263 | 0 | 0.315789 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
97d3d479f4d7bb607ee11ef3af9de4bcb2b193c7 | 12,781 | py | Python | tests/helpers/test_file.py | Centaurioun/PyFunceble | 59b809f3322118f7824195752c6015220738d4a0 | [
"Apache-2.0"
] | null | null | null | tests/helpers/test_file.py | Centaurioun/PyFunceble | 59b809f3322118f7824195752c6015220738d4a0 | [
"Apache-2.0"
] | null | null | null | tests/helpers/test_file.py | Centaurioun/PyFunceble | 59b809f3322118f7824195752c6015220738d4a0 | [
"Apache-2.0"
] | null | null | null | """
The tool to check the availability or syntax of domain, IP or URL.
::
██████╗ ██╗ ██╗███████╗██╗ ██╗███╗ ██╗ ██████╗███████╗██████╗ ██╗ ███████╗
██╔══██╗╚██╗ ██╔╝██╔════╝██║ ██║████╗ ██║██╔════╝██╔════╝██╔══██╗██║ ██╔════╝
██████╔╝ ╚████╔╝ █████╗ ██║ ██║██╔██╗ ██║██║ █████╗ ██████╔╝██║ █████╗
██╔═══╝ ╚██╔╝ ██╔══╝ ██║ ██║██║╚██╗██║██║ ██╔══╝ ██╔══██╗██║ ██╔══╝
██║ ██║ ██║ ╚██████╔╝██║ ╚████║╚██████╗███████╗██████╔╝███████╗███████╗
╚═╝ ╚═╝ ╚═╝ ╚═════╝ ╚═╝ ╚═══╝ ╚═════╝╚══════╝╚═════╝ ╚══════╝╚══════╝
Tests of the file helper.
Author:
Nissar Chababy, @funilrys, contactTATAfunilrysTODTODcom
Special thanks:
https://pyfunceble.github.io/special-thanks.html
Contributors:
https://pyfunceble.github.io/contributors.html
Project link:
https://github.com/funilrys/PyFunceble
Project documentation:
https://pyfunceble.readthedocs.io/en/dev/
Project homepage:
https://pyfunceble.github.io/
License:
::
Copyright 2017, 2018, 2019, 2020, 2021, 2021 Nissar Chababy
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
"""
import os
import secrets
import tempfile
import unittest
from PyFunceble.helpers.file import FileHelper
from PyFunceble.utils.platform import PlatformUtility
class TestFileHelper(unittest.TestCase):
"""
Tests of the file helpers.
"""
def test_set_path_return(self) -> None:
"""
Tests the response from the method which let us set the path to work with.
"""
given = tempfile.NamedTemporaryFile()
file_helper = FileHelper()
actual = file_helper.set_path(given.name)
self.assertIsInstance(actual, FileHelper)
def test_set_path(self) -> None:
"""
Tests the method which let us set the path to work with.
"""
given = tempfile.NamedTemporaryFile()
expected = given.name
file_helper = FileHelper()
file_helper.set_path(given.name)
actual = file_helper.path
self.assertEqual(expected, actual)
file_helper = FileHelper(given.name)
actual = file_helper.path
self.assertEqual(expected, actual)
def test_set_path_not_str(self) -> None:
"""
Tests the method which let us set the path to work with for the case
that it's not a string.
"""
given = ["Hello", "World"]
file_helper = FileHelper()
self.assertRaises(TypeError, lambda: file_helper.set_path(given))
def test_join_path(self) -> None:
"""
Tests the method which let us join paths.
"""
given = "/hello/world"
if PlatformUtility.is_windows():
expected = "/hello/world\\hello\\world"
else:
expected = "/hello/world/hello/world"
actual = FileHelper(given).join_path("hello", "world")
self.assertEqual(expected, actual)
def test_exists(self) -> None:
"""
Tests the method which let us check if the given file exists.
"""
file_helper = FileHelper(tempfile.gettempdir())
file_helper.set_path(file_helper.join_path(secrets.token_hex(8)))
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
with open(file_helper.path, "w") as file_stream:
file_stream.write("Hello, World!")
expected = True
actual = file_helper.exists()
self.assertEqual(expected, actual)
os.remove(file_helper.path)
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
def test_get_size(self) -> None:
"""
Tests the method which let us get the size of a file.
"""
file_helper = FileHelper(tempfile.gettempdir())
file_helper.set_path(file_helper.join_path(secrets.token_hex(8)))
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
with open(file_helper.path, "w") as file_stream:
file_stream.write("Hello, World!")
expected = True
actual = file_helper.exists()
self.assertEqual(expected, actual)
expected = 13
actual = file_helper.get_size()
self.assertEqual(expected, actual)
os.remove(file_helper.path)
def test_is_empty(self) -> None:
"""
Tests the method which let us check if a file is empty.
"""
file_helper = FileHelper(tempfile.gettempdir())
file_helper.set_path(file_helper.join_path(secrets.token_hex(8)))
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
with open(file_helper.path, "w") as file_stream:
file_stream.write("")
expected = True
actual = file_helper.is_empty()
self.assertEqual(expected, actual)
with open(file_helper.path, "w") as file_stream:
file_stream.write("Hello, World!")
expected = False
actual = file_helper.is_empty()
self.assertEqual(expected, actual)
os.remove(file_helper.path)
def test_delete(self) -> None:
"""
Tests the method which let us delete a file.
"""
file_helper = FileHelper(tempfile.gettempdir())
file_helper.set_path(file_helper.join_path(secrets.token_hex(8)))
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
with open(file_helper.path, "w") as file_stream:
file_stream.write("")
expected = True
actual = file_helper.exists()
self.assertEqual(expected, actual)
file_helper.delete()
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
def test_write(self) -> None:
"""
Tests the method which let us write a file.
"""
given = tempfile.NamedTemporaryFile(delete=False)
file_helper = FileHelper(given.name)
file_helper.write("Hello, World!")
given.seek(0)
expected = b"Hello, World!"
actual = given.read()
self.assertEqual(expected, actual)
file_helper.write("Hello, this is Funilrys!")
given.seek(0)
expected = b"Hello, World!Hello, this is Funilrys!"
actual = given.read()
self.assertEqual(expected, actual)
file_helper.write("Hello, World!", overwrite=True)
given.seek(0)
expected = b"Hello, World!"
actual = given.read()
self.assertEqual(expected, actual)
def test_read(self) -> None:
"""
Tests the method which let us read a file.
"""
given = tempfile.NamedTemporaryFile(delete=False)
file_helper = FileHelper(given.name)
file_helper.write("Hello, World!")
given.seek(0)
expected = "Hello, World!"
actual = file_helper.read()
self.assertEqual(expected, actual)
def test_read_file_does_not_exists(self) -> None:
"""
Tests the method which let us read a file for the case that the given
file does not exists.
"""
file_helper = FileHelper(tempfile.gettempdir())
file_helper.set_path(file_helper.join_path(secrets.token_hex(8)))
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
expected = None
actual = file_helper.read()
self.assertEqual(expected, actual)
def test_read_bytes(self) -> None:
"""
Tests the method which let us read (bytes) a file.
"""
given = tempfile.NamedTemporaryFile(delete=False)
file_helper = FileHelper(given.name)
file_helper.write("Hello, World!")
given.seek(0)
expected = b"Hello, World!"
actual = file_helper.read_bytes()
self.assertEqual(expected, actual)
def test_read_bytes_file_does_not_exists(self) -> None:
"""
Tests the method which let us read (bytes) a file for the case that
the given file does not exists.
"""
file_helper = FileHelper(tempfile.gettempdir())
file_helper.set_path(file_helper.join_path(secrets.token_hex(8)))
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
expected = None
actual = file_helper.read_bytes()
self.assertEqual(expected, actual)
def test_open(self) -> None:
"""
Tests the method which let us open the given file as we want.
"""
file_helper = FileHelper(tempfile.gettempdir())
file_helper.set_path(file_helper.join_path(secrets.token_hex(8)))
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
with file_helper.open("w") as file_stream:
file_stream.write("Hello, World!")
expected = True
actual = file_helper.exists()
self.assertEqual(expected, actual)
expected = "Hello, World!"
actual = file_helper.read()
self.assertEqual(expected, actual)
def test_copy(self) -> None:
"""
Tests the method which let us copy a file to another place.
"""
file_helper = FileHelper(tempfile.gettempdir())
file_helper.set_path(file_helper.join_path(secrets.token_hex(8)))
copy_file_helper = FileHelper(tempfile.gettempdir())
copy_file_helper.set_path(copy_file_helper.join_path(secrets.token_hex(8)))
expected = False
actual = file_helper.exists()
actual_copy = copy_file_helper.exists()
self.assertEqual(expected, actual)
self.assertEqual(expected, actual_copy)
file_helper.write("Hello, World!")
expected = True
actual = file_helper.exists()
self.assertEqual(expected, actual)
expected = False
actual_copy = copy_file_helper.exists()
self.assertEqual(expected, actual_copy)
file_helper.copy(copy_file_helper.path)
expected = True
actual_copy = copy_file_helper.exists()
self.assertEqual(expected, actual_copy)
expected = "Hello, World!"
actual = copy_file_helper.read()
self.assertEqual(expected, actual)
expected = True
actual = file_helper.exists()
actual_copy = copy_file_helper.exists()
self.assertEqual(expected, actual)
self.assertEqual(expected, actual_copy)
def test_move(self) -> None:
"""
Tests of the method which let us move a file to another location.
"""
file_helper = FileHelper(tempfile.gettempdir())
file_helper.set_path(file_helper.join_path(secrets.token_hex(8)))
destination_file_helper = FileHelper(tempfile.gettempdir())
destination_file_helper.set_path(
destination_file_helper.join_path(secrets.token_hex(8))
)
expected = False
actual = file_helper.exists()
actual_destination = destination_file_helper.exists()
self.assertEqual(expected, actual)
self.assertEqual(expected, actual_destination)
file_helper.write("Hello, World!")
expected = True
actual = file_helper.exists()
self.assertEqual(expected, actual)
expected = False
actual_destination = destination_file_helper.exists()
self.assertEqual(expected, actual_destination)
file_helper.move(destination_file_helper.path)
expected = True
actual_destination = destination_file_helper.exists()
self.assertEqual(expected, actual_destination)
expected = "Hello, World!"
actual = destination_file_helper.read()
self.assertEqual(expected, actual)
expected = False
actual = file_helper.exists()
self.assertEqual(expected, actual)
expected = True
actual_destination = destination_file_helper.exists()
self.assertEqual(expected, actual_destination)
if __name__ == "__main__":
unittest.main()
| 26.627083 | 88 | 0.606838 | 1,461 | 12,781 | 5.415469 | 0.140999 | 0.131446 | 0.125 | 0.157609 | 0.716507 | 0.692113 | 0.67366 | 0.665571 | 0.630308 | 0.617922 | 0 | 0.004941 | 0.271575 | 12,781 | 479 | 89 | 26.682672 | 0.8029 | 0.209999 | 0 | 0.76 | 0 | 0 | 0.039042 | 0.005164 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.071111 | false | 0 | 0.026667 | 0 | 0.102222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
97dd106f5157a62375f9741a6b7c0edb0c3a8dee | 1,240 | py | Python | tests/test_util_matrix.py | PeerHerholz/pyrsa | 994007086c59de93d86b982f1fff73fe6a8ea929 | [
"MIT"
] | 4 | 2015-08-10T18:34:21.000Z | 2018-05-15T20:43:15.000Z | tests/test_util_matrix.py | PeerHerholz/pyrsa | 994007086c59de93d86b982f1fff73fe6a8ea929 | [
"MIT"
] | null | null | null | tests/test_util_matrix.py | PeerHerholz/pyrsa | 994007086c59de93d86b982f1fff73fe6a8ea929 | [
"MIT"
] | 2 | 2018-03-26T03:02:07.000Z | 2021-11-10T21:09:48.000Z | #!/usr/bin/env python3
# -*- coding: utf-8 -*-
"""
test_util_matrix
@author: jdiedrichsen
"""
import unittest
import pyrsa.util as rsu
import numpy as np
class TestIndicator(unittest.TestCase):
def test_indicator(self):
a = np.array(range(0, 5))
a = np.concatenate((a, a))
X = rsu.matrix.indicator(a)
n_row, n_col = X.shape
self.assertEqual(n_row, 10)
self.assertEqual(n_col, 5)
self.assertEqual(X[0, 0], 1.0)
def test_indicator_pos(self):
a = np.array(range(0, 5))
a = np.concatenate((a, a))
X = rsu.matrix.indicator(a, positive=True)
n_row, n_col = X.shape
self.assertEqual(n_row, 10)
self.assertEqual(n_col, 4)
self.assertEqual(X[0, 0], 0.0)
def test_pairwise(self):
a = np.array(range(0, 5))
X = rsu.matrix.pairwise_contrast(a)
n_row, n_col = X.shape
self.assertEqual(n_row, 10)
self.assertEqual(n_col, 5)
self.assertEqual(X[0, 0], 1.0)
def test_centering(self):
X = rsu.matrix.centering(10)
n_row, n_col = X.shape
self.assertEqual(n_row, 10)
self.assertEqual(n_col, 10)
if __name__ == '__main__':
unittest.main()
| 24.313725 | 50 | 0.592742 | 185 | 1,240 | 3.8 | 0.264865 | 0.234708 | 0.182077 | 0.045519 | 0.59744 | 0.571835 | 0.571835 | 0.544808 | 0.544808 | 0.544808 | 0 | 0.038589 | 0.268548 | 1,240 | 50 | 51 | 24.8 | 0.736494 | 0.066935 | 0 | 0.5 | 0 | 0 | 0.006969 | 0 | 0 | 0 | 0 | 0 | 0.323529 | 1 | 0.117647 | false | 0 | 0.088235 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
97efd442d5baa89669000d346b5c499ecd9f4c0b | 203 | py | Python | qtapps/skrf_qtwidgets/analyzers/analyzer_rs_zva.py | mike0164/scikit-rf | 0af25754b097ee24089ea7e0eacde426a51df563 | [
"BSD-3-Clause"
] | 379 | 2015-01-25T12:19:19.000Z | 2022-03-29T14:01:07.000Z | qtapps/skrf_qtwidgets/analyzers/analyzer_rs_zva.py | mike0164/scikit-rf | 0af25754b097ee24089ea7e0eacde426a51df563 | [
"BSD-3-Clause"
] | 456 | 2015-01-06T19:15:55.000Z | 2022-03-31T06:42:57.000Z | qtapps/skrf_qtwidgets/analyzers/analyzer_rs_zva.py | mike0164/scikit-rf | 0af25754b097ee24089ea7e0eacde426a51df563 | [
"BSD-3-Clause"
] | 211 | 2015-01-06T17:14:06.000Z | 2022-03-31T01:36:00.000Z | from skrf.vi.vna import rs_zva
class Analyzer(rs_zva.ZVA):
DEFAULT_VISA_ADDRESS = "GPIB::16::INSTR"
NAME = "Rhode & Schwartz ZVA"
NPORTS = 4
NCHANNELS = 32
SCPI_VERSION_TESTED = ''
| 20.3 | 44 | 0.665025 | 29 | 203 | 4.448276 | 0.862069 | 0.077519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031847 | 0.226601 | 203 | 9 | 45 | 22.555556 | 0.789809 | 0 | 0 | 0 | 0 | 0 | 0.172414 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
97fbc7c518483b22e3bd3fb0a4313e038f0a4e05 | 508 | py | Python | nanome/_internal/_network/_commands/_serialization/_open_url.py | rramji/nanome-lib | 2806598af31cfb4bb6e16366f0b300d2ddcc9c13 | [
"MIT"
] | null | null | null | nanome/_internal/_network/_commands/_serialization/_open_url.py | rramji/nanome-lib | 2806598af31cfb4bb6e16366f0b300d2ddcc9c13 | [
"MIT"
] | null | null | null | nanome/_internal/_network/_commands/_serialization/_open_url.py | rramji/nanome-lib | 2806598af31cfb4bb6e16366f0b300d2ddcc9c13 | [
"MIT"
] | null | null | null | from nanome._internal._util._serializers import _StringSerializer
from nanome._internal._util._serializers import _TypeSerializer
class _OpenURL(_TypeSerializer):
def __init__(self):
self.string = _StringSerializer()
def version(self):
return 0
def name(self):
return "OpenURL"
def serialize(self, version, value, context):
context.write_using_serializer(self.string, value)
def deserialize(self, version, context):
raise NotImplementedError
| 25.4 | 65 | 0.720472 | 53 | 508 | 6.584906 | 0.509434 | 0.057307 | 0.103152 | 0.126075 | 0.223496 | 0.223496 | 0 | 0 | 0 | 0 | 0 | 0.002463 | 0.200787 | 508 | 19 | 66 | 26.736842 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0.01378 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.384615 | false | 0 | 0.153846 | 0.153846 | 0.769231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
97fdbd42de4debdf4f69ae07026eb489c9f50129 | 2,772 | py | Python | CorpusToolkit/ply_parser/test.py | howl-anderson/tools_for_corpus_of_people_daily | 8178d9a62c356f83723d42ced60f8269eed84861 | [
"Apache-2.0"
] | 243 | 2018-09-12T01:05:03.000Z | 2022-03-30T11:25:59.000Z | CorpusToolkit/ply_parser/test.py | nkkkyyy/tools_for_corpus_of_people_daily | 8178d9a62c356f83723d42ced60f8269eed84861 | [
"Apache-2.0"
] | 3 | 2018-10-18T10:13:07.000Z | 2020-09-10T06:34:40.000Z | CorpusToolkit/ply_parser/test.py | nkkkyyy/tools_for_corpus_of_people_daily | 8178d9a62c356f83723d42ced60f8269eed84861 | [
"Apache-2.0"
] | 56 | 2018-09-11T12:56:20.000Z | 2021-11-09T04:02:00.000Z | import logging
from CorpusToolkit.ply_parser import make_parser, lexer
logging.basicConfig(
level=logging.DEBUG,
filename="parselog.txt",
filemode="w",
format="%(filename)10s:%(lineno)4d:%(message)s"
)
log = logging.getLogger()
test_data = (
"19980101-01-001-002/m 中共中央/nt 总书记/n 、/wu 国家/n 主席/n 江/nrf 泽民/nrg",
"19980101-01-001-006/m 在/p 1998年/t 来临/vi 之际/f ,/wd 我/rr 十分/dc 高兴/a 地/ui 通过/p [中央/n 人民/n 广播/vn 电台/n]nt 、/wu [中国/ns 国际/n 广播/vn 电台/n]nt 和/c [中央/n 电视台/n]nt ,/wd 向/p 全国/n 各族/rz 人民/n ,/wd 向/p [香港/ns 特别/a 行政区/n]ns 同胞/n 、/wu 澳门/ns 和/c 台湾/ns 同胞/n 、/wu 海外/s 侨胞/n ,/wd 向/p 世界/n 各国/rz 的/ud 朋友/n 们/k ,/wd 致以/vt 诚挚/a 的/ud 问候/vn 和/c 良好/a 的/ud 祝愿/vn !/wt",
"19980131-04-013-024/m 那{na4}/rz 音韵/n 如/vt 轻柔/a 的/ud 夜风/n ,/wd ",
"19980103-04-003-007/m 图文/n 兼/vt 重/a 的/ud 中国/ns 文明史/n ,/wd 就/p 方向/n 言/Vg 有利于/vt 历史学/n 和/c 考古学/n 的/ud 进一步/d 结合/vt 。/wj 考古学/n 本身/rz 是/vl 具有/vt 独立/a 的/ud 理论/n 和/c 方法/n 的/ud 学科/n ,/wd 然而/c 中国/ns 考古学/n 从/p 一/d 开始/vt 便/d 以/p 同/p 历史学/n 的/ud 密切/ad 结合/vt 为/vl 特点/n 。/wj 大家/rr 知道/vt ,/wd 王/nrf 国维/nrg 先生/n 二十/m 年代/n 在/p [清华/jn 国学/n 研究院/n]nt 的/ud 讲义/n 《/wkz 古史/n 新/a 证/n 》/wky 中/f 提出/vt 的/ud “/wyz 二/m 重/qc 证据法/n ”/wyy ,/wd 在/p 方法论/n 上{shang5}/f 为{wei4}/p 考古学/n 的/ud 建立/vn 发展/vn 开拓/vt 了/ul 道路/n 。/wj “/wyz 二/m 重/qc 证据法/n ”/wyy 指/vt 文献/n 同/p 文物/n 的/ud 互相/d 印证/vt ,/wd 即/vl 蕴涵/vt 着/uz 历史/n 、/wu 考古/n 的/ud 结合/vn 。/wj 亲手/d 在/p 中国/ns 开展/vt 考古学/n 工作/vn 的/ud 考古学家/n ,/wd 都/d 以/p 探索/vt 和/c 重建/vt 古史/n 为/vl 职/Ng 志/n 。/wj 最/dc 早/a 得到/vt 大规模/d 系统/ad 发掘/vt 的/ud 遗址/n 殷墟/ns ,/wd 其/rz 被/p 选定/vt 正是/vl 出于/vt 这样/rz 的/ud 要求/n 。/wj 长期/d 领导/vt [中国/ns 科学院/n (/wkz 后/f 属/vl [中国/ns 社会/n 科学院/n]nt )/wky 考古/vn 研究所/n]nt 的/ud 夏/nrf 鼐/nrg 先生/n ,/wd 1984年/t 在/p 《/wkz 什么/ryw 是/vl 考古学/n 》/wky 文/Ng 中/f 说/vt ,/wd 考古学/n 和/p 利用/vt 文献/n 记载/vn 进行/vx 研究/vn 的/ud 狭义/b 历史学/n 不/df 同/vt ,/wd 研究/vt 的/ud 对象/n 只/d 是/vl 物质/n 的/ud 遗存/vn ,/wd 但/c 两者/rz 同/d 以/p 恢复/vt 人类/n 历史/n 的/ud 本来面目/in 为/vl 目标/n ,/wd 如/vt 车{che1}/n 之/u 两/m 轮/Ng ,/wd 鸟/n 之/u 两翼/n 。/wj 对于/p 了解/vt 中国/ns 有着/vt 悠久/a 的/ud 文明/n 和/c 丰富/a 的/ud 文献/n 传统/n 的/ud 人们/n 来说/u ,/wd 中国/ns 考古学/n 的/ud 这种/r 特点/n 乃是/vl 自然/a 的/ud 。/wj"
)
s = test_data[3]
def test_lexer():
lexer.input(s)
while True:
tok = lexer.token()
if not tok:
break # No more input
print(tok.type, tok.value, tok.lineno, tok.lexpos)
def test_parser():
parser = make_parser()
result = parser.parse(s)
for token in result:
print(token.token, token.pinyin, token.pos)
| 72.947368 | 1,579 | 0.533911 | 703 | 2,772 | 2.095306 | 0.41394 | 0.05499 | 0.027155 | 0.014257 | 0.033944 | 0.033944 | 0.020367 | 0.020367 | 0 | 0 | 0 | 0.040609 | 0.289322 | 2,772 | 37 | 1,580 | 74.918919 | 0.707107 | 0.00469 | 0 | 0 | 0 | 0.142857 | 0.783098 | 0.044251 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.071429 | 0 | 0.142857 | 0.071429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3f05ec3f00a5d7d90f5ef0232521b059bc84d999 | 672 | py | Python | src/AuShadha/registry/icd10/aushadha.py | GosthMan/AuShadha | 3ab48825a0dba19bf880b6ac6141ab7a6adf1f3e | [
"PostgreSQL"
] | 46 | 2015-03-04T14:19:47.000Z | 2021-12-09T02:58:46.000Z | src/AuShadha/registry/icd10/aushadha.py | aytida23/AuShadha | 3ab48825a0dba19bf880b6ac6141ab7a6adf1f3e | [
"PostgreSQL"
] | 2 | 2015-06-05T10:29:04.000Z | 2015-12-06T16:54:10.000Z | src/AuShadha/registry/icd10/aushadha.py | aytida23/AuShadha | 3ab48825a0dba19bf880b6ac6141ab7a6adf1f3e | [
"PostgreSQL"
] | 24 | 2015-03-23T01:38:11.000Z | 2022-01-24T16:23:42.000Z | ################################################################################
# Create a Registration with the UI for a Role.
# Each module's aushadha.py is screened for this
#
# Each Class is registered for a Role in UI
# These can be used to generate Role based UI elements later.
#
# As of now string base role assignement is done.
# This can be later extended to class based role
################################################################################
from .models import Chapter, Section,Diagnosis
from AuShadha.apps.ui.ui import ui as UI
UI.register('RegistryApp',Chapter )
UI.register('DiseaseCodes',Chapter)
UI.register('ReferenceApp',Chapter)
| 37.333333 | 80 | 0.577381 | 83 | 672 | 4.674699 | 0.566265 | 0.07732 | 0.041237 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129464 | 672 | 17 | 81 | 39.529412 | 0.663248 | 0.4375 | 0 | 0 | 0 | 0 | 0.167464 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
3f2894b54d3e8597c52938f696795d8309755127 | 239 | py | Python | controllers/social_auth/kivyauth/__init__.py | richierh/SalesKivyMD | f445adc701946ff38865b4a1a00a03529142613e | [
"MIT"
] | 126 | 2020-06-12T15:02:19.000Z | 2022-03-31T10:13:29.000Z | controllers/social_auth/kivyauth/__init__.py | richierh/SalesKivyMD | f445adc701946ff38865b4a1a00a03529142613e | [
"MIT"
] | 13 | 2020-07-01T01:03:26.000Z | 2022-02-21T02:21:24.000Z | controllers/social_auth/kivyauth/__init__.py | richierh/SalesKivyMD | f445adc701946ff38865b4a1a00a03529142613e | [
"MIT"
] | 22 | 2020-06-12T22:24:27.000Z | 2022-03-10T13:24:33.000Z | from kivy.logger import Logger
from kivy.utils import platform
__version__ = "2.3.2"
_log_message = "KivyAuth:" + f" {__version__}" + f' (installed at "{__file__}")'
__all__ = ("login_providers", "auto_login")
Logger.info(_log_message)
| 23.9 | 80 | 0.723849 | 32 | 239 | 4.71875 | 0.65625 | 0.10596 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014354 | 0.125523 | 239 | 9 | 81 | 26.555556 | 0.708134 | 0 | 0 | 0 | 0 | 0 | 0.338912 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
3f3537a4e2a9c606bd390358c783d299bde031c0 | 2,125 | py | Python | ooobuild/lo/smarttags/x_range_based_smart_tag_recognizer.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | ooobuild/lo/smarttags/x_range_based_smart_tag_recognizer.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | ooobuild/lo/smarttags/x_range_based_smart_tag_recognizer.py | Amourspirit/ooo_uno_tmpl | 64e0c86fd68f24794acc22d63d8d32ae05dd12b8 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
#
# Copyright 2022 :Barry-Thomas-Paul: Moss
#
# Licensed under the Apache License, Version 2.0 (the "License")
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http: // www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
# Interface Class
# this is a auto generated file generated by Cheetah
# Libre Office Version: 7.3
# Namespace: com.sun.star.smarttags
import typing
from abc import abstractmethod
from ..lang.x_initialization import XInitialization as XInitialization_d46c0cca
if typing.TYPE_CHECKING:
from ..frame.x_controller import XController as XController_b00e0b8f
from .smart_tag_recognizer_mode import SmartTagRecognizerMode as SmartTagRecognizerMode_9179119e
from ..text.x_text_markup import XTextMarkup as XTextMarkup_a5d60b3a
from ..text.x_text_range import XTextRange as XTextRange_9a910ab7
class XRangeBasedSmartTagRecognizer(XInitialization_d46c0cca):
"""
provides access to a range based smart tag recognizer.
See Also:
`API XRangeBasedSmartTagRecognizer <https://api.libreoffice.org/docs/idl/ref/interfacecom_1_1sun_1_1star_1_1smarttags_1_1XRangeBasedSmartTagRecognizer.html>`_
"""
__ooo_ns__: str = 'com.sun.star.smarttags'
__ooo_full_ns__: str = 'com.sun.star.smarttags.XRangeBasedSmartTagRecognizer'
__ooo_type_name__: str = 'interface'
__pyunointerface__: str = 'com.sun.star.smarttags.XRangeBasedSmartTagRecognizer'
@abstractmethod
def recognizeTextRange(self, xRange: 'XTextRange_9a910ab7', eDataType: 'SmartTagRecognizerMode_9179119e', xTextMarkup: 'XTextMarkup_a5d60b3a', aApplicationName: str, xController: 'XController_b00e0b8f') -> None:
"""
recognizes smart tags.
"""
__all__ = ['XRangeBasedSmartTagRecognizer']
| 42.5 | 215 | 0.777412 | 260 | 2,125 | 6.146154 | 0.561538 | 0.037547 | 0.025031 | 0.047559 | 0.0801 | 0.0801 | 0 | 0 | 0 | 0 | 0 | 0.035912 | 0.148235 | 2,125 | 49 | 216 | 43.367347 | 0.846961 | 0.449882 | 0 | 0 | 0 | 0 | 0.23049 | 0.168784 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.4375 | 0 | 0.8125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3f3653bf5b35e045e2b4c2aeff6f681433eea55f | 924 | py | Python | apprest/plugins/icat/views/ICAT.py | acampsm/calipsoplus-backend | b66690124bd2f2541318ddb83b18e082b5df5676 | [
"MIT"
] | 4 | 2018-12-04T15:08:27.000Z | 2019-04-11T09:49:41.000Z | apprest/plugins/icat/views/ICAT.py | acampsm/calipsoplus-backend | b66690124bd2f2541318ddb83b18e082b5df5676 | [
"MIT"
] | 63 | 2018-11-22T13:07:56.000Z | 2021-06-10T20:55:58.000Z | apprest/plugins/icat/views/ICAT.py | AlexRogalskiy/calipsoplus-backend | 3f6b034f16668bc154b0f4b759ed62b055f41647 | [
"MIT"
] | 10 | 2018-11-23T08:17:28.000Z | 2022-01-15T23:41:59.000Z | from rest_framework import status
from rest_framework.authentication import SessionAuthentication, BasicAuthentication
from rest_framework.permissions import IsAuthenticated
from rest_framework.views import APIView
from apprest.plugins.icat.helpers.complex_encoder import JsonResponse
from apprest.plugins.icat.services.ICAT import ICATService
class GetInvestigationUsers(APIView):
"""
get:
Return: Users involved in an investigation
"""
authentication_classes = (SessionAuthentication, BasicAuthentication)
permission_classes = (IsAuthenticated,)
pagination_class = None
def get(self, request, *args, **kwargs):
service = ICATService()
investigation_id = self.kwargs.get('investigation_id')
investigation_users = service.get_users_involved_in_investigation(investigation_id, request)
return JsonResponse(investigation_users, status=status.HTTP_200_OK)
| 34.222222 | 100 | 0.787879 | 96 | 924 | 7.385417 | 0.458333 | 0.045134 | 0.09591 | 0.062059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003802 | 0.146104 | 924 | 26 | 101 | 35.538462 | 0.894804 | 0.050866 | 0 | 0 | 0 | 0 | 0.01867 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0.4 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
3f3f001f639e3ff68f19c91e138db8007658913f | 998 | py | Python | py/book/ShortestSubarrayLength.py | danyfang/SourceCode | 8168f6058648f2a330a7354daf3a73a4d8a4e730 | [
"MIT"
] | null | null | null | py/book/ShortestSubarrayLength.py | danyfang/SourceCode | 8168f6058648f2a330a7354daf3a73a4d8a4e730 | [
"MIT"
] | null | null | null | py/book/ShortestSubarrayLength.py | danyfang/SourceCode | 8168f6058648f2a330a7354daf3a73a4d8a4e730 | [
"MIT"
] | null | null | null | '''
Leetcode problem No 862 Shortest Subarray with Sum at Least K
Solution written by Xuqiang Fang on 1 July, 2018
'''
import collections
class Solution(object):
def shortestSubarray(self, A, K):
"""
:type A: List[int]
:type K: int
:rtype: int
"""
n = len(A)
B = [0] * (n + 1)
for i in range(n):
B[i+1] = B[i] + A[i]
d = collections.deque()
ans = n + 1
for i in range(n+1):
while d and B[i] - B[d[0]] >= K:
ans = min(ans, i-d.popleft())
while d and B[i] <= B[d[-1]]:
d.pop()
d.append(i)
return ans if ans <= n else -1
def main():
s = Solution()
print(s.shortestSubarray([2,-1,2], 3))
print(s.shortestSubarray([1,2], 4))
print(s.shortestSubarray([1], 1))
print(s.shortestSubarray([1,2,3,-5,4,-7,5,-8,6,-9,7,8,-4], 5)) #1
print(s.shortestSubarray([1,2,-5,3,-5,4,-7,5,-8,6,-9,7,8,-4], 5))
main()
| 28.514286 | 69 | 0.490982 | 164 | 998 | 2.987805 | 0.384146 | 0.061224 | 0.22449 | 0.187755 | 0.310204 | 0.261224 | 0.159184 | 0.04898 | 0.04898 | 0.04898 | 0 | 0.081241 | 0.321643 | 998 | 34 | 70 | 29.352941 | 0.642541 | 0.156313 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.041667 | 0 | 0.208333 | 0.208333 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
3f47e5cac2344784ba9a8fd0999bd621214986ec | 669 | py | Python | DesignPatterns/FactoryPattern/SimpleFactory/autoFactory.py | Py-Himanshu-Patel/Learn-Python | 47a50a934cabcce3b1cbdd4c88141a51f21d3a05 | [
"MIT"
] | null | null | null | DesignPatterns/FactoryPattern/SimpleFactory/autoFactory.py | Py-Himanshu-Patel/Learn-Python | 47a50a934cabcce3b1cbdd4c88141a51f21d3a05 | [
"MIT"
] | null | null | null | DesignPatterns/FactoryPattern/SimpleFactory/autoFactory.py | Py-Himanshu-Patel/Learn-Python | 47a50a934cabcce3b1cbdd4c88141a51f21d3a05 | [
"MIT"
] | null | null | null | from inspect import isclass, isabstract, getmembers
import autos
def isconcrete(obj):
return isclass(obj) and not isabstract(obj)
class AutoFactory:
vehicles = {} # { car model name: class for the car}
def __init__(self):
self.load_autos()
def load_autos(self):
classes = getmembers(autos, isconcrete)
for name, _type in classes:
if isclass(_type) and issubclass(_type, autos.AbstractAuto):
self.vehicles.update([[name, _type]])
def create_instance(self, carname):
if carname in self.vehicles:
return self.vehicles[carname]()
return autos.NullCar(carname)
| 25.730769 | 72 | 0.647235 | 78 | 669 | 5.410256 | 0.448718 | 0.085308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.26009 | 669 | 25 | 73 | 26.76 | 0.852525 | 0.053812 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.117647 | 0.058824 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3f4d396e7dff26260074f0fb74d95a3f3b759b61 | 7,358 | py | Python | dlkit/json_/authentication/queries.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 2 | 2018-02-23T12:16:11.000Z | 2020-10-08T17:54:24.000Z | dlkit/json_/authentication/queries.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 87 | 2017-04-21T18:57:15.000Z | 2021-12-13T19:43:57.000Z | dlkit/json_/authentication/queries.py | UOC/dlkit | a9d265db67e81b9e0f405457464e762e2c03f769 | [
"MIT"
] | 1 | 2018-03-01T16:44:25.000Z | 2018-03-01T16:44:25.000Z | """JSON implementations of authentication queries."""
# pylint: disable=no-init
# Numerous classes don't require __init__.
# pylint: disable=too-many-public-methods,too-few-public-methods
# Number of methods are defined in specification
# pylint: disable=protected-access
# Access to protected methods allowed in package json package scope
# pylint: disable=too-many-ancestors
# Inheritance defined in specification
from .. import utilities
from ..osid import queries as osid_queries
from ..primitives import Id
from ..utilities import get_registry
from dlkit.abstract_osid.authentication import queries as abc_authentication_queries
from dlkit.abstract_osid.osid import errors
class AgentQuery(abc_authentication_queries.AgentQuery, osid_queries.OsidObjectQuery):
"""This is the query for searching agents.
Each method specifies an ``AND`` term while multiple invocations of
the same method produce a nested ``OR``.
The following example returns agents whose display name begins with
"Tom" and whose "login name" is "tom" or "tjcoppet" in an agent
record specified by ``companyAgentType``.
Agent Query query = session.getAgentQuery();
query.matchDisplayName("Tom*", wildcardStringMatchType, true);
companyAgentQuery = query.getAgentQueryRecord(companyAgentType);
companyAgentQuery.matchLoginName("tom");
companyAgentQuery = query.getAgentQueryRecord(companyAgentType);
companyAgentQuery.matchLoginName("tjcoppet");
AgentList agentList = session.getAgentsByQuery(query);
"""
def __init__(self, runtime):
self._namespace = 'authentication.Agent'
self._runtime = runtime
record_type_data_sets = get_registry('AGENT_RECORD_TYPES', runtime)
self._all_supported_record_type_data_sets = record_type_data_sets
self._all_supported_record_type_ids = []
for data_set in record_type_data_sets:
self._all_supported_record_type_ids.append(str(Id(**record_type_data_sets[data_set])))
osid_queries.OsidObjectQuery.__init__(self, runtime)
@utilities.arguments_not_none
def match_resource_id(self, agency_id, match):
"""Sets the resource ``Id`` for this query.
arg: agency_id (osid.id.Id): a resource ``Id``
arg: match (boolean): ``true`` for a positive match,
``false`` for a negative match
raise: NullArgument - ``agency_id`` is ``null``
*compliance: mandatory -- This method must be implemented.*
"""
# Implemented from template for osid.resource.ResourceQuery.match_avatar_id
self._add_match('resourceId', str(agency_id), match)
def clear_resource_id_terms(self):
"""Clears the resource ``Id`` terms.
*compliance: mandatory -- This method must be implemented.*
"""
# Implemented from template for osid.resource.ResourceQuery.clear_avatar_id
self._clear_terms('resourceId')
resource_id_terms = property(fdel=clear_resource_id_terms)
def supports_resource_query(self):
"""Tests if a ``ResourceQuery`` is available.
return: (boolean) - ``true`` if a resource query is available,
``false`` otherwise
*compliance: mandatory -- This method must be implemented.*
"""
raise errors.Unimplemented()
def get_resource_query(self):
"""Gets the query for a resource.
Multiple retrievals produce a nested ``OR`` term.
return: (osid.resource.ResourceQuery) - the resource query
raise: Unimplemented - ``supports_resource_query()`` is
``false``
*compliance: optional -- This method must be implemented if
``supports_resource_query()`` is ``true``.*
"""
raise errors.Unimplemented()
resource_query = property(fget=get_resource_query)
@utilities.arguments_not_none
def match_any_resource(self, match):
"""Matches agents with any resource.
arg: match (boolean): ``true`` if to match agents with a
resource, ``false`` to match agents with no resource
*compliance: mandatory -- This method must be implemented.*
"""
raise errors.Unimplemented()
def clear_resource_terms(self):
"""Clears the resource terms.
*compliance: mandatory -- This method must be implemented.*
"""
raise errors.Unimplemented()
resource_terms = property(fdel=clear_resource_terms)
@utilities.arguments_not_none
def match_agency_id(self, agency_id, match):
"""Sets the agency ``Id`` for this query.
arg: agency_id (osid.id.Id): an agency ``Id``
arg: match (boolean): ``true`` for a positive match,
``false`` for negative match
raise: NullArgument - ``agency_id`` is ``null``
*compliance: mandatory -- This method must be implemented.*
"""
# Implemented from template for osid.resource.ResourceQuery.match_bin_id
self._add_match('assignedAgencyIds', str(agency_id), match)
def clear_agency_id_terms(self):
"""Clears the agency ``Id`` terms.
*compliance: mandatory -- This method must be implemented.*
"""
# Implemented from template for osid.resource.ResourceQuery.clear_bin_id_terms
self._clear_terms('assignedAgencyIds')
agency_id_terms = property(fdel=clear_agency_id_terms)
def supports_agency_query(self):
"""Tests if an ``AgencyQuery`` is available.
return: (boolean) - ``true`` if an agency query is available,
``false`` otherwise
*compliance: mandatory -- This method must be implemented.*
"""
raise errors.Unimplemented()
def get_agency_query(self):
"""Gets the query for an agency.
Multiple retrievals produce a nested ``OR`` term.
return: (osid.authentication.AgencyQuery) - the agency query
raise: Unimplemented - ``supports_agency_query()`` is ``false``
*compliance: optional -- This method must be implemented if
``supports_agency_query()`` is ``true``.*
"""
raise errors.Unimplemented()
agency_query = property(fget=get_agency_query)
def clear_agency_terms(self):
"""Clears the agency terms.
*compliance: mandatory -- This method must be implemented.*
"""
# Implemented from template for osid.resource.ResourceQuery.clear_group_terms
self._clear_terms('agency')
agency_terms = property(fdel=clear_agency_terms)
@utilities.arguments_not_none
def get_agent_query_record(self, agent_record_type):
"""Gets the agent query record corresponding to the given ``Agent`` record ``Type``.
Multiple retrievals produce a nested ``OR`` term.
arg: agent_record_type (osid.type.Type): an agent record type
return: (osid.authentication.records.AgentQueryRecord) - the
agent query record
raise: NullArgument - ``agent_record_type`` is ``null``
raise: OperationFailed - unable to complete request
raise: Unsupported - ``has_record_type(agent_record_type)`` is
``false``
*compliance: mandatory -- This method must be implemented.*
"""
raise errors.Unimplemented()
| 36.425743 | 98 | 0.671786 | 838 | 7,358 | 5.706444 | 0.196897 | 0.025094 | 0.035132 | 0.040151 | 0.503555 | 0.451276 | 0.338561 | 0.319741 | 0.318695 | 0.29862 | 0 | 0 | 0.231177 | 7,358 | 201 | 99 | 36.606965 | 0.845324 | 0.574477 | 0 | 0.22 | 0 | 0 | 0.038735 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.26 | false | 0 | 0.12 | 0 | 0.52 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
3f731bc8d56706afd6b8d8a2244161c707b604bd | 6,047 | py | Python | manage/fuzzytranslation.py | Acidburn0zzz/browser-update | fed7b4c52deccd582fcf8b8cca4809607bbb32cd | [
"MIT"
] | 2 | 2017-10-06T15:53:23.000Z | 2017-10-06T15:53:38.000Z | manage/fuzzytranslation.py | Acidburn0zzz/browser-update | fed7b4c52deccd582fcf8b8cca4809607bbb32cd | [
"MIT"
] | null | null | null | manage/fuzzytranslation.py | Acidburn0zzz/browser-update | fed7b4c52deccd582fcf8b8cca4809607bbb32cd | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Created on Sun Jun 12 14:21:31 2016
@author: TH
"""
#%%
import polib
#%%
#old (translated) string
#new renamed string
pairs="""
An initiative by web designers to inform users about browser-updates
An initiative by websites to inform users to update their web browser
If you are on a computer that is maintained by an admin and you cannot install a new browser, ask your admin about it.
Ask your admin to update your browser if you cannot install updates yourself.
blaasdasdfsdaf
faselsdfsadf""";
pairs=pairs.replace("\r","")[1:-1].split("\n\n")
mappings={s.split("\n")[0]:s.split("\n")[1] for s in pairs}
#%%
po = polib.pofile('lang/de_DE/LC_MESSAGES/update.po')
valid_entries = [e for e in po if not e.obsolete]
for entry in valid_entries:
#print(entry.msgid)
if entry.msgid in mappings:
print("replacing", entry.msgid[:10], "with",mappings[entry.msgid][:10])
entry.msgid=mappings[entry.msgid]
po.save()
po.save_as_mofile('lang/de_DE/LC_MESSAGES/update.mo')
#%%
pairs="""aaa
bbb
Subtle
Unobtrusive
bla
fasel"""
pairs=pairs.replace("\r","")[1:-1].split("\n\n")
mappings={s.split("\n")[0]:s.split("\n")[1] for s in pairs}
#%%
po = polib.pofile('lang/de_DE/LC_MESSAGES/site.po')
valid_entries = [e for e in po if not e.obsolete]
for entry in valid_entries:
#print(entry.msgid)
if entry.msgid in mappings:
print("replacing", entry.msgid[:10], "with",mappings[entry.msgid][:10])
entry.msgid=mappings[entry.msgid]
po.save()
po.save_as_mofile('lang/de_DE/LC_MESSAGES/site.mo')
#%%
pot = polib.pofile('lang/update.pot')
for entry in pot:
print (entry.msgid, entry.msgstr)
#%%
#%% display old translations
po = polib.pofile('lang/de_DE/LC_MESSAGES/update.po')
valid_entries = [e for e in po if not e.obsolete]
for entry in valid_entries:
print(entry.msgid)
#%%
#%% getting files
from glob import glob
paths = glob('lang/*/LC_MESSAGES/')
paths=[p[5:10] for p in paths]
paths
#%% updating all site.po
for p in paths:
print("updating %s"%p)
try:
po = polib.pofile('lang/%s/LC_MESSAGES/site.po'%p)
except OSError:
print("no file found")
continue
valid_entries = [e for e in po if not e.obsolete]
for entry in valid_entries:
#print(entry.msgid)
if entry.msgid in mappings:
print(" ", entry.msgid[:10], "-->",mappings[entry.msgid][:10])
entry.msgid=mappings[entry.msgid]
po.save()
po.save_as_mofile('lang/%s/LC_MESSAGES/site.mo'%p)
#%% updating all update.po
for p in paths:
print("updating %s"%p)
try:
po = polib.pofile('lang/%s/LC_MESSAGES/update.po'%p)
except OSError:
print("no file found")
continue
valid_entries = [e for e in po if not e.obsolete]
for entry in valid_entries:
#print(entry.msgid)
if entry.msgid in mappings:
print(" ", entry.msgid[:10], "-->",mappings[entry.msgid][:10])
entry.msgid=mappings[entry.msgid]
po.save()
po.save_as_mofile('lang/%s/LC_MESSAGES/update.mo'%p)
#%%
pairs="""aaa
bbb
Optionally include up to two placeholders "%s" which will be replaced with the browser version and contents of the link tag. Example: "Your browser (%s) is old. Please <a%s>update</a>"
Optionally include up to two placeholders "%s" which will be replaced with the browser version and contents of the link tag. Example: "Your browser (%s) is old. Please <a%s>update</a>"
bla
fasel"""
pairs=pairs.replace("\r","")[1:-1].split("\n\n")
mappings={s.split("\n")[0]:s.split("\n")[1] for s in pairs}
#%%
from glob import glob
paths = glob('lang/*/LC_MESSAGES/')
paths=[p[5:10] for p in paths]
paths
#%% updating all site.po
for p in paths:
print("customize %s"%p)
try:
po = polib.pofile('lang/%s/LC_MESSAGES/customize.po'%p)
except OSError:
print("no file found")
continue
valid_entries = [e for e in po if not e.obsolete]
for entry in valid_entries:
#print(entry.msgid)
if entry.msgid in mappings:
print(" ", entry.msgid[:10], "-->",mappings[entry.msgid][:10])
entry.msgid=mappings[entry.msgid]
po.save()
po.save_as_mofile('lang/%s/LC_MESSAGES/customize.mo'%p)
#%% extract strings
import subprocess
subprocess.call(['xgettext',
"header.php",
"footer.php",
"update-browser.php",
"--keyword=T_gettext",
"--keyword=T_",
"--keyword=T_ngettext:1,2",
"--from-code=utf-8",
"--package-name=browser-update-update",
"--language=PHP",
"--output=lang/update.pot"])
#%% extract site strings
import subprocess
subprocess.call(['xgettext',
"blog.php",
"stat.php",
"index.php",
"contact.php",
"update.testing.php",
"--keyword=T_gettext",
"--keyword=T_",
"--keyword=T_ngettext:1,2",
"--from-code=utf-8",
"--package-name=browser-update-site",
"--language=PHP",
"--output=lang/site.pot"])
#%% extract customize strings
import subprocess
subprocess.call(['xgettext',
"customize.php",
"--keyword=T_gettext",
"--keyword=T_",
"--keyword=T_ngettext:1,2",
"--from-code=utf-8",
"--package-name=browser-update-customize",
"--language=PHP",
"--output=lang/customize.pot"])
#%% upload new sources for translations
import subprocess
subprocess.call(['crowdin-cli-py', 'upload', 'sources'])
#subprocess.call(['java', '-jar', 'manage\crowdin-cli.jar', 'upload', 'sources','--config','manage\crowdin.yaml'])
#subprocess.call(['java', '-jar', 'manage\crowdin-cli.jar', 'upload', 'sources']) | 29.21256 | 197 | 0.599471 | 839 | 6,047 | 4.262217 | 0.197855 | 0.089485 | 0.041946 | 0.028523 | 0.737975 | 0.727349 | 0.685682 | 0.685682 | 0.685682 | 0.685682 | 0 | 0.013072 | 0.240946 | 6,047 | 207 | 198 | 29.21256 | 0.766013 | 0.105176 | 0 | 0.693431 | 0 | 0.021898 | 0.361421 | 0.119234 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.051095 | 0 | 0.051095 | 0.094891 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
58afa3b02632f4e434958664a87781b4ca073a2a | 394 | py | Python | python/0011. maxArea.py | whtahy/leetcode | a2955123d203b155455ceefe38514fd0077d6db9 | [
"CC0-1.0"
] | 1 | 2017-12-09T05:37:51.000Z | 2017-12-09T05:37:51.000Z | python/0011. maxArea.py | whtahy/leetcode | a2955123d203b155455ceefe38514fd0077d6db9 | [
"CC0-1.0"
] | null | null | null | python/0011. maxArea.py | whtahy/leetcode | a2955123d203b155455ceefe38514fd0077d6db9 | [
"CC0-1.0"
] | null | null | null | class Solution:
def maxArea(self, ls):
n = len(ls) - 1
v, left, right = [], 0, n
while 0 <= left < right <= n:
h = min(ls[left], ls[right])
v += [h * (right - left)]
while ls[left] <= h and left < right:
left += 1
while ls[right] <= h and left < right:
right -= 1
return max(v)
| 30.307692 | 50 | 0.411168 | 51 | 394 | 3.176471 | 0.372549 | 0.222222 | 0.098765 | 0.160494 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022936 | 0.446701 | 394 | 12 | 51 | 32.833333 | 0.720183 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
58b676c8df734180c643826f2bc368889a0790b4 | 2,820 | py | Python | safe/geokdbush/kdbushTest.py | s-a-f-e/backend | 6018f51466df9abd58f25729d91856842eee9509 | [
"MIT"
] | 1 | 2019-05-06T19:40:43.000Z | 2019-05-06T19:40:43.000Z | safe/geokdbush/kdbushTest.py | s-a-f-e/backend | 6018f51466df9abd58f25729d91856842eee9509 | [
"MIT"
] | 9 | 2019-12-04T22:57:46.000Z | 2022-02-10T07:15:11.000Z | safe/geokdbush/kdbushTest.py | s-a-f-e/backend | 6018f51466df9abd58f25729d91856842eee9509 | [
"MIT"
] | 3 | 2019-05-01T20:41:33.000Z | 2019-10-03T20:57:00.000Z | from kdbush import KDBush
# test data
points = [
[54,1],[97,21],[65,35],[33,54],[95,39],[54,3],[53,54],[84,72],[33,34],[43,15],[52,83],[81,23],[1,61],[38,74],
[11,91],[24,56],[90,31],[25,57],[46,61],[29,69],[49,60],[4,98],[71,15],[60,25],[38,84],[52,38],[94,51],[13,25],
[77,73],[88,87],[6,27],[58,22],[53,28],[27,91],[96,98],[93,14],[22,93],[45,94],[18,28],[35,15],[19,81],[20,81],
[67,53],[43,3],[47,66],[48,34],[46,12],[32,38],[43,12],[39,94],[88,62],[66,14],[84,30],[72,81],[41,92],[26,4],
[6,76],[47,21],[57,70],[71,82],[50,68],[96,18],[40,31],[78,53],[71,90],[32,14],[55,6],[32,88],[62,32],[21,67],
[73,81],[44,64],[29,50],[70,5],[6,22],[68,3],[11,23],[20,42],[21,73],[63,86],[9,40],[99,2],[99,76],[56,77],
[83,6],[21,72],[78,30],[75,53],[41,11],[95,20],[30,38],[96,82],[65,48],[33,18],[87,28],[10,10],[40,34],
[10,20],[47,29],[46,78]]
ids = [
97, 74, 95, 30, 77, 38, 76, 27, 80, 55, 72, 90, 88, 48, 43, 46, 65, 39, 62, 93, 9, 96, 47, 8, 3, 12, 15, 14, 21, 41, 36, 40, 69, 56, 85, 78, 17, 71, 44,
19, 18, 13, 99, 24, 67, 33, 37, 49, 54, 57, 98, 45, 23, 31, 66, 68, 0, 32, 5, 51, 75, 73, 84, 35, 81, 22, 61, 89, 1, 11, 86, 52, 94, 16, 2, 6, 25, 92,
42, 20, 60, 58, 83, 79, 64, 10, 59, 53, 26, 87, 4, 63, 50, 7, 28, 82, 70, 29, 34, 91]
coords = [
10,20,6,22,10,10,6,27,20,42,18,28,11,23,13,25,9,40,26,4,29,50,30,38,41,11,43,12,43,3,46,12,32,14,35,15,40,31,33,18,
43,15,40,34,32,38,33,34,33,54,1,61,24,56,11,91,4,98,20,81,22,93,19,81,21,67,6,76,21,72,21,73,25,57,44,64,47,66,29,
69,46,61,38,74,46,78,38,84,32,88,27,91,45,94,39,94,41,92,47,21,47,29,48,34,60,25,58,22,55,6,62,32,54,1,53,28,54,3,
66,14,68,3,70,5,83,6,93,14,99,2,71,15,96,18,95,20,97,21,81,23,78,30,84,30,87,28,90,31,65,35,53,54,52,38,65,48,67,
53,49,60,50,68,57,70,56,77,63,86,71,90,52,83,71,82,72,81,94,51,75,53,95,39,78,53,88,62,84,72,77,73,99,76,73,81,88,
87,96,98,96,82]
index = KDBush(points)
result = index.range(20, 30, 50, 70)
print(result) # [60, 20, 45, 3, 17, 71, 44, 19, 18, 15, 69, 90, 62, 96, 47, 8, 77, 72]
for id in result:
p = points[id]
if p[0] < 20 or p[0] > 50 or p[1] < 30 or p[1] > 70:
print("FAIL")
for id in result:
p = points[id]
if id not in result and p[0] >= 20 and p[0] <= 50 and p[1] >= 30 and p[1] <= 70:
print("FAIL: outside point not in range")
def sqDist2(a, b):
dx = a[0] - b[0]
dy = a[1] - b[1]
return dx * dx + dy * dy;
index2 = KDBush(points)
qp = [50, 50]
r = 20
r2 = 20 * 20
result = index.within(qp[0], qp[1], r)
print(result) # [60, 6, 25, 92, 42, 20, 45, 3, 71, 44, 18, 96]
for id in result:
p = points[id]
if (sqDist2(p, qp) > r2): print('FAIL: result point in range')
for id in result:
p = points[id]
if (id not in result and sqDist2(p, qp) <= r2):
print('FAIL: result point not in range')
| 46.229508 | 156 | 0.537234 | 693 | 2,820 | 2.186147 | 0.194805 | 0.031683 | 0.018482 | 0.034323 | 0.168977 | 0.126733 | 0.126733 | 0.126733 | 0.052805 | 0.052805 | 0 | 0.448117 | 0.152482 | 2,820 | 60 | 157 | 47 | 0.185774 | 0.045035 | 0 | 0.204082 | 0 | 0 | 0.034957 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020408 | false | 0 | 0.020408 | 0 | 0.061224 | 0.122449 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
58b91dc41ea5155b61915a8cc460140b8ef148b0 | 15,872 | py | Python | hansberger/analysis/migrations/0001_initial.py | 097475/hansberger | bb4ba1cbc410e7242a12f841e447b4d68f4298f6 | [
"MIT"
] | 1 | 2019-04-03T13:44:38.000Z | 2019-04-03T13:44:38.000Z | hansberger/analysis/migrations/0001_initial.py | sebastianoverdolini/hansberger | bb4ba1cbc410e7242a12f841e447b4d68f4298f6 | [
"MIT"
] | 4 | 2019-05-22T09:43:09.000Z | 2019-05-29T12:22:00.000Z | hansberger/analysis/migrations/0001_initial.py | 097475/hansberger | bb4ba1cbc410e7242a12f841e447b4d68f4298f6 | [
"MIT"
] | 2 | 2019-04-17T09:23:32.000Z | 2019-05-03T10:38:16.000Z | # Generated by Django 2.0.13 on 2019-06-27 17:04
import django.contrib.postgres.fields.jsonb
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
('research', '0001_initial'),
('datasets', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='Bottleneck',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('homology', models.PositiveIntegerField()),
('kind', models.CharField(choices=[('consecutive', 'consecutive'), ('one_to_all', 'one_to_all'), ('all_to_all', 'all_to_all')], max_length=20)),
],
),
migrations.CreateModel(
name='Diagram',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('image', models.TextField()),
('bottleneck_distance', models.FloatField()),
('bottleneck', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analysis.Bottleneck')),
],
),
migrations.CreateModel(
name='FiltrationAnalysis',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(help_text='Name this analysis', max_length=100)),
('slug', models.SlugField(max_length=110)),
('description', models.TextField(blank=True, help_text='Write a brief description of the analysis', max_length=500)),
('creation_date', models.DateTimeField(auto_now_add=True)),
('precomputed_distance_matrix_json', django.contrib.postgres.fields.jsonb.JSONField(default='"[]"')),
('window_size', models.PositiveIntegerField(blank=True, default=None, help_text="Leave window size blank to not use windows. Window parameter\n is ignored when dealing with precomputed distance matrix. Always check\n the dimensions of the dataset your are operating on and plan your windows\n accordingly; eventual data that won't fit into the final window will be\n discarded.", null=True)),
('window_overlap', models.PositiveIntegerField(default=0, help_text='How many columns of overlap to have in\n consequent windows, if windows are being used. It must be at most 1\n less than window size.')),
('filtration_type', models.CharField(choices=[('VRF', 'Vietoris Rips Filtration'), ('CWRF', 'Clique Weighted Rank Filtration')], help_text='Choose the type of analysis.', max_length=50)),
('distance_matrix_metric', models.CharField(blank=True, choices=[('braycurtis', 'Braycurtis'), ('canberra', 'Canberra'), ('chebyshev', 'Chebyshev'), ('cityblock', 'City block'), ('correlation', 'Correlation'), ('cosine', 'Cosine'), ('dice', 'Dice'), ('euclidean', 'Euclidean'), ('hamming', 'Hamming'), ('jaccard', 'Jaccard'), ('jensenshannon', 'Jensen Shannon'), ('kulsinski', 'Kulsinski'), ('mahalanobis', 'Mahalonobis'), ('matching', 'Matching'), ('minkowski', 'Minkowski'), ('rogerstanimoto', 'Rogers-Tanimoto'), ('russellrao', 'Russel Rao'), ('seuclidean', 'Seuclidean'), ('sokalmichener', 'Sojal-Michener'), ('sokalsneath', 'Sokal-Sneath'), ('sqeuclidean', 'Sqeuclidean'), ('yule', 'Yule')], help_text='If Vietoris-Rips filtration is selected and not using a precomputed distance matrix, choose the\n distance metric to use on the selected dataset. This parameter is ignored in all other cases.', max_length=20)),
('max_homology_dimension', models.PositiveIntegerField(default=1, help_text='Maximum homology dimension computed. Will compute all dimensions lower than and equal to this value.\n For 1, H_0 and H_1 will be computed.')),
('max_distances_considered', models.FloatField(blank=True, default=None, help_text='Maximum distances considered when constructing filtration.\n If blank, compute the entire filtration.', null=True)),
('coeff', models.PositiveIntegerField(default=2, help_text='Compute homology with coefficients in the prime field Z/pZ for\n p=coeff.')),
('do_cocycles', models.BooleanField(default=False, help_text='Indicator of whether to compute cocycles.')),
('n_perm', models.IntegerField(blank=True, default=None, help_text='The number of points to subsample in\n a “greedy permutation,” or a furthest point sampling of the points. These points will\n be used in lieu of the full point cloud for a faster computation, at the expense of\n some accuracy, which can be bounded as a maximum bottleneck distance to all diagrams\n on the original point set', null=True)),
('entropy_normalized_graph', models.TextField(blank=True, null=True)),
('entropy_unnormalized_graph', models.TextField(blank=True, null=True)),
('dataset', models.ForeignKey(blank=True, default=None, help_text='Select the source dataset from the loaded datasets', null=True, on_delete=django.db.models.deletion.CASCADE, to='datasets.Dataset')),
('research', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='research.Research')),
],
options={
'verbose_name': 'filtration analysis',
'verbose_name_plural': 'filtration analyses',
'abstract': False,
},
),
migrations.CreateModel(
name='FiltrationWindow',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.PositiveIntegerField()),
('slug', models.SlugField(max_length=150)),
('creation_date', models.DateTimeField(auto_now_add=True)),
('start', models.PositiveIntegerField(blank=True, null=True)),
('end', models.PositiveIntegerField(blank=True, null=True)),
('result_matrix', django.contrib.postgres.fields.jsonb.JSONField(blank=True, null=True)),
('diagrams', django.contrib.postgres.fields.jsonb.JSONField(blank=True, null=True)),
('result_entropy_normalized', django.contrib.postgres.fields.jsonb.JSONField(blank=True, null=True)),
('result_entropy_unnormalized', django.contrib.postgres.fields.jsonb.JSONField(blank=True, null=True)),
('analysis', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='analysis.FiltrationAnalysis')),
],
options={
'abstract': False,
},
),
migrations.CreateModel(
name='MapperAnalysis',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(help_text='Name this analysis', max_length=100)),
('slug', models.SlugField(max_length=110)),
('description', models.TextField(blank=True, help_text='Write a brief description of the analysis', max_length=500)),
('creation_date', models.DateTimeField(auto_now_add=True)),
('precomputed_distance_matrix_json', django.contrib.postgres.fields.jsonb.JSONField(default='"[]"')),
('window_size', models.PositiveIntegerField(blank=True, default=None, help_text="Leave window size blank to not use windows. Window parameter\n is ignored when dealing with precomputed distance matrix. Always check\n the dimensions of the dataset your are operating on and plan your windows\n accordingly; eventual data that won't fit into the final window will be\n discarded.", null=True)),
('window_overlap', models.PositiveIntegerField(default=0, help_text='How many columns of overlap to have in\n consequent windows, if windows are being used. It must be at most 1\n less than window size.')),
('distance_matrix_metric', models.CharField(blank=True, choices=[('braycurtis', 'Braycurtis'), ('canberra', 'Canberra'), ('chebyshev', 'Chebyshev'), ('cityblock', 'City block'), ('correlation', 'Correlation'), ('cosine', 'Cosine'), ('dice', 'Dice'), ('euclidean', 'Euclidean'), ('hamming', 'Hamming'), ('jaccard', 'Jaccard'), ('jensenshannon', 'Jensen Shannon'), ('kulsinski', 'Kulsinski'), ('mahalanobis', 'Mahalonobis'), ('matching', 'Matching'), ('minkowski', 'Minkowski'), ('rogerstanimoto', 'Rogers-Tanimoto'), ('russellrao', 'Russel Rao'), ('seuclidean', 'Seuclidean'), ('sokalmichener', 'Sojal-Michener'), ('sokalsneath', 'Sokal-Sneath'), ('sqeuclidean', 'Sqeuclidean'), ('yule', 'Yule')], help_text='If not using a precomputed matrix, choose the distance metric to use on the dataset.', max_length=20)),
('projection', models.CharField(choices=[('sum', 'Sum'), ('mean', 'Mean'), ('median', 'Median'), ('max', 'Max'), ('min', 'Min'), ('std', 'Std'), ('dist_mean', 'Dist_mean'), ('l2norm', 'L2norm'), ('knn_distance_n', 'knn_distance_n')], default='sum', help_text='Specify a projection/lens type.', max_length=50)),
('knn_n_value', models.PositiveIntegerField(blank=True, help_text='Specify the value of n in knn_distance_n', null=True)),
('scaler', models.CharField(choices=[('None', 'None'), ('MinMaxScaler', 'MinMaxScaler'), ('MaxAbsScaler', 'MaxAbsScaler'), ('RobustScaler', 'RobustScaler'), ('StandardScaler', 'StandardScaler')], default='MinMaxScaler', help_text='Scaler of the data applied after mapping. Use None for no scaling.', max_length=50)),
('use_original_data', models.BooleanField(default=False, help_text='If ticked, clustering is run on the original data,\n else it will be run on the lower dimensional projection.')),
('clusterer', models.CharField(choices=[('k-means', 'K-Means'), ('affinity_propagation', 'Affinity propagation'), ('mean-shift', 'Mean-shift'), ('spectral_clustering', 'Spectral clustering'), ('agglomerative_clustering', 'StandardScaler'), ('DBSCAN(min_samples=1)', 'DBSCAN(min_samples=1)'), ('DBSCAN', 'DBSCAN'), ('gaussian_mixtures', 'Gaussian mixtures'), ('birch', 'Birch')], default='DBSCAN', help_text='Select the clustering algorithm.', max_length=50)),
('cover_n_cubes', models.PositiveIntegerField(default=10, help_text='Number of hypercubes along each dimension.\n Sometimes referred to as resolution.')),
('cover_perc_overlap', models.FloatField(default=0.5, help_text='Amount of overlap between adjacent cubes calculated\n only along 1 dimension.')),
('graph_nerve_min_intersection', models.IntegerField(default=1, help_text='Minimum intersection considered when\n computing the nerve. An edge will be created only when the\n intersection between two nodes is greater than or equal to\n min_intersection')),
('precomputed', models.BooleanField(default=False, help_text='Tell Mapper whether the data that you are clustering on\n is a precomputed distance matrix. If set to True, the assumption is that you are\n also telling your clusterer that metric=’precomputed’ (which is an argument for\n DBSCAN among others), which will then cause the clusterer to expect a square\n distance matrix for each hypercube. precomputed=True will give a square matrix\n to the clusterer to fit on for each hypercube.')),
('remove_duplicate_nodes', models.BooleanField(default=False, help_text='Removes duplicate nodes before edges are\n determined. A node is considered to be duplicate if it has exactly\n the same set of points as another node.')),
('dataset', models.ForeignKey(blank=True, default=None, help_text='Select the source dataset from the loaded datasets', null=True, on_delete=django.db.models.deletion.CASCADE, to='datasets.Dataset')),
('research', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='research.Research')),
],
options={
'verbose_name': 'mapper algorithm analysis',
'verbose_name_plural': 'mapper algoritm analyses',
'abstract': False,
},
),
migrations.CreateModel(
name='MapperWindow',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.PositiveIntegerField()),
('slug', models.SlugField(max_length=150)),
('creation_date', models.DateTimeField(auto_now_add=True)),
('start', models.PositiveIntegerField(blank=True, null=True)),
('end', models.PositiveIntegerField(blank=True, null=True)),
('graph', models.TextField(blank=True, null=True)),
('analysis', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='windows', related_query_name='window', to='analysis.MapperAnalysis')),
],
options={
'abstract': False,
},
),
migrations.AddField(
model_name='diagram',
name='window1',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='window1', to='analysis.FiltrationWindow'),
),
migrations.AddField(
model_name='diagram',
name='window2',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='window2', to='analysis.FiltrationWindow'),
),
migrations.AddField(
model_name='bottleneck',
name='analysis',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='analysis.FiltrationAnalysis'),
),
migrations.AddField(
model_name='bottleneck',
name='window',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='analysis.FiltrationWindow'),
),
migrations.AlterUniqueTogether(
name='mapperanalysis',
unique_together={('slug', 'research')},
),
migrations.AlterUniqueTogether(
name='filtrationanalysis',
unique_together={('slug', 'research')},
),
]
| 99.823899 | 951 | 0.599609 | 1,635 | 15,872 | 5.715596 | 0.223242 | 0.02397 | 0.017978 | 0.02825 | 0.5832 | 0.568539 | 0.520171 | 0.498769 | 0.498769 | 0.498769 | 0 | 0.007359 | 0.27224 | 15,872 | 158 | 952 | 100.455696 | 0.801662 | 0.002898 | 0 | 0.556291 | 1 | 0.072848 | 0.474406 | 0.033114 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.019868 | 0 | 0.046358 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
58bbe52ab96a55b367459bffd53e878ab429b0e4 | 1,019 | py | Python | env/lib/python3.6/site-packages/traits/util/tests/test_import_symbol.py | Raniac/NEURO-LEARN | 3c3acc55de8ba741e673063378e6cbaf10b64c7a | [
"Apache-2.0"
] | 8 | 2019-05-29T09:38:30.000Z | 2021-01-20T03:36:59.000Z | env/lib/python3.6/site-packages/traits/util/tests/test_import_symbol.py | Raniac/neurolearn_dev | 3c3acc55de8ba741e673063378e6cbaf10b64c7a | [
"Apache-2.0"
] | 12 | 2021-03-09T03:01:16.000Z | 2022-03-11T23:59:36.000Z | env/lib/python3.6/site-packages/traits/util/tests/test_import_symbol.py | Raniac/NEURO-LEARN | 3c3acc55de8ba741e673063378e6cbaf10b64c7a | [
"Apache-2.0"
] | 1 | 2020-07-17T12:49:49.000Z | 2020-07-17T12:49:49.000Z | """ Tests for the import manager. """
from traits.util.api import import_symbol
from traits.testing.unittest_tools import unittest
class TestImportSymbol(unittest.TestCase):
""" Tests for the import manager. """
def test_import_dotted_symbol(self):
""" import dotted symbol """
import tarfile
symbol = import_symbol("tarfile.TarFile")
self.assertEqual(symbol, tarfile.TarFile)
return
def test_import_nested_symbol(self):
""" import nested symbol """
import tarfile
symbol = import_symbol("tarfile:TarFile.open")
self.assertEqual(symbol, tarfile.TarFile.open)
return
def test_import_dotted_module(self):
""" import dotted module """
symbol = import_symbol("traits.util.import_symbol:import_symbol")
self.assertEqual(symbol, import_symbol)
return
if __name__ == "__main__":
unittest.main()
#### EOF ######################################################################
| 22.644444 | 79 | 0.614328 | 104 | 1,019 | 5.778846 | 0.288462 | 0.139767 | 0.14975 | 0.056572 | 0.366057 | 0.169717 | 0.169717 | 0.169717 | 0 | 0 | 0 | 0 | 0.221786 | 1,019 | 44 | 80 | 23.159091 | 0.757881 | 0.129539 | 0 | 0.263158 | 0 | 0 | 0.104859 | 0.049872 | 0 | 0 | 0 | 0 | 0.157895 | 1 | 0.157895 | false | 0 | 0.631579 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
58ceaafc2d2819124d87eef37b783e73dcf0c336 | 2,358 | py | Python | HanderCode/aidaiwangApp/aidaiwangApp/Login_aidaiwangApp.py | mocne/PycharmProjects | b009e530f4f01e5b1826bbe2364d86b65bcd66e3 | [
"MIT"
] | null | null | null | HanderCode/aidaiwangApp/aidaiwangApp/Login_aidaiwangApp.py | mocne/PycharmProjects | b009e530f4f01e5b1826bbe2364d86b65bcd66e3 | [
"MIT"
] | null | null | null | HanderCode/aidaiwangApp/aidaiwangApp/Login_aidaiwangApp.py | mocne/PycharmProjects | b009e530f4f01e5b1826bbe2364d86b65bcd66e3 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import time
import xlrd
import Register_aidaiwangApp
import LogOut_aidiawangApp
def start_to_login(filename):
print(u'login')
driver = Register_aidaiwangApp.driver
driver.launch_app()
time.sleep(3)
try:
driver.find_element_by_id('cn.phaidai.loan:id/rb_mine')
print('id')
except:
try:
driver.find_element_by_android_uiautomator('new UiSelector().text(u"我的")')
except:
return 'can not jump to mine'
else:
driver.find_element_by_android_uiautomator('new UiSelector().text(u"我的")').click()
print('text')
else:
driver.find_element_by_id('cn.phaidai.loan:id/rb_mine').click()
try:
driver.find_element_by_id('cn.phaidai.loan:id/tv_click')
except:
try:
driver.find_element_by_id('cn.phaidai.loan:id/iv_avatar')
except:
return 'can not check status'
else:
driver.find_element_by_id('cn.phaidai.loan:id/iv_avatar').click()
else:
usernameLabel = driver.find_element_by_id('cn.phaidai.loan:id/tv_click')
loginfo = usernameLabel.text
while loginfo != u'立即登录':
LogOut_aidiawangApp.start_to_logout()
usernameLabel.click()
currentAC = driver.current_activity
print(currentAC)
print(filename)
userData = xlrd.open_workbook(r'%s' % filename)
print('open user file success')
userSheet = userData.sheet_by_name('login')
loginName = str(userSheet.cell_value(1, 0))
loginPassword = str(userSheet.cell_value(1, 1))
print(loginName.split('.')[0], loginPassword)
try:
userNameLabel = driver.find_element_by_id('cn.phaidai.loan:id/et_login_name')
userNameLabel.clear()
userNameLabel.send_keys(loginName.split('.')[0])
except:
return 'can not input username : %s' % loginName.split('.')[0]
driver.find_element_by_id('cn.phaidai.loan:id/et_login_password').send_keys(loginPassword)
driver.find_element_by_id('cn.phaidai.loan:id/bt_login_into').click()
try:
driver.find_element_by_android_uiautomator('new UiSelector().text(u"首页")').click()
except:
driver.find_element_by_id('cn.phaidai.loan:id/rb_home').click()
| 33.685714 | 98 | 0.636556 | 295 | 2,358 | 4.844068 | 0.294915 | 0.090973 | 0.154654 | 0.172848 | 0.494052 | 0.460462 | 0.445066 | 0.445066 | 0.445066 | 0.419874 | 0 | 0.005025 | 0.240458 | 2,358 | 69 | 99 | 34.173913 | 0.792853 | 0.008906 | 0 | 0.275862 | 0 | 0 | 0.208137 | 0.154176 | 0 | 0 | 0 | 0 | 0 | 1 | 0.017241 | false | 0.051724 | 0.068966 | 0 | 0.137931 | 0.12069 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
58cf698a07fcbc7df1f0d9ad44c0aa4f953a79ab | 2,565 | py | Python | tests/functional/test_uploads.py | jounile/nollanet | 7bea20934d3f5e09658a9d31c3b05c15416398a0 | [
"MIT"
] | 3 | 2019-10-13T08:37:13.000Z | 2020-02-16T12:24:11.000Z | tests/functional/test_uploads.py | jounile/nollanet | 7bea20934d3f5e09658a9d31c3b05c15416398a0 | [
"MIT"
] | 5 | 2019-11-13T15:56:52.000Z | 2021-04-30T20:58:19.000Z | tests/functional/test_uploads.py | jounile/nollanet | 7bea20934d3f5e09658a9d31c3b05c15416398a0 | [
"MIT"
] | 1 | 2020-04-08T21:09:52.000Z | 2020-04-08T21:09:52.000Z | import io
import pytest
from requests import get
from urllib.parse import urljoin
def test_my_uploads_page(wait_for_api, login_user):
"""
GIVEN a user has logged in (login_user)
WHEN the '/my/uploads' page is navigated to (GET)
THEN check the response is valid and page title is correct
"""
request_session, api_url = wait_for_api
response = request_session.get(urljoin(api_url, '/my/uploads'))
assert response.status_code == 200
assert '<h1>My uploads</h1>' in response.text
def test_valid_new_upload_page(wait_for_api, login_user):
"""
GIVEN a user has logged in (login_user)
WHEN the '/media/newupload' page is navigated to (GET)
THEN check the response is valid and page title is correct
"""
request_session, api_url = wait_for_api
response = request_session.get(urljoin(api_url, '/media/newupload'))
assert response.status_code == 200
assert '<h1>New upload</h1>' in response.text
def test_invalid_new_upload_page(wait_for_api):
"""
GIVEN a user has not logged in
WHEN the '/media/newupload' page is navigated to (GET)
THEN check the response is valid and page title is correct
"""
request_session, api_url = wait_for_api
response = request_session.get(urljoin(api_url, '/media/newupload'))
assert response.status_code == 200
assert '<div class="flash">Please login first</div>' in response.text
def test_new_upload(wait_for_api, login_user):
"""
GIVEN a user has logged in (login_user)
WHEN the '/media/newupload' page is posted an example image (POST)
THEN check the response is valid and the page title is correct
"""
example_file=open("./app/static/gfx/example.png","rb")
files = { 'file': example_file }
request_session, api_url = wait_for_api
response = request_session.post(urljoin(api_url, '/media/newupload'), files=files, allow_redirects=True)
assert response.status_code == 200
assert '<h1>My uploads</h1>' in response.text
#def test_remove_upload(wait_for_api, login_user):
# """
# GIVEN a user has logged in (login_user)
# WHEN the '/blob/delete' page is posted (POST)
# THEN check the response is valid and the user is logged in
# """
# valid_blob = dict(blob_path='images/*example.png', upload_id=2)
# request_session, api_url = wait_for_api
# response = request_session.post(urljoin(api_url, '/blob/delete'), data=valid_blob, allow_redirects=True)
# assert response.status_code == 200
# assert 'example.png was deleted successfully' in response.text
| 40.078125 | 109 | 0.71384 | 387 | 2,565 | 4.534884 | 0.217054 | 0.039886 | 0.05698 | 0.037037 | 0.745299 | 0.725356 | 0.694017 | 0.692877 | 0.692877 | 0.611396 | 0 | 0.010536 | 0.185965 | 2,565 | 64 | 110 | 40.078125 | 0.829981 | 0.451852 | 0 | 0.461538 | 0 | 0 | 0.148006 | 0.021472 | 0 | 0 | 0 | 0 | 0.307692 | 1 | 0.153846 | false | 0 | 0.153846 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
58d14d425be795bfa4409700edc4323d29494ae2 | 307 | py | Python | nicegui/elements/row.py | florianwittkamp/nicegui | 4c054f4e5b82e4ac56db93b73d5fb5ffcd480d06 | [
"MIT"
] | 30 | 2021-06-16T15:46:45.000Z | 2022-03-27T03:14:18.000Z | nicegui/elements/row.py | florianwittkamp/nicegui | 4c054f4e5b82e4ac56db93b73d5fb5ffcd480d06 | [
"MIT"
] | 11 | 2021-05-24T17:05:22.000Z | 2022-02-19T07:13:18.000Z | nicegui/elements/row.py | florianwittkamp/nicegui | 4c054f4e5b82e4ac56db93b73d5fb5ffcd480d06 | [
"MIT"
] | 7 | 2021-07-22T05:51:04.000Z | 2022-01-31T19:39:37.000Z | import justpy as jp
from .group import Group
class Row(Group):
def __init__(self):
'''Row Element
Provides a container which arranges its child in a row.
'''
view = jp.QDiv(classes='row items-start', style='gap: 1em', delete_flag=False)
super().__init__(view)
| 23.615385 | 86 | 0.628664 | 42 | 307 | 4.380952 | 0.761905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004405 | 0.260586 | 307 | 12 | 87 | 25.583333 | 0.806167 | 0.221498 | 0 | 0 | 0 | 0 | 0.106481 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
58d37fec96977e11aa6010c2276ce5434c4fc6f8 | 452 | py | Python | tests/guinea-pigs/unittest/expected_failure.py | Tirzono/teamcity-messages | e7f7334e2956a9e707222e4c83de9ffeb15b8ac0 | [
"Apache-2.0"
] | 105 | 2015-06-24T15:40:41.000Z | 2022-02-04T10:30:34.000Z | tests/guinea-pigs/unittest/expected_failure.py | Tirzono/teamcity-messages | e7f7334e2956a9e707222e4c83de9ffeb15b8ac0 | [
"Apache-2.0"
] | 145 | 2015-06-24T15:26:28.000Z | 2022-03-22T20:04:19.000Z | tests/guinea-pigs/unittest/expected_failure.py | Tirzono/teamcity-messages | e7f7334e2956a9e707222e4c83de9ffeb15b8ac0 | [
"Apache-2.0"
] | 76 | 2015-07-20T08:18:21.000Z | 2022-03-18T20:03:53.000Z | # coding=utf-8
import sys
from teamcity.unittestpy import TeamcityTestRunner
if sys.version_info < (2, 7):
from unittest2 import main, TestCase, expectedFailure
else:
from unittest import main, TestCase, expectedFailure
class TestSkip(TestCase):
def test_expected_failure(self):
self.fail("this should happen unfortunately")
test_expected_failure = expectedFailure(test_expected_failure)
main(testRunner=TeamcityTestRunner)
| 26.588235 | 66 | 0.783186 | 53 | 452 | 6.54717 | 0.622642 | 0.103746 | 0.164265 | 0.190202 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010363 | 0.146018 | 452 | 16 | 67 | 28.25 | 0.888601 | 0.026549 | 0 | 0 | 0 | 0 | 0.073059 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
58eeedb6cd1adb5de820dbc349b434e1a3735952 | 425 | py | Python | wrappers/Python/sbmlsolver/__init__.py | gitter-badger/sbmlsolver | c92936832297ea1d2ad7f17223b68ada43c8f0b2 | [
"Apache-2.0"
] | null | null | null | wrappers/Python/sbmlsolver/__init__.py | gitter-badger/sbmlsolver | c92936832297ea1d2ad7f17223b68ada43c8f0b2 | [
"Apache-2.0"
] | null | null | null | wrappers/Python/sbmlsolver/__init__.py | gitter-badger/sbmlsolver | c92936832297ea1d2ad7f17223b68ada43c8f0b2 | [
"Apache-2.0"
] | null | null | null | """
The LibRoadRunner SBML Simulation Engine, (c) 2009-2014 Andy Somogyi and Herbert Sauro
LibRoadRunner is an SBML JIT compiler and simulation engine with a variety of analysis
functions. LibRoadRunner is a self contained library which is designed to be integrated
into existing simulation platforms or may be used a stand alone simulation and analysis
package.
"""
from sbmlsolver import *
__version__ = getVersionStr()
| 32.692308 | 87 | 0.807059 | 60 | 425 | 5.65 | 0.75 | 0.094395 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022284 | 0.155294 | 425 | 12 | 88 | 35.416667 | 0.922006 | 0.844706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
58f0dabb24cb5744c956fc257b97c051c5d3142b | 674 | py | Python | scronsole/widgets/main_screen.py | bastianh/screeps_console_mod | e093cc1e071fae5bdf106674b97e71902fbbb6ff | [
"MIT"
] | 2 | 2017-10-08T19:39:27.000Z | 2017-10-08T19:51:18.000Z | scronsole/widgets/main_screen.py | bastianh/screeps_console_mod | e093cc1e071fae5bdf106674b97e71902fbbb6ff | [
"MIT"
] | null | null | null | scronsole/widgets/main_screen.py | bastianh/screeps_console_mod | e093cc1e071fae5bdf106674b97e71902fbbb6ff | [
"MIT"
] | null | null | null | import urwid
from scronsole.config_manager import ConfigManager
from scronsole.plugin_manager import PluginManager
from scronsole.widgets.main_menu import MainMenu
from scronsole.widgets.server_screen import ServerScreen
class MainScreen(urwid.WidgetPlaceholder):
def __init__(self):
super().__init__(urwid.SolidFill(u'/'))
self.config = ConfigManager()
self.show_main_menu()
self.plugins = PluginManager(self)
self.plugins.load_plugins()
def show_server_screen(self, server_data):
self.original_widget = ServerScreen(self, server_data)
def show_main_menu(self):
self.original_widget = MainMenu(self)
| 30.636364 | 62 | 0.746291 | 79 | 674 | 6.075949 | 0.392405 | 0.108333 | 0.083333 | 0.066667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.172107 | 674 | 21 | 63 | 32.095238 | 0.860215 | 0 | 0 | 0 | 0 | 0 | 0.001484 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1875 | false | 0 | 0.3125 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
58f1e5bfcc6007b51ace335dfbea68c9b539583f | 436 | py | Python | sql/language.py | skylarkgit/sql2java | befd55180969b0ec68e242991c3260272d755cc9 | [
"MIT"
] | 2 | 2019-10-23T08:27:30.000Z | 2019-10-23T09:58:45.000Z | sql/language.py | skylarkgit/sql2java | befd55180969b0ec68e242991c3260272d755cc9 | [
"MIT"
] | null | null | null | sql/language.py | skylarkgit/sql2java | befd55180969b0ec68e242991c3260272d755cc9 | [
"MIT"
] | null | null | null | import re
from csv import reader
def splitEscaped(str, by, escapeChar):
infile = [str]
return reader(infile, delimiter=by, quotechar=escapeChar)
def removeComments(text):
p = r'/\*[^*]*\*+([^/*][^*]*\*+)*/|("(\\.|[^"\\])*"|\'(\\.|[^\'\\])*\'|.[^/"\'\\]*)'
return ''.join(m.group(2) for m in re.finditer(p, text, re.M|re.S) if m.group(2))
def escapeAnnotations(text):
return re.sub(r'(/\*@)(.*)(\*/)',r'@\2',text)
| 31.142857 | 88 | 0.53211 | 55 | 436 | 4.218182 | 0.527273 | 0.051724 | 0.060345 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007958 | 0.135321 | 436 | 13 | 89 | 33.538462 | 0.607427 | 0 | 0 | 0 | 0 | 0 | 0.174312 | 0.107798 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.2 | 0.1 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
58f9aedfba7b25435acbe41455b6f6873bd36f40 | 2,768 | py | Python | tests/io/v3/base/test_csv_iterator.py | alpesh-te/pyTenable | 4b5381a7757561f7ac1e79c2e2679356dd533540 | [
"MIT"
] | null | null | null | tests/io/v3/base/test_csv_iterator.py | alpesh-te/pyTenable | 4b5381a7757561f7ac1e79c2e2679356dd533540 | [
"MIT"
] | 25 | 2021-11-16T18:41:36.000Z | 2022-03-25T05:43:31.000Z | tests/io/v3/base/test_csv_iterator.py | alpesh-te/pyTenable | 4b5381a7757561f7ac1e79c2e2679356dd533540 | [
"MIT"
] | 2 | 2022-03-02T12:24:40.000Z | 2022-03-29T05:12:04.000Z | '''
Testing the CSV iterators
'''
import responses
from tenable.io.v3.base.iterators.explore_iterator import CSVChunkIterator
USERS_BASE_URL = r'https://cloud.tenable.com/api/v3/assets/search'
CSV_TEXT = (
'created,display_ipv4_address,first_observed,id,'
'ipv4_addresses,ipv6_addresses,is_deleted,is_licensed,'
'is_public,last_observed,name,network.id,network.name,'
'observation_sources,sources,types,updated\n'
'2021-11-24T13:43:56.709Z,192.12.13.7,2021-11-24T13:43:56.442Z,'
'"0142df77-dbc4-4706-8456-b756c06ee8a2",192.12.13.7,,false,'
'false,true,2021-11-24T13:43:56.442Z,192.12.13.7,'
'"00000000-0000-0000-0000-000000000000",Default,'
'"test_v3;2021-11-24T13:43:56.442Z;2021-11-24T13:43:56.442Z",'
'test_v3,host,2021-11-24T13:43:56.709Z\n'
)
CSV_TEXT_2 = (
'created,display_ipv4_address,first_observed,id,ipv4_addresses,'
'ipv6_addresses,is_deleted,is_licensed,is_public,last_observed,'
'name,network.id,network.name,observation_sources,sources,'
'types,updated\ncreated,display_ipv4_address,first_observed,id,'
'ipv4_addresses,ipv6_addresses,is_deleted,is_licensed,'
'is_public,last_observed,name,network.id,network.name,'
'observation_sources,sources,types,updated\n'
'2021-11-24T13:43:56.709Z,192.12.13.7,2021-11-24T13:43:56.442Z,'
'"0142df77-dbc4-4706-8456-b756c06ee8a2",192.12.13.7,,'
'false,false,true,2021-11-24T13:43:56.442Z,192.12.13.7,'
'"00000000-0000-0000-0000-000000000000",Default,'
'"test_v3;2021-11-24T13:43:56.442Z;2021-11-24T13:43:56.442Z",'
'test_v3,host,2021-11-24T13:43:56.709Z\n'
)
CSV_HEADERS = {
'Date': 'Wed, 08 Dec 2021 04:42:28 GMT',
'Content-Type': 'text/csv;charset=UTF-8',
'Content-Length': '508',
'Connection': 'keep-alive',
'Set-Cookie': 'nginx-cloud-site-id=qa-develop; path=/; '
'HttpOnly; SameSite=Strict; Secure',
'X-Request-Uuid': '4d43db5bac4decd79fc198e06a8113bd',
'X-Continuation-Token': 'fasd563456fghfgfFGHFGHRT',
'X-Content-Type-Options': 'nosniff',
'X-Frame-Options': 'DENY',
'X-Xss-Protection': '1; mode=block',
'Cache-Control': 'no-store',
'Strict-Transport-Security': 'max-age=63072000; includeSubDomains',
'X-Gateway-Site-ID': 'nginx-router-jm8uw-us-east-1-eng',
'Pragma': 'no-cache',
'Expect-CT': 'enforce, max-age=86400',
'X-Path-Handler': 'tenable-io',
}
@responses.activate
def test_csv_iterator(api):
responses.add(
method=responses.POST,
url=USERS_BASE_URL,
body=CSV_TEXT,
headers=CSV_HEADERS
)
csv_iterator = CSVChunkIterator(
api=api,
_path='api/v3/assets/search',
_payload={}
)
assert next(csv_iterator) == CSV_TEXT
assert next(csv_iterator) == CSV_TEXT_2
| 35.948052 | 74 | 0.695087 | 400 | 2,768 | 4.6725 | 0.37 | 0.038523 | 0.070626 | 0.083467 | 0.553772 | 0.553772 | 0.52381 | 0.52381 | 0.52381 | 0.52381 | 0 | 0.18 | 0.132948 | 2,768 | 76 | 75 | 36.421053 | 0.59875 | 0.009032 | 0 | 0.21875 | 0 | 0.109375 | 0.670932 | 0.513346 | 0 | 0 | 0 | 0 | 0.03125 | 1 | 0.015625 | false | 0 | 0.03125 | 0 | 0.046875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
4507d40889bdeb2efc06f9fd94721d09e699f4c0 | 159 | py | Python | Asignacion2/App/main.py | HarambeGeek/uip-iq-pc3 | 6e9a0678a90c4bfd7499dfb5c71c9a3ea9effe1e | [
"Apache-2.0"
] | null | null | null | Asignacion2/App/main.py | HarambeGeek/uip-iq-pc3 | 6e9a0678a90c4bfd7499dfb5c71c9a3ea9effe1e | [
"Apache-2.0"
] | null | null | null | Asignacion2/App/main.py | HarambeGeek/uip-iq-pc3 | 6e9a0678a90c4bfd7499dfb5c71c9a3ea9effe1e | [
"Apache-2.0"
] | null | null | null | from App.numeros import numeros
if __name__ == "__main__":
x = int(input("Ingrese el número que desea evaluar: \n"))
pi = numeros(x)
pi.parImpar() | 26.5 | 61 | 0.660377 | 23 | 159 | 4.217391 | 0.826087 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.207547 | 159 | 6 | 62 | 26.5 | 0.769841 | 0 | 0 | 0 | 0 | 0 | 0.29375 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
451a01d6bf880434d082fec4bb6d94642deb72ee | 2,195 | py | Python | moex/tests/test_service.py | ChanTerelLy/broker-account-analist | a723c83fe9a924905eb0754b4acb1231b31f9c87 | [
"MIT"
] | null | null | null | moex/tests/test_service.py | ChanTerelLy/broker-account-analist | a723c83fe9a924905eb0754b4acb1231b31f9c87 | [
"MIT"
] | 11 | 2021-02-21T19:39:41.000Z | 2021-06-13T16:29:47.000Z | moex/tests/test_service.py | ChanTerelLy/broker-account-analist | a723c83fe9a924905eb0754b4acb1231b31f9c87 | [
"MIT"
] | 2 | 2021-11-16T16:31:37.000Z | 2022-02-11T02:55:37.000Z | import asyncio
import unittest
from moex.service import Cbr, Moex
class CbrTest(unittest.TestCase):
def setUp(self) -> None:
self.cbr = Cbr('01.01.2021')
def test_usd(self):
self.assertEqual(self.cbr.USD, 73.88)
def test_euro(self):
self.assertEqual(self.cbr.EUR, 90.79)
class MoexTest(unittest.TestCase):
def setUp(self) -> None:
self.loop = asyncio.get_event_loop()
def test_not_contain_empty_list(self):
data = self.loop.run_until_complete(Moex().get_shares('etf', ['RU000A100JH0', 'RU0009029540', 'IE00BD3QFB18']))
for d in data:
self.assertTrue(d)
def test_bonds_tsqb(self):
data = self.loop.run_until_complete(Moex().get_shares('bonds', ['RU000A100JH0']))
self.assertEqual(data[0][0], 'RU000A100JH0')
self.assertEqual(data[0][1], 'RU000A100JH0')
def test_bonds_tqir(self):
data = self.loop.run_until_complete(Moex().get_shares('bonds', ['RU000A1015P6']))
self.assertEqual(data[0][0], 'RU000A1015P6')
self.assertEqual(data[0][1], 'RU000A1015P6')
def test_etf(self):
data = self.loop.run_until_complete(Moex().get_shares('etf', ['IE00BD3QFB18', 'US0231351067']))
self.assertEqual(data[0][0], 'IE00BD3QFB18')
self.assertEqual(data[0][1], 'FXCN')
def test_foreignshares(self):
data = self.loop.run_until_complete(Moex().get_shares('foreignshares', ['US0231351067']))
self.assertEqual(data[0][0], 'US0231351067')
self.assertEqual(data[0][1], 'AMZN-RM')
def test_shares(self):
data = self.loop.run_until_complete(Moex().get_shares('shares', ['RU0009029540']))
self.assertEqual(data[0][0], 'RU0009029540')
self.assertEqual(data[0][1], 'SBER')
def test_coupons_tsqb(self):
data = self.loop.run_until_complete(Moex().get_coupon_by_isins(['RU000A100JH0']))
self.assertEqual(data[0][0]['isin'], 'RU000A100JH0')
def test_coupons_tqir(self):
data = self.loop.run_until_complete(Moex().get_coupon_by_isins(['RU000A1015P6']))
self.assertEqual(data[0][0]['isin'], 'RU000A1015P6')
if __name__ == '__main__':
unittest.main() | 34.296875 | 119 | 0.655125 | 279 | 2,195 | 4.956989 | 0.236559 | 0.151844 | 0.164859 | 0.173536 | 0.666667 | 0.494577 | 0.345625 | 0.293565 | 0.293565 | 0.293565 | 0 | 0.109749 | 0.182232 | 2,195 | 64 | 120 | 34.296875 | 0.660724 | 0 | 0 | 0.044444 | 0 | 0 | 0.143898 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.266667 | false | 0 | 0.066667 | 0 | 0.377778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
45284a1d25fe21c81004bcc320ecfac7a3fe05f4 | 907 | py | Python | src/dask_awkward/tests/test_utils.py | douglasdavis/dask-awkward | e8829d32ed080d643c7e4242036ce64aee60eda6 | [
"BSD-3-Clause"
] | 21 | 2021-09-09T19:32:30.000Z | 2022-03-01T15:42:06.000Z | src/dask_awkward/tests/test_utils.py | douglasdavis/dask-awkward | e8829d32ed080d643c7e4242036ce64aee60eda6 | [
"BSD-3-Clause"
] | 14 | 2021-09-23T16:54:10.000Z | 2022-03-23T19:24:53.000Z | src/dask_awkward/tests/test_utils.py | douglasdavis/dask-awkward | e8829d32ed080d643c7e4242036ce64aee60eda6 | [
"BSD-3-Clause"
] | 3 | 2021-09-09T19:32:32.000Z | 2021-11-18T17:27:35.000Z | from __future__ import annotations
from ..utils import normalize_single_outer_inner_index
def test_normalize_single_outer_inner_index() -> None:
divisions = (0, 12, 14, 20, 23, 24)
indices = [0, 1, 2, 8, 12, 13, 14, 15, 17, 20, 21, 22]
results = [
(0, 0),
(0, 1),
(0, 2),
(0, 8),
(1, 0),
(1, 1),
(2, 0),
(2, 1),
(2, 3),
(3, 0),
(3, 1),
(3, 2),
]
for i, r in zip(indices, results):
res = normalize_single_outer_inner_index(divisions, i)
assert r == res
divisions = (0, 12) # type: ignore
indices = [0, 2, 3, 6, 8, 11]
results = [
(0, 0),
(0, 2),
(0, 3),
(0, 6),
(0, 8),
(0, 11),
]
for i, r in zip(indices, results):
res = normalize_single_outer_inner_index(divisions, i)
assert r == res
| 22.675 | 62 | 0.46527 | 126 | 907 | 3.18254 | 0.333333 | 0.149626 | 0.199501 | 0.249377 | 0.533666 | 0.38404 | 0.38404 | 0.38404 | 0.38404 | 0.38404 | 0 | 0.138053 | 0.377067 | 907 | 39 | 63 | 23.25641 | 0.571681 | 0.01323 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057143 | 1 | 0.028571 | false | 0 | 0.057143 | 0 | 0.085714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
45288cac480034ff3c670253791c5dd9e04dcb61 | 16,413 | py | Python | core/client_socket.py | schalekamp/ibapipy | a9e02d604d9f4a2ad87e78089654b29305aa110d | [
"Apache-2.0"
] | 1 | 2020-08-13T05:45:48.000Z | 2020-08-13T05:45:48.000Z | core/client_socket.py | schalekamp/ibapipy | a9e02d604d9f4a2ad87e78089654b29305aa110d | [
"Apache-2.0"
] | null | null | null | core/client_socket.py | schalekamp/ibapipy | a9e02d604d9f4a2ad87e78089654b29305aa110d | [
"Apache-2.0"
] | null | null | null | """Implements the EClientSocket interface for the Interactive Brokers API."""
import threading
import ibapipy.config as config
from ibapipy.core.network_handler import NetworkHandler
class ClientSocket:
"""Provides methods for sending requests to TWS."""
def __init__(self):
"""Initialize a new instance of a ClientSocket."""
self.__listener_thread__ = None
self.__network_handler__ = NetworkHandler()
self.server_version = 0
self.tws_connection_time = ''
self.is_connected = False
def __send__(self, *args):
"""Hand off each element in args to the NetworkHandler for sending over
the network.
*args -- items to send
"""
for item in args:
self.__network_handler__.socket_out_queue.put(item, block=False)
def account_download_end(self, account_name):
pass
def cancel_calculate_implied_volatility(self, req_id):
raise NotImplementedError()
def calculate_option_price(self, req_id, contract, volatility,
under_price):
raise NotImplementedError()
def calculate_implied_volatility(self, req_id, contract, price,
under_price):
raise NotImplementedError()
def commission_report(self, report):
pass
def contract_details(self, req_id, contract):
pass
def contract_details_end(self, req_id):
pass
def cancel_calculate_option_price(self, req_id):
raise NotImplementedError()
def cancel_fundamental_data(self, req_id):
raise NotImplementedError()
def cancel_historical_data(self, req_id):
version = 1
self.__send__(config.CANCEL_HISTORICAL_DATA, version, req_id)
def cancel_mkt_data(self, req_id):
version = 1
self.__send__(config.CANCEL_MKT_DATA, version, req_id)
def cancel_mkt_depth(self, req_id):
raise NotImplementedError()
def cancel_news_bulletins(self):
raise NotImplementedError()
def cancel_order(self, req_id):
version = 1
self.__send__(config.CANCEL_ORDER, version, req_id)
def cancel_real_time_bars(self, req_id):
raise NotImplementedError()
def cancel_scanner_subscription(self, req_id):
raise NotImplementedError()
def connect(self, host=config.HOST, port=config.PORT,
client_id=config.CLIENT_ID):
"""Connect to the remote TWS.
Keyword arguments:
host -- host name or IP address of the TWS machine
port -- port number on the TWS machine
client_id -- number used to identify this client connection
"""
if self.is_connected:
return
# Connect
results = self.__network_handler__.connect(host, port, client_id)
self.server_version, self.tws_connection_time = results
self.is_connected = True
# Listen for incoming messages
self.__listener_thread__ = threading.Thread(
target=listen, args=(self, self.__network_handler__.message_queue))
self.__listener_thread__.start()
def disconnect(self):
"""Disconnect from the remote TWS."""
self.__network_handler__.disconnect()
self.is_connected = False
self.server_version = 0
self.tws_connection_time = ''
def error(self, req_id, code, message):
pass
def exercise_options(self, req_id, contract, action, quantity, account,
override):
raise NotImplementedError()
def exec_details(self, req_id, contract, execution):
pass
def exec_details_end(self, req_id):
pass
def historical_data(self, req_id, date, open, high, low, close, volume,
bar_count, wap, has_gaps):
pass
def managed_accounts(self, accounts):
pass
def next_valid_id(self, req_id):
pass
def open_order(self, req_id, contract, order):
pass
def open_order_end(self):
pass
def order_status(self, req_id, status, filled, remaining, avg_fill_price,
perm_id, parent_id, last_fill_price, client_id, why_held):
pass
def place_order(self, req_id, contract, order):
version = 35
# Intro and request ID
self.__send__(config.PLACE_ORDER, version, req_id)
# Contract fields
self.__send__(contract.con_id, contract.symbol, contract.sec_type,
contract.expiry, contract.strike, contract.right,
contract.multiplier, contract.exchange,
contract.primary_exch, contract.currency,
contract.local_symbol, contract.sec_id_type,
contract.sec_id)
# Main order fields
self.__send__(order.action, order.total_quantity, order.order_type,
order.lmt_price, order.aux_price)
# Extended order fields
self.__send__(order.tif, order.oca_group, order.account,
order.open_close, order.origin, order.order_ref,
order.transmit, order.parent_id, order.block_order,
order.sweep_to_fill, order.display_size,
order.trigger_method, order.outside_rth, order.hidden)
# Send combo legs for bag requests
if config.BAG_SEC_TYPE == contract.sec_type.upper():
raise NotImplementedError('Bag type not supported yet.')
self.__send__('') # deprecated shares_allocation field
# Everything else (broken into quasi-readble chunks)
self.__send__(order.discretionary_amt, order.good_after_time,
order.good_till_date, order.fa_group, order.fa_method,
order.fa_percentage, order.fa_profile,
order.short_sale_slot, order.designated_location)
self.__send__(order.exempt_code, order.oca_type, order.rule_80a,
order.settling_firm, order.all_or_none,
check(order.min_qty), check(order.percent_offset),
order.etrade_only, order.firm_quote_only,
check(order.nbbo_price_cap))
self.__send__(check(order.auction_strategy),
check(order.starting_price),
check(order.stock_ref_price), check(order.delta),
check(order.stock_range_lower),
check(order.stock_range_upper),
order.override_percentage_constraints,
check(order.volatility), check(order.volatility_type),
order.delta_neutral_order_type,
check(order.delta_neutral_aux_price))
if len(order.delta_neutral_order_type) > 0:
self.__send__(order.delta_neutral_con_id,
order.delta_neutral_settling_firm,
order.delta_neutral_clearing_account,
order.delta_neutral_clearing_intent)
self.__send__(order.continuous_update,
check(order.reference_price_type),
check(order.trail_stop_price),
check(order.scale_init_level_size),
check(order.scale_subs_level_size),
check(order.scale_price_increment), order.hedge_type)
if len(order.hedge_type) > 0:
self.__send__(order.hedge_param)
self.__send__(order.opt_out_smart_routing, order.clearing_account,
order.clearing_intent, order.not_held)
if contract.under_comp is not None:
raise NotImplementedError('Under comp not supported yet.')
else:
self.__send__(False)
self.__send__(order.algo_strategy)
if len(order.algo_strategy) > 0:
raise NotImplementedError('Algo strategy not supported yet.')
self.__send__(order.what_if)
def replace_fa(self, fa_data_type, xml):
raise NotImplementedError()
def req_account_updates(self, subscribe, acct_code):
version = 2
self.__send__(config.REQ_ACCOUNT_DATA, version, subscribe, acct_code)
def req_all_open_orders(self):
version = 1
self.__send__(config.REQ_ALL_OPEN_ORDERS, version)
def req_auto_open_orders(self, auto_bind):
version = 1
self.__send__(config.REQ_AUTO_OPEN_ORDERS, version, auto_bind)
def req_contract_details(self, req_id, contract):
version = 6
# Contract data message
self.__send__(config.REQ_CONTRACT_DATA, version, req_id)
# Contract fields
self.__send__(contract.con_id, contract.symbol, contract.sec_type,
contract.expiry, contract.strike, contract.right,
contract.multiplier, contract.exchange,
contract.currency, contract.local_symbol,
contract.include_expired, contract.sec_id_type,
contract.sec_id)
def req_current_time(self):
"""Returns the current system time on the server side via the
current_time() wrapper method.
"""
version = 1
self.__send__(config.REQ_CURRENT_TIME, version)
def req_executions(self, req_id, exec_filter):
version = 3
# Execution message
self.__send__(config.REQ_EXECUTIONS, version, req_id)
# Execution report filter
self.__send__(exec_filter.client_id, exec_filter.acct_code,
exec_filter.time, exec_filter.symbol,
exec_filter.sec_type, exec_filter.exchange,
exec_filter.side)
def req_fundamental_data(self, req_id, contract, report_type):
raise NotImplementedError()
def req_historical_data(self, req_id, contract, end_date_time,
duration_str, bar_size_setting, what_to_show,
use_rth, format_date):
version = 4
self.__send__(config.REQ_HISTORICAL_DATA, version, req_id)
# Contract fields
self.__send__(contract.symbol, contract.sec_type, contract.expiry,
contract.strike, contract.right, contract.multiplier,
contract.exchange, contract.primary_exch,
contract.currency, contract.local_symbol,
contract.include_expired)
# Other stuff
self.__send__(end_date_time, bar_size_setting, duration_str, use_rth,
what_to_show, format_date)
# Combo legs for bag requests
if config.BAG_SEC_TYPE == contract.sec_type.upper():
raise NotImplementedError('Bag type not supported yet.')
def req_ids(self, num_ids):
version = 1
self.__send__(config.REQ_IDS, version, num_ids)
def req_managed_accts(self):
version = 1
self.__send__(config.REQ_MANAGED_ACCTS, version)
def req_market_data_type(self, type):
raise NotImplementedError()
def req_mkt_data(self, req_id, contract, generic_ticklist='',
snapshot=False):
"""Return market data via the tick_price(), tick_size(),
tick_option_computation(), tick_generic(), tick_string() and
tick_EFP() wrapper methods.
Keyword arguments:
req_id -- unique request ID
contract -- ibapi.contract.Contract object
generic_ticklist -- comma delimited list of generic tick types
(default: '')
snapshot -- True to return a single snapshot of market data
and have the market data subscription cancel;
False, otherwise (default: False)
"""
version = 9
# Intro and request ID
self.__send__(config.REQ_MKT_DATA, version, req_id)
# Contract fields
self.__send__(contract.con_id, contract.symbol, contract.sec_type,
contract.expiry, contract.strike, contract.right,
contract.multiplier, contract.exchange,
contract.primary_exch, contract.currency,
contract.local_symbol)
if config.BAG_SEC_TYPE == contract.sec_type:
raise NotImplementedError('Bag type not supported yet.')
if contract.under_type is not None:
raise NotImplementedError('Under comp not supported yet.')
else:
self.__send__(False)
# Remaining parameters
self.__send__(generic_ticklist, snapshot)
def req_mkt_depth(self, req_id, contract, num_rows):
raise NotImplementedError()
def req_news_bulletins(self, all_msgs):
raise NotImplementedError()
def req_open_orders(self):
version = 1
self.__send__(config.REQ_OPEN_ORDERS, version)
def req_real_time_bars(self, req_id, contract, bar_size, what_to_show,
use_rth):
raise NotImplementedError()
def req_scanner_parameters(self):
raise NotImplementedError()
def req_scanner_subscription(self, req_id, subscription):
raise NotImplementedError()
def request_fa(self, fa_data_type):
raise NotImplementedError()
def set_server_log_level(self, log_level=2):
"""Set the logging level of the server.
Keyword arguments:
log_level -- level of log entry detail used by the server (TWS)
when processing API requests. Valid values include:
1 = SYSTEM; 2 = ERROR; 3 = WARNING; 4 = INFORMATION;
5 = DETAIL (default: 2)
"""
version = 1
self.__send__(config.SET_SERVER_LOGLEVEL, version, log_level)
def tick_price(self, req_id, tick_type, price, can_auto_execute):
pass
def tick_size(self, req_id, tick_type, size):
pass
def update_account_time(self, timestamp):
pass
def update_account_value(self, key, value, currency, account_name):
pass
def update_portfolio(self, contract, position, market_price, market_value,
average_cost, unrealized_pnl, realized_pnl,
account_name):
pass
def update_unknown(self, *args):
"""Callback for updated known data that does not match any existing
callbacks.
"""
pass
def check(value):
"""Check to see if the specified value is equal to JAVA_INT_MAX or
JAVA_DOUBLE_MAX and return None if such is the case; otherwise return
'value'.
Interactive Brokers will set certain integers and floats to be their
maximum possible value in Java.
This is used as a sentinal value that should be replaced with an EOL when
transmitting. Here, we check the value and, if it is a max, return None
which the codec will interpret as an EOL.
Keyword arguments:
value -- integer or floating-point value to check
"""
if is_java_int_max(value) or is_java_double_max(value):
return None
else:
return value
def is_java_double_max(number):
"""Returns True if the specified number is equal to the maximum value of
a Double in Java; False, otherwise.
Keyword arguments:
number -- number to check
"""
return type(number) == float and number == config.JAVA_DOUBLE_MAX
def is_java_int_max(number):
"""Returns True if the specified number is equal to the maximum value of
an Integer in Java; False, otherwise.
Keyword arguments:
number -- number to check
"""
return type(number) == int and number == config.JAVA_INT_MAX
def listen(client, in_queue):
"""Listen to messages in the specified incoming queue and call the
appropriate methods in the client.
Keyword arguments:
client -- client
in_queue -- incoming message queue
"""
# Loop until we receive a stop message in the incoming queue
while True:
method, parms = in_queue.get()
if method == 'stop':
return
elif method is None:
continue
elif hasattr(client, method):
getattr(client, method)(*parms)
else:
parms = list(parms)
parms.insert(0, method)
getattr(client, 'update_unknown')(*parms)
| 36.718121 | 79 | 0.628709 | 1,915 | 16,413 | 5.063708 | 0.21201 | 0.021141 | 0.0297 | 0.022791 | 0.37558 | 0.27524 | 0.233165 | 0.185212 | 0.167268 | 0.15149 | 0 | 0.002772 | 0.296716 | 16,413 | 446 | 80 | 36.800448 | 0.837304 | 0.180893 | 0 | 0.315018 | 0 | 0 | 0.014527 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.227106 | false | 0.069597 | 0.010989 | 0 | 0.263736 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
4529524b72ee8b6f655a486a5542d22fd69041be | 2,234 | py | Python | common.py | shawnau/DataScienceBowl2018 | 3c6f0f26dd86b71aad55fca52314e6432d0b3a82 | [
"MIT"
] | null | null | null | common.py | shawnau/DataScienceBowl2018 | 3c6f0f26dd86b71aad55fca52314e6432d0b3a82 | [
"MIT"
] | null | null | null | common.py | shawnau/DataScienceBowl2018 | 3c6f0f26dd86b71aad55fca52314e6432d0b3a82 | [
"MIT"
] | null | null | null | import os
from datetime import datetime
PROJECT_PATH = os.path.dirname(os.path.realpath(__file__))
IDENTIFIER = datetime.now().strftime('%Y-%m-%d_%H-%M-%S')
#numerical libs
import numpy as np
import random
import matplotlib
matplotlib.use('TkAgg')
import cv2
# torch libs
import torch
from torch.utils.data.sampler import *
import torchvision.transforms as transforms
from torch.utils.data.dataset import Dataset
from torch.utils.data import DataLoader
from torch.utils.data.sampler import *
from torch.nn import init
import torch.nn as nn
import torch.nn.functional as F
from torch.autograd import Variable
import torch.optim as optim
from torch.nn.parallel.data_parallel import data_parallel
# std libs
import collections
import copy
import numbers
import math
import inspect
import shutil
from timeit import default_timer as timer
import csv
import pandas as pd
import pickle
import glob
import sys
from distutils.dir_util import copy_tree
import time
import matplotlib.pyplot as plt
import skimage
import skimage.color
import skimage.morphology
from scipy import ndimage
print('@%s: ' % os.path.basename(__file__))
if 1:
SEED = int(time.time())
random.seed(SEED)
np.random.seed(SEED)
torch.manual_seed(SEED)
torch.cuda.manual_seed_all(SEED)
print ('\tset random seed')
print ('\t\tSEED=%d'%SEED)
if 1:
# uses the inbuilt cudnn auto-tuner to find the fastest convolution algorithms
torch.backends.cudnn.benchmark = True
torch.backends.cudnn.enabled = True
print('\tset cuda environment')
print('\t\ttorch.__version__ =', torch.__version__)
print('\t\ttorch.version.cuda =', torch.version.cuda)
print('\t\ttorch.backends.cudnn.version() =', torch.backends.cudnn.version())
try:
print('\t\tos[\'CUDA_VISIBLE_DEVICES\'] =',os.environ['CUDA_VISIBLE_DEVICES'])
NUM_CUDA_DEVICES = len(os.environ['CUDA_VISIBLE_DEVICES'].split(','))
except Exception:
print('\t\tos[\'CUDA_VISIBLE_DEVICES\'] =','None')
NUM_CUDA_DEVICES = 1
print('\t\ttorch.cuda.device_count() =', torch.cuda.device_count())
print('\t\ttorch.cuda.current_device() =', torch.cuda.current_device())
print('')
| 26.282353 | 90 | 0.723814 | 314 | 2,234 | 5.015924 | 0.366242 | 0.030476 | 0.038095 | 0.045714 | 0.107937 | 0.073651 | 0 | 0 | 0 | 0 | 0 | 0.002138 | 0.162489 | 2,234 | 84 | 91 | 26.595238 | 0.839658 | 0.049687 | 0 | 0.061538 | 0 | 0 | 0.15715 | 0.064653 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.569231 | 0 | 0.569231 | 0.184615 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
452ce291eab1e58321278df273620d4a3c795783 | 678 | py | Python | zombieclusters.py | tnkteja/notthisagain | 85e2b2cbea1298a052986e9dfe5e73d022b537f3 | [
"MIT"
] | null | null | null | zombieclusters.py | tnkteja/notthisagain | 85e2b2cbea1298a052986e9dfe5e73d022b537f3 | [
"MIT"
] | null | null | null | zombieclusters.py | tnkteja/notthisagain | 85e2b2cbea1298a052986e9dfe5e73d022b537f3 | [
"MIT"
] | null | null | null | class cluster(object):
def __init__(self,members=[]):
self.s=set(members)
def merge(self, other):
self.s.union(other.s)
return self
class clusterManager(object):
def __init__(self,clusters={}):
self.c=clusters
def merge(self, i, j):
self.c[i]=self.c[j]=self.c[i].merge(self.c[j])
def count(self):
return len(set(self.c.values()))
def zombieCluster(zombies):
cm=clusterManager(clusters={i:cluster(members=[i]) for i in xrange(len(zombies))})
for i,row in enumerate(zombies):
for j,column in enumerate(row):
if column == '1':
cm.merge(i,j)
return cm.count()
| 26.076923 | 86 | 0.59292 | 97 | 678 | 4.061856 | 0.329897 | 0.076142 | 0.06599 | 0.086294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001965 | 0.249263 | 678 | 25 | 87 | 27.12 | 0.772102 | 0 | 0 | 0 | 0 | 0 | 0.001475 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0 | 0.05 | 0.55 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
4538624158b0321268253bb048733d15b3730192 | 873 | py | Python | mltoolkit/mldp/utils/helpers/nlp/token_matching.py | mancunian1792/FewSum | c2f9ef0ae7445bdb188b6ceb28e998b3fd12b78e | [
"MIT"
] | 28 | 2020-10-12T19:05:22.000Z | 2022-03-18T01:19:29.000Z | mltoolkit/mldp/utils/helpers/nlp/token_matching.py | mancunian1792/FewSum | c2f9ef0ae7445bdb188b6ceb28e998b3fd12b78e | [
"MIT"
] | 1 | 2022-01-30T01:52:59.000Z | 2022-02-19T08:04:54.000Z | mltoolkit/mldp/utils/helpers/nlp/token_matching.py | mancunian1792/FewSum | c2f9ef0ae7445bdb188b6ceb28e998b3fd12b78e | [
"MIT"
] | 7 | 2020-10-29T14:01:04.000Z | 2022-02-22T18:33:10.000Z | from .constants import SPECIAL_TOKENS
try:
import re2 as re
except ImportError:
import re
def twitter_sentiment_token_matching(token):
"""Special token matching function for twitter sentiment data."""
if 'URL_TOKEN' in SPECIAL_TOKENS and re.match(r'https?:\/\/[^\s]+', token):
return SPECIAL_TOKENS['URL_TOKEN']
if 'POS_EM_TOKEN' in SPECIAL_TOKENS and re.match(r':-?(\)|D|p)+', token):
return SPECIAL_TOKENS['POS_EM_TOKEN']
if 'NEG_EM_TOKEN' in SPECIAL_TOKENS and re.match(r':-?(\(|\\|/)+', token):
return SPECIAL_TOKENS['NEG_EM_TOKEN']
if 'USER_TOKEN' in SPECIAL_TOKENS and re.match(
r'(?<=^|(?<=[^a-zA-Z0-9-_\.]))@([A-Za-z]+[A-Za-z0-9]+)', token):
return SPECIAL_TOKENS['USER_TOKEN']
if 'HEART_TOKEN' in SPECIAL_TOKENS and re.match(r'<3+', token):
return SPECIAL_TOKENS['HEART_TOKEN']
| 41.571429 | 79 | 0.651775 | 128 | 873 | 4.21875 | 0.320313 | 0.264815 | 0.12963 | 0.185185 | 0.294444 | 0.294444 | 0.294444 | 0.294444 | 0.122222 | 0 | 0 | 0.008357 | 0.177549 | 873 | 20 | 80 | 43.65 | 0.743733 | 0.067583 | 0 | 0 | 0 | 0.058824 | 0.253713 | 0.064356 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.235294 | 0 | 0.588235 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
1892414a440ce9963c67565a93d5515f1867c2ed | 53 | py | Python | utils/__init__.py | Rfam/rfam-production | 36f3963380da2a08e9cf73c951691c4e95738ac4 | [
"Apache-2.0"
] | 7 | 2016-06-17T09:21:11.000Z | 2021-10-13T20:25:06.000Z | utils/__init__.py | mb1069/rfam-production | 10c76e249dc22d30862b3a873fd54f390e859ad8 | [
"Apache-2.0"
] | 82 | 2016-04-08T10:51:32.000Z | 2022-03-11T13:49:18.000Z | utils/__init__.py | mb1069/rfam-production | 10c76e249dc22d30862b3a873fd54f390e859ad8 | [
"Apache-2.0"
] | 3 | 2019-09-01T09:46:35.000Z | 2021-11-29T08:01:58.000Z | __all__ = ['db_utils', 'RfamDB', 'parse_taxbrowser']
| 26.5 | 52 | 0.698113 | 6 | 53 | 5.166667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.09434 | 53 | 1 | 53 | 53 | 0.645833 | 0 | 0 | 0 | 0 | 0 | 0.566038 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
18a994a759d85007cf88e43e5353bf80d7ac9a5c | 3,055 | py | Python | src/onegov/core/datamanager.py | politbuero-kampagnen/onegov-cloud | 20148bf321b71f617b64376fe7249b2b9b9c4aa9 | [
"MIT"
] | null | null | null | src/onegov/core/datamanager.py | politbuero-kampagnen/onegov-cloud | 20148bf321b71f617b64376fe7249b2b9b9c4aa9 | [
"MIT"
] | null | null | null | src/onegov/core/datamanager.py | politbuero-kampagnen/onegov-cloud | 20148bf321b71f617b64376fe7249b2b9b9c4aa9 | [
"MIT"
] | null | null | null | import os
import tempfile
import transaction
from onegov.core import log
from onegov.core.utils import safe_move
class MailDataManager(object):
""" Takes a postman and an envelope and sends it when the transaction
is commited.
Since we can't really know if a mail can be sent until it happens, we
simply log an exception if the sending failed.
"""
transaction_manager = transaction.manager
def __init__(self, postman, envelope):
self.postman = postman
self.envelope = envelope
@classmethod
def send_email(cls, postman, envelope):
transaction.get().join(cls(postman, envelope))
def sortKey(self):
return 'mails'
def bind_connection(self, transaction, connection):
assert 'mail_connection' not in transaction.extension
def after_commit_hook(*args):
connection.quit()
transaction.addAfterCommitHook(after_commit_hook)
transaction.extension['mail_connection'] = connection
def open_connection(self):
connection = self.postman.transport(
self.postman.host,
self.postman.port,
**self.postman.options
)
connection.ehlo()
for item in self.postman.middlewares:
item(connection)
return connection
def commit(self, transaction):
if 'mail_connection' not in transaction.extension:
self.bind_connection(transaction, self.open_connection())
try:
self.postman.deliver(
transaction.extension['mail_connection'],
self.envelope
)
except Exception:
log.exception("Failed to send e-mail")
def abort(self, transaction):
pass
def tpc_vote(self, transaction):
pass
def tpc_abort(self, transaction):
pass
def tpc_begin(self, transaction):
pass
def tpc_finish(self, transaction):
pass
class FileDataManager(object):
""" Writes a file when the transaction is commited. """
transaction_manager = transaction.manager
def __init__(self, data, path):
self.data = data
self.path = path
@classmethod
def write_file(cls, data, path):
transaction.get().join(cls(data, path))
def sortKey(self):
return 'files'
def commit(self, transaction):
with tempfile.NamedTemporaryFile(delete=False) as temp:
self.tempfn = temp.name
temp.write(self.data)
def abort(self, transaction):
pass
def tpc_vote(self, transaction):
if not os.path.exists(self.tempfn):
raise ValueError('%s doesnt exist' % self.tempfn)
if os.path.exists(self.path):
raise ValueError('file already exists')
def tpc_abort(self, transaction):
try:
os.remove(self.tempfn)
except OSError:
pass
def tpc_begin(self, transaction):
pass
def tpc_finish(self, transaction):
safe_move(self.tempfn, self.path)
| 24.637097 | 73 | 0.629133 | 345 | 3,055 | 5.472464 | 0.321739 | 0.103284 | 0.070445 | 0.069915 | 0.270127 | 0.221398 | 0.169492 | 0.119703 | 0.119703 | 0.119703 | 0 | 0 | 0.285106 | 3,055 | 123 | 74 | 24.837398 | 0.864469 | 0.080196 | 0 | 0.345679 | 0 | 0 | 0.04498 | 0 | 0 | 0 | 0 | 0 | 0.012346 | 1 | 0.259259 | false | 0.098765 | 0.061728 | 0.024691 | 0.407407 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
18ad36444d5128007b08506ac3f31875adc10b4d | 127 | py | Python | books/SystemProgramming/ch4_advanced/echo_command.py | zeroam/TIL | 43e3573be44c7f7aa4600ff8a34e99a65cbdc5d1 | [
"MIT"
] | null | null | null | books/SystemProgramming/ch4_advanced/echo_command.py | zeroam/TIL | 43e3573be44c7f7aa4600ff8a34e99a65cbdc5d1 | [
"MIT"
] | null | null | null | books/SystemProgramming/ch4_advanced/echo_command.py | zeroam/TIL | 43e3573be44c7f7aa4600ff8a34e99a65cbdc5d1 | [
"MIT"
] | null | null | null | from subprocess import Popen, PIPE
cmd = "echo hello world"
p = Popen(cmd, shell=True, stdout=PIPE)
ret, err = p.communicate() | 25.4 | 39 | 0.724409 | 20 | 127 | 4.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149606 | 127 | 5 | 40 | 25.4 | 0.851852 | 0 | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
18be667bef982c766e8e51b2444d4138ae324879 | 7,182 | py | Python | mojo/public/tools/bindings/pylib/parse/mojo_lexer_unittest.py | Acidburn0zzz/chromium-1 | 4c08f442d2588a2c7cfaa117a55bd87d2ac32f9a | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | mojo/public/tools/bindings/pylib/parse/mojo_lexer_unittest.py | Acidburn0zzz/chromium-1 | 4c08f442d2588a2c7cfaa117a55bd87d2ac32f9a | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | mojo/public/tools/bindings/pylib/parse/mojo_lexer_unittest.py | Acidburn0zzz/chromium-1 | 4c08f442d2588a2c7cfaa117a55bd87d2ac32f9a | [
"BSD-3-Clause-No-Nuclear-License-2014",
"BSD-3-Clause"
] | null | null | null | # Copyright 2014 The Chromium Authors. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import mojo_lexer
import unittest
# Try to load the ply module, if not, then assume it is in the third_party
# directory.
try:
# Disable lint check which fails to find the ply module.
# pylint: disable=F0401
from ply import lex
except ImportError:
# This assumes this file is in src/mojo/public/tools/bindings/pylib/parse/.
module_path, module_name = os.path.split(__file__)
third_party = os.path.join(module_path, os.pardir, os.pardir, os.pardir,
os.pardir, os.pardir, os.pardir, 'third_party')
sys.path.append(third_party)
# pylint: disable=F0401
from ply import lex
# This (monkey-patching LexToken to make comparison value-based) is evil, but
# we'll do it anyway. (I'm pretty sure ply's lexer never cares about comparing
# for object identity.)
def _LexTokenEq(self, other):
return self.type == other.type and self.value == other.value and \
self.lineno == other.lineno and self.lexpos == other.lexpos
setattr(lex.LexToken, '__eq__', _LexTokenEq)
def _MakeLexToken(type, value, lineno=1, lexpos=0):
"""Makes a LexToken with the given parameters. (Note that lineno is 1-based,
but lexpos is 0-based.)"""
rv = lex.LexToken()
rv.type, rv.value, rv.lineno, rv.lexpos = type, value, lineno, lexpos
return rv
def _MakeLexTokenForKeyword(keyword, **kwargs):
"""Makes a LexToken for the given keyword."""
return _MakeLexToken(keyword.upper(), keyword.lower(), **kwargs)
class MojoLexerTest(unittest.TestCase):
"""Tests mojo_lexer (in particular, Lexer)."""
def __init__(self, *args, **kwargs):
unittest.TestCase.__init__(self, *args, **kwargs)
# Clone all lexer instances from this one, since making a lexer is slow.
self._zygote_lexer = lex.lex(mojo_lexer.Lexer("my_file.mojom"))
def testValidSingleKeywords(self):
"""Tests valid, single keywords."""
self.assertEquals(self._SingleTokenForInput("handle"),
_MakeLexTokenForKeyword("handle"))
self.assertEquals(self._SingleTokenForInput("data_pipe_consumer"),
_MakeLexTokenForKeyword("data_pipe_consumer"))
self.assertEquals(self._SingleTokenForInput("data_pipe_producer"),
_MakeLexTokenForKeyword("data_pipe_producer"))
self.assertEquals(self._SingleTokenForInput("message_pipe"),
_MakeLexTokenForKeyword("message_pipe"))
self.assertEquals(self._SingleTokenForInput("import"),
_MakeLexTokenForKeyword("import"))
self.assertEquals(self._SingleTokenForInput("module"),
_MakeLexTokenForKeyword("module"))
self.assertEquals(self._SingleTokenForInput("struct"),
_MakeLexTokenForKeyword("struct"))
self.assertEquals(self._SingleTokenForInput("interface"),
_MakeLexTokenForKeyword("interface"))
self.assertEquals(self._SingleTokenForInput("enum"),
_MakeLexTokenForKeyword("enum"))
def testValidSingleTokens(self):
"""Tests valid, single (non-keyword) tokens."""
self.assertEquals(self._SingleTokenForInput("asdf"),
_MakeLexToken("NAME", "asdf"))
self.assertEquals(self._SingleTokenForInput("@123"),
_MakeLexToken("ORDINAL", "@123"))
self.assertEquals(self._SingleTokenForInput("456"),
_MakeLexToken("INT_CONST_DEC", "456"))
self.assertEquals(self._SingleTokenForInput("0765"),
_MakeLexToken("INT_CONST_OCT", "0765"))
self.assertEquals(self._SingleTokenForInput("0x01aB2eF3"),
_MakeLexToken("INT_CONST_HEX", "0x01aB2eF3"))
self.assertEquals(self._SingleTokenForInput("123.456"),
_MakeLexToken("FLOAT_CONST", "123.456"))
self.assertEquals(self._SingleTokenForInput("'x'"),
_MakeLexToken("CHAR_CONST", "'x'"))
self.assertEquals(self._SingleTokenForInput("\"hello\""),
_MakeLexToken("STRING_LITERAL", "\"hello\""))
self.assertEquals(self._SingleTokenForInput("+"),
_MakeLexToken("PLUS", "+"))
self.assertEquals(self._SingleTokenForInput("-"),
_MakeLexToken("MINUS", "-"))
self.assertEquals(self._SingleTokenForInput("*"),
_MakeLexToken("TIMES", "*"))
self.assertEquals(self._SingleTokenForInput("/"),
_MakeLexToken("DIVIDE", "/"))
self.assertEquals(self._SingleTokenForInput("%"),
_MakeLexToken("MOD", "%"))
self.assertEquals(self._SingleTokenForInput("|"),
_MakeLexToken("OR", "|"))
self.assertEquals(self._SingleTokenForInput("~"),
_MakeLexToken("NOT", "~"))
self.assertEquals(self._SingleTokenForInput("^"),
_MakeLexToken("XOR", "^"))
self.assertEquals(self._SingleTokenForInput("<<"),
_MakeLexToken("LSHIFT", "<<"))
self.assertEquals(self._SingleTokenForInput(">>"),
_MakeLexToken("RSHIFT", ">>"))
self.assertEquals(self._SingleTokenForInput("="),
_MakeLexToken("EQUALS", "="))
self.assertEquals(self._SingleTokenForInput("=>"),
_MakeLexToken("RESPONSE", "=>"))
self.assertEquals(self._SingleTokenForInput("("),
_MakeLexToken("LPAREN", "("))
self.assertEquals(self._SingleTokenForInput(")"),
_MakeLexToken("RPAREN", ")"))
self.assertEquals(self._SingleTokenForInput("["),
_MakeLexToken("LBRACKET", "["))
self.assertEquals(self._SingleTokenForInput("]"),
_MakeLexToken("RBRACKET", "]"))
self.assertEquals(self._SingleTokenForInput("{"),
_MakeLexToken("LBRACE", "{"))
self.assertEquals(self._SingleTokenForInput("}"),
_MakeLexToken("RBRACE", "}"))
self.assertEquals(self._SingleTokenForInput("<"),
_MakeLexToken("LANGLE", "<"))
self.assertEquals(self._SingleTokenForInput(">"),
_MakeLexToken("RANGLE", ">"))
self.assertEquals(self._SingleTokenForInput(";"),
_MakeLexToken("SEMI", ";"))
self.assertEquals(self._SingleTokenForInput(","),
_MakeLexToken("COMMA", ","))
self.assertEquals(self._SingleTokenForInput("."),
_MakeLexToken("DOT", "."))
def _TokensForInput(self, input):
"""Gets a list of tokens for the given input string."""
lexer = self._zygote_lexer.clone()
lexer.input(input)
rv = []
while True:
tok = lexer.token()
if not tok:
return rv
rv.append(tok)
def _SingleTokenForInput(self, input):
"""Gets the single token for the given input string. (Raises an exception if
the input string does not result in exactly one token.)"""
toks = self._TokensForInput(input)
assert len(toks) == 1
return toks[0]
if __name__ == "__main__":
unittest.main()
| 44.608696 | 80 | 0.632693 | 671 | 7,182 | 6.554396 | 0.314456 | 0.145521 | 0.181901 | 0.354707 | 0.362665 | 0.047749 | 0.026376 | 0.010914 | 0.010914 | 0.010914 | 0 | 0.010864 | 0.230994 | 7,182 | 160 | 81 | 44.8875 | 0.785443 | 0.151768 | 0 | 0.033058 | 0 | 0 | 0.089102 | 0 | 0 | 0 | 0.003312 | 0 | 0.338843 | 1 | 0.066116 | false | 0 | 0.057851 | 0.008264 | 0.173554 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
18ceea954bda99122d17bf7b1a926a3bf8227da9 | 270 | py | Python | Main/apps.py | Naretto95/Django-Vault | 36fac69873c844bf72732ff635513f0204b7d61a | [
"MIT"
] | null | null | null | Main/apps.py | Naretto95/Django-Vault | 36fac69873c844bf72732ff635513f0204b7d61a | [
"MIT"
] | null | null | null | Main/apps.py | Naretto95/Django-Vault | 36fac69873c844bf72732ff635513f0204b7d61a | [
"MIT"
] | null | null | null | from django.apps import AppConfig
from django.contrib.admin.apps import AdminConfig
class AdminSiteConfig(AdminConfig):
default_site = 'Main.admin.MyAdminSite'
class MainConfig(AppConfig):
default_auto_field = 'django.db.models.BigAutoField'
name = 'Main'
| 27 | 56 | 0.781481 | 32 | 270 | 6.5 | 0.65625 | 0.096154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12963 | 270 | 9 | 57 | 30 | 0.885106 | 0 | 0 | 0 | 0 | 0 | 0.203704 | 0.188889 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
18dd011d855404f1d1af53f818b57ec996f325ba | 1,060 | py | Python | examples/props.py | SandNerd/notional | ccab44bc4c5d19d4546156f0d72b22b93e28e2ed | [
"MIT"
] | 23 | 2021-08-03T08:13:14.000Z | 2022-03-27T13:13:54.000Z | examples/props.py | SandNerd/notional | ccab44bc4c5d19d4546156f0d72b22b93e28e2ed | [
"MIT"
] | 15 | 2021-08-03T04:04:23.000Z | 2022-03-31T14:27:26.000Z | examples/props.py | SandNerd/notional | ccab44bc4c5d19d4546156f0d72b22b93e28e2ed | [
"MIT"
] | 3 | 2021-08-08T04:47:48.000Z | 2022-03-06T23:13:52.000Z | #!/usr/bin/env python3
"""This script demonstrates setting properties on a page manually.
The script accepts a single command line option, which is a page ID. It will then
display information about the properties and update a few of them.
Note that this script assumes the database has already been created with required
fields.
The caller must set `NOTION_AUTH_TOKEN` to a valid integration token.
"""
import logging
import os
import sys
logging.basicConfig(level=logging.INFO)
import notional
from notional import types
page_id = sys.argv[1]
auth_token = os.getenv("NOTION_AUTH_TOKEN")
notion = notional.connect(auth=auth_token)
# get an existing page...
page = notion.pages.retrieve(page_id)
print(f"{page.Title} => {page.url}")
# print all current properties on the page...
for name, prop in page.properties.items():
print(f"{name} => {prop}")
# update a property on the page...
page["Complete"] = types.Checkbox.from_value(True)
# FIXME this feature is broken - https://github.com/jheddings/notional/issues/9
# notion.pages.update(page)
| 25.853659 | 82 | 0.756604 | 164 | 1,060 | 4.835366 | 0.591463 | 0.045397 | 0.037831 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003297 | 0.141509 | 1,060 | 40 | 83 | 26.5 | 0.868132 | 0.568868 | 0 | 0 | 0 | 0 | 0.150562 | 0 | 0 | 0 | 0 | 0.025 | 0 | 1 | 0 | false | 0 | 0.357143 | 0 | 0.357143 | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
18f75103fffe006c35337768f20ad10b43a5b636 | 411 | py | Python | hack_today_2017/web/web_time_solver.py | runsel/CTF_Writeups | df3d8469b981265d4d43bfc90e75075a95acb1dd | [
"MIT"
] | 4 | 2019-01-07T03:15:45.000Z | 2021-01-10T04:58:15.000Z | hack_today_2017/web/web_time_solver.py | runsel/CTF_Writeups | df3d8469b981265d4d43bfc90e75075a95acb1dd | [
"MIT"
] | null | null | null | hack_today_2017/web/web_time_solver.py | runsel/CTF_Writeups | df3d8469b981265d4d43bfc90e75075a95acb1dd | [
"MIT"
] | 3 | 2018-10-21T19:17:34.000Z | 2020-07-07T08:58:25.000Z | import requests
charset = "abcdefghijklmnopqrstuvwxyz0123456789_{}"
password = "HackToday{"
url = "http://sawah.ittoday.web.id:40137/"
while(password[-1]!="}"):
for i in charset:
r = requests.get(url)
payload = {'password': password+i, 'submit': 'Submit+Query'}
r = requests.post(url, data=payload)
if r.status_code==302:
password+=i
print password
| 27.4 | 68 | 0.615572 | 46 | 411 | 5.456522 | 0.652174 | 0.071713 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.060317 | 0.233577 | 411 | 14 | 69 | 29.357143 | 0.736508 | 0 | 0 | 0 | 0 | 0 | 0.26764 | 0.094891 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.416667 | 0.083333 | null | null | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
7a0383028d6c513dd8786b4e28fcf20c534cff1a | 341 | py | Python | CS1/Ch11/Artwork.py | DoctorOac/SwosuCsPythonExamples | 07476b9b4ef9a6f8bd68921aef19e8f00183b1e7 | [
"Apache-2.0"
] | 1 | 2022-03-28T18:27:10.000Z | 2022-03-28T18:27:10.000Z | CS1/Ch11/Artwork.py | DoctorOac/SwosuCsPythonExamples | 07476b9b4ef9a6f8bd68921aef19e8f00183b1e7 | [
"Apache-2.0"
] | 1 | 2022-01-11T16:27:40.000Z | 2022-01-11T16:27:40.000Z | CS1/Ch11/Artwork.py | DoctorOac/SwosuCsPythonExamples | 07476b9b4ef9a6f8bd68921aef19e8f00183b1e7 | [
"Apache-2.0"
] | 7 | 2022-03-25T21:01:42.000Z | 2022-03-28T18:51:24.000Z | from Artist import Artist
class Artwork:
def __init__(self, title='None', year_created=0,\
artist=Artist()):
self.title = title
self.year_created = year_created
self.artist = artist
def print_info(self):
self.artist.print_info()
print('Title: %s, %d' % (self.title, self.year_created))
| 26.230769 | 64 | 0.630499 | 44 | 341 | 4.659091 | 0.386364 | 0.214634 | 0.126829 | 0.195122 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003891 | 0.246334 | 341 | 12 | 65 | 28.416667 | 0.793774 | 0 | 0 | 0 | 0 | 0 | 0.049853 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.1 | 0 | 0.4 | 0.3 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7a03cb031046f0f5a4ab04de791c5d2ae9f6699d | 2,249 | py | Python | nearproteins/__init__.py | audy/nearproteins | ed426a98004c7608894a63c6b445ff60ae251d05 | [
"MIT"
] | null | null | null | nearproteins/__init__.py | audy/nearproteins | ed426a98004c7608894a63c6b445ff60ae251d05 | [
"MIT"
] | 1 | 2019-07-10T05:47:01.000Z | 2019-07-10T17:23:52.000Z | nearproteins/__init__.py | audy/nearproteins | ed426a98004c7608894a63c6b445ff60ae251d05 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
from collections import defaultdict
from itertools import product
import json
import random
import sys
from annoy import AnnoyIndex
from Bio import SeqIO
import numpy as np
class FeatureGenerator:
def __init__(self, k=2):
''' '''
self.k = k
self.alphabet = [ 'A', 'R', 'N', 'D', 'C', 'E', 'Q', 'G', 'H', 'I', 'L', 'K',
'M', 'F', 'P', 'S', 'T', 'W', 'Y', 'V', ]
assert len(self.alphabet) == 20
self.feature_space = list(''.join(i) for i in product(self.alphabet,
repeat=self.k))
self.n_features = len(self.feature_space)
def shingles(self, s, k):
''' return shingles of a given string given a k-mer size k '''
return [ s[i : i + k ] for i in range(0, len(s) - k + 1) ]
def vectorize(self, s):
''' convert shingles to features vector '''
d = defaultdict(lambda: 0)
for i in s:
d[i] += 1
# convert to counts in feature space
vector = np.array([ d[i] for i in self.feature_space ])
return vector
def transform(self, str):
return self.vectorize(self.shingles(str, self.k))
class SimilarStringStore:
def __init__(self, **kwargs):
self.transformer = FeatureGenerator(k=1)
print(self.transformer.n_features)
self.store = AnnoyIndex(self.transformer.n_features)
def vectorize(self, s):
return self.transformer.transform(s)
def add(self, id, s):
''' add a string to index '''
vector = self.transformer.transform(s)
self.store.add_item(int(id), vector)
return vector
def build(self):
self.store.build(500)
def save(self, filename='store.knn'):
self.store.save(filename)
def build_and_save(self, filename='store.knn'):
self.build()
self.save(filename)
def load(self, filename='store.knn'):
self.store.load(filename)
def query(self, s):
''' query index '''
vector = self.transformer.transform(s)
neighbors = self.store.get_nns_by_vector(vector, 40)
return neighbors
def remove(self, id):
''' remove a string from the index '''
pass
| 22.267327 | 85 | 0.578924 | 300 | 2,249 | 4.273333 | 0.343333 | 0.070203 | 0.018721 | 0.058502 | 0.126365 | 0.126365 | 0 | 0 | 0 | 0 | 0 | 0.0081 | 0.286349 | 2,249 | 100 | 86 | 22.49 | 0.790654 | 0.096043 | 0 | 0.113208 | 0 | 0 | 0.023571 | 0 | 0 | 0 | 0 | 0 | 0.018868 | 1 | 0.245283 | false | 0.018868 | 0.150943 | 0.037736 | 0.54717 | 0.018868 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
7a0b7b8522bbe2e3900e18756663a43a8ac174f7 | 2,765 | py | Python | functions/print_initial_values.py | CINPLA/edNEGmodel_analysis | be8854c563376a14ee7d15e51d98d0d82be96a35 | [
"MIT"
] | null | null | null | functions/print_initial_values.py | CINPLA/edNEGmodel_analysis | be8854c563376a14ee7d15e51d98d0d82be96a35 | [
"MIT"
] | null | null | null | functions/print_initial_values.py | CINPLA/edNEGmodel_analysis | be8854c563376a14ee7d15e51d98d0d82be96a35 | [
"MIT"
] | null | null | null | import numpy as np
def print_initial_values(init_cell):
phi_sn, phi_se, phi_sg, phi_dn, phi_de, phi_dg, phi_msn, phi_mdn, phi_msg, phi_mdg = init_cell.membrane_potentials()
E_Na_sn, E_Na_sg, E_Na_dn, E_Na_dg, E_K_sn, E_K_sg, E_K_dn, E_K_dg, E_Cl_sn, E_Cl_sg, E_Cl_dn, E_Cl_dg, E_Ca_sn, E_Ca_dn = init_cell.reversal_potentials()
q_sn = init_cell.total_charge(np.array([init_cell.Na_sn, init_cell.K_sn, init_cell.Cl_sn, init_cell.Ca_sn, init_cell.X_sn]))
q_se = init_cell.total_charge(np.array([init_cell.Na_se, init_cell.K_se, init_cell.Cl_se, init_cell.Ca_se, init_cell.X_se]))
q_sg = init_cell.total_charge(np.array([init_cell.Na_sg, init_cell.K_sg, init_cell.Cl_sg, 0, init_cell.X_sg]))
q_dn = init_cell.total_charge(np.array([init_cell.Na_dn, init_cell.K_dn, init_cell.Cl_dn, init_cell.Ca_dn, init_cell.X_dn]))
q_de = init_cell.total_charge(np.array([init_cell.Na_de, init_cell.K_de, init_cell.Cl_de, init_cell.Ca_de, init_cell.X_de]))
q_dg = init_cell.total_charge(np.array([init_cell.Na_dg, init_cell.K_dg, init_cell.Cl_dg, 0, init_cell.X_dg]))
print("----------------------------")
print("Initial values")
print("----------------------------")
print("initial total charge(C):", q_sn + q_se + q_sg + q_dn + q_de + q_dg)
print("Q_sn + Q_sg (C):", q_sn+q_sg)
print("Q_se (C):", q_se)
print("Q_dn + Q_sg (C):", q_dn+q_dg)
print("Q_de (C):", q_de)
print("----------------------------")
print('phi_sn: ', round(phi_sn*1000, 1))
print('phi_se: ', round(phi_se*1000, 1))
print('phi_sg: ', round(phi_sg*1000, 1))
print('phi_dn: ', round(phi_dn*1000, 1))
print('phi_de: ', round(phi_de*1000, 1))
print('phi_dg: ', round(phi_dg*1000, 1))
print('phi_msn: ', round(phi_msn*1000, 1))
print('phi_mdn: ', round(phi_mdn*1000, 1))
print('phi_msg: ', round(phi_msg*1000, 1))
print('phi_mdg: ', round(phi_mdg*1000, 1))
print('E_Na_sn: ', round(E_Na_sn*1000))
print('E_Na_sg: ', round(E_Na_sg*1000))
print('E_K_sn: ', round(E_K_sn*1000))
print('E_K_sg: ', round(E_K_sg*1000))
print('E_Cl_sn: ', round(E_Cl_sn*1000))
print('E_Cl_sg: ', round(E_Cl_sg*1000))
print('E_Ca_sn: ', round(E_Ca_sn*1000))
print("----------------------------")
print('psi_se-psi_sn', init_cell.psi_se-init_cell.psi_sn)
print('psi_se-psi_sg', init_cell.psi_se-init_cell.psi_sg)
print('psi_de-psi_dn', init_cell.psi_de-init_cell.psi_dn)
print('psi_de-psi_dg', init_cell.psi_de-init_cell.psi_dg)
print("----------------------------")
print('initial total volume (m^3):', init_cell.V_sn + init_cell.V_se + init_cell.V_sg + init_cell.V_dn + init_cell.V_de + init_cell.V_dg)
print("----------------------------")
| 56.428571 | 158 | 0.637975 | 525 | 2,765 | 2.939048 | 0.08381 | 0.26442 | 0.064809 | 0.075826 | 0.202204 | 0.202204 | 0.202204 | 0.139987 | 0.139987 | 0 | 0 | 0.034221 | 0.143942 | 2,765 | 48 | 159 | 57.604167 | 0.617659 | 0 | 0 | 0.136364 | 0 | 0 | 0.173599 | 0.060759 | 0 | 0 | 0 | 0 | 0 | 1 | 0.022727 | false | 0 | 0.022727 | 0 | 0.045455 | 0.795455 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
7a0f470f2ade1699e468a55aa0458f89b6b1d2f2 | 17,965 | py | Python | bddtests/peer/admin_pb2.py | hacera-jonathan/fabric | 3ba291e8fbb0246aa440e02cba54d16924649479 | [
"Apache-2.0"
] | null | null | null | bddtests/peer/admin_pb2.py | hacera-jonathan/fabric | 3ba291e8fbb0246aa440e02cba54d16924649479 | [
"Apache-2.0"
] | 1 | 2021-03-20T05:34:24.000Z | 2021-03-20T05:34:24.000Z | bddtests/peer/admin_pb2.py | hacera-jonathan/fabric | 3ba291e8fbb0246aa440e02cba54d16924649479 | [
"Apache-2.0"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: peer/admin.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from google.protobuf import empty_pb2 as google_dot_protobuf_dot_empty__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='peer/admin.proto',
package='protos',
syntax='proto3',
serialized_pb=_b('\n\x10peer/admin.proto\x12\x06protos\x1a\x1bgoogle/protobuf/empty.proto\"\x9a\x01\n\x0cServerStatus\x12/\n\x06status\x18\x01 \x01(\x0e\x32\x1f.protos.ServerStatus.StatusCode\"Y\n\nStatusCode\x12\r\n\tUNDEFINED\x10\x00\x12\x0b\n\x07STARTED\x10\x01\x12\x0b\n\x07STOPPED\x10\x02\x12\n\n\x06PAUSED\x10\x03\x12\t\n\x05\x45RROR\x10\x04\x12\x0b\n\x07UNKNOWN\x10\x05\"8\n\x0fLogLevelRequest\x12\x12\n\nlog_module\x18\x01 \x01(\t\x12\x11\n\tlog_level\x18\x02 \x01(\t\"9\n\x10LogLevelResponse\x12\x12\n\nlog_module\x18\x01 \x01(\t\x12\x11\n\tlog_level\x18\x02 \x01(\t2\xd5\x02\n\x05\x41\x64min\x12;\n\tGetStatus\x12\x16.google.protobuf.Empty\x1a\x14.protos.ServerStatus\"\x00\x12=\n\x0bStartServer\x12\x16.google.protobuf.Empty\x1a\x14.protos.ServerStatus\"\x00\x12<\n\nStopServer\x12\x16.google.protobuf.Empty\x1a\x14.protos.ServerStatus\"\x00\x12H\n\x11GetModuleLogLevel\x12\x17.protos.LogLevelRequest\x1a\x18.protos.LogLevelResponse\"\x00\x12H\n\x11SetModuleLogLevel\x12\x17.protos.LogLevelRequest\x1a\x18.protos.LogLevelResponse\"\x00\x42+Z)github.com/hyperledger/fabric/protos/peerb\x06proto3')
,
dependencies=[google_dot_protobuf_dot_empty__pb2.DESCRIPTOR,])
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
_SERVERSTATUS_STATUSCODE = _descriptor.EnumDescriptor(
name='StatusCode',
full_name='protos.ServerStatus.StatusCode',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='UNDEFINED', index=0, number=0,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='STARTED', index=1, number=1,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='STOPPED', index=2, number=2,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='PAUSED', index=3, number=3,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='ERROR', index=4, number=4,
options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='UNKNOWN', index=5, number=5,
options=None,
type=None),
],
containing_type=None,
options=None,
serialized_start=123,
serialized_end=212,
)
_sym_db.RegisterEnumDescriptor(_SERVERSTATUS_STATUSCODE)
_SERVERSTATUS = _descriptor.Descriptor(
name='ServerStatus',
full_name='protos.ServerStatus',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='status', full_name='protos.ServerStatus.status', index=0,
number=1, type=14, cpp_type=8, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
_SERVERSTATUS_STATUSCODE,
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=58,
serialized_end=212,
)
_LOGLEVELREQUEST = _descriptor.Descriptor(
name='LogLevelRequest',
full_name='protos.LogLevelRequest',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='log_module', full_name='protos.LogLevelRequest.log_module', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='log_level', full_name='protos.LogLevelRequest.log_level', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=214,
serialized_end=270,
)
_LOGLEVELRESPONSE = _descriptor.Descriptor(
name='LogLevelResponse',
full_name='protos.LogLevelResponse',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='log_module', full_name='protos.LogLevelResponse.log_module', index=0,
number=1, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='log_level', full_name='protos.LogLevelResponse.log_level', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=_b("").decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=272,
serialized_end=329,
)
_SERVERSTATUS.fields_by_name['status'].enum_type = _SERVERSTATUS_STATUSCODE
_SERVERSTATUS_STATUSCODE.containing_type = _SERVERSTATUS
DESCRIPTOR.message_types_by_name['ServerStatus'] = _SERVERSTATUS
DESCRIPTOR.message_types_by_name['LogLevelRequest'] = _LOGLEVELREQUEST
DESCRIPTOR.message_types_by_name['LogLevelResponse'] = _LOGLEVELRESPONSE
ServerStatus = _reflection.GeneratedProtocolMessageType('ServerStatus', (_message.Message,), dict(
DESCRIPTOR = _SERVERSTATUS,
__module__ = 'peer.admin_pb2'
# @@protoc_insertion_point(class_scope:protos.ServerStatus)
))
_sym_db.RegisterMessage(ServerStatus)
LogLevelRequest = _reflection.GeneratedProtocolMessageType('LogLevelRequest', (_message.Message,), dict(
DESCRIPTOR = _LOGLEVELREQUEST,
__module__ = 'peer.admin_pb2'
# @@protoc_insertion_point(class_scope:protos.LogLevelRequest)
))
_sym_db.RegisterMessage(LogLevelRequest)
LogLevelResponse = _reflection.GeneratedProtocolMessageType('LogLevelResponse', (_message.Message,), dict(
DESCRIPTOR = _LOGLEVELRESPONSE,
__module__ = 'peer.admin_pb2'
# @@protoc_insertion_point(class_scope:protos.LogLevelResponse)
))
_sym_db.RegisterMessage(LogLevelResponse)
DESCRIPTOR.has_options = True
DESCRIPTOR._options = _descriptor._ParseOptions(descriptor_pb2.FileOptions(), _b('Z)github.com/hyperledger/fabric/protos/peer'))
try:
# THESE ELEMENTS WILL BE DEPRECATED.
# Please use the generated *_pb2_grpc.py files instead.
import grpc
from grpc.framework.common import cardinality
from grpc.framework.interfaces.face import utilities as face_utilities
from grpc.beta import implementations as beta_implementations
from grpc.beta import interfaces as beta_interfaces
class AdminStub(object):
"""Interface exported by the server.
"""
def __init__(self, channel):
"""Constructor.
Args:
channel: A grpc.Channel.
"""
self.GetStatus = channel.unary_unary(
'/protos.Admin/GetStatus',
request_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
response_deserializer=ServerStatus.FromString,
)
self.StartServer = channel.unary_unary(
'/protos.Admin/StartServer',
request_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
response_deserializer=ServerStatus.FromString,
)
self.StopServer = channel.unary_unary(
'/protos.Admin/StopServer',
request_serializer=google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
response_deserializer=ServerStatus.FromString,
)
self.GetModuleLogLevel = channel.unary_unary(
'/protos.Admin/GetModuleLogLevel',
request_serializer=LogLevelRequest.SerializeToString,
response_deserializer=LogLevelResponse.FromString,
)
self.SetModuleLogLevel = channel.unary_unary(
'/protos.Admin/SetModuleLogLevel',
request_serializer=LogLevelRequest.SerializeToString,
response_deserializer=LogLevelResponse.FromString,
)
class AdminServicer(object):
"""Interface exported by the server.
"""
def GetStatus(self, request, context):
"""Return the serve status.
"""
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def StartServer(self, request, context):
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def StopServer(self, request, context):
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def GetModuleLogLevel(self, request, context):
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def SetModuleLogLevel(self, request, context):
context.set_code(grpc.StatusCode.UNIMPLEMENTED)
context.set_details('Method not implemented!')
raise NotImplementedError('Method not implemented!')
def add_AdminServicer_to_server(servicer, server):
rpc_method_handlers = {
'GetStatus': grpc.unary_unary_rpc_method_handler(
servicer.GetStatus,
request_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
response_serializer=ServerStatus.SerializeToString,
),
'StartServer': grpc.unary_unary_rpc_method_handler(
servicer.StartServer,
request_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
response_serializer=ServerStatus.SerializeToString,
),
'StopServer': grpc.unary_unary_rpc_method_handler(
servicer.StopServer,
request_deserializer=google_dot_protobuf_dot_empty__pb2.Empty.FromString,
response_serializer=ServerStatus.SerializeToString,
),
'GetModuleLogLevel': grpc.unary_unary_rpc_method_handler(
servicer.GetModuleLogLevel,
request_deserializer=LogLevelRequest.FromString,
response_serializer=LogLevelResponse.SerializeToString,
),
'SetModuleLogLevel': grpc.unary_unary_rpc_method_handler(
servicer.SetModuleLogLevel,
request_deserializer=LogLevelRequest.FromString,
response_serializer=LogLevelResponse.SerializeToString,
),
}
generic_handler = grpc.method_handlers_generic_handler(
'protos.Admin', rpc_method_handlers)
server.add_generic_rpc_handlers((generic_handler,))
class BetaAdminServicer(object):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This class was generated
only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0."""
"""Interface exported by the server.
"""
def GetStatus(self, request, context):
"""Return the serve status.
"""
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def StartServer(self, request, context):
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def StopServer(self, request, context):
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def GetModuleLogLevel(self, request, context):
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
def SetModuleLogLevel(self, request, context):
context.code(beta_interfaces.StatusCode.UNIMPLEMENTED)
class BetaAdminStub(object):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This class was generated
only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0."""
"""Interface exported by the server.
"""
def GetStatus(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
"""Return the serve status.
"""
raise NotImplementedError()
GetStatus.future = None
def StartServer(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
raise NotImplementedError()
StartServer.future = None
def StopServer(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
raise NotImplementedError()
StopServer.future = None
def GetModuleLogLevel(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
raise NotImplementedError()
GetModuleLogLevel.future = None
def SetModuleLogLevel(self, request, timeout, metadata=None, with_call=False, protocol_options=None):
raise NotImplementedError()
SetModuleLogLevel.future = None
def beta_create_Admin_server(servicer, pool=None, pool_size=None, default_timeout=None, maximum_timeout=None):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This function was
generated only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0"""
request_deserializers = {
('protos.Admin', 'GetModuleLogLevel'): LogLevelRequest.FromString,
('protos.Admin', 'GetStatus'): google_dot_protobuf_dot_empty__pb2.Empty.FromString,
('protos.Admin', 'SetModuleLogLevel'): LogLevelRequest.FromString,
('protos.Admin', 'StartServer'): google_dot_protobuf_dot_empty__pb2.Empty.FromString,
('protos.Admin', 'StopServer'): google_dot_protobuf_dot_empty__pb2.Empty.FromString,
}
response_serializers = {
('protos.Admin', 'GetModuleLogLevel'): LogLevelResponse.SerializeToString,
('protos.Admin', 'GetStatus'): ServerStatus.SerializeToString,
('protos.Admin', 'SetModuleLogLevel'): LogLevelResponse.SerializeToString,
('protos.Admin', 'StartServer'): ServerStatus.SerializeToString,
('protos.Admin', 'StopServer'): ServerStatus.SerializeToString,
}
method_implementations = {
('protos.Admin', 'GetModuleLogLevel'): face_utilities.unary_unary_inline(servicer.GetModuleLogLevel),
('protos.Admin', 'GetStatus'): face_utilities.unary_unary_inline(servicer.GetStatus),
('protos.Admin', 'SetModuleLogLevel'): face_utilities.unary_unary_inline(servicer.SetModuleLogLevel),
('protos.Admin', 'StartServer'): face_utilities.unary_unary_inline(servicer.StartServer),
('protos.Admin', 'StopServer'): face_utilities.unary_unary_inline(servicer.StopServer),
}
server_options = beta_implementations.server_options(request_deserializers=request_deserializers, response_serializers=response_serializers, thread_pool=pool, thread_pool_size=pool_size, default_timeout=default_timeout, maximum_timeout=maximum_timeout)
return beta_implementations.server(method_implementations, options=server_options)
def beta_create_Admin_stub(channel, host=None, metadata_transformer=None, pool=None, pool_size=None):
"""The Beta API is deprecated for 0.15.0 and later.
It is recommended to use the GA API (classes and functions in this
file not marked beta) for all further purposes. This function was
generated only to ease transition from grpcio<0.15.0 to grpcio>=0.15.0"""
request_serializers = {
('protos.Admin', 'GetModuleLogLevel'): LogLevelRequest.SerializeToString,
('protos.Admin', 'GetStatus'): google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
('protos.Admin', 'SetModuleLogLevel'): LogLevelRequest.SerializeToString,
('protos.Admin', 'StartServer'): google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
('protos.Admin', 'StopServer'): google_dot_protobuf_dot_empty__pb2.Empty.SerializeToString,
}
response_deserializers = {
('protos.Admin', 'GetModuleLogLevel'): LogLevelResponse.FromString,
('protos.Admin', 'GetStatus'): ServerStatus.FromString,
('protos.Admin', 'SetModuleLogLevel'): LogLevelResponse.FromString,
('protos.Admin', 'StartServer'): ServerStatus.FromString,
('protos.Admin', 'StopServer'): ServerStatus.FromString,
}
cardinalities = {
'GetModuleLogLevel': cardinality.Cardinality.UNARY_UNARY,
'GetStatus': cardinality.Cardinality.UNARY_UNARY,
'SetModuleLogLevel': cardinality.Cardinality.UNARY_UNARY,
'StartServer': cardinality.Cardinality.UNARY_UNARY,
'StopServer': cardinality.Cardinality.UNARY_UNARY,
}
stub_options = beta_implementations.stub_options(host=host, metadata_transformer=metadata_transformer, request_serializers=request_serializers, response_deserializers=response_deserializers, thread_pool=pool, thread_pool_size=pool_size)
return beta_implementations.dynamic_stub(channel, 'protos.Admin', cardinalities, options=stub_options)
except ImportError:
pass
# @@protoc_insertion_point(module_scope)
| 41.77907 | 1,109 | 0.74044 | 2,018 | 17,965 | 6.364222 | 0.142716 | 0.027408 | 0.018532 | 0.021802 | 0.581017 | 0.551662 | 0.517325 | 0.469594 | 0.431441 | 0.413922 | 0 | 0.02224 | 0.151517 | 17,965 | 429 | 1,110 | 41.876457 | 0.820311 | 0.092291 | 0 | 0.494118 | 1 | 0.002941 | 0.177841 | 0.092711 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055882 | false | 0.002941 | 0.038235 | 0 | 0.111765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e1404a753371b136c19314c274ee0f8405dd2c32 | 1,598 | py | Python | docs/example/advanced/view.py | Kozea/Pynuts | f2eb1839f59d2e8a4ec96175726186e67f85c4b0 | [
"BSD-3-Clause"
] | 1 | 2016-06-16T15:31:30.000Z | 2016-06-16T15:31:30.000Z | docs/example/advanced/view.py | Kozea/Pynuts | f2eb1839f59d2e8a4ec96175726186e67f85c4b0 | [
"BSD-3-Clause"
] | null | null | null | docs/example/advanced/view.py | Kozea/Pynuts | f2eb1839f59d2e8a4ec96175726186e67f85c4b0 | [
"BSD-3-Clause"
] | null | null | null | from wtforms import TextField, IntegerField, PasswordField
from wtforms.ext.sqlalchemy.fields import (
QuerySelectField, QuerySelectMultipleField)
from wtforms.validators import Required
from pynuts.view import BaseForm
import database
from application import nuts
class EmployeeView(nuts.ModelView):
model = database.Employee
list_column = 'fullname'
table_columns = ('fullname', )
create_columns = ('login', 'password', 'name', 'firstname', 'company')
read_columns = ('person_id', 'name', 'firstname', 'fullname', 'company')
update_columns = ('name', 'firstname')
class Form(BaseForm):
person_id = IntegerField('ID')
login = TextField(u'Login', validators=[Required()])
password = PasswordField(u'Password', validators=[Required()])
name = TextField(u'Surname', validators=[Required()])
firstname = TextField(u'Firstname', validators=[Required()])
fullname = TextField(u'Employee name')
company = QuerySelectField(
u'Company', get_label='name',
query_factory=lambda: database.Company.query, allow_blank=True)
class CompanyView(nuts.ModelView):
model = database.Company
list_column = 'name'
create_columns = ('name', 'employees')
read_columns = ('name', 'employees')
class Form(BaseForm):
company_id = IntegerField('Company')
name = TextField('Company name')
employees = QuerySelectMultipleField(
u'Employees', get_label='fullname', query_factory=
lambda: database.Employee.query.filter_by(company_id=None))
| 35.511111 | 76 | 0.682728 | 161 | 1,598 | 6.664596 | 0.341615 | 0.037279 | 0.033551 | 0.048462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.194618 | 1,598 | 44 | 77 | 36.318182 | 0.833722 | 0 | 0 | 0.057143 | 0 | 0 | 0.137672 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.085714 | 0.171429 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
e1414f639d12d9584079f8b303441fd98b73dfdd | 772 | py | Python | giosgappsdk/giosg_api.py | mentholi/giosgapp-python-sdk | 2a5ea25e223dc4a88a32e917dd393cc9a07f9999 | [
"MIT"
] | null | null | null | giosgappsdk/giosg_api.py | mentholi/giosgapp-python-sdk | 2a5ea25e223dc4a88a32e917dd393cc9a07f9999 | [
"MIT"
] | null | null | null | giosgappsdk/giosg_api.py | mentholi/giosgapp-python-sdk | 2a5ea25e223dc4a88a32e917dd393cc9a07f9999 | [
"MIT"
] | null | null | null | import json
import requests
class GiosgApiMixin(object):
URL_USERS = '/api/v3/customer/personnel'
URL_CHATS = '/api/v3/chat/chatsessions'
def build_request_url(self, base, page_size=25, page=1):
domain = self.data.get('sub')
return '%s://%s%s?page_size=%s&page=%s' % (self._protocol, domain, base, page_size, page)
def get_users(self, page=1, page_size=25):
response = requests.get(self.build_request_url(self.URL_USERS, page_size, page), headers=self.get_auth_header())
return json.loads(response.content)
def get_chats(self, page=1, page_size=25):
response = requests.get(self.build_request_url(self.URL_CHATS, page_size, page), headers=self.get_auth_header())
return json.loads(response.content)
| 38.6 | 120 | 0.700777 | 115 | 772 | 4.495652 | 0.321739 | 0.108317 | 0.087041 | 0.110251 | 0.502901 | 0.502901 | 0.502901 | 0.502901 | 0.502901 | 0.502901 | 0 | 0.017002 | 0.161917 | 772 | 19 | 121 | 40.631579 | 0.782071 | 0 | 0 | 0.142857 | 0 | 0 | 0.108808 | 0.104922 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.142857 | 0 | 0.785714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e15f87d69b9f385338407a9fb5c01c89ecaa7425 | 2,065 | py | Python | lib/utils/timeout.py | kustodian/aerospike-admin | 931ee55ccd65ba3e20e6611a0294c92b09e8cfcb | [
"Apache-2.0"
] | null | null | null | lib/utils/timeout.py | kustodian/aerospike-admin | 931ee55ccd65ba3e20e6611a0294c92b09e8cfcb | [
"Apache-2.0"
] | null | null | null | lib/utils/timeout.py | kustodian/aerospike-admin | 931ee55ccd65ba3e20e6611a0294c92b09e8cfcb | [
"Apache-2.0"
] | null | null | null | # Copyright 2013-2018 Aerospike, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
import signal
import commands
DEFAULT_TIMEOUT = 5.0
class TimeoutException(Exception):
"""A timeout has occurred."""
pass
class call_with_timeout:
def __init__(self, function, timeout=DEFAULT_TIMEOUT):
self.timeout = timeout
self.function = function
def handler(self, signum, frame):
raise TimeoutException()
def __call__(self, *args):
# get the old SIGALRM handler
old = signal.signal(signal.SIGALRM, self.handler)
# set the alarm
signal.setitimer(signal.ITIMER_REAL, self.timeout)
try:
result = self.function(*args)
finally:
# restore existing SIGALRM handler
signal.signal(signal.SIGALRM, old)
signal.setitimer(signal.ITIMER_REAL, 0)
return result
def timeout(timeout):
"""This decorator takes a timeout parameter in seconds."""
def wrap_function(function):
return call_with_timeout(function, timeout)
return wrap_function
def default_timeout(function):
"""This simple decorator 'timesout' after DEFAULT_TIMEOUT seconds."""
return call_with_timeout(function)
def getstatusoutput(command, timeout=DEFAULT_TIMEOUT):
"""This is a timeout wrapper aroung getstatusoutput."""
_gso = call_with_timeout(commands.getstatusoutput, timeout)
try:
return _gso(command)
except TimeoutException:
return (-1, "The command '%s' timed-out after %i seconds." % (command, timeout))
| 29.927536 | 88 | 0.701695 | 259 | 2,065 | 5.490347 | 0.455598 | 0.042194 | 0.042194 | 0.022504 | 0.084388 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009852 | 0.213559 | 2,065 | 68 | 89 | 30.367647 | 0.865764 | 0.397094 | 0 | 0.0625 | 0 | 0 | 0.036394 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.21875 | false | 0.03125 | 0.0625 | 0.03125 | 0.53125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e171d508606a36edd465712b9674cad13c99de99 | 1,554 | py | Python | jsConsole/__init__.py | Animenosekai/jsConsole | e2604988f20a0d0d93578f786ee7beaf72b9afbc | [
"MIT"
] | null | null | null | jsConsole/__init__.py | Animenosekai/jsConsole | e2604988f20a0d0d93578f786ee7beaf72b9afbc | [
"MIT"
] | null | null | null | jsConsole/__init__.py | Animenosekai/jsConsole | e2604988f20a0d0d93578f786ee7beaf72b9afbc | [
"MIT"
] | null | null | null | """
pyJsConsole wrapper.
© Anime no Sekai - 2020
"""
from .internal.javascript import classes as JSClass
console = JSClass._Console()
document = JSClass._Document()
history = JSClass._History()
Math = JSClass._Math()
navigator = JSClass._Navigator()
screen = JSClass._Screen()
window = JSClass._Window()
browser = JSClass.BrowserObject
'''
import threading
from lifeeasy import sleep
def reloadElements():
global document
global window
lastURL = 'data:,'
while True:
sleep(0.1)
try:
if JSClass.evaluate('window.location.href') != lastURL:
document = JSClass._Document()
window = JSClass._Window()
lastURL = JSClass.evaluate('window.location.href')
except:
break
thread = threading.Thread(target=reloadElements)
thread.daemon = True
thread.start()
'''
def newDocument():
return JSClass._Document()
def newWindow():
return JSClass._Window()
def newHistory():
return JSClass._History()
def fresh():
return (JSClass._Document(), JSClass._Window(), JSClass._History())
def clearInterval(intervalID):
JSClass.clearInterval(intervalID)
def clearTimeout(timeoutID):
JSClass.clearTimeout(timeoutID)
def evaluate(code_to_execute, return_value=False):
return JSClass.evaluate(code_to_execute, return_value=return_value)
def setInterval(function, milliseconds):
return JSClass.setInterval(function, milliseconds)
def setTimeout(function, milliseconds):
return JSClass.setTimeout(function, milliseconds)
| 23.19403 | 71 | 0.70592 | 161 | 1,554 | 6.68323 | 0.397516 | 0.084572 | 0.042751 | 0.053903 | 0.120818 | 0.05948 | 0 | 0 | 0 | 0 | 0 | 0.004739 | 0.185328 | 1,554 | 66 | 72 | 23.545455 | 0.844392 | 0.028958 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.037037 | 0.259259 | 0.62963 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
e17d136893ad674b6eda0ec3efee1f8fda058d2d | 341 | py | Python | Data Analysis/csv remove other label.py | byew/python-do-differernt-csv | 094b154834ee48210c2ee4a6a529d8fe76055fb7 | [
"MIT"
] | null | null | null | Data Analysis/csv remove other label.py | byew/python-do-differernt-csv | 094b154834ee48210c2ee4a6a529d8fe76055fb7 | [
"MIT"
] | null | null | null | Data Analysis/csv remove other label.py | byew/python-do-differernt-csv | 094b154834ee48210c2ee4a6a529d8fe76055fb7 | [
"MIT"
] | null | null | null | import pandas as pd
exa = pd.read_csv('en_dup.csv')
exa.loc[exa['label'] =='F', 'label']= 0
exa.loc[exa['label'] =='T', 'label']= 1
exa.loc[exa['label'] =='U', 'label']= 2
#不读取label2, 只读取0,1标签
exa0 = exa.loc[exa["label"] == 0]
exa1 = exa.loc[exa["label"] == 1]
exa = [exa0, exa1]
exa = pd.concat(exa)
exa.to_csv('train.csv', index=0)
| 17.947368 | 39 | 0.595308 | 61 | 341 | 3.278689 | 0.42623 | 0.15 | 0.225 | 0.35 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044218 | 0.13783 | 341 | 18 | 40 | 18.944444 | 0.636054 | 0.055718 | 0 | 0 | 0 | 0 | 0.193146 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e18273cb48126cd36d2e98bfcd448716a51f67d4 | 396 | py | Python | axicli.py | notpeter/AxiDraw_API | d9c35eb93fd85f96cf197908415822af9a725b41 | [
"MIT"
] | null | null | null | axicli.py | notpeter/AxiDraw_API | d9c35eb93fd85f96cf197908415822af9a725b41 | [
"MIT"
] | 3 | 2021-01-17T04:31:57.000Z | 2021-01-17T04:36:41.000Z | axicli.py | notpeter/AxiDraw_API | d9c35eb93fd85f96cf197908415822af9a725b41 | [
"MIT"
] | null | null | null | '''
axicli.py - Command line interface (CLI) for AxiDraw.
For quick help:
python axicli.py --help
Full user guide:
https://axidraw.com/doc/cli_api/
This script is a stand-alone version of AxiDraw Control, accepting
various options and providing a facility for setting default values.
'''
from axicli.axidraw_cli import axidraw_CLI
if __name__ == '__main__':
axidraw_CLI()
| 19.8 | 68 | 0.729798 | 57 | 396 | 4.859649 | 0.719298 | 0.108303 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184343 | 396 | 19 | 69 | 20.842105 | 0.857585 | 0.747475 | 0 | 0 | 0 | 0 | 0.087912 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
e18d760e51cdf1f8ab9695881861681dcd4595c4 | 214 | py | Python | silver_bullet/contain_value.py | Hojung-Jeong/Silver-Bullet-Encryption-Tool | 5ea29b3cd78cf7488e0cbdcf4ea60d7c9151c2a7 | [
"Apache-2.0"
] | null | null | null | silver_bullet/contain_value.py | Hojung-Jeong/Silver-Bullet-Encryption-Tool | 5ea29b3cd78cf7488e0cbdcf4ea60d7c9151c2a7 | [
"Apache-2.0"
] | null | null | null | silver_bullet/contain_value.py | Hojung-Jeong/Silver-Bullet-Encryption-Tool | 5ea29b3cd78cf7488e0cbdcf4ea60d7c9151c2a7 | [
"Apache-2.0"
] | null | null | null | '''
>List of functions
1. contain(value,limit) - contains a value between 0 to limit
'''
def contain(value,limit):
if value<0:
return value+limit
elif value>=limit:
return value-limit
else:
return value | 16.461538 | 62 | 0.705607 | 33 | 214 | 4.575758 | 0.515152 | 0.331126 | 0.225166 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017143 | 0.182243 | 214 | 13 | 63 | 16.461538 | 0.845714 | 0.439252 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e195971c01d6f8dcda846bd7ff1f32bb1f7099e8 | 4,248 | py | Python | src/RosGazeboLibrary/Gazebo.py | hielsnoppe/robotframework-rosgazebolibrary | a91d48413d4af95856964644b149898b538c6724 | [
"Apache-2.0"
] | null | null | null | src/RosGazeboLibrary/Gazebo.py | hielsnoppe/robotframework-rosgazebolibrary | a91d48413d4af95856964644b149898b538c6724 | [
"Apache-2.0"
] | null | null | null | src/RosGazeboLibrary/Gazebo.py | hielsnoppe/robotframework-rosgazebolibrary | a91d48413d4af95856964644b149898b538c6724 | [
"Apache-2.0"
] | null | null | null | from robot.api.deco import keyword
from robot.libraries.BuiltIn import BuiltIn
class Gazebo(object):
"""Robot Framework test library for the Gazebo simulator
See also http://gazebosim.org/tutorials/?tut=ros_comm
== Table of contents ==
%TOC%
"""
ROBOT_LIBRARY_SCOPE = 'SUITE'
def __init__(self):
self.ros_lib = BuiltIn().get_library_instance('RosGazeboLibrary.ROS')
# Create and destroy models in simulation
# http://gazebosim.org/tutorials/?tut=ros_comm#Services:Createanddestroymodelsinsimulation
@keyword
def spawn_urdf_model(self, urdf_path: str, position: tuple, model_name: str):
''' TODO: Refactor to use service call '''
return self.ros_lib.rosrun('gazebo_ros', 'spawn_model', *[
'-file', urdf_path,
'-urdf',
'-model', model_name,
'-x', position[0],
'-y', position[1],
'-z', position[2],
])
@keyword
def spawn_sdf_model(self, sdf_path: str, position: tuple, model_name: str):
''' TODO: Refactor to use service call '''
return self.ros_lib.rosrun('gazebo_ros', 'spawn_model', *[
'-file', sdf_path,
'-sdf',
'-model', model_name,
'-x', position[0],
'-y', position[1],
'-z', position[2],
])
@keyword
def delete_model(self, model_name: str):
''' Delete a model from simulation
http://gazebosim.org/tutorials/?tut=ros_comm#DeleteModel
'''
return self.ros_lib.rosservice_call(
'gazebo/delete_model', 'gazebo_msgs/DeleteModel',
{ 'model_name': model_name }
)
# State and property setters
# http://gazebosim.org/tutorials/?tut=ros_comm#Services:Stateandpropertysetters
''' TODO
def set_link_properties(self, ...):
def set_physics_properties(self, ...):
def set_model_state(self, ...):
def set_model_configuration(self, ...):
def set_joint_properties(self, ...):
def set_link_state(self, ...):
'''
# State and property getters
# http://gazebosim.org/tutorials/?tut=ros_comm#Services:Stateandpropertygetters
@keyword
def get_model_properties(self, model_name: str):
return self.ros_lib.rosservice_call(
'gazebo/get_model_properties', 'gazebo_msgs/GetModelProperties',
{ 'model_name': model_name }
)
@keyword
def get_model_state(self, model_name: str):
return self.ros_lib.rosservice_call(
'gazebo/get_model_state', 'gazebo_msgs/GetModelState',
{ 'model_name': model_name }
)
''' TODO
def get_world_properties(self, ...):
def get_joint_properties(self, ...):
def get_link_properties(self, ...):
def get_link_state(self, ...):
def get_physics_properties(self, ...):
def link_states(self, ...): # investigate
def model_states(self, ...): # investigate
'''
# Force control
# http://gazebosim.org/tutorials/?tut=ros_comm#Services:Forcecontrol
''' TODO
/gazebo/apply_body_wrench
/gazebo/apply_joint_effort
/gazebo/clear_body_wrenches
/gazebo/clear_joint_forces
'''
# Simulation control
# http://gazebosim.org/tutorials/?tut=ros_comm#Services:Simulationcontrol
@keyword
def reset_simulation(self):
return self.ros_lib.rosservice_call('/gazebo/reset_simulation')
@keyword
def reset_world(self):
return self.ros_lib.rosservice_call('/gazebo/reset_world')
@keyword
def pause_physics(self):
return self.ros_lib.rosservice_call('/gazebo/pause_physics')
@keyword
def unpause_physics(self):
return self.ros_lib.rosservice_call('/gazebo/unpause_physics')
# Undocumented services
# Found via `rosservice list`
'''
/gazebo/delete_light
/gazebo/get_light_properties
/gazebo/get_loggers
/gazebo/set_light_properties
/gazebo/set_logger_level
/gazebo/set_parameters
/gazebo_gui/get_loggers
/gazebo_gui/set_logger_level
'''
# Convenience keywords
@keyword
def launch_empty_world(self):
return self.ros_lib.roslaunch('gazebo_ros', 'empty_world.launch') | 29.296552 | 94 | 0.638889 | 488 | 4,248 | 5.297131 | 0.25 | 0.045261 | 0.042553 | 0.061896 | 0.408124 | 0.389555 | 0.37795 | 0.350484 | 0.279304 | 0.169439 | 0 | 0.001852 | 0.237524 | 4,248 | 145 | 95 | 29.296552 | 0.796233 | 0.206686 | 0 | 0.474576 | 0 | 0 | 0.170784 | 0.083049 | 0 | 0 | 0 | 0.027586 | 0 | 1 | 0.186441 | false | 0 | 0.033898 | 0.118644 | 0.423729 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
e196eb274e00b4e5d8027a1161feb36eab5a1ff6 | 1,931 | py | Python | src/MainAPP/migrations/0030_auto_20181211_1246.py | mizamae/HomeAutomation | 8c462ee4c31c1fea6792cb19af66a4d2cf7bb2ca | [
"MIT"
] | null | null | null | src/MainAPP/migrations/0030_auto_20181211_1246.py | mizamae/HomeAutomation | 8c462ee4c31c1fea6792cb19af66a4d2cf7bb2ca | [
"MIT"
] | 9 | 2017-11-21T15:45:18.000Z | 2022-02-11T03:37:54.000Z | src/MainAPP/migrations/0030_auto_20181211_1246.py | mizamae/HomeAutomation | 8c462ee4c31c1fea6792cb19af66a4d2cf7bb2ca | [
"MIT"
] | 1 | 2020-07-22T02:24:17.000Z | 2020-07-22T02:24:17.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.4 on 2018-12-11 11:46
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('MainAPP', '0029_auto_20181211_1237'),
]
operations = [
migrations.AlterField(
model_name='sitesettings',
name='ESIOS_TOKEN',
field=models.CharField(blank=True, default='', help_text='The token assigned by the ESIOS service. You should ask for yours to: Consultas Sios <consultasios@ree.es>', max_length=50, verbose_name='Token for the ESIOS page'),
),
migrations.AlterField(
model_name='sitesettings',
name='IBERDROLA_PASSW',
field=models.CharField(blank=True, default='', help_text='Password registered on the Iberdrola Distribucion webpage', max_length=50, verbose_name='Iberdrola password'),
),
migrations.AlterField(
model_name='sitesettings',
name='IBERDROLA_USER',
field=models.CharField(blank=True, default='', help_text='Username registered into the Iberdrola Distribucion webpage', max_length=50, verbose_name='Iberdrola username'),
),
migrations.AlterField(
model_name='sitesettings',
name='OWM_TOKEN',
field=models.CharField(blank=True, default='', help_text='The token assigned by the OpenWeatherMap service. You should ask yours following https://openweathermap.org/appid', max_length=50, verbose_name='Token for the openweathermap page'),
),
migrations.AlterField(
model_name='sitesettings',
name='TELEGRAM_TOKEN',
field=models.CharField(blank=True, default='', help_text='The token assigned by the BothFather', max_length=50, verbose_name='Token for the telegram bot'),
),
]
| 47.097561 | 252 | 0.651476 | 215 | 1,931 | 5.697674 | 0.372093 | 0.081633 | 0.102041 | 0.118367 | 0.630204 | 0.630204 | 0.556735 | 0.425306 | 0.272653 | 0.272653 | 0 | 0.029352 | 0.241326 | 1,931 | 40 | 253 | 48.275 | 0.806826 | 0.035215 | 0 | 0.454545 | 1 | 0.030303 | 0.353297 | 0.024176 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.060606 | 0.060606 | 0 | 0.151515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
e198e752d01863b4604a77f7225e25fec572d794 | 623 | py | Python | src/evaluators/sample_evaluators/swd_sample_evaluator.py | gmum/cwae-pytorch | 7fb31a5d12a0a637be7dde76f0e11e80ec4a345d | [
"MIT"
] | 4 | 2020-08-20T20:51:24.000Z | 2022-01-26T23:56:35.000Z | src/evaluators/sample_evaluators/swd_sample_evaluator.py | gmum/cwae-pytorch | 7fb31a5d12a0a637be7dde76f0e11e80ec4a345d | [
"MIT"
] | null | null | null | src/evaluators/sample_evaluators/swd_sample_evaluator.py | gmum/cwae-pytorch | 7fb31a5d12a0a637be7dde76f0e11e80ec4a345d | [
"MIT"
] | 1 | 2021-12-24T14:13:40.000Z | 2021-12-24T14:13:40.000Z | import torch
from metrics.swd import sliced_wasserstein_distance
from evaluators.sample_evaluators.base_sample_evaluator import BaseSampleEvaluator
from noise_creator import NoiseCreator
class SWDSampleEvaluator(BaseSampleEvaluator):
def __init__(self, noise_creator: NoiseCreator):
self.__noise_creator = noise_creator
def evaluate(self, sample: torch.Tensor) -> torch.Tensor:
comparision_sample = self.__noise_creator.create(sample.size(0)).type_as(sample)
swd_penalty_value = sliced_wasserstein_distance(sample, comparision_sample, 50)
return swd_penalty_value
| 38.9375 | 89 | 0.781701 | 71 | 623 | 6.478873 | 0.464789 | 0.130435 | 0.104348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005714 | 0.157303 | 623 | 15 | 90 | 41.533333 | 0.870476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.363636 | 0 | 0.727273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e199bdb1802d5fdf8365414f161e96d1a070a7b9 | 899 | py | Python | utils/migrations/0002_alter_electricitybilling_unit_price_and_more.py | shumwe/rental-house-management-system | f97f22afa8bc2740ed08baa387c74b93e02fac0c | [
"MIT"
] | 1 | 2022-03-16T13:29:30.000Z | 2022-03-16T13:29:30.000Z | utils/migrations/0002_alter_electricitybilling_unit_price_and_more.py | shumwe/rental-house-management-system | f97f22afa8bc2740ed08baa387c74b93e02fac0c | [
"MIT"
] | null | null | null | utils/migrations/0002_alter_electricitybilling_unit_price_and_more.py | shumwe/rental-house-management-system | f97f22afa8bc2740ed08baa387c74b93e02fac0c | [
"MIT"
] | null | null | null | # Generated by Django 4.0.3 on 2022-04-02 17:24
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('utils', '0001_initial'),
]
operations = [
migrations.AlterField(
model_name='electricitybilling',
name='unit_price',
field=models.DecimalField(decimal_places=2, default=24.18, max_digits=9),
),
migrations.AlterField(
model_name='mpesaonline',
name='update_status',
field=models.CharField(choices=[('recieved', 'Recieved'), ('updated', 'Updated')], default='recieved', max_length=10),
),
migrations.AlterField(
model_name='waterbilling',
name='unit_price',
field=models.DecimalField(decimal_places=2, default=53.0, max_digits=9, verbose_name='Unit Price (KES)'),
),
]
| 31 | 130 | 0.604004 | 93 | 899 | 5.698925 | 0.569892 | 0.113208 | 0.141509 | 0.164151 | 0.215094 | 0.215094 | 0.215094 | 0.215094 | 0.215094 | 0.215094 | 0 | 0.048338 | 0.263626 | 899 | 28 | 131 | 32.107143 | 0.752266 | 0.050056 | 0 | 0.363636 | 1 | 0 | 0.170188 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.045455 | 0 | 0.181818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
e1b62639aea1ec0a6c6d66e1d90f5f610c374034 | 4,397 | py | Python | win/GUI/Configuration.py | WeAreAVP/mdqc | 3130a713c70017bc54d8e5da6bb5766ba9d97423 | [
"Apache-2.0"
] | 8 | 2019-01-18T08:58:02.000Z | 2021-05-20T16:51:14.000Z | osx/GUI/Configuration.py | WeAreAVP/mdqc | 3130a713c70017bc54d8e5da6bb5766ba9d97423 | [
"Apache-2.0"
] | 7 | 2016-02-25T21:50:03.000Z | 2017-12-13T14:27:29.000Z | osx/GUI/Configuration.py | WeAreAVP/mdqc | 3130a713c70017bc54d8e5da6bb5766ba9d97423 | [
"Apache-2.0"
] | 2 | 2020-02-13T16:00:07.000Z | 2020-08-12T16:31:49.000Z | # -*- coding: UTF-8 -*-
'''
Created on May 14, 2014
@author: Furqan Wasi <furqan@avpreserve.com>
'''
import os, datetime, sys, platform, base64
class Configuration(object):
def __init__(self):
# Constructor
if os.name == 'posix':
self.OsType = 'linux'
elif os.name == 'nt':
self.OsType = 'Windows'
elif os.name == 'os2':
self.OsType = 'check'
self.application_name = 'Metadata Quality Control'
self.application_version = '0.4'
self.user_home_path = os.path.expanduser('~')
if self.OsType == 'Windows':
self.base_path = str(os.getcwd())+str(os.sep)
self.assets_path = r''+(os.path.join(self.base_path, 'assets'+str(os.sep)))
try:
self.avpreserve_img = os.path.join(sys._MEIPASS, 'assets' + (str(os.sep)) +'avpreserve.png')
except:
pass
else:
self.base_path = str(os.getcwd())+str(os.sep)
self.assets_path = r''+(os.path.join(self.base_path, 'assets'+str(os.sep)))
self.avpreserve_img = r''+(os.path.join(self.assets_path) + 'avpreserve.png')
self.logo_sign_small = 'logo_sign_small.png'
def getImagesPath(self):return str(self.assets_path)
def getAvpreserve_img(self):return self.avpreserve_img
def getBasePath(self):return str(self.base_path)
def getApplicationVersion(self):return str(self.application_version)
def getConfig_file_path(self):
return self.config_file_path
def EncodeInfo(self, string_to_be_encoded):
string_to_be_encoded = str(string_to_be_encoded).strip()
return base64.b16encode(base64.b16encode(string_to_be_encoded))
def getLogoSignSmall(self):
if self.getOsType() == 'Windows':
try:
return os.path.join(sys._MEIPASS, 'assets' + (str(os.sep)) + str(self.logo_sign_small))
except:
pass
else:
os.path.join(self.assets_path)
return os.path.join(self.assets_path, str(self.logo_sign_small))
def getOsType(self):return str(self.OsType)
def getApplicationName(self): return str(self.application_name)
def getUserHomePath(self): return str(os.path.expanduser('~'))
def getDebugFilePath(self):return str(self.log_file_path)
def getWindowsInformation(self):
"""
Gets Detail information of Windows
@return: tuple Windows Information
"""
WindowsInformation = {}
try:
major, minor, build, platformType, servicePack = sys.getwindowsversion()
WindowsInformation['major'] = major
WindowsInformation['minor'] = minor
WindowsInformation['build'] = build
WindowsInformation['platformType'] = platformType
WindowsInformation['servicePack'] = servicePack
windowDetailedName = platform.platform()
WindowsInformation['platform'] = windowDetailedName
windowDetailedName = str(windowDetailedName).split('-')
if windowDetailedName[0] is not None and (str(windowDetailedName[0]) == 'Windows' or str(windowDetailedName[0]) == 'windows'):
WindowsInformation['isWindows'] =True
else:
WindowsInformation['isWindows'] =False
if windowDetailedName[1] is not None and (str(windowDetailedName[1]) != ''):
WindowsInformation['WindowsType'] =str(windowDetailedName[1])
else:
WindowsInformation['WindowsType'] =None
WindowsInformation['ProcessorInfo'] = platform.processor()
try:
os.environ["PROGRAMFILES(X86)"]
bits = 64
except:
bits = 32
pass
WindowsInformation['bitType'] = "Win{0}".format(bits)
except:
pass
return WindowsInformation
def CleanStringForBreaks(self,StringToBeCleaned):
"""
@param StringToBeCleaned:
@return:
"""
CleanString = StringToBeCleaned.strip()
try:
CleanString = CleanString.replace('\r\n', '')
CleanString = CleanString.replace('\n', '')
CleanString = CleanString.replace('\r', '')
except:
pass
return CleanString | 31.407143 | 139 | 0.596771 | 439 | 4,397 | 5.856492 | 0.289294 | 0.021004 | 0.027227 | 0.039673 | 0.181641 | 0.143913 | 0.090237 | 0.090237 | 0.090237 | 0.063788 | 0 | 0.010503 | 0.285422 | 4,397 | 140 | 140 | 31.407143 | 0.807766 | 0.04776 | 0 | 0.267442 | 0 | 0 | 0.071168 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.162791 | false | 0.081395 | 0.011628 | 0.104651 | 0.255814 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 2 |
e1bdf0a5854ba7b15ac7f77144b26d66246bcec0 | 517 | py | Python | app/urlshortener.py | felixbade/minimal-url-shortener | 4f2b4e318fff4eab2b37f863230198a5116e7a7e | [
"MIT"
] | null | null | null | app/urlshortener.py | felixbade/minimal-url-shortener | 4f2b4e318fff4eab2b37f863230198a5116e7a7e | [
"MIT"
] | null | null | null | app/urlshortener.py | felixbade/minimal-url-shortener | 4f2b4e318fff4eab2b37f863230198a5116e7a7e | [
"MIT"
] | null | null | null | class URLShortener:
def __init__(self):
self.id_counter = 0
self.links = {}
def getURL(self, short_id):
return self.links.get(short_id)
def shorten(self, url):
short_id = self.getNextId()
self.links.update({short_id: url})
return short_id
def getNextId(self):
self.id_counter += 1
# Id got from a URL is type str anyway so it is easiest to just use
# type str everywhere after this point.
return str(self.id_counter)
| 25.85 | 75 | 0.611219 | 73 | 517 | 4.164384 | 0.479452 | 0.115132 | 0.128289 | 0.111842 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00554 | 0.301741 | 517 | 19 | 76 | 27.210526 | 0.836565 | 0.199226 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0 | 0 | 0.076923 | 0.615385 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e1cc98556e1e617de9737efeaed139473d56ebaf | 426 | py | Python | homeworks/vecutil.py | JediKoder/coursera-CodeMatrix | 1ac461d22ebaf2777eabdcf31d76d709c33f472a | [
"MIT"
] | 3 | 2018-01-11T07:48:06.000Z | 2020-04-27T20:49:02.000Z | homeworks/vecutil.py | JediKoder/coursera-CodeMatrix | 1ac461d22ebaf2777eabdcf31d76d709c33f472a | [
"MIT"
] | null | null | null | homeworks/vecutil.py | JediKoder/coursera-CodeMatrix | 1ac461d22ebaf2777eabdcf31d76d709c33f472a | [
"MIT"
] | 1 | 2021-01-26T07:25:48.000Z | 2021-01-26T07:25:48.000Z | # Copyright 2013 Philip N. Klein
from vec import Vec
def list2vec(L):
"""Given a list L of field elements, return a Vec with domain {0...len(L)-1}
whose entry i is L[i]
>>> list2vec([10, 20, 30])
Vec({0, 1, 2},{0: 10, 1: 20, 2: 30})
"""
return Vec(set(range(len(L))), {k:L[k] for k in range(len(L))})
def zero_vec(D):
"""Returns a zero vector with the given domain
"""
return Vec(D, {})
| 25.058824 | 80 | 0.577465 | 78 | 426 | 3.141026 | 0.525641 | 0.04898 | 0.073469 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080495 | 0.241784 | 426 | 16 | 81 | 26.625 | 0.678019 | 0.565728 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.4 | false | 0 | 0.2 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
e1cd563f597751eb051e125f9959363e2f96050c | 397 | py | Python | users/forms.py | kurosh-wss/Personal-Finance-Management | 9c7c467b95999974492df19a0f0286809f877c87 | [
"MIT"
] | null | null | null | users/forms.py | kurosh-wss/Personal-Finance-Management | 9c7c467b95999974492df19a0f0286809f877c87 | [
"MIT"
] | null | null | null | users/forms.py | kurosh-wss/Personal-Finance-Management | 9c7c467b95999974492df19a0f0286809f877c87 | [
"MIT"
] | null | null | null | from django import forms
from django.contrib.auth.forms import UserCreationForm
from crispy_bootstrap5.bootstrap5 import FloatingField
from crispy_forms.layout import Layout
from crispy_forms.helper import FormHelper
class CustomUserCreationForm(UserCreationForm):
email = forms.EmailField()
class Meta(UserCreationForm.Meta):
fields = UserCreationForm.Meta.fields + ("email",)
| 30.538462 | 58 | 0.806045 | 44 | 397 | 7.204545 | 0.431818 | 0.094637 | 0.094637 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00578 | 0.128463 | 397 | 12 | 59 | 33.083333 | 0.910405 | 0 | 0 | 0 | 0 | 0 | 0.012594 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.555556 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
e1d05a453d3d0e33ff80baf493eec26c3cbe59f9 | 437 | py | Python | Extra kunskap/Kod/Farmen.py | abbindustrigymnasium/Programmering-1-Slutuppgift | 679069ebb632ee59f6b4ee3035c18ae204cde145 | [
"Apache-2.0"
] | null | null | null | Extra kunskap/Kod/Farmen.py | abbindustrigymnasium/Programmering-1-Slutuppgift | 679069ebb632ee59f6b4ee3035c18ae204cde145 | [
"Apache-2.0"
] | null | null | null | Extra kunskap/Kod/Farmen.py | abbindustrigymnasium/Programmering-1-Slutuppgift | 679069ebb632ee59f6b4ee3035c18ae204cde145 | [
"Apache-2.0"
] | 1 | 2020-03-09T12:04:31.000Z | 2020-03-09T12:04:31.000Z | import openpyxl
wb= openpyxl.load_workbook('Farmen.xlsx')
# sheet= wb.active
# print(wb.get_sheet_names())
# Deltagar_sheet= wb.get_sheet_by_name('Deltagare')
# artists=[{"Namn":sheet.cell(row=i, column=2).value,
# "Sång":sheet.cell(row=i, column=3).value,
# "Poäng":sheet.cell(row=i, column=6).value,
# "Röst":sheet.cell(row=i, column=5).value
# } for i in range(2,sheet.max_row) ]
# print(artists) | 33.615385 | 54 | 0.649886 | 67 | 437 | 4.119403 | 0.507463 | 0.130435 | 0.173913 | 0.188406 | 0.275362 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013624 | 0.160183 | 437 | 13 | 55 | 33.615385 | 0.73842 | 0.79405 | 0 | 0 | 0 | 0 | 0.15942 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
becbc66bb4d180935eed7f6a49ea9b7ed75ae703 | 998 | py | Python | plot_loss.py | ngachago/tabular_comp | 799a1e0dbf7a51bb04454f1f14a57f883dbd2da7 | [
"MIT"
] | null | null | null | plot_loss.py | ngachago/tabular_comp | 799a1e0dbf7a51bb04454f1f14a57f883dbd2da7 | [
"MIT"
] | null | null | null | plot_loss.py | ngachago/tabular_comp | 799a1e0dbf7a51bb04454f1f14a57f883dbd2da7 | [
"MIT"
] | null | null | null |
import matplotlib.pyplot as plt
def plot_loss_mae(history):
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Model loss')
plt.ylabel('Loss')
plt.xlabel('Epoch')
plt.legend(['Train', 'Validation'], loc='best')
plt.show()
plt.plot(history.history['mae'])
plt.plot(history.history['val_mae'])
plt.title('Model MAE')
plt.ylabel('MAE')
plt.xlabel('Epoch')
plt.legend(['Train', 'Validation'], loc='best')
plt.show()
def plot_loss_accuracy(history):
plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('Model loss')
plt.ylabel('Loss')
plt.xlabel('Epoch')
plt.legend(['Train', 'Validation'], loc='best')
plt.show()
plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('Model Accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epoch')
plt.legend(['Train', 'Validation'], loc='best')
plt.show()
| 26.972973 | 51 | 0.635271 | 131 | 998 | 4.778626 | 0.175573 | 0.089457 | 0.178914 | 0.268371 | 0.763578 | 0.686901 | 0.686901 | 0.686901 | 0.686901 | 0.686901 | 0 | 0 | 0.165331 | 998 | 36 | 52 | 27.722222 | 0.751501 | 0 | 0 | 0.645161 | 0 | 0 | 0.212638 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.064516 | false | 0 | 0.032258 | 0 | 0.096774 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bed4073b9e79a28ea38d1cc06f6e14cb5d4efcb7 | 777 | py | Python | __determineTripplesSumToZeroFromList.py | simdevex/01.Basics | cf4f372384e66f4b26e4887d2f5d815a1f8e929c | [
"MIT"
] | null | null | null | __determineTripplesSumToZeroFromList.py | simdevex/01.Basics | cf4f372384e66f4b26e4887d2f5d815a1f8e929c | [
"MIT"
] | null | null | null | __determineTripplesSumToZeroFromList.py | simdevex/01.Basics | cf4f372384e66f4b26e4887d2f5d815a1f8e929c | [
"MIT"
] | null | null | null | '''
Python program to determine which triples sum to zero from a given list of lists.
Input: [[1343532, -2920635, 332], [-27, 18, 9], [4, 0, -4], [2, 2, 2], [-20, 16, 4]]
Output:
[False, True, True, False, True]
Input: [[1, 2, -3], [-4, 0, 4], [0, 1, -5], [1, 1, 1], [-2, 4, -1]]
Output:
[True, True, False, False, False]
'''
#License: https://bit.ly/3oLErEI
def test(nums):
return [sum(t)==0 for t in nums]
nums = [[1343532, -2920635, 332], [-27, 18, 9], [4, 0, -4], [2, 2, 2], [-20, 16, 4]]
print("Original list of lists:",nums)
print("Determine which triples sum to zero:")
print(test(nums))
nums = [[1, 2, -3], [-4, 0, 4], [0, 1, -5], [1, 1, 1], [-2, 4, -1]]
print("\nOriginal list of lists:",nums)
print("Determine which triples sum to zero:")
print(test(nums))
| 32.375 | 84 | 0.574003 | 139 | 777 | 3.208633 | 0.33813 | 0.026906 | 0.026906 | 0.161435 | 0.565022 | 0.565022 | 0.497758 | 0.497758 | 0.497758 | 0.497758 | 0 | 0.153125 | 0.176319 | 777 | 23 | 85 | 33.782609 | 0.54375 | 0.449163 | 0 | 0.4 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0 | 0.1 | 0.2 | 0.6 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
bed6c276e3757d89c0d4a20b188e77bced930a94 | 701 | py | Python | fastNLP/modules/encoder/lstm.py | h00Jiang/fastNLP | 79ddb469d81946c87a3d066122a8a3aba6e40f3a | [
"Apache-2.0"
] | null | null | null | fastNLP/modules/encoder/lstm.py | h00Jiang/fastNLP | 79ddb469d81946c87a3d066122a8a3aba6e40f3a | [
"Apache-2.0"
] | null | null | null | fastNLP/modules/encoder/lstm.py | h00Jiang/fastNLP | 79ddb469d81946c87a3d066122a8a3aba6e40f3a | [
"Apache-2.0"
] | null | null | null | import torch.nn as nn
class Lstm(nn.Module):
"""
LSTM module
Args:
input_size : input size
hidden_size : hidden size
num_layers : number of hidden layers. Default: 1
dropout : dropout rate. Default: 0.5
bidirectional : If True, becomes a bidirectional RNN. Default: False.
"""
def __init__(self, input_size, hidden_size=100, num_layers=1, dropout=0, bidirectional=False):
super(Lstm, self).__init__()
self.lstm = nn.LSTM(input_size, hidden_size, num_layers, bias=True, batch_first=True,
dropout=dropout, bidirectional=bidirectional)
def forward(self, x):
x, _ = self.lstm(x)
return x
| 29.208333 | 98 | 0.636234 | 91 | 701 | 4.692308 | 0.428571 | 0.084309 | 0.131148 | 0.133489 | 0.107728 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015564 | 0.266762 | 701 | 23 | 99 | 30.478261 | 0.815175 | 0.319544 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.111111 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
bed723c002fdd1ab37526c62f785025bbbd1fbd1 | 838 | py | Python | geoLocApp/signals.py | KKWaxy/geoLoc | 23e33b9fd7cb3b1031bd11475612dcc324680975 | [
"Apache-2.0"
] | null | null | null | geoLocApp/signals.py | KKWaxy/geoLoc | 23e33b9fd7cb3b1031bd11475612dcc324680975 | [
"Apache-2.0"
] | null | null | null | geoLocApp/signals.py | KKWaxy/geoLoc | 23e33b9fd7cb3b1031bd11475612dcc324680975 | [
"Apache-2.0"
] | null | null | null | from django.db.models.signals import pre_save,post_save
from django.dispatch import receiver
import geoLocApp.models
import geoLocApp.distance
# @receiver(post_save,sender=geoLocApp.models.Position,dispatch_uid="only_before_registered")
# def setDistance(sender, **kwargs):
# position = kwargs["instance"]
# coordonnees = position.coordonnees.all()
# print(coordonnees)
# for coordonnee in coordonnees:
# coordonnee.distance = geoLocApp.distance.distance(coordonnee.latitude,position.latitude,coordonnee.longitude,position.longitude)
# print(coordonnee.distance)
# @receiver(post_save,sender=geoLocApp.models.Position,dispatch_uid="new_position_added")
# def new_position(sender,**kwargs):
# if kwargs['created']==True:
# return ['intance']
# else:
# return 0 | 39.904762 | 139 | 0.72673 | 92 | 838 | 6.5 | 0.434783 | 0.040134 | 0.06689 | 0.080268 | 0.214047 | 0.214047 | 0.214047 | 0.214047 | 0.214047 | 0.214047 | 0 | 0.001418 | 0.158711 | 838 | 21 | 140 | 39.904762 | 0.846809 | 0.768496 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.