hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
a46a117f128d637051d582f6f7515bc9e1f902a2 | 1,688 | py | Python | bc211/import_open_referral_csv/inactive_records_collector.py | pg-irc/pathways-backend | 05a8c4e750523d2d081b030a248c5444d1ed7992 | [
"BSD-3-Clause"
] | 12 | 2017-08-30T18:21:00.000Z | 2021-12-09T04:04:17.000Z | bc211/import_open_referral_csv/inactive_records_collector.py | pg-irc/pathways-backend | 05a8c4e750523d2d081b030a248c5444d1ed7992 | [
"BSD-3-Clause"
] | 424 | 2017-08-08T18:32:14.000Z | 2022-03-30T21:42:51.000Z | bc211/import_open_referral_csv/inactive_records_collector.py | pg-irc/pathways-backend | 05a8c4e750523d2d081b030a248c5444d1ed7992 | [
"BSD-3-Clause"
] | 7 | 2017-09-29T21:14:37.000Z | 2019-12-30T21:07:37.000Z | from bc211.is_inactive import is_inactive
class InactiveRecordsCollector:
def __init__(self):
self.inactive_organizations_ids = []
self.inactive_services_ids = []
self.inactive_locations_ids = []
def add_inactive_organization_id(self, organization_id):
self.inactive_organizations_ids.append(organization_id)
def add_inactive_service_id(self, service_id):
self.inactive_services_ids.append(service_id)
def add_inactive_location_id(self, location_id):
self.inactive_locations_ids.append(location_id)
def organization_has_inactive_data(self, organization_id, description):
if is_inactive(description):
self.add_inactive_organization_id(organization_id)
return True
return False
def service_has_inactive_data(self, organization_id, service_id, description):
if is_inactive(description) or self.has_inactive_organization_id(organization_id):
self.add_inactive_service_id(service_id)
return True
return False
def location_has_inactive_data(self, organization_id, location_id, description):
if is_inactive(description) or self.has_inactive_organization_id(organization_id):
self.add_inactive_location_id(location_id)
return True
return False
def has_inactive_organization_id(self, organization_id):
return organization_id in self.inactive_organizations_ids
def has_inactive_service_id(self, service_id):
return service_id in self.inactive_services_ids
def has_inactive_location_id(self, location_id):
return location_id in self.inactive_locations_ids
| 37.511111 | 90 | 0.745853 | 209 | 1,688 | 5.588517 | 0.138756 | 0.179795 | 0.094178 | 0.071918 | 0.548801 | 0.519692 | 0.164384 | 0.164384 | 0.164384 | 0.164384 | 0 | 0.002212 | 0.196682 | 1,688 | 44 | 91 | 38.363636 | 0.859145 | 0 | 0 | 0.242424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.30303 | false | 0 | 0.030303 | 0.090909 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
a4886cf13df1e14ff908460445c6c266aa67e98b | 57 | py | Python | demo/setting.py | cyril-pierro/Demo_logger_and_analysis | 9d6205f09561ba7706d3af8f70f0a880140b7342 | [
"MIT"
] | null | null | null | demo/setting.py | cyril-pierro/Demo_logger_and_analysis | 9d6205f09561ba7706d3af8f70f0a880140b7342 | [
"MIT"
] | null | null | null | demo/setting.py | cyril-pierro/Demo_logger_and_analysis | 9d6205f09561ba7706d3af8f70f0a880140b7342 | [
"MIT"
] | null | null | null | """
Setting for project
"""
logfile = "demo_project.log"
| 11.4 | 28 | 0.684211 | 7 | 57 | 5.428571 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140351 | 57 | 4 | 29 | 14.25 | 0.77551 | 0.333333 | 0 | 0 | 0 | 0 | 0.533333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a490e563029fc8f47bc4f6326f16f09f99fb52cb | 1,664 | py | Python | tests/test_config.py | shmir/PyIxNetwork | d32d516bac15d2c5bb941927fe72bd598ff029e2 | [
"Apache-2.0"
] | 2 | 2017-05-24T07:39:01.000Z | 2021-07-27T06:07:46.000Z | tests/test_config.py | shmir/IxNetwork | d32d516bac15d2c5bb941927fe72bd598ff029e2 | [
"Apache-2.0"
] | 4 | 2018-05-17T20:55:01.000Z | 2021-03-04T16:53:48.000Z | tests/test_config.py | shmir/IxNetwork | d32d516bac15d2c5bb941927fe72bd598ff029e2 | [
"Apache-2.0"
] | 5 | 2017-05-22T17:16:46.000Z | 2020-11-26T01:02:59.000Z |
chassis_900 = '192.168.65.36'
chassis_910 = '192.168.65.21'
linux_900 = '192.168.65.34:443'
linux_910 = '192.168.65.23:443'
windows_900 = 'localhost:11009'
windows_910 = 'localhost:11009'
cm_900 = '172.40.0.204:443'
server_properties = {'linux_900': {'server': linux_900,
'locations': [f'{chassis_900}/1/1', f'{chassis_900}/1/2'],
'auth': ('admin', 'admin')},
'linux_910': {'server': linux_910,
'locations': [f'{chassis_910}/1/1', f'{chassis_910}/1/2'],
'auth': ('admin', 'admin')},
'windows_900': {'server': windows_900,
'locations': [f'{chassis_900}/1/1', f'{chassis_900}/1/2'],
'auth': None,
'install_dir': 'C:/Program Files (x86)/Ixia/IxNetwork/9.00.1915.16'},
'windows_910': {'server': windows_910,
'locations': [f'{chassis_910}/1/1', f'{chassis_910}/1/2'],
'auth': None,
'install_dir': 'C:/Program Files (x86)/Ixia/IxNetwork/9.10.2007.7'},
'cm_900': {'server': cm_900,
'locations': [f'{chassis_900}/1/1', f'{chassis_900}/1/2'],
'auth': None,
'install_dir': 'C:/Program Files (x86)/Ixia/IxNetwork/9.00.1915.16'}}
license_servers = ['192.168.42.61']
# Default for options.
api = ['rest']
server = ['linux_910']
| 44.972973 | 106 | 0.444712 | 185 | 1,664 | 3.821622 | 0.286486 | 0.113154 | 0.093352 | 0.101839 | 0.534653 | 0.506365 | 0.506365 | 0.506365 | 0.506365 | 0.506365 | 0 | 0.20878 | 0.384014 | 1,664 | 36 | 107 | 46.222222 | 0.480976 | 0.012019 | 0 | 0.357143 | 0 | 0 | 0.393053 | 0.05972 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a496b62ed23a944d3d13a82ee54cea965125b724 | 896 | py | Python | release/ci.py | romatroskin/mapbox-plugins-android | 0a4d416b7ba95ea7fabbdf04c3fd8f396fe077e2 | [
"BSD-2-Clause"
] | null | null | null | release/ci.py | romatroskin/mapbox-plugins-android | 0a4d416b7ba95ea7fabbdf04c3fd8f396fe077e2 | [
"BSD-2-Clause"
] | null | null | null | release/ci.py | romatroskin/mapbox-plugins-android | 0a4d416b7ba95ea7fabbdf04c3fd8f396fe077e2 | [
"BSD-2-Clause"
] | null | null | null | #
# CircleCI
#
from utils import abort_with_message
import constants
import json
import os
import requests
import sys
def get_circleci_api_token():
circleci_api_token = os.environ.get(constants.CIRCLECI_API_TOKEN_ENV_VAR)
if not circleci_api_token:
abort_with_message('You need to set the CIRCLECI_API_TOKEN environment variable.')
print 'Found CircleCI API token.'
return circleci_api_token
def do_circleci_request(branch):
url = constants.URL_CIRCLECI + branch
params = {'circle-token': get_circleci_api_token()}
print 'CircleCI request to %s (params: %s)' % (url, json.dumps(params))
continue_release = raw_input("\nDo you want to start a build? ").lower()
if not continue_release.startswith('y'):
print 'Aborting release'
sys.exit()
r = requests.post(url, params)
print '- CircleCI response code: %s' % r.status_code
| 27.151515 | 90 | 0.722098 | 125 | 896 | 4.944 | 0.464 | 0.142395 | 0.20712 | 0.061489 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.185268 | 896 | 32 | 91 | 28 | 0.846575 | 0.008929 | 0 | 0 | 0 | 0 | 0.236425 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.272727 | null | null | 0.181818 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a4a38762d9a0ceb8f8ec8d1af8d906705ea7490a | 5,669 | py | Python | cs15211/PathSumII.py | JulyKikuAkita/PythonPrac | 0ba027d9b8bc7c80bc89ce2da3543ce7a49a403c | [
"Apache-2.0"
] | 1 | 2021-07-05T01:53:30.000Z | 2021-07-05T01:53:30.000Z | cs15211/PathSumII.py | JulyKikuAkita/PythonPrac | 0ba027d9b8bc7c80bc89ce2da3543ce7a49a403c | [
"Apache-2.0"
] | null | null | null | cs15211/PathSumII.py | JulyKikuAkita/PythonPrac | 0ba027d9b8bc7c80bc89ce2da3543ce7a49a403c | [
"Apache-2.0"
] | 1 | 2018-01-08T07:14:08.000Z | 2018-01-08T07:14:08.000Z | __source__ = 'https://leetcode.com/problems/path-sum-ii/'
# https://github.com/kamyu104/LeetCode/blob/master/Python/path-sum-ii.py
# Time: O(n)
# Space: O(h), h is height of binary tree
# DFS
#
# Description: Leetcode # 113. Path Sum II
#
# Given a binary tree and a sum, find all root-to-leaf paths where each path's sum equals the given sum.
#
# For example:
# Given the below binary tree and sum = 22,
# 5
# / \
# 4 8
# / / \
# 11 13 4
# / \ / \
# 7 2 5 1
# return
# [
# [5,4,11,2],
# [5,8,4,5]
# ]
#
#
# Companies
# Bloomberg
# Related Topics
# Tree Depth-first Search
# Similar Questions
# Path Sum Binary Tree Paths Path Sum III
#
import unittest
# Definition for a binary tree node
class TreeNode:
def __init__(self, x):
self.val = x
self.left = None
self.right = None
class Solution:
# @param root, a tree node
# @param sum, an integer
# @return a list of lists of integers
def pathSum(self, root, sum):
return self.pathSumRecu([], [], root, sum)
def pathSumRecu(self, result, cur, root, sum):
if root is None:
return result
if root.left is None and root.right is None and root.val == sum:
result.append(cur + [root.val])
return result
cur.append(root.val)
self.pathSumRecu(result, cur, root.left, sum - root.val)
self.pathSumRecu(result, cur, root.right, sum - root.val)
cur.pop()
return result
class Solution2:
# @param root, a tree node
# @param sum, an integer
# @return a list of lists of integers
def pathSum(self, root, sum):
result = []
self.pathSumRecu(result, [], root, sum)
return result
def pathSumRecu(self, result, cur, root, sum):
if root is None:
return result # if fo return, it means return null, bad behavior
if root.left is None and root.right is None and root.val == sum:
result.append(cur + [root.val])
return result
self.pathSumRecu(result, cur + [root.val], root.left, sum - root.val)
self.pathSumRecu(result, cur + [root.val], root.right, sum - root.val)
return result
class TestMethods(unittest.TestCase):
def test_Local(self):
self.assertEqual(1, 1)
root = TreeNode(5)
root.left = TreeNode(4)
root.right = TreeNode(8)
root.left.left = TreeNode(11)
root.left.left.left = TreeNode(7)
root.left.left.right = TreeNode(2)
print Solution().pathSum(root, 22)
print Solution2().pathSum(root, 77)
if __name__ == '__main__':
unittest.main()
Java = '''
# Thought:
/**
* Definition for a binary tree node.
* public class TreeNode {
* int val;
* TreeNode left;
* TreeNode right;
* TreeNode(int x) { val = x; }
* }
*/
# DFS
# 2ms 61.15%
class Solution {
public List<List<Integer>> pathSum(TreeNode root, int sum) {
List<List<Integer>> result = new ArrayList<>();
if (root == null) {
return result;
}
pathSum(root, sum, result, new ArrayList<>());
return result;
}
private void pathSum(TreeNode root, int sum, List<List<Integer>> result, List<Integer> list) {
sum -= root.val;
list.add(root.val);
if (root.left == null && root.right == null) {
if (sum == 0) {
result.add(new ArrayList<>(list));
}
} else {
if (root.left != null) {
pathSum(root.left, sum, result, list);
}
if (root.right != null) {
pathSum(root.right, sum, result, list);
}
}
list.remove(list.size() - 1);
}
}
# 2ms 61.15%
class Solution {
public List<List<Integer>> pathSum(TreeNode root, int sum) {
List<List<Integer>> res = new ArrayList<>();
List<Integer> path = new ArrayList<>();
dfs(root, sum, res, path);
return res;
}
public void dfs(TreeNode root, int sum, List<List<Integer>> res, List<Integer> path){
if(root==null) return;
path.add(root.val);
if(root.left==null && root.right==null ){
if(root.val==sum)
res.add(new ArrayList<Integer>(path));
return;
}
if(root.left!=null) {
dfs(root.left,sum-root.val,res,path);
path.remove(path.size()-1);
}
if(root.right!=null) {
dfs(root.right,sum-root.val,res,path);
path.remove(path.size()-1);
}
}
}
# BFS
# 6ms 9.98%
class Solution {
public List<List<Integer>> pathSum(TreeNode root, int sum) {
List<List<Integer>> res = new ArrayList<>();
List<Integer> path = new ArrayList<>();
Stack<TreeNode> stack = new Stack<TreeNode>();
int SUM = 0;
TreeNode cur = root;
TreeNode pre = null;
while(cur!=null || !stack.isEmpty()){
while(cur!=null){
stack.push(cur);
path.add(cur.val);
SUM+=cur.val;
cur=cur.left;
}
cur = stack.peek();
if(cur.right!=null && cur.right!=pre){
cur = cur.right;
continue;
}
if(cur.left==null && cur.right==null && SUM==sum)
res.add(new ArrayList<Integer>(path));
pre = cur;
stack.pop();
path.remove(path.size()-1);
SUM-=cur.val;
cur = null;
}
return res;
}
}
''' | 27.653659 | 104 | 0.534486 | 703 | 5,669 | 4.285918 | 0.193457 | 0.039496 | 0.039827 | 0.029871 | 0.449054 | 0.427149 | 0.408563 | 0.364421 | 0.352473 | 0.335214 | 0 | 0.0169 | 0.331981 | 5,669 | 205 | 105 | 27.653659 | 0.778717 | 0.152584 | 0 | 0.297297 | 0 | 0.013514 | 0.625656 | 0.048499 | 0 | 0 | 0 | 0 | 0.006757 | 0 | null | null | 0 | 0.006757 | null | null | 0.013514 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
a4a3d71ac4674c035b1819fec69de00c63ab1f96 | 3,327 | py | Python | autoflow/opt/result_logger.py | auto-flow/autoflow | f5903424ad8694d57741a0bd6dfeaba320ea6517 | [
"BSD-3-Clause"
] | 49 | 2020-04-16T11:17:28.000Z | 2020-05-06T01:32:44.000Z | autoflow/opt/result_logger.py | auto-flow/autoflow | f5903424ad8694d57741a0bd6dfeaba320ea6517 | [
"BSD-3-Clause"
] | null | null | null | autoflow/opt/result_logger.py | auto-flow/autoflow | f5903424ad8694d57741a0bd6dfeaba320ea6517 | [
"BSD-3-Clause"
] | 3 | 2020-04-17T00:53:24.000Z | 2020-04-23T03:04:26.000Z | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Author : qichun tang
# @Contact : tqichun@gmail.com
import json
import os
from autoflow.resource_manager.base import ResourceManager
from .structure import Job
class DatabaseResultLogger():
def __init__(self, resource_manager: ResourceManager):
self.resource_manager = resource_manager
def __call__(self, job: Job):
self.resource_manager.init_trial_table()
self.resource_manager._finish_trial_update_info(job.result["info"]["trial_id"], job.timestamps)
def new_config(self, config_id, config, config_info):
pass
class JsonResultLogger(object):
def __init__(self, directory, overwrite=False):
"""
convenience logger for 'semi-live-results'
Logger that writes job results into two files (configs.json and results.json).
Both files contain propper json objects in each line.
This version opens and closes the files for each result.
This might be very slow if individual runs are fast and the
filesystem is rather slow (e.g. a NFS).
Parameters
----------
directory: string
the directory where the two files 'configs.json' and
'results.json' are stored
overwrite: bool
In case the files already exist, this flag controls the
behavior:
* True: The existing files will be overwritten. Potential risk of deleting previous results
* False: A FileExistsError is raised and the files are not modified.
"""
os.makedirs(directory, exist_ok=True)
self.config_fn = os.path.join(directory, 'configs.json')
self.results_fn = os.path.join(directory, 'results.json')
try:
with open(self.config_fn, 'x') as fh:
pass
except FileExistsError:
if overwrite:
with open(self.config_fn, 'w') as fh:
pass
else:
raise FileExistsError('The file %s already exists.' % self.config_fn)
except:
raise
try:
with open(self.results_fn, 'x') as fh:
pass
except FileExistsError:
if overwrite:
with open(self.results_fn, 'w') as fh:
pass
else:
raise FileExistsError('The file %s already exists.' % self.config_fn)
except:
raise
self.config_ids = set()
def new_config(self, config_id, config, config_info):
if not config_id in self.config_ids:
self.config_ids.add(config_id)
with open(self.config_fn, 'a') as fh:
fh.write(json.dumps([config_id, config, config_info]))
fh.write('\n')
def __call__(self, job):
if not job.id in self.config_ids:
# should never happen! TODO: log warning here!
self.config_ids.add(job.id)
with open(self.config_fn, 'a') as fh:
fh.write(json.dumps([job.id, job.kwargs['config'], {}]))
fh.write('\n')
with open(self.results_fn, 'a') as fh:
fh.write(json.dumps([job.id, job.kwargs['budget'], job.timestamps, job.result, job.exception]))
fh.write("\n") | 33.94898 | 109 | 0.593027 | 411 | 3,327 | 4.664234 | 0.355231 | 0.073031 | 0.043818 | 0.037559 | 0.375587 | 0.287428 | 0.287428 | 0.252999 | 0.252999 | 0.211268 | 0 | 0.000436 | 0.31049 | 3,327 | 98 | 110 | 33.94898 | 0.835222 | 0.263601 | 0 | 0.454545 | 0 | 0 | 0.049935 | 0 | 0 | 0 | 0 | 0.010204 | 0 | 1 | 0.109091 | false | 0.090909 | 0.072727 | 0 | 0.218182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
f10899d3f28f6f365bef8bd595adf20d76f31721 | 139 | py | Python | script-07.py | gsc92/bottle-tutorial | af8255786d39a59a4663e31c240ede0434b2ebd9 | [
"MIT"
] | null | null | null | script-07.py | gsc92/bottle-tutorial | af8255786d39a59a4663e31c240ede0434b2ebd9 | [
"MIT"
] | null | null | null | script-07.py | gsc92/bottle-tutorial | af8255786d39a59a4663e31c240ede0434b2ebd9 | [
"MIT"
] | null | null | null | from bottle import run, request
for choice in request.forms.getall('multiple_choice'):
print(choice)
run(host='localhost', port=8080) | 23.166667 | 54 | 0.755396 | 20 | 139 | 5.2 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032787 | 0.122302 | 139 | 6 | 55 | 23.166667 | 0.819672 | 0 | 0 | 0 | 0 | 0 | 0.171429 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f1110f5f03872ab4e7022e2935f318968624294b | 1,037 | py | Python | ServeEasyApp/form_validation.py | ServeEasy/ServeEasy | 9b76bff647af9d4bf49915f86d1c4e3b344409bb | [
"MIT"
] | null | null | null | ServeEasyApp/form_validation.py | ServeEasy/ServeEasy | 9b76bff647af9d4bf49915f86d1c4e3b344409bb | [
"MIT"
] | 12 | 2020-04-27T10:19:21.000Z | 2020-10-11T16:32:32.000Z | ServeEasyApp/form_validation.py | ServeEasy/ServeEasy | 9b76bff647af9d4bf49915f86d1c4e3b344409bb | [
"MIT"
] | 8 | 2020-04-27T10:10:02.000Z | 2020-10-09T20:00:10.000Z | from flask_wtf import FlaskForm
from wtforms import StringField, PasswordField, SubmitField,IntegerField
from wtforms.validators import DataRequired, Email, EqualTo, Length, Optional, NumberRange
import re
from wtforms import validators, ValidationError
class SignupForm(FlaskForm):
name = StringField('name',validators=[DataRequired(message="name is required")])
email = StringField('Email',validators=[Email(message = "Enter a valid email"),DataRequired()])
password = PasswordField('Password',validators=[DataRequired()])
def validate_password(form,field):
if not re.match(r"^(?=.*[a-z])(?=.*[A-Z])(?=.*\d)[a-zA-Z\d]{8,}$"
,field.data):
raise ValidationError("Enter Strong Password")
username = StringField('username',validators=[DataRequired()])
phone = StringField('phone',validators=[DataRequired()])
def validate_phone(form,field):
if not re.match(r'^[0][1-9]\d{9}$|^[1-9]\d{9}$',field.data):
raise ValidationError("enter valid phone number")
submit = SubmitField('Submit') | 51.85 | 98 | 0.713597 | 124 | 1,037 | 5.943548 | 0.41129 | 0.119403 | 0.046133 | 0.089552 | 0.151967 | 0.059701 | 0.059701 | 0 | 0 | 0 | 0 | 0.008811 | 0.124397 | 1,037 | 20 | 99 | 51.85 | 0.802863 | 0 | 0 | 0 | 0 | 0.052632 | 0.183044 | 0.071291 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0.210526 | 0.263158 | 0 | 0.736842 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 2 |
f1117d9a38d80867c27b7f855539e84484c731ac | 989 | py | Python | majora2/migrations/0071_auto_20200504_1706.py | CLIMB-COVID/majora2 | 46ea1809a61e4a768f8cbacaf54cba5c4d82e1f2 | [
"MIT"
] | 29 | 2019-04-04T18:03:43.000Z | 2022-02-09T12:47:30.000Z | majora2/migrations/0071_auto_20200504_1706.py | CLIMB-COVID/majora2 | 46ea1809a61e4a768f8cbacaf54cba5c4d82e1f2 | [
"MIT"
] | 66 | 2019-04-02T16:18:40.000Z | 2022-01-25T16:15:42.000Z | majora2/migrations/0071_auto_20200504_1706.py | CLIMB-COVID/majora2 | 46ea1809a61e4a768f8cbacaf54cba5c4d82e1f2 | [
"MIT"
] | 6 | 2020-04-10T14:15:32.000Z | 2022-01-18T13:08:35.000Z | # Generated by Django 2.2.10 on 2020-05-04 17:06
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('majora2', '0070_publishedartifactgroup_public_timestamp'),
]
operations = [
migrations.AddField(
model_name='institute',
name='gisaid_addr',
field=models.CharField(blank=True, max_length=512, null=True),
),
migrations.AddField(
model_name='institute',
name='gisaid_list',
field=models.CharField(blank=True, max_length=2048, null=True),
),
migrations.AddField(
model_name='institute',
name='gisaid_mail',
field=models.EmailField(blank=True, max_length=254, null=True),
),
migrations.AddField(
model_name='institute',
name='gisaid_user',
field=models.CharField(blank=True, max_length=100, null=True),
),
]
| 29.088235 | 75 | 0.591507 | 101 | 989 | 5.643564 | 0.455446 | 0.126316 | 0.161404 | 0.189474 | 0.564912 | 0.564912 | 0.564912 | 0.284211 | 0.284211 | 0 | 0 | 0.048641 | 0.293225 | 989 | 33 | 76 | 29.969697 | 0.76681 | 0.046512 | 0 | 0.444444 | 1 | 0 | 0.139214 | 0.046759 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.037037 | 0 | 0.148148 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f1127db5dca3099cb5d5e9db32930ba2c851a22f | 54 | py | Python | about.py | astrid-project/cubebeat | 74d825cff15ac952f2190966a1e94c41ec5bec2c | [
"MIT"
] | 1 | 2020-05-14T00:36:07.000Z | 2020-05-14T00:36:07.000Z | about.py | astrid-project/cubebeat | 74d825cff15ac952f2190966a1e94c41ec5bec2c | [
"MIT"
] | null | null | null | about.py | astrid-project/cubebeat | 74d825cff15ac952f2190966a1e94c41ec5bec2c | [
"MIT"
] | null | null | null | title = 'Cubebeat'
version = '1.1.0'
description = ''
| 13.5 | 18 | 0.62963 | 7 | 54 | 4.857143 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.066667 | 0.166667 | 54 | 3 | 19 | 18 | 0.688889 | 0 | 0 | 0 | 0 | 0 | 0.240741 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f1326de4101deec6a22bf02b848f891f932c6db8 | 237 | py | Python | tests/measures/test_energy.py | makinacorpus/python-measurement | 2fb3b953d38a05843279988c59c3e71e509c34d7 | [
"MIT"
] | 91 | 2015-02-27T22:32:33.000Z | 2022-03-15T13:38:02.000Z | tests/measures/test_energy.py | makinacorpus/python-measurement | 2fb3b953d38a05843279988c59c3e71e509c34d7 | [
"MIT"
] | 56 | 2015-02-12T15:32:16.000Z | 2022-02-23T18:29:10.000Z | tests/measures/test_energy.py | makinacorpus/python-measurement | 2fb3b953d38a05843279988c59c3e71e509c34d7 | [
"MIT"
] | 49 | 2015-06-03T17:17:02.000Z | 2021-06-22T08:59:44.000Z | from measurement.measures import Energy
class TestEnergy:
def test_dietary_calories_kwarg(self):
calories = Energy(Calorie=2000)
kilojoules = Energy(kJ=8368)
assert calories.si_value == kilojoules.si_value
| 23.7 | 55 | 0.721519 | 28 | 237 | 5.928571 | 0.75 | 0.084337 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 0.206751 | 237 | 9 | 56 | 26.333333 | 0.840426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f1381704ef9c3a8e6bcb73b80f4f49c118e3e45b | 344 | py | Python | likes/serializers.py | mccuyper/SocialMedia-DRF | 5908a2da59904368d5e2dbcfe19b3d7c8563c792 | [
"MIT"
] | null | null | null | likes/serializers.py | mccuyper/SocialMedia-DRF | 5908a2da59904368d5e2dbcfe19b3d7c8563c792 | [
"MIT"
] | null | null | null | likes/serializers.py | mccuyper/SocialMedia-DRF | 5908a2da59904368d5e2dbcfe19b3d7c8563c792 | [
"MIT"
] | null | null | null | from .models import Likes
from rest_framework import serializers
class LikeSerializer(serializers.ModelSerializer):
like = serializers.ReadOnlyField(source="liked_by.username")
dislike = serializers.ReadOnlyField(source="disliked_by.username")
class Meta:
model = Likes
fields = ("id", "post", "like", "dislike")
| 28.666667 | 70 | 0.723837 | 36 | 344 | 6.833333 | 0.638889 | 0.195122 | 0.243902 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.168605 | 344 | 11 | 71 | 31.272727 | 0.86014 | 0 | 0 | 0 | 0 | 0 | 0.156977 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f14fed9832126bf0f83ff7695577093934a3dd30 | 3,229 | py | Python | pyGeno/Protein.py | ealong/pyGeno | b397bf36d49419ecc4c217a64ea64fa90f5a0392 | [
"Apache-2.0"
] | null | null | null | pyGeno/Protein.py | ealong/pyGeno | b397bf36d49419ecc4c217a64ea64fa90f5a0392 | [
"Apache-2.0"
] | null | null | null | pyGeno/Protein.py | ealong/pyGeno | b397bf36d49419ecc4c217a64ea64fa90f5a0392 | [
"Apache-2.0"
] | null | null | null | from . import configuration as conf
from .pyGenoObjectBases import *
from .SNP import SNP_INDEL
import rabaDB.fields as rf
from .tools import UsefulFunctions as uf
from .tools.BinarySequence import AABinarySequence
import copy
class Protein_Raba(pyGenoRabaObject) :
"""The wrapped Raba object that really holds the data"""
_raba_namespace = conf.pyGeno_RABA_NAMESPACE
id = rf.Primitive()
name = rf.Primitive()
genome = rf.RabaObject('Genome_Raba')
chromosome = rf.RabaObject('Chromosome_Raba')
gene = rf.RabaObject('Gene_Raba')
transcript = rf.RabaObject('Transcript_Raba')
def _curate(self) :
if self.name != None :
self.name = self.name.upper()
class Protein(pyGenoRabaObjectWrapper) :
"""The wrapper for playing with Proteins"""
_wrapped_class = Protein_Raba
def __init__(self, *args, **kwargs) :
pyGenoRabaObjectWrapper.__init__(self, *args, **kwargs)
self._load_sequencesTriggers = set(["sequence"])
def _makeLoadQuery(self, objectType, *args, **coolArgs) :
if issubclass(objectType, SNP_INDEL) :
f = RabaQuery(objectType, namespace = self._wrapped_class._raba_namespace)
coolArgs['species'] = self.genome.species
coolArgs['chromosomeNumber'] = self.chromosome.number
coolArgs['start >='] = self.transcript.start
coolArgs['start <'] = self.transcript.end
if len(args) > 0 and type(args[0]) is list :
for a in args[0] :
if type(a) is dict :
f.addFilter(**a)
else :
f.addFilter(*args, **coolArgs)
return f
return pyGenoRabaObjectWrapper._makeLoadQuery(self, objectType, *args, **coolArgs)
def _load_sequences(self) :
if self.chromosome.number != 'MT':
self.sequence = uf.translateDNA(self.transcript.cDNA).rstrip('*')
else:
self.sequence = uf.translateDNA(self.transcript.cDNA, translTable_id='mt').rstrip('*')
def getSequence(self):
return self.sequence
def _load_bin_sequence(self) :
self.bin_sequence = AABinarySequence(self.sequence)
def getDefaultSequence(self) :
"""Returns a version str sequence where only the last allele of each polymorphisms is shown"""
return self.bin_sequence.defaultSequence
def getPolymorphisms(self) :
"""Returns a list of all polymorphisms contained in the protein"""
return self.bin_sequence.getPolymorphisms()
def find(self, sequence):
"""Returns the position of the first occurence of sequence taking polymorphisms into account"""
return self.bin_sequence.find(sequence)
def findAll(self, sequence):
"""Returns all the position of the occurences of sequence taking polymorphisms into accoun"""
return self.bin_sequence.findAll(sequence)
def findString(self, sequence) :
"""Returns the first occurence of sequence using simple string search in sequence that doesn't care about polymorphisms"""
return self.sequence.find(sequence)
def findStringAll(self, sequence):
"""Returns all first occurences of sequence using simple string search in sequence that doesn't care about polymorphisms"""
return uf.findAll(self.sequence, sequence)
def __getitem__(self, i) :
return self.bin_sequence.getChar(i)
def __len__(self) :
return len(self.bin_sequence)
def __str__(self) :
return "Protein, id: %s > %s" %(self.id, str(self.transcript))
| 31.656863 | 125 | 0.740477 | 412 | 3,229 | 5.667476 | 0.31068 | 0.051392 | 0.044968 | 0.044968 | 0.183298 | 0.107066 | 0.107066 | 0.069379 | 0.069379 | 0.069379 | 0 | 0.001093 | 0.150201 | 3,229 | 101 | 126 | 31.970297 | 0.849854 | 0.20161 | 0 | 0.030769 | 0 | 0 | 0.048088 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.107692 | 0.061538 | 0.676923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f15cca27d82646ccd6e758ce3a844dd96cf2cde3 | 912 | py | Python | aioowm/types/weather.py | vladislavkovalskyi/aioowm | dffd7a8adf4810110f98e056eb9d48320f2d187a | [
"MIT"
] | 5 | 2020-09-12T10:58:48.000Z | 2022-02-07T15:27:05.000Z | aioowm/types/weather.py | vladislavkovalskyi/aioowm | dffd7a8adf4810110f98e056eb9d48320f2d187a | [
"MIT"
] | null | null | null | aioowm/types/weather.py | vladislavkovalskyi/aioowm | dffd7a8adf4810110f98e056eb9d48320f2d187a | [
"MIT"
] | 4 | 2020-09-14T11:46:30.000Z | 2022-03-25T10:18:35.000Z | from typing import Optional
from pydantic import BaseModel
class PressureModel(BaseModel):
sea_level: Optional[int] = None
ground_level: Optional[int] = None
value: Optional[int] = None
class WindModel(BaseModel):
speed: Optional[float] = None
direction: Optional[int] = None
gust: Optional[float] = None
class TemperatureModel(BaseModel):
now: Optional[float] = None
minimal: Optional[float] = None
maximal: Optional[float] = None
feels_like: Optional[float] = None
class WeatherModel(BaseModel):
id: Optional[int] = None
main: Optional[str] = None
description: Optional[str] = None
clouds: Optional[int] = None
rain: Optional[float] = None
snow: Optional[float] = None
humidity: Optional[int] = None
pressure: Optional[PressureModel] = None
wind: Optional[WindModel] = None
temperature: Optional[TemperatureModel] = None
| 25.333333 | 50 | 0.696272 | 103 | 912 | 6.135922 | 0.359223 | 0.164557 | 0.21519 | 0.063291 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.202851 | 912 | 35 | 51 | 26.057143 | 0.869326 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.076923 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
f160c608b9dc7f090ecdf7c37bc7eb24fe1a01ba | 308 | py | Python | app/__init__.py | JenBanks8585/Spotmefy | a4aacfa77e36c970ccd7ee3a31feb11eadf62591 | [
"MIT"
] | null | null | null | app/__init__.py | JenBanks8585/Spotmefy | a4aacfa77e36c970ccd7ee3a31feb11eadf62591 | [
"MIT"
] | null | null | null | app/__init__.py | JenBanks8585/Spotmefy | a4aacfa77e36c970ccd7ee3a31feb11eadf62591 | [
"MIT"
] | null | null | null | import os
from flask import Flask
from app.model import model
from app.appli import appli
def create_app():
app = Flask(__name__)
app.register_blueprint(model)
app.register_blueprint(appli)
return app
if __name__ == '__main__':
my_app = create_app()
my_app.run(debug=True) | 12.833333 | 33 | 0.701299 | 44 | 308 | 4.5 | 0.431818 | 0.070707 | 0.20202 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 308 | 24 | 34 | 12.833333 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.02589 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.333333 | 0 | 0.5 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
f182703b161fef369141c510ee0c9e438ca1188a | 1,618 | py | Python | pytpp/attributes/bulk_application_base.py | Venafi/pytpp | 42af655b2403b8c9447c86962abd4aaa0201f646 | [
"MIT"
] | 4 | 2022-02-04T23:58:55.000Z | 2022-02-15T18:53:08.000Z | pytpp/attributes/bulk_application_base.py | Venafi/pytpp | 42af655b2403b8c9447c86962abd4aaa0201f646 | [
"MIT"
] | null | null | null | pytpp/attributes/bulk_application_base.py | Venafi/pytpp | 42af655b2403b8c9447c86962abd4aaa0201f646 | [
"MIT"
] | null | null | null | from pytpp.attributes._helper import IterableMeta, Attribute
from pytpp.attributes.connection_base import ConnectionBaseAttributes
from pytpp.attributes.driver_base import DriverBaseAttributes
from pytpp.attributes.schedule_base import ScheduleBaseAttributes
class BulkApplicationBaseAttributes(ConnectionBaseAttributes, DriverBaseAttributes, ScheduleBaseAttributes, metaclass=IterableMeta):
__config_class__ = "Bulk Application Base"
batch_size = Attribute('Batch Size', min_version='20.1')
certificate_thumbprint = Attribute('Certificate Thumbprint', min_version='18.3')
device = Attribute('Device', min_version='18.3')
exclude_expired_certificates = Attribute('Exclude Expired Certificates', min_version='18.3')
exclude_historical_certificates = Attribute('Exclude Historical Certificates', min_version='18.3')
exclude_revoked_certificates = Attribute('Exclude Revoked Certificates', min_version='18.3')
grouping_id = Attribute('Grouping Id', min_version='18.3')
in_error = Attribute('In Error', min_version='18.3')
in_progress = Attribute('In Progress', min_version='18.3')
last_run = Attribute('Last Run', min_version='18.3')
last_update = Attribute('Last Update', min_version='18.3')
light_run = Attribute('Light Run', min_version='18.3')
light_run_new_certificates_threshold = Attribute('Light Run New Certificates Threshold', min_version='18.3')
maximum_days_expired = Attribute('Maximum Days Expired', min_version='18.3')
policydn = Attribute('PolicyDN', min_version='18.3')
status = Attribute('Status', min_version='18.3')
stop_requested = Attribute('Stop Requested', min_version='18.3')
| 62.230769 | 132 | 0.799135 | 201 | 1,618 | 6.199005 | 0.263682 | 0.136437 | 0.154093 | 0.166934 | 0.222311 | 0.085072 | 0 | 0 | 0 | 0 | 0 | 0.034413 | 0.084054 | 1,618 | 25 | 133 | 64.72 | 0.806343 | 0 | 0 | 0 | 0 | 0 | 0.220025 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.173913 | 0 | 1 | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
f191cd1fd07b8c22a57fda13f9848a468ca5f766 | 3,058 | py | Python | backend-service/visits-service/app/app/tests/crud/test_visit.py | abhishek70/python-petclinic-microservices | e15a41a668958f35f1b962487cd2360c5c150f0b | [
"MIT"
] | 2 | 2021-05-19T07:21:59.000Z | 2021-09-15T17:30:08.000Z | backend-service/visits-service/app/app/tests/crud/test_visit.py | abhishek70/python-petclinic-microservices | e15a41a668958f35f1b962487cd2360c5c150f0b | [
"MIT"
] | null | null | null | backend-service/visits-service/app/app/tests/crud/test_visit.py | abhishek70/python-petclinic-microservices | e15a41a668958f35f1b962487cd2360c5c150f0b | [
"MIT"
] | null | null | null | from sqlalchemy.orm import Session
from ... import crud
from ...main import app
from ...schemas.visit import VisitCreate, VisitUpdate
from ..utils.utils import random_datetime, random_integer, random_lower_string
def test_create_visit(db: Session) -> None:
pet_id = random_integer()
description = random_lower_string()
visit_date = random_datetime()
visit_in = VisitCreate(
pet_id=pet_id, description=description, visit_date=visit_date
)
visit = crud.visit.create(db=db, obj_in=visit_in)
assert visit.pet_id == pet_id
assert visit.description == description
assert visit.visit_date == visit_date
def test_get_visit(db: Session) -> None:
pet_id = random_integer()
description = random_lower_string()
visit_date = random_datetime()
visit_in = VisitCreate(
pet_id=pet_id, description=description, visit_date=visit_date
)
visit = crud.visit.create(db=db, obj_in=visit_in)
store_db_visit = crud.visit.get(db=db, id=visit.id)
assert store_db_visit
assert store_db_visit.pet_id == pet_id
assert store_db_visit.visit_date == visit_date
def test_update_visit(db: Session) -> None:
pet_id = random_integer()
description = random_lower_string()
visit_date = random_datetime()
visit_in = VisitCreate(
pet_id=pet_id, description=description, visit_date=visit_date
)
visit = crud.visit.create(db=db, obj_in=visit_in)
description2 = random_lower_string()
visit_date2 = random_datetime()
visit_update = VisitUpdate(description=description2, visit_date=visit_date2)
visit2 = crud.visit.update(db=db, db_obj=visit, obj_in=visit_update)
store_db_visit = crud.visit.get(db=db, id=visit.id)
assert store_db_visit
assert store_db_visit.description == description2
assert store_db_visit.visit_date == visit_date2
def test_delete_visit(db: Session) -> None:
pet_id = random_integer()
description = random_lower_string()
visit_date = random_datetime()
visit_in = VisitCreate(
pet_id=pet_id, description=description, visit_date=visit_date
)
visit = crud.visit.create(db=db, obj_in=visit_in)
delete_visit = crud.visit.remove(db=db, id=visit.id)
get_visit = crud.visit.get(db=db, id=visit.id)
assert delete_visit.pet_id == pet_id
assert delete_visit.visit_date == visit_date
assert get_visit is None
def test_get_multi_by_pet(db: Session) -> None:
pet_id = random_integer()
description = random_lower_string()
visit_date = random_datetime()
visit_in = VisitCreate(
pet_id=pet_id, description=description, visit_date=visit_date
)
visit = crud.visit.create(db=db, obj_in=visit_in)
description2 = random_lower_string()
visit_date2 = random_datetime()
visit_in2 = VisitCreate(
pet_id=pet_id, description=description2, visit_date=visit_date2
)
visit2 = crud.visit.create(db=db, obj_in=visit_in2)
get_visits = crud.visit.get_multi_by_pet(db=db, pet_id=pet_id, skip=0, limit=2)
assert len(get_visits) == 2
| 35.976471 | 83 | 0.727927 | 440 | 3,058 | 4.731818 | 0.109091 | 0.060038 | 0.107589 | 0.048031 | 0.780019 | 0.74928 | 0.714217 | 0.663785 | 0.654179 | 0.599424 | 0 | 0.006743 | 0.175605 | 3,058 | 84 | 84 | 36.404762 | 0.819119 | 0 | 0 | 0.520548 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.178082 | 1 | 0.068493 | false | 0 | 0.068493 | 0 | 0.136986 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
74afc926895781a450d884338dc7e3323f2df97c | 119 | py | Python | code/csv-as-wrapper.py | swcarpentry/web-data-python | 20b1421f26237811f49739d579a9688d3ecd3e8f | [
"CC-BY-4.0"
] | 10 | 2015-03-31T02:10:38.000Z | 2019-11-07T11:28:04.000Z | code/csv-as-wrapper.py | dbreddyAI/web-data-python | 20b1421f26237811f49739d579a9688d3ecd3e8f | [
"CC-BY-4.0"
] | 5 | 2015-04-28T09:24:00.000Z | 2015-06-02T01:36:16.000Z | code/csv-as-wrapper.py | dbreddyAI/web-data-python | 20b1421f26237811f49739d579a9688d3ecd3e8f | [
"CC-BY-4.0"
] | 16 | 2015-03-06T15:10:33.000Z | 2019-11-07T11:28:05.000Z | import csv
raw = open('test01.csv', 'r')
cooked = csv.reader(raw)
for record in cooked:
print(record)
raw.close()
| 14.875 | 29 | 0.672269 | 19 | 119 | 4.210526 | 0.684211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020202 | 0.168067 | 119 | 7 | 30 | 17 | 0.787879 | 0 | 0 | 0 | 0 | 0 | 0.092437 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.166667 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
74b76893b26ca6aae085e54c5ebcce2345a040b7 | 1,660 | py | Python | wrappers.py | ZeroTwo36/werkzeug | bc1e97ccc436ea058dcee4be244d62bd0c2ee06a | [
"MIT"
] | 1 | 2022-01-18T09:53:31.000Z | 2022-01-18T09:53:31.000Z | wrappers.py | ZeroTwo36/werkzeug | bc1e97ccc436ea058dcee4be244d62bd0c2ee06a | [
"MIT"
] | null | null | null | wrappers.py | ZeroTwo36/werkzeug | bc1e97ccc436ea058dcee4be244d62bd0c2ee06a | [
"MIT"
] | null | null | null | import timeit
class ABC:
def __init__(self) -> None:
self.__slots__ = ()
class AbstractSink:
def __init__(self) -> None:
pass
class Model:
def __init__(self,iterable = None) -> None:
"""THe Class, most DeploymentModels() inherit from
:param: Iterable. Defaults to None.
Example Usage:
```py
from werkzeug.wrappers import Model
from werkzeug.deployment import dir2list,CustomPush,PushType
class Pusher(CustomPush):
def __init__(self,*args):
super().__init__(*args)
self.config(pushType=PushType.MODEL)
pushData = Model(dir2list())
Pusher(pushData).commit('test.wkzg.cmt')
```
"""
self.iter = iterable
def kill(self,__obj):
if hasattr(self,__obj):
self.__setattr__(__obj,None)
def fetch(self,iterable,**optn):
for i in iterable:
if i == optn['key']:
return i
def fetch_iterable(self):
return self.iter if self.iter else ('Null','Unspecified')
@property
def initialized(self):
return not not self.iter
def init_iterable(self,it):
self.kill('iter')
self.init()
self.iter = it
def init(self):
if self.initialized:
return False
self.iter = ()
class BaseDeploymentModel(Model):
def __init__(self, iterable=()) -> None:
super().__init__(iterable=iterable)
def init_deployment(self,__dp):
self.set_deployment = __dp
| 25.151515 | 69 | 0.54759 | 171 | 1,660 | 5.023392 | 0.368421 | 0.065192 | 0.076834 | 0.034924 | 0.065192 | 0.065192 | 0 | 0 | 0 | 0 | 0 | 0.001847 | 0.34759 | 1,660 | 66 | 70 | 25.151515 | 0.79132 | 0.250602 | 0 | 0.057143 | 0 | 0 | 0.020893 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.314286 | false | 0.028571 | 0.028571 | 0.057143 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
74c6ed6eb27c1b2ed358e77b24ca204d9ef91460 | 83 | py | Python | Topics/Nested lists/A very nested list/main.py | soukalli/jetbrain-accademy | fc486d439b4b54a58956e1186eb69c56b85f85f1 | [
"MIT"
] | null | null | null | Topics/Nested lists/A very nested list/main.py | soukalli/jetbrain-accademy | fc486d439b4b54a58956e1186eb69c56b85f85f1 | [
"MIT"
] | null | null | null | Topics/Nested lists/A very nested list/main.py | soukalli/jetbrain-accademy | fc486d439b4b54a58956e1186eb69c56b85f85f1 | [
"MIT"
] | null | null | null | str_1 = input()
str_2 = input()
str_3 = input()
print([str_1, [str_2], [[str_3]]])
| 16.6 | 34 | 0.60241 | 16 | 83 | 2.75 | 0.375 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084507 | 0.144578 | 83 | 4 | 35 | 20.75 | 0.535211 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
74cdc3f2c9b10c516a7ce658f08109d921e62eec | 1,062 | py | Python | school/__manifest__.py | kyaranusa/School-Management-Systems | d6cd71037fe46c08feff32f42af61f56eb25a7c7 | [
"MIT"
] | null | null | null | school/__manifest__.py | kyaranusa/School-Management-Systems | d6cd71037fe46c08feff32f42af61f56eb25a7c7 | [
"MIT"
] | null | null | null | school/__manifest__.py | kyaranusa/School-Management-Systems | d6cd71037fe46c08feff32f42af61f56eb25a7c7 | [
"MIT"
] | 1 | 2020-11-17T03:25:10.000Z | 2020-11-17T03:25:10.000Z | {
'name': 'School',
'version': '12.0.1.0.0',
'author': 'Serpent Consulting Services Pvt. Ltd.',
'website': 'http://www.serpentcs.com',
'category': 'School Management',
'license': "AGPL-3",
'complexity': 'easy',
'Summary': 'A Module For School Management',
'images': ['static/description/EMS.jpg'],
'depends': ['hr', 'crm', 'account'],
'data': ['security/school_security.xml',
'security/ir.model.access.csv',
'wizard/terminate_reason_view.xml',
'wizard/wiz_send_email_view.xml',
'views/student_view.xml',
'views/school_view.xml',
'views/teacher_view.xml',
'views/parent_view.xml',
'data/student_sequence.xml',
'wizard/assign_roll_no_wizard.xml',
'wizard/move_standards_view.xml',
'views/report_view.xml',
'views/identity_card.xml',
'views/template_view.xml'],
'demo': ['demo/school_demo.xml'],
'installable': True,
'application': True
} | 36.62069 | 54 | 0.565913 | 115 | 1,062 | 5.052174 | 0.6 | 0.108434 | 0.123924 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008951 | 0.263653 | 1,062 | 29 | 55 | 36.62069 | 0.734015 | 0 | 0 | 0 | 0 | 0 | 0.610536 | 0.361242 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
74dfadabd7da62f5d8aac038f314d9ae2df6b217 | 263 | py | Python | src/betting_agent/interfaces.py | Ciro-Taranto/betting-arbitrage | df46a9ffb07e814281d01e6dcf9430ddd4b02d87 | [
"MIT"
] | null | null | null | src/betting_agent/interfaces.py | Ciro-Taranto/betting-arbitrage | df46a9ffb07e814281d01e6dcf9430ddd4b02d87 | [
"MIT"
] | null | null | null | src/betting_agent/interfaces.py | Ciro-Taranto/betting-arbitrage | df46a9ffb07e814281d01e6dcf9430ddd4b02d87 | [
"MIT"
] | null | null | null | from typing import Callable, NamedTuple
class ElementParsers(NamedTuple):
teams: Callable
quotes: Callable
class ElementClasses(NamedTuple):
teams: str
quotes: str
class Waits(NamedTuple):
implicit: float = 10.
explicit: float = 5.
| 18.785714 | 39 | 0.703422 | 28 | 263 | 6.607143 | 0.607143 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014634 | 0.220532 | 263 | 13 | 40 | 20.230769 | 0.887805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
74f905dd8703478e2c6588a2ed3b798f5c7ea65f | 166 | py | Python | img_upload/config.py | LeoZ123/image-repository | 4ea2b7382a6cd3fd84b9dfeef7e70aa90b0a8d0b | [
"MIT"
] | null | null | null | img_upload/config.py | LeoZ123/image-repository | 4ea2b7382a6cd3fd84b9dfeef7e70aa90b0a8d0b | [
"MIT"
] | null | null | null | img_upload/config.py | LeoZ123/image-repository | 4ea2b7382a6cd3fd84b9dfeef7e70aa90b0a8d0b | [
"MIT"
] | null | null | null | import os
S3_BUCKET = ""
S3_KEY = ""
S3_SECRET = ""
S3_LOCATION = 'http://{}.s3.amazonaws.com/'.format(S3_BUCKET)
DEBUG = True
PORT = 5000 | 18.444444 | 64 | 0.560241 | 21 | 166 | 4.190476 | 0.714286 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.084034 | 0.283133 | 166 | 9 | 65 | 18.444444 | 0.655462 | 0 | 0 | 0 | 0 | 0 | 0.161677 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2d01a318c18fc49ecc8095f884d485ba9ecee818 | 2,098 | py | Python | pyopensprinkler/device.py | tetienne/py-opensprinkler | 35ab69748d4916a76fa6deb053c357eba575dfc1 | [
"MIT"
] | null | null | null | pyopensprinkler/device.py | tetienne/py-opensprinkler | 35ab69748d4916a76fa6deb053c357eba575dfc1 | [
"MIT"
] | null | null | null | pyopensprinkler/device.py | tetienne/py-opensprinkler | 35ab69748d4916a76fa6deb053c357eba575dfc1 | [
"MIT"
] | null | null | null | """Device module handling /device/ API calls."""
class Device(object):
"""Device class with /device/ API calls."""
def __init__(self, opensprinkler):
"""Device class initializer."""
self._opensprinkler = opensprinkler
def _getOption(self, option):
"""Retrieve option"""
(resp, content) = self._opensprinkler._request('jo')
return content[option]
def _getVariable(self, option):
"""Retrieve option"""
(resp, content) = self._opensprinkler._request('jc')
return content[option]
def _setVariable(self, option, value):
"""Retrieve option"""
params = {}
params[option] = value
(resp, content) = self._opensprinkler._request('cv', params)
return content['result']
def getFirmwareVersion(self):
"""Retrieve firmware version"""
return self._getOption('fwv')
def getHardwareVersion(self):
"""Retrieve hardware version"""
return self._getOption('hwv')
def getLastRun(self):
"""Retrieve hardware version"""
return self._getVariable('lrun')[3]
def getRainDelay(self):
"""Retrieve rain delay"""
return self._getVariable('rd')
def getRainDelayStopTime(self):
"""Retrieve rain delay stop time"""
return self._getVariable('rdst')
def getRainSensor1(self):
"""Retrieve hardware version"""
return self._getVariable('sn1')
def getRainSensor2(self):
"""Retrieve hardware version"""
return self._getVariable('sn2')
def getRainSensorLegacy(self):
"""Retrieve hardware version"""
return self._getVariable('rs')
def getOperationEnabled(self):
"""Retrieve operation enabled"""
return self._getVariable('en')
def getWaterLevel(self):
"""Retrieve water level"""
return self._getOption('wl')
def enable(self):
"""Enable operation"""
return self._setVariable('en', 1)
def disable(self):
"""Disable operation"""
return self._setVariable('en', 0)
| 27.973333 | 68 | 0.612011 | 199 | 2,098 | 6.321608 | 0.321608 | 0.09539 | 0.116852 | 0.107313 | 0.354531 | 0.275835 | 0.246423 | 0.0938 | 0.0938 | 0 | 0 | 0.00447 | 0.253575 | 2,098 | 74 | 69 | 28.351351 | 0.798851 | 0.211153 | 0 | 0.052632 | 0 | 0 | 0.028133 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.421053 | false | 0 | 0 | 0 | 0.842105 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
2d22b86062f1e1fd1ea5135b062003331a2f7a1d | 478 | py | Python | tests/test_modality.py | nomel/hartigan_diptest | 2d591bc50b37fff6d3c2fb7c0401feab87bd6282 | [
"MIT"
] | null | null | null | tests/test_modality.py | nomel/hartigan_diptest | 2d591bc50b37fff6d3c2fb7c0401feab87bd6282 | [
"MIT"
] | null | null | null | tests/test_modality.py | nomel/hartigan_diptest | 2d591bc50b37fff6d3c2fb7c0401feab87bd6282 | [
"MIT"
] | null | null | null | from __future__ import unicode_literals
from __future__ import print_function
import time
import unittest
import numpy as np
from hartigan_diptest import dip
class testModality(unittest.TestCase):
def setUp(self):
self.data = np.random.randn(1000)
def test_hartigan_diptest(self):
t0 = time.time()
dip(self.data)
t1 = time.time()
print("Hartigan diptest: {}".format(t1-t0))
if __name__ == '__main__':
unittest.main()
| 18.384615 | 51 | 0.6841 | 61 | 478 | 5.016393 | 0.52459 | 0.147059 | 0.104575 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021448 | 0.219665 | 478 | 25 | 52 | 19.12 | 0.798928 | 0 | 0 | 0 | 0 | 0 | 0.058577 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.5625 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
2d398cd0e0d8b0c16eba04d2186f5126f906e274 | 98 | py | Python | src/hcrystalball/compose/__init__.py | betatim/hcrystalball | 693b9b406f05afa23cfc4647c43260166a7076fe | [
"MIT"
] | 139 | 2020-06-29T16:36:16.000Z | 2022-01-25T21:49:10.000Z | src/hcrystalball/compose/__init__.py | betatim/hcrystalball | 693b9b406f05afa23cfc4647c43260166a7076fe | [
"MIT"
] | 34 | 2020-06-29T12:31:26.000Z | 2022-03-18T13:56:21.000Z | src/hcrystalball/compose/__init__.py | betatim/hcrystalball | 693b9b406f05afa23cfc4647c43260166a7076fe | [
"MIT"
] | 28 | 2020-06-30T06:00:39.000Z | 2022-03-18T13:27:58.000Z | from ._ts_column_transformer import TSColumnTransformer
__all__ = [
"TSColumnTransformer",
]
| 16.333333 | 55 | 0.785714 | 8 | 98 | 8.75 | 0.875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142857 | 98 | 5 | 56 | 19.6 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0.193878 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
2d3d41466f41aa0f64fce7842731ee737ec1438b | 787 | py | Python | llrws/__init__.py | irahorecka/LLRWS | 9050853815d39a0c84f7f750809d462d2fc312e6 | [
"MIT"
] | null | null | null | llrws/__init__.py | irahorecka/LLRWS | 9050853815d39a0c84f7f750809d462d2fc312e6 | [
"MIT"
] | null | null | null | llrws/__init__.py | irahorecka/LLRWS | 9050853815d39a0c84f7f750809d462d2fc312e6 | [
"MIT"
] | null | null | null | """
/llrws/__init__.py
Concerns all things LLR Web Suite.
"""
from flask import Flask
from flask_cors import CORS
from flask_restful import Api
from llrws.config import Config
from llrws.api.routes import initialize_routes
application = Flask(__name__)
def create_app(config_class=Config):
"""Creates Flask application instance."""
application.config.from_object(config_class)
CORS(application)
from llrws.main.routes import main
from llrws.errors.handlers import errors
application.register_blueprint(main)
application.register_blueprint(errors)
# Register RESTful API
from llrws.api.routes import api_bp
api = Api(api_bp)
initialize_routes(api)
application.register_blueprint(api_bp, url_prefix="/api")
return application
| 21.861111 | 61 | 0.761118 | 103 | 787 | 5.592233 | 0.349515 | 0.078125 | 0.145833 | 0.0625 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163914 | 787 | 35 | 62 | 22.485714 | 0.87538 | 0.142313 | 0 | 0 | 0 | 0 | 0.006042 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.444444 | 0 | 0.555556 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
741a1e9d7ee4c6d639268c31ca68c42233a13cca | 955 | py | Python | auth_service/app/__init__.py | stavrosgreece/WebApplication | c41c2912cc647c6ff5f5c58ba77bbcb1f836673f | [
"Apache-2.0"
] | null | null | null | auth_service/app/__init__.py | stavrosgreece/WebApplication | c41c2912cc647c6ff5f5c58ba77bbcb1f836673f | [
"Apache-2.0"
] | null | null | null | auth_service/app/__init__.py | stavrosgreece/WebApplication | c41c2912cc647c6ff5f5c58ba77bbcb1f836673f | [
"Apache-2.0"
] | null | null | null | ''' flask web_service with mongo '''
import os
import json
import datetime
from bson.objectid import ObjectId
from flask import Flask
from flask_pymongo import PyMongo
import connexion
from connexion import NoContent
from flask_jwt_extended import JWTManager
from flask_bcrypt import Bcrypt
from flask_cors import CORS
class JSONEncoder(json.JSONEncoder):
''' extend json-encoder class'''
def default(self, o):
if isinstance(o, ObjectId):
return str(o)
if isinstance(o, datetime.datetime):
return str(o)
return json.JSONEncoder.default(self, o)
app = connexion.App(__name__)
app.add_api('openapi.yaml')
cors = CORS(app.app)
# use the modified encoder class to handle ObjectId & datetime object while jsonifying the response.
app.app.json_encoder = JSONEncoder
# Debug
app.app.config['JWT_SECRET_KEY'] = "ironman"
app.app.config['JWT_ACCESS_TOKEN_EXPIRES'] = datetime.timedelta(days=1)
| 22.209302 | 100 | 0.739267 | 131 | 955 | 5.259542 | 0.442748 | 0.065312 | 0.034833 | 0.040639 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001271 | 0.175916 | 955 | 42 | 101 | 22.738095 | 0.874206 | 0.168586 | 0 | 0.083333 | 0 | 0 | 0.073265 | 0.030848 | 0 | 0 | 0 | 0 | 0 | 1 | 0.041667 | false | 0 | 0.458333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
742370e5f4867c1f2529fe6c174dfaf613bd5b88 | 2,392 | py | Python | Raspberry Pi_UDP/ujiUDP.py | aguspray001/Interface-Communication-of-Flight-Controller-for-Hardware-in-The-Loop-Simulation-Application | 926bcb90da35a808c61669cf8bde11a2a61f4fd2 | [
"MIT"
] | 5 | 2020-08-19T08:49:11.000Z | 2020-11-13T13:16:09.000Z | Raspberry Pi_UDP/ujiUDP.py | aguspray001/Interface-Communication-of-Flight-Controller-for-Hardware-in-The-Loop-Simulation-Application | 926bcb90da35a808c61669cf8bde11a2a61f4fd2 | [
"MIT"
] | null | null | null | Raspberry Pi_UDP/ujiUDP.py | aguspray001/Interface-Communication-of-Flight-Controller-for-Hardware-in-The-Loop-Simulation-Application | 926bcb90da35a808c61669cf8bde11a2a61f4fd2 | [
"MIT"
] | null | null | null | import serial
import time
import struct
import socket
from multiprocessing import Process
import sys
import select
ser = serial.Serial('/dev/serial0',
38400,
parity=serial.PARITY_NONE,
stopbits=serial.STOPBITS_ONE,
bytesize=serial.EIGHTBITS,
timeout=1)
IP_Send = "192.168.137.1"
Port_Send = 49000
IP_Receive = "192.168.137.21"
Port_Receive = 49005
sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
sock.bind((IP_Receive, Port_Receive))
packet_size = 1
while True:
data, addr = sock.recvfrom(1024)
packet_size = len(data)
#print("Byte Length ", packet_size)
if packet_size == 329:
header = struct.unpack_from('<4s', data, 0)
# print(header) # DATA
if header == (b'DATA',):
# ID offset = 5
id_data = struct.unpack_from('B', data, 5)
# print(id_data) # ID Data = 18 pitch, roll, yaw
timer = struct.unpack_from('f', data, 17)
timer = round(timer[0], 3)
airspeed = struct.unpack_from('f', data, 53)
airspeed = round(airspeed[0], 3)
groundspeed = struct.unpack_from('f', data, 57)
groundspeed = round(groundspeed[0], 3)
pitch = struct.unpack_from('f', data, 153)
pitch = round(pitch[0], 3)
roll = struct.unpack_from('f', data, 157)
roll = round(roll[0], 2)
yaw = struct.unpack_from('f', data, 161)
yaw = round(yaw[0], 3)
mag = struct.unpack_from('f', data, 165)
mag = round(mag[0], 3)
lat = struct.unpack_from('f', data, 225)
lat = round(lat[0], 3)
lon = struct.unpack_from('f', data, 229)
lon = round(lon[0], 3)
alt = struct.unpack_from('f', data, 237)
alt = round(alt[0], 3)
x = struct.unpack_from('f', data, 261)
x = round(x[0], 3)
y = struct.unpack_from('f', data, 265)
y = round(y[0], 3)
vx = struct.unpack_from('f', data, 273)
vx = round(vx[0], 3)
vy = struct.unpack_from('f', data, 277)
vy = round(vy[0], 3)
vz = struct.unpack_from('f', data, 281)
vz = round(vz[0], 3)
throttle = struct.unpack_from('f', data, 297)
throttle = round(throttle[0], 3)
print(airspeed, pitch, roll, yaw, lat, lon, alt, vx, vy, vz)
elevator = pitch*-0.01
aileron = (roll)*-0.01
rudder = 0
message = struct.pack('<4sBi8fi8f', b'DATA', 0, 25, 1, 0, 0, 0, 0, 0, 0, 0, 11, elevator, aileron, rudder, 0, -999, 0, 0, 0)
sock.sendto(message, (IP_Send, Port_Send)) | 30.666667 | 127 | 0.614548 | 370 | 2,392 | 3.875676 | 0.289189 | 0.150628 | 0.200837 | 0.189679 | 0.239191 | 0.004881 | 0 | 0 | 0 | 0 | 0 | 0.085622 | 0.223662 | 2,392 | 78 | 128 | 30.666667 | 0.686591 | 0.047659 | 0 | 0 | 0 | 0 | 0.033876 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.107692 | 0 | 0.107692 | 0.015385 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
74241af1042f2fe3934d80e6e02d9f711d365c43 | 127 | py | Python | tests/util.py | marcovannoord/simple_zpl2 | 7550485c50ee9a4808d1c6dfbeb040ab6429f3bb | [
"MIT"
] | 16 | 2017-06-15T23:01:16.000Z | 2021-09-30T00:30:49.000Z | tests/util.py | marcovannoord/simple_zpl2 | 7550485c50ee9a4808d1c6dfbeb040ab6429f3bb | [
"MIT"
] | 139 | 2017-05-27T15:14:34.000Z | 2022-03-21T14:31:04.000Z | tests/util.py | marcovannoord/simple_zpl2 | 7550485c50ee9a4808d1c6dfbeb040ab6429f3bb | [
"MIT"
] | 5 | 2018-04-09T03:54:28.000Z | 2022-03-22T08:46:38.000Z | from simple_zpl2 import ZPLDocument
def add_to_zdoc(upc):
zdoc = ZPLDocument()
zdoc.add_barcode(upc)
return zdoc
| 15.875 | 35 | 0.724409 | 18 | 127 | 4.888889 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009901 | 0.204724 | 127 | 7 | 36 | 18.142857 | 0.861386 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
742bef22a0849391a4232c98ef20a0e9344cfadc | 1,700 | py | Python | setup.py | ev3dev-python-tools/ev3devlogging | b98904d9d93059159637d87e5ebcab8ca093b0f1 | [
"MIT"
] | null | null | null | setup.py | ev3dev-python-tools/ev3devlogging | b98904d9d93059159637d87e5ebcab8ca093b0f1 | [
"MIT"
] | null | null | null | setup.py | ev3dev-python-tools/ev3devlogging | b98904d9d93059159637d87e5ebcab8ca093b0f1 | [
"MIT"
] | null | null | null | from setuptools import setup
import os.path
import sys
setup(
name="ev3devlogging",
version="1.0.1",
description="easy logging library for ev3dev",
long_description="""
easy logging library for ev3dev
For more info: https://github.com/ev3dev-python-tools/ev3devlogging
""",
url="https://github.com/ev3dev-python-tools/ev3devlogging",
author="Harco Kuppens",
author_email="h.kuppens@cs.ru.nl",
license="MIT",
classifiers=[
"Environment :: MacOS X",
"Environment :: Win32 (MS Windows)",
"Environment :: X11 Applications",
"Intended Audience :: Developers",
"Intended Audience :: Education",
"Intended Audience :: End Users/Desktop",
"License :: Freeware",
"License :: OSI Approved :: MIT License",
"Natural Language :: English",
"Operating System :: MacOS",
"Operating System :: Microsoft :: Windows",
"Operating System :: POSIX",
"Operating System :: POSIX :: Linux",
"Programming Language :: Python",
"Programming Language :: Python :: 3",
"Programming Language :: Python :: 3 :: Only",
"Programming Language :: Python :: 3.4",
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
"Topic :: Education",
"Topic :: Software Development",
],
keywords="IDE education programming EV3 mindstorms lego",
platforms=["Windows", "macOS", "Linux"],
python_requires=">=3.6",
# no packages required; 'logging' already in standard distribution
py_modules=["ev3devlogging"]
)
| 34.693878 | 72 | 0.607059 | 168 | 1,700 | 6.119048 | 0.535714 | 0.129377 | 0.170233 | 0.151751 | 0.159533 | 0.159533 | 0.085603 | 0 | 0 | 0 | 0 | 0.022099 | 0.254706 | 1,700 | 48 | 73 | 35.416667 | 0.789266 | 0.037647 | 0 | 0 | 0 | 0 | 0.619951 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.068182 | 0 | 0.068182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7442470539025c661a0705673823f444fed2ab08 | 850 | py | Python | setup.py | Mywayking/python_stallion | f08649c833963e3a137f99b569981a2dfeee9430 | [
"Apache-2.0"
] | null | null | null | setup.py | Mywayking/python_stallion | f08649c833963e3a137f99b569981a2dfeee9430 | [
"Apache-2.0"
] | null | null | null | setup.py | Mywayking/python_stallion | f08649c833963e3a137f99b569981a2dfeee9430 | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
"""
python setup.py sdist upload -r pypi
"""
from setuptools import setup, find_packages
from stallions import __version__
VERSION = __version__
readability_lxml = "readability-lxml"
setup(
name='stallions',
version=VERSION,
description='Extract the content of the web page.',
license='',
author='galen',
author_email='galen.wang@rtbasia.com',
classifiers=[
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
],
url='https://github.com/Mywayking/python_stallion',
keywords='Web spider',
packages=find_packages(),
install_requires=[
readability_lxml,
'lxml',
'requests',
'fake-useragent',
],
)
| 23.611111 | 55 | 0.625882 | 90 | 850 | 5.744444 | 0.6 | 0.147002 | 0.193424 | 0.201161 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013804 | 0.232941 | 850 | 35 | 56 | 24.285714 | 0.779141 | 0.069412 | 0 | 0.074074 | 0 | 0 | 0.403576 | 0.028097 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.074074 | 0 | 0.074074 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
744344ec5efc9f45580e10e6c76ea05a5bd1aa0c | 442 | py | Python | meta_agents/modules/__init__.py | zhanpenghe/meta_agents | b3b4df70bab1ebe621d48eebb4c886b85c1d8323 | [
"MIT"
] | 3 | 2020-09-26T16:17:52.000Z | 2021-04-23T08:56:04.000Z | meta_agents/modules/__init__.py | zhanpenghe/meta_agents | b3b4df70bab1ebe621d48eebb4c886b85c1d8323 | [
"MIT"
] | 1 | 2019-09-03T19:57:40.000Z | 2019-09-03T19:57:40.000Z | meta_agents/modules/__init__.py | zhanpenghe/meta_agents | b3b4df70bab1ebe621d48eebb4c886b85c1d8323 | [
"MIT"
] | 1 | 2020-12-09T03:06:48.000Z | 2020-12-09T03:06:48.000Z | """Pytorch modules."""
from meta_agents.modules.gaussian_mlp_module import \
GaussianMLPIndependentStdModule, GaussianMLPModule, \
GaussianMLPTwoHeadedModule
from meta_agents.modules.mlp_module import MLPModule
from meta_agents.modules.multi_headed_mlp_module import MultiHeadedMLPModule
__all__ = [
'MLPModule', 'MultiHeadedMLPModule', 'GaussianMLPModule',
'GaussianMLPIndependentStdModule', 'GaussianMLPTwoHeadedModule'
]
| 34 | 76 | 0.821267 | 37 | 442 | 9.459459 | 0.459459 | 0.068571 | 0.12 | 0.18 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.10181 | 442 | 12 | 77 | 36.833333 | 0.881612 | 0.036199 | 0 | 0 | 0 | 0 | 0.245238 | 0.135714 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
7465a346a19ecfbc1286a25f61bbc8e6e0865c9f | 201 | py | Python | tree.py | juhyun0/python_turtle2 | 59943c03a07a71aa33ab7124bca56f6b880b6883 | [
"Unlicense"
] | null | null | null | tree.py | juhyun0/python_turtle2 | 59943c03a07a71aa33ab7124bca56f6b880b6883 | [
"Unlicense"
] | null | null | null | tree.py | juhyun0/python_turtle2 | 59943c03a07a71aa33ab7124bca56f6b880b6883 | [
"Unlicense"
] | null | null | null | def tree(length):
if length>5:
t.forward(length)
t.right(20)
tree(length-15)
t.left(40)
tree(length-15)
t.right(20)
t.backward(length)
t.left(90)
t.color("green")
t.speed(1)
tree(90)
| 11.823529 | 20 | 0.641791 | 38 | 201 | 3.394737 | 0.473684 | 0.232558 | 0.124031 | 0.20155 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.095238 | 0.164179 | 201 | 16 | 21 | 12.5625 | 0.672619 | 0 | 0 | 0.307692 | 0 | 0 | 0.025 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
74764db8fa921a213a978d25e1a6fc2fb2101de2 | 671 | py | Python | uploadfile/views.py | parthrao/Django-Rest | 7921be3de53b7992409ae0dc6e620925f63f9af0 | [
"Unlicense"
] | null | null | null | uploadfile/views.py | parthrao/Django-Rest | 7921be3de53b7992409ae0dc6e620925f63f9af0 | [
"Unlicense"
] | null | null | null | uploadfile/views.py | parthrao/Django-Rest | 7921be3de53b7992409ae0dc6e620925f63f9af0 | [
"Unlicense"
] | null | null | null | from django.shortcuts import render, get_object_or_404
from django.http import HttpResponse
from django.http import Http404
from django.template import loader
from .models import File
from django.views.generic.edit import CreateView
from django.views import generic
# Create your views here.
def index(request):
all_files = File.objects.all()
# template = loader.get_template('music/index.html')
context = {
'all_files': all_files,
}
# return HttpResponse(template.render(context, request))
return render(request, 'uploadfile/index.html', context)
class FileFormCreate(CreateView):
model = File
fields = ['name','company','file_type','file_url', 'file_title'] | 31.952381 | 65 | 0.777943 | 91 | 671 | 5.626374 | 0.472527 | 0.117188 | 0.054688 | 0.078125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010101 | 0.114754 | 671 | 21 | 65 | 31.952381 | 0.851852 | 0.19225 | 0 | 0 | 0 | 0 | 0.12616 | 0.038961 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.4375 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
7481fbbe6605a84643063562bc1093eac4091867 | 937 | py | Python | secpy/core/mixins/base_network_client_mixin.py | McKalvan/secpy | be688ead5931f750f1f106578ba4a9c29ec7167a | [
"MIT"
] | 10 | 2022-02-12T20:48:25.000Z | 2022-03-15T00:55:57.000Z | secpy/core/mixins/base_network_client_mixin.py | McKalvan/secpy | be688ead5931f750f1f106578ba4a9c29ec7167a | [
"MIT"
] | null | null | null | secpy/core/mixins/base_network_client_mixin.py | McKalvan/secpy | be688ead5931f750f1f106578ba4a9c29ec7167a | [
"MIT"
] | null | null | null | from os import path
from secpy.core.endpoint_enum import EndpointEnum
from secpy.core.network_client import NetworkClient
class BaseNetworkClientMixin:
def __init__(self, user_agent, **kwargs):
"""
Mixin for for adding network_client to a class
@param user_agent: email address to use in headers when making requests to SEC REST API
@param kwargs:
"""
self.__network_client = NetworkClient(user_agent, **kwargs)
def _validate_args_and_make_request(self, endpoint, **kwargs):
assert EndpointEnum.validate_endpoint_kwargs(**kwargs)
return self.__network_client.make_request_json(endpoint, **kwargs)
def _validate_path_and_download_file(self, endpoint, target_path, **kwargs):
assert not path.exists(target_path), "target_path {} already exists!".format(target_path)
return self.__network_client.download_file(endpoint, target_path, **kwargs)
| 40.73913 | 97 | 0.736393 | 118 | 937 | 5.516949 | 0.440678 | 0.099846 | 0.078341 | 0.070661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.182497 | 937 | 22 | 98 | 42.590909 | 0.849869 | 0.159018 | 0 | 0 | 0 | 0 | 0.040107 | 0 | 0 | 0 | 0 | 0 | 0.166667 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
748db2662f4cded52c1f16bdcfab7100b47d525f | 2,509 | py | Python | example/pyctp2/trader/strategy.py | mmmaaaggg/pyctp_lovelylain | 4ee3275c53ffdac72c49f54d32ee3baed6148161 | [
"MIT"
] | 358 | 2015-01-05T02:17:16.000Z | 2022-03-20T13:49:05.000Z | example/pyctp2/trader/strategy.py | mmmaaaggg/pyctp_lovelylain | 4ee3275c53ffdac72c49f54d32ee3baed6148161 | [
"MIT"
] | 43 | 2015-01-01T04:26:42.000Z | 2019-07-14T17:15:27.000Z | example/pyctp2/trader/strategy.py | mmmaaaggg/pyctp_lovelylain | 4ee3275c53ffdac72c49f54d32ee3baed6148161 | [
"MIT"
] | 231 | 2015-01-01T18:08:25.000Z | 2022-02-28T18:22:00.000Z | '''
理想的策略形式
交易端须获取所跟进的合约的当日涨跌停位置
'''
import json
import logging
from ..common.base import BaseObject,inverse_direction
from .position import (POSITION_APPROVE_STATUS,
Position,
Closer,ContractCloser,GlobalCloser,
)
class POPEN(object):
"""
数据对象
"""
def __init__(self,contract,base_price,direction,closers,extra_hops=30):
self.contract = contract
self.base_price = base_price
self.direction = direction
self.closers = closers
self.extra_hops = extra_hops
self.unit = 1
self.planned = 0
class PCLOSE(object):
"""
根据Position来平仓
"""
def __init__(self,position,base_price,volume,extra_hops=30):
self.position = position
self.base_price = base_price
self.unit = volume #全部成功或失败
self.planned = volume
self.extra_hops = extra_hops
@property
def contract(self):
return self.position.contract
@property
def direction(self):
return inverse_direction(self.position.direction)
class PCLOSE2(object):
"""
根据合约来平仓. 故需要在所有position中按规定次序遍历
"""
def __init__(self,contract,base_price,direction,volume,extra_hops=30):
self.contract = contract
self.base_price = base_price
self.direction = direction
self.extra_hops = extra_hops
self.volume = volume
class BaseStrategy(object):
def __init__(self,holder):
self._holder = holder
@classmethod
def sname(cls):
return cls.__name__
@property
def name(self):
return self.__class__.__name__
@property
def available_balance(self):
return self._holder.available_balance
def prepare(self,ctick,*args):
'''子类只在必要时实现, 用于在check_open/close之前处理共同计算
'''
pass
def check_open(self,ctick,*contracts):
''' 子类必须实现
针对行情的open check
返回值为 instrs 和对应的closer
'''
raise NotImplementedError()
def on_approved(self,command,astate):
"""
子类挂接点
"""
pass
def on_reject(self,command):
"""
供子类挂接
"""
pass
def on_done(self,command):
"""
供子类挂接
"""
pass
def on_progress(self,command):
"""
供子类挂接
"""
pass
def day_finalize(self):
"""
供子类挂接
"""
pass | 20.735537 | 75 | 0.573137 | 244 | 2,509 | 5.643443 | 0.32377 | 0.058824 | 0.031954 | 0.03268 | 0.310821 | 0.258533 | 0.168482 | 0.114742 | 0.114742 | 0.114742 | 0 | 0.005412 | 0.337186 | 2,509 | 121 | 76 | 20.735537 | 0.82261 | 0.084894 | 0 | 0.322581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.258065 | false | 0.096774 | 0.064516 | 0.080645 | 0.467742 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
74a9eebc084e29c8c47fddf899dfcca87c7d88c8 | 5,536 | py | Python | mediasiteXMLGen/generator.py | sdsu-its/mediasiteXMLGen | 89a62c6ff2735281131f4acd23519c2f97429496 | [
"MIT"
] | null | null | null | mediasiteXMLGen/generator.py | sdsu-its/mediasiteXMLGen | 89a62c6ff2735281131f4acd23519c2f97429496 | [
"MIT"
] | null | null | null | mediasiteXMLGen/generator.py | sdsu-its/mediasiteXMLGen | 89a62c6ff2735281131f4acd23519c2f97429496 | [
"MIT"
] | null | null | null | import pandas as pd
import xml.dom.minidom
import sys
def generate(fileName):
classesDf = pd.read_csv(fileName, sep='\t', header=1)
classesDf.rename(columns={'Start Time': 'startTime', 'End Time': 'endTime', 'Schedule #': 'Schedule'}, inplace=True)
arrValues = classesDf[classesDf['Days'] == 'ARR'].index
classesDf.drop(arrValues, inplace=True)
classesDf.dropna(subset=['Days'], inplace=True)
# Days of the week
Sunday = 'false'
Monday = 'false'
Tuesday = 'false'
Wednesday = 'false'
Thursday = 'false'
Friday = 'false'
Saturday = 'false'
def convertStartTime(x):
xToInt = int(float(x))
xToStr = str(xToInt)
splitStr = xToStr.split('0')
extractTime = splitStr[0]
if len(extractTime) == 1:
extractTime = '0' + extractTime
return extractTime
else:
return extractTime
def convertTime(x):
hour = int(x[0])
minutes = hour * 60
totalTime = minutes + int(x[1:])
return totalTime
recurenceArray = []
scheduleArray = []
subjectArray = []
for row in classesDf.itertuples():
schedule = row.Schedule
subject = row.Subject
days = row.Days
startTime = row.startTime
endTime = row.endTime
duration = str(int(endTime - startTime))
timeInMinutes = convertTime(duration)
beginDateTime = convertStartTime(startTime)
if days == 'MWF':
Sunday = 'false'
Monday = 'true'
Tuesday = 'false'
Wednesday = 'true'
Thursday = 'false'
Friday = 'true'
Saturday = 'false'
if days == 'TTH':
Sunday = 'false'
Monday = 'false'
Tuesday = 'true'
Wednesday = 'false'
Thursday = 'true'
Friday = 'false'
Saturday = 'false'
if days == 'TTHF':
Sunday = 'false'
Monday = 'false'
Tuesday = 'true'
Wednesday = 'false'
Thursday = 'true'
Friday = 'true'
Saturday = 'false'
if days == 'M':
Sunday = 'false'
Monday = 'true'
Tuesday = 'false'
Wednesday = 'false'
Thursday = 'false'
Friday = 'false'
Saturday = 'false'
if days == 'T':
Sunday = 'false'
Monday = 'false'
Tuesday = 'true'
Wednesday = 'false'
Thursday = 'false'
Friday = 'false'
Saturday = 'false'
if days == 'W':
Sunday = 'false'
Monday = 'false'
Tuesday = 'false'
Wednesday = 'true'
Thursday = 'false'
Friday = 'false'
Saturday = 'false'
if days == 'Th':
Sunday = 'false'
Monday = 'false'
Tuesday = 'false'
Wednesday = 'false'
Thursday = 'true'
Friday = 'false'
Saturday = 'false'
if days == 'F':
Sunday = 'false'
Monday = 'false'
Tuesday = 'false'
Wednesday = 'false'
Thursday = 'false'
Friday = 'true'
Saturday = 'false'
recurrences = '<?xml version="1.0" encoding="utf-8" ?><RecorderScheduleImport xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><RecorderSchedules><RecorderSchedule><RecorderName>My Recorder</RecorderName><PresentationNamingFormat>ScheduleNameAndNumber</PresentationNamingFormat><AdvanceCreationTimeInMinutes>60</AdvanceCreationTimeInMinutes><AdvanceLoadTimeInMinutes>5</AdvanceLoadTimeInMinutes><ScheduledOperations>None</ScheduledOperations><NotifyPresenters>true</NotifyPresenters><NotificationEmailAddresses><NotificationEmailAddress>notification_1@example.com</NotificationEmailAddress><NotificationEmailAddress>notification_2@example.com</NotificationEmailAddress></NotificationEmailAddresses><DeleteInactive>true</DeleteInactive><Modules><ModuleOverride><ModuleId>test::module::1</ModuleId><ModuleName>Test Module From Import</ModuleName></ModuleOverride><ModuleOverride><ModuleId>test::module::2</ModuleId><Permissions><AceEntry><DirectoryEntry>schedule::import::role</DirectoryEntry><Permission>3</Permission></AceEntry></Permissions></ModuleOverride></Modules><Recurrences><Recurrence><BeginDateTime>2022-01-01T{}:00:00</BeginDateTime><EndDateTime>2022-05-30T18:00:00</EndDateTime><RecordingDurationInMinutes>{}</RecordingDurationInMinutes><AlwaysExcludeHolidays>true</AlwaysExcludeHolidays><WeeklySchedule><RecurrenceFrequency>1</RecurrenceFrequency><Sunday>{}</Sunday><Monday>{}</Monday><Tuesday>{}</Tuesday><Wednesday>{}</Wednesday><Thursday>{}</Thursday><Friday>{}</Friday><Saturday>{}</Saturday></WeeklySchedule></Recurrence></Recurrences></RecorderSchedule></RecorderSchedules></RecorderScheduleImport>'.format(beginDateTime, timeInMinutes, Sunday, Monday, Tuesday, Wednesday, Thursday, Friday, Saturday)
myXML = xml.dom.minidom.parseString(recurrences)
recurrencData = myXML.toprettyxml()
print(recurrencData)
original_stdout = sys.stdout
print(recurrencData)
original_stdout = sys.stdout
with open('{}_{}.xml'.format(subject, schedule), 'w') as f:
sys.stdout = f
print(recurrencData)
sys.stdout = original_stdout
| 37.405405 | 1,777 | 0.600072 | 478 | 5,536 | 6.935146 | 0.324268 | 0.029864 | 0.046154 | 0.046456 | 0.279336 | 0.279336 | 0.220513 | 0.180392 | 0.167421 | 0.14721 | 0 | 0.014161 | 0.272941 | 5,536 | 147 | 1,778 | 37.659864 | 0.809441 | 0.00289 | 0 | 0.59322 | 1 | 0.008475 | 0.370968 | 0.262414 | 0 | 0 | 0 | 0 | 0 | 1 | 0.025424 | false | 0 | 0.033898 | 0 | 0.084746 | 0.025424 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7765d850dac683519914c17d1c9322b94e69d488 | 607 | py | Python | ctreport_selenium/ctreport_html/detail_view.py | naveens33/ctreport-selenium | 9553b5c4b8deb52e46cf0fb3e1ea7092028cf090 | [
"MIT"
] | 2 | 2020-08-30T13:12:52.000Z | 2020-09-03T05:38:28.000Z | ctreport_selenium/ctreport_html/detail_view.py | naveens33/ctreport-selenium | 9553b5c4b8deb52e46cf0fb3e1ea7092028cf090 | [
"MIT"
] | 5 | 2020-01-10T07:01:24.000Z | 2020-06-25T10:49:43.000Z | ctreport_selenium/ctreport_html/detail_view.py | naveens33/ctreport-selenium | 9553b5c4b8deb52e46cf0fb3e1ea7092028cf090 | [
"MIT"
] | 1 | 2020-10-13T02:27:04.000Z | 2020-10-13T02:27:04.000Z | from ctreport_selenium.ctreport_html.testdetail import summary,details
def content(status, tests, reference):
c = '''
<div id="test-view" class="wrapper" style="display: none;">
<div class="container-fluid">
<section class="pading">
<div class="row mt-3">
<!--test summary-->
''' + summary.content(status, tests,reference) + '''
<!-- test details-->
''' + details.content(tests) + '''
</div>
</section>
</div>
</div>
'''
return c | 33.722222 | 72 | 0.474465 | 54 | 607 | 5.296296 | 0.574074 | 0.090909 | 0.125874 | 0.188811 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002604 | 0.367381 | 607 | 18 | 73 | 33.722222 | 0.742188 | 0 | 0 | 0.176471 | 0 | 0 | 0.629934 | 0.039474 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0 | 0.058824 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7771c678147d967aaf38453e4e1dd0152770de96 | 408 | py | Python | accounts/models.py | olamijinadebayo/hood-watch | d5d7ebbd51f4b1878c9090312aa3d2bc04a16c64 | [
"MIT"
] | null | null | null | accounts/models.py | olamijinadebayo/hood-watch | d5d7ebbd51f4b1878c9090312aa3d2bc04a16c64 | [
"MIT"
] | null | null | null | accounts/models.py | olamijinadebayo/hood-watch | d5d7ebbd51f4b1878c9090312aa3d2bc04a16c64 | [
"MIT"
] | null | null | null | from django.db import models
from django.contrib import auth
# Create your models here.
class User(auth.models.User, auth.models.PermissionsMixin):
'''
This will initialize the user class which inherits classes
from the django.contrib class
'''
def __str__(self):
'''
This is a string reperesentation of each user
'''
return "@{}".format(self.username)
| 24 | 62 | 0.664216 | 51 | 408 | 5.235294 | 0.627451 | 0.074906 | 0.104869 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.245098 | 408 | 16 | 63 | 25.5 | 0.866883 | 0.392157 | 0 | 0 | 0 | 0 | 0.014634 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.4 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
7774a7620b75545f4317ae61343b565f7071ed71 | 813 | py | Python | supernodes/urls.py | ockibagusp/cloud-platform | a9fd369c51d8e2aa1b2fd51d11f2cb16b018bf66 | [
"MIT"
] | 1 | 2019-09-29T06:24:19.000Z | 2019-09-29T06:24:19.000Z | supernodes/urls.py | ockibagusp/cloud-platform | a9fd369c51d8e2aa1b2fd51d11f2cb16b018bf66 | [
"MIT"
] | 1 | 2020-11-23T10:03:46.000Z | 2020-11-23T10:03:46.000Z | supernodes/urls.py | OckiFals/cloud-platform | c62eba7fcc08b06485b845089e79b3ce236ec1ec | [
"MIT"
] | null | null | null | from django.conf.urls import url
from rest_framework.urlpatterns import format_suffix_patterns
from supernodes import views as supernode_views
from nodes import views as node_views
from sensors import views as sensor_views
urlpatterns = [
url(r'^$', supernode_views.SuperNodesList.as_view(), name="supernodes-all"),
url(r'^(?P<pk>\w+)/$', supernode_views.SupernodeDetail.as_view(), name="supernodes-detail"),
url(r'^(?P<pk>\w+)/nodes/$', node_views.NodesList.as_view(), name="supernodes-node-list"),
url(r'^(?P<pk>\w+)/sensors/$', sensor_views.SupernodeSensorsList.as_view(), name="supernodes-sensors-list"),
url(r'^(?P<pk>\w+)/sensors/(?P<sensorid>\w+)/$', sensor_views.SupernodeSensorDetail.as_view(), name="supernodes-sensors-detail"),
]
urlpatterns = format_suffix_patterns(urlpatterns)
| 50.8125 | 133 | 0.741697 | 112 | 813 | 5.223214 | 0.3125 | 0.034188 | 0.08547 | 0.17094 | 0.184615 | 0.064957 | 0.064957 | 0 | 0 | 0 | 0 | 0 | 0.086101 | 813 | 15 | 134 | 54.2 | 0.787349 | 0 | 0 | 0 | 0 | 0 | 0.242312 | 0.135301 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.384615 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
778dfab4c37655c554c807b1287ada28e880ff24 | 865 | py | Python | phenotype/__init__.py | kamalm87/phenotype | 92ff1c3ec6b21cd540caf10880ef532270d85110 | [
"MIT"
] | null | null | null | phenotype/__init__.py | kamalm87/phenotype | 92ff1c3ec6b21cd540caf10880ef532270d85110 | [
"MIT"
] | null | null | null | phenotype/__init__.py | kamalm87/phenotype | 92ff1c3ec6b21cd540caf10880ef532270d85110 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""Top-level package for phenotype."""
__author__ = """Kamal McDermott"""
__email__ = 'kamal.mcdermott@gmail.com'
__version__ = '0.1.0'
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)
# __all__ = [ 'Access', 'Assignment', 'Func' ]
from sys import path as __sys_path__
from os.path import abspath as __abs_path__
__sys_path__.insert(0, __abs_path__('.'))
# import phenotype.Access as Access
# import phenotype.Assignment as Assignment
# import phenotype.Collection as Collection
# import phenotype.Core as Core
# import phenotype.Func as Func
# import phenotype.Interfaces as Interfaces
# import phenotype.Predicate as Predicate
# import phenotype.Result as Result
# import phenotype.States as States
# import phenotype.System as System
# import phenotype.Text as Text
| 32.037037 | 46 | 0.730636 | 110 | 865 | 5.290909 | 0.372727 | 0.283505 | 0.04811 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006993 | 0.17341 | 865 | 26 | 47 | 33.269231 | 0.806993 | 0.609249 | 0 | 0 | 0 | 0 | 0.14375 | 0.078125 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
7797a0431668b3946ecf684ba915f687607da502 | 439 | py | Python | tasks/task_09.py | AlexRogalskiy/Python | 78a38746de51688dc118ba921da08b920fe4caf2 | [
"MIT"
] | null | null | null | tasks/task_09.py | AlexRogalskiy/Python | 78a38746de51688dc118ba921da08b920fe4caf2 | [
"MIT"
] | null | null | null | tasks/task_09.py | AlexRogalskiy/Python | 78a38746de51688dc118ba921da08b920fe4caf2 | [
"MIT"
] | null | null | null | #Take the users input
words = raw_input("Enter some text to translate to pig latin: ")
print "You entered: ", words
#Now I need to break apart the words into a list
words = words.split(' ')
#Now words is a list, so I can manipulate each one usinga loop
for i in words:
if len(i) >= 3: #I only want to translate words greater than 3 characters
i = i + "%say" % (i[0])
i = i[1:]
print i
else:
pass | 27.4375 | 77 | 0.624146 | 76 | 439 | 3.592105 | 0.631579 | 0.080586 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012658 | 0.280182 | 439 | 16 | 78 | 27.4375 | 0.851266 | 0.419134 | 0 | 0 | 0 | 0 | 0.242063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.1 | 0 | null | null | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
779fb022135aee714867e7e340ade66c1f738573 | 362 | py | Python | lib/crd.py | TierMobility/aws-auth-operator | 3841e88b85a04a9dd0fb4ca088d163436442848f | [
"MIT"
] | 8 | 2020-11-17T16:04:13.000Z | 2021-10-21T07:50:20.000Z | lib/crd.py | TierMobility/aws-auth-operator | 3841e88b85a04a9dd0fb4ca088d163436442848f | [
"MIT"
] | 6 | 2021-04-30T21:07:56.000Z | 2021-06-14T12:53:38.000Z | lib/crd.py | TierMobility/aws-auth-operator | 3841e88b85a04a9dd0fb4ca088d163436442848f | [
"MIT"
] | null | null | null | from typing import List, Dict
from lib.constants import CRD_GROUP, CRD_VERSION, CRD_KIND
def build_aws_auth_mapping(mappings: List, name: str) -> Dict:
return {
"apiVersion": CRD_GROUP + "/" + CRD_VERSION,
"kind": CRD_KIND,
"metadata": {"annotations": {}, "labels": {}, "name": name,},
"spec": {"mappings": mappings},
}
| 30.166667 | 69 | 0.616022 | 42 | 362 | 5.095238 | 0.595238 | 0.074766 | 0.102804 | 0.168224 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.223757 | 362 | 11 | 70 | 32.909091 | 0.761566 | 0 | 0 | 0 | 0 | 0 | 0.154696 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.222222 | 0.111111 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
77a1988105d74c7faf581b6dd4d131d239126fbb | 387 | py | Python | migrations/20210714_01_kvALS-add-artist-last-post-imported-at.py | ThatOneAnimeGuy/seiso | f8ad20a0ec59b86b88149723eafc8e6d9f8be451 | [
"BSD-3-Clause"
] | 3 | 2021-11-08T05:23:08.000Z | 2021-11-08T09:46:51.000Z | migrations/20210714_01_kvALS-add-artist-last-post-imported-at.py | ThatOneAnimeGuy/seiso | f8ad20a0ec59b86b88149723eafc8e6d9f8be451 | [
"BSD-3-Clause"
] | null | null | null | migrations/20210714_01_kvALS-add-artist-last-post-imported-at.py | ThatOneAnimeGuy/seiso | f8ad20a0ec59b86b88149723eafc8e6d9f8be451 | [
"BSD-3-Clause"
] | 2 | 2021-11-08T05:23:12.000Z | 2021-11-16T01:16:35.000Z | """
add artist last post imported at
"""
from yoyo import step
__depends__ = {'20210712_02_9n9bA-make-display-name-unique'}
steps = [
step(
"""
ALTER TABLE artist ADD COLUMN last_post_imported_at timestamp NULL;
UPDATE artist SET last_post_imported_at = last_indexed;
CREATE INDEX ON artist (last_post_imported_at);
"""
)
]
| 21.5 | 79 | 0.645995 | 48 | 387 | 4.875 | 0.625 | 0.136752 | 0.273504 | 0.307692 | 0.205128 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 0.271318 | 387 | 17 | 80 | 22.764706 | 0.787234 | 0.082687 | 0 | 0 | 0 | 0 | 0.33871 | 0.33871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
77a4ed800bca321e5a78ce76a941f96f136cec64 | 205 | py | Python | test.py | ssghost/MyStockPredictor | 9665f5131dee23462171db11f27348e354472719 | [
"MIT"
] | 1 | 2021-05-18T13:06:40.000Z | 2021-05-18T13:06:40.000Z | test.py | ssghost/MyStockPredictor | 9665f5131dee23462171db11f27348e354472719 | [
"MIT"
] | null | null | null | test.py | ssghost/MyStockPredictor | 9665f5131dee23462171db11f27348e354472719 | [
"MIT"
] | null | null | null | from stock_predictor import Predictor
from train import symbols, pkey
def main():
prd = Predictor(symbol=symbols, key=pkey)
print(symbols, prd.predict())
if __name__ == "__main__":
main() | 22.777778 | 45 | 0.697561 | 26 | 205 | 5.153846 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190244 | 205 | 9 | 46 | 22.777778 | 0.807229 | 0 | 0 | 0 | 0 | 0 | 0.038835 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.428571 | 0.142857 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
77a4fff14bbd28b8022a368e6c521c8f5cad7e67 | 430 | py | Python | inspector/checks/engine/executors/python_executor.py | yoyowallet/inspector | a6ee3328a4dcf49b0e5b62d23195ed44f515a705 | [
"Apache-2.0"
] | 7 | 2019-03-03T14:47:47.000Z | 2020-10-31T00:26:52.000Z | inspector/checks/engine/executors/python_executor.py | yoyowallet/inspector | a6ee3328a4dcf49b0e5b62d23195ed44f515a705 | [
"Apache-2.0"
] | 2 | 2019-03-06T19:35:41.000Z | 2020-11-04T11:57:18.000Z | inspector/checks/engine/executors/python_executor.py | yoyowallet/inspector | a6ee3328a4dcf49b0e5b62d23195ed44f515a705 | [
"Apache-2.0"
] | 3 | 2019-03-03T16:29:44.000Z | 2020-10-31T00:47:01.000Z | from . import CheckExecutor
from ...constants import CHECK_TYPES
class PythonExecutor(CheckExecutor):
supported_check_types = \
(CHECK_TYPES.NUMBER,
CHECK_TYPES.STRING,
CHECK_TYPES.DATE)
def execute(self, check_logic):
if self.check_type == CHECK_TYPES.STRING:
return str(check_logic)
if self.check_type == CHECK_TYPES.NUMBER:
return float(check_logic)
| 26.875 | 49 | 0.669767 | 50 | 430 | 5.5 | 0.42 | 0.254545 | 0.116364 | 0.116364 | 0.254545 | 0.254545 | 0.254545 | 0.254545 | 0 | 0 | 0 | 0 | 0.251163 | 430 | 15 | 50 | 28.666667 | 0.854037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0 | 0.166667 | 0 | 0.583333 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
77a9b6d38f3df0bed53135f73518516f0c04f98a | 653 | py | Python | nardis/http/utils.py | yoongkang/nardis | b4c9e6f660c0097f04a0774123880310bb095b35 | [
"MIT"
] | 15 | 2018-06-22T09:58:41.000Z | 2021-11-24T13:59:44.000Z | nardis/http/utils.py | yoongkang/nardis | b4c9e6f660c0097f04a0774123880310bb095b35 | [
"MIT"
] | 3 | 2018-06-24T11:46:42.000Z | 2018-06-25T23:39:51.000Z | nardis/http/utils.py | yoongkang/nardis | b4c9e6f660c0097f04a0774123880310bb095b35 | [
"MIT"
] | null | null | null | from typing import List, Tuple
from nardis.utils import decode_bytes
from urllib.parse import parse_qs as libparse_qs
def parse_cookie(cookie: str) -> dict:
if cookie:
key_values = (x.split("=", 1) for x in cookie.split("; "))
return {
(x[0] if len(x) > 1 else "") : (x[1] if len(x) > 1 else x[0])
for x in key_values
}
return {}
def parse_headers(headers: List[Tuple[bytes, bytes]]):
return {decode_bytes(k): decode_bytes(v) for (k, v) in headers}
def parse_qs(qs: bytes):
return {
decode_bytes(k): [*map(decode_bytes, v)]
for k, v in libparse_qs(qs).items()
}
| 25.115385 | 73 | 0.600306 | 102 | 653 | 3.715686 | 0.343137 | 0.145119 | 0.031662 | 0.036939 | 0.28496 | 0.163588 | 0.100264 | 0 | 0 | 0 | 0 | 0.012474 | 0.2634 | 653 | 25 | 74 | 26.12 | 0.775468 | 0 | 0 | 0.111111 | 0 | 0 | 0.004594 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0.111111 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 |
77ab807d4431413a8d09317f71b2e1f86ba345e9 | 238 | py | Python | src/poliastro/earth/plotting/utils.py | niharsalunke/poliastro | 75a11922f00df3a06c64af265a182dfdbbc4f5c3 | [
"MIT"
] | 634 | 2015-05-11T08:50:42.000Z | 2022-03-28T10:13:13.000Z | src/poliastro/earth/plotting/utils.py | niharsalunke/poliastro | 75a11922f00df3a06c64af265a182dfdbbc4f5c3 | [
"MIT"
] | 1,386 | 2015-04-29T20:54:36.000Z | 2022-03-30T13:06:34.000Z | src/poliastro/earth/plotting/utils.py | niharsalunke/poliastro | 75a11922f00df3a06c64af265a182dfdbbc4f5c3 | [
"MIT"
] | 324 | 2015-04-29T20:52:43.000Z | 2022-03-06T23:19:15.000Z | """ Holds utilities related with Earth plotting """
EARTH_PALETTE = {
"land_color": "#9fc164",
"ocean_color": "#b2d9ff",
"lake_color": "#e9eff9",
"dessert_color": "d8c596",
}
""" A color palette based on Earth colors """
| 23.8 | 51 | 0.630252 | 27 | 238 | 5.37037 | 0.740741 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.0625 | 0.193277 | 238 | 9 | 52 | 26.444444 | 0.692708 | 0.180672 | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
77b576ae2839049696fc98333f7464621ab856da | 29,793 | py | Python | src/oci/data_safe/models/discovery_job_result.py | LaudateCorpus1/oci-python-sdk | b0d3ce629d5113df4d8b83b7a6502b2c5bfa3015 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/oci/data_safe/models/discovery_job_result.py | LaudateCorpus1/oci-python-sdk | b0d3ce629d5113df4d8b83b7a6502b2c5bfa3015 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | src/oci/data_safe/models/discovery_job_result.py | LaudateCorpus1/oci-python-sdk | b0d3ce629d5113df4d8b83b7a6502b2c5bfa3015 | [
"Apache-2.0",
"BSD-3-Clause"
] | null | null | null | # coding: utf-8
# Copyright (c) 2016, 2022, Oracle and/or its affiliates. All rights reserved.
# This software is dual-licensed to you under the Universal Permissive License (UPL) 1.0 as shown at https://oss.oracle.com/licenses/upl or Apache License 2.0 as shown at http://www.apache.org/licenses/LICENSE-2.0. You may choose either license.
from oci.util import formatted_flat_dict, NONE_SENTINEL, value_allowed_none_or_none_sentinel # noqa: F401
from oci.decorators import init_model_state_from_kwargs
@init_model_state_from_kwargs
class DiscoveryJobResult(object):
"""
A discovery job result representing a sensitive column. It can be one of the following three types:
NEW: A new sensitive column in the target database that is not in the sensitive data model.
DELETED: A column that is present in the sensitive data model but has been deleted from the target database.
MODIFIED: A column that is present in the target database as well as the sensitive data model but some of its attributes have been modified.
"""
#: A constant which can be used with the discovery_type property of a DiscoveryJobResult.
#: This constant has a value of "NEW"
DISCOVERY_TYPE_NEW = "NEW"
#: A constant which can be used with the discovery_type property of a DiscoveryJobResult.
#: This constant has a value of "MODIFIED"
DISCOVERY_TYPE_MODIFIED = "MODIFIED"
#: A constant which can be used with the discovery_type property of a DiscoveryJobResult.
#: This constant has a value of "DELETED"
DISCOVERY_TYPE_DELETED = "DELETED"
#: A constant which can be used with the object_type property of a DiscoveryJobResult.
#: This constant has a value of "TABLE"
OBJECT_TYPE_TABLE = "TABLE"
#: A constant which can be used with the object_type property of a DiscoveryJobResult.
#: This constant has a value of "EDITIONING_VIEW"
OBJECT_TYPE_EDITIONING_VIEW = "EDITIONING_VIEW"
#: A constant which can be used with the relation_type property of a DiscoveryJobResult.
#: This constant has a value of "NONE"
RELATION_TYPE_NONE = "NONE"
#: A constant which can be used with the relation_type property of a DiscoveryJobResult.
#: This constant has a value of "APP_DEFINED"
RELATION_TYPE_APP_DEFINED = "APP_DEFINED"
#: A constant which can be used with the relation_type property of a DiscoveryJobResult.
#: This constant has a value of "DB_DEFINED"
RELATION_TYPE_DB_DEFINED = "DB_DEFINED"
#: A constant which can be used with the planned_action property of a DiscoveryJobResult.
#: This constant has a value of "NONE"
PLANNED_ACTION_NONE = "NONE"
#: A constant which can be used with the planned_action property of a DiscoveryJobResult.
#: This constant has a value of "ACCEPT"
PLANNED_ACTION_ACCEPT = "ACCEPT"
#: A constant which can be used with the planned_action property of a DiscoveryJobResult.
#: This constant has a value of "INVALIDATE"
PLANNED_ACTION_INVALIDATE = "INVALIDATE"
#: A constant which can be used with the planned_action property of a DiscoveryJobResult.
#: This constant has a value of "REJECT"
PLANNED_ACTION_REJECT = "REJECT"
def __init__(self, **kwargs):
"""
Initializes a new DiscoveryJobResult object with values from keyword arguments.
The following keyword arguments are supported (corresponding to the getters/setters of this class):
:param key:
The value to assign to the key property of this DiscoveryJobResult.
:type key: str
:param discovery_type:
The value to assign to the discovery_type property of this DiscoveryJobResult.
Allowed values for this property are: "NEW", "MODIFIED", "DELETED", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:type discovery_type: str
:param sensitive_columnkey:
The value to assign to the sensitive_columnkey property of this DiscoveryJobResult.
:type sensitive_columnkey: str
:param app_name:
The value to assign to the app_name property of this DiscoveryJobResult.
:type app_name: str
:param schema_name:
The value to assign to the schema_name property of this DiscoveryJobResult.
:type schema_name: str
:param object_name:
The value to assign to the object_name property of this DiscoveryJobResult.
:type object_name: str
:param column_name:
The value to assign to the column_name property of this DiscoveryJobResult.
:type column_name: str
:param object_type:
The value to assign to the object_type property of this DiscoveryJobResult.
Allowed values for this property are: "TABLE", "EDITIONING_VIEW", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:type object_type: str
:param data_type:
The value to assign to the data_type property of this DiscoveryJobResult.
:type data_type: str
:param sensitive_type_id:
The value to assign to the sensitive_type_id property of this DiscoveryJobResult.
:type sensitive_type_id: str
:param parent_column_keys:
The value to assign to the parent_column_keys property of this DiscoveryJobResult.
:type parent_column_keys: list[str]
:param relation_type:
The value to assign to the relation_type property of this DiscoveryJobResult.
Allowed values for this property are: "NONE", "APP_DEFINED", "DB_DEFINED", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:type relation_type: str
:param estimated_data_value_count:
The value to assign to the estimated_data_value_count property of this DiscoveryJobResult.
:type estimated_data_value_count: int
:param sample_data_values:
The value to assign to the sample_data_values property of this DiscoveryJobResult.
:type sample_data_values: list[str]
:param app_defined_child_column_keys:
The value to assign to the app_defined_child_column_keys property of this DiscoveryJobResult.
:type app_defined_child_column_keys: list[str]
:param db_defined_child_column_keys:
The value to assign to the db_defined_child_column_keys property of this DiscoveryJobResult.
:type db_defined_child_column_keys: list[str]
:param planned_action:
The value to assign to the planned_action property of this DiscoveryJobResult.
Allowed values for this property are: "NONE", "ACCEPT", "INVALIDATE", "REJECT", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:type planned_action: str
:param is_result_applied:
The value to assign to the is_result_applied property of this DiscoveryJobResult.
:type is_result_applied: bool
:param modified_attributes:
The value to assign to the modified_attributes property of this DiscoveryJobResult.
:type modified_attributes: oci.data_safe.models.ModifiedAttributes
"""
self.swagger_types = {
'key': 'str',
'discovery_type': 'str',
'sensitive_columnkey': 'str',
'app_name': 'str',
'schema_name': 'str',
'object_name': 'str',
'column_name': 'str',
'object_type': 'str',
'data_type': 'str',
'sensitive_type_id': 'str',
'parent_column_keys': 'list[str]',
'relation_type': 'str',
'estimated_data_value_count': 'int',
'sample_data_values': 'list[str]',
'app_defined_child_column_keys': 'list[str]',
'db_defined_child_column_keys': 'list[str]',
'planned_action': 'str',
'is_result_applied': 'bool',
'modified_attributes': 'ModifiedAttributes'
}
self.attribute_map = {
'key': 'key',
'discovery_type': 'discoveryType',
'sensitive_columnkey': 'sensitiveColumnkey',
'app_name': 'appName',
'schema_name': 'schemaName',
'object_name': 'objectName',
'column_name': 'columnName',
'object_type': 'objectType',
'data_type': 'dataType',
'sensitive_type_id': 'sensitiveTypeId',
'parent_column_keys': 'parentColumnKeys',
'relation_type': 'relationType',
'estimated_data_value_count': 'estimatedDataValueCount',
'sample_data_values': 'sampleDataValues',
'app_defined_child_column_keys': 'appDefinedChildColumnKeys',
'db_defined_child_column_keys': 'dbDefinedChildColumnKeys',
'planned_action': 'plannedAction',
'is_result_applied': 'isResultApplied',
'modified_attributes': 'modifiedAttributes'
}
self._key = None
self._discovery_type = None
self._sensitive_columnkey = None
self._app_name = None
self._schema_name = None
self._object_name = None
self._column_name = None
self._object_type = None
self._data_type = None
self._sensitive_type_id = None
self._parent_column_keys = None
self._relation_type = None
self._estimated_data_value_count = None
self._sample_data_values = None
self._app_defined_child_column_keys = None
self._db_defined_child_column_keys = None
self._planned_action = None
self._is_result_applied = None
self._modified_attributes = None
@property
def key(self):
"""
**[Required]** Gets the key of this DiscoveryJobResult.
The unique key that identifies the discovery result.
:return: The key of this DiscoveryJobResult.
:rtype: str
"""
return self._key
@key.setter
def key(self, key):
"""
Sets the key of this DiscoveryJobResult.
The unique key that identifies the discovery result.
:param key: The key of this DiscoveryJobResult.
:type: str
"""
self._key = key
@property
def discovery_type(self):
"""
**[Required]** Gets the discovery_type of this DiscoveryJobResult.
The type of the discovery result. It can be one of the following three types:
NEW: A new sensitive column in the target database that is not in the sensitive data model.
DELETED: A column that is present in the sensitive data model but has been deleted from the target database.
MODIFIED: A column that is present in the target database as well as the sensitive data model but some of its attributes have been modified.
Allowed values for this property are: "NEW", "MODIFIED", "DELETED", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:return: The discovery_type of this DiscoveryJobResult.
:rtype: str
"""
return self._discovery_type
@discovery_type.setter
def discovery_type(self, discovery_type):
"""
Sets the discovery_type of this DiscoveryJobResult.
The type of the discovery result. It can be one of the following three types:
NEW: A new sensitive column in the target database that is not in the sensitive data model.
DELETED: A column that is present in the sensitive data model but has been deleted from the target database.
MODIFIED: A column that is present in the target database as well as the sensitive data model but some of its attributes have been modified.
:param discovery_type: The discovery_type of this DiscoveryJobResult.
:type: str
"""
allowed_values = ["NEW", "MODIFIED", "DELETED"]
if not value_allowed_none_or_none_sentinel(discovery_type, allowed_values):
discovery_type = 'UNKNOWN_ENUM_VALUE'
self._discovery_type = discovery_type
@property
def sensitive_columnkey(self):
"""
Gets the sensitive_columnkey of this DiscoveryJobResult.
The unique key that identifies the sensitive column represented by the discovery result.
:return: The sensitive_columnkey of this DiscoveryJobResult.
:rtype: str
"""
return self._sensitive_columnkey
@sensitive_columnkey.setter
def sensitive_columnkey(self, sensitive_columnkey):
"""
Sets the sensitive_columnkey of this DiscoveryJobResult.
The unique key that identifies the sensitive column represented by the discovery result.
:param sensitive_columnkey: The sensitive_columnkey of this DiscoveryJobResult.
:type: str
"""
self._sensitive_columnkey = sensitive_columnkey
@property
def app_name(self):
"""
Gets the app_name of this DiscoveryJobResult.
The name of the application. An application is an entity that is identified by a schema and stores sensitive information for that schema. Its value will be same as schemaName, if no value is passed.
:return: The app_name of this DiscoveryJobResult.
:rtype: str
"""
return self._app_name
@app_name.setter
def app_name(self, app_name):
"""
Sets the app_name of this DiscoveryJobResult.
The name of the application. An application is an entity that is identified by a schema and stores sensitive information for that schema. Its value will be same as schemaName, if no value is passed.
:param app_name: The app_name of this DiscoveryJobResult.
:type: str
"""
self._app_name = app_name
@property
def schema_name(self):
"""
**[Required]** Gets the schema_name of this DiscoveryJobResult.
The database schema that contains the sensitive column.
:return: The schema_name of this DiscoveryJobResult.
:rtype: str
"""
return self._schema_name
@schema_name.setter
def schema_name(self, schema_name):
"""
Sets the schema_name of this DiscoveryJobResult.
The database schema that contains the sensitive column.
:param schema_name: The schema_name of this DiscoveryJobResult.
:type: str
"""
self._schema_name = schema_name
@property
def object_name(self):
"""
**[Required]** Gets the object_name of this DiscoveryJobResult.
The database object that contains the sensitive column.
:return: The object_name of this DiscoveryJobResult.
:rtype: str
"""
return self._object_name
@object_name.setter
def object_name(self, object_name):
"""
Sets the object_name of this DiscoveryJobResult.
The database object that contains the sensitive column.
:param object_name: The object_name of this DiscoveryJobResult.
:type: str
"""
self._object_name = object_name
@property
def column_name(self):
"""
**[Required]** Gets the column_name of this DiscoveryJobResult.
The name of the sensitive column.
:return: The column_name of this DiscoveryJobResult.
:rtype: str
"""
return self._column_name
@column_name.setter
def column_name(self, column_name):
"""
Sets the column_name of this DiscoveryJobResult.
The name of the sensitive column.
:param column_name: The column_name of this DiscoveryJobResult.
:type: str
"""
self._column_name = column_name
@property
def object_type(self):
"""
**[Required]** Gets the object_type of this DiscoveryJobResult.
The type of the database object that contains the sensitive column.
Allowed values for this property are: "TABLE", "EDITIONING_VIEW", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:return: The object_type of this DiscoveryJobResult.
:rtype: str
"""
return self._object_type
@object_type.setter
def object_type(self, object_type):
"""
Sets the object_type of this DiscoveryJobResult.
The type of the database object that contains the sensitive column.
:param object_type: The object_type of this DiscoveryJobResult.
:type: str
"""
allowed_values = ["TABLE", "EDITIONING_VIEW"]
if not value_allowed_none_or_none_sentinel(object_type, allowed_values):
object_type = 'UNKNOWN_ENUM_VALUE'
self._object_type = object_type
@property
def data_type(self):
"""
**[Required]** Gets the data_type of this DiscoveryJobResult.
The data type of the sensitive column.
:return: The data_type of this DiscoveryJobResult.
:rtype: str
"""
return self._data_type
@data_type.setter
def data_type(self, data_type):
"""
Sets the data_type of this DiscoveryJobResult.
The data type of the sensitive column.
:param data_type: The data_type of this DiscoveryJobResult.
:type: str
"""
self._data_type = data_type
@property
def sensitive_type_id(self):
"""
Gets the sensitive_type_id of this DiscoveryJobResult.
The OCID of the sensitive type associated with the sensitive column.
:return: The sensitive_type_id of this DiscoveryJobResult.
:rtype: str
"""
return self._sensitive_type_id
@sensitive_type_id.setter
def sensitive_type_id(self, sensitive_type_id):
"""
Sets the sensitive_type_id of this DiscoveryJobResult.
The OCID of the sensitive type associated with the sensitive column.
:param sensitive_type_id: The sensitive_type_id of this DiscoveryJobResult.
:type: str
"""
self._sensitive_type_id = sensitive_type_id
@property
def parent_column_keys(self):
"""
Gets the parent_column_keys of this DiscoveryJobResult.
Unique keys identifying the columns that are parents of the sensitive column. At present, it tracks a single parent only.
:return: The parent_column_keys of this DiscoveryJobResult.
:rtype: list[str]
"""
return self._parent_column_keys
@parent_column_keys.setter
def parent_column_keys(self, parent_column_keys):
"""
Sets the parent_column_keys of this DiscoveryJobResult.
Unique keys identifying the columns that are parents of the sensitive column. At present, it tracks a single parent only.
:param parent_column_keys: The parent_column_keys of this DiscoveryJobResult.
:type: list[str]
"""
self._parent_column_keys = parent_column_keys
@property
def relation_type(self):
"""
**[Required]** Gets the relation_type of this DiscoveryJobResult.
The type of referential relationship the sensitive column has with its parent. NONE indicates that the sensitive
column does not have a parent. DB_DEFINED indicates that the relationship is defined in the database dictionary.
APP_DEFINED indicates that the relationship is defined at the application level and not in the database dictionary.
Allowed values for this property are: "NONE", "APP_DEFINED", "DB_DEFINED", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:return: The relation_type of this DiscoveryJobResult.
:rtype: str
"""
return self._relation_type
@relation_type.setter
def relation_type(self, relation_type):
"""
Sets the relation_type of this DiscoveryJobResult.
The type of referential relationship the sensitive column has with its parent. NONE indicates that the sensitive
column does not have a parent. DB_DEFINED indicates that the relationship is defined in the database dictionary.
APP_DEFINED indicates that the relationship is defined at the application level and not in the database dictionary.
:param relation_type: The relation_type of this DiscoveryJobResult.
:type: str
"""
allowed_values = ["NONE", "APP_DEFINED", "DB_DEFINED"]
if not value_allowed_none_or_none_sentinel(relation_type, allowed_values):
relation_type = 'UNKNOWN_ENUM_VALUE'
self._relation_type = relation_type
@property
def estimated_data_value_count(self):
"""
**[Required]** Gets the estimated_data_value_count of this DiscoveryJobResult.
The estimated number of data values the column has in the associated database.
:return: The estimated_data_value_count of this DiscoveryJobResult.
:rtype: int
"""
return self._estimated_data_value_count
@estimated_data_value_count.setter
def estimated_data_value_count(self, estimated_data_value_count):
"""
Sets the estimated_data_value_count of this DiscoveryJobResult.
The estimated number of data values the column has in the associated database.
:param estimated_data_value_count: The estimated_data_value_count of this DiscoveryJobResult.
:type: int
"""
self._estimated_data_value_count = estimated_data_value_count
@property
def sample_data_values(self):
"""
Gets the sample_data_values of this DiscoveryJobResult.
Original data values collected for the sensitive column from the associated database. Sample data helps review
the column and ensure that it actually contains sensitive data. Note that sample data is retrieved by a data
discovery job only if the isSampleDataCollectionEnabled attribute is set to true. At present, only one data
value is collected per sensitive column.
:return: The sample_data_values of this DiscoveryJobResult.
:rtype: list[str]
"""
return self._sample_data_values
@sample_data_values.setter
def sample_data_values(self, sample_data_values):
"""
Sets the sample_data_values of this DiscoveryJobResult.
Original data values collected for the sensitive column from the associated database. Sample data helps review
the column and ensure that it actually contains sensitive data. Note that sample data is retrieved by a data
discovery job only if the isSampleDataCollectionEnabled attribute is set to true. At present, only one data
value is collected per sensitive column.
:param sample_data_values: The sample_data_values of this DiscoveryJobResult.
:type: list[str]
"""
self._sample_data_values = sample_data_values
@property
def app_defined_child_column_keys(self):
"""
Gets the app_defined_child_column_keys of this DiscoveryJobResult.
Unique keys identifying the columns that are application-level (non-dictionary) children of the sensitive column.
:return: The app_defined_child_column_keys of this DiscoveryJobResult.
:rtype: list[str]
"""
return self._app_defined_child_column_keys
@app_defined_child_column_keys.setter
def app_defined_child_column_keys(self, app_defined_child_column_keys):
"""
Sets the app_defined_child_column_keys of this DiscoveryJobResult.
Unique keys identifying the columns that are application-level (non-dictionary) children of the sensitive column.
:param app_defined_child_column_keys: The app_defined_child_column_keys of this DiscoveryJobResult.
:type: list[str]
"""
self._app_defined_child_column_keys = app_defined_child_column_keys
@property
def db_defined_child_column_keys(self):
"""
Gets the db_defined_child_column_keys of this DiscoveryJobResult.
Unique keys identifying the columns that are database-level (dictionary-defined) children of the sensitive column.
:return: The db_defined_child_column_keys of this DiscoveryJobResult.
:rtype: list[str]
"""
return self._db_defined_child_column_keys
@db_defined_child_column_keys.setter
def db_defined_child_column_keys(self, db_defined_child_column_keys):
"""
Sets the db_defined_child_column_keys of this DiscoveryJobResult.
Unique keys identifying the columns that are database-level (dictionary-defined) children of the sensitive column.
:param db_defined_child_column_keys: The db_defined_child_column_keys of this DiscoveryJobResult.
:type: list[str]
"""
self._db_defined_child_column_keys = db_defined_child_column_keys
@property
def planned_action(self):
"""
**[Required]** Gets the planned_action of this DiscoveryJobResult.
Specifies how to process the discovery result. It's set to NONE by default. Use the PatchDiscoveryJobResults operation to update this attribute. You can choose one of the following options:
ACCEPT: To accept the discovery result and update the sensitive data model to reflect the changes.
REJECT: To reject the discovery result so that it doesn't change the sensitive data model.
INVALIDATE: To invalidate a newly discovered column. It adds the column to the sensitive data model but marks it as invalid. It helps track false positives and ensure that they aren't reported by future discovery jobs.
After specifying the planned action, you can use the ApplyDiscoveryJobResults operation to automatically process the discovery results.
Allowed values for this property are: "NONE", "ACCEPT", "INVALIDATE", "REJECT", 'UNKNOWN_ENUM_VALUE'.
Any unrecognized values returned by a service will be mapped to 'UNKNOWN_ENUM_VALUE'.
:return: The planned_action of this DiscoveryJobResult.
:rtype: str
"""
return self._planned_action
@planned_action.setter
def planned_action(self, planned_action):
"""
Sets the planned_action of this DiscoveryJobResult.
Specifies how to process the discovery result. It's set to NONE by default. Use the PatchDiscoveryJobResults operation to update this attribute. You can choose one of the following options:
ACCEPT: To accept the discovery result and update the sensitive data model to reflect the changes.
REJECT: To reject the discovery result so that it doesn't change the sensitive data model.
INVALIDATE: To invalidate a newly discovered column. It adds the column to the sensitive data model but marks it as invalid. It helps track false positives and ensure that they aren't reported by future discovery jobs.
After specifying the planned action, you can use the ApplyDiscoveryJobResults operation to automatically process the discovery results.
:param planned_action: The planned_action of this DiscoveryJobResult.
:type: str
"""
allowed_values = ["NONE", "ACCEPT", "INVALIDATE", "REJECT"]
if not value_allowed_none_or_none_sentinel(planned_action, allowed_values):
planned_action = 'UNKNOWN_ENUM_VALUE'
self._planned_action = planned_action
@property
def is_result_applied(self):
"""
**[Required]** Gets the is_result_applied of this DiscoveryJobResult.
Indicates if the discovery result has been processed. You can update this attribute using the PatchDiscoveryJobResults
operation to track whether the discovery result has already been processed and applied to the sensitive data model.
:return: The is_result_applied of this DiscoveryJobResult.
:rtype: bool
"""
return self._is_result_applied
@is_result_applied.setter
def is_result_applied(self, is_result_applied):
"""
Sets the is_result_applied of this DiscoveryJobResult.
Indicates if the discovery result has been processed. You can update this attribute using the PatchDiscoveryJobResults
operation to track whether the discovery result has already been processed and applied to the sensitive data model.
:param is_result_applied: The is_result_applied of this DiscoveryJobResult.
:type: bool
"""
self._is_result_applied = is_result_applied
@property
def modified_attributes(self):
"""
Gets the modified_attributes of this DiscoveryJobResult.
:return: The modified_attributes of this DiscoveryJobResult.
:rtype: oci.data_safe.models.ModifiedAttributes
"""
return self._modified_attributes
@modified_attributes.setter
def modified_attributes(self, modified_attributes):
"""
Sets the modified_attributes of this DiscoveryJobResult.
:param modified_attributes: The modified_attributes of this DiscoveryJobResult.
:type: oci.data_safe.models.ModifiedAttributes
"""
self._modified_attributes = modified_attributes
def __repr__(self):
return formatted_flat_dict(self)
def __eq__(self, other):
if other is None:
return False
return self.__dict__ == other.__dict__
def __ne__(self, other):
return not self == other
| 40.589918 | 245 | 0.690061 | 3,738 | 29,793 | 5.291867 | 0.071964 | 0.029119 | 0.115262 | 0.040038 | 0.800263 | 0.713564 | 0.651231 | 0.576715 | 0.526414 | 0.511198 | 0 | 0.000808 | 0.252308 | 29,793 | 733 | 246 | 40.645293 | 0.887188 | 0.616353 | 0 | 0.089623 | 0 | 0 | 0.137228 | 0.027104 | 0 | 0 | 0 | 0 | 0 | 1 | 0.198113 | false | 0 | 0.009434 | 0.009434 | 0.377358 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
77d05415b50f29d22c69f6f06d96e8dba8cc2143 | 22,280 | py | Python | nautobot/dcim/navigation.py | psmware-ltd/nautobot | ac516287fb8edcc3482bd011839de837c6bbf0df | [
"Apache-2.0"
] | 384 | 2021-02-24T01:40:40.000Z | 2022-03-30T10:30:59.000Z | nautobot/dcim/navigation.py | psmware-ltd/nautobot | ac516287fb8edcc3482bd011839de837c6bbf0df | [
"Apache-2.0"
] | 1,067 | 2021-02-24T00:58:08.000Z | 2022-03-31T23:38:23.000Z | nautobot/dcim/navigation.py | psmware-ltd/nautobot | ac516287fb8edcc3482bd011839de837c6bbf0df | [
"Apache-2.0"
] | 128 | 2021-02-24T02:45:16.000Z | 2022-03-20T18:48:36.000Z | from nautobot.core.apps import NavMenuAddButton, NavMenuGroup, NavMenuItem, NavMenuImportButton, NavMenuTab
menu_items = (
NavMenuTab(
name="Organization",
weight=100,
groups=(
NavMenuGroup(
name="Sites",
weight=100,
items=(
NavMenuItem(
link="dcim:site_list",
name="Sites",
weight=100,
permissions=[
"dcim.view_site",
],
buttons=(
NavMenuAddButton(
link="dcim:site_add",
permissions=[
"dcim.add_site",
],
),
NavMenuImportButton(
link="dcim:site_import",
permissions=[
"dcim.add_site",
],
),
),
),
NavMenuItem(
link="dcim:region_list",
name="Regions",
weight=200,
permissions=[
"dcim.view_region",
],
buttons=(
NavMenuAddButton(
link="dcim:region_add",
permissions=[
"dcim.add_region",
],
),
NavMenuImportButton(
link="dcim:region_import",
permissions=[
"dcim.add_region",
],
),
),
),
),
),
NavMenuGroup(
name="Racks",
weight=200,
items=(
NavMenuItem(
link="dcim:rack_list",
name="Racks",
weight=100,
permissions=[
"dcim.view_rack",
],
buttons=(
NavMenuAddButton(
link="dcim:rack_add",
permissions=[
"dcim.add_rack",
],
),
NavMenuImportButton(
link="dcim:rack_import",
permissions=[
"dcim.add_rack",
],
),
),
),
NavMenuItem(
link="dcim:rackgroup_list",
name="Rack Groups",
weight=200,
permissions=[
"dcim.view_rackgroup",
],
buttons=(
NavMenuAddButton(
link="dcim:rackgroup_add",
permissions=[
"dcim.add_rackgroup",
],
),
NavMenuImportButton(
link="dcim:rackgroup_import",
permissions=[
"dcim.add_rackgroup",
],
),
),
),
NavMenuItem(
link="dcim:rackrole_list",
name="Rack Roles",
weight=300,
permissions=[
"dcim.view_rackrole",
],
buttons=(
NavMenuAddButton(
link="dcim:rackrole_add",
permissions=[
"dcim.add_rackrole",
],
),
NavMenuImportButton(
link="dcim:rackrole_import",
permissions=[
"dcim.add_rackrole",
],
),
),
),
NavMenuItem(
link="dcim:rackreservation_list",
name="Reservations",
weight=400,
permissions=[
"dcim.view_rackreservation",
],
buttons=(
NavMenuAddButton(
link="dcim:rackreservation_add",
permissions=[
"dcim.add_rackreservation",
],
),
NavMenuImportButton(
link="dcim:rackreservation_import",
permissions=[
"dcim.add_rackreservation",
],
),
),
),
NavMenuItem(
link="dcim:rack_elevation_list",
name="Elevations",
weight=500,
permissions=[
"dcim.view_rack",
],
buttons=(),
),
),
),
),
),
NavMenuTab(
name="Devices",
weight=200,
groups=(
NavMenuGroup(
name="Devices",
weight=100,
items=(
NavMenuItem(
link="dcim:device_list",
name="Devices",
weight=100,
permissions=[
"dcim.view_device",
],
buttons=(
NavMenuAddButton(
link="dcim:device_add",
permissions=[
"dcim.add_device",
],
),
NavMenuImportButton(
link="dcim:device_import",
permissions=[
"dcim.add_device",
],
),
),
),
NavMenuItem(
link="dcim:devicerole_list",
name="Device Roles",
weight=200,
permissions=[
"dcim.view_devicerole",
],
buttons=(
NavMenuAddButton(
link="dcim:devicerole_add",
permissions=[
"dcim.add_devicerole",
],
),
NavMenuImportButton(
link="dcim:devicerole_import",
permissions=[
"dcim.add_devicerole",
],
),
),
),
NavMenuItem(
link="dcim:platform_list",
name="Platforms",
weight=300,
permissions=[
"dcim.view_platform",
],
buttons=(
NavMenuAddButton(
link="dcim:platform_add",
permissions=[
"dcim.add_platform",
],
),
NavMenuImportButton(
link="dcim:platform_import",
permissions=[
"dcim.add_platform",
],
),
),
),
NavMenuItem(
link="dcim:virtualchassis_list",
name="Virtual Chassis",
weight=400,
permissions=[
"dcim.view_virtualchassis",
],
buttons=(
NavMenuAddButton(
link="dcim:virtualchassis_add",
permissions=[
"dcim.add_virtualchassis",
],
),
NavMenuImportButton(
link="dcim:virtualchassis_import",
permissions=[
"dcim.add_virtualchassis",
],
),
),
),
),
),
NavMenuGroup(
name="Device Types",
weight=200,
items=(
NavMenuItem(
link="dcim:devicetype_list",
name="Device Types",
weight=100,
permissions=[
"dcim.view_devicetype",
],
buttons=(
NavMenuAddButton(
link="dcim:devicetype_add",
permissions=[
"dcim.add_devicetype",
],
),
NavMenuImportButton(
link="dcim:devicetype_import",
permissions=[
"dcim.add_devicetype",
],
),
),
),
NavMenuItem(
link="dcim:manufacturer_list",
name="Manufacturers",
weight=200,
permissions=[
"dcim.view_manufacturer",
],
buttons=(
NavMenuAddButton(
link="dcim:manufacturer_add",
permissions=[
"dcim.add_manufacturer",
],
),
NavMenuImportButton(
link="dcim:manufacturer_import",
permissions=[
"dcim.add_manufacturer",
],
),
),
),
),
),
NavMenuGroup(
name="Connections",
weight=300,
items=(
NavMenuItem(
link="dcim:cable_list",
name="Cables",
weight=100,
permissions=[
"dcim.view_cable",
],
buttons=(
NavMenuImportButton(
link="dcim:cable_import",
permissions=[
"dcim.add_cable",
],
),
),
),
NavMenuItem(
link="dcim:console_connections_list",
name="Console Connections",
weight=200,
permissions=[
"dcim.view_consoleport",
"dcim.view_consoleserverport",
],
buttons=(),
),
NavMenuItem(
link="dcim:power_connections_list",
name="Power Connections",
weight=300,
permissions=[
"dcim.view_powerport",
"dcim.view_poweroutlet",
],
buttons=(),
),
NavMenuItem(
link="dcim:interface_connections_list",
name="Interface Connections",
weight=400,
permissions=[
"dcim.view_interface",
],
buttons=(),
),
),
),
NavMenuGroup(
name="Device Components",
weight=400,
items=(
NavMenuItem(
link="dcim:interface_list",
name="Interfaces",
weight=100,
permissions=[
"dcim.view_interface",
],
buttons=(
NavMenuImportButton(
link="dcim:interface_import",
permissions=[
"dcim.add_interface",
],
),
),
),
NavMenuItem(
link="dcim:frontport_list",
name="Front Ports",
weight=200,
permissions=[
"dcim.view_frontport",
],
buttons=(
NavMenuImportButton(
link="dcim:frontport_import",
permissions=[
"dcim.add_frontport",
],
),
),
),
NavMenuItem(
link="dcim:rearport_list",
name="Rear Ports",
weight=300,
permissions=[
"dcim.view_rearport",
],
buttons=(
NavMenuImportButton(
link="dcim:rearport_import",
permissions=[
"dcim.add_rearport",
],
),
),
),
NavMenuItem(
link="dcim:consoleport_list",
name="Console Ports",
weight=400,
permissions=[
"dcim.view_consoleport",
],
buttons=(
NavMenuImportButton(
link="dcim:consoleport_import",
permissions=[
"dcim.add_consoleport",
],
),
),
),
NavMenuItem(
link="dcim:consoleserverport_list",
name="Console Server Ports",
weight=500,
permissions=[
"dcim.view_consoleserverport",
],
buttons=(
NavMenuImportButton(
link="dcim:consoleserverport_import",
permissions=[
"dcim.add_consoleserverport",
],
),
),
),
NavMenuItem(
link="dcim:powerport_list",
name="Power Ports",
weight=600,
permissions=[
"dcim.view_powerport",
],
buttons=(
NavMenuImportButton(
link="dcim:powerport_import",
permissions=[
"dcim.add_powerport",
],
),
),
),
NavMenuItem(
link="dcim:poweroutlet_list",
name="Power Outlets",
weight=700,
permissions=[
"dcim.view_poweroutlet",
],
buttons=(
NavMenuImportButton(
link="dcim:poweroutlet_import",
permissions=[
"dcim.add_poweroutlet",
],
),
),
),
NavMenuItem(
link="dcim:devicebay_list",
name="Device Bays",
weight=800,
permissions=[
"dcim.view_devicebay",
],
buttons=(
NavMenuImportButton(
link="dcim:devicebay_import",
permissions=[
"dcim.add_devicebay",
],
),
),
),
NavMenuItem(
link="dcim:inventoryitem_list",
name="Inventory Items",
weight=900,
permissions=[
"dcim.view_inventoryitem",
],
buttons=(
NavMenuImportButton(
link="dcim:inventoryitem_import",
permissions=[
"dcim.add_inventoryitem",
],
),
),
),
),
),
),
),
NavMenuTab(
name="Power",
weight=600,
groups=(
NavMenuGroup(
name="Power",
weight=100,
items=(
NavMenuItem(
link="dcim:powerfeed_list",
name="Power Feeds",
permissions=[
"dcim.view_powerfeed",
],
buttons=(
NavMenuAddButton(
link="dcim:powerfeed_add",
permissions=[
"dcim.add_powerfeed",
],
),
NavMenuImportButton(
link="dcim:powerfeed_import",
permissions=[
"dcim.add_powerfeed",
],
),
),
),
NavMenuItem(
link="dcim:powerpanel_list",
name="Power Panels",
permissions=[
"dcim.view_powerpanel",
],
buttons=(
NavMenuAddButton(
link="dcim:powerpanel_add",
permissions=[
"dcim.add_powerpanel",
],
),
NavMenuImportButton(
link="dcim:powerpanel_import",
permissions=[
"dcim.add_powerpanel",
],
),
),
),
),
),
),
),
)
| 39.087719 | 107 | 0.260413 | 833 | 22,280 | 6.798319 | 0.10084 | 0.093237 | 0.120784 | 0.101713 | 0.147448 | 0.029137 | 0 | 0 | 0 | 0 | 0 | 0.014833 | 0.673205 | 22,280 | 569 | 108 | 39.156415 | 0.762945 | 0 | 0 | 0.776014 | 0 | 0 | 0.136266 | 0.050583 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.08642 | 0 | 0.08642 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
77ea86bbf98ba89d0603b9754c062abc4a743b05 | 68 | py | Python | akari/__init__.py | mananapr/akari | 77d4f34da64e1e68b76ef7b4ee87bd703adfb8e0 | [
"MIT"
] | 36 | 2019-07-25T14:36:19.000Z | 2022-01-17T02:24:50.000Z | akari/__init__.py | mananapr/akari | 77d4f34da64e1e68b76ef7b4ee87bd703adfb8e0 | [
"MIT"
] | 1 | 2019-07-21T13:27:38.000Z | 2019-07-21T13:27:38.000Z | akari/__init__.py | mananapr/akari | 77d4f34da64e1e68b76ef7b4ee87bd703adfb8e0 | [
"MIT"
] | 2 | 2019-07-26T06:08:29.000Z | 2020-10-27T16:08:54.000Z | __version__ = '0.6'
__license__ = 'MIT'
__author__ = 'Manan Singh'
| 17 | 27 | 0.691176 | 8 | 68 | 4.375 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035088 | 0.161765 | 68 | 3 | 28 | 22.666667 | 0.578947 | 0 | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
77f2c05f2367f2bda2c4e4a6d6138e049c9970f4 | 2,456 | py | Python | hack/temp1/a.py | SwordYoung/cutprob | f022b6bc23d80d5d214b54c49f372af49c837855 | [
"Artistic-2.0"
] | null | null | null | hack/temp1/a.py | SwordYoung/cutprob | f022b6bc23d80d5d214b54c49f372af49c837855 | [
"Artistic-2.0"
] | null | null | null | hack/temp1/a.py | SwordYoung/cutprob | f022b6bc23d80d5d214b54c49f372af49c837855 | [
"Artistic-2.0"
] | null | null | null | #!/usr/bin/env python
# Enter your code here. Read input from STDIN. Print output to STDOUT
class TreeNode:
def __init__(self, n):
self.n = n
self.prev = set()
def addPrev(self, p):
# p.next.append(self)
self.prev.add(p.n)
assert isinstance(self.prev, set)
def removePrev(self, prevs):
assert isinstance(self.prev, set)
print "before: %s" % (list(self.prev))
print "remove: %s" % (list(prevs))
self.prev = self.prev - prevs
print "after: %s" % (list(self.prev))
assert isinstance(self.prev, set)
def input_empty(self):
return len(self.prev) == 0
class Solution:
def __init__(self):
self.node_dict = {}
def add_connect(self, n1, n2):
node1 = self.node_dict[n1]
node2 = self.node_dict[n2]
node2.addPrev(node1)
def addsequence(self, seq):
if len(seq) == 0:
return None
if not self.node_dict.has_key(seq[0]):
self.node_dict[seq[0]] = TreeNode(seq[0])
for i in xrange(1, len(seq)):
if not self.node_dict.has_key(seq[i]):
self.node_dict[seq[i]] = TreeNode(seq[i])
self.add_connect(seq[i-1], seq[i])
def getresult(self):
nodes = self.node_dict.values()
result = []
while len(nodes) != 0:
sub_result = []
new_nodes = []
for n in nodes:
print "%d input is %d" % (n.n, len(n.prev))
if n.input_empty():
sub_result.append(n.n)
else:
new_nodes.append(n)
sub_result.sort()
nodes = new_nodes
sub_set = set(sub_result)
for n in nodes:
n.removePrev(sub_set)
result.extend(sub_result)
return result
def read_num():
line = raw_input()
return int(line)
def read_nums():
line = raw_input()
strnums = line.split(' ')
nums = []
for s in strnums:
nums.append(int(s))
return nums
if __name__ == "__main__":
n = read_num()
sol = Solution()
for i in xrange(n):
p = read_num()
nums = read_nums()
sol.addsequence(nums)
print 'INPUT DONE'
result = sol.getresult()
str_result = []
for r in result:
str_result.append("%d" % (r))
print ' '.join(str_result)
| 27.288889 | 69 | 0.522394 | 319 | 2,456 | 3.865204 | 0.275862 | 0.064882 | 0.077859 | 0.034063 | 0.112733 | 0.090835 | 0.042174 | 0.042174 | 0 | 0 | 0 | 0.01005 | 0.351792 | 2,456 | 89 | 70 | 27.595506 | 0.764447 | 0.043974 | 0 | 0.094595 | 0 | 0 | 0.02773 | 0 | 0 | 0 | 0 | 0.011236 | 0.040541 | 0 | null | null | 0 | 0 | null | null | 0.081081 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7ac3315994d79bc024ab349ba0f6ccf05cb36189 | 1,987 | py | Python | tests/test_issues/test_issue_41.py | vemonet/PyShEx | 0004641fbfefc069be615067dd7e78b19e0d7967 | [
"Apache-2.0"
] | null | null | null | tests/test_issues/test_issue_41.py | vemonet/PyShEx | 0004641fbfefc069be615067dd7e78b19e0d7967 | [
"Apache-2.0"
] | null | null | null | tests/test_issues/test_issue_41.py | vemonet/PyShEx | 0004641fbfefc069be615067dd7e78b19e0d7967 | [
"Apache-2.0"
] | 1 | 2019-03-08T15:38:22.000Z | 2019-03-08T15:38:22.000Z | import unittest
from pprint import pprint
from rdflib import Graph, Namespace
from pyshex import ShExEvaluator
rdf = """
@prefix : <http://example.org/model/> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix rdfs: <http://www.w3.org/2000/01/rdf-schema#> .
@prefix xml: <http://www.w3.org/XML/1998/namespace> .
@prefix xsd: <http://www.w3.org/2001/XMLSchema#> .
<http://example.org/context/42> a :Person ;
foaf:age 43 ;
foaf:firstName "Bob",
"Joe" ;
foaf:lastName "smith" .
"""
shex = """
<http://example.org/sample/example1/String> <http://www.w3.org/2001/XMLSchema#string>
<http://example.org/sample/example1/Int> <http://www.w3.org/2001/XMLSchema#integer>
<http://example.org/sample/example1/Boolean> <http://www.w3.org/2001/XMLSchema#boolean>
<http://example.org/sample/example1/Person> CLOSED {
( <http://xmlns.com/foaf/0.1/firstName> @<http://example.org/sample/example1/String> * ;
<http://xmlns.com/foaf/0.1/lastName> @<http://example.org/sample/example1/String> ;
<http://xmlns.com/foaf/0.1/age> @<http://example.org/sample/example1/Int> ? ;
<http://example.org/model/living> @<http://example.org/sample/example1/Boolean> ? ;
<http://xmlns.com/foaf/0.1/knows> @<http://example.org/sample/example1/Person> *
)
}
"""
EXC = Namespace("http://example.org/context/")
EXE = Namespace("http://example.org/sample/example1/")
class Issue41TestCase(unittest.TestCase):
def test_closed(self):
""" Test closed definition """
e = ShExEvaluator(rdf=rdf, schema=shex, focus=EXC['42'], start=EXE.Person)
# This causes issue 42
# pprint(e.evaluate())
self.assertFalse(e.evaluate()[0].result)
from pyshex.evaluate import evaluate
g = Graph()
g.parse(data=rdf, format="turtle")
pprint(evaluate(g, shex, focus=EXC['42'], start=EXE.Person))
if __name__ == '__main__':
unittest.main()
| 34.258621 | 93 | 0.655762 | 272 | 1,987 | 4.757353 | 0.305147 | 0.119011 | 0.151468 | 0.15456 | 0.448223 | 0.426584 | 0.267388 | 0.080371 | 0.080371 | 0.080371 | 0 | 0.043529 | 0.144439 | 1,987 | 57 | 94 | 34.859649 | 0.717647 | 0.033216 | 0 | 0.047619 | 0 | 0.238095 | 0.673811 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 1 | 0.02381 | false | 0 | 0.119048 | 0 | 0.166667 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7accb185f976f9474f2fb7f4e27d3cc1ac5d1c95 | 68 | py | Python | plume/__init__.py | KEVINYZY/plume | ae7b688d0d7b5f3dd5eeb975a302e8a5524d4255 | [
"MIT"
] | 22 | 2018-03-06T18:01:22.000Z | 2020-11-09T16:17:02.000Z | plume/__init__.py | liuslnlp/plume | dbd523861bfb9abad8a52b1de28de85c0f128807 | [
"MIT"
] | null | null | null | plume/__init__.py | liuslnlp/plume | dbd523861bfb9abad8a52b1de28de85c0f128807 | [
"MIT"
] | 21 | 2017-08-12T10:24:17.000Z | 2020-11-27T03:15:30.000Z | __author__ = 'Liu'
__email__ = 'wisedoge@outlook.com'
# 一些惯用变量名的解释
| 13.6 | 34 | 0.735294 | 7 | 68 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132353 | 68 | 4 | 35 | 17 | 0.711864 | 0.147059 | 0 | 0 | 0 | 0 | 0.410714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7ad03c0779016c0e8fa6e742ab6d90af3b24cd28 | 322 | py | Python | CognosService.py | gbryant-dev/CApy | 8f0c09ae81d024f0d6a3848525cd49c079241e2a | [
"MIT"
] | null | null | null | CognosService.py | gbryant-dev/CApy | 8f0c09ae81d024f0d6a3848525cd49c079241e2a | [
"MIT"
] | null | null | null | CognosService.py | gbryant-dev/CApy | 8f0c09ae81d024f0d6a3848525cd49c079241e2a | [
"MIT"
] | null | null | null | from GroupService import GroupService
from RESTService import RESTService
from UserService import UserService
class CognosService:
def __init__(self, **kwargs) -> None:
self._cognos_rest = RESTService(**kwargs)
self.groups = GroupService(self._cognos_rest)
self.users = UserService(self._cognos_rest) | 32.2 | 51 | 0.770186 | 36 | 322 | 6.611111 | 0.444444 | 0.12605 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.152174 | 322 | 10 | 52 | 32.2 | 0.871795 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.375 | 0 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
7ad11832461ee95fa19f878953fa8a2d2ca87d23 | 321 | py | Python | test_app/tests/test_models/test_validators.py | mpasternak/django-flexible-reports | cdf62590efb2937b30e19952a67afbc3a3e1c192 | [
"MIT"
] | 2 | 2017-08-31T11:55:26.000Z | 2018-07-14T19:39:05.000Z | test_app/tests/test_models/test_validators.py | mpasternak/django-flexible-reports | cdf62590efb2937b30e19952a67afbc3a3e1c192 | [
"MIT"
] | 1 | 2017-08-24T07:04:46.000Z | 2017-09-23T14:39:06.000Z | test_app/tests/test_models/test_validators.py | mpasternak/django-flexible-reports | cdf62590efb2937b30e19952a67afbc3a3e1c192 | [
"MIT"
] | null | null | null | # -*- encoding: utf-8 -*-
import pytest
from django.core.exceptions import ValidationError
from flexible_reports.models.validators import TemplateValidator
@pytest.mark.django_db
def test_validators():
with pytest.raises(ValidationError):
TemplateValidator("{% for a in b %}")
TemplateValidator("hi")
| 22.928571 | 64 | 0.744548 | 36 | 321 | 6.555556 | 0.722222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00365 | 0.146417 | 321 | 13 | 65 | 24.692308 | 0.857664 | 0.071651 | 0 | 0 | 0 | 0 | 0.060811 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | true | 0 | 0.375 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
bb03fa7dffbe3326b36d9ae9faedf51cd1d8899a | 131 | py | Python | Chapter 01/coding_question_15.py | bpbpublications/Advance-Core-Python-Programming | 8902ceb270f55c04c12e818032f90d641c14d7b1 | [
"MIT"
] | null | null | null | Chapter 01/coding_question_15.py | bpbpublications/Advance-Core-Python-Programming | 8902ceb270f55c04c12e818032f90d641c14d7b1 | [
"MIT"
] | null | null | null | Chapter 01/coding_question_15.py | bpbpublications/Advance-Core-Python-Programming | 8902ceb270f55c04c12e818032f90d641c14d7b1 | [
"MIT"
] | null | null | null | def funny(x):
if (x%2 == 1):
return x+1
else:
return funny(x-1)
print(funny(7))
print(funny(6))
| 16.375 | 26 | 0.465649 | 21 | 131 | 2.904762 | 0.52381 | 0.196721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072289 | 0.366412 | 131 | 7 | 27 | 18.714286 | 0.662651 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.428571 | 0.285714 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bb09f2b5e15dc7583793cec5192518a1321aeaee | 735 | py | Python | TwitterModule/tweet_complaint.py | CFGIndia20/team-19 | e2b27ad8009303d262c2dc60551d6fcc4645b3b5 | [
"MIT"
] | null | null | null | TwitterModule/tweet_complaint.py | CFGIndia20/team-19 | e2b27ad8009303d262c2dc60551d6fcc4645b3b5 | [
"MIT"
] | null | null | null | TwitterModule/tweet_complaint.py | CFGIndia20/team-19 | e2b27ad8009303d262c2dc60551d6fcc4645b3b5 | [
"MIT"
] | null | null | null | class TweetComplaint:
def __init__(self, complain_text, city, state, tweet_id, username, image_url):
self.complain_text = complain_text
self.tweet_id = tweet_id
self.username = username
self.city = city
self.state = state
self.image_url = image_url
self.status = "reported"
def __str__(self):
return self.complain_text +'\n'+ str(self.tweet_id) +'\n'+ self.username +'\n'+ self.city +'\n'+ self.state +'\n'+ self.image_url
def to_dict(self):
return {'category': '', 'city': self.city, 'state': self.state, 'image_url': self.image_url,
'status': self.status, 'text': self.complain_text, 'tweet_id': self.tweet_id, 'username': self.username} | 43.235294 | 137 | 0.638095 | 98 | 735 | 4.520408 | 0.22449 | 0.094808 | 0.14447 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.216327 | 735 | 17 | 138 | 43.235294 | 0.769097 | 0 | 0 | 0 | 0 | 0 | 0.095109 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0 | 0.142857 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
bb35148b1e2ac78d0791fc69955350fb99a60361 | 37,397 | py | Python | nova/objects/image_meta.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/objects/image_meta.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/objects/image_meta.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2014 Red Hat, Inc'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'import'
name|'copy'
newline|'\n'
nl|'\n'
name|'from'
name|'oslo_utils'
name|'import'
name|'versionutils'
newline|'\n'
nl|'\n'
name|'from'
name|'nova'
name|'import'
name|'exception'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'objects'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'base'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'objects'
name|'import'
name|'fields'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'utils'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'virt'
name|'import'
name|'hardware'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|variable|NULLABLE_STRING_FIELDS
name|'NULLABLE_STRING_FIELDS'
op|'='
op|'['
string|"'name'"
op|','
string|"'checksum'"
op|','
string|"'owner'"
op|','
nl|'\n'
string|"'container_format'"
op|','
string|"'disk_format'"
op|']'
newline|'\n'
DECL|variable|NULLABLE_INTEGER_FIELDS
name|'NULLABLE_INTEGER_FIELDS'
op|'='
op|'['
string|"'size'"
op|','
string|"'virtual_size'"
op|']'
newline|'\n'
nl|'\n'
nl|'\n'
op|'@'
name|'base'
op|'.'
name|'NovaObjectRegistry'
op|'.'
name|'register'
newline|'\n'
DECL|class|ImageMeta
name|'class'
name|'ImageMeta'
op|'('
name|'base'
op|'.'
name|'NovaObject'
op|')'
op|':'
newline|'\n'
comment|'# Version 1.0: Initial version'
nl|'\n'
comment|'# Version 1.1: updated ImageMetaProps'
nl|'\n'
comment|'# Version 1.2: ImageMetaProps version 1.2'
nl|'\n'
comment|'# Version 1.3: ImageMetaProps version 1.3'
nl|'\n'
comment|'# Version 1.4: ImageMetaProps version 1.4'
nl|'\n'
comment|'# Version 1.5: ImageMetaProps version 1.5'
nl|'\n'
comment|'# Version 1.6: ImageMetaProps version 1.6'
nl|'\n'
comment|'# Version 1.7: ImageMetaProps version 1.7'
nl|'\n'
comment|'# Version 1.8: ImageMetaProps version 1.8'
nl|'\n'
DECL|variable|VERSION
indent|' '
name|'VERSION'
op|'='
string|"'1.8'"
newline|'\n'
nl|'\n'
comment|'# These are driven by what the image client API returns'
nl|'\n'
comment|'# to Nova from Glance. This is defined in the glance'
nl|'\n'
comment|'# code glance/api/v2/images.py get_base_properties()'
nl|'\n'
comment|'# method. A few things are currently left out:'
nl|'\n'
comment|'# self, file, schema - Nova does not appear to ever use'
nl|'\n'
comment|'# these field; locations - modelling the arbitrary'
nl|'\n'
comment|"# data in the 'metadata' subfield is non-trivial as"
nl|'\n'
comment|"# there's no clear spec."
nl|'\n'
comment|'#'
nl|'\n'
comment|'# TODO(ft): In version 2.0, these fields should be nullable:'
nl|'\n'
comment|'# name, checksum, owner, size, virtual_size, container_format, disk_format'
nl|'\n'
comment|'#'
nl|'\n'
DECL|variable|fields
name|'fields'
op|'='
op|'{'
nl|'\n'
string|"'id'"
op|':'
name|'fields'
op|'.'
name|'UUIDField'
op|'('
op|')'
op|','
nl|'\n'
string|"'name'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
string|"'status'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
string|"'visibility'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
string|"'protected'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
string|"'checksum'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
string|"'owner'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
string|"'size'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
string|"'virtual_size'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
string|"'container_format'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
string|"'disk_format'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
string|"'created_at'"
op|':'
name|'fields'
op|'.'
name|'DateTimeField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'updated_at'"
op|':'
name|'fields'
op|'.'
name|'DateTimeField'
op|'('
name|'nullable'
op|'='
name|'True'
op|')'
op|','
nl|'\n'
string|"'tags'"
op|':'
name|'fields'
op|'.'
name|'ListOfStringsField'
op|'('
op|')'
op|','
nl|'\n'
string|"'direct_url'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
string|"'min_ram'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
string|"'min_disk'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
string|"'properties'"
op|':'
name|'fields'
op|'.'
name|'ObjectField'
op|'('
string|"'ImageMetaProps'"
op|')'
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
op|'@'
name|'classmethod'
newline|'\n'
DECL|member|from_dict
name|'def'
name|'from_dict'
op|'('
name|'cls'
op|','
name|'image_meta'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create instance from image metadata dict\n\n :param image_meta: image metadata dictionary\n\n Creates a new object instance, initializing from the\n properties associated with the image metadata instance\n\n :returns: an ImageMeta instance\n """'
newline|'\n'
name|'if'
name|'image_meta'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'image_meta'
op|'='
op|'{'
op|'}'
newline|'\n'
nl|'\n'
comment|"# We must turn 'properties' key dict into an object"
nl|'\n'
comment|'# so copy image_meta to avoid changing original'
nl|'\n'
dedent|''
name|'image_meta'
op|'='
name|'copy'
op|'.'
name|'deepcopy'
op|'('
name|'image_meta'
op|')'
newline|'\n'
name|'image_meta'
op|'['
string|'"properties"'
op|']'
op|'='
name|'objects'
op|'.'
name|'ImageMetaProps'
op|'.'
name|'from_dict'
op|'('
nl|'\n'
name|'image_meta'
op|'.'
name|'get'
op|'('
string|'"properties"'
op|','
op|'{'
op|'}'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Some fields are nullable in Glance DB schema, but was not marked that'
nl|'\n'
comment|'# in ImageMeta initially by mistake. To keep compatibility with compute'
nl|'\n'
comment|'# nodes which are run with previous versions these fields are still'
nl|'\n'
comment|'# not nullable in ImageMeta, but the code below converts None to'
nl|'\n'
comment|'# approppriate empty values.'
nl|'\n'
name|'for'
name|'fld'
name|'in'
name|'NULLABLE_STRING_FIELDS'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'fld'
name|'in'
name|'image_meta'
name|'and'
name|'image_meta'
op|'['
name|'fld'
op|']'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'image_meta'
op|'['
name|'fld'
op|']'
op|'='
string|"''"
newline|'\n'
dedent|''
dedent|''
name|'for'
name|'fld'
name|'in'
name|'NULLABLE_INTEGER_FIELDS'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'fld'
name|'in'
name|'image_meta'
name|'and'
name|'image_meta'
op|'['
name|'fld'
op|']'
name|'is'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'image_meta'
op|'['
name|'fld'
op|']'
op|'='
number|'0'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
name|'return'
name|'cls'
op|'('
op|'**'
name|'image_meta'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'classmethod'
newline|'\n'
DECL|member|from_instance
name|'def'
name|'from_instance'
op|'('
name|'cls'
op|','
name|'instance'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create instance from instance system metadata\n\n :param instance: Instance object\n\n Creates a new object instance, initializing from the\n system metadata "image_*" properties associated with\n instance\n\n :returns: an ImageMeta instance\n """'
newline|'\n'
name|'sysmeta'
op|'='
name|'utils'
op|'.'
name|'instance_sys_meta'
op|'('
name|'instance'
op|')'
newline|'\n'
name|'image_meta'
op|'='
name|'utils'
op|'.'
name|'get_image_from_system_metadata'
op|'('
name|'sysmeta'
op|')'
newline|'\n'
name|'return'
name|'cls'
op|'.'
name|'from_dict'
op|'('
name|'image_meta'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'classmethod'
newline|'\n'
DECL|member|from_image_ref
name|'def'
name|'from_image_ref'
op|'('
name|'cls'
op|','
name|'context'
op|','
name|'image_api'
op|','
name|'image_ref'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create instance from glance image\n\n :param context: the request context\n :param image_api: the glance client API\n :param image_ref: the glance image identifier\n\n Creates a new object instance, initializing from the\n properties associated with a glance image\n\n :returns: an ImageMeta instance\n """'
newline|'\n'
nl|'\n'
name|'image_meta'
op|'='
name|'image_api'
op|'.'
name|'get'
op|'('
name|'context'
op|','
name|'image_ref'
op|')'
newline|'\n'
name|'image'
op|'='
name|'cls'
op|'.'
name|'from_dict'
op|'('
name|'image_meta'
op|')'
newline|'\n'
name|'setattr'
op|'('
name|'image'
op|','
string|'"id"'
op|','
name|'image_ref'
op|')'
newline|'\n'
name|'return'
name|'image'
newline|'\n'
nl|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'base'
op|'.'
name|'NovaObjectRegistry'
op|'.'
name|'register'
newline|'\n'
DECL|class|ImageMetaProps
name|'class'
name|'ImageMetaProps'
op|'('
name|'base'
op|'.'
name|'NovaObject'
op|')'
op|':'
newline|'\n'
comment|'# Version 1.0: Initial version'
nl|'\n'
comment|'# Version 1.1: added os_require_quiesce field'
nl|'\n'
comment|'# Version 1.2: added img_hv_type and img_hv_requested_version fields'
nl|'\n'
comment|'# Version 1.3: HVSpec version 1.1'
nl|'\n'
comment|'# Version 1.4: added hw_vif_multiqueue_enabled field'
nl|'\n'
comment|'# Version 1.5: added os_admin_user field'
nl|'\n'
comment|"# Version 1.6: Added 'lxc' and 'uml' enum types to DiskBusField"
nl|'\n'
comment|'# Version 1.7: added img_config_drive field'
nl|'\n'
comment|"# Version 1.8: Added 'lxd' to hypervisor types"
nl|'\n'
comment|'# Version 1.9: added hw_cpu_thread_policy field'
nl|'\n'
comment|'# Version 1.10: added hw_cpu_realtime_mask field'
nl|'\n'
comment|'# Version 1.11: Added hw_firmware_type field'
nl|'\n'
comment|'# Version 1.12: Added properties for image signature verification'
nl|'\n'
DECL|variable|VERSION
indent|' '
name|'VERSION'
op|'='
string|"'1.12'"
newline|'\n'
nl|'\n'
DECL|member|obj_make_compatible
name|'def'
name|'obj_make_compatible'
op|'('
name|'self'
op|','
name|'primitive'
op|','
name|'target_version'
op|')'
op|':'
newline|'\n'
indent|' '
name|'super'
op|'('
name|'ImageMetaProps'
op|','
name|'self'
op|')'
op|'.'
name|'obj_make_compatible'
op|'('
name|'primitive'
op|','
nl|'\n'
name|'target_version'
op|')'
newline|'\n'
name|'target_version'
op|'='
name|'versionutils'
op|'.'
name|'convert_version_to_tuple'
op|'('
name|'target_version'
op|')'
newline|'\n'
name|'if'
name|'target_version'
op|'<'
op|'('
number|'1'
op|','
number|'11'
op|')'
op|':'
newline|'\n'
indent|' '
name|'primitive'
op|'.'
name|'pop'
op|'('
string|"'hw_firmware_type'"
op|','
name|'None'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'target_version'
op|'<'
op|'('
number|'1'
op|','
number|'10'
op|')'
op|':'
newline|'\n'
indent|' '
name|'primitive'
op|'.'
name|'pop'
op|'('
string|"'hw_cpu_realtime_mask'"
op|','
name|'None'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'target_version'
op|'<'
op|'('
number|'1'
op|','
number|'9'
op|')'
op|':'
newline|'\n'
indent|' '
name|'primitive'
op|'.'
name|'pop'
op|'('
string|"'hw_cpu_thread_policy'"
op|','
name|'None'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'target_version'
op|'<'
op|'('
number|'1'
op|','
number|'7'
op|')'
op|':'
newline|'\n'
indent|' '
name|'primitive'
op|'.'
name|'pop'
op|'('
string|"'img_config_drive'"
op|','
name|'None'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'target_version'
op|'<'
op|'('
number|'1'
op|','
number|'5'
op|')'
op|':'
newline|'\n'
indent|' '
name|'primitive'
op|'.'
name|'pop'
op|'('
string|"'os_admin_user'"
op|','
name|'None'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'target_version'
op|'<'
op|'('
number|'1'
op|','
number|'4'
op|')'
op|':'
newline|'\n'
indent|' '
name|'primitive'
op|'.'
name|'pop'
op|'('
string|"'hw_vif_multiqueue_enabled'"
op|','
name|'None'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'target_version'
op|'<'
op|'('
number|'1'
op|','
number|'2'
op|')'
op|':'
newline|'\n'
indent|' '
name|'primitive'
op|'.'
name|'pop'
op|'('
string|"'img_hv_type'"
op|','
name|'None'
op|')'
newline|'\n'
name|'primitive'
op|'.'
name|'pop'
op|'('
string|"'img_hv_requested_version'"
op|','
name|'None'
op|')'
newline|'\n'
dedent|''
name|'if'
name|'target_version'
op|'<'
op|'('
number|'1'
op|','
number|'1'
op|')'
op|':'
newline|'\n'
indent|' '
name|'primitive'
op|'.'
name|'pop'
op|'('
string|"'os_require_quiesce'"
op|','
name|'None'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'target_version'
op|'<'
op|'('
number|'1'
op|','
number|'6'
op|')'
op|':'
newline|'\n'
indent|' '
name|'bus'
op|'='
name|'primitive'
op|'.'
name|'get'
op|'('
string|"'hw_disk_bus'"
op|','
name|'None'
op|')'
newline|'\n'
name|'if'
name|'bus'
name|'in'
op|'('
string|"'lxc'"
op|','
string|"'uml'"
op|')'
op|':'
newline|'\n'
indent|' '
name|'raise'
name|'exception'
op|'.'
name|'ObjectActionError'
op|'('
nl|'\n'
name|'action'
op|'='
string|"'obj_make_compatible'"
op|','
nl|'\n'
name|'reason'
op|'='
string|"'hw_disk_bus=%s not supported in version %s'"
op|'%'
op|'('
nl|'\n'
name|'bus'
op|','
name|'target_version'
op|')'
op|')'
newline|'\n'
nl|'\n'
comment|'# Maximum number of NUMA nodes permitted for the guest topology'
nl|'\n'
DECL|variable|NUMA_NODES_MAX
dedent|''
dedent|''
dedent|''
name|'NUMA_NODES_MAX'
op|'='
number|'128'
newline|'\n'
nl|'\n'
comment|"# 'hw_' - settings affecting the guest virtual machine hardware"
nl|'\n'
comment|"# 'img_' - settings affecting the use of images by the compute node"
nl|'\n'
comment|"# 'os_' - settings affecting the guest operating system setup"
nl|'\n'
nl|'\n'
DECL|variable|fields
name|'fields'
op|'='
op|'{'
nl|'\n'
comment|'# name of guest hardware architecture eg i686, x86_64, ppc64'
nl|'\n'
string|"'hw_architecture'"
op|':'
name|'fields'
op|'.'
name|'ArchitectureField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# used to decide to expand root disk partition and fs to full size of'
nl|'\n'
comment|'# root disk'
nl|'\n'
string|"'hw_auto_disk_config'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# whether to display BIOS boot device menu'
nl|'\n'
string|"'hw_boot_menu'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# name of the CDROM bus to use eg virtio, scsi, ide'
nl|'\n'
string|"'hw_cdrom_bus'"
op|':'
name|'fields'
op|'.'
name|'DiskBusField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# preferred number of CPU cores per socket'
nl|'\n'
string|"'hw_cpu_cores'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# preferred number of CPU sockets'
nl|'\n'
string|"'hw_cpu_sockets'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# maximum number of CPU cores per socket'
nl|'\n'
string|"'hw_cpu_max_cores'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# maximum number of CPU sockets'
nl|'\n'
string|"'hw_cpu_max_sockets'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# maximum number of CPU threads per core'
nl|'\n'
string|"'hw_cpu_max_threads'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# CPU allocation policy'
nl|'\n'
string|"'hw_cpu_policy'"
op|':'
name|'fields'
op|'.'
name|'CPUAllocationPolicyField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# CPU thread allocation policy'
nl|'\n'
string|"'hw_cpu_thread_policy'"
op|':'
name|'fields'
op|'.'
name|'CPUThreadAllocationPolicyField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# CPU mask indicates which vCPUs will have realtime enable,'
nl|'\n'
comment|'# example ^0-1 means that all vCPUs except 0 and 1 will have a'
nl|'\n'
comment|'# realtime policy.'
nl|'\n'
string|"'hw_cpu_realtime_mask'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# preferred number of CPU threads per core'
nl|'\n'
string|"'hw_cpu_threads'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# guest ABI version for guest xentools either 1 or 2 (or 3 - depends on'
nl|'\n'
comment|'# Citrix PV tools version installed in image)'
nl|'\n'
string|"'hw_device_id'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# name of the hard disk bus to use eg virtio, scsi, ide'
nl|'\n'
string|"'hw_disk_bus'"
op|':'
name|'fields'
op|'.'
name|'DiskBusField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# allocation mode eg 'preallocated'"
nl|'\n'
string|"'hw_disk_type'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# name of the floppy disk bus to use eg fd, scsi, ide'
nl|'\n'
string|"'hw_floppy_bus'"
op|':'
name|'fields'
op|'.'
name|'DiskBusField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# This indicates the guest needs UEFI firmware'
nl|'\n'
string|"'hw_firmware_type'"
op|':'
name|'fields'
op|'.'
name|'FirmwareTypeField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# boolean - used to trigger code to inject networking when booting a CD'
nl|'\n'
comment|'# image with a network boot image'
nl|'\n'
string|"'hw_ipxe_boot'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# There are sooooooooooo many possible machine types in'
nl|'\n'
comment|'# QEMU - several new ones with each new release - that it'
nl|'\n'
comment|'# is not practical to enumerate them all. So we use a free'
nl|'\n'
comment|'# form string'
nl|'\n'
string|"'hw_machine_type'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# One of the magic strings 'small', 'any', 'large'"
nl|'\n'
comment|'# or an explicit page size in KB (eg 4, 2048, ...)'
nl|'\n'
string|"'hw_mem_page_size'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# Number of guest NUMA nodes'
nl|'\n'
string|"'hw_numa_nodes'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# Each list entry corresponds to a guest NUMA node and the'
nl|'\n'
comment|'# set members indicate CPUs for that node'
nl|'\n'
string|"'hw_numa_cpus'"
op|':'
name|'fields'
op|'.'
name|'ListOfSetsOfIntegersField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# Each list entry corresponds to a guest NUMA node and the'
nl|'\n'
comment|'# list value indicates the memory size of that node.'
nl|'\n'
string|"'hw_numa_mem'"
op|':'
name|'fields'
op|'.'
name|'ListOfIntegersField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# boolean 'yes' or 'no' to enable QEMU guest agent"
nl|'\n'
string|"'hw_qemu_guest_agent'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# name of the RNG device type eg virtio'
nl|'\n'
string|"'hw_rng_model'"
op|':'
name|'fields'
op|'.'
name|'RNGModelField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# number of serial ports to create'
nl|'\n'
string|"'hw_serial_port_count'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# name of the SCSI bus controller eg 'virtio-scsi', 'lsilogic', etc"
nl|'\n'
string|"'hw_scsi_model'"
op|':'
name|'fields'
op|'.'
name|'SCSIModelField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# name of the video adapter model to use, eg cirrus, vga, xen, qxl'
nl|'\n'
string|"'hw_video_model'"
op|':'
name|'fields'
op|'.'
name|'VideoModelField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# MB of video RAM to provide eg 64'
nl|'\n'
string|"'hw_video_ram'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# name of a NIC device model eg virtio, e1000, rtl8139'
nl|'\n'
string|"'hw_vif_model'"
op|':'
name|'fields'
op|'.'
name|'VIFModelField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# "xen" vs "hvm"'
nl|'\n'
string|"'hw_vm_mode'"
op|':'
name|'fields'
op|'.'
name|'VMModeField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# action to take when watchdog device fires eg reset, poweroff, pause,'
nl|'\n'
comment|'# none'
nl|'\n'
string|"'hw_watchdog_action'"
op|':'
name|'fields'
op|'.'
name|'WatchdogActionField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# boolean - If true, this will enable the virtio-multiqueue feature'
nl|'\n'
string|"'hw_vif_multiqueue_enabled'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# if true download using bittorrent'
nl|'\n'
string|"'img_bittorrent'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# Which data format the 'img_block_device_mapping' field is"
nl|'\n'
comment|'# using to represent the block device mapping'
nl|'\n'
string|"'img_bdm_v2'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# Block device mapping - the may can be in one or two completely'
nl|'\n'
comment|"# different formats. The 'img_bdm_v2' field determines whether"
nl|'\n'
comment|'# it is in legacy format, or the new current format. Ideally'
nl|'\n'
comment|'# we would have a formal data type for this field instead of a'
nl|'\n'
comment|'# dict, but with 2 different formats to represent this is hard.'
nl|'\n'
comment|'# See nova/block_device.py from_legacy_mapping() for the complex'
nl|'\n'
comment|'# conversion code. So for now leave it as a dict and continue'
nl|'\n'
comment|'# to use existing code that is able to convert dict into the'
nl|'\n'
comment|'# desired internal BDM formats'
nl|'\n'
string|"'img_block_device_mapping'"
op|':'
nl|'\n'
name|'fields'
op|'.'
name|'ListOfDictOfNullableStringsField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# boolean - if True, and image cache set to "some" decides if image'
nl|'\n'
comment|'# should be cached on host when server is booted on that host'
nl|'\n'
string|"'img_cache_in_nova'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# Compression level for images. (1-9)'
nl|'\n'
string|"'img_compression_level'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# hypervisor supported version, eg. '>=2.6'"
nl|'\n'
string|"'img_hv_requested_version'"
op|':'
name|'fields'
op|'.'
name|'VersionPredicateField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# type of the hypervisor, eg kvm, ironic, xen'
nl|'\n'
string|"'img_hv_type'"
op|':'
name|'fields'
op|'.'
name|'HVTypeField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# Whether the image needs/expected config drive'
nl|'\n'
string|"'img_config_drive'"
op|':'
name|'fields'
op|'.'
name|'ConfigDrivePolicyField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# boolean flag to set space-saving or performance behavior on the'
nl|'\n'
comment|'# Datastore'
nl|'\n'
string|"'img_linked_clone'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# Image mappings - related to Block device mapping data - mapping'
nl|'\n'
comment|'# of virtual image names to device names. This could be represented'
nl|'\n'
comment|'# as a formatl data type, but is left as dict for same reason as'
nl|'\n'
comment|'# img_block_device_mapping field. It would arguably make sense for'
nl|'\n'
comment|'# the two to be combined into a single field and data type in the'
nl|'\n'
comment|'# future.'
nl|'\n'
string|"'img_mappings'"
op|':'
name|'fields'
op|'.'
name|'ListOfDictOfNullableStringsField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# image project id (set on upload)'
nl|'\n'
string|"'img_owner_id'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# root device name, used in snapshotting eg /dev/<blah>'
nl|'\n'
string|"'img_root_device_name'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# boolean - if false don't talk to nova agent"
nl|'\n'
string|"'img_use_agent'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# integer value 1'
nl|'\n'
string|"'img_version'"
op|':'
name|'fields'
op|'.'
name|'IntegerField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# base64 of encoding of image signature'
nl|'\n'
string|"'img_signature'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# string indicating hash method used to compute image signature'
nl|'\n'
string|"'img_signature_hash_method'"
op|':'
name|'fields'
op|'.'
name|'ImageSignatureHashTypeField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# string indicating Castellan uuid of certificate'
nl|'\n'
comment|"# used to compute the image's signature"
nl|'\n'
string|"'img_signature_certificate_uuid'"
op|':'
name|'fields'
op|'.'
name|'UUIDField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# string indicating type of key used to compute image signature'
nl|'\n'
string|"'img_signature_key_type'"
op|':'
name|'fields'
op|'.'
name|'ImageSignatureKeyTypeField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# string of username with admin privileges'
nl|'\n'
string|"'os_admin_user'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# string of boot time command line arguments for the guest kernel'
nl|'\n'
string|"'os_command_line'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# the name of the specific guest operating system distro. This'
nl|'\n'
comment|'# is not done as an Enum since the list of operating systems is'
nl|'\n'
comment|'# growing incredibly fast, and valid values can be arbitrarily'
nl|'\n'
comment|'# user defined. Nova has no real need for strict validation so'
nl|'\n'
comment|'# leave it freeform'
nl|'\n'
string|"'os_distro'"
op|':'
name|'fields'
op|'.'
name|'StringField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|'# boolean - if true, then guest must support disk quiesce'
nl|'\n'
comment|'# or snapshot operation will be denied'
nl|'\n'
string|"'os_require_quiesce'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# boolean - if using agent don't inject files, assume someone else is"
nl|'\n'
comment|'# doing that (cloud-init)'
nl|'\n'
string|"'os_skip_agent_inject_files_at_boot'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# boolean - if using agent don't try inject ssh key, assume someone"
nl|'\n'
comment|'# else is doing that (cloud-init)'
nl|'\n'
string|"'os_skip_agent_inject_ssh'"
op|':'
name|'fields'
op|'.'
name|'FlexibleBooleanField'
op|'('
op|')'
op|','
nl|'\n'
nl|'\n'
comment|"# The guest operating system family such as 'linux', 'windows' - this"
nl|'\n'
comment|'# is a fairly generic type. For a detailed type consider os_distro'
nl|'\n'
comment|'# instead'
nl|'\n'
string|"'os_type'"
op|':'
name|'fields'
op|'.'
name|'OSTypeField'
op|'('
op|')'
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
comment|'# The keys are the legacy property names and'
nl|'\n'
comment|'# the values are the current preferred names'
nl|'\n'
DECL|variable|_legacy_property_map
name|'_legacy_property_map'
op|'='
op|'{'
nl|'\n'
string|"'architecture'"
op|':'
string|"'hw_architecture'"
op|','
nl|'\n'
string|"'owner_id'"
op|':'
string|"'img_owner_id'"
op|','
nl|'\n'
string|"'vmware_disktype'"
op|':'
string|"'hw_disk_type'"
op|','
nl|'\n'
string|"'vmware_image_version'"
op|':'
string|"'img_version'"
op|','
nl|'\n'
string|"'vmware_ostype'"
op|':'
string|"'os_distro'"
op|','
nl|'\n'
string|"'auto_disk_config'"
op|':'
string|"'hw_auto_disk_config'"
op|','
nl|'\n'
string|"'ipxe_boot'"
op|':'
string|"'hw_ipxe_boot'"
op|','
nl|'\n'
string|"'xenapi_device_id'"
op|':'
string|"'hw_device_id'"
op|','
nl|'\n'
string|"'xenapi_image_compression_level'"
op|':'
string|"'img_compression_level'"
op|','
nl|'\n'
string|"'vmware_linked_clone'"
op|':'
string|"'img_linked_clone'"
op|','
nl|'\n'
string|"'xenapi_use_agent'"
op|':'
string|"'img_use_agent'"
op|','
nl|'\n'
string|"'xenapi_skip_agent_inject_ssh'"
op|':'
string|"'os_skip_agent_inject_ssh'"
op|','
nl|'\n'
string|"'xenapi_skip_agent_inject_files_at_boot'"
op|':'
nl|'\n'
string|"'os_skip_agent_inject_files_at_boot'"
op|','
nl|'\n'
string|"'cache_in_nova'"
op|':'
string|"'img_cache_in_nova'"
op|','
nl|'\n'
string|"'vm_mode'"
op|':'
string|"'hw_vm_mode'"
op|','
nl|'\n'
string|"'bittorrent'"
op|':'
string|"'img_bittorrent'"
op|','
nl|'\n'
string|"'mappings'"
op|':'
string|"'img_mappings'"
op|','
nl|'\n'
string|"'block_device_mapping'"
op|':'
string|"'img_block_device_mapping'"
op|','
nl|'\n'
string|"'bdm_v2'"
op|':'
string|"'img_bdm_v2'"
op|','
nl|'\n'
string|"'root_device_name'"
op|':'
string|"'img_root_device_name'"
op|','
nl|'\n'
string|"'hypervisor_version_requires'"
op|':'
string|"'img_hv_requested_version'"
op|','
nl|'\n'
string|"'hypervisor_type'"
op|':'
string|"'img_hv_type'"
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
comment|'# TODO(berrange): Need to run this from a data migration'
nl|'\n'
comment|'# at some point so we can eventually kill off the compat'
nl|'\n'
DECL|member|_set_attr_from_legacy_names
name|'def'
name|'_set_attr_from_legacy_names'
op|'('
name|'self'
op|','
name|'image_props'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'legacy_key'
name|'in'
name|'self'
op|'.'
name|'_legacy_property_map'
op|':'
newline|'\n'
indent|' '
name|'new_key'
op|'='
name|'self'
op|'.'
name|'_legacy_property_map'
op|'['
name|'legacy_key'
op|']'
newline|'\n'
nl|'\n'
name|'if'
name|'legacy_key'
name|'not'
name|'in'
name|'image_props'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
nl|'\n'
dedent|''
name|'setattr'
op|'('
name|'self'
op|','
name|'new_key'
op|','
name|'image_props'
op|'['
name|'legacy_key'
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
name|'vmware_adaptertype'
op|'='
name|'image_props'
op|'.'
name|'get'
op|'('
string|'"vmware_adaptertype"'
op|')'
newline|'\n'
name|'if'
name|'vmware_adaptertype'
op|'=='
string|'"ide"'
op|':'
newline|'\n'
indent|' '
name|'setattr'
op|'('
name|'self'
op|','
string|'"hw_disk_bus"'
op|','
string|'"ide"'
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'vmware_adaptertype'
op|':'
newline|'\n'
indent|' '
name|'setattr'
op|'('
name|'self'
op|','
string|'"hw_disk_bus"'
op|','
string|'"scsi"'
op|')'
newline|'\n'
name|'setattr'
op|'('
name|'self'
op|','
string|'"hw_scsi_model"'
op|','
name|'vmware_adaptertype'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_set_numa_mem
dedent|''
dedent|''
name|'def'
name|'_set_numa_mem'
op|'('
name|'self'
op|','
name|'image_props'
op|')'
op|':'
newline|'\n'
indent|' '
name|'hw_numa_mem'
op|'='
op|'['
op|']'
newline|'\n'
name|'hw_numa_mem_set'
op|'='
name|'False'
newline|'\n'
name|'for'
name|'cellid'
name|'in'
name|'range'
op|'('
name|'ImageMetaProps'
op|'.'
name|'NUMA_NODES_MAX'
op|')'
op|':'
newline|'\n'
indent|' '
name|'memprop'
op|'='
string|'"hw_numa_mem.%d"'
op|'%'
name|'cellid'
newline|'\n'
name|'if'
name|'memprop'
name|'not'
name|'in'
name|'image_props'
op|':'
newline|'\n'
indent|' '
name|'break'
newline|'\n'
dedent|''
name|'hw_numa_mem'
op|'.'
name|'append'
op|'('
name|'int'
op|'('
name|'image_props'
op|'['
name|'memprop'
op|']'
op|')'
op|')'
newline|'\n'
name|'hw_numa_mem_set'
op|'='
name|'True'
newline|'\n'
name|'del'
name|'image_props'
op|'['
name|'memprop'
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'hw_numa_mem_set'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'hw_numa_mem'
op|'='
name|'hw_numa_mem'
newline|'\n'
nl|'\n'
DECL|member|_set_numa_cpus
dedent|''
dedent|''
name|'def'
name|'_set_numa_cpus'
op|'('
name|'self'
op|','
name|'image_props'
op|')'
op|':'
newline|'\n'
indent|' '
name|'hw_numa_cpus'
op|'='
op|'['
op|']'
newline|'\n'
name|'hw_numa_cpus_set'
op|'='
name|'False'
newline|'\n'
name|'for'
name|'cellid'
name|'in'
name|'range'
op|'('
name|'ImageMetaProps'
op|'.'
name|'NUMA_NODES_MAX'
op|')'
op|':'
newline|'\n'
indent|' '
name|'cpuprop'
op|'='
string|'"hw_numa_cpus.%d"'
op|'%'
name|'cellid'
newline|'\n'
name|'if'
name|'cpuprop'
name|'not'
name|'in'
name|'image_props'
op|':'
newline|'\n'
indent|' '
name|'break'
newline|'\n'
dedent|''
name|'hw_numa_cpus'
op|'.'
name|'append'
op|'('
nl|'\n'
name|'hardware'
op|'.'
name|'parse_cpu_spec'
op|'('
name|'image_props'
op|'['
name|'cpuprop'
op|']'
op|')'
op|')'
newline|'\n'
name|'hw_numa_cpus_set'
op|'='
name|'True'
newline|'\n'
name|'del'
name|'image_props'
op|'['
name|'cpuprop'
op|']'
newline|'\n'
nl|'\n'
dedent|''
name|'if'
name|'hw_numa_cpus_set'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'hw_numa_cpus'
op|'='
name|'hw_numa_cpus'
newline|'\n'
nl|'\n'
DECL|member|_set_attr_from_current_names
dedent|''
dedent|''
name|'def'
name|'_set_attr_from_current_names'
op|'('
name|'self'
op|','
name|'image_props'
op|')'
op|':'
newline|'\n'
indent|' '
name|'for'
name|'key'
name|'in'
name|'self'
op|'.'
name|'fields'
op|':'
newline|'\n'
comment|'# The two NUMA fields need special handling to'
nl|'\n'
comment|'# un-stringify them correctly'
nl|'\n'
indent|' '
name|'if'
name|'key'
op|'=='
string|'"hw_numa_mem"'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_set_numa_mem'
op|'('
name|'image_props'
op|')'
newline|'\n'
dedent|''
name|'elif'
name|'key'
op|'=='
string|'"hw_numa_cpus"'
op|':'
newline|'\n'
indent|' '
name|'self'
op|'.'
name|'_set_numa_cpus'
op|'('
name|'image_props'
op|')'
newline|'\n'
dedent|''
name|'else'
op|':'
newline|'\n'
indent|' '
name|'if'
name|'key'
name|'not'
name|'in'
name|'image_props'
op|':'
newline|'\n'
indent|' '
name|'continue'
newline|'\n'
nl|'\n'
dedent|''
name|'setattr'
op|'('
name|'self'
op|','
name|'key'
op|','
name|'image_props'
op|'['
name|'key'
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
dedent|''
op|'@'
name|'classmethod'
newline|'\n'
DECL|member|from_dict
name|'def'
name|'from_dict'
op|'('
name|'cls'
op|','
name|'image_props'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create instance from image properties dict\n\n :param image_props: dictionary of image metdata properties\n\n Creates a new object instance, initializing from a\n dictionary of image metadata properties\n\n :returns: an ImageMetaProps instance\n """'
newline|'\n'
name|'obj'
op|'='
name|'cls'
op|'('
op|')'
newline|'\n'
comment|'# We look to see if the dict has entries for any'
nl|'\n'
comment|'# of the legacy property names first. Then we use'
nl|'\n'
comment|'# the current property names. That way if both the'
nl|'\n'
comment|'# current and legacy names are set, the value'
nl|'\n'
comment|'# associated with the current name takes priority'
nl|'\n'
name|'obj'
op|'.'
name|'_set_attr_from_legacy_names'
op|'('
name|'image_props'
op|')'
newline|'\n'
name|'obj'
op|'.'
name|'_set_attr_from_current_names'
op|'('
name|'image_props'
op|')'
newline|'\n'
name|'return'
name|'obj'
newline|'\n'
nl|'\n'
DECL|member|get
dedent|''
name|'def'
name|'get'
op|'('
name|'self'
op|','
name|'name'
op|','
name|'defvalue'
op|'='
name|'None'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Get the value of an attribute\n :param name: the attribute to request\n :param defvalue: the default value if not set\n\n This returns the value of an attribute if it is currently\n set, otherwise it will return None.\n\n This differs from accessing props.attrname, because that\n will raise an exception if the attribute has no value set.\n\n So instead of\n\n if image_meta.properties.obj_attr_is_set("some_attr"):\n val = image_meta.properties.some_attr\n else\n val = None\n\n Callers can rely on unconditional access\n\n val = image_meta.properties.get("some_attr")\n\n :returns: the attribute value or None\n """'
newline|'\n'
nl|'\n'
name|'if'
name|'not'
name|'self'
op|'.'
name|'obj_attr_is_set'
op|'('
name|'name'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'defvalue'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'getattr'
op|'('
name|'self'
op|','
name|'name'
op|')'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 16.091652 | 754 | 0.631762 | 5,724 | 37,397 | 4.031272 | 0.109189 | 0.048364 | 0.069772 | 0.024572 | 0.616685 | 0.542232 | 0.465049 | 0.417595 | 0.381538 | 0.347216 | 0 | 0.004452 | 0.123005 | 37,397 | 2,323 | 755 | 16.098579 | 0.699119 | 0 | 0 | 0.867843 | 0 | 0.002152 | 0.517421 | 0.044282 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.003444 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bb3e54849c08e39f608b4bac6e8233d52852dce2 | 710 | py | Python | pydcmjpeg/config.py | scaramallion/pydcmjpeg | 2f31979c2858d34db8fcbc42f139b75fbeaaf024 | [
"MIT"
] | null | null | null | pydcmjpeg/config.py | scaramallion/pydcmjpeg | 2f31979c2858d34db8fcbc42f139b75fbeaaf024 | [
"MIT"
] | null | null | null | pydcmjpeg/config.py | scaramallion/pydcmjpeg | 2f31979c2858d34db8fcbc42f139b75fbeaaf024 | [
"MIT"
] | null | null | null |
JPEG_10918 = (
'SOF0', 'SOF1', 'SOF2', 'SOF3', 'SOF5',
'SOF6', 'SOF7', 'SOF9', 'SOF10',
'SOF11', 'SOF13', 'SOF14', 'SOF15',
)
JPEG_14495 = ('SOF55', 'LSE', )
JPEG_15444 = ('SOC', )
PARSE_SUPPORTED = {
'10918' : [
'Process 1',
'Process 2',
'Process 4',
'Process 14',
]
}
DECODE_SUPPORTED = {}
ENCODE_SUPPORTED = {}
ZIGZAG = [ 0, 1, 5, 6, 14, 15, 27, 28,
2, 4, 7, 13, 16, 26, 29, 42,
3, 8, 12, 17, 25, 30, 41, 43,
9, 11, 18, 24, 31, 40, 44, 53,
10, 19, 23, 32, 39, 45, 52, 54,
20, 22, 33, 38, 46, 51, 55, 60,
21, 34, 37, 47, 50, 56, 59, 61,
35, 36, 48, 49, 57, 58, 62, 63]
| 21.515152 | 43 | 0.432394 | 102 | 710 | 2.95098 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.357456 | 0.357746 | 710 | 32 | 44 | 22.1875 | 0.302632 | 0 | 0 | 0 | 0 | 0 | 0.155367 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
bb4ca687798a9571d8879b4c03dd5d7167a01e5c | 434 | py | Python | src/tac/core/wavenet_vocoder/models/__init__.py | stefantaubert/Tacotron-2 | 710a3b39b48147307fa8eef2c9f635562f48d49a | [
"MIT"
] | null | null | null | src/tac/core/wavenet_vocoder/models/__init__.py | stefantaubert/Tacotron-2 | 710a3b39b48147307fa8eef2c9f635562f48d49a | [
"MIT"
] | null | null | null | src/tac/core/wavenet_vocoder/models/__init__.py | stefantaubert/Tacotron-2 | 710a3b39b48147307fa8eef2c9f635562f48d49a | [
"MIT"
] | null | null | null | from src.tac.core.wavenet_vocoder.models.wavenet import WaveNet
from warnings import warn
from src.tac.core.wavenet_vocoder.util import is_mulaw_quantize
def create_model(hparams, init=False):
if is_mulaw_quantize(hparams.input_type):
if hparams.out_channels != hparams.quantize_channels:
raise RuntimeError(
"out_channels must equal to quantize_chennels if input_type is 'mulaw-quantize'")
return WaveNet(hparams, init)
| 36.166667 | 85 | 0.813364 | 64 | 434 | 5.3125 | 0.5 | 0.061765 | 0.132353 | 0.082353 | 0.164706 | 0.164706 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110599 | 434 | 11 | 86 | 39.454545 | 0.880829 | 0 | 0 | 0 | 0 | 0 | 0.179724 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
24719f23ab554f1c0d959e732d43c4175ac06221 | 8,058 | py | Python | autograph/core/plot.py | crillab/autograph | afae58e3b8ac265d71cd981e25adcb13197b5492 | [
"MIT"
] | 2 | 2021-04-19T15:16:25.000Z | 2021-04-21T18:15:18.000Z | autograph/core/plot.py | crillab/autograph | afae58e3b8ac265d71cd981e25adcb13197b5492 | [
"MIT"
] | 1 | 2021-05-30T17:57:37.000Z | 2021-05-30T17:57:37.000Z | autograph/core/plot.py | crillab/autograph | afae58e3b8ac265d71cd981e25adcb13197b5492 | [
"MIT"
] | null | null | null | # ##############################################################################
# Copyright © 2021 Univ Artois & CNRS, Exakis Nelite #
# #
# Permission is hereby granted, free of charge, to any person #
# obtaining a copy of this software and associated documentation #
# files (the “Software”), to deal in the Software without #
# restriction, including without limitation the rights to use, #
# copy, modify, merge, publish, distribute, sublicense, and/or sell #
# copies of the Software, and to permit persons to whom the #
# Software is furnished to do so, subject to the following #
# conditions: #
# #
# The above copyright notice and this permission notice shall be #
# included in all copies or substantial portions of the Software. #
# #
# THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, #
# EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES #
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND #
# NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT #
# HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, #
# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING #
# FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR #
# OTHER DEALINGS IN THE SOFTWARE. #
# ##############################################################################
from abc import abstractmethod
from collections import defaultdict
from typing import Optional
from autograph.core.enumstyle import LineType, MarkerShape
from autograph.core.style import TextStyle, TextPosition, PlotStyle, LegendStyle
class Plot:
name = 'plot'
def __init__(self):
self._data = defaultdict(dict)
@property
def figure_size(self):
return self._data.get('figure_size')
@figure_size.setter
def figure_size(self, value):
self._data['figure_size'] = value
@property
def latex(self):
return self._data.get('latex')
@latex.setter
def latex(self, value):
self._data['latex'] = value
self._set_latex(value)
@property
def title(self):
return self._data.get('title', {}).get('text')
@title.setter
def title(self, value):
self._data['title']['text'] = value
@property
def title_style(self) -> TextStyle:
return self._data.get('title').get('style', TextStyle())
@title_style.setter
def title_style(self, value: TextStyle):
self._data['title']['style'] = value
self._set_title_style(value)
@property
def title_position(self) -> TextPosition:
return self._data.get('title').get('position', TextPosition())
@title_position.setter
def title_position(self, value):
self._data['title']['position'] = value
self._set_title_position(value)
@property
def x_label(self):
return self._data.get('x_axis').get('label')
@x_label.setter
def x_label(self, value):
self._data['x_axis']['label'] = value
@property
def x_label_style(self):
return self._data.get('x_axis').get('style')
@x_label_style.setter
def x_label_style(self, value):
self._data['x_axis']['style'] = value
self._set_x_label_style(value)
@property
def y_label_style(self):
return self._data.get('y_axis').get('style')
@y_label_style.setter
def y_label_style(self, value):
self._data['y_axis']['style'] = value
self._set_y_label_style(value)
@property
def x_label_position(self):
return self._data.get('x_axis').get('position')
@x_label_position.setter
def x_label_position(self, value):
self._data['x_axis']['position'] = value
self._set_x_label_position(value)
@property
def y_label_position(self):
return self._data.get('y_axis').get('position')
@y_label_position.setter
def y_label_position(self, value):
self._data['y_axis']['position'] = value
self._set_y_label_position(value)
@property
def y_label(self):
return self._data.get('y_axis').get('label')
@y_label.setter
def y_label(self, value):
self._data['y_axis']['label'] = value
@property
def log_x(self):
return self._data.get('x_axis', {}).get('log', False)
@log_x.setter
def log_x(self, value: bool):
self._data['x_axis']['log'] = value
@property
def log_y(self):
return self._data.get('y_axis', {}).get('log', False)
@log_y.setter
def log_y(self, value: bool):
self._data['y_axis']['log'] = value
@property
def x_min(self):
return self._data.get("x_axis", {}).get("min")
@x_min.setter
def x_min(self, value):
self.set_x_lim(left=value)
@property
def x_max(self):
return self._data.get("x_axis", {}).get("max")
@x_max.setter
def x_max(self, value):
self.set_x_lim(right=value)
@property
def y_min(self):
return self._data.get("y_axis", {}).get("min")
@y_min.setter
def y_min(self, value):
self.set_y_lim(bottom=value)
@property
def y_max(self):
return self._data.get("y_axis", {}).get("max")
@y_max.setter
def y_max(self, value):
self.set_y_lim(up=value)
@property
def x_lim(self):
return self.x_min, self.x_max
@x_lim.setter
def x_lim(self, value):
self.set_x_lim(left=value[0], right=value[1])
@property
def y_lim(self):
return self.y_min, self.y_max
@y_lim.setter
def y_lim(self, value):
self.set_y_lim(bottom=value[0], up=value[1])
@property
def legend(self):
return self._data.get('legend')
@legend.setter
def legend(self, value: LegendStyle):
self._data['legend'] = value
self._set_legend(value)
def set_x_lim(self, left=None, right=None):
self._data['x_axis']['min'] = left
self._data['x_axis']['max'] = right
def set_y_lim(self, bottom=None, up=None):
self._data['y_axis']['min'] = bottom
self._data['y_axis']['max'] = up
def _set_legend(self, value: LegendStyle):
value.set_plot(self)
def _set_title_style(self, value: TextStyle):
value.set_plot(value)
def _set_x_label_style(self, value: TextStyle):
value.set_plot(value)
def _set_y_label_style(self, value: TextStyle):
value.set_plot(value)
def _set_title_position(self, value: TextPosition):
value.set_plot(value)
def _set_x_label_position(self, value):
value.set_plot(value)
def _set_y_label_position(self, value):
value.set_plot(value)
@abstractmethod
def _set_latex(self, value):
raise NotImplementedError
@abstractmethod
def plot(self, x, y, label=None, style: Optional[PlotStyle] = None):
raise NotImplementedError
@abstractmethod
def scatter(self, x, y, label=None, style: Optional[PlotStyle] = None):
raise NotImplementedError
def boxplot(self, x):
raise NotImplementedError
@abstractmethod
def show(self):
raise NotImplementedError
@abstractmethod
def save(self, output, **kwargs):
raise NotImplementedError
@abstractmethod
def _line_type_as_string(self, line_type: LineType):
raise NotImplementedError
@abstractmethod
def _marker_shape_as_string(self, shape: MarkerShape):
raise NotImplementedError
| 30.179775 | 80 | 0.588483 | 991 | 8,058 | 4.566095 | 0.167508 | 0.065414 | 0.055691 | 0.067624 | 0.392486 | 0.282431 | 0.213923 | 0.189392 | 0.061436 | 0.061436 | 0 | 0.001384 | 0.2827 | 8,058 | 266 | 81 | 30.293233 | 0.781315 | 0.207123 | 0 | 0.227778 | 0 | 0 | 0.059303 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.322222 | false | 0 | 0.027778 | 0.111111 | 0.472222 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
2477553e355c8734db82bef918542dc24d368fe9 | 458 | py | Python | deploy/DeployConfig.py | equinor/equinor-trading-plotly-dash-template | eb0869930bf802076b4d89b86431c0bc3bb0ec17 | [
"MIT"
] | 2 | 2022-01-21T09:22:10.000Z | 2022-01-21T09:22:23.000Z | deploy/DeployConfig.py | equinor/equinor-trading-plotly-dash-template | eb0869930bf802076b4d89b86431c0bc3bb0ec17 | [
"MIT"
] | null | null | null | deploy/DeployConfig.py | equinor/equinor-trading-plotly-dash-template | eb0869930bf802076b4d89b86431c0bc3bb0ec17 | [
"MIT"
] | 1 | 2022-02-01T15:54:25.000Z | 2022-02-01T15:54:25.000Z | from typing import List
from pydantic import BaseModel
from pydantic.fields import Field
class Role(BaseModel):
allowed_member_types: List[str] = Field(alias="allowedMemberTypes")
description: str
display_name: str = Field(alias="displayName")
id: str
is_enabled: bool = Field(alias="isEnabled")
origin: str
value: str
class DeployConfig(BaseModel):
roles: List[Role]
redirect_path: str = Field(alias="redirectPath")
| 22.9 | 71 | 0.722707 | 56 | 458 | 5.821429 | 0.571429 | 0.122699 | 0.119632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181223 | 458 | 19 | 72 | 24.105263 | 0.869333 | 0 | 0 | 0 | 0 | 0 | 0.10917 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.214286 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
248a56f39bbd15595da83180cf60016a6a11f0dd | 546 | py | Python | Python3.x/226-Invert Binary Tree.py | ranchlin/Leetcode | 02f23ff0dd7d1de1cc51161835310678ecb0c87e | [
"MIT"
] | null | null | null | Python3.x/226-Invert Binary Tree.py | ranchlin/Leetcode | 02f23ff0dd7d1de1cc51161835310678ecb0c87e | [
"MIT"
] | null | null | null | Python3.x/226-Invert Binary Tree.py | ranchlin/Leetcode | 02f23ff0dd7d1de1cc51161835310678ecb0c87e | [
"MIT"
] | null | null | null | # 递归
# Runtime: 32 ms, faster than 100.00% of Python3 online submissions for Invert Binary Tree.
# Memory Usage: 12.4 MB, less than 0.96% of Python3 online submissions for Invert Binary Tree.
# Definition for a binary tree node.
# class TreeNode:
# def __init__(self, x):
# self.val = x
# self.left = None
# self.right = None
class Solution:
def invertTree(self, root: 'TreeNode') -> 'TreeNode':
if root: root.left, root.right = self.invertTree(root.right), self.invertTree(root.left)
return root
| 36.4 | 96 | 0.664835 | 78 | 546 | 4.602564 | 0.538462 | 0.083565 | 0.083565 | 0.144847 | 0.389972 | 0.250696 | 0.250696 | 0.250696 | 0 | 0 | 0 | 0.035629 | 0.228938 | 546 | 14 | 97 | 39 | 0.817102 | 0.613553 | 0 | 0 | 0 | 0 | 0.079602 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
24a099a74a547ef75727ed488d1640f7dfc5900b | 220 | py | Python | setup.py | SHZ66/enm_package | 3e3981560f11146e611625589cc2b1dd3cf6cd17 | [
"MIT"
] | null | null | null | setup.py | SHZ66/enm_package | 3e3981560f11146e611625589cc2b1dd3cf6cd17 | [
"MIT"
] | null | null | null | setup.py | SHZ66/enm_package | 3e3981560f11146e611625589cc2b1dd3cf6cd17 | [
"MIT"
] | 2 | 2021-05-29T14:58:49.000Z | 2021-10-19T05:38:41.000Z | from setuptools import find_packages, setup
setup(
name='enm',
packages=find_packages(),
version='0.1.0',
description='ENM applications on genetic networks',
author='Omer Acar',
license='MIT',
)
| 20 | 55 | 0.672727 | 27 | 220 | 5.407407 | 0.777778 | 0.164384 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016949 | 0.195455 | 220 | 10 | 56 | 22 | 0.80791 | 0 | 0 | 0 | 0 | 0 | 0.254545 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.111111 | 0 | 0.111111 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
24a851e10a6e94d137c78a2d4b879f457e473a40 | 1,053 | py | Python | Udemy Material/Airline Satisfaction/exercise.py | karkir0003/DSGT-Bootcamp-Material | fb987e600733646355c007b83eff5df46039792a | [
"MIT"
] | 1 | 2021-05-15T20:17:10.000Z | 2021-05-15T20:17:10.000Z | Udemy Material/Airline Satisfaction/exercise.py | karkir0003/DSGT-Bootcamp-Material | fb987e600733646355c007b83eff5df46039792a | [
"MIT"
] | 1 | 2021-05-12T17:17:30.000Z | 2021-05-12T17:17:30.000Z | Udemy Material/Airline Satisfaction/exercise.py | karkir0003/DSGT-Bootcamp-Material | fb987e600733646355c007b83eff5df46039792a | [
"MIT"
] | 1 | 2021-04-12T19:33:52.000Z | 2021-04-12T19:33:52.000Z | #Add Key imports here!
TRAIN_URL = "https://raw.githubusercontent.com/karkir0003/DSGT-Bootcamp-Material/main/Udemy%20Material/Airline%20Satisfaction/train.csv"
def read_train_dataset():
"""
This function should read in the train.csv and return it
in whatever representation you like
"""
###YOUR CODE HERE####
raise NotImplementedError("Did not implement read_train_dataset() function")
#####################
def preprocess_dataset(dataset):
"""
Given the raw dataset read in from your read_train_dataset() function,
process the dataset accordingly
"""
###YOUR CODE HERE####
raise NotImplementedError("Did not implement read_train_dataset() function")
#####################
def train_model():
"""
Given your cleaned data, train your Machine Learning model on it and return the
model
MANDATORY FUNCTION TO IMPLEMENT
"""
####YOUR CODE HERE#####
raise NotImplementedError("Did not implement the train_model() function")
#######################
| 30.085714 | 136 | 0.648623 | 119 | 1,053 | 5.638655 | 0.462185 | 0.053651 | 0.09538 | 0.076006 | 0.308495 | 0.308495 | 0.308495 | 0.308495 | 0.232489 | 0.232489 | 0 | 0.009501 | 0.20038 | 1,053 | 34 | 137 | 30.970588 | 0.787411 | 0.361823 | 0 | 0.285714 | 0 | 0.142857 | 0.505837 | 0 | 0 | 0 | 0 | 0.088235 | 0 | 1 | 0.428571 | false | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
24a8ecc9cb17b5a19220280fa18c3e9bb112c6c3 | 2,750 | py | Python | b2flow/python/tools/storage.py | allanbatista/b2flow-python-tools | 00e349a3cea005e5f96355698e22cccbb143bf73 | [
"MIT"
] | null | null | null | b2flow/python/tools/storage.py | allanbatista/b2flow-python-tools | 00e349a3cea005e5f96355698e22cccbb143bf73 | [
"MIT"
] | null | null | null | b2flow/python/tools/storage.py | allanbatista/b2flow-python-tools | 00e349a3cea005e5f96355698e22cccbb143bf73 | [
"MIT"
] | null | null | null | import os, io, re
from b2flow.python.tools.handler import Handlers
from b2flow.python.tools.driver import Driver
from b2flow.python.tools.compress import gzip, ungzip
def remove_first_back_slash(path: str):
if path.startswith('/'):
return path[1:]
else:
return path
def to_in_memory_io(data: bytes):
in_memory_io = io.BytesIO(data)
in_memory_io.seek(0)
return in_memory_io
class Storage:
"""
This class abstract storage layer
"""
def __init__(self, directory: str, driver: Driver = None):
self.directory = remove_first_back_slash(directory)
self.handlers = Handlers(self)
self.driver = driver
def path(self, directory: str):
"""
create a directory scope
string path
return Storage
"""
return Storage(self.join(directory), driver=self.driver)
@property
def parent(self):
"""
go to parent path
return Storage
"""
parent_path = "/".join(self.directory.split("/")[:-1])
return Storage(parent_path, driver=self.driver)
def join(self, *args):
"""
join a filepath with current path of storage
string filepath1, filepathN, ...
return string
"""
return os.path.join(self.directory, *args)
def write(self, data: bytes, name: str, compress: bool = False):
"""
write bytes to a remote file
byte[] data
string filename
boolean compress=False
return None
"""
filepath = self.join(name)
if compress:
self.driver.write(gzip(data), filepath)
else:
self.driver.write(data, filepath)
def read(self, filename: str, as_memory_io: bool = False):
"""
read remote file as bytesentiry
string filename
return byte[]
"""
data = ungzip(self.driver.read(self.join(filename)))
if as_memory_io:
data = to_in_memory_io(data)
return data
def list(self):
"""
should list all files in a current prefix path
return Entry[]
"""
entries = []
regex = re.compile('^%s' % self.directory)
for obj in self.driver.list(self.directory):
path, name = os.path.split(obj['Key'])
entries.append(Entry(self.path(regex.sub("", path)), name))
return entries
class Entry:
def __init__(self, storage: Storage, name: str):
self.name = name
self.storage = storage
def read(self):
return self.storage.read(self.name)
def write(self, data: bytes, compress: bool = False):
self.storage.write(data, self.name, compress) | 23.706897 | 71 | 0.588 | 330 | 2,750 | 4.80303 | 0.260606 | 0.035331 | 0.031546 | 0.039748 | 0.046688 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003661 | 0.304727 | 2,750 | 116 | 72 | 23.706897 | 0.825314 | 0.158545 | 0 | 0.038462 | 0 | 0 | 0.004358 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.076923 | 0.019231 | 0.519231 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
24aa6840e4ff4a953b1079b5539a3cf0eb2f9d4e | 6,247 | py | Python | additional_codes/migrations/0001_initial.py | uktrade/tamato | 4ba2ffb25eea2887e4e081c81da7634cd7b4f9ca | [
"MIT"
] | 14 | 2020-03-25T11:11:29.000Z | 2022-03-08T20:41:33.000Z | additional_codes/migrations/0001_initial.py | uktrade/tamato | 4ba2ffb25eea2887e4e081c81da7634cd7b4f9ca | [
"MIT"
] | 352 | 2020-03-25T10:42:09.000Z | 2022-03-30T15:32:26.000Z | additional_codes/migrations/0001_initial.py | uktrade/tamato | 4ba2ffb25eea2887e4e081c81da7634cd7b4f9ca | [
"MIT"
] | 3 | 2020-08-06T12:22:41.000Z | 2022-01-16T11:51:12.000Z | # Generated by Django 3.1 on 2021-01-06 15:33
import django.core.validators
import django.db.models.deletion
from django.db import migrations
from django.db import models
import common.fields
class Migration(migrations.Migration):
initial = True
dependencies = [
("footnotes", "0001_initial"),
("common", "0001_initial"),
]
operations = [
migrations.CreateModel(
name="AdditionalCode",
fields=[
(
"trackedmodel_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="common.trackedmodel",
),
),
("valid_between", common.fields.TaricDateTimeRangeField(db_index=True)),
("sid", common.fields.SignedIntSID()),
(
"code",
models.CharField(
max_length=3,
validators=[
django.core.validators.RegexValidator(
"^[A-Z0-9][A-Z0-9][A-Z0-9]$"
)
],
),
),
],
options={
"abstract": False,
"base_manager_name": "objects",
},
bases=("common.trackedmodel", models.Model),
),
migrations.CreateModel(
name="AdditionalCodeType",
fields=[
(
"trackedmodel_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="common.trackedmodel",
),
),
("valid_between", common.fields.TaricDateTimeRangeField(db_index=True)),
(
"sid",
models.CharField(
db_index=True,
max_length=1,
validators=[
django.core.validators.RegexValidator("^[A-Z0-9]$")
],
),
),
("description", common.fields.ShortDescription()),
(
"application_code",
models.PositiveSmallIntegerField(
choices=[
(0, "Export refund nomenclature"),
(1, "Additional codes"),
(3, "Meursing additional codes"),
(4, "Export refund for processed agricultural goods"),
]
),
),
],
options={
"abstract": False,
"base_manager_name": "objects",
},
bases=("common.trackedmodel", models.Model),
),
migrations.CreateModel(
name="FootnoteAssociationAdditionalCode",
fields=[
(
"trackedmodel_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="common.trackedmodel",
),
),
("valid_between", common.fields.TaricDateTimeRangeField(db_index=True)),
(
"additional_code",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
to="additional_codes.additionalcode",
),
),
(
"associated_footnote",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
to="footnotes.footnote",
),
),
],
options={
"abstract": False,
"base_manager_name": "objects",
},
bases=("common.trackedmodel", models.Model),
),
migrations.CreateModel(
name="AdditionalCodeDescription",
fields=[
(
"trackedmodel_ptr",
models.OneToOneField(
auto_created=True,
on_delete=django.db.models.deletion.CASCADE,
parent_link=True,
primary_key=True,
serialize=False,
to="common.trackedmodel",
),
),
("valid_between", common.fields.TaricDateTimeRangeField(db_index=True)),
("description_period_sid", common.fields.SignedIntSID()),
("description", models.TextField()),
(
"described_additional_code",
models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
related_name="descriptions",
to="additional_codes.additionalcode",
),
),
],
options={
"abstract": False,
"base_manager_name": "objects",
},
bases=("common.trackedmodel", models.Model),
),
migrations.AddField(
model_name="additionalcode",
name="type",
field=models.ForeignKey(
on_delete=django.db.models.deletion.PROTECT,
to="additional_codes.additionalcodetype",
),
),
]
| 35.697143 | 88 | 0.414119 | 393 | 6,247 | 6.437659 | 0.251908 | 0.034783 | 0.049802 | 0.078261 | 0.641502 | 0.641502 | 0.63834 | 0.63834 | 0.600395 | 0.600395 | 0 | 0.011371 | 0.493197 | 6,247 | 174 | 89 | 35.902299 | 0.787745 | 0.006883 | 0 | 0.628743 | 1 | 0 | 0.152854 | 0.036762 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02994 | 0 | 0.053892 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
24c818402b52cfd55d5b453e8edf70ab4c456a2d | 228 | py | Python | chap03/list0311.py | ytianjin/GitTest | a657f46098728ad90f7140fadad356e8561c9a7a | [
"MIT"
] | null | null | null | chap03/list0311.py | ytianjin/GitTest | a657f46098728ad90f7140fadad356e8561c9a7a | [
"MIT"
] | null | null | null | chap03/list0311.py | ytianjin/GitTest | a657f46098728ad90f7140fadad356e8561c9a7a | [
"MIT"
] | null | null | null | # 判断整数的位数(0、一位数或多位数)
n = int(input('整数:'))
if n == 0: # 0
print('该值为零。')
elif n >= -9 and n <= 9: # 一位数
print('该值为一位数。')
else: # 多位数
print('该值为多位数。')
| 20.727273 | 36 | 0.364035 | 26 | 228 | 3.192308 | 0.692308 | 0.048193 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04065 | 0.460526 | 228 | 10 | 37 | 22.8 | 0.634146 | 0.131579 | 0 | 0 | 0 | 0 | 0.120879 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.428571 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
24ccaf2aa8b6c62be9e87d419403d983d50214ac | 460 | py | Python | examples/custom_model_continuos_training.py | Tufail114/ImageAi-Project-Software | 11615d877745e4a3a5b7a49d782b31bbc7784dce | [
"MIT"
] | 7,141 | 2018-03-22T16:27:31.000Z | 2022-03-31T07:18:34.000Z | examples/custom_model_continuos_training.py | Tufail114/ImageAi-Project-Software | 11615d877745e4a3a5b7a49d782b31bbc7784dce | [
"MIT"
] | 729 | 2018-03-23T13:54:12.000Z | 2022-03-30T13:43:25.000Z | examples/custom_model_continuos_training.py | Tufail114/ImageAi-Project-Software | 11615d877745e4a3a5b7a49d782b31bbc7784dce | [
"MIT"
] | 2,055 | 2018-03-22T15:30:19.000Z | 2022-03-30T10:57:17.000Z | from imageai.Classification.Custom import ClassificationModelTrainer
import os
trainer = ClassificationModelTrainer()
trainer.setModelTypeAsDenseNet121()
trainer.setDataDirectory("idenprof")
trainer.trainModel(num_objects=10, num_experiments=50, enhance_data=True, batch_size=8, show_network_summary=True, continue_from_model="idenprof_densenet-0.763500.h5") # Download the model via this link https://github.com/OlafenwaMoses/ImageAI/releases/tag/models-v3
| 51.111111 | 266 | 0.847826 | 56 | 460 | 6.803571 | 0.803571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.039261 | 0.058696 | 460 | 8 | 267 | 57.5 | 0.840647 | 0.208696 | 0 | 0 | 0 | 0 | 0.102493 | 0.080332 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
24d012f738d18059268adc1d7d6be54fddf4d570 | 555 | py | Python | biserici_inlemnite/biserici/migrations/0028_auto_20210803_1551.py | ck-tm/biserici-inlemnite | c9d12127b92f25d3ab2fcc7b4c386419fe308a4e | [
"MIT"
] | null | null | null | biserici_inlemnite/biserici/migrations/0028_auto_20210803_1551.py | ck-tm/biserici-inlemnite | c9d12127b92f25d3ab2fcc7b4c386419fe308a4e | [
"MIT"
] | null | null | null | biserici_inlemnite/biserici/migrations/0028_auto_20210803_1551.py | ck-tm/biserici-inlemnite | c9d12127b92f25d3ab2fcc7b4c386419fe308a4e | [
"MIT"
] | null | null | null | # Generated by Django 3.1.13 on 2021-08-03 12:51
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('biserici', '0027_auto_20210803_1546'),
]
operations = [
migrations.RenameField(
model_name='historicalpicturainterioara',
old_name='tehnica',
new_name='tehnica_pictura',
),
migrations.RenameField(
model_name='picturainterioara',
old_name='tehnica',
new_name='tehnica_pictura',
),
]
| 23.125 | 53 | 0.596396 | 52 | 555 | 6.153846 | 0.653846 | 0.1375 | 0.1625 | 0.1875 | 0.21875 | 0.21875 | 0.21875 | 0 | 0 | 0 | 0 | 0.082474 | 0.300901 | 555 | 23 | 54 | 24.130435 | 0.742268 | 0.082883 | 0 | 0.470588 | 1 | 0 | 0.234714 | 0.098619 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.058824 | 0 | 0.235294 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
24d18c372ee07f0b5887698b494bebfb6145504c | 20,433 | py | Python | pysnmp/CISCO-WBX-MEETING-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 11 | 2021-02-02T16:27:16.000Z | 2021-08-31T06:22:49.000Z | pysnmp/CISCO-WBX-MEETING-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 75 | 2021-02-24T17:30:31.000Z | 2021-12-08T00:01:18.000Z | pysnmp/CISCO-WBX-MEETING-MIB.py | agustinhenze/mibs.snmplabs.com | 1fc5c07860542b89212f4c8ab807057d9a9206c7 | [
"Apache-2.0"
] | 10 | 2019-04-30T05:51:36.000Z | 2022-02-16T03:33:41.000Z | #
# PySNMP MIB module CISCO-WBX-MEETING-MIB (http://snmplabs.com/pysmi)
# ASN.1 source file:///Users/davwang4/Dev/mibs.snmplabs.com/asn1/CISCO-WBX-MEETING-MIB
# Produced by pysmi-0.3.4 at Mon Apr 29 18:04:58 2019
# On host DAVWANG4-M-1475 platform Darwin version 18.5.0 by user davwang4
# Using Python version 3.7.3 (default, Mar 27 2019, 09:23:15)
#
OctetString, Integer, ObjectIdentifier = mibBuilder.importSymbols("ASN1", "OctetString", "Integer", "ObjectIdentifier")
NamedValues, = mibBuilder.importSymbols("ASN1-ENUMERATION", "NamedValues")
ValueRangeConstraint, ValueSizeConstraint, SingleValueConstraint, ConstraintsUnion, ConstraintsIntersection = mibBuilder.importSymbols("ASN1-REFINEMENT", "ValueRangeConstraint", "ValueSizeConstraint", "SingleValueConstraint", "ConstraintsUnion", "ConstraintsIntersection")
ciscoMgmt, = mibBuilder.importSymbols("CISCO-SMI", "ciscoMgmt")
InetAddressType, InetAddress = mibBuilder.importSymbols("INET-ADDRESS-MIB", "InetAddressType", "InetAddress")
SnmpAdminString, = mibBuilder.importSymbols("SNMP-FRAMEWORK-MIB", "SnmpAdminString")
ModuleCompliance, ObjectGroup, NotificationGroup = mibBuilder.importSymbols("SNMPv2-CONF", "ModuleCompliance", "ObjectGroup", "NotificationGroup")
Counter64, NotificationType, IpAddress, Integer32, Counter32, Gauge32, MibIdentifier, ModuleIdentity, Bits, MibScalar, MibTable, MibTableRow, MibTableColumn, Unsigned32, TimeTicks, iso, ObjectIdentity = mibBuilder.importSymbols("SNMPv2-SMI", "Counter64", "NotificationType", "IpAddress", "Integer32", "Counter32", "Gauge32", "MibIdentifier", "ModuleIdentity", "Bits", "MibScalar", "MibTable", "MibTableRow", "MibTableColumn", "Unsigned32", "TimeTicks", "iso", "ObjectIdentity")
AutonomousType, TextualConvention, DisplayString = mibBuilder.importSymbols("SNMPv2-TC", "AutonomousType", "TextualConvention", "DisplayString")
ciscoWebExMeetingMIB = ModuleIdentity((1, 3, 6, 1, 4, 1, 9, 9, 809))
ciscoWebExMeetingMIB.setRevisions(('2013-05-29 00:00',))
if mibBuilder.loadTexts: ciscoWebExMeetingMIB.setLastUpdated('201305290000Z')
if mibBuilder.loadTexts: ciscoWebExMeetingMIB.setOrganization('Cisco Systems Inc.')
ciscoWebExMeetingMIBNotifs = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 809, 0))
ciscoWebExMeetingMIBObjects = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 809, 1))
ciscoWebExMeetingMIBConform = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 809, 2))
class CiscoWebExCommSysResource(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1, 2, 3, 4))
namedValues = NamedValues(("cpu", 0), ("memory", 1), ("swap", 2), ("fileDescriptor", 3), ("disk", 4))
class CiscoWebExCommSysResMonitoringStatus(TextualConvention, Integer32):
status = 'current'
subtypeSpec = Integer32.subtypeSpec + ConstraintsUnion(SingleValueConstraint(0, 1))
namedValues = NamedValues(("closed", 0), ("open", 1))
ciscoWebExCommInfo = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 1))
ciscoWebExCommSystemResource = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2))
cwCommSystemVersion = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 1, 1), SnmpAdminString().subtype(subtypeSpec=ValueSizeConstraint(0, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommSystemVersion.setStatus('current')
cwCommSystemObjectID = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 1, 2), AutonomousType()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommSystemObjectID.setStatus('current')
cwCommCPUUsageObject = ObjectIdentity((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1))
if mibBuilder.loadTexts: cwCommCPUUsageObject.setStatus('current')
cwCommCPUTotalUsage = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 1), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setUnits('percent').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUTotalUsage.setStatus('current')
cwCommCPUUsageWindow = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 2), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(1, 60))).setUnits('Minute').setMaxAccess("readwrite")
if mibBuilder.loadTexts: cwCommCPUUsageWindow.setStatus('current')
cwCommCPUTotalNumber = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 3), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 64))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUTotalNumber.setStatus('current')
cwCommCPUUsageTable = MibTable((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4), )
if mibBuilder.loadTexts: cwCommCPUUsageTable.setStatus('current')
cwCommCPUUsageEntry = MibTableRow((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1), ).setIndexNames((0, "CISCO-WBX-MEETING-MIB", "cwCommCPUIndex"))
if mibBuilder.loadTexts: cwCommCPUUsageEntry.setStatus('current')
cwCommCPUIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 128)))
if mibBuilder.loadTexts: cwCommCPUIndex.setStatus('current')
cwCommCPUName = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 2), SnmpAdminString().subtype(subtypeSpec=ValueSizeConstraint(1, 255))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUName.setStatus('current')
cwCommCPUUsage = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 3), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setUnits('percent').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsage.setStatus('current')
cwCommCPUUsageUser = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 4), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsageUser.setStatus('current')
cwCommCPUUsageNice = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 5), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsageNice.setStatus('current')
cwCommCPUUsageSystem = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 6), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsageSystem.setStatus('current')
cwCommCPUUsageIdle = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 7), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsageIdle.setStatus('current')
cwCommCPUUsageIOWait = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 8), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsageIOWait.setStatus('current')
cwCommCPUUsageIRQ = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 9), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsageIRQ.setStatus('current')
cwCommCPUUsageSoftIRQ = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 10), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsageSoftIRQ.setStatus('current')
cwCommCPUUsageSteal = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 11), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsageSteal.setStatus('current')
cwCommCPUUsageCapacitySubTotal = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 4, 1, 12), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUUsageCapacitySubTotal.setStatus('current')
cwCommCPUMonitoringStatus = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 5), CiscoWebExCommSysResMonitoringStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUMonitoringStatus.setStatus('current')
cwCommCPUCapacityTotal = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 1, 6), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KHz').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommCPUCapacityTotal.setStatus('current')
cwCommMEMUsageObject = ObjectIdentity((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 2))
if mibBuilder.loadTexts: cwCommMEMUsageObject.setStatus('current')
cwCommMEMUsage = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 2, 1), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setUnits('percent').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommMEMUsage.setStatus('current')
cwCommMEMMonitoringStatus = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 2, 2), CiscoWebExCommSysResMonitoringStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommMEMMonitoringStatus.setStatus('current')
cwCommMEMTotal = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 2, 3), Gauge32()).setUnits('MBytes').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommMEMTotal.setStatus('current')
cwCommMEMSwapUsageObject = ObjectIdentity((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 3))
if mibBuilder.loadTexts: cwCommMEMSwapUsageObject.setStatus('current')
cwCommMEMSwapUsage = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 3, 1), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setUnits('percent').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommMEMSwapUsage.setStatus('current')
cwCommMEMSwapMonitoringStatus = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 3, 2), CiscoWebExCommSysResMonitoringStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommMEMSwapMonitoringStatus.setStatus('current')
cwCommSysResourceNotificationObject = ObjectIdentity((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 4))
if mibBuilder.loadTexts: cwCommSysResourceNotificationObject.setStatus('current')
cwCommNotificationHostAddressType = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 4, 1), InetAddressType()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: cwCommNotificationHostAddressType.setStatus('current')
cwCommNotificationHostAddress = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 4, 2), InetAddress()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: cwCommNotificationHostAddress.setStatus('current')
cwCommNotificationResName = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 4, 3), CiscoWebExCommSysResource()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: cwCommNotificationResName.setStatus('current')
cwCommNotificationResValue = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 4, 4), Unsigned32()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: cwCommNotificationResValue.setStatus('current')
cwCommNotificationSeqNum = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 4, 5), Counter32()).setMaxAccess("accessiblefornotify")
if mibBuilder.loadTexts: cwCommNotificationSeqNum.setStatus('current')
cwCommDiskUsageObject = ObjectIdentity((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 5))
if mibBuilder.loadTexts: cwCommDiskUsageObject.setStatus('current')
cwCommDiskUsageCount = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 5, 1), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 65535))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommDiskUsageCount.setStatus('current')
cwCommDiskUsageTable = MibTable((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 5, 2), )
if mibBuilder.loadTexts: cwCommDiskUsageTable.setStatus('current')
cwCommDiskUsageEntry = MibTableRow((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 5, 2, 1), ).setIndexNames((0, "CISCO-WBX-MEETING-MIB", "cwCommDiskUsageIndex"))
if mibBuilder.loadTexts: cwCommDiskUsageEntry.setStatus('current')
cwCommDiskUsageIndex = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 5, 2, 1, 1), Unsigned32().subtype(subtypeSpec=ValueRangeConstraint(1, 128)))
if mibBuilder.loadTexts: cwCommDiskUsageIndex.setStatus('current')
cwCommDiskPartitionName = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 5, 2, 1, 2), SnmpAdminString().subtype(subtypeSpec=ValueSizeConstraint(0, 128))).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommDiskPartitionName.setStatus('current')
cwCommDiskUsage = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 5, 2, 1, 3), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 100))).setUnits('percent').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommDiskUsage.setStatus('current')
cwCommDiskTotal = MibTableColumn((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 5, 2, 1, 4), Gauge32().subtype(subtypeSpec=ValueRangeConstraint(0, 4294967295))).setUnits('KB').setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommDiskTotal.setStatus('current')
cwCommDiskMonitoringStatus = MibScalar((1, 3, 6, 1, 4, 1, 9, 9, 809, 1, 2, 5, 3), CiscoWebExCommSysResMonitoringStatus()).setMaxAccess("readonly")
if mibBuilder.loadTexts: cwCommDiskMonitoringStatus.setStatus('current')
cwCommSystemResourceUsageNormalEvent = NotificationType((1, 3, 6, 1, 4, 1, 9, 9, 809, 0, 1)).setObjects(("CISCO-WBX-MEETING-MIB", "cwCommNotificationHostAddressType"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationHostAddress"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationResName"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationResValue"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationSeqNum"))
if mibBuilder.loadTexts: cwCommSystemResourceUsageNormalEvent.setStatus('current')
cwCommSystemResourceUsageMinorEvent = NotificationType((1, 3, 6, 1, 4, 1, 9, 9, 809, 0, 2)).setObjects(("CISCO-WBX-MEETING-MIB", "cwCommNotificationHostAddressType"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationHostAddress"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationResName"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationResValue"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationSeqNum"))
if mibBuilder.loadTexts: cwCommSystemResourceUsageMinorEvent.setStatus('current')
cwCommSystemResourceUsageMajorEvent = NotificationType((1, 3, 6, 1, 4, 1, 9, 9, 809, 0, 3)).setObjects(("CISCO-WBX-MEETING-MIB", "cwCommNotificationHostAddressType"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationHostAddress"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationResName"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationResValue"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationSeqNum"))
if mibBuilder.loadTexts: cwCommSystemResourceUsageMajorEvent.setStatus('current')
cwCommMIBCompliances = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 809, 2, 1))
cwCommMIBCompliance = ModuleCompliance((1, 3, 6, 1, 4, 1, 9, 9, 809, 2, 1, 1)).setObjects(("CISCO-WBX-MEETING-MIB", "ciscoWebExCommInfoGroup"), ("CISCO-WBX-MEETING-MIB", "ciscoWebExCommSystemResourceGroup"), ("CISCO-WBX-MEETING-MIB", "ciscoWebExMeetingMIBNotifsGroup"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
cwCommMIBCompliance = cwCommMIBCompliance.setStatus('current')
cwCommMIBGroups = MibIdentifier((1, 3, 6, 1, 4, 1, 9, 9, 809, 2, 2))
ciscoWebExCommInfoGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 9, 9, 809, 2, 2, 1)).setObjects(("CISCO-WBX-MEETING-MIB", "cwCommSystemVersion"), ("CISCO-WBX-MEETING-MIB", "cwCommSystemObjectID"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ciscoWebExCommInfoGroup = ciscoWebExCommInfoGroup.setStatus('current')
ciscoWebExCommSystemResourceGroup = ObjectGroup((1, 3, 6, 1, 4, 1, 9, 9, 809, 2, 2, 2)).setObjects(("CISCO-WBX-MEETING-MIB", "cwCommCPUTotalUsage"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageWindow"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUTotalNumber"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUName"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsage"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUMonitoringStatus"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageUser"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageNice"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageSystem"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageIdle"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageIOWait"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageIRQ"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageSoftIRQ"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageSteal"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUUsageCapacitySubTotal"), ("CISCO-WBX-MEETING-MIB", "cwCommCPUCapacityTotal"), ("CISCO-WBX-MEETING-MIB", "cwCommMEMUsage"), ("CISCO-WBX-MEETING-MIB", "cwCommMEMMonitoringStatus"), ("CISCO-WBX-MEETING-MIB", "cwCommMEMSwapUsage"), ("CISCO-WBX-MEETING-MIB", "cwCommMEMSwapMonitoringStatus"), ("CISCO-WBX-MEETING-MIB", "cwCommMEMTotal"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationHostAddressType"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationHostAddress"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationResName"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationResValue"), ("CISCO-WBX-MEETING-MIB", "cwCommNotificationSeqNum"), ("CISCO-WBX-MEETING-MIB", "cwCommDiskUsageCount"), ("CISCO-WBX-MEETING-MIB", "cwCommDiskPartitionName"), ("CISCO-WBX-MEETING-MIB", "cwCommDiskUsage"), ("CISCO-WBX-MEETING-MIB", "cwCommDiskTotal"), ("CISCO-WBX-MEETING-MIB", "cwCommDiskMonitoringStatus"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ciscoWebExCommSystemResourceGroup = ciscoWebExCommSystemResourceGroup.setStatus('current')
ciscoWebExMeetingMIBNotifsGroup = NotificationGroup((1, 3, 6, 1, 4, 1, 9, 9, 809, 2, 2, 3)).setObjects(("CISCO-WBX-MEETING-MIB", "cwCommSystemResourceUsageNormalEvent"), ("CISCO-WBX-MEETING-MIB", "cwCommSystemResourceUsageMinorEvent"), ("CISCO-WBX-MEETING-MIB", "cwCommSystemResourceUsageMajorEvent"))
if getattr(mibBuilder, 'version', (0, 0, 0)) > (4, 4, 0):
ciscoWebExMeetingMIBNotifsGroup = ciscoWebExMeetingMIBNotifsGroup.setStatus('current')
mibBuilder.exportSymbols("CISCO-WBX-MEETING-MIB", CiscoWebExCommSysResource=CiscoWebExCommSysResource, cwCommMEMUsage=cwCommMEMUsage, ciscoWebExCommInfo=ciscoWebExCommInfo, cwCommCPUUsageIRQ=cwCommCPUUsageIRQ, cwCommCPUUsageCapacitySubTotal=cwCommCPUUsageCapacitySubTotal, cwCommNotificationHostAddressType=cwCommNotificationHostAddressType, CiscoWebExCommSysResMonitoringStatus=CiscoWebExCommSysResMonitoringStatus, cwCommSystemResourceUsageNormalEvent=cwCommSystemResourceUsageNormalEvent, cwCommCPUUsageNice=cwCommCPUUsageNice, ciscoWebExMeetingMIBNotifs=ciscoWebExMeetingMIBNotifs, cwCommNotificationResValue=cwCommNotificationResValue, cwCommCPUUsageUser=cwCommCPUUsageUser, cwCommSystemVersion=cwCommSystemVersion, cwCommDiskPartitionName=cwCommDiskPartitionName, cwCommMEMMonitoringStatus=cwCommMEMMonitoringStatus, cwCommCPUUsageIdle=cwCommCPUUsageIdle, cwCommMIBCompliances=cwCommMIBCompliances, cwCommDiskUsageCount=cwCommDiskUsageCount, cwCommCPUCapacityTotal=cwCommCPUCapacityTotal, cwCommMIBGroups=cwCommMIBGroups, cwCommMEMTotal=cwCommMEMTotal, cwCommMEMSwapUsage=cwCommMEMSwapUsage, cwCommSystemResourceUsageMinorEvent=cwCommSystemResourceUsageMinorEvent, cwCommDiskMonitoringStatus=cwCommDiskMonitoringStatus, cwCommMEMSwapUsageObject=cwCommMEMSwapUsageObject, cwCommSystemObjectID=cwCommSystemObjectID, cwCommCPUUsageSystem=cwCommCPUUsageSystem, cwCommCPUUsageWindow=cwCommCPUUsageWindow, cwCommCPUIndex=cwCommCPUIndex, cwCommSystemResourceUsageMajorEvent=cwCommSystemResourceUsageMajorEvent, cwCommCPUUsageSoftIRQ=cwCommCPUUsageSoftIRQ, cwCommDiskUsageTable=cwCommDiskUsageTable, cwCommCPUUsageObject=cwCommCPUUsageObject, cwCommCPUUsageEntry=cwCommCPUUsageEntry, ciscoWebExMeetingMIB=ciscoWebExMeetingMIB, cwCommMEMUsageObject=cwCommMEMUsageObject, cwCommNotificationHostAddress=cwCommNotificationHostAddress, cwCommNotificationResName=cwCommNotificationResName, ciscoWebExMeetingMIBNotifsGroup=ciscoWebExMeetingMIBNotifsGroup, cwCommDiskTotal=cwCommDiskTotal, ciscoWebExCommSystemResourceGroup=ciscoWebExCommSystemResourceGroup, cwCommDiskUsageIndex=cwCommDiskUsageIndex, cwCommDiskUsage=cwCommDiskUsage, ciscoWebExCommSystemResource=ciscoWebExCommSystemResource, cwCommMEMSwapMonitoringStatus=cwCommMEMSwapMonitoringStatus, cwCommCPUMonitoringStatus=cwCommCPUMonitoringStatus, cwCommDiskUsageEntry=cwCommDiskUsageEntry, ciscoWebExMeetingMIBConform=ciscoWebExMeetingMIBConform, cwCommSysResourceNotificationObject=cwCommSysResourceNotificationObject, cwCommCPUTotalNumber=cwCommCPUTotalNumber, cwCommCPUUsageTable=cwCommCPUUsageTable, cwCommCPUUsageIOWait=cwCommCPUUsageIOWait, cwCommMIBCompliance=cwCommMIBCompliance, ciscoWebExCommInfoGroup=ciscoWebExCommInfoGroup, cwCommDiskUsageObject=cwCommDiskUsageObject, cwCommCPUUsageSteal=cwCommCPUUsageSteal, ciscoWebExMeetingMIBObjects=ciscoWebExMeetingMIBObjects, PYSNMP_MODULE_ID=ciscoWebExMeetingMIB, cwCommCPUUsage=cwCommCPUUsage, cwCommNotificationSeqNum=cwCommNotificationSeqNum, cwCommCPUTotalUsage=cwCommCPUTotalUsage, cwCommCPUName=cwCommCPUName)
| 139.952055 | 3,021 | 0.774434 | 2,143 | 20,433 | 7.383108 | 0.099393 | 0.00948 | 0.013652 | 0.015169 | 0.414044 | 0.366136 | 0.333649 | 0.319745 | 0.316774 | 0.311907 | 0 | 0.06594 | 0.075221 | 20,433 | 145 | 3,022 | 140.917241 | 0.77138 | 0.016444 | 0 | 0.044444 | 0 | 0 | 0.197531 | 0.108672 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.125926 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
24e0f2740a751f652bc308fe8498ecf5cab7ff33 | 93 | py | Python | timer.py | Proteus555/learning | 0aa6fcc7bb0996c51f656fd11286e67aea538c56 | [
"MIT"
] | null | null | null | timer.py | Proteus555/learning | 0aa6fcc7bb0996c51f656fd11286e67aea538c56 | [
"MIT"
] | null | null | null | timer.py | Proteus555/learning | 0aa6fcc7bb0996c51f656fd11286e67aea538c56 | [
"MIT"
] | null | null | null | import time
count=0
while count<10:
print(count)
count +=1
time.sleep(2)
print("Boom!")
| 9.3 | 15 | 0.677419 | 16 | 93 | 3.9375 | 0.6875 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064103 | 0.16129 | 93 | 9 | 16 | 10.333333 | 0.74359 | 0 | 0 | 0 | 0 | 0 | 0.054945 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.285714 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
24e28b775fc53c737acb81f35bb2e75bc3e42a3d | 651 | py | Python | tests/unit_tests/data_steward/curation_logging/deprecation_logging_test.py | lrwb-aou/curation | e80447e56d269dc2c9c8bc79e78218d4b0dc504c | [
"MIT"
] | 16 | 2017-06-30T20:05:05.000Z | 2022-03-08T21:03:19.000Z | tests/unit_tests/data_steward/curation_logging/deprecation_logging_test.py | lrwb-aou/curation | e80447e56d269dc2c9c8bc79e78218d4b0dc504c | [
"MIT"
] | 342 | 2017-06-23T21:37:40.000Z | 2022-03-30T16:44:16.000Z | tests/unit_tests/data_steward/curation_logging/deprecation_logging_test.py | lrwb-aou/curation | e80447e56d269dc2c9c8bc79e78218d4b0dc504c | [
"MIT"
] | 33 | 2017-07-01T00:12:20.000Z | 2022-01-26T18:06:53.000Z | import unittest
from deprecated import deprecated
class DeprecationLoggingTest(unittest.TestCase):
@classmethod
def setUpClass(cls):
print('**************************************************************')
print(cls.__name__)
print('**************************************************************')
def setUp(self):
pass
def test_deprecation_warning(self):
with self.assertWarns(DeprecationWarning) as warn:
self.deprecated_method()
self.assertIn('@@@@@', str(warn.warnings[0].message))
@deprecated(reason='@@@@@')
def deprecated_method(self):
pass
| 25.038462 | 79 | 0.506912 | 51 | 651 | 6.313725 | 0.607843 | 0.049689 | 0.124224 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001934 | 0.205837 | 651 | 25 | 80 | 26.04 | 0.62089 | 0 | 0 | 0.235294 | 0 | 0 | 0.205837 | 0.190476 | 0 | 0 | 0 | 0 | 0.117647 | 1 | 0.235294 | false | 0.117647 | 0.117647 | 0 | 0.411765 | 0.176471 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
24e3217507fd6a3e000936dac991684723efab1e | 250 | py | Python | metalgrafica/www/pruebaangular.py | Nirchains/metal | c6e4d5abac15750c6b33287e16034e83d08d8243 | [
"MIT"
] | null | null | null | metalgrafica/www/pruebaangular.py | Nirchains/metal | c6e4d5abac15750c6b33287e16034e83d08d8243 | [
"MIT"
] | null | null | null | metalgrafica/www/pruebaangular.py | Nirchains/metal | c6e4d5abac15750c6b33287e16034e83d08d8243 | [
"MIT"
] | null | null | null | # Copyright (c) 2015, Frappe Technologies Pvt. Ltd. and Contributors
# License: GNU General Public License v3. See license.txt
from __future__ import unicode_literals
import frappe
def get_context(context):
context.prueba = "esto es una prueba"
| 22.727273 | 68 | 0.78 | 35 | 250 | 5.4 | 0.8 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023585 | 0.152 | 250 | 10 | 69 | 25 | 0.867925 | 0.488 | 0 | 0 | 0 | 0 | 0.144 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.5 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 2 |
24f853b142a67e8d893120f2e477a85ffd168ebe | 539 | py | Python | src/coordinators/admin.py | mrts/foodbank-campaign | fbb059f3ebe44dccde4895964242b105421a69d1 | [
"MIT"
] | 1 | 2021-03-20T10:14:21.000Z | 2021-03-20T10:14:21.000Z | src/coordinators/admin.py | mrts/foodbank-campaign | fbb059f3ebe44dccde4895964242b105421a69d1 | [
"MIT"
] | 4 | 2018-03-24T21:49:02.000Z | 2021-01-13T21:31:44.000Z | src/coordinators/admin.py | mrts/foodbank-campaign | fbb059f3ebe44dccde4895964242b105421a69d1 | [
"MIT"
] | 3 | 2018-04-15T16:34:46.000Z | 2019-11-13T16:38:05.000Z | from django.utils.translation import ugettext_lazy as _
from django.contrib import admin
from django.contrib.auth.admin import UserAdmin as BaseUserAdmin
from django.contrib.auth.models import User
from coordinators.models import Coordinator
class CoordinatorInline(admin.StackedInline):
model = Coordinator
can_delete = False
verbose_name_plural = _('coordinator')
class UserAdmin(BaseUserAdmin):
inlines = (CoordinatorInline, )
# Re-register UserAdmin
admin.site.unregister(User)
admin.site.register(User, UserAdmin)
| 28.368421 | 64 | 0.80334 | 64 | 539 | 6.671875 | 0.5 | 0.093677 | 0.119438 | 0.098361 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.124304 | 539 | 18 | 65 | 29.944444 | 0.904661 | 0.038961 | 0 | 0 | 0 | 0 | 0.021318 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.384615 | 0 | 0.846154 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
70073df449852f40e20f23a1b250d705d62234ae | 5,517 | py | Python | GEMS_modules/opal/faultSim/misc/version00/faultSim.py | ma3mool/Approxilyzer | e05a38d0222cd31a59c8fb49ac7078eacfce7494 | [
"Ruby"
] | 4 | 2017-02-25T13:17:35.000Z | 2021-04-27T06:30:24.000Z | GEMS_modules/opal/faultSim/misc/version00/faultSim.py | ma3mool/Approxilyzer | e05a38d0222cd31a59c8fb49ac7078eacfce7494 | [
"Ruby"
] | null | null | null | GEMS_modules/opal/faultSim/misc/version00/faultSim.py | ma3mool/Approxilyzer | e05a38d0222cd31a59c8fb49ac7078eacfce7494 | [
"Ruby"
] | 4 | 2017-03-19T02:13:45.000Z | 2019-12-20T05:37:49.000Z | import time, sim_commands, string, sys, getpass, os
from cli import *
LOG_FILE_DIR = "./logs/"
#SIM_STEPS = "11000000"
SIM_STEPS = "11000"
def get_rand_simics_fd(app,phase):
print "Get Simics Random forwarding checkpoint\n"
run_sim_command('read-configuration ../sim_phases/%s_%s' %(app,phase) )
run_sim_command('c 4431287')
run_sim_command('run-python-file check_privilege.py')
run_sim_command('write-configuration ../sim_phases/%s_%s-1' %(app,phase) )
run_sim_command('c 2410963')
run_sim_command('run-python-file check_privilege.py')
run_sim_command('write-configuration ../sim_phases/%s_%s-2' %(app,phase) )
def get_rand_super(app):
print "Get Simics Random forwarding checkpoint\n"
run_sim_command('read-configuration ../sim_phases/%s' %(app) )
run_sim_command('c 4431287')
run_sim_command('run-python-file check_privilege.py')
run_sim_command('write-configuration ../sim_phases/%s_p1-1' %(app) )
run_sim_command('c 2410963')
run_sim_command('run-python-file check_privilege.py')
run_sim_command('write-configuration ../sim_phases/%s_p1-2' %(app) )
def get_rand_simics_chkpt(app,phase):
print "Get Random Simics checkpoint\n"
run_sim_command('read-configuration ../sarek_chkpt/%s_%s' %(app,phase) )
run_sim_command('c 1000000')
run_sim_command('run-python-file check_privilege.py')
run_sim_command('write-configuration ../sarek_phases/%s_%s' %(app,phase) )
def get_rand_cache_chkpt(app,phase):
print "Get Random Cache Checkpoint\n"
run_sim_command('read-configuration ../sarek_chkpt/%s_%s' %(app,phase))
run_sim_command("instruction-fetch-mode instruction-fetch-trace")
run_sim_command("istc-disable")
run_sim_command("dstc-disable")
run_sim_command('load-module ruby')
run_sim_command('ruby0.init')
run_sim_command('c 1000000')
run_sim_command('run-python-file check_privilege.py')
run_sim_command('ruby0.save-caches ../sarek_phases/%s_%s.caches.gz' %(app,phase))
def run_all(app, phase, type=-1, bit=-1, stuck=-1, faultreg=64, injinst=0, seqnum=0):
print "Running simulation for 10M instructions\n"
run_sim_command('read-configuration ../sarek_phases/%s_%s' % (app,phase))
run_sim_command('instruction-fetch-mode instruction-fetch-trace')
run_sim_command('istc-disable')
run_sim_command('dstc-disable')
run_sim_command('load-module ruby')
run_sim_command('load-module opal')
run_sim_command('ruby0.setparam g_NUM_PROCESSORS 1')
run_sim_command('ruby0.init')
run_sim_command('opal0.init')
run_sim_command('ruby0.load-caches ../sarek_phases/%s_%s.caches.gz' % (app,phase) )
run_sim_command('opal0.sim-start "/dev/null"')
run_sim_command('opal0.fault-log "%s/%s_%s.t%s.i%s.s%s.fault_log"' % (LOG_FILE_DIR, app, phase, type, seqnum, stuck))
run_sim_command('opal0.fault-type %s' %(type))
run_sim_command('opal0.fault-bit %s' %(bit))
run_sim_command('opal0.fault-stuck-at %s' %(stuck))
run_sim_command('opal0.faulty-reg-no %s' %(faultreg))
run_sim_command('opal0.fault-inj-inst %s' %(injinst))
run_sim_command('opal0.sim-step %s'%(SIM_STEPS))
run_sim_command('opal0.stats')
run_sim_command('opal0.fault-stats')
def run_golden(app,phase):
print "Golden run of 10M instructions\n"
run_sim_command('read-configuration ../sarek_phases/%s_%s' % (app,phase))
# all instr fetches visible:
run_sim_command("instruction-fetch-mode instruction-fetch-trace")
# disable simulation translation caches (STCs) of SIMICS:
run_sim_command("istc-disable")
run_sim_command("dstc-disable")
run_sim_command('load-module ruby')
run_sim_command('load-module opal')
run_sim_command('ruby0.setparam g_NUM_PROCESSORS 1')
run_sim_command('ruby0.init')
run_sim_command('opal0.init')
run_sim_command('ruby0.load-caches ../sarek_phases/%s_%s.caches.gz' % (app,phase) )
run_sim_command('opal0.sim-start "/dev/null"')
run_sim_command('opal0.fault-log "%s/%s_%s.golden.fault_log"' % (LOG_FILE_DIR,app,phase) )
run_sim_command('opal0.sim-step %s'%(SIM_STEPS))
run_sim_command('opal0.stats')
run_sim_command('opal0.fault-stats')
def run_sim_command(cmd):
print '### Executing "%s"'%cmd
try:
# run() returns None is no error occured
return run(cmd)
except:
run("quit 666")
return
# NOTES:
# a module is a file containing python definitions and statements - the file name is module name.py
# within this file the name is available as the value of the global variable __name__
# use import to import the defs
# the imported module names are placed in the importing module's global symbol table
# to import all names that a module defines: from module import * (all names except those beginning with an
# underscore)
# when a module m is imported, the interpreter searches for a file name called m.py in the current directory
# then in the dirs specified by PYTHONPATH, then the default path for python
# modules are searched in the list of directories given by the variable sys.path
# the script should not have the same name as the std module - it is on the search path
# if a file m.pyc exists for a file m.py, this is assumed to contain a byte compiled version of the module m
# the modification time for m.py when m.pyc was created is recorded, should a mismatch found,
# the .pyc file gets ignored
# whenever the .py file is successfully compiled, a .pyc file is tried to be automatically generated
# the speed-up in using .pyc files stems from faster loading, the program itself does not run faster
# std modules -> sys
# ->os ...
| 45.595041 | 119 | 0.741526 | 884 | 5,517 | 4.40724 | 0.239819 | 0.097023 | 0.210216 | 0.078542 | 0.61499 | 0.575462 | 0.559548 | 0.540811 | 0.536191 | 0.513347 | 0 | 0.021228 | 0.129056 | 5,517 | 120 | 120 | 45.975 | 0.789594 | 0.248142 | 0 | 0.541176 | 0 | 0.011765 | 0.44571 | 0.105187 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.011765 | 0.023529 | null | null | 0.082353 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
703baed81d0b01f1a355b2fc03593196fa0da9b5 | 748 | py | Python | Dataset/Leetcode/train/4/156.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/train/4/156.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | Dataset/Leetcode/train/4/156.py | kkcookies99/UAST | fff81885aa07901786141a71e5600a08d7cb4868 | [
"MIT"
] | null | null | null | def XXX(self, nums1: List[int], nums2: List[int]) -> float:
def findKthElement(arr1,arr2,k):
len1,len2 = len(arr1),len(arr2)
if len1 > len2:
return findKthElement(arr2,arr1,k)
if not arr1:
return arr2[k-1]
if k == 1:
return min(arr1[0],arr2[0])
i,j = min(k//2,len1)-1,min(k//2,len2)-1
if arr1[i] > arr2[j]:
return findKthElement(arr1,arr2[j+1:],k-j-1)
else:
return findKthElement(arr1[i+1:],arr2,k-i-1)
l1,l2 = len(nums1),len(nums2)
left,right = (l1+l2+1)//2,(l1+l2+2)//2
return (findKthElement(nums1,nums2,left)+findKthElement(nums1,nums2,right))/2
| 39.368421 | 85 | 0.506684 | 107 | 748 | 3.542056 | 0.271028 | 0.211082 | 0.116095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106 | 0.331551 | 748 | 18 | 86 | 41.555556 | 0.652 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
70536e78518c88f594195b781091ae2f269b7be9 | 403 | py | Python | NCPWD/apps/comments/models.py | fossabot/NCPWD | e46776ecf0cb4f263ff4e7883cb2402a3fc58717 | [
"Apache-2.0"
] | 1 | 2019-08-22T23:36:07.000Z | 2019-08-22T23:36:07.000Z | NCPWD/apps/comments/models.py | C3real-kill3r/H-digest | afce0a6e8b0fa4d2684550fe9f484ab9c6e76560 | [
"Apache-2.0"
] | 10 | 2019-12-12T13:59:43.000Z | 2021-09-22T18:21:44.000Z | NCPWD/apps/comments/models.py | C3real-kill3r/H-digest | afce0a6e8b0fa4d2684550fe9f484ab9c6e76560 | [
"Apache-2.0"
] | 2 | 2019-12-12T13:56:33.000Z | 2019-12-26T11:47:05.000Z | from django.db import models
from NCPWD.apps.topics.models import Topic
from datetime import datetime
from NCPWD.apps.authentication.models import User
class Comments(models.Model):
author = models.ForeignKey(User, on_delete=models.CASCADE)
topic = models.ForeignKey(Topic, on_delete=models.CASCADE)
body = models.TextField()
created_at = models.DateTimeField(default=datetime.now)
| 26.866667 | 62 | 0.781638 | 53 | 403 | 5.886792 | 0.509434 | 0.057692 | 0.083333 | 0.134615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131514 | 403 | 14 | 63 | 28.785714 | 0.891429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.444444 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
705e709d3fd6b8dc521563ca6dc53eb834daa951 | 394 | py | Python | slack-bot/helpers/message.py | Rolv-Apneseth/slack-bot | 3a11bf7e97d0e2de12a5f7a1bafd7be83987ab43 | [
"MIT"
] | null | null | null | slack-bot/helpers/message.py | Rolv-Apneseth/slack-bot | 3a11bf7e97d0e2de12a5f7a1bafd7be83987ab43 | [
"MIT"
] | null | null | null | slack-bot/helpers/message.py | Rolv-Apneseth/slack-bot | 3a11bf7e97d0e2de12a5f7a1bafd7be83987ab43 | [
"MIT"
] | null | null | null | class Message:
def __init__(self, to_channel):
self.to_channel = to_channel
self.timestamp = ""
self.text = ""
self.blocks = []
self.is_completed = False
def get_message(self):
return {
"ts": self.timestamp,
"channel": self.to_channel,
"text": self.text,
"blocks": self.blocks,
}
| 23.176471 | 39 | 0.510152 | 40 | 394 | 4.775 | 0.4 | 0.188482 | 0.204188 | 0.209424 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.373096 | 394 | 16 | 40 | 24.625 | 0.773279 | 0 | 0 | 0 | 0 | 0 | 0.048223 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0.071429 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
7073d5cd42513c5f5dc5a0e41fc955d2db1431b8 | 350 | py | Python | gans/models/gans/gan.py | tlatkowski/gans-2.0 | 974efc5bbcea39c0a7dec9405ba4514ada6dc39c | [
"MIT"
] | 78 | 2019-09-25T15:09:18.000Z | 2022-02-09T09:56:15.000Z | gans/models/gans/gan.py | tlatkowski/gans-2.0 | 974efc5bbcea39c0a7dec9405ba4514ada6dc39c | [
"MIT"
] | 23 | 2019-10-09T21:24:39.000Z | 2022-03-12T00:00:53.000Z | gans/models/gans/gan.py | tlatkowski/gans-2.0 | 974efc5bbcea39c0a7dec9405ba4514ada6dc39c | [
"MIT"
] | 18 | 2020-01-24T13:13:57.000Z | 2022-02-15T18:58:12.000Z | from abc import ABC
from abc import abstractmethod
class GAN(ABC):
@property
@abstractmethod
def generators(self):
raise NotImplementedError
@property
@abstractmethod
def discriminators(self):
raise NotImplementedError
@abstractmethod
def predict(self, inputs):
raise NotImplementedError
| 16.666667 | 33 | 0.691429 | 32 | 350 | 7.5625 | 0.46875 | 0.210744 | 0.107438 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.257143 | 350 | 20 | 34 | 17.5 | 0.930769 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.142857 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
707535ec877d2302c09feb8853ced10221a8464c | 336 | py | Python | appserver/neo4japp/schemas/formats/enrichment_tables.py | SBRG/lifelike | a7b715f38b389a585c10e6d0d067345937455c13 | [
"MIT"
] | 8 | 2022-01-28T08:43:07.000Z | 2022-03-23T11:18:10.000Z | appserver/neo4japp/schemas/formats/enrichment_tables.py | SBRG/lifelike | a7b715f38b389a585c10e6d0d067345937455c13 | [
"MIT"
] | 23 | 2022-02-14T15:25:00.000Z | 2022-03-28T15:30:45.000Z | appserver/neo4japp/schemas/formats/enrichment_tables.py | SBRG/lifelike | a7b715f38b389a585c10e6d0d067345937455c13 | [
"MIT"
] | 5 | 2022-01-28T15:45:44.000Z | 2022-03-14T11:36:49.000Z | import importlib.resources as resources
import json
import fastjsonschema
from .. import formats
# noinspection PyTypeChecker
with resources.open_text(formats, 'enrichment_tables_v5.json') as f:
# Use this method to validate the content of an enrichment table
validate_enrichment_table = fastjsonschema.compile(json.load(f))
| 28 | 68 | 0.803571 | 44 | 336 | 6.022727 | 0.659091 | 0.113208 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003448 | 0.136905 | 336 | 11 | 69 | 30.545455 | 0.910345 | 0.264881 | 0 | 0 | 0 | 0 | 0.102459 | 0.102459 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
70866eb2b8314cc3ade86cf87c1c999fa22a33a7 | 3,415 | py | Python | pglet/barchart.py | InesaFitsner/pglet-python | 487575736023421c42de01839c850ebe36244eb7 | [
"MIT"
] | null | null | null | pglet/barchart.py | InesaFitsner/pglet-python | 487575736023421c42de01839c850ebe36244eb7 | [
"MIT"
] | null | null | null | pglet/barchart.py | InesaFitsner/pglet-python | 487575736023421c42de01839c850ebe36244eb7 | [
"MIT"
] | null | null | null | from .utils import encode_attr
from .control import Control
# P
class P(Control):
def __init__(self, id=None, x=None, y=None, legend=None, color=None,
x_tooltip=None, y_tooltip=None):
Control.__init__(self, id=id)
self.x = x
self.y = y
self.legend = legend
self.color = color
self.x_tooltip = x_tooltip
self.y_tooltip = y_tooltip
def _getControlName(self):
return "p"
# x
@property
def x(self):
return self._get_attr("x")
@x.setter
def x(self, value):
assert value == None or isinstance(value, float) or isinstance(value, int), "x must be a float"
self._set_attr("x", value)
# y
@property
def y(self):
return self._get_attr("y")
@y.setter
def y(self, value):
assert value == None or isinstance(value, float) or isinstance(value, int), "y must be a float"
self._set_attr("y", value)
# legend
@property
def legend(self):
return self._get_attr("legend")
@legend.setter
def legend(self, value):
self._set_attr("legend", value)
# color
@property
def color(self):
return self._get_attr("color")
@color.setter
def color(self, value):
self._set_attr("color", value)
# x_tooltip
@property
def x_tooltip(self):
return self._get_attr("xTooltip")
@x_tooltip.setter
def x_tooltip(self, value):
self._set_attr("xTooltip", value)
# y_tooltip
@property
def y_tooltip(self):
return self._get_attr("yTooltip")
@y_tooltip.setter
def y_tooltip(self, value):
self._set_attr("yTooltip", value)
# Data
class Data(Control):
def __init__(self, id=None, points=[]):
Control.__init__(self, id=id)
self._points = []
if points and len(points) > 0:
for point in points:
self.add_point(point)
# points
@property
def points(self):
return self._points
def _getControlName(self):
return "data"
def add_point(self, point):
assert isinstance(point, P), ("data can hold points only")
self._points.append(point)
def _getChildren(self):
return self._points
class BarChart(Control):
def __init__(self, id=None, tooltips=None, data_mode=None, data=[],
width=None, height=None, padding=None, margin=None, visible=None, disabled=None):
Control.__init__(self, id=id,
width=width, height=height, padding=padding, margin=margin,
visible=visible, disabled=disabled)
self._data = Data(points=data)
self.tooltips = tooltips
self.data_mode = data_mode
def _getControlName(self):
return "barchart"
# data
@property
def data(self):
return self._data
# tooltips
@property
def tooltips(self):
return self._get_attr("tooltips")
@tooltips.setter
def tooltips(self, value):
assert value == None or isinstance(value, bool), "tooltips must be a boolean"
self._set_attr("tooltips", value)
# data_mode
@property
def data_mode(self):
return self._get_attr("dataMode")
@data_mode.setter
def data_mode(self, value):
self._set_attr("dataMode", value)
def _getChildren(self):
return [self._data] | 24.049296 | 107 | 0.602928 | 431 | 3,415 | 4.563805 | 0.143852 | 0.076258 | 0.085409 | 0.069141 | 0.364006 | 0.240976 | 0.111337 | 0.087951 | 0.067107 | 0.067107 | 0 | 0.00041 | 0.284919 | 3,415 | 142 | 108 | 24.049296 | 0.805078 | 0.021669 | 0 | 0.193878 | 0 | 0 | 0.056473 | 0 | 0 | 0 | 0 | 0 | 0.040816 | 1 | 0.27551 | false | 0 | 0.020408 | 0.153061 | 0.479592 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 |
709265e0318e6dcbbebf4773ff6a655c2001a37b | 2,801 | py | Python | Myna/models.py | sartho/GreenAnt | 9d46c19612ca0392d73b5f625d35e917076d93ca | [
"MIT"
] | null | null | null | Myna/models.py | sartho/GreenAnt | 9d46c19612ca0392d73b5f625d35e917076d93ca | [
"MIT"
] | null | null | null | Myna/models.py | sartho/GreenAnt | 9d46c19612ca0392d73b5f625d35e917076d93ca | [
"MIT"
] | null | null | null | from datetime import datetime
from Myna import db
from werkzeug.security import generate_password_hash, check_password_hash, new_hash
from Myna import login
from flask_login import UserMixin
from .Hornbill import IMGresizer
from Myna.config import Config
import os
from Myna import photos
followers = db.Table('followers',
db.Column('follower_id', db.Integer, db.ForeignKey('user.id')),
db.Column('followed_id', db.Integer, db.ForeignKey('user.id'))
)
class User(UserMixin, db.Model):
id = db.Column(db.Integer, primary_key=True, autoincrement=True)
username = db.Column(db.String(64), index=True, unique=True)
email = db.Column(db.String(120), index=True, unique=True)
stakeholder = db.Column(db.String(64))
password_hash = db.Column(db.String(128))
last_seen = db.Column(db.DateTime, index=True, default=datetime.utcnow)
posts = db.relationship('Post', backref='author', lazy='dynamic')
avatar_img=db.Column(db.String(128))
followed = db.relationship(
'User', secondary=followers,
primaryjoin=(followers.c.follower_id == id),
secondaryjoin=(followers.c.followed_id == id),
backref=db.backref('followers', lazy='dynamic'), lazy='dynamic')
def __repr__(self):
return '<User {}>'.format(self.username)
def set_password(self, password):
self.password_hash = generate_password_hash(password)
def check_password(self, password):
return check_password_hash(self.password_hash, password)
def update_avatar_img(self,filepath):
self.avatar_img=filepath
def avatarIMG(self,size):
filename=self.avatar_img.split('/')[-1]
print (filename)
imgloc=os.path.join(Config.UPLOADED_PHOTOS_DEST,filename)
print (imgloc)
img=IMGresizer(size,imgloc,'')
return photos.url(img.GetIMG())
def follow(self, user):
if not self.is_following(user):
self.followed.append(user)
def unfollow(self, user):
if self.is_following(user):
self.followed.remove(user)
def is_following(self, user):
return self.followed.filter(
followers.c.followed_id == user.id).count() > 0
def followed_posts(self):
return Post.query.join(
followers, (followers.c.followed_id == Post.user_id)).filter(
followers.c.follower_id == self.id).order_by(Post.timestamp.desc())
class Post(db.Model):
id = db.Column(db.Integer, primary_key=True)
body = db.Column(db.String(140))
timestamp = db.Column(db.DateTime, index=True, default=datetime.utcnow)
user_id = db.Column(db.Integer, db.ForeignKey('user.id'))
def __repr__(self):
return '<Post {}>'.format(self.body)
@login.user_loader
def load_user(id):
return User.query.get(id) | 34.580247 | 83 | 0.681899 | 373 | 2,801 | 4.989276 | 0.276139 | 0.055884 | 0.059108 | 0.051585 | 0.2187 | 0.173563 | 0.125739 | 0.094573 | 0.094573 | 0.042988 | 0 | 0.007902 | 0.186719 | 2,801 | 81 | 84 | 34.580247 | 0.809043 | 0 | 0 | 0.030769 | 1 | 0 | 0.041042 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.169231 | false | 0.092308 | 0.138462 | 0.092308 | 0.646154 | 0.030769 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 2 |
5608c76257468fd677f1d2f7f27cd68363cbbc58 | 676 | py | Python | pwb/trash_detection/views.py | Mo5mami/PWB | 809f3d67fc7be57fe772bcceea5acfb09a669fe5 | [
"MIT"
] | null | null | null | pwb/trash_detection/views.py | Mo5mami/PWB | 809f3d67fc7be57fe772bcceea5acfb09a669fe5 | [
"MIT"
] | null | null | null | pwb/trash_detection/views.py | Mo5mami/PWB | 809f3d67fc7be57fe772bcceea5acfb09a669fe5 | [
"MIT"
] | null | null | null | from django.shortcuts import render
from rest_framework.decorators import api_view
from rest_framework.views import APIView
from .trash_detection_service import TrashDetectionService
from asgiref.sync import sync_to_async
import asyncio
"""class TrashDetection(APIView):
def post(self, request, format=None):
print("am I async : ",asyncio.iscoroutinefunction(self.post))
async_fn = sync_to_async(TrashDetectionService.inference,thread_sensitive=False)
return async_fn(request)"""
@api_view([ 'POST'])
def trash_detection(request):
if request.method == 'POST':
return TrashDetectionService.inference(request)
| 29.391304 | 88 | 0.744083 | 79 | 676 | 6.189873 | 0.518987 | 0.03272 | 0.06953 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171598 | 676 | 23 | 89 | 29.391304 | 0.873214 | 0 | 0 | 0 | 0 | 0 | 0.020202 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.6 | 0 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
56159fa43d791d5adfdd24e872f673f9f310ed1e | 1,873 | py | Python | cnn_segmentation/preprocessing/generate_label_mapping.py | spencerimp/mri_segmentation | 4b1195c4520b09f0759c11c890ebc6331e3ecb06 | [
"Apache-2.0"
] | 8 | 2017-03-21T13:04:53.000Z | 2019-07-24T08:28:57.000Z | cnn_segmentation/preprocessing/generate_label_mapping.py | spencerimp/mri_segmentation | 4b1195c4520b09f0759c11c890ebc6331e3ecb06 | [
"Apache-2.0"
] | 3 | 2017-03-29T05:13:03.000Z | 2018-01-17T09:25:43.000Z | cnn_segmentation/preprocessing/generate_label_mapping.py | spencerimp/mri_segmentation | 4b1195c4520b09f0759c11c890ebc6331e3ecb06 | [
"Apache-2.0"
] | null | null | null | import csv
import numpy as np
ignored_labels = range(1,4)+range(5,11)+range(12,23)+range(24,30)+[33,34]+[42,43]+[53,54]+range(63,69)+[70,74]+\
range(80,100)+[110,111]+[126,127]+[130,131]+[158,159]+[188,189]
true_labels = [4, 11, 23, 30, 31, 32, 35, 36, 37, 38, 39, 40, 41, 44, 45, 46, 47, 48, 49, 50, 51, 52, 55, 56, 57,
58, 59, 60, 61, 62, 69, 71, 72, 73, 75, 76, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 112,
113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 128, 129, 132, 133, 134, 135, 136,
137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 151, 152, 153, 154, 155, 156,
157, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 174, 175, 176, 177, 178,
179, 180, 181, 182, 183, 184, 185, 186, 187, 190, 191, 192, 193, 194, 195, 196, 197, 198, 199, 200,
201, 202, 203, 204, 205, 206, 207]
OASIS_label_path = './MICCAI-Challenge-2012-Label-Information_v2.csv'
MICCAI_label_path = './MICCAI-Challenge-2012-Label-Information_v3.csv'
with open(OASIS_label_path) as f:
csvReader = csv.reader(f, delimiter=',')
labels, names = zip(*[row for row in csvReader])
labels = np.array(map(int, labels))
names = np.array(names)
for ignored_label in ignored_labels:
labels[np.where(labels==ignored_label)] = 0
idx = 1
miccai_names = []
for true_label in true_labels:
miccai_name = names[np.where(labels==true_label)][0]
labels[np.where(labels==true_label)] = idx
miccai_names.append(miccai_name)
idx += 1
#idx = 135
miccai_names = np.array(miccai_names)[:idx-1]
miccai_labels = range(1, 1+len(true_labels))
# output to file
with open(MICCAI_label_path, 'wb') as f:
csvWriter = csv.writer(f, delimiter=',')
csvWriter.writerows(zip(miccai_labels, miccai_names))
| 41.622222 | 115 | 0.619861 | 317 | 1,873 | 3.570978 | 0.640379 | 0.048587 | 0.034452 | 0.042403 | 0.116608 | 0.077739 | 0.077739 | 0 | 0 | 0 | 0 | 0.305256 | 0.207688 | 1,873 | 44 | 116 | 42.568182 | 0.457547 | 0.012814 | 0 | 0 | 0 | 0 | 0.054171 | 0.052004 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
561b2e390a2e869f0d12756c4ec440d42b06e869 | 187 | py | Python | package/service.py | kived/p4a-boot-service | 260bd455d7f6c698593f43b4eedaa14ce701f170 | [
"MIT"
] | null | null | null | package/service.py | kived/p4a-boot-service | 260bd455d7f6c698593f43b4eedaa14ce701f170 | [
"MIT"
] | null | null | null | package/service.py | kived/p4a-boot-service | 260bd455d7f6c698593f43b4eedaa14ce701f170 | [
"MIT"
] | null | null | null | import time
print 'service.py loaded'
def service():
while True:
print 'time is:', str(time.time())
time.sleep(1)
if __name__ == '__main__':
print 'starting service'
service()
| 13.357143 | 36 | 0.673797 | 26 | 187 | 4.538462 | 0.653846 | 0.135593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006494 | 0.176471 | 187 | 13 | 37 | 14.384615 | 0.75974 | 0 | 0 | 0 | 0 | 0 | 0.263441 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.111111 | null | null | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
561b4b8ba8fb3565d60156cb9bb54b066989730d | 19,202 | py | Python | app/endpoints/groups/views.py | kant/test-api | 2b2ab5b722dbf18cd99906b27fda356d02ae7a52 | [
"MIT"
] | null | null | null | app/endpoints/groups/views.py | kant/test-api | 2b2ab5b722dbf18cd99906b27fda356d02ae7a52 | [
"MIT"
] | null | null | null | app/endpoints/groups/views.py | kant/test-api | 2b2ab5b722dbf18cd99906b27fda356d02ae7a52 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
"""
Group Routes
List Groups
Count Groups
State of Group
Create Group
Get Group
Add User to Group
Remove User to Group
"""
import asyncio
import uuid
from datetime import datetime
from fastapi import APIRouter, Query, status
from fastapi.responses import JSONResponse, ORJSONResponse
from loguru import logger
from com_lib.crud_ops import execute_one_db, fetch_all_db, fetch_one_db
from com_lib.db_setup import database, groups, groups_item
from endpoints.groups.models import (
GroupCreate,
GroupItemDelete,
GroupTypeEnum,
GroupUser,
)
from endpoints.groups.validation import (
check_id_exists,
check_unique_name,
check_user_exists,
check_user_id_exists,
)
router = APIRouter()
title = "Delay in Seconds"
@router.get("/list", tags=["groups"])
async def group_list(
delay: int = Query(
None,
title=title,
description="Seconds to delay (max 121)",
ge=1,
le=121,
alias="delay",
),
qty: int = Query(
None,
title="Quanity",
description="Records to return (max 500)",
ge=1,
le=500,
alias="qty",
),
offset: int = Query(
None, title="Offset", description="Offset increment", ge=0, alias="offset"
),
is_active: bool = Query(None, title="by active status", alias="active"),
group_type: GroupTypeEnum = Query(
None, title="groupType", description="Type of group", alias="groupType"
),
group_name: str = Query(
None,
title="Group Name",
description="Get by the Group Name",
alias="groupName",
),
) -> dict:
"""[summary]
Get list of all groups and associated information
Args:
delay (int, optional): [description]. Defaults to Query( None, title=title,
description="Seconds to delay (max 121)", ge=1, le=121, alias="delay", ).
qty (int, optional): [description]. Defaults to Query( None, title="Quanity",
description="Records to return (max 500)", ge=1, le=500, alias="qty", ).
offset (int, optional): [description]. Defaults to Query( None, title="Offset",
description="Offset increment", ge=0, alias="offset" ).
is_active (bool, optional): [description]. Defaults to Query(None,
title="by active status", alias="active").
group_type (GroupTypeEnum, optional): [description]. Defaults to Query( None,
title="groupType", description="Type of group", alias="groupType" ).
Returns:
dict: [description]
GroupId, Name, Description, active state, dates created & updated
"""
criteria = []
# sleep if delay option is used
if delay is not None:
await asyncio.sleep(delay)
if qty is None:
qty: int = 100
if offset is None:
offset: int = 0
if is_active is not None:
criteria.append((groups.c.is_active, is_active, "equal"))
if group_type is not None:
criteria.append((groups.c.group_type, group_type, "equal"))
if group_name is not None:
criteria.append((groups.c.name, group_name, "ilike"))
query = groups.select().order_by(groups.c.date_create).limit(qty).offset(offset)
count_query = groups.select()
for crit in criteria:
col, val, compare_type = crit
if compare_type == "ilike":
query = query.where(col.ilike(f"%{val}%"))
else:
query = query.where(col == val)
count_query = count_query.where(col == val)
db_result = await database.fetch_all(query)
total_count = await database.fetch_all(count_query)
result = {
"parameters": {
"returned_results": len(db_result),
"qty": qty,
"total_count": len(total_count),
"offset": offset,
"filter": is_active,
"delay": delay,
},
"groups": db_result,
}
return result
@router.get("/list/count", tags=["groups"])
async def group_list_count(
delay: int = Query(
None,
title=title,
description="Seconds to delay (max 121)",
ge=1,
le=121,
alias="delay",
),
is_active: bool = Query(None, title="by active status", alias="active"),
group_type: GroupTypeEnum = Query(
None, title="groupType", description="Type of group", alias="groupType"
),
) -> dict:
"""[summary]
Get a count of groups
Args:
delay (int, optional): [description]. Defaults to Query( None,
title=title, description="Seconds to delay (max 121)", ge=1, le=121, alias="delay", ).
is_active (bool, optional): [description]. Defaults to Query(None,
title="by active status", alias="active").
group_type (GroupTypeEnum, optional): [description]. Defaults to Query( None,
title="groupType", description="Type of group", alias="groupType" ).
Returns:
dict: [description]
count based on filters
"""
criteria = []
# sleep if delay option is used
if delay is not None:
await asyncio.sleep(delay)
if is_active is not None:
criteria.append((groups.c.is_active, is_active))
if group_type is not None:
criteria.append((groups.c.group_type, group_type))
query = groups.select().order_by(groups.c.date_create)
count_query = groups.select()
for crit in criteria:
col, val = crit
query = query.where(col == val)
count_query = count_query.where(col == val)
total_count = await database.fetch_all(count_query)
result = {
"parameters": {
"total_count": len(total_count),
"filter": is_active,
"delay": delay,
},
}
return result
@router.put(
"/state",
tags=["groups"],
response_description="ID Modified",
response_class=ORJSONResponse,
status_code=201,
responses={
# 302: {"description": "Incorrect URL, redirecting"},
400: {"description": "Bad Request"},
422: {"description": "Validation Error"},
404: {"description": "Not Found"},
405: {"description": "Method not allowed"},
500: {"description": "All lines are busy, try again later."},
},
)
async def group_state(
*,
id: str = Query(..., title="group id", description="Group UUID", alias="id",),
is_active: bool = Query(
None,
title="active status",
description="true or false of status",
alias="isActive",
),
delay: int = Query(
None,
title=title,
ge=1,
le=10,
alias="delay",
description="integer delay value for simulating delays",
),
) -> dict:
"""[summary]
Active or Deactivate a Group ID
Args:
id (str, optional): [description]. Defaults to
Query(..., title="group id", description="Group UUID", alias="id",).
state (bool, optional): [description]. Defaults to
Query( ..., title="active state", description="true or false of state", alias="state", ).
delay (int, optional): [description]. Defaults to
Query( None, title=title, ge=1, le=10, alias="delay",
description="integer delay value for simulating delays", ).
Returns:
dict: [id, state]
"""
# sleep if delay option is used
if delay is not None:
logger.info(f"adding a delay of {delay} seconds")
await asyncio.sleep(delay)
if is_active is None:
error: dict = {"error": f"isActive must be true or false and cannot be empty"}
logger.warning(error)
return JSONResponse(status_code=422, content=error)
id_exists = await check_id_exists(id)
if id_exists is False:
error: dict = {"error": f"Group ID: '{id}' not found"}
logger.warning(error)
return JSONResponse(status_code=404, content=error)
try:
group_data = {
"is_active": is_active,
"date_update": datetime.now(),
}
logger.debug(group_data)
# create group
query = groups.update().where(groups.c.id == id)
group_result = await execute_one_db(query=query, values=group_data)
logger.debug(str(group_result))
# if "error" in group_result:
# error: dict = group_result
# logger.critical(error)
# return JSONResponse(status_code=400, content=error)
# data result
full_result: dict = {"id": id, "status": is_active}
logger.debug(full_result)
return JSONResponse(status_code=status.HTTP_201_CREATED, content=full_result)
except Exception as e:
error: dict = {"error": str(e)}
logger.debug(e)
logger.critical(error)
return JSONResponse(status_code=400, content=error)
@router.post(
"/create",
tags=["groups"],
response_description="The created item",
response_class=ORJSONResponse,
status_code=201,
responses={
# 302: {"description": "Incorrect URL, redirecting"},
400: {"description": "Bad Request"},
422: {"description": "Validation Error"},
# 404: {"description": "Operation forbidden"},
# 405: {"description": "Method not allowed"},
500: {"description": "All lines are busy, try again later."},
},
)
async def create_group(
*,
group: GroupCreate,
delay: int = Query(None, title=title, ge=1, le=10, alias="delay",),
) -> dict:
"""[summary]
Create a new group
Args:
group (GroupCreate): [description]
delay (int, optional): [description]. Defaults to Query(None,
title=title, ge=1, le=10, alias="delay",).
Returns:
dict: [description]
Group data
"""
# sleep if delay option is used
if delay is not None:
logger.info(f"adding a delay of {delay} seconds")
await asyncio.sleep(delay)
# approval or notification
group_type_check: list = ["approval", "notification"]
if group.group_type not in group_type_check:
error: dict = {
"error": f"Group Type '{group.group_type}'\
is not 'approval' or 'notification'"
}
logger.warning(error)
return JSONResponse(status_code=400, content=error)
check_name = str(group.name)
duplicate = await check_unique_name(check_name)
try:
if duplicate is False:
error: dict = {"error": f"Group Name '{group.name}' is a duplicate"}
logger.warning(error)
return JSONResponse(status_code=400, content=error)
group_id = uuid.uuid4()
group_data = {
"id": str(group_id),
"name": group.name,
"is_active": group.is_active,
"description": group.description,
"group_type": group.group_type,
"date_create": datetime.now(),
"date_update": datetime.now(),
}
logger.debug(group_data)
# create group
query = groups.insert()
group_result = await execute_one_db(query=query, values=group_data)
# if "error" in group_result:
# error: dict = group_result
# logger.critical(error)
# return JSONResponse(status_code=400, content=error)
# data result
full_result: dict = {"id": str(group_id), "data": group_result}
logger.debug(full_result)
return JSONResponse(status_code=status.HTTP_201_CREATED, content=full_result)
except Exception as e:
error: dict = {"error": str(e)}
logger.critical(error)
return JSONResponse(status_code=400, content=error)
@router.get("/group", tags=["groups"])
async def group_id(
group_id: str = Query(
None, title="Group ID", description="Get by the Group UUID", alias="groupId",
),
group_name: str = Query(
None,
title="Group Name",
description="Get by the Group Name",
alias="groupName",
),
delay: int = Query(
None,
title=title,
description="Seconds to delay (max 121)",
ge=1,
le=121,
alias="delay",
),
) -> dict:
"""[summary]
Get individual group data, including users
Args:
group_id (str, optional): [description]. Defaults to Query( None,
title="Group ID", description="Get by the Group UUID", alias="groupId", ).
group_name (str, optional): [description]. Defaults to Query( None,
title="Group Name", description="Get by the Group Name", alias="groupName", ).
delay (int, optional): [description]. Defaults to Query( None,
title=title, description="Seconds to delay (max 121)", ge=1, le=121, alias="delay", ).
Returns:
dict: [description]
Group data and associated users
"""
# sleep if delay option is used
if delay is not None:
await asyncio.sleep(delay)
# if search by ID
if group_id is not None:
id_exists = await check_id_exists(group_id)
if id_exists is False:
error: dict = {"error": f"Group ID: '{group_id}' not found"}
logger.warning(error)
return JSONResponse(status_code=404, content=error)
# elif search by name
elif group_name is not None:
name_exists = await check_unique_name(group_name)
if name_exists is True:
error: dict = {"error": f"Group Name: '{group_name}' not found"}
logger.warning(error)
return JSONResponse(status_code=404, content=error)
query = groups.select().where(groups.c.name == group_name)
name_result = await fetch_one_db(query=query)
group_id = name_result["id"]
# else at least one needs to be selected
else:
error: dict = {"error": "groupId or groupName must be used"}
logger.warning(error)
return JSONResponse(status_code=404, content=error)
query = groups_item.select().where(groups_item.c.group_id == group_id)
db_result = await fetch_all_db(query=query)
users_list: list = []
user_dict: dict = []
for r in db_result:
logger.debug(r)
user_data: dict = {
"id": r["id"],
"user": r["user"],
"date_created": str(r["date_create"]),
}
user_dict.append(user_data)
users_list.append(r["user"])
result = {
"group_id": group_id,
"count": len(users_list),
"users": users_list,
"user_info": user_dict,
}
return JSONResponse(status_code=200, content=result)
@router.post(
"/user/create",
tags=["groups"],
response_description="The created item",
response_class=ORJSONResponse,
status_code=201,
responses={
# 302: {"description": "Incorrect URL, redirecting"},
400: {"description": "Bad Request"},
422: {"description": "Validation Error"},
# 404: {"description": "Operation forbidden"},
# 405: {"description": "Method not allowed"},
500: {"description": "All lines are busy, try again later."},
},
)
async def create_group_user(
*,
group: GroupUser,
delay: int = Query(
None,
title=title,
description="Seconds to delay (max 121)",
ge=1,
le=121,
alias="delay",
),
) -> dict:
"""[summary]
Add a user to a group
Args:
group (GroupUser): [description]
delay (int, optional): [description]. Defaults to Query( None,
title=title, description="Seconds to delay (max 121)", ge=1, le=121, alias="delay", ).
Returns:
dict: [description]
Confirmation of user being added
"""
# sleep if delay option is used
if delay is not None:
logger.info(f"adding a delay of {delay} seconds")
await asyncio.sleep(delay)
check_id = str(group.group_id)
group_id_exists = await check_id_exists(id=check_id)
if group_id_exists is False:
error: dict = {"error": f"Group ID '{check_id}' does not exist"}
logger.warning(error)
return JSONResponse(status_code=404, content=error)
check_user = str(group.user)
exist_user = await check_user_exists(user=check_user, group_id=check_id)
if exist_user is True:
error: dict = {"error": f"User ID '{check_id}' already in group"}
logger.warning(error)
return JSONResponse(status_code=400, content=error)
try:
user_id = str(uuid.uuid4())
group_data = {"id": user_id, "user": group.user, "group_id": group.group_id}
logger.debug(group_data)
# create group
query = groups_item.insert()
group_result = await execute_one_db(query=query, values=group_data)
logger.debug(str(group_result))
# if "error" in group_result:
# error: dict = group_result
# logger.critical(error)
# return JSONResponse(status_code=400, content=error)
# data result
full_result: dict = group_data
# full_result: dict = {"id": str(user_id), "data": group_result}
logger.debug(full_result)
return JSONResponse(status_code=status.HTTP_201_CREATED, content=full_result)
except Exception as e:
error: dict = {"error": str(e)}
logger.debug(e)
logger.critical(f"Critical Error: {e}")
return JSONResponse(status_code=400, content=error)
@router.delete(
"/user/delete",
tags=["groups"],
response_description="The deleted item",
responses={
302: {"description": "Incorrect URL, redirecting"},
404: {"description": "Not Found"},
# 405: {"description": "Method not allowed"},
500: {"description": "Mommy!"},
},
)
async def delete_group_item_user_id(
*,
user: GroupItemDelete,
delay: int = Query(
None,
title=title,
description="Seconds to delay (max 121)",
ge=1,
le=121,
alias="delay",
),
) -> dict:
"""[summary]
Remove User from Group
Args:
user (GroupItemDelete): [description]
delay (int, optional): [description]. Defaults to Query( None,
title=title, description="Seconds to delay (max 121)", ge=1, le=121, alias="delay", ).
Returns:
dict: [description]
Confirmation of removal
"""
# sleep if delay option is used
if delay is not None:
logger.info(f"adding a delay of {delay} seconds")
await asyncio.sleep(delay)
check_id = str(user.id)
group_id_exists = await check_user_id_exists(id=check_id)
if group_id_exists is False:
error: dict = {"error": f"Group ID '{check_id}' does not exist"}
logger.warning(error)
return JSONResponse(status_code=404, content=error)
try:
# delete id
logger.debug(str(user.id))
query = groups_item.delete().where(groups_item.c.id == user.id)
await execute_one_db(query)
result = {"status": f"{user.id} deleted"}
return JSONResponse(status_code=200, content=result)
except Exception as e:
error: dict = {"error": f"{e}"}
logger.error(error)
return JSONResponse(status_code=500, content=error)
| 31.478689 | 98 | 0.604729 | 2,317 | 19,202 | 4.892965 | 0.095814 | 0.025404 | 0.039517 | 0.054335 | 0.744641 | 0.706007 | 0.683602 | 0.658463 | 0.635089 | 0.614713 | 0 | 0.017568 | 0.270753 | 19,202 | 609 | 99 | 31.530378 | 0.792045 | 0.072857 | 0 | 0.56391 | 0 | 0 | 0.148151 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.025063 | 0 | 0.077694 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
561e7de26114b43ad769d1dcf6d1ac6c46660f56 | 736 | py | Python | tests/integration_tests/points_tests/test_copy.py | skrat/martinez | 86db48324cb50ecb52be8ab2e4278a6d5cdd562b | [
"MIT"
] | 7 | 2020-05-07T08:13:44.000Z | 2021-12-17T07:33:51.000Z | tests/integration_tests/points_tests/test_copy.py | skrat/martinez | 86db48324cb50ecb52be8ab2e4278a6d5cdd562b | [
"MIT"
] | 17 | 2019-11-29T23:17:26.000Z | 2020-12-20T15:47:17.000Z | tests/integration_tests/points_tests/test_copy.py | skrat/martinez | 86db48324cb50ecb52be8ab2e4278a6d5cdd562b | [
"MIT"
] | 1 | 2020-12-17T22:44:21.000Z | 2020-12-17T22:44:21.000Z | import copy
from typing import Tuple
from hypothesis import given
from tests.bind_tests.hints import BoundPoint
from tests.integration_tests.utils import are_bound_ported_points_equal
from tests.port_tests.hints import PortedPoint
from . import strategies
@given(strategies.points_pairs)
def test_shallow(points_pair: Tuple[BoundPoint, PortedPoint]) -> None:
bound, ported = points_pair
assert are_bound_ported_points_equal(copy.copy(bound), copy.copy(ported))
@given(strategies.points_pairs)
def test_deep(points_pair: Tuple[BoundPoint, PortedPoint]) -> None:
bound, ported = points_pair
assert are_bound_ported_points_equal(copy.deepcopy(bound),
copy.deepcopy(ported))
| 29.44 | 77 | 0.763587 | 96 | 736 | 5.614583 | 0.3125 | 0.102041 | 0.157699 | 0.111317 | 0.525046 | 0.478664 | 0.356215 | 0.356215 | 0.356215 | 0.356215 | 0 | 0 | 0.161685 | 736 | 24 | 78 | 30.666667 | 0.873582 | 0 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 1 | 0.125 | false | 0 | 0.4375 | 0 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
5631fc2feedfff7b8e6286cf1cd68434375bb7c3 | 189 | py | Python | wavefront_reader/__init__.py | SimLeek/wavefront_reader | 4504f5b6185a03fcdd1722dbea660f7af35b8b8c | [
"MIT"
] | null | null | null | wavefront_reader/__init__.py | SimLeek/wavefront_reader | 4504f5b6185a03fcdd1722dbea660f7af35b8b8c | [
"MIT"
] | null | null | null | wavefront_reader/__init__.py | SimLeek/wavefront_reader | 4504f5b6185a03fcdd1722dbea660f7af35b8b8c | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
__author__ = """Nicholas A. Del Grosso"""
__email__ = 'delgrosso@bio.lmu.de'
__version__ = '0.1.0'
from .reading import read_objfile, read_mtlfile, read_wavefront
| 23.625 | 63 | 0.703704 | 26 | 189 | 4.538462 | 0.884615 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02439 | 0.132275 | 189 | 7 | 64 | 27 | 0.695122 | 0.111111 | 0 | 0 | 0 | 0 | 0.283133 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
56415b11dc039381b0e6a1eb50011a38d11ae0ae | 100 | py | Python | bot/cogs/__init__.py | IPLSplatoon/CheckpointBot | 8d4ece8265270aa5d66ebdb430f51c46f720c943 | [
"MIT"
] | 1 | 2021-05-03T19:53:10.000Z | 2021-05-03T19:53:10.000Z | bot/cogs/__init__.py | IPLSplatoon/CheckpointBot | 8d4ece8265270aa5d66ebdb430f51c46f720c943 | [
"MIT"
] | null | null | null | bot/cogs/__init__.py | IPLSplatoon/CheckpointBot | 8d4ece8265270aa5d66ebdb430f51c46f720c943 | [
"MIT"
] | 1 | 2021-05-06T01:31:27.000Z | 2021-05-06T01:31:27.000Z | """List all the extensions."""
names = [
"tourney",
"season",
"refresh",
"misc",
]
| 11.111111 | 30 | 0.49 | 9 | 100 | 5.444444 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.29 | 100 | 8 | 31 | 12.5 | 0.690141 | 0.24 | 0 | 0 | 0 | 0 | 0.342857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
564ea8d237587582d11fe9aa01199d492f2216f3 | 43,484 | py | Python | powerbi_toolkit/toolkit.py | seequality/seequality_powerbi_python_toolkit | 01fa1d984d035819884f8f3f8ab19a91afb5443d | [
"MIT"
] | 1 | 2018-08-20T17:48:10.000Z | 2018-08-20T17:48:10.000Z | powerbi_toolkit/toolkit.py | seequality/seequality_powerbi_python_toolkit | 01fa1d984d035819884f8f3f8ab19a91afb5443d | [
"MIT"
] | null | null | null | powerbi_toolkit/toolkit.py | seequality/seequality_powerbi_python_toolkit | 01fa1d984d035819884f8f3f8ab19a91afb5443d | [
"MIT"
] | 1 | 2019-11-15T22:55:03.000Z | 2019-11-15T22:55:03.000Z | # internal
from powerbi_toolkit.classes import PowerbiApp
from powerbi_toolkit.classes import PowerbiWorkspace
from powerbi_toolkit.classes import PowerbiWorkspaceDashboard
from powerbi_toolkit.classes import PowerbiWorkspaceReport
from powerbi_toolkit.classes import PowerbiWorkspaceReportTab
from powerbi_toolkit.classes import ScreenshotType
# external
import os
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
from selenium.webdriver.common.keys import Keys
from selenium.common.exceptions import TimeoutException
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from bs4 import BeautifulSoup
import re
import os
import time
import urllib.request
import datetime
import uuid
import collections
import logging
import json
class Toolkit:
def __init__(self, arguments: dict):
self.CONFIG_FILE_PATH = os.path.abspath(__file__ + "/../../config/config.json")
with open(self.CONFIG_FILE_PATH) as f:
config_data = json.load(f)
self.base_Url = config_data["tech"]["base_url"]
self.base_url_after_login = config_data["tech"]["base_url_after_login"]
self.chromium_path = config_data["tech"]["chromium_path"]
self.time_sleep_normal_seconds = config_data["tech"]["time_sleep_normal_seconds"]
self.time_sleep_report_seconds = config_data["tech"]["time_sleep_report_seconds"]
self.page_load_timeout = config_data["tech"]["page_load_timeout"]
self.powerbi_user_email = config_data["user"]["powerbi_user_email"]
self.powerbi_user_name = config_data["user"]["powerbi_user_name"]
self.powerbi_user_password = config_data["user"]["powerbi_user_password"]
self.log_img_directory_path = config_data["system"]["log_img_directory_path"]
self.log_save_code_error_screenshot = config_data["system"]["log_save_code_error_screenshot"]
self.log_app_directory_path = config_data["system"]["log_app_directory_path"]
self.log_data_directory_path = config_data["system"]["log_data_directory_path"]
# structures
self.app_list = []
self.workspae_list = []
self.workspace_report_list = []
self.workspace_dashboard_list = []
self.workspace_report_tab_list = []
self.workspace_report_tab_duplicates_list = []
self.workspace_report_tab_visual_errors_list = []
self.workspace_dashboard_visual_errors_list = []
# initialize
chrome_options = Options()
chrome_options.add_argument("--headless")
chrome_options.add_argument('--disable-gpu')
chrome_options.add_argument("--disable-extensions")
chrome_options.add_argument("--log-level=2")
#self.browser = webdriver.Chrome(executable_path=self.chromium_path) # to run without headless mode
self.browser = webdriver.Chrome(executable_path=self.chromium_path, chrome_options=chrome_options)
self.browser.execute_script("document.body.style.zoom='100'")
self.is_logged_in = False
self.current_run_guid = uuid.uuid4().hex
#logging
logging.basicConfig(level=logging.INFO)
formatter = logging.Formatter('%(asctime)s - %(name)s - %(funcName)s - %(lineno)s - %(levelname)s - %(message)s')
self.logger = logging.getLogger(__name__)
file_handler = logging.FileHandler(self.log_app_directory_path + "applog.txt")
file_handler.setLevel(logging.INFO)
file_handler.setFormatter(formatter)
# for debugging
#console_handler = logging.StreamHandler()
#console_handler.setLevel(logging.INFO)
#console_handler.setFormatter(formatter)
#self.logger.addHandler(console_handler)
self.logger.addHandler(file_handler)
self.logger.propagate = False
# log start and log used arguments
self.logger.info("Starting toolkit with the following parameters. Method: <" + arguments["method"] + ">, output: <" + arguments["output"] + ">")
def dispose(self):
self.browser.quit()
def log(self, log_message: str):
print ("Log message: " + log_message)
full_log_message = datetime.datetime.today().strftime("%Y%m%d_%H%M%S") + " > guid: " + self.current_run_guid + " > message: " + log_message
current_output_file_name = self.log_app_directory_path + "log_" + ".txt"
with open(current_output_file_name, 'a') as output_file:
output_file.write(full_log_message + "\n")
def saveScreenshot(self, screenshotType: ScreenshotType):
if ((screenshotType.name == ScreenshotType.CodeError and self.log_save_code_error_screenshot == True) or screenshotType.name != ScreenshotType.CodeError):
current_file_name = self.log_img_directory_path + screenshotType.value + datetime.datetime.today().strftime("_%Y%m%d_%H%M%S_") + self.current_run_guid + ".png"
self.browser.get_screenshot_as_file(current_file_name)
return current_file_name
else:
return "none"
def printData(self):
for app in self.app_list:
print ("app > " + app.AppName + " > " + app.AppUrl)
for workspace in self.workspae_list:
print ("workspace > " + workspace.WorkspaceName + " > " + workspace.WorkspaceUrl)
for workspace_report in self.workspace_report_list:
print ("workspace report > " + workspace_report.WorkspaceName + " > " + workspace_report.WorkspaceReportName + " > " + workspace_report.WorkspaceReportUrl)
for workspace_dashboard in self.workspace_dashboard_list:
print ("workspace dashboard > " + workspace_dashboard.WorkspaceName + " > " + workspace_dashboard.WorkspaceDashboardName + " > " + workspace_dashboard.WorkspaceDashboardUrl)
for workspace_report_tab in self.workspace_report_tab_list:
print ("workspace report tabs > " + " > " + workspace_report_tab.WorkspaceName + " > " + workspace_report_tab.WorkspaceReportName + " > " + workspace_report_tab.WorkspaceReportUrl + " > " + workspace_report_tab.WorkspaceReportTabName + " > " + workspace_report_tab.WOrkspaceReportTabUrl + "\n")
for workspace_report_tab in self.workspace_report_tab_list:
print ("workspace report duplicates tabs > " + " > " + workspace_report_tab.WorkspaceName + " > " + workspace_report_tab.WorkspaceReportName + " > " + workspace_report_tab.WorkspaceReportUrl + " > " + workspace_report_tab.WorkspaceReportTabName + " > " + workspace_report_tab.WOrkspaceReportTabUrl + "\n")
for workspace_report_tab in self.workspace_report_tab_visual_errors_list:
print ("workspace report tabs visual error > " + " > " + workspace_report_tab.WorkspaceName + " > " + workspace_report_tab.WorkspaceReportName + " > " + workspace_report_tab.WorkspaceReportUrl + " > " + workspace_report_tab.WorkspaceReportTabName + " > " + workspace_report_tab.WOrkspaceReportTabUrl + "\n")
for workspace_dashboard in self.workspace_dashboard_visual_errors_list:
print ("workspace dashboard visual error > " + workspace_dashboard.WorkspaceName + " > " + workspace_dashboard.WorkspaceDashboardName + " > " + workspace_dashboard.WorkspaceDashboardUrl + "\n")
def saveData(self):
self.logger.info("Saving the data to file started")
current_output_file_name = self.log_data_directory_path + "output_data_" + datetime.datetime.today().strftime("%Y%m%d_%H%M%S_") + self.current_run_guid + ".txt"
with open(current_output_file_name, 'a') as output_file:
for app in self.app_list:
output_file.write("app > " + app.AppName + " > " + app.AppUrl + "\n")
for workspace in self.workspae_list:
output_file.write("workspace > " + workspace.WorkspaceName + " > " + workspace.WorkspaceUrl + "\n")
for workspace_report in self.workspace_report_list:
output_file.write("workspace report > " + workspace_report.WorkspaceName + " > " + workspace_report.WorkspaceReportName + " > " + workspace_report.WorkspaceReportUrl + "\n")
for workspace_dashboard in self.workspace_dashboard_list:
output_file.write("workspace dashboard > " + workspace_dashboard.WorkspaceName + " > " + workspace_dashboard.WorkspaceDashboardName + " > " + workspace_dashboard.WorkspaceDashboardUrl + "\n")
for workspace_report_tab in self.workspace_report_tab_list:
output_file.write("workspace report tabs > " + " > " + workspace_report_tab.WorkspaceName + " > " + workspace_report_tab.WorkspaceReportName + " > " + workspace_report_tab.WorkspaceReportUrl + " > " + workspace_report_tab.WorkspaceReportTabName + " > " + workspace_report_tab.WOrkspaceReportTabUrl + "\n")
for workspace_report_tab in self.workspace_report_tab_duplicates_list:
output_file.write("workspace report duplicate tabs > " + " > " + workspace_report_tab.WorkspaceName + " > " + workspace_report_tab.WorkspaceReportName + " > " + workspace_report_tab.WorkspaceReportUrl + " > " + workspace_report_tab.WorkspaceReportTabName + " > " + workspace_report_tab.WOrkspaceReportTabUrl + "\n")
for workspace_report_tab in self.workspace_report_tab_visual_errors_list:
output_file.write("workspace report tabs visual error > " + " > " + workspace_report_tab.WorkspaceName + " > " + workspace_report_tab.WorkspaceReportName + " > " + workspace_report_tab.WorkspaceReportUrl + " > " + workspace_report_tab.WorkspaceReportTabName + " > " + workspace_report_tab.WOrkspaceReportTabUrl + "\n")
for workspace_dashboard in self.workspace_dashboard_visual_errors_list:
output_file.write("workspace dashboard visual error > " + workspace_dashboard.WorkspaceName + " > " + workspace_dashboard.WorkspaceDashboardName + " > " + workspace_dashboard.WorkspaceDashboardUrl + "\n")
self.logger.info("Saving the data to file done")
def login(self):
# go to the power bi main page
self.browser.get(self.base_Url)
try: # login
self.logger.info("Trying to connect with username " + self.powerbi_user_name)
# click sign in in main page
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//html//ul[@class='menu-secondary']//a[@ms.cmpnm='Sign in']"))).click()
# fill user email in the first login page
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//html//input[@name='loginfmt']"))).send_keys(self.powerbi_user_email)
# aprove user email/name and click next to go to the organization login
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//html//input[@class='btn btn-block btn-primary']"))).click()
# fill user name in the second login page
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//html//input[@name='UserName']"))).send_keys(self.powerbi_user_name)
# fill user password in the second login page
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//html//input[@name='Password']"))).send_keys(self.powerbi_user_password)
# login to powerbi.com
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//html//div[@class='submitMargin']//span[@class='submit']"))).click()
# choose to not remember user in the browser
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//html//input[@class='btn btn-block' and @value='No']"))).click()
# set the is_logged_in flag to true
self.is_logged_in = True
self.logger.info("Logging to powerbi.com succesfull")
except TimeoutException:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Timeout while trying to login, screenshot: " + error_screenshot_name)
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Other error while trying to login, screenshot: " + error_screenshot_name)
def getApps(self):
self.logger.info("Getting apps started")
if self.is_logged_in == True:
try: # get all apps addresses
self.logger.info("Getting apps list started")
# click Apps to go to the Apps tab
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//html//button[@title='Apps']//span[@class='btnLabel' and @localize='Apps_NavPaneTitle']"))).click()
# get list of all app's names
allAppsTitles = WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//html//section[@class='galleryContainer appsGallery']"))).find_elements_by_xpath("//li[@class='galleryItem unselectable']//h1")
self.logger.info("Getting apps list done")
self.logger.info("Getting all apps started")
# prepare
apps_page_path = self.browser.current_url
main_window = self.browser.current_window_handle
for app in allAppsTitles:
try: # get app page url
current_app_name = app.get_attribute("textContent").strip()
self.logger.info("Getting app <" + current_app_name + "> started")
# open new blank tab
self.browser.execute_script("window.open('');")
# switch to new tab
self.browser.switch_to.window(self.browser.window_handles[1])
# go to the app page in the new tab
self.browser.get(apps_page_path)
# wait till the page will load and open - click - new app
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//section[@class='galleryContainer appsGallery']//li[@aria-label='" + current_app_name + "']"))).click()
# get current page url and add to list
self.app_list.append(PowerbiApp(AppName = current_app_name, AppUrl = self.browser.current_url))
# close the current tab
self.browser.close()
# go to main tab
self.browser.switch_to_window(main_window)
self.logger.info("Getting app <" + current_app_name + "> done")
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.info("Error while getting app <" + current_app_name + "> started, screenshot: " + error_screenshot_name)
except TimeoutException:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Timed out while getting apps lists, screenshot: " + error_screenshot_name)
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Other error while getting apps lists, screenshot: " + error_screenshot_name)
self.logger.info("Getting all apps done")
else:
self.logger.error("User not logged in")
self.logger.info("Getting apps done")
def getWorkspaces(self):
self.logger.info("Getting workspaces started")
if self.is_logged_in == True:
try: # get all workspaces addresses
self.logger.info("Getting workspaces list started")
# go to main page
self.browser.get(self.base_url_after_login)
# click to expand workspaces list
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//button[@title='Show/hide workspaces']"))).click()
# wait till the list will open
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//group-list[@class='large']//header//span[@localize='NavigationPane_Groups_Workspaces_V2']")))
# get all workspaces
allWorkspaces = self.browser.find_elements_by_xpath("//ul//li[@ng-repeat='folder in $ctrl.appWorkspaces track by folder.uniqueId']//span[@class='workspaceName']")
self.logger.info("Getting workspaces list done")
self.logger.info("Getting all workspaces started")
# prepare
workspace_page_path = self.browser.current_url
main_window = self.browser.current_window_handle
# extract workspaces titles
for workspace in allWorkspaces:
try: # get workspace url
current_workspace_name = workspace.get_attribute("textContent").strip()
self.logger.info("Getting workspace <" + current_workspace_name + "> started")
# open new blank tab
self.browser.execute_script("window.open('');")
# switch to new tab
self.browser.switch_to.window(self.browser.window_handles[1])
# go to the workspace page in the new tab
self.browser.get(workspace_page_path)
# click to expand workspaces list
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//button[@title='Show/hide workspaces']"))).click()
# wait till the list will open
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//group-list[@class='large']//header//span[@localize='NavigationPane_Groups_Workspaces_V2']")))
# wait till the page will load and open - click - new workspace
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//li//button[@title='" + current_workspace_name + "']//span[@class='workspaceName']"))).click()
# get current page url and add to list
self.workspae_list.append(PowerbiWorkspace(WorkspaceName = current_workspace_name, WorkspaceUrl = self.browser.current_url))
# close the current tab
self.browser.close()
# go to main tab
self.browser.switch_to_window(main_window)
self.logger.info("Getting workspace <" + current_workspace_name + "> done")
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Error while getting workspace " + current_workspace_name + ", screenshot: " + error_screenshot_name)
except TimeoutException:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Timed out while getting workspaces, screenshot: " + error_screenshot_name)
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Other error while getting workspaces, screenshot: " + error_screenshot_name)
self.logger.info("Getting all workspaces done")
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("User not logged in")
self.logger.info("Getting workspaces done")
def checkReportTabsVisualsErrors(self):
self.logger.info("Getting reports tabs visual errors started")
# check if there is any downloaded reports tabs
if len(self.workspace_report_tab_list) > 0:
# check if user is already logged in
if self.is_logged_in == True:
for workspace_report_tab in self.workspace_report_tab_list:
self.logger.info("Getting report tabs visual errors for: " + workspace_report_tab.WorkspaceReportTabName + " started")
# go to the specific report tab page
self.browser.get(workspace_report_tab.WOrkspaceReportTabUrl)
# wait till content will be loaded
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//div[@role='tab']")))
# wait extra seconds, try to find a better way here....
time.sleep(self.time_sleep_report_seconds)
# search for at least one error
visual_errors = self.browser.find_elements_by_xpath("//a[@class='errorSeeMore']")
if (len(visual_errors) > 0):
# get screenshot with page that contains errors
screenshot_name = self.saveScreenshot(ScreenshotType.VisualError)
self.logger.info("Visual errors in report: " + workspace_report_tab.WorkspaceReportUrl + " , tab: " + workspace_report_tab.WorkspaceReportTabName + " , url: " + workspace_report_tab.WOrkspaceReportTabUrl + " , screnshot: " + screenshot_name)
# save erorr
self.workspace_report_tab_visual_errors_list.append(workspace_report_tab)
self.logger.info("Getting report tabs visual errors for: " + workspace_report_tab.WorkspaceReportTabName + " done")
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.info("User not logged in, screenshot: " + error_screenshot_name)
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.info("The workspace reports tabs list is empty. Get workspaces first, screenshot: " + error_screenshot_name)
self.logger.info("Getting reports tabs visual errors done")
def checkWorkspaceDashboardVisualsErrors(self):
self.logger.info("Getting workspaces dashboards visual errors started")
# check if there is any downloaded reports tabs
if len(self.workspace_dashboard_list) > 0:
# check if user is already logged in
if self.is_logged_in == True:
for workspace_dashboard in self.workspace_dashboard_list:
self.logger.info("Getting workspace dashboard visual errors for: " + workspace_dashboard.WorkspaceDashboardName + " started")
# go to the specific report tab page
self.browser.get(workspace_dashboard.WorkspaceDashboardUrl)
# wait till content will be loaded
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//div[@class='landingRootContent' and @id='dashboardLandingContainer']")))
# wait extra seconds, try to find a better way here....
time.sleep(self.time_sleep_report_seconds)
# search for at least one error
visual_errors = self.browser.find_elements_by_xpath("//div[@class='errorContainer']")
if (len(visual_errors) > 0):
# get screenshot with page that contains errors
screenshot_name = self.saveScreenshot(ScreenshotType.VisualError)
self.logger.info("Visual errors in dashboard: " + workspace_dashboard.WorkspaceDashboardName + " , url: " + workspace_dashboard.WorkspaceDashboardUrl + " , workspace: " + workspace_dashboard.WorkspaceName + " , screnshot: " + screenshot_name)
# save error
self.workspace_dashboard_visual_errors_list.append(workspace_dashboard)
self.logger.info("Getting workspace dashboard visual errors for: " + workspace_dashboard.WorkspaceDashboardName + " done")
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.info("User not logged in, screenshot: " + error_screenshot_name)
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.info("The workspace dashboard list is empty. Get workspaces first, screenshot: " + error_screenshot_name)
self.logger.info("Getting workspaces dashboards visual errors started")
def getWorkspacesDashboards(self):
self.logger.info("Getting workspaces dashboards started")
if len(self.workspae_list) > 0:
if self.is_logged_in == True:
self.browser.get(self.base_url_after_login)
try: # check errors in workspaces
for workspace in self.workspae_list:
current_workspace_name = workspace.WorkspaceName
current_workspace_url = workspace.WorkspaceUrl
self.logger.info("Getting dashboards for workspace " + current_workspace_name + " started")
try: # get workspace dashboards
self.logger.info("Getting dashboards lists for workspace " + current_workspace_name + " started")
### iterate dashboards
current_workspace_dashboard_list_url = current_workspace_url + "/list/dashboards"
# go to workspace dashboard list
self.browser.get(current_workspace_dashboard_list_url)
# wait till the list of dashboard will be loaded
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//div[@role='datagrid' or @class='dataSourceTiles']")))
time.sleep(self.time_sleep_normal_seconds) # additional wait while the previous one is not always working ....
# get all dashboard list
current_workspace_dashboards = self.browser.find_elements_by_xpath("//div[@class='row']//a")
self.logger.info("Getting dashboards lists for workspace " + current_workspace_name + " done")
# get all dashboard list - name + url
for current_workspace_dashboard in current_workspace_dashboards:
current_workspace_dashboard_name = current_workspace_dashboard.get_attribute("textContent").strip()
current_workspace_dashboard_url = current_workspace_dashboard.get_attribute('href')
self.logger.info("Getting dashboards " + current_workspace_dashboard_name + " started")
self.workspace_dashboard_list.append(PowerbiWorkspaceDashboard(WorkspaceName = current_workspace_name, WorkspaceDashboardName = current_workspace_dashboard_name, WorkspaceDashboardUrl = current_workspace_dashboard_url))
self.logger.info("Getting dashboards " + current_workspace_dashboard_name + " done")
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Error while getting worksapce dashboards list for " + current_workspace_name + ", screenshot: " + error_screenshot_name)
self.logger.info("Getting dashboards for workspace " + current_workspace_name + " done")
except TimeoutException:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Timed out while checking errors in workspaces" + ", screenshot: " + error_screenshot_name)
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Other error while checking errors in workspaces" + ", screenshot: " + error_screenshot_name)
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("User not logged in" + ", screenshot: " + error_screenshot_name)
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("The workspace list is empty. Get workspaces first" + ", screenshot: " + error_screenshot_name)
self.logger.info("Getting workspaces dashboards done")
def getWorkspacesReports(self):
self.logger.info("Getting workspaces reports started")
if len(self.workspae_list) > 0:
if self.is_logged_in == True:
self.browser.get(self.base_url_after_login)
try: # check errors in workspaces
for workspace in self.workspae_list:
current_workspace_name = workspace.WorkspaceName
current_workspace_url = workspace.WorkspaceUrl
self.logger.info("Getting workspaces reports " + current_workspace_name + " started")
try: # get workspace reports
self.logger.info("Getting workspaces reports list" + current_workspace_name + " started")
### itearate reports
current_workspace_report_list_url = current_workspace_url + "/list/reports"
# go to workspace report list
self.browser.get(current_workspace_report_list_url)
# wait till the list of reports will be loaded
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//div[@role='datagrid' or @class='dataSourceTiles']")))
time.sleep(self.time_sleep_normal_seconds) # additional wait while the previous one is not always working ....
# get all reports list
current_workspace_reports = self.browser.find_elements_by_xpath("//div[@class='row']//a")
self.logger.info("Getting workspaces reports list" + current_workspace_name + " done")
# get all reports list - name + url
for current_workspace_report in current_workspace_reports:
current_workspace_report_name = current_workspace_report.get_attribute("textContent").strip()
current_workspace_report_url = current_workspace_report.get_attribute('href')
self.logger.info("Getting report " + current_workspace_report_name + " started")
self.workspace_report_list.append(PowerbiWorkspaceReport(WorkspaceName = current_workspace_name, WorkspaceReportName = current_workspace_report_name, WorkspaceReportUrl = current_workspace_report_url))
self.logger.info("Getting report " + current_workspace_report_name + " done")
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.log("Error while getting worksapce reports list for " + current_workspace_name + ", screenshot: " + error_screenshot_name)
self.logger.info("Getting workspaces reports " + current_workspace_name + " done")
except TimeoutException:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Timed out while checking errors in workspaces" + ", screenshot: " + error_screenshot_name)
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Other error while checking errors in workspaces" + ", screenshot: " + error_screenshot_name)
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("User not logged in" + ", screenshot: " + error_screenshot_name)
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("The workspace list is empty. Get workspaces first" + ", screenshot: " + error_screenshot_name)
self.logger.info("Getting workspaces reports done")
def getWorkspacesReportsTabs(self):
self.logger.info("Getting workspaces reports tabs started")
if len(self.workspae_list) > 0:
if self.is_logged_in == True:
self.browser.get(self.base_url_after_login)
try: # check errors in workspaces
for workspace in self.workspae_list:
current_workspace_name = workspace.WorkspaceName
self.logger.info("Getting workspaces reports tabs for workspace " + current_workspace_name + " started")
try: # get workspace reports
current_workspace_report_list = [x for x in self.workspace_report_list if x.WorkspaceName == current_workspace_name]
# get all reports list - name + url
for current_workspace_report in current_workspace_report_list:
current_workspace_report_name = current_workspace_report.WorkspaceReportName
current_workspace_report_url = current_workspace_report.WorkspaceReportUrl
self.logger.info("Getting workspaces reports tabs for report " + current_workspace_report_name + " started")
try: # get all tab
self.logger.info("Getting workspaces reports tabs list for report " + current_workspace_report_name + " started")
# go to the single report - first page
self.browser.get(current_workspace_report_url)
# wait till the first report will be loaded to get all reports
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//div[@role='tab']")))
time.sleep(self.time_sleep_report_seconds) # additional wait while the previous one is not always working ....
# get all tabs
current_report_tabs = self.browser.find_elements_by_xpath("//div[@role='tab']")
# prepare
report_page_path = self.browser.current_url
main_window = self.browser.current_window_handle
current_report_tabs_names = []
for report_tab in current_report_tabs:
current_report_tabs_names.append(report_tab.get_attribute("textContent").strip())
# check if there is more than one tab with given name in the report (duplicates)
current_report_tab_name_count = collections.Counter(current_report_tabs_names)
current_report_tab_name_count_single = {x : current_report_tab_name_count[x] for x in current_report_tab_name_count if current_report_tab_name_count[x] == 1 }
current_report_tab_name_count_duplicates = {x : current_report_tab_name_count[x] for x in current_report_tab_name_count if current_report_tab_name_count[x] > 1 }
if (len(current_report_tab_name_count_duplicates) > 0):
for current_report_tab_name in list(current_report_tab_name_count_duplicates.keys()):
self.workspace_report_tab_list.append(PowerbiWorkspaceReportTab(WorkspaceName = current_workspace_name, WorkspaceReportName = current_workspace_report_name, WorkspaceReportUrl = current_workspace_report_url, WorkspaceReportTabName = current_report_tab_name, WOrkspaceReportTabUrl = self.browser.current_url))
self.logger.info("Getting workspaces reports tabs list for report " + current_workspace_report_name + " done")
if (len(current_report_tab_name_count_single) > 0):
for current_report_tab_name in list(current_report_tab_name_count_single.keys()):
self.logger.info("Getting workspaces reports tab " + current_report_tab_name + " started")
try: # get report page tab url
# open new blank tab
self.browser.execute_script("window.open('');")
# switch to new tab
self.browser.switch_to.window(self.browser.window_handles[1])
# go to the app page in the new tab
self.browser.get(report_page_path)
# wait till the page will load and open - click - new app
try:
# click on specific tab and wait till the tab will load
WebDriverWait(self.browser, self.page_load_timeout).until(EC.presence_of_element_located((By.XPATH, "//div[@role='tab']//div[@title='" + current_report_tab_name + "']"))).click()
time.sleep(self.time_sleep_report_seconds) # additional wait while the previous one is not always working ....
# get current page url and add to list
self.workspace_report_tab_list.append(PowerbiWorkspaceReportTab(WorkspaceName = current_workspace_name, WorkspaceReportName = current_workspace_report_name, WorkspaceReportUrl = current_workspace_report_url, WorkspaceReportTabName = current_report_tab_name, WOrkspaceReportTabUrl = self.browser.current_url))
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Error while getting workspace single tab workspace " + current_workspace_report_name + ", report " + current_workspace_report_name + ", report tab: " + current_report_tab_name + " , screenshot: none" )
# close the current tab
self.browser.close()
# go to main tab
self.browser.switch_to_window(main_window)
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Error while getting workspace single tab url for workspace " + current_workspace_report_name + ", report " + current_workspace_report_name + ", report tab: " + current_report_tab_name + " , screenshot: none" )
self.logger.info("Getting workspaces reports tab " + current_report_tab_name + " done")
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Error while getting workspace single tab url for workspace " + current_workspace_report_name + ", report " + current_workspace_report_name + " , screenshot: " + error_screenshot_name)
self.logger.info("Getting workspaces reports tabs for report " + current_workspace_report_name + " done")
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Error while getting worksapce reports list for " + current_workspace_name + ", screenshot: " + error_screenshot_name)
self.logger.info("Getting workspaces reports tabs for workspace " + current_workspace_name + " done")
except TimeoutException:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Timed out while checking errors in workspaces" + ", screenshot: " + error_screenshot_name)
except:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("Other error while checking errors in workspaces" + ", screenshot: " + error_screenshot_name)
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("User not logged in" + ", screenshot: " + error_screenshot_name)
else:
error_screenshot_name = self.saveScreenshot(ScreenshotType.CodeError)
self.logger.error("The workspace list is empty. Get workspaces first" + ", screenshot: " + error_screenshot_name)
self.logger.info("Getting workspaces reports tabs done")
| 52.835966 | 360 | 0.61625 | 4,392 | 43,484 | 5.849954 | 0.082878 | 0.065971 | 0.033784 | 0.040867 | 0.786129 | 0.740241 | 0.691083 | 0.662554 | 0.633597 | 0.611217 | 0 | 0.000756 | 0.300455 | 43,484 | 822 | 361 | 52.900243 | 0.84388 | 0.075637 | 0 | 0.417266 | 0 | 0.007194 | 0.161981 | 0.033454 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033573 | false | 0.004796 | 0.057554 | 0 | 0.098321 | 0.023981 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
5664f7326a70652d873546433602b653bd5cbdc3 | 1,089 | py | Python | setup.py | xxblx/npgameworld | c80222154ffe0ccc1a69f0b8dc3273669b4447f7 | [
"Zlib"
] | 8 | 2017-08-07T07:19:27.000Z | 2022-01-12T12:20:34.000Z | setup.py | xxblx/npgameworld | c80222154ffe0ccc1a69f0b8dc3273669b4447f7 | [
"Zlib"
] | null | null | null | setup.py | xxblx/npgameworld | c80222154ffe0ccc1a69f0b8dc3273669b4447f7 | [
"Zlib"
] | 1 | 2017-08-18T13:58:37.000Z | 2017-08-18T13:58:37.000Z | # -*- coding: utf-8 -*-
from distutils.core import setup
from npgameworld import __version__
setup(
name='npgameworld',
version=__version__,
license='zlib/libpng',
url='https://github.com/xxblx/npgameworld',
author='Oleg Kozlov',
author_email='xxblx@posteo.org',
description='Simple pure python game engine',
long_description="""NpGameWorld is very simple game engine in pure python
created for embedding. It is designed for games such as top-down shooters
where the player controls the hero, sending commands to the world.""",
platforms=['any'],
packages=['npgameworld'],
classifiers=[
'Intended Audience :: Developers',
'License :: OSI Approved :: zlib/libpng License',
'Operating System :: POSIX :: Linux',
'Operating System :: MacOS',
'Operating System :: Microsoft :: Windows',
'Programming Language :: Python :: 3',
'Topic :: Software Development :: Libraries :: Python Modules',
'Topic :: Games/Entertainment'
],
keywords='games game world engine shooter'
)
| 29.432432 | 77 | 0.662075 | 119 | 1,089 | 5.97479 | 0.689076 | 0.063291 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002347 | 0.217631 | 1,089 | 36 | 78 | 30.25 | 0.83216 | 0.019284 | 0 | 0 | 0 | 0 | 0.61257 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.074074 | 0 | 0.074074 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
566c299d3262fc7e042236873c3a27b6034df432 | 379 | py | Python | LiveFeedPythonScripts/FeedFromOtherDevice/Share PC Screen With Android/screenshot.py | viveksb007/LiveFeed | bfaa0ac0855327c1d691bab3fd6eb8e718efa801 | [
"Unlicense"
] | 10 | 2017-04-16T19:27:14.000Z | 2022-02-17T05:41:51.000Z | LiveFeedPythonScripts/FeedFromOtherDevice/Share PC Screen With Android/screenshot.py | viveksb007/LiveFeed | bfaa0ac0855327c1d691bab3fd6eb8e718efa801 | [
"Unlicense"
] | 2 | 2017-10-01T09:14:46.000Z | 2017-10-01T09:20:23.000Z | LiveFeedPythonScripts/FeedFromOtherDevice/Share PC Screen With Android/screenshot.py | viveksb007/LiveFeed | bfaa0ac0855327c1d691bab3fd6eb8e718efa801 | [
"Unlicense"
] | 3 | 2017-07-19T02:42:33.000Z | 2020-05-16T07:39:19.000Z | import time
import cv2
import pyscreenshot as ImageGrab
import numpy as np
class Screenshot(object):
def get_frame(self):
img = np.array(ImageGrab.grab().convert('RGB'), dtype=np.uint8)
img = cv2.cvtColor(img, cv2.COLOR_RGB2BGR)
ret2, jpeg = cv2.imencode('.jpg', img)
return jpeg.tostring()
def __del__(self):
self.cam.release()
| 25.266667 | 71 | 0.656992 | 51 | 379 | 4.764706 | 0.666667 | 0.049383 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023729 | 0.221636 | 379 | 14 | 72 | 27.071429 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0.01847 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 2 |
567fdc691bb946d96c7f49243c0174d039c5bb4a | 695 | py | Python | ue4helpers/ProjectPackager.py | adamrehn/ue4-ci-helpers | a90920ece8896d5210c6b6c0f64b1616b82e1a81 | [
"MIT"
] | 22 | 2019-02-27T18:21:55.000Z | 2022-02-06T07:37:18.000Z | ue4helpers/ProjectPackager.py | adamrehn/ue4-ci-helpers | a90920ece8896d5210c6b6c0f64b1616b82e1a81 | [
"MIT"
] | 1 | 2020-03-27T17:38:10.000Z | 2020-03-28T06:23:27.000Z | ue4helpers/ProjectPackager.py | adamrehn/ue4-ci-helpers | a90920ece8896d5210c6b6c0f64b1616b82e1a81 | [
"MIT"
] | 3 | 2020-07-05T14:42:44.000Z | 2021-07-21T16:16:01.000Z | from .PackagerBase import PackagerBase
class ProjectPackager(PackagerBase):
'''
Provides functionality for packaging an Unreal project.
'''
def __init__(self, root, version, archive='{name}-{version}-{platform}', strip_debug=False, strip_manifests=False, stage=[], verbose=True):
'''
Creates a new ProjectPackager with the specified configuration.
See `PackagerBase.__init__()` for details on the input parameters.
'''
super().__init__(root, version, archive, strip_debug, strip_manifests, stage, verbose)
# "Private" methods
def _extension(self):
'''
Returns the file extension for the descriptor files supported by this packager type
'''
return '.uproject'
| 28.958333 | 140 | 0.735252 | 80 | 695 | 6.175 | 0.675 | 0.044534 | 0.072874 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153957 | 695 | 23 | 141 | 30.217391 | 0.840136 | 0.417266 | 0 | 0 | 0 | 0 | 0.097561 | 0.073171 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.166667 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
568d62ea58262a9a05ab2d360d3b97e32254813b | 268 | py | Python | setup.py | lrado1/delimag | 1dc00552d920bdcd261144f3943f805a9dad198d | [
"MIT"
] | null | null | null | setup.py | lrado1/delimag | 1dc00552d920bdcd261144f3943f805a9dad198d | [
"MIT"
] | null | null | null | setup.py | lrado1/delimag | 1dc00552d920bdcd261144f3943f805a9dad198d | [
"MIT"
] | null | null | null | from setuptools import setup
setup(name = 'delimag',
version = '0.01',
description = 'Delimag is a tool to analyze Pandas DataFrame objects with multiselect records.',
packages = ['pandas', 'delimag'],
author = 'lrado1',
zip_safe=False)
| 29.777778 | 102 | 0.652985 | 31 | 268 | 5.612903 | 0.870968 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019512 | 0.235075 | 268 | 8 | 103 | 33.5 | 0.829268 | 0 | 0 | 0 | 0 | 0 | 0.406716 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
568f72f1485e0c2289db961863afd116bc2a3509 | 286 | py | Python | python/parsing.py | simonfong6/micro-projects | 5be195ea72ce117df6da041446f11c18e102b5df | [
"MIT"
] | null | null | null | python/parsing.py | simonfong6/micro-projects | 5be195ea72ce117df6da041446f11c18e102b5df | [
"MIT"
] | null | null | null | python/parsing.py | simonfong6/micro-projects | 5be195ea72ce117df6da041446f11c18e102b5df | [
"MIT"
] | null | null | null | myString = "temperature = 89, sound = 65, light = 355, heartrate = 85"
myList = myString.split(", ")
print myList
dataJSON = {}
for word in myList:
print word
key, equals, data = word.split(" ", 2)
print key
print equals
print data
dataJSON[key] = int(data)
print dataJSON
| 14.3 | 70 | 0.667832 | 39 | 286 | 4.897436 | 0.538462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044248 | 0.20979 | 286 | 19 | 71 | 15.052632 | 0.800885 | 0 | 0 | 0 | 0 | 0 | 0.211268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 2 |
56959c8a1a254209eeb2ba3a15a248f9502ef3d8 | 11,203 | py | Python | nova/api/openstack/compute/views/images.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/api/openstack/compute/views/images.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | null | null | null | nova/api/openstack/compute/views/images.py | bopopescu/nova-token | ec98f69dea7b3e2b9013b27fd55a2c1a1ac6bfb2 | [
"Apache-2.0"
] | 2 | 2017-07-20T17:31:34.000Z | 2020-07-24T02:42:19.000Z | begin_unit
comment|'# Copyright 2010-2011 OpenStack Foundation'
nl|'\n'
comment|'# Copyright 2013 IBM Corp.'
nl|'\n'
comment|'# All Rights Reserved.'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Licensed under the Apache License, Version 2.0 (the "License"); you may'
nl|'\n'
comment|'# not use this file except in compliance with the License. You may obtain'
nl|'\n'
comment|'# a copy of the License at'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# http://www.apache.org/licenses/LICENSE-2.0'
nl|'\n'
comment|'#'
nl|'\n'
comment|'# Unless required by applicable law or agreed to in writing, software'
nl|'\n'
comment|'# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT'
nl|'\n'
comment|'# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the'
nl|'\n'
comment|'# License for the specific language governing permissions and limitations'
nl|'\n'
comment|'# under the License.'
nl|'\n'
nl|'\n'
name|'from'
name|'nova'
op|'.'
name|'api'
op|'.'
name|'openstack'
name|'import'
name|'common'
newline|'\n'
name|'from'
name|'nova'
op|'.'
name|'image'
name|'import'
name|'glance'
newline|'\n'
name|'from'
name|'nova'
name|'import'
name|'utils'
newline|'\n'
nl|'\n'
nl|'\n'
DECL|class|ViewBuilder
name|'class'
name|'ViewBuilder'
op|'('
name|'common'
op|'.'
name|'ViewBuilder'
op|')'
op|':'
newline|'\n'
nl|'\n'
DECL|variable|_collection_name
indent|' '
name|'_collection_name'
op|'='
string|'"images"'
newline|'\n'
nl|'\n'
DECL|member|basic
name|'def'
name|'basic'
op|'('
name|'self'
op|','
name|'request'
op|','
name|'image'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return a dictionary with basic image attributes."""'
newline|'\n'
name|'return'
op|'{'
nl|'\n'
string|'"image"'
op|':'
op|'{'
nl|'\n'
string|'"id"'
op|':'
name|'image'
op|'.'
name|'get'
op|'('
string|'"id"'
op|')'
op|','
nl|'\n'
string|'"name"'
op|':'
name|'image'
op|'.'
name|'get'
op|'('
string|'"name"'
op|')'
op|','
nl|'\n'
string|'"links"'
op|':'
name|'self'
op|'.'
name|'_get_links'
op|'('
name|'request'
op|','
nl|'\n'
name|'image'
op|'['
string|'"id"'
op|']'
op|','
nl|'\n'
name|'self'
op|'.'
name|'_collection_name'
op|')'
op|','
nl|'\n'
op|'}'
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
DECL|member|show
dedent|''
name|'def'
name|'show'
op|'('
name|'self'
op|','
name|'request'
op|','
name|'image'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return a dictionary with image details."""'
newline|'\n'
name|'image_dict'
op|'='
op|'{'
nl|'\n'
string|'"id"'
op|':'
name|'image'
op|'.'
name|'get'
op|'('
string|'"id"'
op|')'
op|','
nl|'\n'
string|'"name"'
op|':'
name|'image'
op|'.'
name|'get'
op|'('
string|'"name"'
op|')'
op|','
nl|'\n'
string|'"minRam"'
op|':'
name|'int'
op|'('
name|'image'
op|'.'
name|'get'
op|'('
string|'"min_ram"'
op|')'
name|'or'
number|'0'
op|')'
op|','
nl|'\n'
string|'"minDisk"'
op|':'
name|'int'
op|'('
name|'image'
op|'.'
name|'get'
op|'('
string|'"min_disk"'
op|')'
name|'or'
number|'0'
op|')'
op|','
nl|'\n'
string|'"metadata"'
op|':'
name|'image'
op|'.'
name|'get'
op|'('
string|'"properties"'
op|','
op|'{'
op|'}'
op|')'
op|','
nl|'\n'
string|'"created"'
op|':'
name|'self'
op|'.'
name|'_format_date'
op|'('
name|'image'
op|'.'
name|'get'
op|'('
string|'"created_at"'
op|')'
op|')'
op|','
nl|'\n'
string|'"updated"'
op|':'
name|'self'
op|'.'
name|'_format_date'
op|'('
name|'image'
op|'.'
name|'get'
op|'('
string|'"updated_at"'
op|')'
op|')'
op|','
nl|'\n'
string|'"status"'
op|':'
name|'self'
op|'.'
name|'_get_status'
op|'('
name|'image'
op|')'
op|','
nl|'\n'
string|'"progress"'
op|':'
name|'self'
op|'.'
name|'_get_progress'
op|'('
name|'image'
op|')'
op|','
nl|'\n'
string|'"links"'
op|':'
name|'self'
op|'.'
name|'_get_links'
op|'('
name|'request'
op|','
nl|'\n'
name|'image'
op|'['
string|'"id"'
op|']'
op|','
nl|'\n'
name|'self'
op|'.'
name|'_collection_name'
op|')'
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
name|'instance_uuid'
op|'='
name|'image'
op|'.'
name|'get'
op|'('
string|'"properties"'
op|','
op|'{'
op|'}'
op|')'
op|'.'
name|'get'
op|'('
string|'"instance_uuid"'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'instance_uuid'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'server_ref'
op|'='
name|'self'
op|'.'
name|'_get_href_link'
op|'('
name|'request'
op|','
name|'instance_uuid'
op|','
string|"'servers'"
op|')'
newline|'\n'
name|'image_dict'
op|'['
string|'"server"'
op|']'
op|'='
op|'{'
nl|'\n'
string|'"id"'
op|':'
name|'instance_uuid'
op|','
nl|'\n'
string|'"links"'
op|':'
op|'['
op|'{'
nl|'\n'
string|'"rel"'
op|':'
string|'"self"'
op|','
nl|'\n'
string|'"href"'
op|':'
name|'server_ref'
op|','
nl|'\n'
op|'}'
op|','
nl|'\n'
op|'{'
nl|'\n'
string|'"rel"'
op|':'
string|'"bookmark"'
op|','
nl|'\n'
string|'"href"'
op|':'
name|'self'
op|'.'
name|'_get_bookmark_link'
op|'('
name|'request'
op|','
nl|'\n'
name|'instance_uuid'
op|','
nl|'\n'
string|"'servers'"
op|')'
op|','
nl|'\n'
op|'}'
op|']'
op|','
nl|'\n'
op|'}'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'dict'
op|'('
name|'image'
op|'='
name|'image_dict'
op|')'
newline|'\n'
nl|'\n'
DECL|member|detail
dedent|''
name|'def'
name|'detail'
op|'('
name|'self'
op|','
name|'request'
op|','
name|'images'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Show a list of images with details."""'
newline|'\n'
name|'list_func'
op|'='
name|'self'
op|'.'
name|'show'
newline|'\n'
name|'coll_name'
op|'='
name|'self'
op|'.'
name|'_collection_name'
op|'+'
string|"'/detail'"
newline|'\n'
name|'return'
name|'self'
op|'.'
name|'_list_view'
op|'('
name|'list_func'
op|','
name|'request'
op|','
name|'images'
op|','
name|'coll_name'
op|')'
newline|'\n'
nl|'\n'
DECL|member|index
dedent|''
name|'def'
name|'index'
op|'('
name|'self'
op|','
name|'request'
op|','
name|'images'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Show a list of images with basic attributes."""'
newline|'\n'
name|'list_func'
op|'='
name|'self'
op|'.'
name|'basic'
newline|'\n'
name|'coll_name'
op|'='
name|'self'
op|'.'
name|'_collection_name'
newline|'\n'
name|'return'
name|'self'
op|'.'
name|'_list_view'
op|'('
name|'list_func'
op|','
name|'request'
op|','
name|'images'
op|','
name|'coll_name'
op|')'
newline|'\n'
nl|'\n'
DECL|member|_list_view
dedent|''
name|'def'
name|'_list_view'
op|'('
name|'self'
op|','
name|'list_func'
op|','
name|'request'
op|','
name|'images'
op|','
name|'coll_name'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Provide a view for a list of images.\n\n :param list_func: Function used to format the image data\n :param request: API request\n :param images: List of images in dictionary format\n :param coll_name: Name of collection, used to generate the next link\n for a pagination query\n\n :returns: Image reply data in dictionary format\n """'
newline|'\n'
name|'image_list'
op|'='
op|'['
name|'list_func'
op|'('
name|'request'
op|','
name|'image'
op|')'
op|'['
string|'"image"'
op|']'
name|'for'
name|'image'
name|'in'
name|'images'
op|']'
newline|'\n'
name|'images_links'
op|'='
name|'self'
op|'.'
name|'_get_collection_links'
op|'('
name|'request'
op|','
name|'images'
op|','
name|'coll_name'
op|')'
newline|'\n'
name|'images_dict'
op|'='
name|'dict'
op|'('
name|'images'
op|'='
name|'image_list'
op|')'
newline|'\n'
nl|'\n'
name|'if'
name|'images_links'
op|':'
newline|'\n'
indent|' '
name|'images_dict'
op|'['
string|'"images_links"'
op|']'
op|'='
name|'images_links'
newline|'\n'
nl|'\n'
dedent|''
name|'return'
name|'images_dict'
newline|'\n'
nl|'\n'
DECL|member|_get_links
dedent|''
name|'def'
name|'_get_links'
op|'('
name|'self'
op|','
name|'request'
op|','
name|'identifier'
op|','
name|'collection_name'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return a list of links for this image."""'
newline|'\n'
name|'return'
op|'['
op|'{'
nl|'\n'
string|'"rel"'
op|':'
string|'"self"'
op|','
nl|'\n'
string|'"href"'
op|':'
name|'self'
op|'.'
name|'_get_href_link'
op|'('
name|'request'
op|','
name|'identifier'
op|','
name|'collection_name'
op|')'
op|','
nl|'\n'
op|'}'
op|','
nl|'\n'
op|'{'
nl|'\n'
string|'"rel"'
op|':'
string|'"bookmark"'
op|','
nl|'\n'
string|'"href"'
op|':'
name|'self'
op|'.'
name|'_get_bookmark_link'
op|'('
name|'request'
op|','
nl|'\n'
name|'identifier'
op|','
nl|'\n'
name|'collection_name'
op|')'
op|','
nl|'\n'
op|'}'
op|','
nl|'\n'
op|'{'
nl|'\n'
string|'"rel"'
op|':'
string|'"alternate"'
op|','
nl|'\n'
string|'"type"'
op|':'
string|'"application/vnd.openstack.image"'
op|','
nl|'\n'
string|'"href"'
op|':'
name|'self'
op|'.'
name|'_get_alternate_link'
op|'('
name|'request'
op|','
name|'identifier'
op|')'
op|','
nl|'\n'
op|'}'
op|']'
newline|'\n'
nl|'\n'
DECL|member|_get_alternate_link
dedent|''
name|'def'
name|'_get_alternate_link'
op|'('
name|'self'
op|','
name|'request'
op|','
name|'identifier'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Create an alternate link for a specific image id."""'
newline|'\n'
name|'glance_url'
op|'='
name|'glance'
op|'.'
name|'generate_glance_url'
op|'('
op|')'
newline|'\n'
name|'glance_url'
op|'='
name|'self'
op|'.'
name|'_update_glance_link_prefix'
op|'('
name|'glance_url'
op|')'
newline|'\n'
name|'return'
string|"'/'"
op|'.'
name|'join'
op|'('
op|'['
name|'glance_url'
op|','
nl|'\n'
name|'self'
op|'.'
name|'_collection_name'
op|','
nl|'\n'
name|'str'
op|'('
name|'identifier'
op|')'
op|']'
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_format_date
name|'def'
name|'_format_date'
op|'('
name|'dt'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Return standard format for a given datetime object."""'
newline|'\n'
name|'if'
name|'dt'
name|'is'
name|'not'
name|'None'
op|':'
newline|'\n'
indent|' '
name|'return'
name|'utils'
op|'.'
name|'isotime'
op|'('
name|'dt'
op|')'
newline|'\n'
nl|'\n'
dedent|''
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_get_status
name|'def'
name|'_get_status'
op|'('
name|'image'
op|')'
op|':'
newline|'\n'
indent|' '
string|'"""Update the status field to standardize format."""'
newline|'\n'
name|'return'
op|'{'
nl|'\n'
string|"'active'"
op|':'
string|"'ACTIVE'"
op|','
nl|'\n'
string|"'queued'"
op|':'
string|"'SAVING'"
op|','
nl|'\n'
string|"'saving'"
op|':'
string|"'SAVING'"
op|','
nl|'\n'
string|"'deleted'"
op|':'
string|"'DELETED'"
op|','
nl|'\n'
string|"'pending_delete'"
op|':'
string|"'DELETED'"
op|','
nl|'\n'
string|"'killed'"
op|':'
string|"'ERROR'"
op|','
nl|'\n'
op|'}'
op|'.'
name|'get'
op|'('
name|'image'
op|'.'
name|'get'
op|'('
string|'"status"'
op|')'
op|','
string|"'UNKNOWN'"
op|')'
newline|'\n'
nl|'\n'
dedent|''
op|'@'
name|'staticmethod'
newline|'\n'
DECL|member|_get_progress
name|'def'
name|'_get_progress'
op|'('
name|'image'
op|')'
op|':'
newline|'\n'
indent|' '
name|'return'
op|'{'
nl|'\n'
string|'"queued"'
op|':'
number|'25'
op|','
nl|'\n'
string|'"saving"'
op|':'
number|'50'
op|','
nl|'\n'
string|'"active"'
op|':'
number|'100'
op|','
nl|'\n'
op|'}'
op|'.'
name|'get'
op|'('
name|'image'
op|'.'
name|'get'
op|'('
string|'"status"'
op|')'
op|','
number|'0'
op|')'
newline|'\n'
dedent|''
dedent|''
endmarker|''
end_unit
| 12.818078 | 413 | 0.584933 | 1,674 | 11,203 | 3.838112 | 0.111111 | 0.139144 | 0.046693 | 0.063346 | 0.667704 | 0.620389 | 0.556576 | 0.484358 | 0.426615 | 0.402023 | 0 | 0.002621 | 0.114612 | 11,203 | 873 | 414 | 12.832761 | 0.645126 | 0 | 0 | 0.90378 | 0 | 0.001145 | 0.398465 | 0.007141 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.003436 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 |
56963e125a26f85c1fec1fede95cb46497433416 | 625 | py | Python | Dominant/DominantModel.py | lizhong2613/Awesome-anomaly-detection-baseline | 9a6c0092b564825e5394e4a877b5154a5f76136e | [
"MIT"
] | 10 | 2021-01-28T02:48:24.000Z | 2021-10-03T10:44:26.000Z | Dominant/DominantModel.py | lizhong2613/Awesome-anomaly-detection-baseline | 9a6c0092b564825e5394e4a877b5154a5f76136e | [
"MIT"
] | null | null | null | Dominant/DominantModel.py | lizhong2613/Awesome-anomaly-detection-baseline | 9a6c0092b564825e5394e4a877b5154a5f76136e | [
"MIT"
] | 1 | 2021-04-15T01:32:22.000Z | 2021-04-15T01:32:22.000Z | import torch
import numpy as np
import torch.nn as nn
import torch.nn.functional as F
import GraphConv
class DominantModel(nn.Module):
def __init__(self, A_norm, dim_in, dim_out=16):
super(DominantModel, self).__init__()
self.gcn1 = GraphConv.GCN(A_norm, dim_in, 64)
self.gcn2 = GraphConv.GCN(A_norm, 64, 32)
self.gcn3 = GraphConv.GCN(A_norm, 32, dim_out)
self.deconv = GraphConv.GCN(A_norm, dim_out, dim_in)
def forward(self, X):
X = self.gcn1(X)
X = self.gcn2(X)
Z = self.gcn3(X)
return torch.sigmoid(Z.mm(Z.transpose(0, 1))), self.deconv(Z) | 32.894737 | 69 | 0.648 | 101 | 625 | 3.821782 | 0.376238 | 0.064767 | 0.134715 | 0.176166 | 0.103627 | 0 | 0 | 0 | 0 | 0 | 0 | 0.037267 | 0.2272 | 625 | 19 | 69 | 32.894737 | 0.761905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.294118 | 0 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 2 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.