hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
9c26d711887f84da99433b770df53c3bffc460c4 | 1,067 | py | Python | Python/Vowel-Substring/solution.py | arpitran/HackerRank_solutions | a3a77c858edd3955ea38530916db9051b1aa93f9 | [
"MIT"
] | null | null | null | Python/Vowel-Substring/solution.py | arpitran/HackerRank_solutions | a3a77c858edd3955ea38530916db9051b1aa93f9 | [
"MIT"
] | null | null | null | Python/Vowel-Substring/solution.py | arpitran/HackerRank_solutions | a3a77c858edd3955ea38530916db9051b1aa93f9 | [
"MIT"
] | null | null | null | #!/bin/python3
import math
import os
import random
import re
import sys
#
# Complete the 'findSubstring' function below.
#
# The function is expected to return a STRING.
# The function accepts following parameters:
# 1. STRING s
# 2. INTEGER k
#
def isVowel(x):
if(x=="a" or x=='e' or x=='i' or x=='o' or x=='u'):
return True
return False
def vowelcount(x):
lowercase = x.lower()
vowel_counts = {}
for vowel in "aeiou":
count = lowercase.count(vowel)
vowel_counts[vowel] = count
counts = vowel_counts.values()
total_vowels = sum(counts)
return total_vowels
def findSubstring(s, k):
test_str = s
count = 0
sub_string = {}
res = [test_str[i: j] for i in range(len(test_str)) for j in range(i+1, len(test_str)+1) if len(test_str[i:j])==k]
for i in res:
sub_string[i]=vowelcount(i)
if sub_string.get(max(sub_string,key=sub_string.get))==0:
return "Not found!"
else:
return str(max(sub_string,key=sub_string.get))
# Write your code here
| 20.519231 | 118 | 0.626992 | 167 | 1,067 | 3.904192 | 0.413174 | 0.096626 | 0.046012 | 0.027607 | 0.082822 | 0.082822 | 0.082822 | 0 | 0 | 0 | 0 | 0.008717 | 0.247423 | 1,067 | 51 | 119 | 20.921569 | 0.803238 | 0.181818 | 0 | 0 | 0 | 0 | 0.023175 | 0 | 0 | 0 | 0 | 0.019608 | 0 | 1 | 0.103448 | false | 0 | 0.172414 | 0 | 0.448276 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9c2c850b8212d47e83a1fb645622cfcbef2e844f | 7,385 | py | Python | python/tink/jwt/_raw_jwt.py | cuonglm/tink | df5fa42e45b4d43aac6c3506ceba2956b79a62b8 | [
"Apache-2.0"
] | null | null | null | python/tink/jwt/_raw_jwt.py | cuonglm/tink | df5fa42e45b4d43aac6c3506ceba2956b79a62b8 | [
"Apache-2.0"
] | null | null | null | python/tink/jwt/_raw_jwt.py | cuonglm/tink | df5fa42e45b4d43aac6c3506ceba2956b79a62b8 | [
"Apache-2.0"
] | null | null | null | # Copyright 2021 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
"""The raw JSON Web Token (JWT)."""
import copy
import datetime
import json
from typing import cast, Mapping, Set, List, Dict, Optional, Text, Union, Any
from tink import core
from tink.jwt import _jwt_error
from tink.jwt import _jwt_format
_REGISTERED_NAMES = frozenset({'iss', 'sub', 'jti', 'aud', 'exp', 'nbf', 'iat'})
_MAX_TIMESTAMP_VALUE = 253402300799 # 31 Dec 9999, 23:59:59 GMT
Claim = Union[None, bool, int, float, Text, List[Any], Dict[Text, Any]]
def _from_datetime(t: datetime.datetime) -> float:
if not t.tzinfo:
raise _jwt_error.JwtInvalidError('datetime must have tzinfo')
return t.timestamp()
def _to_datetime(timestamp: float) -> datetime.datetime:
return datetime.datetime.fromtimestamp(timestamp, datetime.timezone.utc)
def _validate_custom_claim_name(name: Text) -> None:
if name in _REGISTERED_NAMES:
raise _jwt_error.JwtInvalidError(
'registered name %s cannot be custom claim name' % name)
class RawJwt(object):
"""A raw JSON Web Token (JWT).
It can be signed to obtain a compact JWT. It is also used as a parse token
that has not yet been verified.
"""
def __new__(cls):
raise core.TinkError('RawJwt cannot be instantiated directly.')
def __init__(self, type_header: Optional[Text], payload: Dict[Text,
Any]) -> None:
# No need to copy payload, because only create and from_json_payload
# call this method.
if not isinstance(payload, Dict):
raise _jwt_error.JwtInvalidError('payload must be a dict')
self._type_header = type_header
self._payload = payload
self._validate_string_claim('iss')
self._validate_string_claim('sub')
self._validate_string_claim('jti')
self._validate_timestamp_claim('exp')
self._validate_timestamp_claim('nbf')
self._validate_timestamp_claim('iat')
self._validate_audience_claim()
def _validate_string_claim(self, name: Text):
if name in self._payload:
if not isinstance(self._payload[name], Text):
raise _jwt_error.JwtInvalidError('claim %s must be a String' % name)
def _validate_timestamp_claim(self, name: Text):
if name in self._payload:
timestamp = self._payload[name]
if not isinstance(timestamp, (int, float)):
raise _jwt_error.JwtInvalidError('claim %s must be a Number' % name)
if timestamp > _MAX_TIMESTAMP_VALUE or timestamp < 0:
raise _jwt_error.JwtInvalidError(
'timestamp of claim %s is out of range' % name)
def _validate_audience_claim(self):
if 'aud' in self._payload:
audiences = self._payload['aud']
if isinstance(audiences, Text):
self._payload['aud'] = [audiences]
return
if not isinstance(audiences, list) or not audiences:
raise _jwt_error.JwtInvalidError('audiences must be a non-empty list')
if not all(isinstance(value, Text) for value in audiences):
raise _jwt_error.JwtInvalidError('audiences must only contain Text')
# TODO(juerg): Consider adding a raw_ prefix to all access methods
def has_type_header(self) -> bool:
return self._type_header is not None
def type_header(self) -> Text:
if not self.has_type_header():
raise KeyError('type header is not set')
return self._type_header
def has_issuer(self) -> bool:
return 'iss' in self._payload
def issuer(self) -> Text:
return cast(Text, self._payload['iss'])
def has_subject(self) -> bool:
return 'sub' in self._payload
def subject(self) -> Text:
return cast(Text, self._payload['sub'])
def has_audiences(self) -> bool:
return 'aud' in self._payload
def audiences(self) -> List[Text]:
return list(self._payload['aud'])
def has_jwt_id(self) -> bool:
return 'jti' in self._payload
def jwt_id(self) -> Text:
return cast(Text, self._payload['jti'])
def has_expiration(self) -> bool:
return 'exp' in self._payload
def expiration(self) -> datetime.datetime:
return _to_datetime(self._payload['exp'])
def has_not_before(self) -> bool:
return 'nbf' in self._payload
def not_before(self) -> datetime.datetime:
return _to_datetime(self._payload['nbf'])
def has_issued_at(self) -> bool:
return 'iat' in self._payload
def issued_at(self) -> datetime.datetime:
return _to_datetime(self._payload['iat'])
def custom_claim_names(self) -> Set[Text]:
return {n for n in self._payload.keys() if n not in _REGISTERED_NAMES}
def custom_claim(self, name: Text) -> Claim:
_validate_custom_claim_name(name)
value = self._payload[name]
if isinstance(value, (list, dict)):
return copy.deepcopy(value)
else:
return value
def json_payload(self) -> Text:
"""Returns the payload encoded as JSON string."""
return _jwt_format.json_dumps(self._payload)
@classmethod
def create(cls,
*,
type_header: Optional[Text] = None,
issuer: Optional[Text] = None,
subject: Optional[Text] = None,
audiences: Optional[List[Text]] = None,
jwt_id: Optional[Text] = None,
expiration: Optional[datetime.datetime] = None,
not_before: Optional[datetime.datetime] = None,
issued_at: Optional[datetime.datetime] = None,
custom_claims: Mapping[Text, Claim] = None) -> 'RawJwt':
"""Create a new RawJwt instance."""
payload = {}
if issuer:
payload['iss'] = issuer
if subject:
payload['sub'] = subject
if jwt_id is not None:
payload['jti'] = jwt_id
if audiences is not None:
payload['aud'] = copy.copy(audiences)
if expiration:
payload['exp'] = _from_datetime(expiration)
if not_before:
payload['nbf'] = _from_datetime(not_before)
if issued_at:
payload['iat'] = _from_datetime(issued_at)
if custom_claims:
for name, value in custom_claims.items():
_validate_custom_claim_name(name)
if not isinstance(name, Text):
raise _jwt_error.JwtInvalidError('claim name must be Text')
if (value is None or isinstance(value, (bool, int, float, Text))):
payload[name] = value
elif isinstance(value, list):
payload[name] = json.loads(json.dumps(value))
elif isinstance(value, dict):
payload[name] = json.loads(json.dumps(value))
else:
raise _jwt_error.JwtInvalidError('claim %s has unknown type' % name)
raw_jwt = object.__new__(cls)
raw_jwt.__init__(type_header, payload)
return raw_jwt
@classmethod
def from_json(cls, type_header: Optional[Text], payload: Text) -> 'RawJwt':
"""Creates a RawJwt from payload encoded as JSON string."""
raw_jwt = object.__new__(cls)
raw_jwt.__init__(type_header, _jwt_format.json_loads(payload))
return raw_jwt
| 34.189815 | 80 | 0.677319 | 997 | 7,385 | 4.811434 | 0.198596 | 0.057328 | 0.02981 | 0.05837 | 0.211799 | 0.150302 | 0.143215 | 0.07734 | 0.047947 | 0.015843 | 0 | 0.005703 | 0.21652 | 7,385 | 215 | 81 | 34.348837 | 0.823367 | 0.133785 | 0 | 0.10596 | 0 | 0 | 0.075287 | 0 | 0 | 0 | 0 | 0.004651 | 0 | 1 | 0.192053 | false | 0 | 0.046358 | 0.112583 | 0.410596 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
9c2cdfaea02de247b5a0a427743330312fb34eb8 | 16,904 | py | Python | dialogue-engine/test/programytest/config/file/test_json.py | cotobadesign/cotoba-agent-oss | 3833d56e79dcd7529c3e8b3a3a8a782d513d9b12 | [
"MIT"
] | 104 | 2020-03-30T09:40:00.000Z | 2022-03-06T22:34:25.000Z | dialogue-engine/test/programytest/config/file/test_json.py | cotobadesign/cotoba-agent-oss | 3833d56e79dcd7529c3e8b3a3a8a782d513d9b12 | [
"MIT"
] | 25 | 2020-06-12T01:36:35.000Z | 2022-02-19T07:30:44.000Z | dialogue-engine/test/programytest/config/file/test_json.py | cotobadesign/cotoba-agent-oss | 3833d56e79dcd7529c3e8b3a3a8a782d513d9b12 | [
"MIT"
] | 10 | 2020-04-02T23:43:56.000Z | 2021-05-14T13:47:01.000Z | """
Copyright (c) 2020 COTOBA DESIGN, Inc.
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated
documentation files (the "Software"), to deal in the Software without restriction, including without limitation
the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software,
and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO
THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
"""
import os
from programy.config.file.json_file import JSONConfigurationFile
from programy.clients.events.console.config import ConsoleConfiguration
from programy.utils.substitutions.substitues import Substitutions
from programytest.config.file.base_file_tests import ConfigurationBaseFileTests
class JSONConfigurationFileTests(ConfigurationBaseFileTests):
def test_get_methods(self):
config_data = JSONConfigurationFile()
self.assertIsNotNone(config_data)
configuration = config_data.load_from_text("""
{
"brain": {
"overrides": {
"allow_system_aiml": true,
"allow_learn_aiml": true,
"allow_learnf_aiml": true
}
}
}
""", ConsoleConfiguration(), ".")
self.assertIsNotNone(configuration)
section = config_data.get_section("brainx")
self.assertIsNone(section)
section = config_data.get_section("brain")
self.assertIsNotNone(section)
child_section = config_data.get_section("overrides", section)
self.assertIsNotNone(child_section)
keys = list(config_data.get_child_section_keys("overrides", section))
self.assertIsNotNone(keys)
self.assertEqual(3, len(keys))
self.assertTrue("allow_system_aiml" in keys)
self.assertTrue("allow_learn_aiml" in keys)
self.assertTrue("allow_learnf_aiml" in keys)
self.assertIsNone(config_data.get_child_section_keys("missing", section))
self.assertEqual(True, config_data.get_option(child_section, "allow_system_aiml"))
self.assertEqual(True, config_data.get_option(child_section, "missing", missing_value=True))
self.assertEqual(True, config_data.get_bool_option(child_section, "allow_system_aiml"))
self.assertEqual(False, config_data.get_bool_option(child_section, "other_value"))
self.assertEqual(0, config_data.get_int_option(child_section, "other_value"))
def test_load_from_file(self):
config = JSONConfigurationFile()
self.assertIsNotNone(config)
configuration = config.load_from_file(os.path.dirname(__file__) + os.sep + "test_json.json", ConsoleConfiguration(), ".")
self.assertIsNotNone(configuration)
self.assert_configuration(configuration)
def test_load_from_text_multis_one_value(self):
config = JSONConfigurationFile()
self.assertIsNotNone(config)
configuration = config.load_from_text("""
{
"bot": {
"brain": "bot1"
}
}
""", ConsoleConfiguration(), ".")
self.assertIsNotNone(configuration)
self.assertEqual(1, len(configuration.client_configuration.configurations[0].configurations))
def test_load_from_text_multis_multiple_values(self):
config = JSONConfigurationFile()
self.assertIsNotNone(config)
configuration = config.load_from_text("""
{
"console": {
"bot": "bot"
},
"bot": {
"brain": ["bot1", "bot2"]
}
}
""", ConsoleConfiguration(), ".")
self.assertIsNotNone(configuration)
self.assertEqual(2, len(configuration.client_configuration.configurations[0].configurations))
def test_load_from_text(self):
config = JSONConfigurationFile()
self.assertIsNotNone(config)
configuration = config.load_from_text("""
{
"console": {
"bot": "bot",
"prompt": ">>>",
"scheduler": {
"name": "Scheduler1",
"debug_level": 50,
"add_listeners": false,
"remove_all_jobs": false
},
"storage": {
"entities": {
"users": "sql",
"linked_accounts": "sql",
"links": "sql",
"properties": "file",
"conversations": "file",
"categories": "file",
"maps": "file",
"sets": "file",
"rdf": "file",
"denormal": "file",
"normal": "file",
"gender": "file",
"person": "file",
"person2": "file",
"spelling_corpus": "file",
"license_keys": "file",
"nodes": "file",
"binaries": "file",
"braintree": "file",
"preprocessors": "file",
"postprocessors": "file",
"regex_templates": "file",
"usergroups": "file",
"learnf": "file"
},
"stores": {
"sql": {
"type": "sql",
"config": {
"url": "sqlite:///:memory",
"echo": false,
"encoding": "utf-8",
"create_db": true,
"drop_all_first": true
}
},
"mongo": {
"type": "mongo",
"config": {
"url": "mongodb://localhost:27017/",
"database": "programy",
"drop_all_first": true
}
},
"redis": {
"type": "redis",
"config": {
"host": "localhost",
"port": 6379,
"password": null,
"db": 0,
"prefix": "programy",
"drop_all_first": true
}
},
"file": {
"type": "file",
"config": {
"category_storage": {
"files": "./storage/categories"
},
"conversations_storage": {
"files": "./storage/conversations"
},
"sets_storage": {
"files": "./storage/sets",
"extension": ".txt",
"directories": false
},
"maps_storage": {
"files": "./storage/maps",
"extension": ".txt",
"directories": false
},
"regex_templates": {
"files": "./storage/regex"
},
"lookups_storage": {
"files": "./storage/lookups",
"extension": ".txt",
"directories": false
},
"properties_storage": {
"file": "./storage/properties.txt"
},
"defaults_storage": {
"file": "./storage/defaults.txt"
},
"rdf_storage": {
"files": "./storage/rdfs",
"extension": ".txt",
"directories": true
},
"spelling_corpus": {
"file": "./storage/spelling/corpus.txt"
},
"license_keys": {
"file": "./storage/license.keys"
},
"nodes": {
"files": "./storage/nodes"
},
"binaries": {
"files": "./storage/binaries"
},
"braintree": {
"file": "./storage/braintree/braintree.xml",
"format": "xml"
},
"preprocessors": {
"file": "./storage/processing/preprocessors.txt"
},
"postprocessors": {
"file": "./storage/processing/postprocessing.txt"
},
"usergroups": {
"files": "./storage/security/usergroups.txt"
},
"learnf": {
"files": "./storage/categories/learnf"
}
}
}
}
},
"logger": {
"type": "logger",
"config": {
"conversation_logger": "conversation"
}
}
},
"voice": {
"license_keys": "$BOT_ROOT/config/license.keys",
"tts": "osx",
"stt": "azhang",
"osx": {
"classname": "talky.clients.voice.tts.osxsay.OSXSayTextToSpeach"
},
"pytts": {
"classname": "talky.clients.voice.tts.pyttssay.PyTTSSayTextToSpeach",
"rate_adjust": 10
},
"azhang": {
"classname": "talky.clients.voice.stt.azhang.AnthonyZhangSpeechToText",
"ambient_adjust": 3,
"service": "ibm"
}
},
"rest": {
"host": "0.0.0.0",
"port": 8989,
"debug": false,
"workers": 4,
"license_keys": "$BOT_ROOT/config/license.keys"
},
"webchat": {
"host": "0.0.0.0",
"port": 8090,
"debug": false,
"license_keys": "$BOT_ROOT/config/license.keys",
"api": "/api/web/v1.0/ask"
},
"twitter": {
"polling": true,
"polling_interval": 49,
"streaming": false,
"use_status": true,
"use_direct_message": true,
"auto_follow": true,
"storage": "file",
"welcome_message": "Thanks for following me, send me a message and I'll try and help",
"license_keys": "file"
},
"xmpp": {
"server": "talk.google.com",
"port": 5222,
"xep_0030": true,
"xep_0004": true,
"xep_0060": true,
"xep_0199": true,
"license_keys": "file"
},
"socket": {
"host": "127.0.0.1",
"port": 9999,
"queue": 5,
"debug": true,
"license_keys": "file"
},
"telegram": {
"unknown_command": "Sorry, that is not a command I have been taught yet!",
"license_keys": "file"
},
"facebook": {
"host": "127.0.0.1",
"port": 5000,
"debug": false,
"license_keys": "file"
},
"twilio": {
"host": "127.0.0.1",
"port": 5000,
"debug": false,
"license_keys": "file"
},
"slack": {
"polling_interval": 1,
"license_keys": "file"
},
"viber": {
"name": "Servusai",
"avatar": "http://viber.com/avatar.jpg",
"license_keys": "file"
},
"line": {
"host": "127.0.0.1",
"port": 8084,
"debug": false,
"license_keys": "file"
},
"kik": {
"bot_name": "servusai",
"webhook": "https://93638f7a.ngrok.io/api/kik/v1.0/ask",
"host": "127.0.0.1",
"port": 8082,
"debug": false,
"license_keys": "file"
},
"bot": {
"brain": "brain",
"initial_question": "Hi, how can I help you today?",
"initial_question_srai": "YINITIALQUESTION",
"default_response": "Sorry, I don't have an answer for that!",
"default_response_srai": "YEMPTY",
"empty_string": "YEMPTY",
"exit_response": "So long, and thanks for the fish!",
"exit_response_srai": "YEXITRESPONSE",
"override_properties": true,
"max_question_recursion": 1000,
"max_question_timeout": 60,
"max_search_depth": 100,
"max_search_timeout": 60,
"spelling": {
"load": true,
"classname": "programy.spelling.norvig.NorvigSpellingChecker",
"check_before": true,
"check_and_retry": true
},
"conversations": {
"max_histories": 100,
"restore_last_topic": false,
"initial_topic": "TOPIC1",
"empty_on_start": false
}
},
"brain": {
"overrides": {
"allow_system_aiml": true,
"allow_learn_aiml": true,
"allow_learnf_aiml": true
},
"defaults": {
"default-get": "unknown",
"default-property": "unknown",
"default-map": "unknown",
"learnf-path": "file"
},
"binaries": {
"save_binary": true,
"load_binary": true,
"load_aiml_on_binary_fail": true
},
"braintree": {
"create": true
},
"services": {
"REST": {
"classname": "programy.services.rest.GenericRESTService",
"method": "GET",
"host": "0.0.0.0",
"port": 8080
},
"Pannous": {
"classname": "programy.services.pannous.PannousService",
"url": "http://weannie.pannous.com/api"
}
},
"security": {
"authentication": {
"classname": "programy.security.authenticate.passthrough.BasicPassThroughAuthenticationService",
"denied_srai": "AUTHENTICATION_FAILED"
},
"authorisation": {
"classname": "programy.security.authorise.usergroupsauthorisor.BasicUserGroupAuthorisationService",
"denied_srai": "AUTHORISATION_FAILED",
"usergroups": {
"storage": "file"
}
}
},
"oob": {
"default": {
"classname": "programy.oob.defaults.default.DefaultOutOfBandProcessor"
},
"alarm": {
"classname": "programy.oob.defaults.alarm.AlarmOutOfBandProcessor"
},
"camera": {
"classname": "programy.oob.defaults.camera.CameraOutOfBandProcessor"
},
"clear": {
"classname": "programy.oob.defaults.clear.ClearOutOfBandProcessor"
},
"dial": {
"classname": "programy.oob.defaults.dial.DialOutOfBandProcessor"
},
"dialog": {
"classname": "programy.oob.defaults.dialog.DialogOutOfBandProcessor"
},
"email": {
"classname": "programy.oob.defaults.email.EmailOutOfBandProcessor"
},
"geomap": {
"classname": "programy.oob.defaults.map.MapOutOfBandProcessor"
},
"schedule": {
"classname": "programy.oob.defaults.schedule.ScheduleOutOfBandProcessor"
},
"search": {
"classname": "programy.oob.defaults.search.SearchOutOfBandProcessor"
},
"sms": {
"classname": "programy.oob.defaults.sms.SMSOutOfBandProcessor"
},
"url": {
"classname": "programy.oob.defaults.url.URLOutOfBandProcessor"
},
"wifi": {
"classname": "programy.oob.defaults.wifi.WifiOutOfBandProcessor"
}
},
"dynamic": {
"variables": {
"gettime": "programy.dynamic.variables.datetime.GetTime"
},
"sets": {
"numeric": "programy.dynamic.sets.numeric.IsNumeric",
"roman": "programy.dynamic.sets.roman.IsRomanNumeral"
},
"maps": {
"romantodec": "programy.dynamic.maps.roman.MapRomanToDecimal",
"dectoroman": "programy.dynamic.maps.roman.MapDecimalToRoman"
}
}
}
}
""", ConsoleConfiguration(), ".")
self.assertIsNotNone(configuration)
self.assert_configuration(configuration)
def test_load_additionals(self):
config = JSONConfigurationFile()
self.assertIsNotNone(config)
configuration = config.load_from_text("""
{
"console": {
"bot": "bot"
},
"bot": {
"brain": "brain"
},
"brain": {
"security": {
"authentication": {
"classname": "programy.security.authenticate.passthrough.PassThroughAuthenticationService",
"denied_srai": "ACCESS_DENIED"
}
}
}
}
""", ConsoleConfiguration(), ".")
self.assertIsNotNone(configuration)
auth_service = configuration.client_configuration.configurations[0].configurations[0].security.authentication
self.assertIsNotNone(auth_service)
self.assertEqual("ACCESS_DENIED", auth_service.denied_srai)
def test_load_with_subs(self):
subs = Substitutions()
subs.add_substitute("$ALLOW_SYSTEM", True)
config_data = JSONConfigurationFile()
self.assertIsNotNone(config_data)
configuration = config_data.load_from_text("""
{
"brain": {
"overrides": {
"allow_system_aiml": true,
"allow_learn_aiml": true,
"allow_learnf_aiml": true
}
}
}
""", ConsoleConfiguration(), ".")
self.assertIsNotNone(configuration)
section = config_data.get_section("brainx")
self.assertIsNone(section)
section = config_data.get_section("brain")
self.assertIsNotNone(section)
child_section = config_data.get_section("overrides", section)
self.assertIsNotNone(child_section)
self.assertEqual(True, config_data.get_option(child_section, "allow_system_aiml"))
self.assertEqual(True, config_data.get_bool_option(child_section, "allow_system_aiml"))
self.assertEqual(False, config_data.get_bool_option(child_section, "other_value"))
| 31.245841 | 129 | 0.562885 | 1,520 | 16,904 | 6.094079 | 0.290132 | 0.02375 | 0.022455 | 0.039296 | 0.336716 | 0.316852 | 0.272698 | 0.246249 | 0.246249 | 0.244305 | 0 | 0.01316 | 0.29425 | 16,904 | 540 | 130 | 31.303704 | 0.763286 | 0.062825 | 0 | 0.262525 | 0 | 0.004008 | 0.731906 | 0.142857 | 0 | 0 | 0 | 0 | 0.08016 | 1 | 0.014028 | false | 0.006012 | 0.01002 | 0 | 0.026052 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9c2e7f928242ca7ff6d49135cef52a68297eec05 | 3,484 | py | Python | pincer/objects/message/sticker.py | mjneff2/Pincer | a11bc3e4bad319fdf927d913c58c933576ec7c99 | [
"MIT"
] | null | null | null | pincer/objects/message/sticker.py | mjneff2/Pincer | a11bc3e4bad319fdf927d913c58c933576ec7c99 | [
"MIT"
] | null | null | null | pincer/objects/message/sticker.py | mjneff2/Pincer | a11bc3e4bad319fdf927d913c58c933576ec7c99 | [
"MIT"
] | null | null | null | # Copyright Pincer 2021-Present
# Full MIT License can be found in `LICENSE` at the project root.
from __future__ import annotations
from dataclasses import dataclass
from enum import IntEnum
from typing import List, Optional, TYPE_CHECKING
from ...utils.api_object import APIObject
from ...utils.types import MISSING
if TYPE_CHECKING:
from ..user import User
from ...utils import APINullable, Snowflake
class StickerType(IntEnum):
"""
Displays from where the sticker comes from.
:param STANDARD:
Sticker is included in the default Discord sticker pack.
:param GUILD:
Sticker is a custom sticker from a discord server.
"""
STANDARD = 1
GUILD = 2
class StickerFormatType(IntEnum):
"""
The type of the sticker.
:param PNG:
Sticker is of PNG format.
:param APNG:
Sticker is animated with APNG format.
:param LOTTIE:
Sticker is animated with with LOTTIE format. (vector based)
"""
PNG = 1
APNG = 2
LOTTIE = 3
@dataclass
class Sticker(APIObject):
"""
Represents a Discord sticker.
:param description:
description of the sticker
:param format_type:
type of sticker format
:param id:
id of the sticker
:param name:
name of the sticker
:param tags:
for guild stickers, the Discord name of a unicode emoji
representing the sticker's expression. For standard stickers,
a comma-separated list of related expressions.
:param type:
type of sticker
:param available:
whether this guild sticker can be used,
may be false due to loss of Server Boosts
:param guild_id:
id of the guild that owns this sticker
:param pack_id:
for standard stickers, id of the pack the sticker is from
:param sort_value:
the standard sticker's sort order within its pack
:param user:
the user that uploaded the guild sticker
"""
description: Optional[str]
format_type: StickerFormatType
id: Snowflake
name: str
tags: str
type: StickerType
available: APINullable[bool] = MISSING
guild_id: APINullable[Snowflake] = MISSING
pack_id: APINullable[Snowflake] = MISSING
sort_value: APINullable[int] = MISSING
user: APINullable[User] = MISSING
@dataclass
class StickerItem(APIObject):
"""
Represents the smallest amount of data required to render a sticker.
A partial sticker object.
:param id:
id of the sticker
:param name:
name of the sticker
:param format_type:
type of sticker format
"""
id: Snowflake
name: str
format_type: StickerFormatType
@dataclass
class StickerPack(APIObject):
"""
Represents a pack of standard stickers.
:param id:
id of the sticker pack
:param stickers:
the stickers in the pack
:param name:
name of the sticker pack
:param sku_id:
id of the pack's SKU
:param description:
description of the sticker pack
:param cover_sticker_id:
id of a sticker in the pack which is shown as the pack's icon
:param banner_asset_id:
id of the sticker pack's banner image
"""
id: Snowflake
stickers: List[Sticker]
name: str
sku_id: Snowflake
description: str
cover_sticker_id: APINullable[Snowflake] = MISSING
banner_asset_id: APINullable[Snowflake] = MISSING
| 21.506173 | 72 | 0.661022 | 447 | 3,484 | 5.091723 | 0.272931 | 0.057118 | 0.052724 | 0.044815 | 0.147627 | 0.137522 | 0.077768 | 0.077768 | 0.077768 | 0.077768 | 0 | 0.003597 | 0.28186 | 3,484 | 161 | 73 | 21.639752 | 0.906075 | 0.557118 | 0 | 0.255814 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.186047 | 0 | 0.906977 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
9c2fcf2ba9545bbf4f412026ea905a2899fef624 | 2,511 | py | Python | regenesis/modelgen.py | crijke/regenesis | e53a0c6302aa458ff9ae95f573d5594351e5434c | [
"MIT"
] | 16 | 2015-04-09T14:40:53.000Z | 2021-07-13T15:03:35.000Z | regenesis/modelgen.py | crijke/regenesis | e53a0c6302aa458ff9ae95f573d5594351e5434c | [
"MIT"
] | 1 | 2018-06-25T07:51:18.000Z | 2018-06-25T07:51:18.000Z | regenesis/modelgen.py | crijke/regenesis | e53a0c6302aa458ff9ae95f573d5594351e5434c | [
"MIT"
] | 3 | 2015-12-20T18:24:21.000Z | 2018-06-24T16:57:25.000Z | import json
from regenesis.queries import get_cubes, get_all_dimensions, get_dimensions
from pprint import pprint
def generate_dimensions():
dimensions = []
for dimension in get_all_dimensions():
pprint (dimension)
if dimension.get('measure_type').startswith('W-'):
continue
attrs = ['name', 'label']
if 'ZI' in dimension.get('measure_type'):
attrs = ['text', 'from', 'until']
dim = {
'name': dimension.get('name'),
'label': dimension.get('title_de'),
'description': dimension.get('definition_de'),
'attributes': attrs
}
dimensions.append(dim)
return dimensions
def generate_cubes():
cubes = []
for cube in get_cubes():
dimensions = []
measures = []
joins = []
mappings = {}
cube_name = cube.get('cube_name')
for dim in get_dimensions(cube_name):
dn = dim.get('dim_name')
if dim.get('dim_measure_type').startswith('W-'):
measures.append(dn)
continue
dimensions.append(dn)
if dim.get('dim_measure_type').startswith('ZI-'):
mappings[dn + '.text'] = 'fact_%s.%s' % (cube_name, dn)
mappings[dn + '.from'] = 'fact_%s.%s_from' % (cube_name, dn)
mappings[dn + '.until'] = 'fact_%s.%s_until' % (cube_name, dn)
else:
tn = 'tbl_' + dn
joins.append({
'master': dn,
'detail': 'value.value_id',
'alias': tn
})
mappings[dn + '.name'] = tn + '.name'
mappings[dn + '.label'] = tn + '.title_de'
cubes.append({
'dimensions': dimensions,
'measures': measures,
'mappings': mappings,
'joins': joins,
'fact': 'fact_%s' % cube_name,
'name': cube.get('cube_name'),
'label': cube.get('statistic_title_de'),
'description': cube.get('statistic_description_de'),
})
return cubes
def generate_model():
model = {
'dimensions': generate_dimensions(),
'cubes': generate_cubes(),
'locale': 'de'
}
pprint(model)
return model
if __name__ == '__main__':
with open('model.json', 'wb') as fh:
model = generate_model()
json.dump(model, fh, indent=2)
| 30.253012 | 78 | 0.502589 | 250 | 2,511 | 4.84 | 0.248 | 0.052893 | 0.033058 | 0.038017 | 0.117355 | 0.052893 | 0.052893 | 0 | 0 | 0 | 0 | 0.000619 | 0.35683 | 2,511 | 82 | 79 | 30.621951 | 0.748607 | 0 | 0 | 0.085714 | 1 | 0 | 0.172112 | 0.009562 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042857 | false | 0 | 0.042857 | 0 | 0.128571 | 0.042857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9c3934843ce267b0dc897db0634f69b0dfaade62 | 280 | py | Python | Data_Structures/2d_array_ds.py | csixteen/HackerRank | 3ef6fa48599341f481b9e266c69df2d449a7b313 | [
"MIT"
] | 4 | 2018-04-19T20:32:54.000Z | 2020-04-21T12:28:00.000Z | Data_Structures/2d_array_ds.py | csixteen/HackerRank | 3ef6fa48599341f481b9e266c69df2d449a7b313 | [
"MIT"
] | null | null | null | Data_Structures/2d_array_ds.py | csixteen/HackerRank | 3ef6fa48599341f481b9e266c69df2d449a7b313 | [
"MIT"
] | null | null | null | matrix = [list(map(int, input().split())) for _ in range(6)]
max_sum = None
for i in range(4):
for j in range(4):
s = sum(matrix[i][j:j+3]) + matrix[i+1][j+1] + sum(matrix[i+2][j:j+3])
if max_sum is None or s > max_sum:
max_sum = s
print(max_sum)
| 28 | 78 | 0.557143 | 57 | 280 | 2.631579 | 0.421053 | 0.2 | 0.106667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038278 | 0.253571 | 280 | 9 | 79 | 31.111111 | 0.679426 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9c464f3f464935bc0cc2e17b41fede6128938835 | 1,200 | py | Python | async_sched/client/__init__.py | justengel/async_sched | f980722d51d15025522b2265426b0188ff368418 | [
"MIT"
] | 1 | 2020-10-19T13:36:20.000Z | 2020-10-19T13:36:20.000Z | async_sched/client/__init__.py | justengel/async_sched | f980722d51d15025522b2265426b0188ff368418 | [
"MIT"
] | null | null | null | async_sched/client/__init__.py | justengel/async_sched | f980722d51d15025522b2265426b0188ff368418 | [
"MIT"
] | null | null | null | from async_sched.client import quit_server as module_quit
from async_sched.client import request_schedules as module_request
from async_sched.client import run_command as module_run
from async_sched.client import schedule_command as module_schedule
from async_sched.client import stop_schedule as module_stop
from async_sched.client import update_server as module_update
from .client import Client, \
quit_server_async, quit_server, update_server_async, update_server, request_schedules_async, \
request_schedules, run_command_async, run_command, schedule_command_async, schedule_command, \
stop_schedule_async, stop_schedule
# The other modules in this package exist for the "-m" python flag
# `python -m async_sched.client.request_schedules --host "12.0.0.1" --port 8000`
__all__ = ['Client',
'quit_server_async', 'quit_server', 'update_server_async', 'update_server', 'request_schedules_async',
'request_schedules', 'run_command_async', 'run_command', 'schedule_command_async', 'schedule_command',
'stop_schedule_async', 'stop_schedule',
'module_quit', 'module_request', 'module_run', 'module_schedule', 'module_stop', 'module_update']
| 52.173913 | 113 | 0.785 | 164 | 1,200 | 5.341463 | 0.207317 | 0.079909 | 0.127854 | 0.136986 | 0.60274 | 0.424658 | 0.424658 | 0.424658 | 0.424658 | 0.424658 | 0 | 0.008637 | 0.131667 | 1,200 | 22 | 114 | 54.545455 | 0.832054 | 0.119167 | 0 | 0 | 0 | 0 | 0.263757 | 0.042695 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.466667 | 0 | 0.466667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
9c48342a450b3888ddd355595c9462c4c225a106 | 2,880 | py | Python | account_processing.py | amitjoshi9627/Playong | d54a8db05ae5035e122b8bc8d84c849f25483005 | [
"MIT"
] | 4 | 2019-04-22T15:16:45.000Z | 2020-01-17T12:57:09.000Z | account_processing.py | amitjoshi9627/Playong | d54a8db05ae5035e122b8bc8d84c849f25483005 | [
"MIT"
] | null | null | null | account_processing.py | amitjoshi9627/Playong | d54a8db05ae5035e122b8bc8d84c849f25483005 | [
"MIT"
] | null | null | null | from selenium.webdriver import Firefox
from selenium.webdriver.firefox.options import Options
import getpass
import time
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.common.action_chains import ActionChains
from utils import *
def login_user(browser, email='', password=''):
print('Redirecting to login page..')
browser.find_element_by_xpath('//*[@id="login-btn"]').click()
if email is '':
email, password = take_credentials()
browser.find_element_by_id("login_username").send_keys(email)
browser.find_element_by_id("login_password").send_keys(password)
complete_captcha(browser)
time.sleep(4)
browser.find_element_by_xpath('//*[@id="static-login-btn"]').click()
def logout_user(browser):
print("\nThank you for your using the program! Logging you out from jiosaavn...")
show_notificaton("Thank", "You", 0)
action = ActionChains(browser)
menu = browser.find_element_by_class_name('user-name')
action.move_to_element(menu).perform()
menu.click()
browser.find_element_by_xpath(
'/html/body/div[2]/div/div[2]/div[3]/div[3]/ol/li[4]/a').click()
time.sleep(2)
print('Logout..successful...')
def check_credentials(browser):
print('Checking credentials...Please wait..')
time.sleep(5)
try:
close_promo_ad(browser)
accept_cookies(browser)
success = True
except:
success = False
return success
def wrong_credentials_check(browser, counts=1):
while success != True:
print("\nWrong username/password entered.Please try again...\n")
email = input("Enter your email for jiosaavn account: ")
password = getpass.getpass(f"Enter password for {email}: ")
email_element = browser.find_element_by_id("login_username")
email_element.clear()
email_element.send_keys(email)
pswd_element = browser.find_element_by_id("login_password")
pswd_element.clear()
pswd_element.send_keys(password)
browser.find_element_by_xpath('//*[@id="static-login-btn"]').click()
success = check_credentials(browser)
counts += 1
if counts > 4:
print('Too many unsuccessful attempts done. Exiting...\n')
break
return counts
def go_without_login(browser):
return False
def take_credentials():
email = input("Enter your email for jiosaavn account: ")
password = getpass.getpass(f"Enter password for {email}: ")
return email, password
def prompt(browser):
# response = int(input("Press 1 to Log in with you account else Press 0: "))
# if response:
# login_user(browser)
# return True
# else:
# go_without_login(browser)
print("Due to some issues.. Login Option is not available currently! Sorry for the inconvenience caused.")
go_without_login(browser)
| 32.359551 | 110 | 0.682986 | 370 | 2,880 | 5.140541 | 0.345946 | 0.05205 | 0.085174 | 0.094637 | 0.247108 | 0.233964 | 0.219769 | 0.138801 | 0.138801 | 0.138801 | 0 | 0.006061 | 0.197917 | 2,880 | 88 | 111 | 32.727273 | 0.817316 | 0.056597 | 0 | 0.092308 | 0 | 0.015385 | 0.254982 | 0.047232 | 0 | 0 | 0 | 0 | 0 | 1 | 0.107692 | false | 0.153846 | 0.107692 | 0.015385 | 0.276923 | 0.107692 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
9c49c6272ae1b539badcabd74a81163ceda4090b | 1,104 | py | Python | Mundo 3/teste.py | RafaelSdm/Curso-de-Python | ae933ba80ee00ad5160bd5d05cf4b21007943fd4 | [
"MIT"
] | 1 | 2021-03-10T21:53:38.000Z | 2021-03-10T21:53:38.000Z | Mundo 3/teste.py | RafaelSdm/Curso-de-Python | ae933ba80ee00ad5160bd5d05cf4b21007943fd4 | [
"MIT"
] | null | null | null | Mundo 3/teste.py | RafaelSdm/Curso-de-Python | ae933ba80ee00ad5160bd5d05cf4b21007943fd4 | [
"MIT"
] | null | null | null | pessoas = {'nomes': "Rafael","sexo":"macho alfa","idade":19}
print(f"o {pessoas['nomes']} que se considera um {pessoas['sexo']} possui {pessoas['idade']}")
print(pessoas.keys())
print(pessoas.values())
print(pessoas.items())
for c in pessoas.keys():
print(c)
for c in pessoas.values():
print(c)
for c, j in pessoas.items():
print(f"o {c} pertence ao {j}")
del pessoas['sexo']
print(pessoas)
pessoas["sexo"] = "macho alfa"
print(pessoas)
print("outro codida daqui pra frente \n\n\n\n\n\n")
estado1 = {'estado': 'minas gerais', 'cidade':'capela nova' }
estado2 = {'estado':'rio de janeiro', 'cidade':"rossinha"}
brasil = []
brasil.append(estado1)
brasil.append(estado2)
print(brasil)
print(f"o brasil possui um estado chamado {brasil[0]['estado']} e a prorpia possui uma cidade chamada {brasil[0]['cidade']}")
print("-"*45)
es = {}
br = []
for c in range(0,3):
es['estado'] = str(input("informe o seu estado:"))
es['cidade'] = str(input("informe a sua cidade:"))
br.append(es.copy())
for c in br:
for i,j in c.items():
print(f"o campo {i} tem valor {j}")
| 23 | 125 | 0.638587 | 173 | 1,104 | 4.075145 | 0.393064 | 0.085106 | 0.039716 | 0.017021 | 0.008511 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012917 | 0.158514 | 1,104 | 47 | 126 | 23.489362 | 0.745963 | 0 | 0 | 0.121212 | 0 | 0.060606 | 0.416667 | 0.038043 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.424242 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
9c4cf09ffcfa4dd9bf0d914e9750a3f14e039df3 | 605 | py | Python | examples/basic/findQSpark.py | myriadrf/pyLMS7002M | b866deea1f05dba44c9ed1a1a4666352b811b66b | [
"Apache-2.0"
] | 46 | 2016-11-29T05:10:36.000Z | 2021-10-31T19:27:46.000Z | examples/basic/findQSpark.py | myriadrf/pyLMS7002M | b866deea1f05dba44c9ed1a1a4666352b811b66b | [
"Apache-2.0"
] | 2 | 2017-04-15T21:36:01.000Z | 2017-06-08T09:44:26.000Z | examples/basic/findQSpark.py | myriadrf/pyLMS7002M | b866deea1f05dba44c9ed1a1a4666352b811b66b | [
"Apache-2.0"
] | 16 | 2016-11-28T20:47:55.000Z | 2021-04-07T01:48:20.000Z | from pyLMS7002M import *
print("Searching for QSpark...")
try:
QSpark = QSpark()
except:
print("QSpark not found")
exit(1)
print("\QSpark info:")
QSpark.printInfo() # print the QSpark board info
# QSpark.LMS7002_Reset() # reset the LMS7002M
lms7002 = QSpark.getLMS7002() # get the LMS7002M object
ver, rev, mask = lms7002.chipInfo # get the chip info
print("\nLMS7002M info:")
print("VER : "+str(ver))
print("REV : "+str(rev))
print("MASK : "+str(mask))
| 31.842105 | 80 | 0.528926 | 64 | 605 | 4.984375 | 0.46875 | 0.068966 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.083756 | 0.34876 | 605 | 18 | 81 | 33.611111 | 0.725888 | 0.231405 | 0 | 0 | 0 | 0 | 0.272926 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.066667 | 0 | 0.066667 | 0.533333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
9c5b28789209abf7074e8e365fb1d2e93079992e | 2,109 | py | Python | tests/test_bindiff.py | Kyle-Kyle/angr | 345b2131a7a67e3a6ffc7d9fd475146a3e12f837 | [
"BSD-2-Clause"
] | 6,132 | 2015-08-06T23:24:47.000Z | 2022-03-31T21:49:34.000Z | tests/test_bindiff.py | Kyle-Kyle/angr | 345b2131a7a67e3a6ffc7d9fd475146a3e12f837 | [
"BSD-2-Clause"
] | 2,272 | 2015-08-10T08:40:07.000Z | 2022-03-31T23:46:44.000Z | tests/test_bindiff.py | Kyle-Kyle/angr | 345b2131a7a67e3a6ffc7d9fd475146a3e12f837 | [
"BSD-2-Clause"
] | 1,155 | 2015-08-06T23:37:39.000Z | 2022-03-31T05:54:11.000Z | import nose
import angr
import logging
l = logging.getLogger("angr.tests.test_bindiff")
import os
test_location = os.path.join(os.path.dirname(os.path.realpath(__file__)), '..', '..', 'binaries', 'tests')
# todo make a better test
def test_bindiff_x86_64():
binary_path_1 = os.path.join(test_location, 'x86_64', 'bindiff_a')
binary_path_2 = os.path.join(test_location, 'x86_64', 'bindiff_b')
b = angr.Project(binary_path_1, load_options={"auto_load_libs": False})
b2 = angr.Project(binary_path_2, load_options={"auto_load_libs": False})
bindiff = b.analyses.BinDiff(b2)
identical_functions = bindiff.identical_functions
differing_functions = bindiff.differing_functions
unmatched_functions = bindiff.unmatched_functions
# check identical functions
nose.tools.assert_in((0x40064c, 0x40066a), identical_functions)
# check differing functions
nose.tools.assert_in((0x400616, 0x400616), differing_functions)
# check unmatched functions
nose.tools.assert_less_equal(len(unmatched_functions[0]), 1)
nose.tools.assert_less_equal(len(unmatched_functions[1]), 2)
# check for no major regressions
nose.tools.assert_greater(len(identical_functions), len(differing_functions))
nose.tools.assert_less(len(differing_functions), 4)
# check a function diff
fdiff = bindiff.get_function_diff(0x400616, 0x400616)
block_matches = { (a.addr, b.addr) for a, b in fdiff.block_matches }
nose.tools.assert_in((0x40064a, 0x400668), block_matches)
nose.tools.assert_in((0x400616, 0x400616), block_matches)
nose.tools.assert_in((0x40061e, 0x40061e), block_matches)
def run_all():
functions = globals()
all_functions = dict(filter((lambda kv: kv[0].startswith('test_')), functions.items()))
for f in sorted(all_functions.keys()):
if hasattr(all_functions[f], '__call__'):
all_functions[f]()
if __name__ == "__main__":
logging.getLogger("angr.analyses.bindiff").setLevel(logging.DEBUG)
import sys
if len(sys.argv) > 1:
globals()['test_' + sys.argv[1]]()
else:
run_all()
| 39.055556 | 106 | 0.719772 | 282 | 2,109 | 5.099291 | 0.315603 | 0.056328 | 0.09388 | 0.05911 | 0.305981 | 0.243394 | 0.109875 | 0.109875 | 0 | 0 | 0 | 0.058758 | 0.152679 | 2,109 | 53 | 107 | 39.792453 | 0.745943 | 0.07302 | 0 | 0 | 0 | 0 | 0.074435 | 0.022587 | 0 | 0 | 0.049281 | 0.018868 | 0.230769 | 1 | 0.051282 | false | 0 | 0.128205 | 0 | 0.179487 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9c5f1cf8cb3617f22a594d7ff47f26bbe868fb45 | 326 | py | Python | 01-logica-de-programacao-e-algoritmos/Aula 06/01 Tuplas/1.2 Desempacotamento de parametros em funcoes/ex01.py | rafaelbarretomg/Uninter | 1f84b0103263177122663e991db3a8aeb106a959 | [
"MIT"
] | null | null | null | 01-logica-de-programacao-e-algoritmos/Aula 06/01 Tuplas/1.2 Desempacotamento de parametros em funcoes/ex01.py | rafaelbarretomg/Uninter | 1f84b0103263177122663e991db3a8aeb106a959 | [
"MIT"
] | null | null | null | 01-logica-de-programacao-e-algoritmos/Aula 06/01 Tuplas/1.2 Desempacotamento de parametros em funcoes/ex01.py | rafaelbarretomg/Uninter | 1f84b0103263177122663e991db3a8aeb106a959 | [
"MIT"
] | null | null | null | # Desempacotamento de parametros em funcoes
# somando valores de uma tupla
def soma(*num):
soma = 0
print('Tupla: {}' .format(num))
for i in num:
soma += i
return soma
# Programa principal
print('Resultado: {}\n' .format(soma(1, 2)))
print('Resultado: {}\n' .format(soma(1, 2, 3, 4, 5, 6, 7, 8, 9)))
| 23.285714 | 65 | 0.604294 | 50 | 326 | 3.94 | 0.64 | 0.071066 | 0.152284 | 0.213198 | 0.274112 | 0.274112 | 0.274112 | 0 | 0 | 0 | 0 | 0.047431 | 0.223926 | 326 | 13 | 66 | 25.076923 | 0.731225 | 0.273006 | 0 | 0 | 0 | 0 | 0.167382 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0 | 0 | 0.25 | 0.375 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9c6a899bfa0fce8fa48384ca11c89371d3bdbbc4 | 10,449 | py | Python | tests/test_decorators.py | stephenfin/django-rest-framework | 9d001cd84c1239d708b1528587c183ef30e38c31 | [
"BSD-3-Clause"
] | 1 | 2019-01-11T13:56:41.000Z | 2019-01-11T13:56:41.000Z | tests/test_decorators.py | stephenfin/django-rest-framework | 9d001cd84c1239d708b1528587c183ef30e38c31 | [
"BSD-3-Clause"
] | null | null | null | tests/test_decorators.py | stephenfin/django-rest-framework | 9d001cd84c1239d708b1528587c183ef30e38c31 | [
"BSD-3-Clause"
] | 1 | 2019-06-29T12:46:16.000Z | 2019-06-29T12:46:16.000Z | from __future__ import unicode_literals
import pytest
from django.test import TestCase
from rest_framework import status
from rest_framework.authentication import BasicAuthentication
from rest_framework.decorators import (
action, api_view, authentication_classes, detail_route, list_route,
parser_classes, permission_classes, renderer_classes, schema,
throttle_classes
)
from rest_framework.parsers import JSONParser
from rest_framework.permissions import IsAuthenticated
from rest_framework.renderers import JSONRenderer
from rest_framework.response import Response
from rest_framework.schemas import AutoSchema
from rest_framework.test import APIRequestFactory
from rest_framework.throttling import UserRateThrottle
from rest_framework.views import APIView
class DecoratorTestCase(TestCase):
def setUp(self):
self.factory = APIRequestFactory()
def _finalize_response(self, request, response, *args, **kwargs):
response.request = request
return APIView.finalize_response(self, request, response, *args, **kwargs)
def test_api_view_incorrect(self):
"""
If @api_view is not applied correct, we should raise an assertion.
"""
@api_view
def view(request):
return Response()
request = self.factory.get('/')
self.assertRaises(AssertionError, view, request)
def test_api_view_incorrect_arguments(self):
"""
If @api_view is missing arguments, we should raise an assertion.
"""
with self.assertRaises(AssertionError):
@api_view('GET')
def view(request):
return Response()
def test_calling_method(self):
@api_view(['GET'])
def view(request):
return Response({})
request = self.factory.get('/')
response = view(request)
assert response.status_code == status.HTTP_200_OK
request = self.factory.post('/')
response = view(request)
assert response.status_code == status.HTTP_405_METHOD_NOT_ALLOWED
def test_calling_put_method(self):
@api_view(['GET', 'PUT'])
def view(request):
return Response({})
request = self.factory.put('/')
response = view(request)
assert response.status_code == status.HTTP_200_OK
request = self.factory.post('/')
response = view(request)
assert response.status_code == status.HTTP_405_METHOD_NOT_ALLOWED
def test_calling_patch_method(self):
@api_view(['GET', 'PATCH'])
def view(request):
return Response({})
request = self.factory.patch('/')
response = view(request)
assert response.status_code == status.HTTP_200_OK
request = self.factory.post('/')
response = view(request)
assert response.status_code == status.HTTP_405_METHOD_NOT_ALLOWED
def test_renderer_classes(self):
@api_view(['GET'])
@renderer_classes([JSONRenderer])
def view(request):
return Response({})
request = self.factory.get('/')
response = view(request)
assert isinstance(response.accepted_renderer, JSONRenderer)
def test_parser_classes(self):
@api_view(['GET'])
@parser_classes([JSONParser])
def view(request):
assert len(request.parsers) == 1
assert isinstance(request.parsers[0], JSONParser)
return Response({})
request = self.factory.get('/')
view(request)
def test_authentication_classes(self):
@api_view(['GET'])
@authentication_classes([BasicAuthentication])
def view(request):
assert len(request.authenticators) == 1
assert isinstance(request.authenticators[0], BasicAuthentication)
return Response({})
request = self.factory.get('/')
view(request)
def test_permission_classes(self):
@api_view(['GET'])
@permission_classes([IsAuthenticated])
def view(request):
return Response({})
request = self.factory.get('/')
response = view(request)
assert response.status_code == status.HTTP_403_FORBIDDEN
def test_throttle_classes(self):
class OncePerDayUserThrottle(UserRateThrottle):
rate = '1/day'
@api_view(['GET'])
@throttle_classes([OncePerDayUserThrottle])
def view(request):
return Response({})
request = self.factory.get('/')
response = view(request)
assert response.status_code == status.HTTP_200_OK
response = view(request)
assert response.status_code == status.HTTP_429_TOO_MANY_REQUESTS
def test_schema(self):
"""
Checks CustomSchema class is set on view
"""
class CustomSchema(AutoSchema):
pass
@api_view(['GET'])
@schema(CustomSchema())
def view(request):
return Response({})
assert isinstance(view.cls.schema, CustomSchema)
class ActionDecoratorTestCase(TestCase):
def test_defaults(self):
@action(detail=True)
def test_action(request):
"""Description"""
assert test_action.mapping == {'get': 'test_action'}
assert test_action.detail is True
assert test_action.url_path == 'test_action'
assert test_action.url_name == 'test-action'
assert test_action.kwargs == {
'name': 'Test action',
'description': 'Description',
}
def test_detail_required(self):
with pytest.raises(AssertionError) as excinfo:
@action()
def test_action(request):
raise NotImplementedError
assert str(excinfo.value) == "@action() missing required argument: 'detail'"
def test_method_mapping_http_methods(self):
# All HTTP methods should be mappable
@action(detail=False, methods=[])
def test_action():
raise NotImplementedError
for name in APIView.http_method_names:
def method():
raise NotImplementedError
# Python 2.x compatibility - cast __name__ to str
method.__name__ = str(name)
getattr(test_action.mapping, name)(method)
# ensure the mapping returns the correct method name
for name in APIView.http_method_names:
assert test_action.mapping[name] == name
def test_view_name_kwargs(self):
"""
'name' and 'suffix' are mutually exclusive kwargs used for generating
a view's display name.
"""
# by default, generate name from method
@action(detail=True)
def test_action(request):
raise NotImplementedError
assert test_action.kwargs == {
'description': None,
'name': 'Test action',
}
# name kwarg supersedes name generation
@action(detail=True, name='test name')
def test_action(request):
raise NotImplementedError
assert test_action.kwargs == {
'description': None,
'name': 'test name',
}
# suffix kwarg supersedes name generation
@action(detail=True, suffix='Suffix')
def test_action(request):
raise NotImplementedError
assert test_action.kwargs == {
'description': None,
'suffix': 'Suffix',
}
# name + suffix is a conflict.
with pytest.raises(TypeError) as excinfo:
action(detail=True, name='test name', suffix='Suffix')
assert str(excinfo.value) == "`name` and `suffix` are mutually exclusive arguments."
def test_method_mapping(self):
@action(detail=False)
def test_action(request):
raise NotImplementedError
@test_action.mapping.post
def test_action_post(request):
raise NotImplementedError
# The secondary handler methods should not have the action attributes
for name in ['mapping', 'detail', 'url_path', 'url_name', 'kwargs']:
assert hasattr(test_action, name) and not hasattr(test_action_post, name)
def test_method_mapping_already_mapped(self):
@action(detail=True)
def test_action(request):
raise NotImplementedError
msg = "Method 'get' has already been mapped to '.test_action'."
with self.assertRaisesMessage(AssertionError, msg):
@test_action.mapping.get
def test_action_get(request):
raise NotImplementedError
def test_method_mapping_overwrite(self):
@action(detail=True)
def test_action():
raise NotImplementedError
msg = ("Method mapping does not behave like the property decorator. You "
"cannot use the same method name for each mapping declaration.")
with self.assertRaisesMessage(AssertionError, msg):
@test_action.mapping.post
def test_action():
raise NotImplementedError
def test_detail_route_deprecation(self):
with pytest.warns(DeprecationWarning) as record:
@detail_route()
def view(request):
raise NotImplementedError
assert len(record) == 1
assert str(record[0].message) == (
"`detail_route` is deprecated and will be removed in "
"3.10 in favor of `action`, which accepts a `detail` bool. Use "
"`@action(detail=True)` instead."
)
def test_list_route_deprecation(self):
with pytest.warns(DeprecationWarning) as record:
@list_route()
def view(request):
raise NotImplementedError
assert len(record) == 1
assert str(record[0].message) == (
"`list_route` is deprecated and will be removed in "
"3.10 in favor of `action`, which accepts a `detail` bool. Use "
"`@action(detail=False)` instead."
)
def test_route_url_name_from_path(self):
# pre-3.8 behavior was to base the `url_name` off of the `url_path`
with pytest.warns(DeprecationWarning):
@list_route(url_path='foo_bar')
def view(request):
raise NotImplementedError
assert view.url_path == 'foo_bar'
assert view.url_name == 'foo-bar'
| 31.954128 | 92 | 0.624175 | 1,112 | 10,449 | 5.68705 | 0.181655 | 0.036528 | 0.030993 | 0.039532 | 0.509013 | 0.438014 | 0.397849 | 0.338393 | 0.290006 | 0.242568 | 0 | 0.005996 | 0.281749 | 10,449 | 326 | 93 | 32.052147 | 0.836642 | 0.066226 | 0 | 0.491304 | 0 | 0 | 0.088562 | 0.004667 | 0 | 0 | 0 | 0 | 0.165217 | 1 | 0.217391 | false | 0.004348 | 0.06087 | 0.03913 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
9c6c043e7e279ee40586854016feb8a49ecc6e3c | 661 | py | Python | tamilmorse/morse_encode.py | CRE2525/open-tamil | ffc02509f7b8a6a17644c85799a475a8ba623954 | [
"MIT"
] | 1 | 2021-08-03T19:35:18.000Z | 2021-08-03T19:35:18.000Z | tamilmorse/morse_encode.py | CRE2525/open-tamil | ffc02509f7b8a6a17644c85799a475a8ba623954 | [
"MIT"
] | null | null | null | tamilmorse/morse_encode.py | CRE2525/open-tamil | ffc02509f7b8a6a17644c85799a475a8ba623954 | [
"MIT"
] | null | null | null | ## -*- coding: utf-8 -*-
#(C) 2018 Muthiah Annamalai
# This file is part of Open-Tamil project
# You may use or distribute this file under terms of MIT license
import codecs
import json
import tamil
import sys
import os
#e.g. python morse_encode.py கலைஞர்
CURRDIR = os.path.dirname(os.path.realpath(__file__))
def encode(text):
with codecs.open(os.path.join(CURRDIR,"data","madurai_tamilmorse.json"),"r","utf-8") as fp:
codebook = json.loads(fp.read())
output = [codebook.get(l,l) for l in tamil.utf8.get_letters(text)]
return u" ".join(output)
if __name__ == u"__main__":
encode(u" ".join([i.decode("utf-8") for i in sys.argv[1:]]))
| 30.045455 | 95 | 0.688351 | 113 | 661 | 3.911504 | 0.637168 | 0.027149 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016129 | 0.155825 | 661 | 21 | 96 | 31.47619 | 0.772401 | 0.278366 | 0 | 0 | 0 | 0 | 0.102128 | 0.048936 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.384615 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
9c6edb66b25f5b7d6f691984d70d7a69bf328bdb | 469 | py | Python | pyRasp.py | ToninoTarsi/pyRasp | a46bb1dc38c7547b60e24189ecf34310da770042 | [
"MIT"
] | null | null | null | pyRasp.py | ToninoTarsi/pyRasp | a46bb1dc38c7547b60e24189ecf34310da770042 | [
"MIT"
] | null | null | null | pyRasp.py | ToninoTarsi/pyRasp | a46bb1dc38c7547b60e24189ecf34310da770042 | [
"MIT"
] | null | null | null | # pyRasp
# Copyright (c) Tonino Tarsi 2020. Licensed under MIT.
# requirement :
# Python 3
# pip install pyyaml
# pip install request
# pip install f90nml
from downloadGFSA import downloadGFSA
from prepare_wps import prepare_wps
from ungrib import ungrib
from metgrid import metgrid
from prepare_wrf import prepare_wrf
from real import real
from wrf import wrf
result = downloadGFSA(True)
prepare_wps(result)
ungrib()
metgrid()
prepare_wrf(result)
real()
wrf()
| 16.75 | 54 | 0.784648 | 66 | 469 | 5.484848 | 0.424242 | 0.082873 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017722 | 0.157783 | 469 | 27 | 55 | 17.37037 | 0.898734 | 0.302772 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
9c8366ee191973d219cc50c6458365ebe9053724 | 376 | py | Python | Backjoon/1929.py | hanjungwoo1/CodingTest | 0112488d04dd53cea1c869439341fb602e699f2a | [
"MIT"
] | 3 | 2022-03-29T04:56:50.000Z | 2022-03-30T08:06:42.000Z | Backjoon/1929.py | hanjungwoo1/CodingTest | 0112488d04dd53cea1c869439341fb602e699f2a | [
"MIT"
] | null | null | null | Backjoon/1929.py | hanjungwoo1/CodingTest | 0112488d04dd53cea1c869439341fb602e699f2a | [
"MIT"
] | null | null | null | """
입력 예시
3 16
출력 예시
3
5
7
11
13
"""
import math
left, right = map(int, input().split())
array = [True for i in range(right+1)]
array[1] = 0
for i in range(2, int(math.sqrt(right)) + 1):
if array[i] == True:
j = 2
while i * j <= right:
array[i * j] = False
j += 1
for i in range(left, right+1):
if array[i]:
print(i) | 13.925926 | 45 | 0.505319 | 68 | 376 | 2.794118 | 0.470588 | 0.063158 | 0.094737 | 0.173684 | 0.147368 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071713 | 0.332447 | 376 | 27 | 46 | 13.925926 | 0.685259 | 0.077128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 0.076923 | 0.076923 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92bbcd24bf10bc66f379878a7b6917a00a8a96a4 | 2,698 | py | Python | layerserver/migrations/0001_initial.py | aroiginfraplan/giscube-admin | b7f3131b0186f847f3902df97f982cb288b16a49 | [
"BSD-3-Clause"
] | 5 | 2018-06-07T12:54:35.000Z | 2022-01-14T10:38:38.000Z | layerserver/migrations/0001_initial.py | aroiginfraplan/giscube-admin | b7f3131b0186f847f3902df97f982cb288b16a49 | [
"BSD-3-Clause"
] | 140 | 2018-06-18T10:27:28.000Z | 2022-03-23T09:53:15.000Z | layerserver/migrations/0001_initial.py | aroiginfraplan/giscube-admin | b7f3131b0186f847f3902df97f982cb288b16a49 | [
"BSD-3-Clause"
] | 1 | 2021-04-13T11:20:54.000Z | 2021-04-13T11:20:54.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.10 on 2018-04-26 09:14
import colorfield.fields
from django.db import migrations, models
import django.db.models.deletion
import giscube.utils
class Migration(migrations.Migration):
initial = True
dependencies = [
('giscube', '0002_update'),
]
operations = [
migrations.CreateModel(
name='GeoJsonLayer',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=50, unique=True)),
('title', models.CharField(blank=True, max_length=100, null=True)),
('description', models.TextField(blank=True, null=True)),
('keywords', models.CharField(blank=True, max_length=200, null=True)),
('active', models.BooleanField(default=True)),
('visibility', models.CharField(choices=[('private', 'Private'), ('public', 'Public')], default='private', max_length=10)),
('visible_on_geoportal', models.BooleanField(default=False)),
('shapetype', models.CharField(blank=True, choices=[('marker', 'Marker'), ('line', 'Line'), ('polygon', 'Polygon'), ('Circle', 'Circle')], max_length=20, null=True)),
('shape_radius', models.IntegerField(blank=True, null=True)),
('stroke_color', colorfield.fields.ColorField(blank=True, default=b'#FF3333', max_length=18, null=True)),
('stroke_width', models.IntegerField(blank=True, default=1, null=True)),
('stroke_dash_array', models.CharField(blank=True, default='', max_length=25, null=True)),
('fill_color', colorfield.fields.ColorField(blank=True, default=b'#FFC300', max_length=18, null=True)),
('fill_opacity', models.DecimalField(blank=True, decimal_places=1, default=1, max_digits=2, null=True)),
('url', models.CharField(blank=True, max_length=100, null=True)),
('data_file', models.FileField(blank=True, null=True, upload_to=giscube.utils.unique_service_directory)),
('service_path', models.CharField(max_length=255)),
('cache_time', models.IntegerField(blank=True, null=True)),
('last_fetch_on', models.DateField(blank=True, null=True)),
('category', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, to='giscube.Category')),
],
options={
'verbose_name': 'GeoJSONLayer',
'verbose_name_plural': 'GeoJSONLayers',
},
),
]
| 52.901961 | 182 | 0.610823 | 293 | 2,698 | 5.494881 | 0.392491 | 0.083851 | 0.048447 | 0.063354 | 0.201863 | 0.178261 | 0.114286 | 0.114286 | 0.054658 | 0 | 0 | 0.02743 | 0.2298 | 2,698 | 50 | 183 | 53.96 | 0.747353 | 0.025575 | 0 | 0 | 1 | 0 | 0.153085 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.097561 | 0 | 0.195122 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92bdca03049e78f08b91682f83e48976672f9a1b | 456 | py | Python | utils/get_season_things_price.py | vogelfenx/storagebot | 64ab07b068bf645d7cdf5bb1cd5db91c0e2a9228 | [
"MIT"
] | null | null | null | utils/get_season_things_price.py | vogelfenx/storagebot | 64ab07b068bf645d7cdf5bb1cd5db91c0e2a9228 | [
"MIT"
] | 2 | 2021-11-24T18:20:00.000Z | 2021-11-24T18:31:55.000Z | utils/get_season_things_price.py | vogelfenx/storagebot | 64ab07b068bf645d7cdf5bb1cd5db91c0e2a9228 | [
"MIT"
] | 4 | 2021-11-24T16:40:28.000Z | 2021-11-28T10:40:57.000Z | def get_season_things_price(thing, amount, price):
if thing == 'wheel':
wheel_price = price[thing]['month'] * amount
return f'Стоимость составит {wheel_price}/месяц'
else:
other_thing_price_week = price[thing]['week'] * amount
other_thing_price_month = price[thing]['month'] * amount
return f'Стоимость составит {other_thing_price_week} р./неделю' + \
f' или {other_thing_price_month} р./месяц' | 41.454545 | 75 | 0.662281 | 59 | 456 | 4.830508 | 0.355932 | 0.140351 | 0.210526 | 0.147368 | 0.315789 | 0.315789 | 0.315789 | 0.315789 | 0 | 0 | 0 | 0 | 0.219298 | 456 | 11 | 76 | 41.454545 | 0.800562 | 0 | 0 | 0 | 0 | 0 | 0.326039 | 0.107221 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92c32d549db39666405ca82ccd8b1e761fbef653 | 455 | py | Python | dashboard/urls.py | EdisonBr/MockDados | c625cba7b93a8f31609549241c5aa71932e26b2d | [
"MIT"
] | null | null | null | dashboard/urls.py | EdisonBr/MockDados | c625cba7b93a8f31609549241c5aa71932e26b2d | [
"MIT"
] | 4 | 2021-03-30T13:49:39.000Z | 2021-06-10T19:40:02.000Z | dashboard/urls.py | smart320/MockDados | c625cba7b93a8f31609549241c5aa71932e26b2d | [
"MIT"
] | 1 | 2020-07-27T02:08:29.000Z | 2020-07-27T02:08:29.000Z |
from django.urls import path, re_path
from django.views.generic.base import TemplateView
from .views import dashboard_cost, dashboard_energy, MotorDataListView
app_name = 'dashboard'
urlpatterns = [
path('', MotorDataListView.as_view(), name='dashboard_custom'),
#path('', dashboard_custom, name='dashboard_custom'),
path('energy', dashboard_energy, name='dashboard_energy'),
path('cost', dashboard_cost, name='dashboard_cost'),
]
| 28.4375 | 71 | 0.745055 | 53 | 455 | 6.169811 | 0.377358 | 0.198777 | 0.116208 | 0.140673 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127473 | 455 | 15 | 72 | 30.333333 | 0.823678 | 0.114286 | 0 | 0 | 0 | 0 | 0.1625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
92c83bb936e6892d8eb39bcbfcb76fe95e1f5577 | 1,281 | py | Python | docker/autoconfig.py | misc0110/bepasty-server | 662179671220d680fed57aa90894ffebf57dd4c7 | [
"BSD-2-Clause"
] | null | null | null | docker/autoconfig.py | misc0110/bepasty-server | 662179671220d680fed57aa90894ffebf57dd4c7 | [
"BSD-2-Clause"
] | null | null | null | docker/autoconfig.py | misc0110/bepasty-server | 662179671220d680fed57aa90894ffebf57dd4c7 | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/python
import os
import sys
SITENAME = os.environ.get("BEPASTY_SITENAME", None)
if SITENAME is None:
print("\n\nEnvironment variable BEPASTY_SITENAME must be set.")
sys.exit(1)
SECRET_KEY = os.environ.get("BEPASTY_SECRET_KEY", None)
if SECRET_KEY is None:
print("\n\nEnvironment variable BEPASTY_SECRET_KEY must be set.")
sys.exit(1)
APP_BASE_PATH = os.environ.get("BEPASTY_APP_BASE_PATH", None)
STORAGE_FILESYSTEM_DIRECTORY = os.environ.get(
"BEPASTY_STORAGE_FILESYSTEM_DIRECTORY", "/app/data",
)
DEFAULT_PERMISSIONS = os.environ.get("BEPASTY_DEFAULT_PERMISSIONS", "create,read")
PERMISSIONS = {}
admin_secret = os.environ.get("BEPASTY_ADMIN_SECRET", None)
if admin_secret is not None:
PERMISSIONS.update({admin_secret: "admin,list,create,modify,read,delete"})
try:
max_allowed_file_size = os.environ.get("BEPASTY_MAX_ALLOWED_FILE_SIZE", 5000000000)
MAX_ALLOWED_FILE_SIZE = int(max_allowed_file_size)
except ValueError as err:
print("\n\nInvalid BEPASTY_MAX_ALLOWED_FILE_SIZE: %s", str(err))
sys.exit(1)
try:
max_body_size = os.environ.get("BEPASTY_MAX_BODY_SIZE", 1040384)
MAX_BODY_SIZE = int(max_body_size)
except ValueError as err:
print("\n\nInvalid BEPASTY_MAX_BODY_SIZE: %s", str(err))
sys.exit(1)
| 30.5 | 87 | 0.753318 | 193 | 1,281 | 4.715026 | 0.295337 | 0.079121 | 0.105495 | 0.167033 | 0.358242 | 0.32967 | 0.235165 | 0.107692 | 0.107692 | 0.107692 | 0 | 0.018767 | 0.126464 | 1,281 | 41 | 88 | 31.243902 | 0.794459 | 0.01249 | 0 | 0.258065 | 0 | 0 | 0.344937 | 0.175633 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.064516 | 0 | 0.064516 | 0.129032 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92cab9ec692aa8897ecccca29c25b34c478b66a7 | 8,798 | py | Python | qiskit_metal/_gui/elements_ui.py | sarafs1926/qiskit-metal | cf2ce8125ebe8f21b6d1b85362466fd57db2cada | [
"Apache-2.0"
] | 1 | 2022-01-27T07:11:49.000Z | 2022-01-27T07:11:49.000Z | qiskit_metal/_gui/elements_ui.py | sarafs1926/qiskit-metal | cf2ce8125ebe8f21b6d1b85362466fd57db2cada | [
"Apache-2.0"
] | null | null | null | qiskit_metal/_gui/elements_ui.py | sarafs1926/qiskit-metal | cf2ce8125ebe8f21b6d1b85362466fd57db2cada | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Form implementation generated from reading ui file './elements_ui.ui',
# licensing of './elements_ui.ui' applies.
#
# Created: Wed Jun 16 14:29:03 2021
# by: pyside2-uic running on PySide2 5.13.2
#
# WARNING! All changes made in this file will be lost!
from PySide2 import QtCore, QtGui, QtWidgets
class Ui_ElementsWindow(object):
def setupUi(self, ElementsWindow):
ElementsWindow.setObjectName("ElementsWindow")
ElementsWindow.resize(841, 623)
self.centralwidget = QtWidgets.QWidget(ElementsWindow)
self.centralwidget.setObjectName("centralwidget")
self.verticalLayout_2 = QtWidgets.QVBoxLayout(self.centralwidget)
self.verticalLayout_2.setSpacing(0)
self.verticalLayout_2.setContentsMargins(0, 0, 0, 0)
self.verticalLayout_2.setObjectName("verticalLayout_2")
self.verticalLayout = QtWidgets.QVBoxLayout()
self.verticalLayout.setSizeConstraint(
QtWidgets.QLayout.SetDefaultConstraint)
self.verticalLayout.setObjectName("verticalLayout")
self.horizontalLayout = QtWidgets.QHBoxLayout()
self.horizontalLayout.setObjectName("horizontalLayout")
self.btn_refresh = QtWidgets.QPushButton(self.centralwidget)
self.btn_refresh.setCursor(QtCore.Qt.ClosedHandCursor)
self.btn_refresh.setText("")
icon = QtGui.QIcon()
icon.addPixmap(QtGui.QPixmap(":/refresh"), QtGui.QIcon.Normal,
QtGui.QIcon.Off)
self.btn_refresh.setIcon(icon)
self.btn_refresh.setIconSize(QtCore.QSize(20, 20))
self.btn_refresh.setAutoDefault(False)
self.btn_refresh.setDefault(False)
self.btn_refresh.setFlat(True)
self.btn_refresh.setObjectName("btn_refresh")
self.horizontalLayout.addWidget(self.btn_refresh)
self.label = QtWidgets.QLabel(self.centralwidget)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Minimum,
QtWidgets.QSizePolicy.Minimum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(
self.label.sizePolicy().hasHeightForWidth())
self.label.setSizePolicy(sizePolicy)
font = QtGui.QFont()
font.setWeight(75)
font.setBold(True)
self.label.setFont(font)
self.label.setAlignment(QtCore.Qt.AlignRight | QtCore.Qt.AlignTrailing |
QtCore.Qt.AlignVCenter)
self.label.setObjectName("label")
self.horizontalLayout.addWidget(self.label)
self.combo_element_type = QtWidgets.QComboBox(self.centralwidget)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Preferred,
QtWidgets.QSizePolicy.Minimum)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(
self.combo_element_type.sizePolicy().hasHeightForWidth())
self.combo_element_type.setSizePolicy(sizePolicy)
self.combo_element_type.setCurrentText("")
self.combo_element_type.setSizeAdjustPolicy(
QtWidgets.QComboBox.AdjustToContents)
self.combo_element_type.setObjectName("combo_element_type")
self.horizontalLayout.addWidget(self.combo_element_type)
self.line = QtWidgets.QFrame(self.centralwidget)
self.line.setFrameShape(QtWidgets.QFrame.VLine)
self.line.setFrameShadow(QtWidgets.QFrame.Sunken)
self.line.setObjectName("line")
self.horizontalLayout.addWidget(self.line)
self.label_3 = QtWidgets.QLabel(self.centralwidget)
font = QtGui.QFont()
font.setWeight(75)
font.setBold(True)
self.label_3.setFont(font)
self.label_3.setObjectName("label_3")
self.horizontalLayout.addWidget(self.label_3)
self.label_2 = QtWidgets.QLabel(self.centralwidget)
self.label_2.setObjectName("label_2")
self.horizontalLayout.addWidget(self.label_2)
self.lineEdit = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit.setObjectName("lineEdit")
self.horizontalLayout.addWidget(self.lineEdit)
self.label_4 = QtWidgets.QLabel(self.centralwidget)
self.label_4.setObjectName("label_4")
self.horizontalLayout.addWidget(self.label_4)
self.lineEdit_2 = QtWidgets.QLineEdit(self.centralwidget)
self.lineEdit_2.setObjectName("lineEdit_2")
self.horizontalLayout.addWidget(self.lineEdit_2)
self.line_2 = QtWidgets.QFrame(self.centralwidget)
self.line_2.setFrameShape(QtWidgets.QFrame.VLine)
self.line_2.setFrameShadow(QtWidgets.QFrame.Sunken)
self.line_2.setObjectName("line_2")
self.horizontalLayout.addWidget(self.line_2)
self.verticalLayout.addLayout(self.horizontalLayout)
self.tableElements = QtWidgets.QTableView(self.centralwidget)
sizePolicy = QtWidgets.QSizePolicy(QtWidgets.QSizePolicy.Expanding,
QtWidgets.QSizePolicy.Expanding)
sizePolicy.setHorizontalStretch(0)
sizePolicy.setVerticalStretch(0)
sizePolicy.setHeightForWidth(
self.tableElements.sizePolicy().hasHeightForWidth())
self.tableElements.setSizePolicy(sizePolicy)
self.tableElements.setProperty("showDropIndicator", False)
self.tableElements.setDragDropOverwriteMode(False)
self.tableElements.setAlternatingRowColors(True)
self.tableElements.setSortingEnabled(False)
self.tableElements.setObjectName("tableElements")
self.verticalLayout.addWidget(self.tableElements)
self.verticalLayout_2.addLayout(self.verticalLayout)
ElementsWindow.setCentralWidget(self.centralwidget)
self.menubar = QtWidgets.QMenuBar()
self.menubar.setGeometry(QtCore.QRect(0, 0, 841, 22))
self.menubar.setObjectName("menubar")
ElementsWindow.setMenuBar(self.menubar)
self.statusbar = QtWidgets.QStatusBar(ElementsWindow)
self.statusbar.setEnabled(True)
self.statusbar.setObjectName("statusbar")
ElementsWindow.setStatusBar(self.statusbar)
self.retranslateUi(ElementsWindow)
QtCore.QObject.connect(self.combo_element_type,
QtCore.SIGNAL("currentIndexChanged(QString)"),
ElementsWindow.combo_element_type)
QtCore.QObject.connect(self.btn_refresh, QtCore.SIGNAL("clicked()"),
ElementsWindow.force_refresh)
QtCore.QMetaObject.connectSlotsByName(ElementsWindow)
def retranslateUi(self, ElementsWindow):
ElementsWindow.setWindowTitle(
QtWidgets.QApplication.translate("ElementsWindow", "MainWindow",
None, -1))
self.btn_refresh.setToolTip(
QtWidgets.QApplication.translate("ElementsWindow",
"Force refresh the table ", None,
-1))
self.btn_refresh.setStatusTip(
QtWidgets.QApplication.translate("ElementsWindow",
"Force refresh the table ", None,
-1))
self.btn_refresh.setWhatsThis(
QtWidgets.QApplication.translate("ElementsWindow",
"Force refresh the table ", None,
-1))
self.btn_refresh.setAccessibleDescription(
QtWidgets.QApplication.translate("ElementsWindow",
"Force refresh the table ", None,
-1))
self.label.setText(
QtWidgets.QApplication.translate("ElementsWindow", "Element type: ",
None, -1))
self.combo_element_type.setToolTip(
QtWidgets.QApplication.translate(
"ElementsWindow",
"<html><head/><body><p>Select the element table you wish to view</p></body></html>",
None, -1))
self.label_3.setText(
QtWidgets.QApplication.translate("ElementsWindow", " Filter: ",
None, -1))
self.label_2.setText(
QtWidgets.QApplication.translate("ElementsWindow", "Component: ",
None, -1))
self.label_4.setText(
QtWidgets.QApplication.translate("ElementsWindow", " Layer: ",
None, -1))
from . import main_window_rc_rc
| 49.988636 | 100 | 0.639236 | 780 | 8,798 | 7.111538 | 0.234615 | 0.034072 | 0.037858 | 0.059492 | 0.356589 | 0.248783 | 0.172345 | 0.13611 | 0.13611 | 0.119704 | 0 | 0.01458 | 0.26722 | 8,798 | 175 | 101 | 50.274286 | 0.84582 | 0.030461 | 0 | 0.221519 | 1 | 0.006329 | 0.072879 | 0.009154 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012658 | false | 0 | 0.012658 | 0 | 0.031646 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92cf711f3ee4d4acd21a60efb873e479a2b9db00 | 447 | py | Python | sparkdq/outliers/params/KSigmaParams.py | PasaLab/SparkDQ | 16d50210747ef7de03cf36d689ce26ff7445f63a | [
"Apache-2.0"
] | 1 | 2021-02-08T07:49:54.000Z | 2021-02-08T07:49:54.000Z | sparkdq/outliers/params/KSigmaParams.py | PasaLab/SparkDQ | 16d50210747ef7de03cf36d689ce26ff7445f63a | [
"Apache-2.0"
] | null | null | null | sparkdq/outliers/params/KSigmaParams.py | PasaLab/SparkDQ | 16d50210747ef7de03cf36d689ce26ff7445f63a | [
"Apache-2.0"
] | null | null | null | import json
from sparkdq.outliers.params.OutlierSolverParams import OutlierSolverParams
from sparkdq.outliers.OutlierSolver import OutlierSolver
class KSigmaParams(OutlierSolverParams):
def __init__(self, deviation=1.5):
self.deviation = deviation
def model(self):
return OutlierSolver.kSigma
@staticmethod
def from_json(json_str):
d = json.loads(json_str)
return KSigmaParams(d["deviation"])
| 23.526316 | 75 | 0.736018 | 48 | 447 | 6.708333 | 0.479167 | 0.068323 | 0.118012 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00551 | 0.187919 | 447 | 18 | 76 | 24.833333 | 0.881543 | 0 | 0 | 0 | 0 | 0 | 0.020134 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.083333 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92d2be755f1c0894c43d329732b414de4bf31ab2 | 195 | py | Python | atcoder/abc132A_fifty_fifty.py | uninhm/kyopro | bf6ed9cbf6a5e46cde0291f7aa9d91a8ddf1f5a3 | [
"BSD-3-Clause"
] | 31 | 2020-05-13T01:07:55.000Z | 2021-07-13T07:53:26.000Z | atcoder/abc132A_fifty_fifty.py | uninhm/kyopro | bf6ed9cbf6a5e46cde0291f7aa9d91a8ddf1f5a3 | [
"BSD-3-Clause"
] | 10 | 2020-05-20T07:22:09.000Z | 2021-07-19T03:52:13.000Z | atcoder/abc132A_fifty_fifty.py | uninhm/kyopro | bf6ed9cbf6a5e46cde0291f7aa9d91a8ddf1f5a3 | [
"BSD-3-Clause"
] | 14 | 2020-05-11T05:58:36.000Z | 2021-12-07T03:20:43.000Z | # Vicfred
# https://atcoder.jp/contests/abc132/tasks/abc132_a
# implementation
S = list(input())
if len(set(S)) == 2:
if S.count(S[0]) == 2:
print("Yes")
quit()
print("No")
| 16.25 | 51 | 0.574359 | 29 | 195 | 3.827586 | 0.758621 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 0.215385 | 195 | 11 | 52 | 17.727273 | 0.666667 | 0.369231 | 0 | 0 | 0 | 0 | 0.042373 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92d5a318d2e721b05edd8c4dc433e4875c24b448 | 6,318 | py | Python | visual_perception/Detection/yolov4/__init__.py | SSusantAchary/Visual-Perception | b81ffe69ab85e9afb7ee6eece43ac83c8f292285 | [
"MIT"
] | null | null | null | visual_perception/Detection/yolov4/__init__.py | SSusantAchary/Visual-Perception | b81ffe69ab85e9afb7ee6eece43ac83c8f292285 | [
"MIT"
] | null | null | null | visual_perception/Detection/yolov4/__init__.py | SSusantAchary/Visual-Perception | b81ffe69ab85e9afb7ee6eece43ac83c8f292285 | [
"MIT"
] | null | null | null | """
MIT License
Copyright (c) 2020 Susant Achary <sache.meet@yahoo.com>
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
"""
from visual_perception.Detection.yolov4.tf import YOLOv4 as yolo_main
import numpy as np
import cv2
labels = {0: 'person', 1: 'bicycle', 2: 'car', 3: 'motorcycle', 4: 'airplane', 5: 'bus', 6: 'train', 7: 'truck', 8: 'boat',
9: 'traffic light', 10: 'fire hydrant', 11: 'stop sign', 12: 'parking meter', 13: 'bench', 14: 'bird', 15: 'cat', 16: 'dog',
17: 'horse', 18: 'sheep', 19: 'cow', 20: 'elephant', 21: 'bear', 22: 'zebra', 23: 'giraffe', 24: 'backpack', 25: 'umbrella',
26: 'handbag', 27: 'tie', 28: 'suitcase', 29: 'frisbee', 30: 'skis', 31: 'snowboard', 32: 'sports ball', 33: 'kite',
34: 'baseball bat', 35: 'baseball glove', 36: 'skateboard', 37: 'surfboard', 38: 'tennis racket', 39: 'bottle', 40: 'wine glass',
41: 'cup', 42: 'fork', 43: 'knife', 44: 'spoon', 45: 'bowl', 46: 'banana', 47: 'apple', 48: 'sandwich', 49: 'orange',
50: 'broccoli', 51: 'carrot', 52: 'hot dog', 53: 'pizza', 54: 'donut', 55: 'cake', 56: 'chair', 57: 'couch', 58: 'potted plant',
59: 'bed', 60: 'dining table', 61: 'toilet', 62: 'tv', 63: 'laptop', 64: 'mouse', 65: 'remote', 66: 'keyboard', 67: 'cell phone',
68: 'microwave', 69: 'oven', 70: 'toaster', 71: 'sink', 72: 'refrigerator', 73: 'book', 74: 'clock', 75: 'vase', 76: 'scissors',
77: 'teddy bear', 78: 'hair drier', 79: 'toothbrush'}
class YOLOv4:
def __init__(self):
self.weights_path = ""
self.model = None
self.yolo_classes = ""
self.iou = 0
self.score = 0
self.input_shape = 0
self.output_path = ""
def load_model(self, weights_path:str = None, classes_path:str = None, input_shape:int = 608):
if (weights_path is None) or (classes_path is None):
raise RuntimeError ('weights_path AND classes_path should not be None.')
self.yolo_classes = classes_path
self.weights_path = weights_path
self.input_shape = input_shape
self.model = yolo_main(shape = self.input_shape)
self.model.classes = self.yolo_classes
self.model.make_model()
self.model.load_weights(self.weights_path, weights_type = 'yolo')
def predict(self, img:np.ndarray, output_path:str, iou = 0.45, score = 0.25, custom_objects:dict = None,
debug=True):
self.output_path = output_path
self.iou = iou
self.score = score
#img = np.array(Image.open(img))[..., ::-1]
pred_bboxes = self.model.predict(img, iou_threshold = self.iou, score_threshold = self.score)
boxes = []
if (custom_objects != None):
for i in range(len(pred_bboxes)):
check_name = labels[pred_bboxes[i][4]]
check = custom_objects.get(check_name, 'invalid')
if check == 'invalid':
continue
elif check == 'valid':
boxes.append(list(pred_bboxes[i]))
boxes = np.array(boxes)
res = self.model.draw_bboxes(img, boxes)
if debug:
cv2.imwrite(self.output_path, res)
else:
res = self.model.draw_bboxes(img, pred_bboxes)
if debug:
cv2.imwrite(self.output_path, res)
return res
class TinyYOLOv4:
def __init__(self):
self.weights_path = ""
self.model = None
self.yolo_classes = ""
self.iou = 0
self.score = 0
self.input_shape = 0
self.output_path = ""
def load_model(self, weights_path:str = None, classes_path:str = None, input_shape:int = 0):
if (weights_path is None) or (classes_path is None):
raise RuntimeError ('weights_path AND classes_path should not be None.')
self.yolo_classes = classes_path
self.weights_path = weights_path
self.input_shape = input_shape
self.model = yolo_main(tiny = True, shape = self.input_shape)
self.model.classes = self.yolo_classes
self.model.make_model()
self.model.load_weights(self.weights_path, weights_type = 'yolo')
def predict(self, img:np.ndarray, output_path:str, iou = 0.4, score = 0.07, custom_objects:dict = None,
debug=True):
self.output_path = output_path
self.iou = iou
self.score = score
#img = np.array(Image.open(img))[..., ::-1]
pred_bboxes = self.model.predict(img, iou_threshold = self.iou, score_threshold = self.score)
boxes = []
if (custom_objects != None):
for i in range(len(pred_bboxes)):
check_name = labels[pred_bboxes[i][4]]
check = custom_objects.get(check_name, 'invalid')
if check == 'invalid':
continue
elif check == 'valid':
boxes.append(list(pred_bboxes[i]))
boxes = np.array(boxes)
res = self.model.draw_bboxes(img, boxes)
if debug:
cv2.imwrite(self.output_path, res)
else:
res = self.model.draw_bboxes(img, pred_bboxes)
if debug:
cv2.imwrite(self.output_path, res)
return res
| 43.875 | 141 | 0.608895 | 843 | 6,318 | 4.447212 | 0.40688 | 0.03841 | 0.032009 | 0.020272 | 0.546279 | 0.546279 | 0.546279 | 0.546279 | 0.546279 | 0.546279 | 0 | 0.040675 | 0.268439 | 6,318 | 144 | 142 | 43.875 | 0.770446 | 0.186451 | 0 | 0.787879 | 0 | 0 | 0.133502 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.030303 | 0 | 0.131313 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92d6916cbd5aa31c26daff18d295d7d026f17d82 | 517 | py | Python | server/mqtt/handler.py | rishab-rb/MyIOTMap | e27a73b58cd3a9aba558ebacfb2bf8b6ef4761aa | [
"MIT"
] | 1 | 2018-10-08T06:11:20.000Z | 2018-10-08T06:11:20.000Z | server/mqtt/handler.py | rishab-rb/MyIOTMap | e27a73b58cd3a9aba558ebacfb2bf8b6ef4761aa | [
"MIT"
] | null | null | null | server/mqtt/handler.py | rishab-rb/MyIOTMap | e27a73b58cd3a9aba558ebacfb2bf8b6ef4761aa | [
"MIT"
] | 2 | 2018-07-30T08:18:22.000Z | 2018-10-11T08:04:58.000Z | import paho.client as mqtt
HOST = 'localhost'
PORT = 1883
class MQTTConnector:
def __init__(self, host, port):
host = host
port = port
client = mqtt.Client()
def connect():
self.client.connect(self.host, self.port, 60)
def run(self):
self.client.loop_forever()
class MQTTSubscriber:
def __init__(self, *args, **kwargs):
super(MQTTSubscriber, self).__init__(*args, **kwargs)
class MQTTPublisher:
def __init__(self, host) | 19.884615 | 61 | 0.609284 | 59 | 517 | 5.050847 | 0.40678 | 0.07047 | 0.110738 | 0.100671 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016086 | 0.27853 | 517 | 26 | 62 | 19.884615 | 0.782842 | 0 | 0 | 0 | 0 | 0 | 0.017375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.058824 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92e1c91fec4c34f39e9e2622024fad4489b61749 | 5,279 | py | Python | scripts/C189/C189Checkin.py | xiaopowanyi/py_scripts | 29f240800eefd6e0f91fd098c35ac3c451172ff8 | [
"MIT"
] | 2 | 2020-11-14T05:42:49.000Z | 2020-11-14T05:43:13.000Z | scripts/C189/C189Checkin.py | J220541674/py_scripts | 2b72e23041392a2e5f0a7305d7e9802054978384 | [
"MIT"
] | null | null | null | scripts/C189/C189Checkin.py | J220541674/py_scripts | 2b72e23041392a2e5f0a7305d7e9802054978384 | [
"MIT"
] | null | null | null | import requests, time, re, rsa, json, base64
from urllib import parse
s = requests.Session()
username = ""
password = ""
if(username == "" or password == ""):
username = input("账号:")
password = input("密码:")
def main():
login(username, password)
rand = str(round(time.time()*1000))
surl = f'https://api.cloud.189.cn/mkt/userSign.action?rand={rand}&clientType=TELEANDROID&version=8.6.3&model=SM-G930K'
url = f'https://m.cloud.189.cn/v2/drawPrizeMarketDetails.action?taskId=TASK_SIGNIN&activityId=ACT_SIGNIN'
url2 = f'https://m.cloud.189.cn/v2/drawPrizeMarketDetails.action?taskId=TASK_SIGNIN_PHOTOS&activityId=ACT_SIGNIN'
headers = {
'User-Agent':'Mozilla/5.0 (Linux; Android 5.1.1; SM-G930K Build/NRD90M; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/74.0.3729.136 Mobile Safari/537.36 Ecloud/8.6.3 Android/22 clientId/355325117317828 clientModel/SM-G930K imsi/460071114317824 clientChannelId/qq proVersion/1.0.6',
"Referer" : "https://m.cloud.189.cn/zhuanti/2016/sign/index.jsp?albumBackupOpened=1",
"Host" : "m.cloud.189.cn",
"Accept-Encoding" : "gzip, deflate",
}
response = s.get(surl,headers=headers)
netdiskBonus = response.json()['netdiskBonus']
if(response.json()['isSign'] == "false"):
print(f"未签到,签到获得{netdiskBonus}M空间")
else:
print(f"已经签到过了,签到获得{netdiskBonus}M空间")
headers = {
'User-Agent':'Mozilla/5.0 (Linux; Android 5.1.1; SM-G930K Build/NRD90M; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/74.0.3729.136 Mobile Safari/537.36 Ecloud/8.6.3 Android/22 clientId/355325117317828 clientModel/SM-G930K imsi/460071114317824 clientChannelId/qq proVersion/1.0.6',
"Referer" : "https://m.cloud.189.cn/zhuanti/2016/sign/index.jsp?albumBackupOpened=1",
"Host" : "m.cloud.189.cn",
"Accept-Encoding" : "gzip, deflate",
}
response = s.get(url,headers=headers)
try:
if ("errorCode" in response.text):
print(response.json()['errorCode'])
elif (response.json().has_key('description')):
description = response.json()['description']
print(f"抽奖获得{description}")
except:
print(f"抽奖1完成,解析时失败")
try:
response2 = s.get(url2,headers=headers)
if ("errorCode" in response2.text):
print(response.json()['errorCode'])
elif (response2.json().has_key('description')):
description = response2.json()['description']
print(f"抽奖2获得{description}")
except:
print(f"抽奖2完成,解析时失败")
BI_RM = list("0123456789abcdefghijklmnopqrstuvwxyz")
def int2char(a):
return BI_RM[a]
b64map = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/"
def b64tohex(a):
d = ""
e = 0
c = 0
for i in range(len(a)):
if list(a)[i] != "=":
v = b64map.index(list(a)[i])
if 0 == e:
e = 1
d += int2char(v >> 2)
c = 3 & v
elif 1 == e:
e = 2
d += int2char(c << 2 | v >> 4)
c = 15 & v
elif 2 == e:
e = 3
d += int2char(c)
d += int2char(v >> 2)
c = 3 & v
else:
e = 0
d += int2char(c << 2 | v >> 4)
d += int2char(15 & v)
if e == 1:
d += int2char(c << 2)
return d
def rsa_encode(j_rsakey, string):
rsa_key = f"-----BEGIN PUBLIC KEY-----\n{j_rsakey}\n-----END PUBLIC KEY-----"
pubkey = rsa.PublicKey.load_pkcs1_openssl_pem(rsa_key.encode())
result = b64tohex((base64.b64encode(rsa.encrypt(f'{string}'.encode(), pubkey))).decode())
return result
def calculate_md5_sign(params):
return hashlib.md5('&'.join(sorted(params.split('&'))).encode('utf-8')).hexdigest()
def login(username, password):
url = "https://cloud.189.cn/udb/udb_login.jsp?pageId=1&redirectURL=/main.action"
r = s.get(url)
captchaToken = re.findall(r"captchaToken' value='(.+?)'", r.text)[0]
lt = re.findall(r'lt = "(.+?)"', r.text)[0]
returnUrl = re.findall(r"returnUrl = '(.+?)'", r.text)[0]
paramId = re.findall(r'paramId = "(.+?)"', r.text)[0]
j_rsakey = re.findall(r'j_rsaKey" value="(\S+)"', r.text, re.M)[0]
s.headers.update({"lt": lt})
username = rsa_encode(j_rsakey, username)
password = rsa_encode(j_rsakey, password)
url = "https://open.e.189.cn/api/logbox/oauth2/loginSubmit.do"
headers = {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:74.0) Gecko/20100101 Firefox/76.0',
'Referer': 'https://open.e.189.cn/',
}
data = {
"appKey": "cloud",
"accountType": '01',
"userName": f"{{RSA}}{username}",
"password": f"{{RSA}}{password}",
"validateCode": "",
"captchaToken": captchaToken,
"returnUrl": returnUrl,
"mailSuffix": "@189.cn",
"paramId": paramId
}
r = s.post(url, data=data, headers=headers, timeout=5)
if(r.json()['result'] == 0):
print(r.json()['msg'])
else:
print(r.json()['msg'])
redirect_url = r.json()['toUrl']
r = s.get(redirect_url)
return s
if __name__ == "__main__":
main()
| 37.707143 | 305 | 0.586664 | 683 | 5,279 | 4.481698 | 0.311859 | 0.017968 | 0.026135 | 0.021562 | 0.352499 | 0.32179 | 0.282914 | 0.273767 | 0.273767 | 0.273767 | 0 | 0.07761 | 0.23603 | 5,279 | 139 | 306 | 37.978417 | 0.681379 | 0 | 0 | 0.24 | 0 | 0.072 | 0.385111 | 0.043758 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048 | false | 0.056 | 0.016 | 0.016 | 0.104 | 0.08 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
92e459320d22b81d1d537a46bdb22eb8751da72d | 6,218 | py | Python | sympy/assumptions/assume.py | shivangdubey/sympy | bd3ddd4c71d439c8b623f69a02274dd8a8a82198 | [
"BSD-3-Clause"
] | 2 | 2020-07-27T16:36:27.000Z | 2020-12-29T22:28:37.000Z | sympy/assumptions/assume.py | shivangdubey/sympy | bd3ddd4c71d439c8b623f69a02274dd8a8a82198 | [
"BSD-3-Clause"
] | 2 | 2020-08-18T15:21:59.000Z | 2020-08-18T19:35:29.000Z | sympy/assumptions/assume.py | shivangdubey/sympy | bd3ddd4c71d439c8b623f69a02274dd8a8a82198 | [
"BSD-3-Clause"
] | 2 | 2021-01-08T23:03:23.000Z | 2021-01-13T18:57:02.000Z | import inspect
from sympy.core.cache import cacheit
from sympy.core.singleton import S
from sympy.core.sympify import _sympify
from sympy.logic.boolalg import Boolean
from sympy.utilities.source import get_class
from contextlib import contextmanager
class AssumptionsContext(set):
"""Set representing assumptions.
This is used to represent global assumptions, but you can also use this
class to create your own local assumptions contexts. It is basically a thin
wrapper to Python's set, so see its documentation for advanced usage.
Examples
========
>>> from sympy import Q
>>> from sympy.assumptions.assume import global_assumptions
>>> global_assumptions
AssumptionsContext()
>>> from sympy.abc import x
>>> global_assumptions.add(Q.real(x))
>>> global_assumptions
AssumptionsContext({Q.real(x)})
>>> global_assumptions.remove(Q.real(x))
>>> global_assumptions
AssumptionsContext()
>>> global_assumptions.clear()
"""
def add(self, *assumptions):
"""Add an assumption."""
for a in assumptions:
super().add(a)
def _sympystr(self, printer):
if not self:
return "%s()" % self.__class__.__name__
return "{}({})".format(self.__class__.__name__, printer._print_set(self))
global_assumptions = AssumptionsContext()
class AppliedPredicate(Boolean):
"""The class of expressions resulting from applying a Predicate.
Examples
========
>>> from sympy import Q, Symbol
>>> x = Symbol('x')
>>> Q.integer(x)
Q.integer(x)
>>> type(Q.integer(x))
<class 'sympy.assumptions.assume.AppliedPredicate'>
"""
__slots__ = ()
def __new__(cls, predicate, arg):
arg = _sympify(arg)
return Boolean.__new__(cls, predicate, arg)
is_Atom = True # do not attempt to decompose this
@property
def arg(self):
"""
Return the expression used by this assumption.
Examples
========
>>> from sympy import Q, Symbol
>>> x = Symbol('x')
>>> a = Q.integer(x + 1)
>>> a.arg
x + 1
"""
return self._args[1]
@property
def args(self):
return self._args[1:]
@property
def func(self):
return self._args[0]
@cacheit
def sort_key(self, order=None):
return (self.class_key(), (2, (self.func.name, self.arg.sort_key())),
S.One.sort_key(), S.One)
def __eq__(self, other):
if type(other) is AppliedPredicate:
return self._args == other._args
return False
def __hash__(self):
return super().__hash__()
def _eval_ask(self, assumptions):
return self.func.eval(self.arg, assumptions)
@property
def binary_symbols(self):
from sympy.core.relational import Eq, Ne
if self.func.name in ['is_true', 'is_false']:
i = self.arg
if i.is_Boolean or i.is_Symbol or isinstance(i, (Eq, Ne)):
return i.binary_symbols
return set()
class Predicate(Boolean):
"""A predicate is a function that returns a boolean value.
Predicates merely wrap their argument and remain unevaluated:
>>> from sympy import Q, ask
>>> type(Q.prime)
<class 'sympy.assumptions.assume.Predicate'>
>>> Q.prime.name
'prime'
>>> Q.prime(7)
Q.prime(7)
>>> _.func.name
'prime'
To obtain the truth value of an expression containing predicates, use
the function ``ask``:
>>> ask(Q.prime(7))
True
The tautological predicate ``Q.is_true`` can be used to wrap other objects:
>>> from sympy.abc import x
>>> Q.is_true(x > 1)
Q.is_true(x > 1)
"""
is_Atom = True
def __new__(cls, name, handlers=None):
obj = Boolean.__new__(cls)
obj.name = name
obj.handlers = handlers or []
return obj
def _hashable_content(self):
return (self.name,)
def __getnewargs__(self):
return (self.name,)
def __call__(self, expr):
return AppliedPredicate(self, expr)
def add_handler(self, handler):
self.handlers.append(handler)
def remove_handler(self, handler):
self.handlers.remove(handler)
@cacheit
def sort_key(self, order=None):
return self.class_key(), (1, (self.name,)), S.One.sort_key(), S.One
def eval(self, expr, assumptions=True):
"""
Evaluate self(expr) under the given assumptions.
This uses only direct resolution methods, not logical inference.
"""
res, _res = None, None
mro = inspect.getmro(type(expr))
for handler in self.handlers:
cls = get_class(handler)
for subclass in mro:
eval_ = getattr(cls, subclass.__name__, None)
if eval_ is None:
continue
res = eval_(expr, assumptions)
# Do not stop if value returned is None
# Try to check for higher classes
if res is None:
continue
if _res is None:
_res = res
elif res is None:
# since first resolutor was conclusive, we keep that value
res = _res
else:
# only check consistency if both resolutors have concluded
if _res != res:
raise ValueError('incompatible resolutors')
break
return res
@contextmanager
def assuming(*assumptions):
""" Context manager for assumptions
Examples
========
>>> from sympy.assumptions import assuming, Q, ask
>>> from sympy.abc import x, y
>>> print(ask(Q.integer(x + y)))
None
>>> with assuming(Q.integer(x), Q.integer(y)):
... print(ask(Q.integer(x + y)))
True
"""
old_global_assumptions = global_assumptions.copy()
global_assumptions.update(assumptions)
try:
yield
finally:
global_assumptions.clear()
global_assumptions.update(old_global_assumptions)
| 26.686695 | 81 | 0.587488 | 733 | 6,218 | 4.818554 | 0.271487 | 0.038222 | 0.017837 | 0.01812 | 0.180634 | 0.107588 | 0.069649 | 0.048698 | 0.048698 | 0.02718 | 0 | 0.002776 | 0.30476 | 6,218 | 232 | 82 | 26.801724 | 0.814249 | 0.362335 | 0 | 0.138614 | 0 | 0 | 0.013363 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.19802 | false | 0 | 0.079208 | 0.089109 | 0.524752 | 0.019802 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
92e5ae34177c1ed1dca21481a52e063cdd40f997 | 5,794 | py | Python | distancematrix/tests/consumer/test_distance_matrix.py | IDLabResearch/seriesdistancematrix | c0e666d036f24184511e766cee9fdfa55f41df97 | [
"MIT"
] | 12 | 2019-11-22T14:34:51.000Z | 2021-05-04T19:23:55.000Z | distancematrix/tests/consumer/test_distance_matrix.py | predict-idlab/seriesdistancematrix | c0e666d036f24184511e766cee9fdfa55f41df97 | [
"MIT"
] | 1 | 2020-04-28T07:59:03.000Z | 2020-04-28T07:59:03.000Z | distancematrix/tests/consumer/test_distance_matrix.py | IDLabResearch/seriesdistancematrix | c0e666d036f24184511e766cee9fdfa55f41df97 | [
"MIT"
] | 3 | 2020-03-02T12:39:00.000Z | 2021-03-22T13:36:25.000Z | import numpy as np
from unittest import TestCase
import numpy.testing as npt
from distancematrix.util import diag_indices_of
from distancematrix.consumer.distance_matrix import DistanceMatrix
class TestContextualMatrixProfile(TestCase):
def setUp(self):
self.dist_matrix = np.array([
[8.67, 1.10, 1.77, 1.26, 1.91, 4.29, 6.32, 4.24, 4.64, 5.06, 6.41, 4.07, 4.67, 9.32, 5.09],
[4.33, 4.99, 0.14, 2.79, 2.10, 6.26, 9.40, 4.14, 5.53, 4.26, 8.21, 5.91, 6.83, 9.26, 6.19],
[0.16, 9.05, 1.35, 4.78, 7.01, 4.36, 5.24, 8.81, 7.90, 5.84, 8.90, 7.88, 3.37, 4.70, 6.94],
[0.94, 8.70, 3.87, 6.29, 0.32, 1.79, 5.80, 2.61, 1.43, 6.32, 1.62, 0.20, 2.28, 7.11, 2.15],
[9.90, 4.51, 2.11, 2.83, 5.52, 8.55, 6.90, 0.24, 1.58, 4.26, 8.75, 3.71, 9.93, 8.33, 0.38],
[7.30, 5.84, 9.63, 1.95, 3.76, 3.61, 9.42, 5.56, 5.09, 7.07, 1.90, 4.78, 1.06, 0.69, 3.67],
[2.17, 8.37, 3.99, 4.28, 4.37, 2.86, 8.61, 3.39, 8.37, 6.95, 6.57, 1.79, 7.40, 4.41, 7.64],
[6.26, 0.29, 6.44, 8.84, 1.24, 2.52, 6.25, 3.07, 5.55, 3.19, 8.16, 5.32, 9.01, 0.39, 9.],
[4.67, 8.88, 3.05, 3.06, 2.36, 8.34, 4.91, 5.46, 9.25, 9.78, 0.03, 5.64, 5.10, 3.58, 6.92],
[1.01, 0.91, 6.28, 7.79, 0.68, 5.50, 6.72, 5.11, 0.80, 9.30, 9.77, 4.71, 3.26, 7.29, 6.26]])
def mock_initialise(self, dm):
dm.initialise(1, self.dist_matrix.shape[0], self.dist_matrix.shape[1])
def test_process_diagonal(self):
dm = DistanceMatrix()
self.mock_initialise(dm)
for diag in range(-self.dist_matrix.shape[0] + 1, self.dist_matrix.shape[1]):
diag_ind = diag_indices_of(self.dist_matrix, diag)
dm.process_diagonal(diag, np.atleast_2d(self.dist_matrix[diag_ind]))
npt.assert_equal(dm.distance_matrix, self.dist_matrix)
def test_process_diagonal_partial_calculation(self):
dm = DistanceMatrix()
self.mock_initialise(dm)
correct = np.full_like(self.dist_matrix, np.nan, dtype=float)
for diag in range(-8, self.dist_matrix.shape[1], 3):
diag_ind = diag_indices_of(self.dist_matrix, diag)
dm.process_diagonal(diag, np.atleast_2d(self.dist_matrix[diag_ind]))
correct[diag_ind] = self.dist_matrix[diag_ind]
npt.assert_equal(dm.distance_matrix, correct)
def test_process_column(self):
dm = DistanceMatrix()
self.mock_initialise(dm)
for column in range(0, self.dist_matrix.shape[1]):
dm.process_column(column, np.atleast_2d(self.dist_matrix[:, column]))
npt.assert_equal(dm.distance_matrix, self.dist_matrix)
def test_process_column_partial_calculation(self):
dm = DistanceMatrix()
self.mock_initialise(dm)
correct = np.full_like(self.dist_matrix, np.nan, dtype=float)
for column in [2, 3, 4, 5, 10, 11, 12]:
dm.process_column(column, np.atleast_2d(self.dist_matrix[:, column]))
correct[:, column] = self.dist_matrix[:, column]
npt.assert_equal(dm.distance_matrix, correct)
def test_streaming_process_column(self):
dm = DistanceMatrix()
dm.initialise(1, 5, 5)
dm.process_column(0, np.atleast_2d(self.dist_matrix[0, 0]))
dm.process_column(1, np.atleast_2d(self.dist_matrix[:2, 1]))
expected = np.full((5, 5), np.nan)
expected[0, 0] = self.dist_matrix[0, 0]
expected[:2, 1] = self.dist_matrix[:2, 1]
npt.assert_equal(dm.distance_matrix, expected)
for column in range(0, 5):
dm.process_column(column, np.atleast_2d(self.dist_matrix[:5, :5][:, column]))
npt.assert_equal(dm.distance_matrix, self.dist_matrix[:5, :5])
dm.shift_query(1)
dm.shift_series(3)
correct = np.full((5, 5), np.nan)
correct[0:4, 0:2] = self.dist_matrix[1:5, 3:5]
npt.assert_equal(dm.distance_matrix, correct)
for column in range(0, 5):
dm.process_column(column, np.atleast_2d(self.dist_matrix[1:6, 3:8][:, column]))
npt.assert_equal(dm.distance_matrix, self.dist_matrix[1:6, 3:8])
dm.shift_query(2)
dm.shift_series(1)
dm.process_column(4, np.atleast_2d(self.dist_matrix[3:8, 8]))
correct = np.full((5, 5), np.nan)
correct[0:3, 0:4] = self.dist_matrix[3:6, 4:8]
correct[:, 4] = self.dist_matrix[3:8, 8]
npt.assert_equal(dm.distance_matrix, correct)
def test_streaming_process_diagonal(self):
dm = DistanceMatrix()
dm.initialise(1, 5, 5)
dm.process_diagonal(0, np.atleast_2d(self.dist_matrix[0, 0]))
diag_ind = diag_indices_of(self.dist_matrix[:3, :3], 1)
dm.process_diagonal(1, np.atleast_2d(np.atleast_2d(self.dist_matrix[diag_ind])))
expected = np.full((5, 5), np.nan)
expected[0, 0] = self.dist_matrix[0, 0]
expected[0, 1] = self.dist_matrix[0, 1]
expected[1, 2] = self.dist_matrix[1, 2]
npt.assert_equal(dm.distance_matrix, expected)
for diag in range(-4,5):
diag_ind = diag_indices_of(self.dist_matrix[:5, :5], diag)
dm.process_diagonal(diag, np.atleast_2d(self.dist_matrix[diag_ind]))
npt.assert_equal(dm.distance_matrix, self.dist_matrix[:5, :5])
dm.shift_query(2)
dm.shift_series(1)
expected = self.dist_matrix[2:7, 1:6].copy()
expected[-2:, :] = np.nan
expected[:, -1:] = np.nan
npt.assert_equal(dm.distance_matrix, expected)
for diag in range(-4,5):
diag_ind = diag_indices_of(self.dist_matrix[:5, :5], diag)
dm.process_diagonal(diag, np.atleast_2d(self.dist_matrix[diag_ind]))
npt.assert_equal(dm.distance_matrix, self.dist_matrix[:5, :5])
| 42.602941 | 104 | 0.608733 | 1,002 | 5,794 | 3.367265 | 0.136727 | 0.104327 | 0.182573 | 0.057795 | 0.705691 | 0.64019 | 0.606995 | 0.593657 | 0.524896 | 0.487552 | 0 | 0.132737 | 0.227649 | 5,794 | 135 | 105 | 42.918519 | 0.621229 | 0 | 0 | 0.490196 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.127451 | 1 | 0.078431 | false | 0 | 0.04902 | 0 | 0.137255 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92e9c5118907200831bee6234267cd344285472f | 1,457 | py | Python | agsadmin/sharing_admin/community/groups/Group.py | christopherblanchfield/agsadmin | 989cb3795aacf285ccf74ee51b0de26bf2f48bc3 | [
"BSD-3-Clause"
] | 2 | 2015-12-07T05:53:29.000Z | 2020-09-13T18:12:15.000Z | agsadmin/sharing_admin/community/groups/Group.py | christopherblanchfield/agsadmin | 989cb3795aacf285ccf74ee51b0de26bf2f48bc3 | [
"BSD-3-Clause"
] | 4 | 2015-03-09T05:59:14.000Z | 2018-01-09T00:12:56.000Z | agsadmin/sharing_admin/community/groups/Group.py | christopherblanchfield/agsadmin | 989cb3795aacf285ccf74ee51b0de26bf2f48bc3 | [
"BSD-3-Clause"
] | 5 | 2015-03-09T01:05:24.000Z | 2019-09-09T23:01:21.000Z | from __future__ import (absolute_import, division, print_function, unicode_literals)
from builtins import (ascii, bytes, chr, dict, filter, hex, input, int, map, next, oct, open, pow, range, round, str,
super, zip)
from ...._utils import send_session_request
from ..._PortalEndpointBase import PortalEndpointBase
from .CreateUpdateGroupParams import CreateUpdateGroupParams
class Group(PortalEndpointBase):
@property
def id(self):
return self._pdata["id"]
@property
def _url_full(self):
return "{0}/{1}".format(self._url_base, self.id)
def __init__(self, requests_session, url_base, id):
super().__init__(requests_session, url_base)
self._pdata = {"id": id}
def get_properties(self):
"""
Gets the properties of the item.
"""
return self._get()
def update(self, update_group_params, clear_empty_fields=False):
"""
Updates the group properties.
"""
update_group_params = update_group_params._get_params() if isinstance(
update_group_params, CreateUpdateGroupParams) else update_group_params.copy()
if not "clearEmptyFields" in update_group_params:
update_group_params["clearEmptyFields"] = clear_empty_fields
r = self._create_operation_request(self, "update", method="POST", data=update_group_params)
return send_session_request(self._session, r).json() | 33.883721 | 117 | 0.683596 | 169 | 1,457 | 5.544379 | 0.455621 | 0.093917 | 0.145144 | 0.046958 | 0.072572 | 0.072572 | 0 | 0 | 0 | 0 | 0 | 0.001754 | 0.21757 | 1,457 | 43 | 118 | 33.883721 | 0.820175 | 0.042553 | 0 | 0.08 | 0 | 0 | 0.039288 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0.08 | 0.6 | 0.04 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
92ea3eda1c775e0583e47210352c08da3ae6793c | 3,995 | py | Python | amy/workshops/migrations/0191_auto_20190809_0936.py | code-review-doctor/amy | 268c1a199510457891459f3ddd73fcce7fe2b974 | [
"MIT"
] | 53 | 2015-01-10T17:39:19.000Z | 2019-06-12T17:36:34.000Z | amy/workshops/migrations/0191_auto_20190809_0936.py | code-review-doctor/amy | 268c1a199510457891459f3ddd73fcce7fe2b974 | [
"MIT"
] | 1,176 | 2015-01-02T06:32:47.000Z | 2019-06-18T11:57:47.000Z | amy/workshops/migrations/0191_auto_20190809_0936.py | code-review-doctor/amy | 268c1a199510457891459f3ddd73fcce7fe2b974 | [
"MIT"
] | 44 | 2015-01-03T15:08:56.000Z | 2019-06-09T05:33:08.000Z | # Generated by Django 2.1.7 on 2019-08-09 09:36
from django.db import migrations, models
def migrate_public_event(apps, schema_editor):
"""Migrate options previously with no contents (displayed as "Other:")
to a new contents ("other").
The field containing these options is in CommonRequest abstract model,
implemented in WorkshopRequest, WorkshopInquiryRequest, and
SelfOrganizedSubmission models."""
WorkshopRequest = apps.get_model('workshops', 'WorkshopRequest')
WorkshopInquiryRequest = apps.get_model('extrequests',
'WorkshopInquiryRequest')
SelfOrganizedSubmission = apps.get_model('extrequests',
'SelfOrganizedSubmission')
WorkshopRequest.objects.filter(public_event="") \
.update(public_event="other")
WorkshopInquiryRequest.objects.filter(public_event="") \
.update(public_event="other")
SelfOrganizedSubmission.objects.filter(public_event="") \
.update(public_event="other")
class Migration(migrations.Migration):
dependencies = [
('workshops', '0190_auto_20190728_1118'),
('extrequests', '0008_auto_20190809_1004'),
]
operations = [
migrations.AlterField(
model_name='workshoprequest',
name='host_responsibilities',
field=models.BooleanField(default=False, verbose_name='I understand <a href="https://docs.carpentries.org/topic_folders/hosts_instructors/hosts_instructors_checklist.html#host-checklist">the responsibilities of the workshop host</a>, including recruiting local helpers to support the workshop (1 helper for every 8-10 learners).'),
),
migrations.AlterField(
model_name='workshoprequest',
name='requested_workshop_types',
field=models.ManyToManyField(help_text='If your learners are new to programming and primarily interested in working with data, Data Carpentry is likely the best choice. If your learners are interested in learning more about programming, including version control and automation, Software Carpentry is likely the best match. If your learners are people working in library and information related roles interested in learning data and software skills, Library Carpentry is the best choice. Please visit the <a href="https://software-carpentry.org/lessons/">Software Carpentry lessons page</a>, <a href="http://www.datacarpentry.org/lessons/">Data Carpentry lessons page</a>, or the <a href="https://librarycarpentry.org/lessons/">Library Carpentry lessons page</a> for more information about any of our lessons.', limit_choices_to={'active': True}, to='workshops.Curriculum', verbose_name='Which Carpentries workshop are you requesting?'),
),
migrations.AlterField(
model_name='workshoprequest',
name='scholarship_circumstances',
field=models.TextField(blank=True, help_text='Required only if you request a scholarship.', verbose_name='Please explain the circumstances for your scholarship request and let us know what budget you have towards The Carpentries workshop fees.'),
),
migrations.AlterField(
model_name='workshoprequest',
name='public_event',
field=models.CharField(blank=True, choices=[('invite', 'This event is open to learners by invitation only.'), ('closed', 'This event is open to learners inside of my institution.'), ('public', 'This event is open to learners outside of my institution.'), ('other', 'Other:')], default='', help_text='Many of our workshops restrict registration to learners from the hosting institution. If your workshop will be open to registrants outside of your institution please let us know below.', max_length=20, verbose_name='Is this workshop open to the public?'),
),
migrations.RunPython(migrate_public_event),
]
| 71.339286 | 949 | 0.702378 | 460 | 3,995 | 6.006522 | 0.406522 | 0.035831 | 0.036193 | 0.041983 | 0.163952 | 0.14658 | 0.049946 | 0.049946 | 0 | 0 | 0 | 0.016688 | 0.205006 | 3,995 | 55 | 950 | 72.636364 | 0.853275 | 0.076596 | 0 | 0.365854 | 1 | 0.097561 | 0.550859 | 0.043905 | 0 | 0 | 0 | 0 | 0 | 1 | 0.02439 | false | 0 | 0.02439 | 0 | 0.121951 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92ec1a79aa56994e71f763b1fea1ca3f88478806 | 1,278 | py | Python | pix2pix/Discriminator.py | yubin1219/GAN | 8345095f9816e548c968492efbe92b427b0e06a3 | [
"MIT"
] | null | null | null | pix2pix/Discriminator.py | yubin1219/GAN | 8345095f9816e548c968492efbe92b427b0e06a3 | [
"MIT"
] | null | null | null | pix2pix/Discriminator.py | yubin1219/GAN | 8345095f9816e548c968492efbe92b427b0e06a3 | [
"MIT"
] | 1 | 2021-09-17T01:28:50.000Z | 2021-09-17T01:28:50.000Z | import torch
import torch.nn as nn
class Discriminator(nn.Module):
def __init__(self, input_nc, ndf=64, norm_layer=nn.BatchNorm2d, use_sigmoid=False) :
super(Discriminator, self).__init__()
self.conv1 = nn.Sequential(
nn.Conv2d(input_nc, ndf, kernel_size=4, stride=2, padding=1),
nn.LeakyReLU(0.2, True)
)
self.conv2 = nn.Sequential(
nn.Conv2d(ndf, ndf * 2, kernel_size=4, stride=2, padding=1),
norm_layer(ndf * 2),
nn.LeakyReLU(0.2, True)
)
self.conv3 = nn.Sequential(
nn.Conv2d(ndf * 2, ndf * 4, kernel_size=4, stride=2, padding=1),
norm_layer(ndf * 4),
nn.LeakyReLU(0.2, True)
)
self.conv4 = nn.Sequential(
nn.Conv2d(ndf * 4, ndf * 8, kernel_size=4, stride=2, padding=1),
norm_layer(ndf * 8),
nn.LeakyReLU(0.2, True)
)
if use_sigmoid:
self.conv5 = nn.Sequential(
nn.Conv2d(ndf * 8, 1, kernel_size=4, stride=2, padding=1),
nn.Sigmoid()
)
else:
self.conv5 = nn.Sequential(
nn.Conv2d(ndf * 8, 1, kernel_size=4, stride=2, padding=1)
)
def forward(self, x):
x = self.conv1(x)
x = self.conv2(x)
x = self.conv3(x)
x = self.conv4(x)
x = self.conv5(x)
return x
| 29.045455 | 86 | 0.58216 | 191 | 1,278 | 3.780105 | 0.230366 | 0.099723 | 0.116343 | 0.166205 | 0.569252 | 0.450139 | 0.365651 | 0.365651 | 0.3241 | 0.3241 | 0 | 0.062433 | 0.273083 | 1,278 | 43 | 87 | 29.72093 | 0.714747 | 0 | 0 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.05 | false | 0 | 0.05 | 0 | 0.15 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92f10be0db1bfd9a43f5256577f3d2fdfd1c920b | 13,185 | py | Python | marvel_world/views.py | xiaoranppp/si664-final | f5545c04452fd674ddf1d078444e79ea58385e7e | [
"MIT"
] | null | null | null | marvel_world/views.py | xiaoranppp/si664-final | f5545c04452fd674ddf1d078444e79ea58385e7e | [
"MIT"
] | 1 | 2018-11-25T21:07:37.000Z | 2018-11-25T21:07:37.000Z | marvel_world/views.py | xiaoranppp/si664-final | f5545c04452fd674ddf1d078444e79ea58385e7e | [
"MIT"
] | 1 | 2018-12-21T12:06:03.000Z | 2018-12-21T12:06:03.000Z | from django.shortcuts import render,redirect
from django.http import HttpResponse,HttpResponseRedirect
from django.views import generic
from django.contrib.auth.decorators import login_required
from django.utils.decorators import method_decorator
from .models import Character,Comic,Power,CharacterPower,CharacterComic
from django_filters.views import FilterView
from .filters import Marvel_worldFilter,Marvel_comicFilter
from .forms import CharacterForm,PowerForm,ComicForm
from django.urls import reverse,reverse_lazy
def index(request):
return HttpResponse("Hello, world. You're at the marvel world super hero")
class AboutPageView(generic.TemplateView):
template_name = 'marvel_world/about.html'
class HomePageView(generic.TemplateView):
template_name = 'marvel_world/home.html'
@method_decorator(login_required, name='dispatch')
class CharacterListView(generic.ListView):
model = Character
context_object_name = 'characters'
template_name = 'marvel_world/characters.html'
paginate_by = 50
def get_queryset(self):
return Character.objects.all().select_related('alignment','eye_color','skin_color','hair_color','race','gender','publisher').order_by('character_name')
@method_decorator(login_required, name='dispatch')
class CharacterDetailView(generic.DetailView):
model = Character
context_object_name= 'character'
template_name = 'marvel_world/character_information.html'
@method_decorator(login_required, name='dispatch')
class ComicListView(generic.ListView):
model = Comic
context_object_name = 'comics'
template_name = 'marvel_world/comics.html'
paginate_by = 600
def get_queryset(self):
return Comic.objects.all().order_by('comic_name')
@method_decorator(login_required, name='dispatch')
class ComicDetailView(generic.DetailView):
model = Comic
context_object_name= 'comic'
template_name = 'marvel_world/comic_information.html'
@method_decorator(login_required, name='dispatch')
class PowerListView(generic.ListView):
model = Power
context_object_name = 'powers'
template_name = 'marvel_world/super_power.html'
paginate_by = 50
def get_queryset(self):
return Power.objects.all().order_by('power_name')
@method_decorator(login_required, name='dispatch')
class PowerDetailView(generic.DetailView):
model = Power
context_object_name= 'power'
template_name = 'marvel_world/super_power_information.html'
@method_decorator(login_required, name='dispatch')
class CharacterFilterView(FilterView):
filterset_class = Marvel_worldFilter
template_name = 'marvel_world/character_filter.html'
@method_decorator(login_required, name='dispatch')
class ComicFilterView(FilterView):
filterset_class = Marvel_comicFilter
template_name = 'marvel_world/comic_filter.html'
@method_decorator(login_required, name='dispatch')
class CharacterCreateView(generic.View):
model = Character
form_class = CharacterForm
success_message = "Character created successfully"
template_name = 'marvel_world/character_new.html'
# fields = '__all__' <-- superseded by form_class
# success_url = reverse_lazy('heritagesites/site_list')
def dispatch(self, *args, **kwargs):
return super().dispatch(*args, **kwargs)
def post(self, request):
form = CharacterForm(request.POST)
if form.is_valid():
character = form.save(commit=False)
character.save()
for power in form.cleaned_data['super_power']:
CharacterPower.objects.create(character=character, power=power)
for comic in form.cleaned_data['comics']:
CharacterComic.objects.create(character=character, comic=comic)
return redirect(character) # shortcut to object's get_absolute_url()
# return HttpResponseRedirect(site.get_absolute_url())
return render(request, 'marvel_world/character_new.html', {'form': form})
def get(self, request):
form = CharacterForm()
return render(request, 'marvel_world/character_new.html', {'form': form})
@method_decorator(login_required, name='dispatch')
class PowerCreateView(generic.View):
model = Power
form_class = PowerForm
success_message = "Super power created successfully"
template_name = 'marvel_world/power_new.html'
# fields = '__all__' <-- superseded by form_class
# success_url = reverse_lazy('heritagesites/site_list')
def dispatch(self, *args, **kwargs):
return super().dispatch(*args, **kwargs)
def post(self, request):
form = PowerForm(request.POST)
if form.is_valid():
power = form.save(commit=False)
power.save()
for character in form.cleaned_data['character']:
CharacterPower.objects.create(character=character, power=power)
return redirect(power) # shortcut to object's get_absolute_url()
# return HttpResponseRedirect(site.get_absolute_url())
return render(request, 'marvel_world/power_new.html', {'form': form})
def get(self, request):
form = PowerForm()
return render(request, 'marvel_world/power_new.html', {'form': form})
@method_decorator(login_required, name='dispatch')
class ComicCreateView(generic.View):
model = Comic
form_class = ComicForm
success_message = "Comic created successfully"
template_name = 'marvel_world/comic_new.html'
# fields = '__all__' <-- superseded by form_class
# success_url = reverse_lazy('heritagesites/site_list')
def dispatch(self, *args, **kwargs):
return super().dispatch(*args, **kwargs)
def post(self, request):
form = ComicForm(request.POST)
if form.is_valid():
comic = form.save(commit=False)
comic.save()
for character in form.cleaned_data['character']:
CharacterComic.objects.create(character=character, comic=comic)
return redirect(comic) # shortcut to object's get_absolute_url()
# return HttpResponseRedirect(site.get_absolute_url())
return render(request, 'marvel_world/comic_new.html', {'form': form})
def get(self, request):
form = ComicForm()
return render(request, 'marvel_world/comic_new.html', {'form': form})
#class CharacterDetailView(generic.DetailView):model = Characters context_object_name= 'character'template_name='marvel_world/character_information.html'
@method_decorator(login_required, name='dispatch')
class CharacterUpdateView(generic.UpdateView):
model = Character
form_class = CharacterForm
# fields = '__all__' <-- superseded by form_class
context_object_name = 'character'
# pk_url_kwarg = 'site_pk'
success_message = "Character updated successfully"
template_name = 'marvel_world/character_update.html'
def dispatch(self, *args, **kwargs):
return super().dispatch(*args, **kwargs)
def form_valid(self, form):
character = form.save(commit=False)
# site.updated_by = self.request.user
# site.date_updated = timezone.now()
character.save()
# Current country_area_id values linked to site
old_ids = CharacterPower.objects\
.values_list('power_id', flat=True)\
.filter(character_id=character.character_id)
# New countries list
new_powers = form.cleaned_data['super_power']
# TODO can these loops be refactored?
# New ids
new_ids = []
# Insert new unmatched country entries
for power in new_powers:
new_id = power.power_id
new_ids.append(new_id)
if new_id in old_ids:
continue
else:
CharacterPower.objects \
.create(character=character, power=power)
# Delete old unmatched country entries
for old_id in old_ids:
if old_id in new_ids:
continue
else:
CharacterPower.objects \
.filter(character_id=character.character_id, power_id=old_id) \
.delete()
old_ids1 = CharacterComic.objects\
.values_list('comic_id', flat=True)\
.filter(character_id=character.character_id)
# New countries list
new_comics = form.cleaned_data['comics']
# TODO can these loops be refactored?
# New ids
new_ids1 = []
# Insert new unmatched country entries
for comic in new_comics:
new_id1 = comic.comic_id
new_ids1.append(new_id1)
if new_id1 in old_ids1:
continue
else:
CharacterComic.objects \
.create(character=character, comic=comic)
# Delete old unmatched country entries
for old_id1 in old_ids1:
if old_id1 in new_ids1:
continue
else:
CharacterComic.objects \
.filter(character_id=character.character_id, comic_id=old_id1) \
.delete()
return HttpResponseRedirect(character.get_absolute_url())
@method_decorator(login_required, name='dispatch')
class PowerUpdateView(generic.UpdateView):
model = Power
form_class = PowerForm
# fields = '__all__' <-- superseded by form_class
context_object_name = 'power'
# pk_url_kwarg = 'site_pk'
success_message = "Super power updated successfully"
template_name = 'marvel_world/power_update.html'
def dispatch(self, *args, **kwargs):
return super().dispatch(*args, **kwargs)
def form_valid(self, form):
power = form.save(commit=False)
# site.updated_by = self.request.user
# site.date_updated = timezone.now()
power.save()
# Current country_area_id values linked to site
old_ids = CharacterPower.objects\
.values_list('character_id', flat=True)\
.filter(power_id=power.power_id)
# New countries list
new_chs = form.cleaned_data['character']
# TODO can these loops be refactored?
# New ids
new_ids = []
# Insert new unmatched country entries
for character in new_chs:
new_id = character.character_id
new_ids.append(new_id)
if new_id in old_ids:
continue
else:
CharacterPower.objects \
.create(character=character, power=power)
# Delete old unmatched country entries
for old_id in old_ids:
if old_id in new_ids:
continue
else:
CharacterPower.objects \
.filter(character_id=old_id, power_id=power.power_id) \
.delete()
return HttpResponseRedirect(power.get_absolute_url())
# return redirect('heritagesites/site_detail', pk=site.pk)
@method_decorator(login_required, name='dispatch')
class ComicUpdateView(generic.UpdateView):
model = Comic
form_class = ComicForm
# fields = '__all__' <-- superseded by form_class
context_object_name = 'comic'
# pk_url_kwarg = 'site_pk'
success_message = "Comic updated successfully"
template_name = 'marvel_world/comic_update.html'
def dispatch(self, *args, **kwargs):
return super().dispatch(*args, **kwargs)
def form_valid(self, form):
comic = form.save(commit=False)
# site.updated_by = self.request.user
# site.date_updated = timezone.now()
comic.save()
# Current country_area_id values linked to site
old_ids = CharacterComic.objects\
.values_list('character_id', flat=True)\
.filter(comic_id=comic.comic_id)
# New countries list
new_chs = form.cleaned_data['character']
# TODO can these loops be refactored?
# New ids
new_ids = []
# Insert new unmatched country entries
for character in new_chs:
new_id = character.character_id
new_ids.append(new_id)
if new_id in old_ids:
continue
else:
CharacterComic.objects \
.create(character=character, comic=comic)
# Delete old unmatched country entries
for old_id in old_ids:
if old_id in new_ids:
continue
else:
CharacterComic.objects \
.filter(character_id=old_id, comic_id=comic.comic_id) \
.delete()
return HttpResponseRedirect(comic.get_absolute_url())
@method_decorator(login_required, name='dispatch')
class CharacterDeleteView(generic.DeleteView):
model =Character
success_message = "Character deleted successfully"
success_url = reverse_lazy('characters')
context_object_name = 'character'
template_name = 'marvel_world/character_delete.html'
def dispatch(self, *args, **kwargs):
return super().dispatch(*args, **kwargs)
def delete(self, request, *args, **kwargs):
self.object = self.get_object()
# Delete HeritageSiteJurisdiction entries
CharacterPower.objects \
.filter(character_id=self.object.character_id) \
.delete()
CharacterComic.objects \
.filter(character_id=self.object.character_id) \
.delete()
self.object.delete()
return HttpResponseRedirect(self.get_success_url())
@method_decorator(login_required, name='dispatch')
class PowerDeleteView(generic.DeleteView):
model =Power
success_message = "Super power deleted successfully"
success_url = reverse_lazy('super_power')
context_object_name = 'power'
template_name = 'marvel_world/power_delete.html'
def dispatch(self, *args, **kwargs):
return super().dispatch(*args, **kwargs)
def delete(self, request, *args, **kwargs):
self.object = self.get_object()
# Delete HeritageSiteJurisdiction entries
CharacterPower.objects \
.filter(power_id=self.object.power_id) \
.delete()
self.object.delete()
return HttpResponseRedirect(self.get_success_url())
@method_decorator(login_required, name='dispatch')
class ComicDeleteView(generic.DeleteView):
model =Comic
success_message = "Comic deleted successfully"
success_url = reverse_lazy('comics')
context_object_name = 'comic'
template_name = 'marvel_world/comic_delete.html'
def dispatch(self, *args, **kwargs):
return super().dispatch(*args, **kwargs)
def delete(self, request, *args, **kwargs):
self.object = self.get_object()
# Delete HeritageSiteJurisdiction entries
CharacterComic.objects \
.filter(comic_id=self.object.comic_id) \
.delete()
self.object.delete()
return HttpResponseRedirect(self.get_success_url()) | 31.618705 | 154 | 0.754873 | 1,668 | 13,185 | 5.741007 | 0.106715 | 0.031015 | 0.037594 | 0.048037 | 0.763784 | 0.688596 | 0.620301 | 0.581349 | 0.545113 | 0.443713 | 0 | 0.001668 | 0.135836 | 13,185 | 417 | 155 | 31.618705 | 0.838775 | 0.154115 | 0 | 0.633898 | 0 | 0 | 0.137825 | 0.067381 | 0 | 0 | 0 | 0.002398 | 0 | 1 | 0.084746 | false | 0 | 0.033898 | 0.044068 | 0.511864 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
92f1aa0fa9769c9eeef09b7c084da22948285538 | 267 | py | Python | src/rpi/fwd.py | au-chrismor/selfdrive | 31325dd7a173bbb16a13e3de4c9598aab0a50632 | [
"BSD-3-Clause"
] | null | null | null | src/rpi/fwd.py | au-chrismor/selfdrive | 31325dd7a173bbb16a13e3de4c9598aab0a50632 | [
"BSD-3-Clause"
] | 6 | 2018-03-15T05:23:55.000Z | 2018-10-26T10:28:47.000Z | src/rpi/fwd.py | au-chrismor/selfdrive | 31325dd7a173bbb16a13e3de4c9598aab0a50632 | [
"BSD-3-Clause"
] | null | null | null | """Set-up and execute the main loop"""
import RPi.GPIO as GPIO
import time
GPIO.setmode(GPIO.BCM)
GPIO.setwarnings(False)
#Right motor input A
GPIO.setup(18,GPIO.OUT)
#Right motor input B
GPIO.setup(23,GPIO.OUT)
GPIO.output(18,GPIO.HIGH)
GPIO.output(23,GPIO.LOW)
| 16.6875 | 38 | 0.749064 | 49 | 267 | 4.081633 | 0.571429 | 0.1 | 0.15 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033755 | 0.11236 | 267 | 15 | 39 | 17.8 | 0.810127 | 0.265918 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92f3155f2bddf2f3a6336a7c75d33f3d299e7e91 | 931 | py | Python | util/get_from_db.py | Abel-Huang/simple-image-classifier | 89d2822c2b06cdec728f734d43d9638f4b601348 | [
"MIT"
] | 4 | 2017-05-17T08:01:38.000Z | 2018-07-22T11:13:55.000Z | util/get_from_db.py | Abel-Huang/ImageClassifier | 89d2822c2b06cdec728f734d43d9638f4b601348 | [
"MIT"
] | null | null | null | util/get_from_db.py | Abel-Huang/ImageClassifier | 89d2822c2b06cdec728f734d43d9638f4b601348 | [
"MIT"
] | null | null | null | import pymysql
# 连接配置信息
config = {
'host': '127.0.0.1',
'port': 3306,
'user': 'root',
'password': '',
'db': 'classdata',
'charset': 'utf8',
'cursorclass': pymysql.cursors.DictCursor,
}
def get_summary_db(unitag):
# 创建连接
conn = pymysql.connect(**config)
cur = conn.cursor()
# 执行sql语句
try:
# 执行sql语句,进行查询
sql = 'SELECT * FROM summary where unitag= %s'
cur.execute(sql,unitag)
# 获取查询结果
result = cur.fetchall()
return result
finally:
cur.close()
conn.close()
def get_result_db(unitag):
# 创建连接
conn = pymysql.connect(**config)
cur = conn.cursor()
# 执行sql语句
try:
# 执行sql语句,进行查询
sql = 'SELECT * FROM result where unitag= %s'
cur.execute(sql,unitag)
# 获取查询结果
result = cur.fetchall()
return result
finally:
cur.close()
conn.close()
| 20.23913 | 54 | 0.541353 | 100 | 931 | 5 | 0.44 | 0.024 | 0.048 | 0.064 | 0.692 | 0.692 | 0.692 | 0.692 | 0.692 | 0.692 | 0 | 0.017433 | 0.322234 | 931 | 45 | 55 | 20.688889 | 0.77496 | 0.077336 | 0 | 0.5625 | 0 | 0 | 0.166078 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0.03125 | 0.03125 | 0 | 0.15625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92f638d897dda2bf328a3077b43f492f38f39bb7 | 1,412 | py | Python | jduck/robot.py | luutp/jduck | 3c60a79c926bb9452777cddbebe28982273068a6 | [
"Apache-2.0"
] | null | null | null | jduck/robot.py | luutp/jduck | 3c60a79c926bb9452777cddbebe28982273068a6 | [
"Apache-2.0"
] | null | null | null | jduck/robot.py | luutp/jduck | 3c60a79c926bb9452777cddbebe28982273068a6 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
jduck.py
Description:
Author: luutp
Contact: luubot2207@gmail.com
Created on: 2021/02/27
"""
# Utilities
# %%
# ================================IMPORT PACKAGES====================================
# Utilities
from traitlets.config.configurable import SingletonConfigurable
# Custom Packages
from jduck.DCMotor import DCMotor
# ================================================================================
class JDuck(SingletonConfigurable):
def __init__(self, *args, **kwargs):
self.left_motor = DCMotor(32, 36, 38, alpha=1.0)
self.right_motor = DCMotor(33, 35, 37, alpha=1.0)
self.left_motor.set_speed(50)
self.right_motor.set_speed(50)
def set_speeds(self, left_speed, right_speed):
self.left_motor.set_speed(left_speed)
self.right_motor.set_speed(right_speed)
def move_forward(self):
self.left_motor.rotate_forward()
self.right_motor.rotate_forward()
def move_backward(self):
self.left_motor.rotate_backward()
self.right_motor.rotate_backward()
def turn_left(self):
self.left_motor.rotate_backward()
self.right_motor.rotate_forward()
def turn_right(self):
self.left_motor.rotate_forward()
self.right_motor.rotate_backward()
def stop(self):
self.left_motor.stop()
self.right_motor.stop()
| 25.214286 | 85 | 0.61119 | 167 | 1,412 | 4.922156 | 0.353293 | 0.087591 | 0.126521 | 0.103406 | 0.40146 | 0.296837 | 0.245742 | 0.245742 | 0.245742 | 0.245742 | 0 | 0.028547 | 0.181303 | 1,412 | 55 | 86 | 25.672727 | 0.682526 | 0.23796 | 0 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.269231 | false | 0 | 0.076923 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
92f8d1944416ddff1cb95f31ed4c8d93f364c956 | 5,193 | py | Python | src/nebulo/gql/alias.py | olirice/nebulo | de9b043fe66d0cb872c5c0f2aca3c5c6f20918a7 | [
"MIT"
] | 76 | 2020-04-03T01:21:47.000Z | 2021-12-06T02:54:53.000Z | src/nebulo/gql/alias.py | olirice/nebulo | de9b043fe66d0cb872c5c0f2aca3c5c6f20918a7 | [
"MIT"
] | 7 | 2020-04-06T04:44:10.000Z | 2021-05-17T12:38:15.000Z | src/nebulo/gql/alias.py | olirice/nebulo | de9b043fe66d0cb872c5c0f2aca3c5c6f20918a7 | [
"MIT"
] | 2 | 2020-10-23T10:25:16.000Z | 2020-10-28T14:16:57.000Z | # pylint: disable=missing-class-docstring,invalid-name
import typing
from graphql.language import (
InputObjectTypeDefinitionNode,
InputObjectTypeExtensionNode,
ObjectTypeDefinitionNode,
ObjectTypeExtensionNode,
)
from graphql.type import (
GraphQLArgument,
GraphQLBoolean,
GraphQLEnumType,
GraphQLEnumValue,
GraphQLField,
GraphQLFieldMap,
GraphQLFloat,
GraphQLID,
GraphQLInputFieldMap,
GraphQLInputObjectType,
GraphQLInt,
GraphQLInterfaceType,
GraphQLIsTypeOfFn,
GraphQLList,
GraphQLNonNull,
GraphQLObjectType,
GraphQLResolveInfo,
GraphQLScalarType,
GraphQLSchema,
GraphQLString,
GraphQLType,
Thunk,
)
from graphql.type.definition import GraphQLInputFieldOutType
from nebulo.sql.composite import CompositeType as SQLACompositeType
# Handle name changes from graphql-core and graphql-core-next
try:
from graphql.type import GraphQLInputObjectField as GraphQLInputField
except ImportError:
from graphql.type import GraphQLInputField
Type = GraphQLType
List = GraphQLList
NonNull = GraphQLNonNull
Argument = GraphQLArgument
Boolean = GraphQLBoolean
String = GraphQLString
ScalarType = GraphQLScalarType
ID = GraphQLID
InterfaceType = GraphQLInterfaceType
Int = GraphQLInt
InputField = GraphQLInputField
ResolveInfo = GraphQLResolveInfo
EnumType = GraphQLEnumType
EnumValue = GraphQLEnumValue
Schema = GraphQLSchema
Field = GraphQLField
Float = GraphQLFloat
EnumType = GraphQLEnumType
class HasSQLAModel: # pylint: disable= too-few-public-methods
sqla_table = None
class HasSQLFunction: # pylint: disable= too-few-public-methods
sql_function = None
class HasSQLAComposite: # pylint: disable= too-few-public-methods
sqla_composite: SQLACompositeType
class ObjectType(GraphQLObjectType, HasSQLAModel):
def __init__(
self,
name: str,
fields: Thunk[GraphQLFieldMap],
interfaces: typing.Optional[Thunk[typing.Collection["GraphQLInterfaceType"]]] = None,
is_type_of: typing.Optional[GraphQLIsTypeOfFn] = None,
extensions: typing.Optional[typing.Dict[str, typing.Any]] = None,
description: typing.Optional[str] = None,
ast_node: typing.Optional[ObjectTypeDefinitionNode] = None,
extension_ast_nodes: typing.Optional[typing.Collection[ObjectTypeExtensionNode]] = None,
sqla_model=None,
) -> None:
super().__init__(
name=name,
fields=fields,
interfaces=interfaces,
is_type_of=is_type_of,
extensions=extensions,
description=description,
ast_node=ast_node,
extension_ast_nodes=extension_ast_nodes,
)
self.sqla_model = sqla_model
class ConnectionType(ObjectType):
pass
class EdgeType(ObjectType):
pass
class TableType(ObjectType):
pass
class CompositeType(ObjectType, HasSQLAComposite):
pass
class MutationPayloadType(ObjectType):
pass
class CreatePayloadType(MutationPayloadType):
pass
class UpdatePayloadType(MutationPayloadType):
pass
class DeletePayloadType(MutationPayloadType):
pass
class FunctionPayloadType(MutationPayloadType, HasSQLFunction):
pass
class InputObjectType(GraphQLInputObjectType, HasSQLAModel):
def __init__(
self,
name: str,
fields: Thunk[GraphQLInputFieldMap],
description: typing.Optional[str] = None,
out_type: typing.Optional[GraphQLInputFieldOutType] = None,
extensions: typing.Optional[typing.Dict[str, typing.Any]] = None,
ast_node: typing.Optional[InputObjectTypeDefinitionNode] = None,
extension_ast_nodes: typing.Optional[typing.Collection[InputObjectTypeExtensionNode]] = None,
sqla_model=None,
) -> None:
super().__init__(
name=name,
fields=fields,
description=description,
out_type=out_type,
extensions=extensions,
ast_node=ast_node,
extension_ast_nodes=extension_ast_nodes,
)
self.sqla_model = sqla_model
class CreateInputType(InputObjectType):
pass
class TableInputType(InputObjectType):
pass
class UpdateInputType(InputObjectType):
pass
class DeleteInputType(InputObjectType):
pass
class FunctionInputType(GraphQLInputObjectType):
def __init__(
self,
name: str,
fields: Thunk[GraphQLInputFieldMap],
description: typing.Optional[str] = None,
out_type: typing.Optional[GraphQLInputFieldOutType] = None,
extensions: typing.Optional[typing.Dict[str, typing.Any]] = None,
ast_node: typing.Optional[InputObjectTypeDefinitionNode] = None,
extension_ast_nodes: typing.Optional[typing.Collection[InputObjectTypeExtensionNode]] = None,
sql_function=None,
) -> None:
super().__init__(
name=name,
fields=fields,
description=description,
out_type=out_type,
extensions=extensions,
ast_node=ast_node,
extension_ast_nodes=extension_ast_nodes,
)
self.sql_function = sql_function
| 25.965 | 101 | 0.706913 | 457 | 5,193 | 7.868709 | 0.26477 | 0.062291 | 0.042547 | 0.017519 | 0.385984 | 0.371246 | 0.362347 | 0.342325 | 0.313404 | 0.313404 | 0 | 0 | 0.220104 | 5,193 | 199 | 102 | 26.095477 | 0.887901 | 0.044676 | 0 | 0.43871 | 0 | 0 | 0.004036 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019355 | false | 0.083871 | 0.051613 | 0 | 0.212903 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
92fca9c0141bc42e92af9526839fedc292014b9b | 292 | py | Python | suda/1121/12.py | tusikalanse/acm-icpc | 20150f42752b85e286d812e716bb32ae1fa3db70 | [
"MIT"
] | 2 | 2021-06-09T12:27:07.000Z | 2021-06-11T12:02:03.000Z | suda/1121/12.py | tusikalanse/acm-icpc | 20150f42752b85e286d812e716bb32ae1fa3db70 | [
"MIT"
] | 1 | 2021-09-08T12:00:05.000Z | 2021-09-08T14:52:30.000Z | suda/1121/12.py | tusikalanse/acm-icpc | 20150f42752b85e286d812e716bb32ae1fa3db70 | [
"MIT"
] | null | null | null | for _ in range(int(input())):
x, y = list(map(int, input().split()))
flag = 1
for i in range(x, y + 1):
n = i * i + i + 41
for j in range(2, n):
if j * j > n:
break
if n % j == 0:
flag = 0
break
if flag == 0:
break
if flag:
print("OK")
else:
print("Sorry") | 17.176471 | 39 | 0.489726 | 55 | 292 | 2.581818 | 0.436364 | 0.147887 | 0.140845 | 0.169014 | 0.197183 | 0 | 0 | 0 | 0 | 0 | 0 | 0.040609 | 0.325342 | 292 | 17 | 40 | 17.176471 | 0.680203 | 0 | 0 | 0.176471 | 0 | 0 | 0.023891 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.117647 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13027944554eb9d8705c77a8073c891a250b3842 | 1,853 | py | Python | ally/instrument.py | platformmaster9/PyAlly | 55400e0835ae3ac5b3cf58e0e8214c6244aeb149 | [
"MIT"
] | null | null | null | ally/instrument.py | platformmaster9/PyAlly | 55400e0835ae3ac5b3cf58e0e8214c6244aeb149 | [
"MIT"
] | null | null | null | ally/instrument.py | platformmaster9/PyAlly | 55400e0835ae3ac5b3cf58e0e8214c6244aeb149 | [
"MIT"
] | null | null | null | from . import utils
#################################################
""" INSTRUMENT """
#################################################
def Instrument(symbol):
symbol = str(symbol).upper()
return {
'__symbol' : symbol,
'Sym' : symbol,
'SecTyp' : 'CS',
'__type' : 'equity'
}
#################################################
def Equity(symbol):
return Instrument(symbol)
#################################################
def Option (instrument, maturity_date, strike):
return {
**{
'MatDt' : str(maturity_date) + 'T00:00:00.000-05:00',
'StrkPx' : str(int(strike)),
'SecTyp' : 'OPT',
'__maturity' : str(maturity_date),
'__strike' : str(int(strike))
},
**instrument
}
#################################################
def Call (instrument, maturity_date, strike):
# Let Option do some lifting
x = {
**{ 'CFI':'OC' },
**Option(instrument, maturity_date, strike)
}
x['__underlying'] = x['Sym']
x['__type'] = 'call'
x['__symbol'] = utils.option_format(
symbol = x['Sym'],
exp_date = x['__maturity'],
strike = x['__strike'],
direction = 'C'
)
return x
#################################################
def Put (instrument, maturity_date, strike):
# Let Option do some lifting
x = {
**{ 'CFI':'OP' },
**Option(instrument, maturity_date, strike)
}
x['__underlying'] = x['Sym']
x['__type'] = 'put'
x['__symbol'] = utils.option_format(
symbol = x['Sym'],
exp_date = x['__maturity'],
strike = x['__strike'],
direction = 'P'
)
return x | 29.412698 | 70 | 0.399352 | 148 | 1,853 | 4.736486 | 0.283784 | 0.119829 | 0.154066 | 0.199715 | 0.562054 | 0.513552 | 0.513552 | 0.513552 | 0.513552 | 0.513552 | 0 | 0.010228 | 0.314085 | 1,853 | 63 | 71 | 29.412698 | 0.541306 | 0.028602 | 0 | 0.36 | 0 | 0 | 0.141781 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.1 | false | 0 | 0.02 | 0.04 | 0.22 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1302b2f57e10ec891cc57b121da1cf9b5593731f | 432 | py | Python | airbyte-integrations/connectors/source-yahoo-finance-price/integration_tests/acceptance.py | onaio/airbyte | 38302e82a25f1b66742c3febfbff0668556920f2 | [
"MIT"
] | 22 | 2020-08-27T00:47:20.000Z | 2020-09-17T15:39:39.000Z | airbyte-integrations/connectors/source-yahoo-finance-price/integration_tests/acceptance.py | onaio/airbyte | 38302e82a25f1b66742c3febfbff0668556920f2 | [
"MIT"
] | 116 | 2020-08-27T01:11:27.000Z | 2020-09-19T02:47:52.000Z | airbyte-integrations/connectors/source-yahoo-finance-price/integration_tests/acceptance.py | onaio/airbyte | 38302e82a25f1b66742c3febfbff0668556920f2 | [
"MIT"
] | 1 | 2020-09-15T06:10:01.000Z | 2020-09-15T06:10:01.000Z | #
# Copyright (c) 2022 Airbyte, Inc., all rights reserved.
#
import pytest
pytest_plugins = ("source_acceptance_test.plugin",)
@pytest.fixture(scope="session", autouse=True)
def connector_setup():
"""This fixture is a placeholder for external resources that acceptance test might require."""
# TODO: setup test dependencies if needed. otherwise remove the TODO comments
yield
# TODO: clean up test dependencies
| 25.411765 | 98 | 0.738426 | 55 | 432 | 5.727273 | 0.8 | 0.088889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011142 | 0.168981 | 432 | 16 | 99 | 27 | 0.866295 | 0.585648 | 0 | 0 | 0 | 0 | 0.213018 | 0.171598 | 0 | 0 | 0 | 0.0625 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
130613c0dd1daf7edf4aa6e30bb0303b2984e2ce | 4,862 | py | Python | hail/python/test/hailtop/utils/test_utils.py | vrautela/hail | 7db6189b5b1feafa88452b8470e497d9505d9a46 | [
"MIT"
] | null | null | null | hail/python/test/hailtop/utils/test_utils.py | vrautela/hail | 7db6189b5b1feafa88452b8470e497d9505d9a46 | [
"MIT"
] | null | null | null | hail/python/test/hailtop/utils/test_utils.py | vrautela/hail | 7db6189b5b1feafa88452b8470e497d9505d9a46 | [
"MIT"
] | null | null | null | from hailtop.utils import (partition, url_basename, url_join, url_scheme,
url_and_params, parse_docker_image_reference)
def test_partition_zero_empty():
assert list(partition(0, [])) == []
def test_partition_even_small():
assert list(partition(3, range(3))) == [range(0, 1), range(1, 2), range(2, 3)]
def test_partition_even_big():
assert list(partition(3, range(9))) == [range(0, 3), range(3, 6), range(6, 9)]
def test_partition_uneven_big():
assert list(partition(2, range(9))) == [range(0, 5), range(5, 9)]
def test_partition_toofew():
assert list(partition(6, range(3))) == [range(0, 1), range(1, 2), range(2, 3),
range(3, 3), range(3, 3), range(3, 3)]
def test_url_basename():
assert url_basename('/path/to/file') == 'file'
assert url_basename('https://hail.is/path/to/file') == 'file'
def test_url_join():
assert url_join('/path/to', 'file') == '/path/to/file'
assert url_join('/path/to/', 'file') == '/path/to/file'
assert url_join('/path/to/', '/absolute/file') == '/absolute/file'
assert url_join('https://hail.is/path/to', 'file') == 'https://hail.is/path/to/file'
assert url_join('https://hail.is/path/to/', 'file') == 'https://hail.is/path/to/file'
assert url_join('https://hail.is/path/to/', '/absolute/file') == 'https://hail.is/absolute/file'
def test_url_scheme():
assert url_scheme('https://hail.is/path/to') == 'https'
assert url_scheme('/path/to') == ''
def test_url_and_params():
assert url_and_params('https://example.com/') == ('https://example.com/', {})
assert url_and_params('https://example.com/foo?') == ('https://example.com/foo', {})
assert url_and_params('https://example.com/foo?a=b&c=d') == ('https://example.com/foo', {'a': 'b', 'c': 'd'})
def test_parse_docker_image_reference():
x = parse_docker_image_reference('animage')
assert x.domain is None
assert x.path == 'animage'
assert x.tag is None
assert x.digest is None
assert x.name() == 'animage'
assert str(x) == 'animage'
x = parse_docker_image_reference('hailgenetics/animage')
assert x.domain == 'hailgenetics'
assert x.path == 'animage'
assert x.tag is None
assert x.digest is None
assert x.name() == 'hailgenetics/animage'
assert str(x) == 'hailgenetics/animage'
x = parse_docker_image_reference('localhost:5000/animage')
assert x.domain == 'localhost:5000'
assert x.path == 'animage'
assert x.tag is None
assert x.digest is None
assert x.name() == 'localhost:5000/animage'
assert str(x) == 'localhost:5000/animage'
x = parse_docker_image_reference('localhost:5000/a/b/name')
assert x.domain == 'localhost:5000'
assert x.path == 'a/b/name'
assert x.tag is None
assert x.digest is None
assert x.name() == 'localhost:5000/a/b/name'
assert str(x) == 'localhost:5000/a/b/name'
x = parse_docker_image_reference('localhost:5000/a/b/name:tag')
assert x.domain == 'localhost:5000'
assert x.path == 'a/b/name'
assert x.tag == 'tag'
assert x.digest is None
assert x.name() == 'localhost:5000/a/b/name'
assert str(x) == 'localhost:5000/a/b/name:tag'
x = parse_docker_image_reference('localhost:5000/a/b/name:tag@sha256:abc123')
assert x.domain == 'localhost:5000'
assert x.path == 'a/b/name'
assert x.tag == 'tag'
assert x.digest == 'sha256:abc123'
assert x.name() == 'localhost:5000/a/b/name'
assert str(x) == 'localhost:5000/a/b/name:tag@sha256:abc123'
x = parse_docker_image_reference('localhost:5000/a/b/name@sha256:abc123')
assert x.domain == 'localhost:5000'
assert x.path == 'a/b/name'
assert x.tag is None
assert x.digest == 'sha256:abc123'
assert x.name() == 'localhost:5000/a/b/name'
assert str(x) == 'localhost:5000/a/b/name@sha256:abc123'
x = parse_docker_image_reference('name@sha256:abc123')
assert x.domain is None
assert x.path == 'name'
assert x.tag is None
assert x.digest == 'sha256:abc123'
assert x.name() == 'name'
assert str(x) == 'name@sha256:abc123'
x = parse_docker_image_reference('gcr.io/hail-vdc/batch-worker:123fds312')
assert x.domain == 'gcr.io'
assert x.path == 'hail-vdc/batch-worker'
assert x.tag == '123fds312'
assert x.digest is None
assert x.name() == 'gcr.io/hail-vdc/batch-worker'
assert str(x) == 'gcr.io/hail-vdc/batch-worker:123fds312'
x = parse_docker_image_reference('us-docker.pkg.dev/my-project/my-repo/test-image')
assert x.domain == 'us-docker.pkg.dev'
assert x.path == 'my-project/my-repo/test-image'
assert x.tag is None
assert x.digest is None
assert x.name() == 'us-docker.pkg.dev/my-project/my-repo/test-image'
assert str(x) == 'us-docker.pkg.dev/my-project/my-repo/test-image'
| 37.689922 | 113 | 0.644385 | 741 | 4,862 | 4.118758 | 0.098516 | 0.114679 | 0.06291 | 0.068152 | 0.732962 | 0.678244 | 0.647772 | 0.609109 | 0.486894 | 0.486894 | 0 | 0.04725 | 0.177293 | 4,862 | 128 | 114 | 37.984375 | 0.71575 | 0 | 0 | 0.366337 | 0 | 0 | 0.317153 | 0.145825 | 0 | 0 | 0 | 0 | 0.772277 | 1 | 0.09901 | false | 0 | 0.009901 | 0 | 0.108911 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
131342de18ae50cff3d8d09f0b5c640ef367d9c5 | 997 | py | Python | tests/test_dcd_api.py | sadamek/pyIMX | 52af15e656b400f0812f16cf31d9bf6edbe631ad | [
"BSD-3-Clause"
] | null | null | null | tests/test_dcd_api.py | sadamek/pyIMX | 52af15e656b400f0812f16cf31d9bf6edbe631ad | [
"BSD-3-Clause"
] | null | null | null | tests/test_dcd_api.py | sadamek/pyIMX | 52af15e656b400f0812f16cf31d9bf6edbe631ad | [
"BSD-3-Clause"
] | null | null | null | # Copyright (c) 2017-2018 Martin Olejar
#
# SPDX-License-Identifier: BSD-3-Clause
# The BSD-3-Clause license for this file can be found in the LICENSE file included with this distribution
# or at https://spdx.org/licenses/BSD-3-Clause.html#licenseText
import os
import pytest
from imx import img
# Used Directories
DATA_DIR = os.path.join(os.path.dirname(os.path.abspath(__file__)), 'data')
# Test Files
DCD_TXT = os.path.join(DATA_DIR, 'dcd_test.txt')
DCD_BIN = os.path.join(DATA_DIR, 'dcd_test.bin')
def setup_module(module):
# Prepare test environment
pass
def teardown_module(module):
# Clean test environment
pass
def test_txt_parser():
with open(DCD_TXT, 'r') as f:
dcd_obj = img.SegDCD.parse_txt(f.read())
assert dcd_obj is not None
assert len(dcd_obj) == 12
def test_bin_parser():
with open(DCD_BIN, 'rb') as f:
dcd_obj = img.SegDCD.parse(f.read())
assert dcd_obj is not None
assert len(dcd_obj) == 12
| 22.155556 | 105 | 0.691073 | 162 | 997 | 4.092593 | 0.450617 | 0.054299 | 0.045249 | 0.042232 | 0.271493 | 0.271493 | 0.271493 | 0.129713 | 0.129713 | 0.129713 | 0 | 0.01875 | 0.197593 | 997 | 44 | 106 | 22.659091 | 0.81 | 0.316951 | 0 | 0.3 | 0 | 0 | 0.046269 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | false | 0.1 | 0.15 | 0 | 0.35 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1313abf2371aefa93bced19321156374195a59c5 | 9,387 | py | Python | recentjson.py | nydailynews/feedutils | 8cb18b26ebf70033df420f3fece8c2cac363f918 | [
"MIT"
] | null | null | null | recentjson.py | nydailynews/feedutils | 8cb18b26ebf70033df420f3fece8c2cac363f918 | [
"MIT"
] | 1 | 2017-07-11T17:37:50.000Z | 2017-07-11T17:37:50.000Z | recentjson.py | nydailynews/feedutils | 8cb18b26ebf70033df420f3fece8c2cac363f918 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# Return recent items from a json feed. Recent means "In the last X days."
import os
import doctest
import json
import urllib2
import argparse
import types
import gzip
from datetime import datetime, timedelta
from time import mktime
class RecentJson:
""" Methods for ingesting and publishing JSON feeds.
>>> url = 'http://www.nydailynews.com/json/cmlink/aaron-judge-1.3306628'
>>> parser = build_parser()
>>> args = parser.parse_args([url])
>>> rj = RecentJson(args)
"""
def __init__(self, args={}):
self.args = args
if not hasattr(self.args, 'days'):
self.args.days = 0
self.days = self.args.days
self.date_format = '%a, %d %b %Y %X'
def get(self, url):
""" Wrapper for API requests. Take a URL, return a json array.
>>> url = 'http://www.nydailynews.com/json/cmlink/aaron-judge-1.3306628'
>>> parser = build_parser()
>>> args = parser.parse_args([url])
>>> rj = RecentJson(args)
>>> rj.get(url)
True
"""
response = urllib2.urlopen(url)
if int(response.code) >= 400:
if 'verbose' in self.args and self.args.verbose:
print "URL: %s" % url
raise ValueError("URL %s response: %s" % (url, response['status']))
self.xml = response.read()
return True
def parse(self):
""" Turn the xml into an object.
>>> url = 'http://www.nydailynews.com/json/cmlink/aaron-judge-1.3306628'
>>> parser = build_parser()
>>> args = parser.parse_args([url])
>>> rj = RecentJson(args)
>>> rj.get(url)
True
>>> xml = rj.parse()
>>> print len(xml)
50
"""
try:
p = json.loads(self.xml)
except:
# Sometimes we download gzipped documents from the web.
fh = open('json.gz', 'wb')
fh.write(self.xml)
fh.close()
try:
gz = gzip.GzipFile('json.gz', 'r').read()
p = json.loads(gzip.GzipFile('json.gz', 'r').read())
except IOError:
return None
self.p = p
return p
def recently(self):
""" Return a feedparser entry object for the last X days of feed entries.
>>> url = 'http://www.nydailynews.com/json/cmlink/aaron-judge-1.3306628'
>>> parser = build_parser()
>>> args = parser.parse_args([url])
>>> rj = RecentJson(args)
>>> rj.get(url)
True
>>> xml = rj.parse()
>>> articles = rj.recently()
"""
items = []
for item in self.p:
# print item.keys()
# [u'body', u'tags', u'url', u'contentId', u'abstract', u'author', u'lastUpdated', u'mobileTitle', u'mobileUrl', u'publish_date', u'images', u'title', u'type', u'categories']
# print item['publish_date']
# Fri, 7 Jul 2017 15:16:38 -0400
#dt = datetime.strptime(item['publish_date'], '%a, %d %b %Y %X %z')
dt = datetime.strptime(' '.join(item['publish_date'].split(' ')[:5]), self.date_format)
delta = datetime.today() - dt
if delta.days > int(self.days):
continue
items.append(item)
if 'verbose' in self.args and self.args.verbose:
print delta.days, dt
self.items = items
return items
def pretty_date(ago):
""" Process a timedelta object.
From https://stackoverflow.com/questions/1551382/user-friendly-time-format-in-python
"""
second_diff = ago.seconds
day_diff = ago.days
if day_diff < 0:
return ''
if day_diff == 0:
if second_diff < 10:
return "just now"
if second_diff < 60:
return str(second_diff) + " seconds ago"
if second_diff < 120:
return "a minute ago"
if second_diff < 3600:
return str(second_diff / 60) + " minutes ago"
if second_diff < 7200:
return "an hour ago"
if second_diff < 86400:
return str(second_diff / 3600) + " hours ago"
if day_diff == 1:
return "Yesterday"
if day_diff < 7:
return str(day_diff) + " days ago"
if day_diff < 31:
if day_diff / 7 == 1:
return str(day_diff / 7) + " week ago"
return str(day_diff / 7) + " weeks ago"
if day_diff < 365:
if day_diff / 30 == 1:
return str(day_diff / 30) + " month ago"
return str(day_diff / 30) + " months ago"
if day_diff / 365 == 1:
return str(day_diff / 365) + " year ago"
return str(day_diff / 365) + " years ago"
def main(args):
""" For command-line use.
"""
rj = RecentJson(args)
if args:
articles = []
for arg in args.urls[0]:
if args.verbose:
print arg
rj.get(arg)
try:
p = rj.parse()
except:
continue
if not p:
continue
articles.append(rj.recently())
if len(articles) is 0:
return None
for i, article in enumerate(articles[0]):
if i >= args.limit and args.limit > 0:
break
dt = datetime.strptime(' '.join(article['publish_date'].split(' ')[:5]), '%a, %d %b %Y %X')
ago = datetime.now() - dt
# print ago
# 2 days, 15:57:48.578638
if args.output == 'html':
if type(article['title']) is types.UnicodeType:
article['title'] = article['title'].encode('utf-8', 'replace')
if args.listitem == True:
print '<li><a href="{0}">{1}</a> <span>({2})</span></li>'.format(article['url'], article['title'], pretty_date(ago).lower())
elif args.nostamp == True:
print '<li><a href="{0}">{1}</a></li>'.format(article['url'], article['title'], pretty_date(ago).lower())
else:
print '<a href="{0}">{1}</a> <span>({2})</span>'.format(article['url'], article['title'], pretty_date(ago).lower())
if args.output == 'js':
if type(article['title']) is types.UnicodeType:
article['title'] = article['title'].encode('utf-8', 'replace')
print 'var hed = "<a href=\'{0}\'>{1}</a> <span>({2})</span>";'.format(article['url'], article['title'].replace('"', '\\\\"'), pretty_date(ago).lower())
elif args.output == 'json':
print json.dumps({'title': article['title'],
'id': article['id'],
'description': article['description']})
elif args.output == 'csv':
dt = datetime.strptime(' '.join(article['publish_date'].split(' ')[:5]), '%a, %d %b %Y %X')
article['datetime'] = '%s-%s-%s' % (dt.year, dt.month, dt.day)
if dt.month < 10:
article['datetime'] = '%d-0%d-%d' % (dt.year, dt.month, dt.day)
if dt.day < 10:
article['datetime'] = '%d-0%d-0%d' % (dt.year, dt.month, dt.day)
article['slug'] = article['title'].lower().replace(' ', '-').replace('--', '-').replace(':', '')
article['iframe_url'] = article['media_player']['url']
article['image_url'] = article['media_thumbnail'][0]['url']
article['image_large_url'] = article['media_thumbnail'][1]['url']
article['description'] = article['description'].replace('"', "'")
# date,title,id,slug,player_url,image_url,image_large_url,keywords,description
print '%(datetime)s,"%(title)s",%(id)s,%(slug)s,%(iframe_url)s,%(image_url)s,%(image_large_url)s,"%(media_keywords)s","%(description)s"' % article
def build_parser():
""" We put the argparse in a method so we can test it
outside of the command-line.
"""
parser = argparse.ArgumentParser(usage='$ python recentjson.py http://domain.com/json/',
description='''Takes a list of URLs passed as args.
Returns the items published today unless otherwise specified.''',
epilog='')
parser.add_argument("-v", "--verbose", dest="verbose", default=False, action="store_true")
parser.add_argument("--test", dest="test", default=False, action="store_true")
parser.add_argument("-d", "--days", dest="days", default=0)
parser.add_argument("-l", "--limit", dest="limit", default=0, type=int)
parser.add_argument("-o", "--output", dest="output", default="html", type=str)
parser.add_argument("--li", dest="listitem", default=False, action="store_true")
parser.add_argument("--ns", dest="nostamp", default=False, action="store_true")
parser.add_argument("urls", action="append", nargs="*")
return parser
if __name__ == '__main__':
"""
"""
parser = build_parser()
args = parser.parse_args()
if args.test:
doctest.testmod(verbose=args.verbose)
main(args)
| 40.813043 | 186 | 0.520081 | 1,124 | 9,387 | 4.262456 | 0.237544 | 0.024838 | 0.016907 | 0.023377 | 0.34043 | 0.307034 | 0.281987 | 0.270298 | 0.214569 | 0.210812 | 0 | 0.025517 | 0.319484 | 9,387 | 229 | 187 | 40.991266 | 0.724483 | 0.063386 | 0 | 0.116129 | 0 | 0.012903 | 0.17826 | 0.0245 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.006452 | 0.058065 | null | null | 0.058065 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1317d06f323723e4a4f4ac9a34fb6dfc7aa40fb0 | 1,146 | py | Python | website/migrations/0084_auto_20210215_1401.py | czhu1217/cmimc-online | 5ef49ceec0bb86d8ae120a6ecfd723532e277821 | [
"MIT"
] | null | null | null | website/migrations/0084_auto_20210215_1401.py | czhu1217/cmimc-online | 5ef49ceec0bb86d8ae120a6ecfd723532e277821 | [
"MIT"
] | 1 | 2022-01-23T21:08:12.000Z | 2022-01-23T21:08:12.000Z | website/migrations/0084_auto_20210215_1401.py | czhu1217/cmimc-online | 5ef49ceec0bb86d8ae120a6ecfd723532e277821 | [
"MIT"
] | 1 | 2021-10-17T17:11:42.000Z | 2021-10-17T17:11:42.000Z | # Generated by Django 3.1.6 on 2021-02-15 19:01
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('website', '0083_remove_aisubmission_code'),
]
operations = [
migrations.AddField(
model_name='exam',
name='division',
field=models.IntegerField(default=1),
preserve_default=False,
),
migrations.CreateModel(
name='ExamPair',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=100, unique=True)),
('contest', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='exampairs', to='website.contest')),
],
),
migrations.AddField(
model_name='exam',
name='exampair',
field=models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='exams', to='website.exampair'),
),
]
| 33.705882 | 150 | 0.604712 | 120 | 1,146 | 5.65 | 0.541667 | 0.047198 | 0.061947 | 0.097345 | 0.19174 | 0.19174 | 0 | 0 | 0 | 0 | 0 | 0.027316 | 0.265271 | 1,146 | 33 | 151 | 34.727273 | 0.77791 | 0.039267 | 0 | 0.259259 | 1 | 0 | 0.11647 | 0.026388 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.074074 | 0 | 0.185185 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
131a0e5ecdfc260b3827d4c50aa8606f1a20747d | 3,054 | py | Python | getUniformSmiles.py | OpenEye-Contrib/Molecular-List-Logic | 82caf41f7d8b94e7448d8e839bdbc0620a8666d7 | [
"BSD-3-Clause"
] | 2 | 2016-01-21T19:50:12.000Z | 2021-04-10T13:27:40.000Z | getUniformSmiles.py | OpenEye-Contrib/Molecular-List-Logic | 82caf41f7d8b94e7448d8e839bdbc0620a8666d7 | [
"BSD-3-Clause"
] | null | null | null | getUniformSmiles.py | OpenEye-Contrib/Molecular-List-Logic | 82caf41f7d8b94e7448d8e839bdbc0620a8666d7 | [
"BSD-3-Clause"
] | null | null | null | #!/opt/az/psf/python/2.7/bin/python
from openeye.oechem import *
import cgi
#creates a list of smiles of the syntax [smiles|molId,smiles|molId]
def process_smiles(smiles):
smiles = smiles.split('\n')
mol = OEGraphMol()
smiles_list=[]
for line in smiles:
if len(line.rstrip())>0:
line = line.split()
smi = line[0]
molId = ""
if len(line)>1:
molId = line[1].replace(" ","|").rstrip()
if(OEParseSmiles(mol,smi)):
smi = OECreateSmiString(mol)
mol.Clear()
smiles_list.append(smi + "|" + molId) #can't send spaces or new lines
return smiles_list
#takes a list of smiles and writes it as sdf using a memory buffer
def write_sdf(smiles_list):
sdfs = []
ofs = oemolostream()
ofs.SetFormat(OEFormat_SDF)
ofs.openstring()
mol = OEGraphMol()
for smiles in smiles_list:
if(OEParseSmiles(mol,smiles.replace("|"," "))):
OEWriteMolecule(ofs,mol)
sdfs.append(ofs.GetString())
mol.Clear()
ofs.SetString("")
return sdfs
#creates a list of smiles of the syntax [smiles|molId,smiles|molId]
def read_sdf(sdf_data):
ifs = oemolistream()
ifs.SetFormat(OEFormat_SDF)
ifs.openstring(sdf_data)
smiles_list = []
for mol in ifs.GetOEGraphMols():
smiles = OECreateSmiString(mol)
smiles_list.append(smiles + "|" + mol.GetTitle())
return smiles_list
if __name__ == "__main__":
print "Content-Type: text/html\r\n\r\n"
form = cgi.FieldStorage()
extension = form.getvalue("extension")
dataA = form.getvalue("dataA")
operator = form.getvalue("smiles_operator")
sdf_output = form.getvalue("sdf_output")
if(extension=="smi"):
list_A = process_smiles(dataA)
else:
list_A = read_sdf(dataA)
outputString = ""
if(operator=="UNI"): #if only one file is supplied
outputString = "*".join(set(list_A)) #removes all doubles using the set() function
else:
dataB = form.getvalue("dataB") #if two files are supplied
if(extension=="smi"):
list_B = process_smiles(dataB)
else:
list_B = read_sdf(dataB)
if(operator=="AND"):
outputString = "*".join(set(list_A) & set(list_B))
elif(operator=="OR"):
outputString = "*".join(set(list_A) | set(list_B))
elif(operator=="NOT"):
outputString = "*".join(set(list_A) - set(list_B))
if(sdf_output=="on"): #if we want the output as sdf
sdfs = write_sdf(outputString.replace("|"," ").split("*"))
outputString = "*".join(sdfs)
outputString = outputString.replace("\n","!").replace(" ","|")
#sends the output to index.html using javascript
print """
<html>
<head>
<input type="text" id="data" value=""" + outputString + """>
<script type="text/javascript">
parent.postMessage(data.value,"*");
</script>
</head>
</html>
"""
| 30.237624 | 90 | 0.586444 | 367 | 3,054 | 4.762943 | 0.33515 | 0.045767 | 0.043478 | 0.052632 | 0.146453 | 0.132723 | 0.132723 | 0.132723 | 0.114416 | 0.114416 | 0 | 0.00268 | 0.266863 | 3,054 | 100 | 91 | 30.54 | 0.778026 | 0.141781 | 0 | 0.1625 | 0 | 0 | 0.117196 | 0.022214 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.025 | null | null | 0.025 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1321da1b77c6741566badad243b4f02ccf184f31 | 2,510 | py | Python | src/plot_timeseries_outstanding_bytes.py | arunksaha/heap_tracker | 0755c6b9c3e4e621efda31c144421a1e67e51a9c | [
"BSD-3-Clause"
] | 1 | 2019-07-22T03:43:49.000Z | 2019-07-22T03:43:49.000Z | src/plot_timeseries_outstanding_bytes.py | arunksaha/heap_tracker | 0755c6b9c3e4e621efda31c144421a1e67e51a9c | [
"BSD-3-Clause"
] | null | null | null | src/plot_timeseries_outstanding_bytes.py | arunksaha/heap_tracker | 0755c6b9c3e4e621efda31c144421a1e67e51a9c | [
"BSD-3-Clause"
] | null | null | null | #
# Copyright 2018, Arun Saha <arunksaha@gmail.com>
#
import matplotlib
import matplotlib.pyplot as plt
import matplotlib.dates as md
import datetime as dt
import sys
import os
# Open the file, read the string contents into a list,
# and return the list.
def GetLinesListFromFile(filename):
with open(filename) as f:
content = f.readlines()
return content
# Convert usecs (numeric) to datetime
# >>> ts = 1520189090755278 / 1000000.0
# >>> x = datetime.datetime.fromtimestamp(ts)
# >>> x.strftime('%Y-%m-%d %H:%M:%S.%f')
# '2018-03-04 10:44:50.755278'
def ConvertUsecsEpochToDateTime(usecs):
secs = usecs / 1000000.0
# Attempt to parse usecs throws:
# ValueError: year is out of range
# So, using secs instead. REVISIT.
# datetimeObj = dt.datetime.fromtimestamp(usecs)
datetimeObj = dt.datetime.fromtimestamp(secs)
# print usecs, secs, datetimeObj
return datetimeObj
# Take a list of string tuples (timestamp, metric),
# parses them into numerical values and returns
# separate lists.
def GetTxListFromFile(filename):
lineList = GetLinesListFromFile(filename)
datetimeList = []
outBytesList = []
for line in lineList:
tokens = line.split()
# print tokens
assert(len(tokens) >= 2)
usecs = int(tokens[0])
bytes = int(tokens[1])
datetimeObj = ConvertUsecsEpochToDateTime(usecs)
datetimeList.append(datetimeObj)
outBytesList.append(bytes)
return datetimeList, outBytesList
# Plotting driver program.
def driver(dataFile):
datetimeList, outBytesList = GetTxListFromFile(dataFile)
plt.subplots_adjust(bottom = 0.2)
plt.xticks(rotation = 25)
ax = plt.gca()
# Intended to show micro-seconds, but facing some problem,
# see REVISIT above.
# xfmt = md.DateFormatter('%Y-%m-%d %H:%M:%S.%f')
xfmt = md.DateFormatter('%Y-%m-%d %H:%M:%S')
ax.xaxis.set_major_formatter(xfmt)
# Avoid scientific notatinn, use plain numbers.
ax.get_yaxis().get_major_formatter().set_scientific(False)
# Make the numbers comma separated.
ax.get_yaxis().set_major_formatter(
matplotlib.ticker.FuncFormatter(lambda bytes, p: format(int(bytes), ',')))
# Intended the y-axis numbers on both sides, but not working.
ax.yaxis.set_ticks_position('both')
plt.plot(datetimeList, outBytesList)
plt.title('Outstanding Bytes Timeseries')
plt.ylabel('bytes')
plt.xlabel('timestamp')
plt.grid(True)
plt.show()
# main
if len(sys.argv) == 1:
print "usage: {} <input-text-file>".format(sys.argv[0])
sys.exit(1)
driver(sys.argv[1])
| 28.202247 | 78 | 0.712749 | 335 | 2,510 | 5.304478 | 0.507463 | 0.054024 | 0.005065 | 0.006753 | 0.032639 | 0.032639 | 0.032639 | 0.028137 | 0.028137 | 0 | 0 | 0.031784 | 0.160159 | 2,510 | 88 | 79 | 28.522727 | 0.811195 | 0.359761 | 0 | 0 | 0 | 0 | 0.057631 | 0 | 0 | 0 | 0 | 0 | 0.020833 | 0 | null | null | 0 | 0.125 | null | null | 0.020833 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1324d01927785e4ef25103f8dd91f9cf2502dddb | 722 | py | Python | import.py | vmariano/meme-classifier | e8d6e73e4a843542143f20381c0741df16d3945d | [
"BSD-3-Clause"
] | null | null | null | import.py | vmariano/meme-classifier | e8d6e73e4a843542143f20381c0741df16d3945d | [
"BSD-3-Clause"
] | 1 | 2022-02-07T12:06:59.000Z | 2022-02-07T12:06:59.000Z | import.py | vmariano/meme-classifier | e8d6e73e4a843542143f20381c0741df16d3945d | [
"BSD-3-Clause"
] | 1 | 2022-02-07T02:38:04.000Z | 2022-02-07T02:38:04.000Z | from dotenv import load_dotenv
load_dotenv()
import sys
import os
import re
import json
import psycopg2
from meme_classifier.images import process_image
path = sys.argv[1]
data = json.load(open(os.path.join(path, 'result.json'), 'r'))
chat_id = data['id']
conn = psycopg2.connect(os.getenv('POSTGRES_CREDENTIALS'))
for m in data['messages']:
if 'photo' in m:
template, text = process_image(open(os.path.join(path, m['photo']), 'rb'))
message_id = m['id']
print(f'processing message {message_id}')
cur = conn.cursor()
cur.execute("INSERT INTO meme (template, text, chat_id, message_id) VALUES (%s, %s, %s, %s)", (template, text, chat_id, message_id))
conn.commit()
| 26.740741 | 140 | 0.67036 | 108 | 722 | 4.361111 | 0.462963 | 0.076433 | 0.042463 | 0.059448 | 0.191083 | 0.11465 | 0 | 0 | 0 | 0 | 0 | 0.005076 | 0.18144 | 722 | 26 | 141 | 27.769231 | 0.791878 | 0 | 0 | 0 | 0 | 0.05 | 0.228532 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.35 | 0 | 0.35 | 0.05 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
13275a03cfbb8b9a14ddc59598aea372c23ea6fb | 6,461 | py | Python | relocation/depth/setup_relocation_dir.py | ziyixi/SeisScripts | a484bc1747eae52b2441f0bfd47ac7e093150f1d | [
"MIT"
] | null | null | null | relocation/depth/setup_relocation_dir.py | ziyixi/SeisScripts | a484bc1747eae52b2441f0bfd47ac7e093150f1d | [
"MIT"
] | null | null | null | relocation/depth/setup_relocation_dir.py | ziyixi/SeisScripts | a484bc1747eae52b2441f0bfd47ac7e093150f1d | [
"MIT"
] | null | null | null | """
setup earthquake depth relocation directory
"""
import obspy
import sh
import numpy as np
import click
from os.path import join
from glob import glob
import copy
def generate_new_cmtsolution_files(cmts_dir, generated_cmts_dir, depth_perturbation_list):
cmt_names = glob(join(cmts_dir, "*"))
for cmt_file in cmt_names:
event = obspy.read_events(cmt_file)[0]
# gcmt_id = event.resource_id.id.split("/")[-2]
# there are some problems in changing names
gcmt_id = cmt_file.split("/")[-1]
# assume dirs like f"{generated_cmts_dir}/d-3" have already been created
for depth_per in depth_perturbation_list:
generated_name = join(generated_cmts_dir, f"d{depth_per}", gcmt_id)
# there are always problem in copy event, so here I'd like to read in the event again
event_this_depth = obspy.read_events(cmt_file)[0]
# event_this_depth = event.copy()
event_this_depth.origins[0].depth += 1000.0*depth_per
# print(generated_name, generated_cmts_dir, f"d{depth_per}", gcmt_id)
event_this_depth.write(generated_name, format="CMTSOLUTION")
def setup_basic_structure(main_dir, ref_dir, cmts_dir, depth_perturbation_list):
# main
sh.mkdir("-p", main_dir)
# ref
sh.cp("-r", ref_dir, join(main_dir, "ref"))
# refine the structure in ref
sh.rm("-rf", join(main_dir, "ref", "DATABASES_MPI"))
sh.rm("-rf", join(main_dir, "ref", "EXAMPLES"))
sh.rm("-rf", join(main_dir, "ref", "OUTPUT_FILES"))
sh.rm("-rf", join(main_dir, "ref", "doc"))
sh.rm("-rf", join(main_dir, "ref", "tests"))
# mv DATA and utils to upper level
sh.mv(join(main_dir, "ref", "DATA"), main_dir)
sh.mv(join(main_dir, "ref", "utils"), main_dir)
# cmts
sh.mkdir("-p", join(main_dir, "cmts"))
sh.cp("-r", cmts_dir, join(main_dir, "cmts", "cmts_raw"))
sh.mkdir("-p", join(main_dir, "cmts", "cmts_generated"))
for depth_per in depth_perturbation_list:
sh.mkdir("-p", join(main_dir, "cmts",
"cmts_generated", f"d{depth_per}"))
# working directory
sh.mkdir("-p", join(main_dir, "work"))
def setup_structure_after_generat_cmts(main_dir, output_dir, depth_perturbation_list):
# get cmts names
cmt_dirs = glob(join(main_dir, "cmts", "cmts_raw", "*"))
cmt_names = [item.split("/")[-1] for item in cmt_dirs]
# mkdirs
for cmt_name in cmt_names:
sh.mkdir(join(main_dir, "work", cmt_name))
for depth_per in depth_perturbation_list:
# sh.mkdir(join(main_dir, "work", cmt_name, f"d{depth_per}"))
# cp ref to working dirs
sh.cp("-r", join(main_dir, "ref"),
join(main_dir, "work", cmt_name, f"d{depth_per}"))
# mv DATA and utils back to ref
sh.mv(join(main_dir, "DATA"), join(main_dir, "ref", "DATA"))
sh.mv(join(main_dir, "utils"), join(main_dir, "ref", "utils"))
# mkdir DATA in work directory
for cmt_name in cmt_names:
for depth_per in depth_perturbation_list:
sh.mkdir(join(main_dir, "work", cmt_name, f"d{depth_per}", "DATA"))
# cp and ln files in DATA
toln = ["cemRequest", "crust1.0", "crust2.0",
"crustmap", "epcrust", "eucrust-07", "GLL", "heterogen", "Lebedev_sea99", "Montagner_model", "old", "PPM", "QRFSI12", "s20rts", "s362ani", "s40rts", "Simons_model", "topo_bathy", "Zhao_JP_model"]
for cmt_name in cmt_names:
for depth_per in depth_perturbation_list:
sh.cp(join(main_dir, "cmts", "cmts_generated",
f"d{depth_per}", cmt_name), join(main_dir, "work", cmt_name, f"d{depth_per}", "DATA", "CMTSOLUTION"))
sh.cp(join(main_dir, "ref", "DATA", "Par_file"), join(
main_dir, "work", cmt_name, f"d{depth_per}", "DATA", "Par_file"))
sh.cp(join(main_dir, "ref", "DATA", "STATIONS"), join(
main_dir, "work", cmt_name, f"d{depth_per}", "DATA", "STATIONS"))
for lnfile in toln:
sh.ln("-s", join(main_dir, "ref", "DATA", lnfile), join(
main_dir, "work", cmt_name, f"d{depth_per}", "DATA", lnfile))
# ln in work files
toln_work = ["utils"]
for lnfile in toln_work:
sh.ln("-s", join(main_dir, "ref", lnfile), join(
main_dir, "work", cmt_name, f"d{depth_per}", lnfile))
# mkdir and ln DATABASE_MPI and OUTPUT_FILES
sh.mkdir("-p", output_dir)
sh.mkdir("-p", join(output_dir, "DATABASES_MPI"))
sh.mkdir("-p", join(output_dir, "OUTPUT_FILES"))
for cmt_name in cmt_names:
for depth_per in depth_perturbation_list:
sh.mkdir("-p", join(output_dir, "DATABASES_MPI",
cmt_name, f"d{depth_per}"))
sh.mkdir("-p", join(output_dir, "OUTPUT_FILES",
cmt_name, f"d{depth_per}"))
sh.ln("-s", join(output_dir, "DATABASES_MPI",
cmt_name, f"d{depth_per}"), join(main_dir, "work", cmt_name, f"d{depth_per}", "DATABASES_MPI"))
sh.ln("-s", join(output_dir, "OUTPUT_FILES",
cmt_name, f"d{depth_per}"), join(main_dir, "work", cmt_name, f"d{depth_per}", "OUTPUT_FILES"))
@click.command()
@click.option('--main_dir', required=True, help="the main working directory", type=str)
@click.option('--output_dir', required=True, help="the output directory in scratch", type=str)
@click.option('--ref_dir', required=True, help="the reference specfem directory", type=str)
@click.option('--cmts_dir', required=True, help="the cmt solution directory", type=str)
@click.option('--depth_perturbation', required=True, help="the depth perturbation, use somthing like -3,-1,5 (in km)", type=str)
def main(main_dir, output_dir, ref_dir, cmts_dir, depth_perturbation):
depth_perturbation_list = [float(item)
for item in depth_perturbation.split(",")]
setup_basic_structure(main_dir, ref_dir, cmts_dir, depth_perturbation_list)
generated_cmts_dir = join(main_dir, "cmts", "cmts_generated")
working_cmts_dir = join(main_dir, "cmts", "cmts_raw")
generate_new_cmtsolution_files(
working_cmts_dir, generated_cmts_dir, depth_perturbation_list)
setup_structure_after_generat_cmts(
main_dir, output_dir, depth_perturbation_list)
if __name__ == "__main__":
main()
| 45.181818 | 207 | 0.625445 | 933 | 6,461 | 4.068596 | 0.165059 | 0.084826 | 0.107218 | 0.047418 | 0.571391 | 0.505269 | 0.463119 | 0.384089 | 0.318757 | 0.292413 | 0 | 0.006389 | 0.224733 | 6,461 | 142 | 208 | 45.5 | 0.751447 | 0.112676 | 0 | 0.126316 | 1 | 0 | 0.193514 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.042105 | false | 0 | 0.073684 | 0 | 0.115789 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
132ba0fc5dd29581ad43ebacaf74b8069bf64ef6 | 5,072 | py | Python | src/test-apps/happy/test-templates/WeaveInetDNS.py | aiw-google/openweave-core | 5dfb14b21d0898ef95bb62ff564cadfeea4b4702 | [
"Apache-2.0"
] | 1 | 2021-08-10T12:08:31.000Z | 2021-08-10T12:08:31.000Z | src/test-apps/happy/test-templates/WeaveInetDNS.py | aiw-google/openweave-core | 5dfb14b21d0898ef95bb62ff564cadfeea4b4702 | [
"Apache-2.0"
] | 1 | 2020-04-30T05:38:44.000Z | 2020-04-30T05:38:44.000Z | src/test-apps/happy/test-templates/WeaveInetDNS.py | aiw-google/openweave-core | 5dfb14b21d0898ef95bb62ff564cadfeea4b4702 | [
"Apache-2.0"
] | 1 | 2020-06-15T01:50:59.000Z | 2020-06-15T01:50:59.000Z | #!/usr/bin/env python
#
# Copyright (c) 2016-2017 Nest Labs, Inc.
# All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
#
# @file
# Implements WeaveInet class that tests Weave Inet Layer among Weave Nodes.
#
import os
import sys
import time
from happy.ReturnMsg import ReturnMsg
from happy.Utils import *
from happy.HappyNode import HappyNode
from happy.HappyNetwork import HappyNetwork
from WeaveTest import WeaveTest
# Q: what are the parameters need to specify?
options = {}
options["quiet"] = False
options["node_id"] = None
options["tap_if"] = None
options["node_ip"] = None
options["ipv4_gateway"] = None
options["dns"] = None
options["use_lwip"] = False
def option():
return options.copy()
class WeaveInetDNS(HappyNode, HappyNetwork, WeaveTest):
def __init__(self, opts = options):
HappyNode.__init__(self)
HappyNetwork.__init__(self)
WeaveTest.__init__(self)
self.quiet = opts["quiet"]
self.node_id = opts["node_id"]
self.tap_if = opts["tap_if"]
self.prefix = opts["prefix"]
self.ipv4_gateway =opts["ipv4_gateway"]
self.dns = opts["dns"]
self.use_lwip = opts["use_lwip"]
self.node_process_tag = "WEAVE-INET-NODE"
def __log_error_and_exit(self, error):
self.logger.error("[localhost] WeaveInetDNS: %s" % (error))
sys.exit(1)
def __checkNodeExists(self, node, description):
if not self._nodeExists(node):
emsg = "The %s '%s' does not exist in the test topology." % (description, node)
self.__log_error_and_exit(emsg)
def __pre_check(self):
# Check if the name of the new node is given
if not self.node_id:
emsg = "Missing name of the virtual node that should start shell."
self.__log_error_and_exit(emsg)
# Check if virtual node exists
if not self._nodeExists():
emsg = "virtual node %s does not exist." % (self.node_id)
self.__log_error_and_exit(emsg)
# check if prefix
if self.prefix == None:
emsg = "prefix is None, Please specifiy a valid prefix."
self.__log_error_and_exit(emsg)
def __gather_results(self):
"""
gather result from get_test_output()
"""
quiet = True
results = {}
results['status'], results['output'] = self.get_test_output(self.node_id, self.node_process_tag, quiet)
return (results)
def __process_results(self, results):
"""
process results from gather_results()
"""
status = False
output = ""
status = (results['status'] == 0)
output = results['output']
return (status, output)
def __start_node_dnscheck(self):
"""
lwip and socket use different command for now
"""
cmd = "sudo "
cmd += self.getWeaveInetLayerDNSPath()
node_ip = self.getNodeAddressesOnPrefix(self.prefix, self.node_id)[0]
if node_ip == None:
emsg = "Could not find IP address of the node, %s" % (self.node_id)
self.__log_error_and_exit(emsg)
if self.use_lwip:
cmd += " --tap-device " + self.tap_if + " -a " + node_ip + " --ipv4-gateway " + self.ipv4_gateway + \
" --dns-server " + self.dns
print "dns check command : {}".format(cmd)
self.start_weave_process(self.node_id, cmd, self.node_process_tag, sync_on_output=self.ready_to_service_events_str)
def __stop_node(self):
self.stop_weave_process(self.node_id, self.node_process_tag)
def run(self):
self.logger.debug("[localhost] WeaveInetDNS: Run.")
self.__pre_check()
self.__start_node_dnscheck()
emsg = "WeaveInet %s should be running." % (self.node_process_tag)
self.logger.debug("[%s] WeaveInet: %s" % (self.node_id, emsg))
self.__stop_node()
node_output_value, node_output_data = \
self.get_test_output(self.node_id, self.node_process_tag, True)
node_strace_value, node_strace_data = \
self.get_test_strace(self.node_id, self.node_process_tag, True)
results = self.__gather_results()
result, output = self.__process_results(results)
data = {}
data["node_output"] = node_output_data
data["node_strace"] = node_strace_data
self.logger.debug("[localhost] WeaveInetDNSTest: Done.")
return ReturnMsg(result, data)
| 29.317919 | 123 | 0.636632 | 649 | 5,072 | 4.731895 | 0.292758 | 0.049495 | 0.035819 | 0.041029 | 0.11169 | 0.100619 | 0.100619 | 0.074569 | 0.050798 | 0.029306 | 0 | 0.005316 | 0.258281 | 5,072 | 172 | 124 | 29.488372 | 0.811005 | 0.168967 | 0 | 0.054945 | 0 | 0 | 0.149287 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.087912 | null | null | 0.010989 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
132bada4fd58d52bb6e1891b67b8d0c493944002 | 6,243 | py | Python | funcoes.py | ZezaoDev/Circtrigo | 5e5f6be0bdee17d30c2993478ca25409b82b6af3 | [
"MIT"
] | null | null | null | funcoes.py | ZezaoDev/Circtrigo | 5e5f6be0bdee17d30c2993478ca25409b82b6af3 | [
"MIT"
] | null | null | null | funcoes.py | ZezaoDev/Circtrigo | 5e5f6be0bdee17d30c2993478ca25409b82b6af3 | [
"MIT"
] | null | null | null | import turtle as t
import math
class circTrigo:
def __init__(self):
self.raio = 0
self.grau = 0
self.seno = 0
self.cosseno = 0
self.tangente = 0
self.quadrante = 0
self.tema = ''
t.bgcolor("black")
t.pencolor("white")
def seta(self):
# DESENHA UMA SETA
t.left(90)
t.forward(5)
t.right(120)
t.forward(10)
t.right(120)
t.forward(10)
t.right(120)
t.forward(5)
t.right(90)
def linha(self, pxls):
# DESENHA UMA LINHA PONTILHADA
pixels = int(pxls//1)
if pixels % 2 == 0:
pixels = pixels + 1
for x in range(0, pixels//10):
t.pendown()
t.forward(5)
t.penup()
t.forward(5)
t.pendown()
t.forward(pixels%10)
def reset(self):
# RETORNA PRA POSICAO INICIAL
t.penup()
t.home()
t.pendown()
t.speed(0)
t.pensize(2)
t.pencolor("white")
def circulo(self, raio):
# DESENHA O CIRCULO
self.raio = raio
t.right(90)
t.penup()
t.forward(self.raio)
t.left(90)
t.pendown()
t.circle(self.raio)
self.reset()
def eixos(self):
# EIXO X
t.penup()
t.backward(self.raio + 50)
t.pendown()
self.linha((self.raio*2)+100)
self.seta()
self.reset()
# EIXO Y
t.left(90)
t.penup()
t.backward(self.raio + 50)
t.pendown()
self.linha((self.raio*2)+100)
self.seta()
self.reset()
def angulo(self, grau):
# DESENHA O ANGULO
self.grau = grau % 360
t.left(self.grau)
t.forward(self.raio)
self.reset()
# DEFINE O VALOR DO SENO, COSSENO E TANGENTE.
self.seno = math.sin(math.radians(self.grau))
self.cosseno = math.cos(math.radians(self.grau))
self.tangente = math.tan(math.radians(self.grau))
# DEFINE O QUADRANTE DO ANGULO
vquad = self.grau
if 0 < vquad < 90:
self.quadrante = 1
elif 90 < vquad < 180:
self.quadrante = 2
elif 180 < vquad < 270:
self.quadrante = 3
elif 270 < vquad < 360:
self.quadrante = 4
if vquad == 0 or vquad == 90 or vquad == 180 or vquad == 270 or vquad == 360: # Quadrante 0 representa os angulos de resultados indefinidos
self.quadrante = 0
def sen(self):
# DESENHA O SENO
t.left(self.grau)
t.forward(self.raio)
t.pencolor("red")
if self.quadrante == 1:
t.left(180 - self.grau)
self.linha(self.cosseno * self.raio)
t.left(90)
t.forward(self.seno * self.raio)
print (self.seno)
elif self.quadrante == 2:
t.right(self.grau)
self.linha((self.cosseno * self.raio) * -1)
t.right(90)
t.forward(self.seno * self.raio)
print (self.seno)
elif self.quadrante == 3:
t.right(self.grau)
self.linha(self.cosseno * self.raio * -1)
t.left(90)
t.forward(self.seno * self.raio * -1)
print (self.seno)
elif self.quadrante == 4:
t.left(180 - self.grau)
self.linha(self.cosseno * self.raio)
t.left(90)
t.forward(self.seno * self.raio)
print (self.seno)
else:
print("Erro: angulo invalido")
self.reset()
def csen(self):
# DESENHA O COSSENO
t.left(self.grau)
t.forward(self.raio)
t.pencolor("green")
if self.quadrante == 1:
t.right(self.grau + 90)
self.linha(self.seno * self.raio)
t.right(90)
t.forward(self.cosseno * self.raio)
print (self.cosseno)
elif self.quadrante == 2:
t.right(self.grau + 90)
self.linha(self.seno * self.raio)
t.right(90)
t.forward(self.cosseno * self.raio)
print (self.cosseno)
elif self.quadrante == 3:
t.right(self.grau - 90)
self.linha(self.seno * self.raio * -1)
t.right(90)
t.forward(self.cosseno * self.raio * -1)
print (self.cosseno)
elif self.quadrante == 4:
t.right(self.grau - 90)
self.linha(self.seno * self.raio * -1)
t.left(90)
t.forward(self.cosseno * self.raio)
print (self.cosseno)
else:
print("Erro: angulo invalido")
self.reset()
def tan(self):
# DESENHA A TANGENTE
t.left(self.grau)
t.penup()
t.pencolor("blue")
if self.quadrante == 1:
t.forward(self.raio)
t.pendown()
self.linha(math.sqrt(((self.tangente*self.raio)**2) + (self.raio**2)) - self.raio)
t.right(self.grau + 90)
t.forward(self.tangente * self.raio)
print (self.tangente)
elif self.quadrante == 2:
t.left(180)
t.forward(self.raio)
t.pendown()
self.linha(math.sqrt(((self.tangente*self.raio)**2) + (self.raio**2)) - self.raio)
t.left(90 - self.grau)
t.forward(self.tangente * self.raio)
print (self.tangente)
elif self.quadrante == 3:
t.left(180)
t.forward(self.raio)
t.pendown()
self.linha(math.sqrt(((self.tangente*self.raio)**2) + (self.raio**2)) - self.raio)
t.right(self.grau - 90)
t.forward(self.tangente * self.raio)
print (self.tangente)
elif self.quadrante == 4:
t.forward(self.raio)
t.pendown()
self.linha(math.sqrt(((self.tangente*self.raio)**2) + (self.raio**2)) - self.raio)
t.right(90 + self.grau)
t.forward(self.tangente * self.raio)
print (self.tangente)
else:
print("Erro: angulo invalido")
self.reset()
| 30.014423 | 149 | 0.492712 | 775 | 6,243 | 3.963871 | 0.125161 | 0.125 | 0.078125 | 0.045573 | 0.665039 | 0.611979 | 0.586263 | 0.574544 | 0.538411 | 0.514648 | 0 | 0.043052 | 0.378664 | 6,243 | 207 | 150 | 30.15942 | 0.748904 | 0.049175 | 0 | 0.708791 | 0 | 0 | 0.015198 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054945 | false | 0 | 0.010989 | 0 | 0.071429 | 0.082418 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1335675a9f3e2654ba5bacc0a704284147b3d912 | 2,518 | py | Python | tests/test_get_set.py | snoopyjc/ssf | b995cae0e90d38e3758d4944fb144831f9bae0a5 | [
"Apache-2.0"
] | 3 | 2020-10-07T18:28:12.000Z | 2020-10-09T15:24:53.000Z | tests/test_get_set.py | snoopyjc/ssf | b995cae0e90d38e3758d4944fb144831f9bae0a5 | [
"Apache-2.0"
] | 15 | 2020-10-09T15:23:03.000Z | 2020-10-29T04:34:17.000Z | tests/test_get_set.py | snoopyjc/ssf | b995cae0e90d38e3758d4944fb144831f9bae0a5 | [
"Apache-2.0"
] | null | null | null | from ssf import SSF
ssf = SSF(errors='raise')
def test_get_set_days():
dn = ssf.get_day_names()
assert isinstance(dn, tuple)
assert dn == (('Mon', 'Monday'),
('Tue', 'Tuesday'),
('Wed', 'Wednesday'),
('Thu', 'Thursday'),
('Fri', 'Friday'),
('Sat', 'Saturday'),
('Sun', 'Sunday'))
ssf.set_day_names([['MO', 'MON'],
('TU', 'TUE'), ['WE', 'WED'],
('TH', 'THU'), ['FR', 'FRI'],
('SA', 'SAT'), ['SU', 'SUN']])
assert ssf.format('ddd dddd', '10/3/2020') == 'SA SAT'
assert ssf.format('ddd dddd', '10/4/2020') == 'SU SUN'
assert ssf.format('ddd dddd', '10/5/2020') == 'MO MON'
assert ssf.format('ddd dddd', '10/6/2020') == 'TU TUE'
assert ssf.format('ddd dddd', '10/7/2020') == 'WE WED'
assert ssf.format('ddd dddd', '10/8/2020') == 'TH THU'
assert ssf.format('ddd dddd', '10/9/2020') == 'FR FRI'
try:
ssf.set_day_names(2)
assert False # Failed
except ValueError:
pass
try:
ssf.set_day_names((1, 2, 3, 4, 5, 6, 7))
assert False # Failed
except ValueError:
pass
def test_get_set_months():
mn = ssf.get_month_names()
assert isinstance(mn, tuple)
assert mn == (None, ('J', 'Jan', 'January'), ('F', 'Feb', 'February'), ('M', 'Mar', 'March'),
('A', 'Apr', 'April'), ('M', 'May', 'May'), ('J', 'Jun', 'June'), ('J', 'Jul', 'July'),
('A', 'Aug', 'August'), ('S', 'Sep', 'September'), ('O', 'Oct', 'October'),
('N', 'Nov', 'November'), ('D', 'Dec', 'December'))
ssf.set_month_names(mn[:-1] + (('X', 'DE', 'DEC'),) )
assert ssf.format('mmmmm mmm mmmm', '12/3/2020') == 'X DE DEC'
try:
ssf.set_month_names(2)
assert False # Failed
except ValueError:
pass
try:
ssf.set_month_names((0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12))
assert False # Failed
except ValueError:
pass
def test_get_load_table():
t = ssf.get_table()
assert t[0] == 'General'
assert t[1] == '0'
assert t[14] == 'm/d/yyyy'
assert t[49] == '@'
ssf.load_table({104:'yyyy-mm-dd', 105:'0.0'})
assert ssf.format(104, '10/6/2020') == '2020-10-06'
assert ssf.format(105, 3.4) == '3.4'
assert ssf.load('0') == 1
assert ssf.load('mmm mmmm') == 5 # Will be inserted at 5
assert ssf.load('@') == 49
assert ssf.format(5, '10/6/2020') == 'Oct October'
| 31.475 | 100 | 0.496029 | 349 | 2,518 | 3.501433 | 0.332378 | 0.10311 | 0.135025 | 0.10311 | 0.355155 | 0.319149 | 0.220949 | 0.209493 | 0.162029 | 0.085106 | 0 | 0.074849 | 0.278396 | 2,518 | 79 | 101 | 31.873418 | 0.597689 | 0.01946 | 0 | 0.246154 | 0 | 0 | 0.208283 | 0 | 0 | 0 | 0 | 0 | 0.4 | 1 | 0.046154 | false | 0.061538 | 0.015385 | 0 | 0.061538 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
133cca4be64ff28929df70bc44eae2ffd26907ef | 5,889 | py | Python | tests/test_list.py | amikrop/django-paste | 109f6e5a42bdc20f3cb671471b3ce5c9e329148b | [
"MIT"
] | 3 | 2020-11-11T11:28:47.000Z | 2022-03-16T11:27:39.000Z | tests/test_list.py | amikrop/django-paste | 109f6e5a42bdc20f3cb671471b3ce5c9e329148b | [
"MIT"
] | null | null | null | tests/test_list.py | amikrop/django-paste | 109f6e5a42bdc20f3cb671471b3ce5c9e329148b | [
"MIT"
] | 1 | 2021-01-05T15:01:06.000Z | 2021-01-05T15:01:06.000Z | import json
from django.urls import reverse
from rest_framework import status
from rest_framework.test import APITestCase
from paste import constants
from tests.mixins import SnippetListTestCaseMixin
from tests.utils import constant, create_snippet, create_user
class SnippetListTestCase(SnippetListTestCaseMixin, APITestCase):
"""Tests for the snippet list view."""
def url(self):
"""Return the snippet list URL."""
return reverse('snippet-list')
def post(self, **kwargs):
"""Send a POST request to the view's URL with data indicated by given
kwargs, as JSON, using the proper content-type, and return the
response.
"""
return self.client.post(
self.url(), data=json.dumps(kwargs),
content_type='application/json')
def test_get_success(self):
"""Snippet list GET must return all the viewable snippets."""
create_snippet('foo')
create_snippet('bar')
response = self.get()
self.assertEqual(response.status_code, status.HTTP_200_OK)
self.assertEqual(len(response.data), 2)
self.assertEqual(response.data[0]['content'], 'foo')
self.assertEqual(response.data[1]['content'], 'bar')
def test_get_private(self):
"""Snippet list GET must return private snippets only to those
authorized to view them.
"""
owner = create_user('owner')
create_snippet('foo', private=True, owner=owner)
expected = [0, 0, 1, 1]
def check(i):
response = self.get()
self.assertEqual(len(response.data), expected[i])
self.check_for_users(check, owner)
def test_get_list_foreign(self):
"""Snippet list GET must not return snippets owned by other users if
the LIST_FOREIGN setting is True, unless requested by a staff user.
"""
create_snippet('foo')
create_snippet('bar', owner=self.user)
expected = [0, 1, 2]
def check(i):
response = self.get()
self.assertEqual(len(response.data), expected[i])
with constant('LIST_FOREIGN', False):
self.check_for_users(check)
def test_post_success(self):
"""Snippet list POST must create a new snippet."""
response = self.post(
content='foo', style='friendly', embed_title=False)
self.assertEqual(response.status_code, status.HTTP_201_CREATED)
self.assertEqual(response.data['content'], 'foo')
self.assertEqual(response.data['title'], '')
self.assertEqual(response.data['language'], '')
self.assertEqual(response.data['style'], 'friendly')
self.assertEqual(
response.data['line_numbers'], constants.DEFAULT_LINE_NUMBERS)
self.assertFalse(response.data['embed_title'])
self.assertEqual(response.data['private'], constants.DEFAULT_PRIVATE)
self.assertIsNone(response.data['owner'])
def test_post_owner(self):
"""Snippet list POST must store currently authenticated user as the
newly created snippet's owner.
"""
self.client.force_authenticate(self.user)
response = self.post(content='foo')
self.assertEqual(response.data['owner'], self.user.pk)
def test_post_no_content(self):
"""Snippet list POST must return a 400 Bad Request response if no
content field is set.
"""
response = self.post(title='foo')
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_post_oversized_title(self):
"""Snippet list POST must return a 400 Bad Request response if the
title field consists of more characters than the TITLE_MAX_LENGTH
setting indicates.
"""
title = 'a' * (constants.TITLE_MAX_LENGTH + 1)
response = self.post(content='foo', title=title)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_post_invalid(self):
"""Snippet list POST must return a 400 Bad Request response if a value
different than the available choices is set for a multiple choice
field.
"""
for field in ['language', 'style']:
response = self.post(
**{'content': 'foo', field: '123-invalid-abc'})
self.assertEqual(
response.status_code, status.HTTP_400_BAD_REQUEST)
def check_post_forbid_anonymous(self, setting):
"""Check that snippet list POST returns a 403 Forbidden response to
anonymous users if the given setting is True.
"""
expected = (
[status.HTTP_403_FORBIDDEN] + [status.HTTP_400_BAD_REQUEST] * 2)
def check(i):
response = self.post()
self.assertEqual(response.status_code, expected[i])
with constant(setting):
self.check_for_users(check)
def test_post_forbid_anonymous(self):
"""Snippet list POST must return a 403 Forbidden response to anonymous
users if the FORBID_ANONYMOUS setting is True.
"""
self.check_post_forbid_anonymous('FORBID_ANONYMOUS')
def test_post_forbid_anonymous_create(self):
"""Snippet list POST must return a 403 Forbidden response to anonymous
users if the FORBID_ANONYMOUS_CREATE setting is True.
"""
self.check_post_forbid_anonymous('FORBID_ANONYMOUS_CREATE')
def test_post_anonymous_private(self):
"""Snippet list POST must return a 400 Bad Request response to
anonymous users who attempt to create a private snippet.
"""
response = self.post(content='foo', private=True)
self.assertEqual(response.status_code, status.HTTP_400_BAD_REQUEST)
def test_pagination(self):
"""Snippet list must be able to handle pagination."""
self.check_pagination()
| 37.993548 | 78 | 0.652742 | 725 | 5,889 | 5.158621 | 0.204138 | 0.076203 | 0.098396 | 0.064973 | 0.468984 | 0.371658 | 0.286096 | 0.263102 | 0.245455 | 0.234225 | 0 | 0.013559 | 0.248599 | 5,889 | 154 | 79 | 38.24026 | 0.831638 | 0.255561 | 0 | 0.22093 | 0 | 0 | 0.062805 | 0.005621 | 0 | 0 | 0 | 0 | 0.244186 | 1 | 0.209302 | false | 0 | 0.081395 | 0 | 0.325581 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
133e3e1f16ff977c264f6c6b6c2854d7503259d9 | 1,082 | py | Python | Algorithmic Toolbox/Greedy Algorithms/Maximum Advertisement Revenue/maximum_ad_revenue.py | ganeshbhandarkar/Python-Projects | a4df933122a6694d249c69d1e8e95b592cf036a0 | [
"MIT"
] | 9 | 2020-07-02T06:06:17.000Z | 2022-02-26T11:08:09.000Z | Algorithmic Toolbox/Greedy Algorithms/Maximum Advertisement Revenue/maximum_ad_revenue.py | ganeshbhandarkar/Python-Projects | a4df933122a6694d249c69d1e8e95b592cf036a0 | [
"MIT"
] | 1 | 2021-11-04T17:26:36.000Z | 2021-11-04T17:26:36.000Z | Algorithmic Toolbox/Greedy Algorithms/Maximum Advertisement Revenue/maximum_ad_revenue.py | ganeshbhandarkar/Python-Projects | a4df933122a6694d249c69d1e8e95b592cf036a0 | [
"MIT"
] | 8 | 2021-01-31T10:31:12.000Z | 2022-03-13T09:15:55.000Z | # python3
from itertools import permutations
def max_dot_product_naive(first_sequence, second_sequence):
assert len(first_sequence) == len(second_sequence)
assert len(first_sequence) <= 10 ** 3
assert all(0 <= f <= 10 ** 5 for f in first_sequence)
assert all(0 <= s <= 10 ** 5 for s in second_sequence)
max_product = 0
for permutation in permutations(second_sequence):
dot_product = sum(first_sequence[i] * permutation[i] for i in range(len(first_sequence)))
max_product = max(max_product, dot_product)
return max_product
def max_dot_product(first_sequence, second_sequence):
assert len(first_sequence) == len(second_sequence)
assert len(first_sequence) <= 10 ** 3
assert all(0 <= f <= 10 ** 5 for f in first_sequence)
assert all(0 <= s <= 10 ** 5 for s in second_sequence)
type here
if __name__ == '__main__':
n = int(input())
prices = list(map(int, input().split()))
clicks = list(map(int, input().split()))
assert len(prices) == len(clicks) == n
print(max_dot_product(prices, clicks))
| 30.914286 | 97 | 0.675601 | 158 | 1,082 | 4.386076 | 0.272152 | 0.18759 | 0.11544 | 0.132756 | 0.513709 | 0.455988 | 0.455988 | 0.455988 | 0.455988 | 0.455988 | 0 | 0.027875 | 0.204251 | 1,082 | 34 | 98 | 31.823529 | 0.777003 | 0.00647 | 0 | 0.347826 | 0 | 0 | 0.007456 | 0 | 0 | 0 | 0 | 0 | 0.391304 | 0 | null | null | 0 | 0.043478 | null | null | 0.043478 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1359a8c4afe9581f59876f936fb68313f28865c1 | 1,028 | py | Python | corehq/apps/accounting/migrations/0026_auto_20180508_1956.py | kkrampa/commcare-hq | d64d7cad98b240325ad669ccc7effb07721b4d44 | [
"BSD-3-Clause"
] | 1 | 2020-05-05T13:10:01.000Z | 2020-05-05T13:10:01.000Z | corehq/apps/accounting/migrations/0026_auto_20180508_1956.py | kkrampa/commcare-hq | d64d7cad98b240325ad669ccc7effb07721b4d44 | [
"BSD-3-Clause"
] | 1 | 2019-12-09T14:00:14.000Z | 2019-12-09T14:00:14.000Z | corehq/apps/accounting/migrations/0026_auto_20180508_1956.py | MaciejChoromanski/commcare-hq | fd7f65362d56d73b75a2c20d2afeabbc70876867 | [
"BSD-3-Clause"
] | 5 | 2015-11-30T13:12:45.000Z | 2019-07-01T19:27:07.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.13 on 2018-05-08 19:56
from __future__ import unicode_literals
from __future__ import absolute_import
from django.db import migrations
def noop(*args, **kwargs):
pass
def _convert_emailed_to_array_field(apps, schema_editor):
BillingRecord = apps.get_model('accounting', 'BillingRecord')
for record in BillingRecord.objects.all():
if record.emailed_to != '':
record.emailed_to_list = record.emailed_to.split(',')
record.save()
WireBillingRecord = apps.get_model('accounting', 'WireBillingRecord')
for wirerecord in WireBillingRecord.objects.all():
if wirerecord.emailed_to != '':
wirerecord.emailed_to_list = wirerecord.emailed_to.split(',')
wirerecord.save()
class Migration(migrations.Migration):
dependencies = [
('accounting', '0025_auto_20180508_1952'),
]
operations = [
migrations.RunPython(_convert_emailed_to_array_field, reverse_code=noop)
]
| 27.783784 | 80 | 0.689689 | 117 | 1,028 | 5.760684 | 0.529915 | 0.106825 | 0.066766 | 0.062315 | 0.077151 | 0 | 0 | 0 | 0 | 0 | 0 | 0.041312 | 0.199416 | 1,028 | 36 | 81 | 28.555556 | 0.777643 | 0.067121 | 0 | 0 | 1 | 0 | 0.088912 | 0.024059 | 0 | 0 | 0 | 0 | 0 | 1 | 0.086957 | false | 0.043478 | 0.130435 | 0 | 0.347826 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
136018fcec6b4476406646ec76b683c9bf3950c1 | 1,148 | py | Python | rover/rover.py | cloudy/osr-rover-code | 07d370ae1cde75eaf2d279fcc7f220c95cf6d736 | [
"Apache-2.0"
] | null | null | null | rover/rover.py | cloudy/osr-rover-code | 07d370ae1cde75eaf2d279fcc7f220c95cf6d736 | [
"Apache-2.0"
] | null | null | null | rover/rover.py | cloudy/osr-rover-code | 07d370ae1cde75eaf2d279fcc7f220c95cf6d736 | [
"Apache-2.0"
] | null | null | null | from __future__ import print_function
import time
from rover import Robot
from connections import Connections
class Rover(Robot, Connections):
def __init__( self,
config,
bt_flag = 0,
xbox_flag = 0,
unix_flag = 0
):
self.bt_flag = bt_flag
self.xbox_flag = xbox_flag
self.unix_flag = unix_flag
super(Rover,self).__init__(config)
self.prev_cmd = [None,None]
if bt_flag and xbox_flag:
raise Exception( "[Rover init] Cannot initialize with both bluetooth and Xbox, run with only one argument")
elif bt_flag: self.connection_type = "b"
elif xbox_flag: self.connection_type = "x"
self.connectController()
def drive(self):
try:
v,r = self.getDriveVals()
if v,r != self.prev_cmd:
self.sendCommands(v,r)
self.prev_cmd = v,r
except KeyboardInterrupt:
self.cleanup()
except Exception as e:
print(e)
self.cleanup()
time.sleep(0.5)
self.connectController()
if self.unix_flag:
try:
self.sendUnixData()
except Exception as e:
print(e)
self.unix_flag = 0
def cleanup(self):
self.killMotors()
self.closeConnections()
| 19.457627 | 110 | 0.674216 | 160 | 1,148 | 4.63125 | 0.3625 | 0.040486 | 0.048583 | 0.059379 | 0.110661 | 0.075574 | 0.075574 | 0 | 0 | 0 | 0 | 0.006787 | 0.229965 | 1,148 | 58 | 111 | 19.793103 | 0.831448 | 0 | 0 | 0.232558 | 0 | 0 | 0.077526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.093023 | null | null | 0.069767 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1368b793b823b3bd0b461ed385d6e6b6434e1e68 | 3,455 | py | Python | scripts/dev/dockerutil.py | axelbarjon/mongodb-kubernetes-operator | 13eb844c55774ce8a6de51edde1a66b4371f3ef6 | [
"RSA-MD"
] | 1 | 2021-03-24T17:54:51.000Z | 2021-03-24T17:54:51.000Z | scripts/dev/dockerutil.py | axelbarjon/mongodb-kubernetes-operator | 13eb844c55774ce8a6de51edde1a66b4371f3ef6 | [
"RSA-MD"
] | 18 | 2021-03-08T13:38:37.000Z | 2022-02-14T15:06:28.000Z | scripts/dev/dockerutil.py | axelbarjon/mongodb-kubernetes-operator | 13eb844c55774ce8a6de51edde1a66b4371f3ef6 | [
"RSA-MD"
] | 1 | 2021-03-25T13:37:02.000Z | 2021-03-25T13:37:02.000Z | import docker
from dockerfile_generator import render
import os
import json
from tqdm import tqdm
from typing import Union, Any, Optional
def build_image(repo_url: str, tag: str, path: str) -> None:
"""
build_image builds the image with the given tag
"""
client = docker.from_env()
print(f"Building image: {tag}")
client.images.build(tag=tag, path=path)
print("Successfully built image!")
def push_image(tag: str) -> None:
"""
push_image pushes the given tag. It uses
the current docker environment
"""
client = docker.from_env()
print(f"Pushing image: {tag}")
with tqdm(total=100, ascii=False) as progress_bar:
last_percent = 0.0
for line in client.images.push(tag, stream=True):
percent = get_completion_percentage(line)
if percent:
progress_bar.update(percent - last_percent)
last_percent = percent
def retag_image(
old_repo_url: str,
new_repo_url: str,
old_tag: str,
new_tag: str,
path: str,
labels: Optional[dict] = None,
username: Optional[str] = None,
password: Optional[str] = None,
registry: Optional[str] = None,
) -> None:
with open(f"{path}/Dockerfile", "w") as f:
f.write(f"FROM {old_repo_url}:{old_tag}")
client = docker.from_env()
if all(value is not None for value in [username, password, registry]):
client.login(username=username, password=password, registry=registry)
image, _ = client.images.build(path=f"{path}", labels=labels, tag=new_tag)
image.tag(new_repo_url, new_tag)
os.remove(f"{path}/Dockerfile")
# We do not want to republish an image that has not changed, so we check if the new
# pair repo:tag already exists.
try:
image = client.images.pull(new_repo_url, new_tag)
return
# We also need to catch APIError as if the image has been recently deleted (uncommon, but might happen?)
# we will get this kind of error:
# docker.errors.APIError: 500 Server Error: Internal Server Error
# ("unknown: Tag <tag> was deleted or has expired. To pull, revive via time machine"
except (docker.errors.ImageNotFound, docker.errors.APIError) as e:
pass
print(f"Pushing to {new_repo_url}:{new_tag}")
client.images.push(new_repo_url, new_tag)
def get_completion_percentage(line: Any) -> float:
try:
line = json.loads(line.strip().decode("utf-8"))
except ValueError:
return 0
to_skip = ("Preparing", "Waiting", "Layer already exists")
if "status" in line:
if line["status"] in to_skip:
return 0
if line["status"] == "Pushing":
try:
current = float(line["progressDetail"]["current"])
total = float(line["progressDetail"]["total"])
except KeyError:
return 0
result = (current / total) * 100
if result > 100.0:
return 100.0
return result
return 0
def build_and_push_image(repo_url: str, tag: str, path: str, image_type: str) -> None:
"""
build_and_push_operator creates the Dockerfile for the operator
and pushes it to the target repo
"""
dockerfile_text = render(image_type, ["."])
with open(f"{path}/Dockerfile", "w") as f:
f.write(dockerfile_text)
build_image(repo_url, tag, path)
os.remove(f"{path}/Dockerfile")
push_image(tag)
| 32.28972 | 108 | 0.636758 | 473 | 3,455 | 4.528541 | 0.312896 | 0.03268 | 0.023343 | 0.024276 | 0.143324 | 0.080299 | 0.056956 | 0.056956 | 0.030812 | 0.030812 | 0 | 0.009292 | 0.252388 | 3,455 | 106 | 109 | 32.59434 | 0.819977 | 0.176845 | 0 | 0.189189 | 0 | 0 | 0.112388 | 0.017235 | 0 | 0 | 0 | 0 | 0 | 1 | 0.067568 | false | 0.054054 | 0.081081 | 0 | 0.243243 | 0.054054 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
1369857b721c52701d49ebb99393f03d4c246712 | 569 | py | Python | appliance_catalog/migrations/0015_appliance_icon_py3.py | ChameleonCloud/portal | 92a06fb926dc36e997b94fb8dcd22b7e0d24d3ee | [
"Apache-2.0"
] | 3 | 2015-08-04T20:53:41.000Z | 2020-02-14T22:58:20.000Z | appliance_catalog/migrations/0015_appliance_icon_py3.py | ChameleonCloud/portal | 92a06fb926dc36e997b94fb8dcd22b7e0d24d3ee | [
"Apache-2.0"
] | 103 | 2015-01-15T14:21:00.000Z | 2022-03-31T19:14:20.000Z | appliance_catalog/migrations/0015_appliance_icon_py3.py | ChameleonCloud/portal | 92a06fb926dc36e997b94fb8dcd22b7e0d24d3ee | [
"Apache-2.0"
] | 4 | 2016-02-22T16:48:20.000Z | 2021-01-08T17:13:21.000Z | # -*- coding: utf-8 -*-
# Generated by Django 1.11.29 on 2021-02-25 20:32
from __future__ import unicode_literals
from django.db import migrations, models
class Migration(migrations.Migration):
"""Updates ImageField syntax for later version.
"""
dependencies = [
('appliance_catalog', '0014_auto_20180625_1104'),
]
operations = [
migrations.AlterField(
model_name='appliance',
name='appliance_icon',
field=models.ImageField(blank=True, upload_to='appliance_catalog/icons/'),
),
]
| 24.73913 | 86 | 0.644991 | 62 | 569 | 5.709677 | 0.790323 | 0.090395 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078522 | 0.239016 | 569 | 22 | 87 | 25.863636 | 0.73903 | 0.210896 | 0 | 0 | 1 | 0 | 0.196833 | 0.106335 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.153846 | 0 | 0.384615 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
136a5b3d51a58e910193de0d1a2f38a488d4801a | 1,435 | py | Python | twitoff/twitter.py | ChristopherKchilton/twitoff-ChristopherKchilton | fbac9899feff256ededab009b28e2f6ebd67f476 | [
"MIT"
] | 1 | 2021-09-23T22:04:09.000Z | 2021-09-23T22:04:09.000Z | twitoff/twitter.py | ChristopherKchilton/twitoff-ChristopherKchilton | fbac9899feff256ededab009b28e2f6ebd67f476 | [
"MIT"
] | null | null | null | twitoff/twitter.py | ChristopherKchilton/twitoff-ChristopherKchilton | fbac9899feff256ededab009b28e2f6ebd67f476 | [
"MIT"
] | null | null | null | """Retrieve and request tweets from the DS API"""
import requests
import spacy
from .models import DB, Tweet, User
nlp = spacy.load("my_model")
def vectorize_tweet(tweet_text):
return nlp(tweet_text).vector
# Add and updates tweets
def add_or_update_user(username):
"""Adds and updates the user with twiter handle 'username'
to our database
"""
#TODO: Figure out
try:
r = requests.get(
f"https://lambda-ds-twit-assist.herokuapp.com/user/{username}")
user = r.json()
user_id = user["twitter_handle"]["id"]
# print(user)
# This is either respectively grabs or creates a user for our db
db_user = (User.query.get(user_id)) or User(id=user_id, name=username)
# This adds the db_user to our database
DB.session.add(db_user)
tweets = user["tweets"]
# if tweets:
# db_user.newest_tweet_id = tweets[0].id
for tweet in tweets:
tweet_vector = vectorize_tweet(tweet["full_text"])
tweet_id = tweet["id"]
db_tweet = (Tweet.query.get(tweet_id)) or Tweet(
id=tweet["id"], text=tweet["full_text"], vect=tweet_vector)
db_user.tweets.append(db_tweet)
DB.session.add(db_tweet)
except Exception as e:
print("Error processing {}: {}".format(username, e))
raise e
else:
DB.session.commit()
| 25.175439 | 78 | 0.608362 | 196 | 1,435 | 4.311224 | 0.413265 | 0.049704 | 0.04497 | 0.033136 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000965 | 0.278049 | 1,435 | 56 | 79 | 25.625 | 0.814672 | 0.225087 | 0 | 0 | 0 | 0 | 0.123959 | 0 | 0 | 0 | 0 | 0.017857 | 0 | 1 | 0.074074 | false | 0 | 0.111111 | 0.037037 | 0.222222 | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
136b5475110947f42e139d24c848b375d4d0e140 | 2,144 | py | Python | deep_disfluency/feature_extraction/wer_calculation_from_final_asr_results.py | askender/deep_disfluency | bea8403ed954df8eadd3e2b9d98bb7c2b416a665 | [
"MIT"
] | null | null | null | deep_disfluency/feature_extraction/wer_calculation_from_final_asr_results.py | askender/deep_disfluency | bea8403ed954df8eadd3e2b9d98bb7c2b416a665 | [
"MIT"
] | null | null | null | deep_disfluency/feature_extraction/wer_calculation_from_final_asr_results.py | askender/deep_disfluency | bea8403ed954df8eadd3e2b9d98bb7c2b416a665 | [
"MIT"
] | null | null | null | from mumodo.mumodoIO import open_intervalframe_from_textgrid
import numpy
from deep_disfluency.utils.accuracy import wer
final_file = open('wer_test.text', "w")
ranges1 = [line.strip() for line in open(
"/media/data/jh/simple_rnn_disf/rnn_disf_detection/data/disfluency_detection/swda_divisions_disfluency_detection/SWDisfHeldoutASR_ranges.text")]
ranges2 = [line.strip() for line in open(
"/media/data/jh/simple_rnn_disf/rnn_disf_detection/data/disfluency_detection/swda_divisions_disfluency_detection/SWDisfTestASR_ranges.text")]
for ranges in [ranges1, ranges2]:
final_file.write("\n\n")
for r in ranges:
for s in ["A", "B"]:
iframe = open_intervalframe_from_textgrid("{0}{1}.TextGrid"
.format(r, s))
hyp = " ".join(iframe['Hyp']['text'])
ref = " ".join(iframe['Ref']['text'])
wer = wer(ref, hyp)
cost = wer(ref, hyp, macro=True)
print r, s, wer
print>>final_file, r, s, wer, cost
final_file.close()
# Based on the results, output the 'good' ASR results
results = open("wer_test.text")
no_ho = 0
no_test = 0
ingood = True
file = open("../../../simple_rnn_disf/rnn_disf_detection/data/disfluency_detection/swda_divisions_disfluency_detection/SWDisfHeldoutASRgood_ranges.text", "w")
for l in results:
# print l
if l == "\n":
print no_ho
no_ho = 0
file.close()
file = open(
"../../../simple_rnn_disf/rnn_disf_detection/data/disfluency_detection/swda_divisions_disfluency_detection/SWDisfTestASRgood_ranges.text",
"w")
continue
if float(l.strip('\n').split(" ")[
2]) < 0.4: # both speakers are under 40% error rate- likely half decent separation
# print l
if ingood and "B" in l.strip("\n").split(" ")[1]:
no_ho += 1
#file.write(l.strip('\n').split(" ")[0]+l.strip('\n').split(" ")[1]+"\n")
file.write(l.strip('\n').split(" ")[0] + "\n")
ingood = True
else:
ingood = False
print no_ho
results.close()
file.close()
| 36.965517 | 158 | 0.619403 | 286 | 2,144 | 4.451049 | 0.304196 | 0.043991 | 0.027494 | 0.047133 | 0.383346 | 0.362922 | 0.362922 | 0.328358 | 0.328358 | 0.328358 | 0 | 0.0116 | 0.236007 | 2,144 | 57 | 159 | 37.614035 | 0.765568 | 0.097481 | 0 | 0.173913 | 0 | 0 | 0.326594 | 0.285122 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.065217 | null | null | 0.086957 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
136b8fe5a8aa80e827bc10eb9537bdf94b4fdc81 | 1,541 | py | Python | newsweec/utils/_dataclasses.py | Adwaith-Rajesh/newsweec | f3b66fb6f74cb68be4e716269032db340abe8320 | [
"MIT"
] | 13 | 2020-08-30T10:52:29.000Z | 2021-08-18T12:20:39.000Z | newsweec/utils/_dataclasses.py | Adwaith-Rajesh/newsweec | f3b66fb6f74cb68be4e716269032db340abe8320 | [
"MIT"
] | null | null | null | newsweec/utils/_dataclasses.py | Adwaith-Rajesh/newsweec | f3b66fb6f74cb68be4e716269032db340abe8320 | [
"MIT"
] | 1 | 2021-06-07T04:01:37.000Z | 2021-06-07T04:01:37.000Z | from dataclasses import dataclass
from dataclasses import field
from time import time
from typing import Any
from typing import Callable
from typing import Dict
from typing import List
from typing import Optional
from typing import Tuple
@dataclass
class NewUser:
"""Deals with the commands the user is currently sending"""
user_id: int
chat_id: int
command: str
def __repr__(self) -> str:
return f"{self.user_id=} {self.command=}"
@dataclass
class UserCommand:
"""Stores the latest command sent by the user"""
user_id: int
command: str
insert_time: int = int(time()) # for garbage collection
def __repr__(self) -> str:
return f"{self.user_id=} {self.command=} {self.insert_time=}"
@dataclass
class MessageInfo:
"""Important things in the message"""
user_id: int
chat_id: int
message_id: int
text: str
def __repr__(self) -> str:
return f"{self.user_id=} {self.chat_id=} {self.message_id=} {self.text=}"
@dataclass
class UserDBInfo:
"""Info about the user from the DB"""
feed: bool # if false, the bot will not send any news feeds on a daily basis
user_id: int
db_id: int
topics: List[str] = field(default_factory=lambda: [])
def __repr__(self) -> str:
return f"{self.user_id=} {self.feed=} {self.db_id=} {self.topics=}"
@dataclass
class StagedFunction:
"""For FunctionStagingArea"""
fn: Callable[..., Any]
args: Optional[Tuple[Any, ...]] = None
kwargs: Optional[Dict[str, Any]] = None
| 22.661765 | 81 | 0.668397 | 215 | 1,541 | 4.632558 | 0.353488 | 0.048193 | 0.096386 | 0.056225 | 0.196787 | 0.196787 | 0.160643 | 0.160643 | 0.160643 | 0.160643 | 0 | 0 | 0.21804 | 1,541 | 67 | 82 | 23 | 0.826556 | 0.176509 | 0 | 0.386364 | 0 | 0 | 0.162641 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.204545 | 0.090909 | 0.886364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
136bbda00809274a9f8b16997fd9b06b349771f8 | 3,754 | py | Python | vivo2notld/definitions/person_definition.py | gwu-libraries/vivo2notld | 3f579f8aad28c60119864757e1fe66c2d64a0149 | [
"MIT"
] | 5 | 2015-09-23T10:05:29.000Z | 2016-04-07T17:08:38.000Z | vivo2notld/definitions/person_definition.py | gwu-libraries/vivo2notld | 3f579f8aad28c60119864757e1fe66c2d64a0149 | [
"MIT"
] | null | null | null | vivo2notld/definitions/person_definition.py | gwu-libraries/vivo2notld | 3f579f8aad28c60119864757e1fe66c2d64a0149 | [
"MIT"
] | null | null | null | from .document_summary import definition as document_summary_definition
from .organization_summary import definition as organization_summmary_definition
definition = {
"where": "?subj a foaf:Person .",
"fields": {
"name": {
"where": "?subj rdfs:label ?obj ."
},
#Contact info
"email": {
"where": """
?subj obo:ARG_2000028 ?vc .
?vc a vcard:Kind .
?vc vcard:hasEmail ?vce .
?vce a vcard:Email, vcard:Work .
?vce vcard:email ?obj .
"""
},
"telephone": {
"where": """
?subj obo:ARG_2000028 ?vc .
?vc a vcard:Kind .
?vc vcard:hasTelephone ?vct .
?vct a vcard:Telephone .
?vct vcard:telephone ?obj .
"""
},
"address": {
"where": """
?subj obo:ARG_2000028 ?vc .
?vc a vcard:Kind .
?vc vcard:hasAddress ?obj .
""",
"definition": {
"where": "?subj a vcard:Address .",
"fields": {
"address": {
"where": "?subj vcard:streetAddress ?obj ."
},
"city": {
"where": "?subj vcard:locality ?obj ."
},
"state": {
"where": "?subj vcard:region ?obj ."
},
"zip": {
"where": "?subj vcard:postalCode ?obj ."
}
}
}
},
"website": {
"list": True,
"where": """
?subj obo:ARG_2000028 ?vc .
?vc a vcard:Kind .
?vc vcard:hasURL ?vcu .
?vcu a vcard:URL .
?vcu vcard:url ?obj .
""",
"optional": True
},
"researchArea": {
"where": """
?subj vivo:hasResearchArea ?ra .
?ra rdfs:label ?obj .
""",
"optional": True,
"list": True
},
"geographicFocus": {
"where": """
?subj vivo:geographicFocus ?gf .
?gf rdfs:label ?obj .
""",
"optional": True,
"list": True
},
"overview": {
"where": "?subj vivo:overview ?obj .",
"optional": True,
},
"positions": {
"where": "?subj vivo:relatedBy ?obj .",
"definition": {
"where": "?subj a vivo:Position .",
"fields": {
"title": {
"where": "?subj rdfs:label ?obj ."
},
"organization": {
"where": "?subj vivo:relates ?obj .",
"definition": organization_summmary_definition
}
}
},
"optional": True,
"list": True
},
"publications": {
"where": """
?subj vivo:relatedBy ?aship .
?aship a vivo:Authorship .
?aship vivo:relates ?obj .
""",
"definition": document_summary_definition,
"optional": True,
"list": True
}
}
}
| 33.221239 | 80 | 0.34017 | 245 | 3,754 | 5.155102 | 0.261224 | 0.135392 | 0.061758 | 0.047506 | 0.304038 | 0.186857 | 0.186857 | 0.136184 | 0.136184 | 0.136184 | 0 | 0.01627 | 0.541556 | 3,754 | 112 | 81 | 33.517857 | 0.717606 | 0.003197 | 0 | 0.363636 | 0 | 0 | 0.546378 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.018182 | 0 | 0.018182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1377f0cfac03b437ee48b39fc8008df7e5dd358b | 4,720 | py | Python | docs/updatedoc.py | JukeboxPipeline/jukedj | d4159961c819c26792a278981ee68106ee15f3f3 | [
"BSD-3-Clause"
] | 2 | 2015-01-22T17:39:05.000Z | 2015-02-09T16:47:15.000Z | docs/updatedoc.py | JukeboxPipeline/jukedj | d4159961c819c26792a278981ee68106ee15f3f3 | [
"BSD-3-Clause"
] | 3 | 2020-02-12T00:24:58.000Z | 2021-06-10T20:05:03.000Z | docs/updatedoc.py | JukeboxPipeline/jukeboxmaya | c8d6318d53cdb5493453c4a6b65ef75bdb2d5f2c | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
"""Builds the documentaion. First it runs gendoc to create rst files for the source code. Then it runs sphinx make.
.. Warning:: This will delete the content of the output directory first! So you might loose data.
You can use updatedoc.py -nod.
Usage, just call::
updatedoc.py -h
"""
import argparse
import os
import shutil
import sys
import gendoc
thisdir = os.path.abspath(os.path.dirname(__file__))
def setup_argparse():
"""Sets up the argument parser and returns it
:returns: the parser
:rtype: :class:`argparse.ArgumentParser`
:raises: None
"""
parser = argparse.ArgumentParser(
description="Builds the documentaion. First it runs gendoc to create rst files\
for the source code. Then it runs sphinx make.\
WARNING: this will delete the contents of the output dirs. You can use -nod.")
ipath = os.path.join(thisdir, '../src')
ipath = os.path.abspath(ipath)
idefault = [ipath]
parser.add_argument('-i', '--input', nargs='+', default=idefault,
help='list of input directories. gendoc is called for every\
source dir.\
Default is \'%s\'.' % ', '.join(idefault))
opath = os.path.join(thisdir, 'reference')
opath = os.path.abspath(opath)
odefault = [opath]
parser.add_argument('-o', '--output', nargs='+', default=odefault,
help='list of output directories. if you have multiple source\
directories, the corresponding output directorie is used.\
if there are less dirs than for source, the last output dir\
is used for the remaining source dirs.\
WARNING: the output directories are emptied by default. See -nod.\
Default is \'%s\'.' % ', '.join(odefault))
gadefault = ['-T', '-f', '-e', '-o']
parser.add_argument('-ga', '--gendocargs', nargs='*', default=gadefault,
help="list of arguments to pass to gendoc. use -gh for info.\
Default is \'%s\'" % ', '.join(gadefault))
parser.add_argument('-nod', '--nodelete', action='store_true',
help='Do not empty the output directories first.')
parser.add_argument('-gh', '--gendochelp', action='store_true',
help='print the help for gendoc and exit')
return parser
def prepare_dir(directory, delete=True):
"""Create apidoc dir, delete contents if delete is True.
:param directory: the apidoc directory. you can use relative paths here
:type directory: str
:param delete: if True, deletes the contents of apidoc. This acts like an override switch.
:type delete: bool
:returns: None
:rtype: None
:raises: None
"""
if os.path.exists(directory):
if delete:
assert directory != thisdir, 'Trying to delete docs! Specify other output dir!'
print 'Deleting %s' % directory
shutil.rmtree(directory)
print 'Creating %s' % directory
os.mkdir(directory)
else:
print 'Creating %s' % directory
os.mkdir(directory)
def run_gendoc(source, dest, args):
"""Starts gendoc which reads source and creates rst files in dest with the given args.
:param source: The python source directory for gendoc. Can be a relative path.
:type source: str
:param dest: The destination for the rst files. Can be a relative path.
:type dest: str
:param args: Arguments for gendoc. See gendoc for more information.
:type args: list
:returns: None
:rtype: None
:raises: SystemExit
"""
args.insert(0, 'gendoc.py')
args.append(dest)
args.append(source)
print 'Running gendoc.main with: %s' % args
gendoc.main(args)
def main(argv=sys.argv[1:]):
"""Parse commandline arguments and run the tool
:param argv: the commandline arguments.
:type argv: list
:returns: None
:rtype: None
:raises: None
"""
parser = setup_argparse()
args = parser.parse_args(argv)
if args.gendochelp:
sys.argv[0] = 'gendoc.py'
genparser = gendoc.setup_parser()
genparser.print_help()
sys.exit(0)
print 'Preparing output directories'
print '='*80
for odir in args.output:
prepare_dir(odir, not args.nodelete)
print '\nRunning gendoc'
print '='*80
for i, idir in enumerate(args.input):
if i >= len(args.output):
odir = args.output[-1]
else:
odir = args.output[i]
run_gendoc(idir, odir, args.gendocargs)
if __name__ == '__main__':
main() | 35.488722 | 115 | 0.613559 | 594 | 4,720 | 4.828283 | 0.314815 | 0.014644 | 0.029637 | 0.014644 | 0.154812 | 0.154812 | 0.106695 | 0.079498 | 0.079498 | 0.079498 | 0 | 0.002636 | 0.276695 | 4,720 | 133 | 116 | 35.488722 | 0.837434 | 0.004237 | 0 | 0.105263 | 0 | 0 | 0.116201 | 0 | 0 | 0 | 0 | 0 | 0.013158 | 0 | null | null | 0.013158 | 0.065789 | null | null | 0.131579 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1379138cdd6c153ab5075c9fd6e443c52181da72 | 4,618 | py | Python | BridgeOptimizer/scriptBuilder/ScriptBuilderBoundaryConditions.py | manuel1618/bridgeOptimizer | 273bbf27b2c6273e4aaca55debbd9a10bebf7042 | [
"MIT"
] | 1 | 2022-01-20T16:30:04.000Z | 2022-01-20T16:30:04.000Z | BridgeOptimizer/scriptBuilder/ScriptBuilderBoundaryConditions.py | manuel1618/bridgeOptimizer | 273bbf27b2c6273e4aaca55debbd9a10bebf7042 | [
"MIT"
] | 13 | 2022-01-07T14:07:15.000Z | 2022-01-29T19:42:48.000Z | BridgeOptimizer/scriptBuilder/ScriptBuilderBoundaryConditions.py | manuel1618/bridgeOptimizer | 273bbf27b2c6273e4aaca55debbd9a10bebf7042 | [
"MIT"
] | null | null | null | import os
from typing import List, Tuple
from BridgeOptimizer.datastructure.hypermesh.LoadCollector import LoadCollector
from BridgeOptimizer.datastructure.hypermesh.LoadStep import LoadStep
from BridgeOptimizer.datastructure.hypermesh.Force import Force
from BridgeOptimizer.datastructure.hypermesh.SPC import SPC
class ScriptBuilderBoundaryConditions:
"""
Extra class for generating Loadstep, Loadcollectors, Forces and Constraints
Parameters:
---------
None
"""
def __init__(self) -> None:
pass
def write_tcl_commands_loadCollectors(self, tcl_commands: List) -> None:
"""
Creates all the load collectors (has to be done before creating loadsteps, as the loadcollectors are referenced)
"""
load_collector: LoadCollector = None
# create all load collectors and loads first
for load_collector in LoadCollector.instances:
load_collector_type = load_collector.get_load_collector_type()
load_collector.name = f"{str(load_collector_type.__name__)}_{str(load_collector.get_id())}"
tcl_commands.append(
f"*createentity loadcols includeid=0 name=\"{load_collector.name}\"")
# create loads
for load in load_collector.loads:
if load_collector_type == Force:
force: Force = load
tcl_commands.append(
f"*createmark nodes 1 {' '.join([str(x) for x in force.nodeIds])}")
tcl_commands.append(
f"*loadcreateonentity_curve nodes 1 1 1 {force.x} {force.y} {force.z} 0 {force.x} {force.y} {force.z} 0 0 0 0")
elif load_collector_type == SPC:
spc: SPC = load
tcl_commands.append(
f"*createmark nodes 1 {' '.join([str(x) for x in spc.nodeIds])}")
tcl_commands.append(
f"*loadcreateonentity_curve nodes 1 3 1 {spc.dofs[0]} {spc.dofs[1]} {spc.dofs[2]} {spc.dofs[3]} {spc.dofs[4]} {spc.dofs[5]} 0 0 0 0 0")
tcl_commands.append("*createmark loads 0 1")
tcl_commands.append("*loadsupdatefixedvalue 0 0")
def write_tcl_commands_loadsteps(self, tcl_commands: List) -> None:
"""
Single method to write all tcl commands to the file
"""
self.write_tcl_commands_loadCollectors(tcl_commands)
# create the load step
load_step: LoadStep = None
for load_step in LoadStep.instances:
load_step_id = str(load_step.get_id())
# TODO: should be possible to just use a spc collector - not possible rn.
spc_loadCollector = load_step.spc_loadCollector
load_loadCollector = load_step.load_loadCollector
spc_loadCollector_id = str(spc_loadCollector.get_id())
load_loadCollector_id = str(load_loadCollector.get_id())
tcl_commands.append(
f"*createmark loadcols 1 \"{spc_loadCollector.name}\" \"{load_loadCollector.name}\"")
tcl_commands.append("*createmark outputblocks 1")
tcl_commands.append("*createmark groups 1")
tcl_commands.append(
f"*loadstepscreate \"loadstep_{load_step_id}\" 1")
tcl_commands.append(
f"*attributeupdateint loadsteps {load_step_id} 4143 1 1 0 1")
tcl_commands.append(
f"*attributeupdateint loadsteps {load_step_id} 4709 1 1 0 1")
tcl_commands.append(
f"*setvalue loadsteps id={load_step_id} STATUS=2 4059=1 4060=STATICS")
tcl_commands.append(
f"*attributeupdateentity loadsteps {load_step_id} 4145 1 1 0 loadcols {spc_loadCollector_id}")
tcl_commands.append(
f"*attributeupdateentity loadsteps {load_step_id} 4147 1 1 0 loadcols {load_loadCollector_id}")
tcl_commands.append(
f"*attributeupdateint loadsteps {load_step_id} 3800 1 1 0 0")
tcl_commands.append(
f"*attributeupdateint loadsteps {load_step_id} 707 1 1 0 0")
tcl_commands.append(
f"*attributeupdateint loadsteps {load_step_id} 2396 1 1 0 0")
tcl_commands.append(
f"*attributeupdateint loadsteps {load_step_id} 8134 1 1 0 0")
tcl_commands.append(
f"*attributeupdateint loadsteps {load_step_id} 2160 1 1 0 0")
tcl_commands.append(
f"*attributeupdateint loadsteps {load_step_id} 10212 1 1 0 0")
| 47.122449 | 159 | 0.622347 | 546 | 4,618 | 5.053114 | 0.20696 | 0.115622 | 0.135556 | 0.117434 | 0.39942 | 0.336716 | 0.30917 | 0.295397 | 0.286336 | 0.185212 | 0 | 0.036947 | 0.290819 | 4,618 | 97 | 160 | 47.608247 | 0.805496 | 0.090732 | 0 | 0.264706 | 0 | 0.029412 | 0.318204 | 0.055097 | 0 | 0 | 0 | 0.010309 | 0 | 1 | 0.044118 | false | 0.014706 | 0.088235 | 0 | 0.147059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
137f6361d1e175bc555153af22f77e79ad507096 | 369 | py | Python | dataset/dataset.py | TeamOfProfGuo/few_shot_baseline | f9ac87b9d309fc417589350d3ce61d3612e2be91 | [
"MIT"
] | null | null | null | dataset/dataset.py | TeamOfProfGuo/few_shot_baseline | f9ac87b9d309fc417589350d3ce61d3612e2be91 | [
"MIT"
] | null | null | null | dataset/dataset.py | TeamOfProfGuo/few_shot_baseline | f9ac87b9d309fc417589350d3ce61d3612e2be91 | [
"MIT"
] | null | null | null | import os
DEFAULT_ROOT = './materials'
datasets_dt = {}
def register(name):
def decorator(cls):
datasets_dt[name] = cls
return cls
return decorator
def make(name, **kwargs):
if kwargs.get('root_path') is None:
kwargs['root_path'] = os.path.join(DEFAULT_ROOT, name)
dataset = datasets_dt[name](**kwargs)
return dataset
| 17.571429 | 62 | 0.642276 | 48 | 369 | 4.791667 | 0.458333 | 0.130435 | 0.121739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.233062 | 369 | 20 | 63 | 18.45 | 0.812721 | 0 | 0 | 0 | 0 | 0 | 0.078804 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.230769 | false | 0 | 0.076923 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13836c2a14bcd63d7cbfba39a7fecc1ae843d691 | 7,087 | py | Python | backend/jenkins/pipelines/ansible/utils/testplan_gen.py | gbl1124/hfrd | 327d7c1e18704d2e31a2649b40ae1d90353ebe24 | [
"Apache-2.0"
] | 5 | 2019-08-02T20:53:57.000Z | 2021-06-25T05:16:46.000Z | backend/jenkins/pipelines/ansible/utils/testplan_gen.py | anandbanik/hfrd | 7bc1f13bfc9c7d902aec0363d27b089ef68c7eec | [
"Apache-2.0"
] | null | null | null | backend/jenkins/pipelines/ansible/utils/testplan_gen.py | anandbanik/hfrd | 7bc1f13bfc9c7d902aec0363d27b089ef68c7eec | [
"Apache-2.0"
] | 14 | 2019-07-01T01:40:50.000Z | 2020-03-24T06:14:32.000Z |
#!/usr/bin/python
import yaml
import os
import ast
import sys
from collections import OrderedDict
curr_dir = os.getcwd()
work_dir = sys.argv[1]
network_type = sys.argv[2]
testplan_dict = {}
testplan_dict["name"] = "System performance test"
testplan_dict["description"] = "This test is to create as much chaincode computation load as possible"
testplan_dict["runid"] = "RUNID_HERE"
if network_type == "ibp":
testplan_dict["networkid"] = sys.argv[3]
testplan_dict["collectFabricMetrics"] = False
testplan_dict["storageclass"] = "default"
testplan_dict["saveLog"] = False
testplan_dict["continueAfterFail"] = True
testplan_dict["tests"] = []
testplan_dict["peernodeAlias"] =[]
if os.path.exists(work_dir) != True:
print 'certs keyfiles directory do not exist'
exit(1)
# Load template file
with open(curr_dir + "/templates/testplan_template.yml", 'r') as stream:
template = yaml.load(stream)
channel_create = template["CHANNEL_CREATE"]
# channel_join = template["CHANNEL_JOIN"]
chaincode_install = template["CHAINCODE_INSTALL"]
chaincode_instantiate = template["CHAINCODE_INSTANTIATE"]
chaincode_invoke = template["CHAINCODE_INVOKE"]
execute_command = template["EXECUTE_COMMAND"]
connectionProfile = {}
org_list = []
org_list_lowercase = []
orderer_list = []
peer_list = []
org_peers_dict = {}
org_anchor_dict ={}
allAnchor_list =[]
# Load connection profile
for orgName in os.listdir(work_dir + '/keyfiles'):
if os.path.isfile(work_dir + '/keyfiles/' + orgName + '/connection.yml'):
with open(work_dir + '/keyfiles/' + orgName + '/connection.yml', 'r') as stream:
connectionProfile = yaml.load(stream)
if connectionProfile["orderers"] is None:
continue
orderer_list = orderer_list + connectionProfile["orderers"].keys()
if (connectionProfile["organizations"][orgName.lower()]["peers"] != None):
org_list.append(orgName)
org_list_lowercase.append(orgName.lower())
org_peers_dict[orgName] = connectionProfile["organizations"][orgName.lower(
)]["peers"]
peer_list = peer_list + \
connectionProfile["organizations"][orgName.lower(
)]["peers"]
org_anchor_dict[orgName] = sorted(
connectionProfile["organizations"][orgName.lower(
)]["peers"])[0]
# When there is only peer or orderer, we skip tests.
if len(orderer_list) == 0 or len(peer_list) == 0:
outputfile =open(work_dir + '/testplan_example.yml','w')
outputfile.write("")
outputfile.close()
exit(0)
orderer_list = list(OrderedDict.fromkeys(orderer_list))
peer_list = list(OrderedDict.fromkeys(peer_list))
for orgName in org_list :
tempOrgAnchorObj={}
tempOrgAnchorObj[orgName+"Anchor"] = org_anchor_dict[orgName]
testplan_dict["peernodeAlias"].append(tempOrgAnchorObj)
tempOrgPeersObj={}
tempOrgPeersObj[orgName+"Peers"] = ','.join(org_peers_dict[orgName])
testplan_dict["peernodeAlias"].append(tempOrgPeersObj)
allAnchor_list.append(org_anchor_dict[orgName])
testplan_dict["peernodeAlias"].append({"allAnchors":','.join(allAnchor_list)})
testplan_dict["peernodeAlias"].append({"allPeers":','.join(peer_list)})
print 'org list: '
print org_list_lowercase
print 'orderer_list: '
print orderer_list
print 'peer_list: '
print peer_list
print 'allAnchor_list'
print allAnchor_list
# CREATE_CHANNEL
channel_create["parameters"]["connectionProfile"] = org_list[0]
if network_type == 'cello':
channel_create["parameters"]["channelConsortium"] = 'FabricConsortium'
else:
channel_create["parameters"]["channelConsortium"] = 'SampleConsortium'
channel_create["parameters"]["channelOrgs"] = ','.join(org_list_lowercase)
channel_create["parameters"]["ordererName"] = orderer_list[0]
testplan_dict["tests"].append(channel_create)
# JOIN_CHANNEL and INSTALL_CHAINCODE
join_list = []
install_list = []
for org in org_list:
channel_join = template["CHANNEL_JOIN"]
channel_join["parameters"]["connectionProfile"] = org
channel_join["parameters"]["peers"] = ','.join(org_peers_dict[org])
channel_join["parameters"]["ordererName"] = orderer_list[0]
join_list.append(str(channel_join))
# CHAINCODE_INSTALL
chaincode_install["parameters"]["connectionProfile"] = org
chaincode_install["parameters"]["peers"] = ','.join(org_peers_dict[org])
install_list.append(str(chaincode_install))
for join_org in join_list:
join_item = ast.literal_eval(join_org)
testplan_dict["tests"].append(join_item)
for install_org in install_list:
install_item = ast.literal_eval(install_org)
testplan_dict["tests"].append(install_item)
# CHAINCODE_INSTANTIATE
chaincode_instantiate["parameters"]["connectionProfile"] = org_list[0]
chaincode_instantiate["parameters"]["peers"] = ','.join(peer_list)
# CHAINCODE_INVOKE
# Invoke with fixed transaction count : 100
chaincode_invoke["iterationCount"] = '100'
chaincode_invoke["parameters"]["connectionProfile"] = org_list[0]
chaincode_invoke["parameters"]["peers"] = ','.join(peer_list)
chaincoode_invoke_count = str(chaincode_invoke)
# Invoke with fixed running duration : 0 hour 10 minutes 0 second.
# And enable running tests parallel by setting waitUntilFinish to true
chaincode_invoke["iterationCount"] = '0h10m0s'
chaincode_invoke["waitUntilFinish"] = False
chaincoode_invoke_time = str(chaincode_invoke)
# Invoke with fixed running duration : 0 hour 10 minutes 0 second
chaincode_invoke["iterationCount"] = '0h10m0s'
chaincode_invoke["parameters"]["peers"] = peer_list[0]
chaincoode_invoke_parallel = str(chaincode_invoke)
testplan_dict["tests"].append(chaincode_instantiate)
testplan_dict["tests"].append(ast.literal_eval(chaincoode_invoke_count))
testplan_dict["tests"].append(ast.literal_eval(chaincoode_invoke_time))
testplan_dict["tests"].append(ast.literal_eval(chaincoode_invoke_parallel))
# Execute command with default images
testplan_dict["tests"].append(ast.literal_eval(str(execute_command)))
# Execute command with customized image
execute_command["name"] = "execute-command-with-customized-image"
execute_command["container"] = "user/ownimage"
testplan_dict["tests"].append(ast.literal_eval(str(execute_command)))
connYamlStr= yaml.dump(testplan_dict,default_flow_style=False)
tempstr= connYamlStr
for orgName in org_list :
tempstr = tempstr.replace(orgName+"Anchor:",orgName+"Anchor: &"+orgName+"Anchor")
tempstr = tempstr.replace(orgName+"Peers:",orgName+"Peers: &"+orgName+"Peers")
tempstr = tempstr.replace("allAnchors:","allAnchors: &allAnchors")
tempstr = tempstr.replace("allPeers:","allPeers: &allPeers")
tempstr = tempstr.replace("runid:","runid: &runid")
if network_type == "ibp":
tempstr = tempstr.replace("networkid:","networkid: &networkid")
# Dump testplan file
outputfile =open(work_dir + '/testplan_example.yml','w')
outputfile.write(tempstr)
outputfile.close()
| 39.372222 | 102 | 0.719769 | 813 | 7,087 | 6.04305 | 0.200492 | 0.061062 | 0.034602 | 0.042133 | 0.337065 | 0.220029 | 0.158559 | 0.125585 | 0.104824 | 0.072461 | 0 | 0.005958 | 0.147453 | 7,087 | 179 | 103 | 39.592179 | 0.807183 | 0.083251 | 0 | 0.115108 | 0 | 0 | 0.217935 | 0.020374 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.035971 | null | null | 0.064748 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
1383ec6b114d686bf9cab5e588bcd0ec41143a37 | 1,033 | py | Python | dblib/test_lib.py | cyber-fighters/dblib | 9743122a55bc265f7551dd9283f381678b2703e4 | [
"MIT"
] | null | null | null | dblib/test_lib.py | cyber-fighters/dblib | 9743122a55bc265f7551dd9283f381678b2703e4 | [
"MIT"
] | 1 | 2019-02-25T09:52:31.000Z | 2019-02-25T09:52:31.000Z | dblib/test_lib.py | cyber-fighters/dblib | 9743122a55bc265f7551dd9283f381678b2703e4 | [
"MIT"
] | null | null | null | """Collection of tests."""
import pytest
import dblib.lib
f0 = dblib.lib.Finding('CD spook', 'my_PC', 'The CD drive is missing.')
f1 = dblib.lib.Finding('Unplugged', 'my_PC', 'The power cord is unplugged.')
f2 = dblib.lib.Finding('Monitor switched off', 'my_PC', 'The monitor is switched off.')
def test_add_remove():
"""Test function."""
db = dblib.lib.BackyardDB()
# regular cases
db.add(f0)
assert f0 in db.findings
assert len(db.findings) == 1
db.add(f1)
assert f1 in db.findings
assert len(db.findings) == 2
db.add(f2)
assert f2 in db.findings
assert len(db.findings) == 3
db.add(None)
assert len(db.findings) == 3
db.remove(f1)
assert f1 not in db.findings
assert len(db.findings) == 2
# test exceptions
with pytest.raises(TypeError):
db.add(1)
def test_update():
"""Test function."""
db = dblib.lib.BackyardDB()
db.add(f0)
db.add(f1)
db.update(f1, f2)
assert f2 in db.findings
assert len(db.findings) == 2
| 23.477273 | 87 | 0.629235 | 158 | 1,033 | 4.075949 | 0.303797 | 0.170807 | 0.102484 | 0.177019 | 0.414596 | 0.414596 | 0.276398 | 0.228261 | 0.127329 | 0.127329 | 0 | 0.028786 | 0.226525 | 1,033 | 43 | 88 | 24.023256 | 0.777222 | 0.078412 | 0 | 0.433333 | 0 | 0 | 0.141176 | 0 | 0 | 0 | 0 | 0 | 0.366667 | 1 | 0.066667 | false | 0 | 0.066667 | 0 | 0.133333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
138922f3a893ab484911754fbdc916b94b521606 | 1,341 | py | Python | tests/input_files/full_sm_UFO/function_library.py | valassi/mg5amc_test | 2e04f23353051f64e1604b23105fe3faabd32869 | [
"NCSA"
] | 1 | 2016-07-09T00:05:56.000Z | 2016-07-09T00:05:56.000Z | tests/input_files/full_sm_UFO/function_library.py | valassi/mg5amc_test | 2e04f23353051f64e1604b23105fe3faabd32869 | [
"NCSA"
] | 4 | 2022-03-10T09:13:31.000Z | 2022-03-30T16:15:01.000Z | tests/input_files/full_sm_UFO/function_library.py | valassi/mg5amc_test | 2e04f23353051f64e1604b23105fe3faabd32869 | [
"NCSA"
] | 1 | 2016-07-09T00:06:15.000Z | 2016-07-09T00:06:15.000Z | # This file is part of the UFO.
#
# This file contains definitions for functions that
# are extensions of the cmath library, and correspond
# either to functions that are in cmath, but inconvenient
# to access from there (e.g. z.conjugate()),
# or functions that are simply not defined.
#
#
from __future__ import absolute_import
__date__ = "22 July 2010"
__author__ = "claude.duhr@durham.ac.uk"
import cmath
from .object_library import all_functions, Function
#
# shortcuts for functions from cmath
#
complexconjugate = Function(name = 'complexconjugate',
arguments = ('z',),
expression = 'z.conjugate()')
re = Function(name = 're',
arguments = ('z',),
expression = 'z.real')
im = Function(name = 'im',
arguments = ('z',),
expression = 'z.imag')
# New functions (trigonometric)
sec = Function(name = 'sec',
arguments = ('z',),
expression = '1./cmath.cos(z)')
asec = Function(name = 'asec',
arguments = ('z',),
expression = 'cmath.acos(1./z)')
csc = Function(name = 'csc',
arguments = ('z',),
expression = '1./cmath.sin(z)')
acsc = Function(name = 'acsc',
arguments = ('z',),
expression = 'cmath.asin(1./z)')
| 23.946429 | 57 | 0.57047 | 149 | 1,341 | 5.033557 | 0.469799 | 0.112 | 0.186667 | 0.084 | 0.069333 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010549 | 0.293065 | 1,341 | 55 | 58 | 24.381818 | 0.780591 | 0.251305 | 0 | 0.269231 | 0 | 0 | 0.166329 | 0.024341 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.115385 | 0 | 0.115385 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
138ad53bc75698fb0a04af0266ae508da388a981 | 6,057 | py | Python | nevergrad/parametrization/utils.py | mehrdad-shokri/nevergrad | 7b68b00c158bf60544bc45997560edf733fb5812 | [
"MIT"
] | 2 | 2021-04-13T12:14:46.000Z | 2021-07-07T14:37:50.000Z | nevergrad/parametrization/utils.py | OregonWebSells/nevergrad | c2b2a0efdca29830ccc9182d8a7ba4d8695f698d | [
"MIT"
] | 1 | 2020-09-25T10:45:06.000Z | 2020-09-25T11:51:13.000Z | nevergrad/parametrization/utils.py | OregonWebSells/nevergrad | c2b2a0efdca29830ccc9182d8a7ba4d8695f698d | [
"MIT"
] | 1 | 2021-04-07T10:34:20.000Z | 2021-04-07T10:34:20.000Z | # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved.
#
# This source code is licensed under the MIT license found in the
# LICENSE file in the root directory of this source tree.
import os
import sys
import shutil
import tempfile
import subprocess
import typing as tp
from pathlib import Path
from nevergrad.common import tools as ngtools
class Descriptors:
"""Provides access to a set of descriptors for the parametrization
This can be used within optimizers.
""" # TODO add repr
# pylint: disable=too-many-arguments
def __init__(
self,
deterministic: bool = True,
deterministic_function: bool = True,
monoobjective: bool = True,
not_manyobjective: bool = True,
continuous: bool = True,
metrizable: bool = True,
ordered: bool = True,
) -> None:
self.deterministic = deterministic
self.deterministic_function = deterministic_function
self.continuous = continuous
self.metrizable = metrizable
self.ordered = ordered
self.monoobjective = monoobjective
self.not_manyobjective = not_manyobjective
def __and__(self, other: "Descriptors") -> "Descriptors":
values = {field: getattr(self, field) & getattr(other, field) for field in self.__dict__}
return Descriptors(**values)
def __repr__(self) -> str:
diff = ",".join(f"{x}={y}" for x, y in sorted(ngtools.different_from_defaults(instance=self, check_mismatches=True).items()))
return f"{self.__class__.__name__}({diff})"
class NotSupportedError(RuntimeError):
"""This type of operation is not supported by the parameter.
"""
class TemporaryDirectoryCopy(tempfile.TemporaryDirectory): # type: ignore
"""Creates a full copy of a directory inside a temporary directory
This class can be used as TemporaryDirectory but:
- the created copy path is available through the copyname attribute
- the contextmanager returns the clean copy path
- the directory where the temporary directory will be created
can be controlled through the CLEAN_COPY_DIRECTORY environment
variable
"""
key = "CLEAN_COPY_DIRECTORY"
@classmethod
def set_clean_copy_environment_variable(cls, directory: tp.Union[Path, str]) -> None:
"""Sets the CLEAN_COPY_DIRECTORY environment variable in
order for subsequent calls to use this directory as base for the
copies.
"""
assert Path(directory).exists(), "Directory does not exist"
os.environ[cls.key] = str(directory)
# pylint: disable=redefined-builtin
def __init__(self, source: tp.Union[Path, str], dir: tp.Optional[tp.Union[Path, str]] = None) -> None:
if dir is None:
dir = os.environ.get(self.key, None)
super().__init__(prefix="tmp_clean_copy_", dir=dir)
self.copyname = Path(self.name) / Path(source).name
shutil.copytree(str(source), str(self.copyname))
def __enter__(self) -> Path:
super().__enter__()
return self.copyname
class FailedJobError(RuntimeError):
"""Job failed during processing
"""
class CommandFunction:
"""Wraps a command as a function in order to make sure it goes through the
pipeline and notify when it is finished.
The output is a string containing everything that has been sent to stdout
Parameters
----------
command: list
command to run, as a list
verbose: bool
prints the command and stdout at runtime
cwd: Path/str
path to the location where the command must run from
Returns
-------
str
Everything that has been sent to stdout
"""
def __init__(self, command: tp.List[str], verbose: bool = False, cwd: tp.Optional[tp.Union[str, Path]] = None,
env: tp.Optional[tp.Dict[str, str]] = None) -> None:
if not isinstance(command, list):
raise TypeError("The command must be provided as a list")
self.command = command
self.verbose = verbose
self.cwd = None if cwd is None else str(cwd)
self.env = env
def __call__(self, *args: tp.Any, **kwargs: tp.Any) -> str:
"""Call the cammand line with addidional arguments
The keyword arguments will be sent as --{key}={val}
The logs are bufferized. They will be printed if the job fails, or sent as output of the function
Errors are provided with the internal stderr
"""
# TODO make the following command more robust (probably fails in multiple cases)
full_command = self.command + [str(x) for x in args] + ["--{}={}".format(x, y) for x, y in kwargs.items()]
if self.verbose:
print(f"The following command is sent: {full_command}")
outlines: tp.List[str] = []
with subprocess.Popen(full_command, stdout=subprocess.PIPE, stderr=subprocess.PIPE,
shell=False, cwd=self.cwd, env=self.env) as process:
try:
assert process.stdout is not None
for line in iter(process.stdout.readline, b''):
if not line:
break
outlines.append(line.decode().strip())
if self.verbose:
print(outlines[-1], flush=True)
except Exception: # pylint: disable=broad-except
process.kill()
process.wait()
raise FailedJobError("Job got killed for an unknown reason.")
stderr = process.communicate()[1] # we already got stdout
stdout = "\n".join(outlines)
retcode = process.poll()
if stderr and (retcode or self.verbose):
print(stderr.decode(), file=sys.stderr)
if retcode:
subprocess_error = subprocess.CalledProcessError(retcode, process.args, output=stdout, stderr=stderr)
raise FailedJobError(stderr.decode()) from subprocess_error
return stdout
| 38.826923 | 133 | 0.639591 | 744 | 6,057 | 5.106183 | 0.34543 | 0.014741 | 0.008687 | 0.011056 | 0.052645 | 0.043169 | 0.017373 | 0 | 0 | 0 | 0 | 0.000453 | 0.270596 | 6,057 | 155 | 134 | 39.077419 | 0.859439 | 0.297177 | 0 | 0.023256 | 0 | 0 | 0.061914 | 0.00814 | 0 | 0 | 0 | 0.012903 | 0.023256 | 1 | 0.093023 | false | 0 | 0.093023 | 0 | 0.302326 | 0.034884 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
138b01aa9774bbead45a8dac1264c5149cf9f912 | 568 | py | Python | Section 20/2.Document-transfer_files.py | airbornum/-Complete-Python-Scripting-for-Automation | bc053444f8786259086269ca1713bdb10144dd74 | [
"MIT"
] | 18 | 2020-04-13T03:14:06.000Z | 2022-03-09T18:54:41.000Z | Section 20/2.Document-transfer_files.py | airbornum/-Complete-Python-Scripting-for-Automation | bc053444f8786259086269ca1713bdb10144dd74 | [
"MIT"
] | null | null | null | Section 20/2.Document-transfer_files.py | airbornum/-Complete-Python-Scripting-for-Automation | bc053444f8786259086269ca1713bdb10144dd74 | [
"MIT"
] | 22 | 2020-04-29T21:12:42.000Z | 2022-03-17T18:19:54.000Z | import paramiko
ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect(hostname='54.165.97.91',username='ec2-user',password='paramiko123',port=22)
sftp_client=ssh.open_sftp()
#sftp_client.get('/home/ec2-user/paramiko_download.txt','paramiko_downloaded_file.txt')
#sftp_client.chdir("/home/ec2-user")
#print(sftp_client.getcwd())
#sftp_client.get('demo.txt','C:\\Users\\Automation\\Desktop\\download_file.txt')
sftp_client.put("transfer_files.py",'/home/ec2-user/transfer_files.py')
sftp_client.close()
ssh.close() | 43.692308 | 88 | 0.769366 | 84 | 568 | 4.988095 | 0.535714 | 0.167064 | 0.078759 | 0.081146 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033272 | 0.047535 | 568 | 13 | 89 | 43.692308 | 0.74122 | 0.399648 | 0 | 0 | 0 | 0 | 0.245399 | 0.09816 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.125 | 0.125 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
139756e066bb02143bb1f17cfc6e0e0c48ac0c56 | 20,000 | py | Python | tests/hwsim/test_ap_open.py | waittrue/wireless | 3c64f015dc62aec4da0b696f45cc4bcf41594c5d | [
"Unlicense"
] | 1 | 2016-04-22T19:32:57.000Z | 2016-04-22T19:32:57.000Z | tests/hwsim/test_ap_open.py | Acidburn0zzz/third_party-hostap | 0542463c4de76fde6e8164f75b3a52ce0ddd8087 | [
"Unlicense"
] | null | null | null | tests/hwsim/test_ap_open.py | Acidburn0zzz/third_party-hostap | 0542463c4de76fde6e8164f75b3a52ce0ddd8087 | [
"Unlicense"
] | null | null | null | # Open mode AP tests
# Copyright (c) 2014, Qualcomm Atheros, Inc.
#
# This software may be distributed under the terms of the BSD license.
# See README for more details.
import logging
logger = logging.getLogger()
import struct
import subprocess
import time
import os
import hostapd
import hwsim_utils
from tshark import run_tshark
from utils import alloc_fail
from wpasupplicant import WpaSupplicant
def test_ap_open(dev, apdev):
"""AP with open mode (no security) configuration"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412",
bg_scan_period="0")
ev = hapd.wait_event([ "AP-STA-CONNECTED" ], timeout=5)
if ev is None:
raise Exception("No connection event received from hostapd")
hwsim_utils.test_connectivity(dev[0], hapd)
dev[0].request("DISCONNECT")
ev = hapd.wait_event([ "AP-STA-DISCONNECTED" ], timeout=5)
if ev is None:
raise Exception("No disconnection event received from hostapd")
def test_ap_open_packet_loss(dev, apdev):
"""AP with open mode configuration and large packet loss"""
params = { "ssid": "open",
"ignore_probe_probability": "0.5",
"ignore_auth_probability": "0.5",
"ignore_assoc_probability": "0.5",
"ignore_reassoc_probability": "0.5" }
hapd = hostapd.add_ap(apdev[0]['ifname'], params)
for i in range(0, 3):
dev[i].connect("open", key_mgmt="NONE", scan_freq="2412",
wait_connect=False)
for i in range(0, 3):
dev[i].wait_connected(timeout=20)
def test_ap_open_unknown_action(dev, apdev):
"""AP with open mode configuration and unknown Action frame"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412")
bssid = apdev[0]['bssid']
cmd = "MGMT_TX {} {} freq=2412 action=765432".format(bssid, bssid)
if "FAIL" in dev[0].request(cmd):
raise Exception("Could not send test Action frame")
ev = dev[0].wait_event(["MGMT-TX-STATUS"], timeout=10)
if ev is None:
raise Exception("Timeout on MGMT-TX-STATUS")
if "result=SUCCESS" not in ev:
raise Exception("AP did not ack Action frame")
def test_ap_open_invalid_wmm_action(dev, apdev):
"""AP with open mode configuration and invalid WMM Action frame"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412")
bssid = apdev[0]['bssid']
cmd = "MGMT_TX {} {} freq=2412 action=1100".format(bssid, bssid)
if "FAIL" in dev[0].request(cmd):
raise Exception("Could not send test Action frame")
ev = dev[0].wait_event(["MGMT-TX-STATUS"], timeout=10)
if ev is None or "result=SUCCESS" not in ev:
raise Exception("AP did not ack Action frame")
def test_ap_open_reconnect_on_inactivity_disconnect(dev, apdev):
"""Reconnect to open mode AP after inactivity related disconnection"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412")
hapd.request("DEAUTHENTICATE " + dev[0].p2p_interface_addr() + " reason=4")
dev[0].wait_disconnected(timeout=5)
dev[0].wait_connected(timeout=2, error="Timeout on reconnection")
def test_ap_open_assoc_timeout(dev, apdev):
"""AP timing out association"""
ssid = "test"
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
dev[0].scan(freq="2412")
hapd.set("ext_mgmt_frame_handling", "1")
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412",
wait_connect=False)
for i in range(0, 10):
req = hapd.mgmt_rx()
if req is None:
raise Exception("MGMT RX wait timed out")
if req['subtype'] == 11:
break
req = None
if not req:
raise Exception("Authentication frame not received")
resp = {}
resp['fc'] = req['fc']
resp['da'] = req['sa']
resp['sa'] = req['da']
resp['bssid'] = req['bssid']
resp['payload'] = struct.pack('<HHH', 0, 2, 0)
hapd.mgmt_tx(resp)
assoc = 0
for i in range(0, 10):
req = hapd.mgmt_rx()
if req is None:
raise Exception("MGMT RX wait timed out")
if req['subtype'] == 0:
assoc += 1
if assoc == 3:
break
if assoc != 3:
raise Exception("Association Request frames not received: assoc=%d" % assoc)
hapd.set("ext_mgmt_frame_handling", "0")
dev[0].wait_connected(timeout=15)
def test_ap_open_id_str(dev, apdev):
"""AP with open mode and id_str"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412", id_str="foo",
wait_connect=False)
ev = dev[0].wait_connected(timeout=10)
if "id_str=foo" not in ev:
raise Exception("CTRL-EVENT-CONNECT did not have matching id_str: " + ev)
if dev[0].get_status_field("id_str") != "foo":
raise Exception("id_str mismatch")
def test_ap_open_select_any(dev, apdev):
"""AP with open mode and select any network"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
id = dev[0].connect("unknown", key_mgmt="NONE", scan_freq="2412",
only_add_network=True)
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412",
only_add_network=True)
dev[0].select_network(id)
ev = dev[0].wait_event(["CTRL-EVENT-NETWORK-NOT-FOUND",
"CTRL-EVENT-CONNECTED"], timeout=10)
if ev is None:
raise Exception("No result reported")
if "CTRL-EVENT-CONNECTED" in ev:
raise Exception("Unexpected connection")
dev[0].select_network("any")
dev[0].wait_connected(timeout=10)
def test_ap_open_unexpected_assoc_event(dev, apdev):
"""AP with open mode and unexpected association event"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected(timeout=15)
dev[0].dump_monitor()
# This will be accepted due to matching network
subprocess.call(['iw', 'dev', dev[0].ifname, 'connect', 'open', "2412",
apdev[0]['bssid']])
dev[0].wait_connected(timeout=15)
dev[0].dump_monitor()
dev[0].request("REMOVE_NETWORK all")
dev[0].wait_disconnected(timeout=5)
dev[0].dump_monitor()
# This will result in disconnection due to no matching network
subprocess.call(['iw', 'dev', dev[0].ifname, 'connect', 'open', "2412",
apdev[0]['bssid']])
dev[0].wait_disconnected(timeout=15)
def test_ap_bss_load(dev, apdev):
"""AP with open mode (no security) configuration"""
hapd = hostapd.add_ap(apdev[0]['ifname'],
{ "ssid": "open",
"bss_load_update_period": "10" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412")
# this does not really get much useful output with mac80211_hwsim currently,
# but run through the channel survey update couple of times
for i in range(0, 10):
hwsim_utils.test_connectivity(dev[0], hapd)
hwsim_utils.test_connectivity(dev[0], hapd)
hwsim_utils.test_connectivity(dev[0], hapd)
time.sleep(0.15)
def hapd_out_of_mem(hapd, apdev, count, func):
with alloc_fail(hapd, count, func):
started = False
try:
hostapd.add_ap(apdev['ifname'], { "ssid": "open" })
started = True
except:
pass
if started:
raise Exception("hostapd interface started even with memory allocation failure: " + arg)
def test_ap_open_out_of_memory(dev, apdev):
"""hostapd failing to setup interface due to allocation failure"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
hapd_out_of_mem(hapd, apdev[1], 1, "hostapd_alloc_bss_data")
for i in range(1, 3):
hapd_out_of_mem(hapd, apdev[1], i, "hostapd_iface_alloc")
for i in range(1, 5):
hapd_out_of_mem(hapd, apdev[1], i, "hostapd_config_defaults;hostapd_config_alloc")
hapd_out_of_mem(hapd, apdev[1], 1, "hostapd_config_alloc")
hapd_out_of_mem(hapd, apdev[1], 1, "hostapd_driver_init")
for i in range(1, 4):
hapd_out_of_mem(hapd, apdev[1], i, "=wpa_driver_nl80211_drv_init")
# eloop_register_read_sock() call from i802_init()
hapd_out_of_mem(hapd, apdev[1], 1, "eloop_sock_table_add_sock;eloop_register_sock;?eloop_register_read_sock;=i802_init")
# verify that a new interface can still be added when memory allocation does
# not fail
hostapd.add_ap(apdev[1]['ifname'], { "ssid": "open" })
def test_bssid_black_white_list(dev, apdev):
"""BSSID black/white list"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
hapd2 = hostapd.add_ap(apdev[1]['ifname'], { "ssid": "open" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412",
bssid_whitelist=apdev[1]['bssid'])
dev[1].connect("open", key_mgmt="NONE", scan_freq="2412",
bssid_blacklist=apdev[1]['bssid'])
dev[2].connect("open", key_mgmt="NONE", scan_freq="2412",
bssid_whitelist="00:00:00:00:00:00/00:00:00:00:00:00",
bssid_blacklist=apdev[1]['bssid'])
if dev[0].get_status_field('bssid') != apdev[1]['bssid']:
raise Exception("dev[0] connected to unexpected AP")
if dev[1].get_status_field('bssid') != apdev[0]['bssid']:
raise Exception("dev[1] connected to unexpected AP")
if dev[2].get_status_field('bssid') != apdev[0]['bssid']:
raise Exception("dev[2] connected to unexpected AP")
dev[0].request("REMOVE_NETWORK all")
dev[1].request("REMOVE_NETWORK all")
dev[2].request("REMOVE_NETWORK all")
dev[2].connect("open", key_mgmt="NONE", scan_freq="2412",
bssid_whitelist="00:00:00:00:00:00", wait_connect=False)
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412",
bssid_whitelist="11:22:33:44:55:66/ff:00:00:00:00:00 " + apdev[1]['bssid'] + " aa:bb:cc:dd:ee:ff")
dev[1].connect("open", key_mgmt="NONE", scan_freq="2412",
bssid_blacklist="11:22:33:44:55:66/ff:00:00:00:00:00 " + apdev[1]['bssid'] + " aa:bb:cc:dd:ee:ff")
if dev[0].get_status_field('bssid') != apdev[1]['bssid']:
raise Exception("dev[0] connected to unexpected AP")
if dev[1].get_status_field('bssid') != apdev[0]['bssid']:
raise Exception("dev[1] connected to unexpected AP")
dev[0].request("REMOVE_NETWORK all")
dev[1].request("REMOVE_NETWORK all")
ev = dev[2].wait_event(["CTRL-EVENT-CONNECTED"], timeout=0.1)
if ev is not None:
raise Exception("Unexpected dev[2] connectin")
dev[2].request("REMOVE_NETWORK all")
def test_ap_open_wpas_in_bridge(dev, apdev):
"""Open mode AP and wpas interface in a bridge"""
br_ifname='sta-br0'
ifname='wlan5'
try:
_test_ap_open_wpas_in_bridge(dev, apdev)
finally:
subprocess.call(['ip', 'link', 'set', 'dev', br_ifname, 'down'])
subprocess.call(['brctl', 'delif', br_ifname, ifname])
subprocess.call(['brctl', 'delbr', br_ifname])
subprocess.call(['iw', ifname, 'set', '4addr', 'off'])
def _test_ap_open_wpas_in_bridge(dev, apdev):
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
br_ifname='sta-br0'
ifname='wlan5'
wpas = WpaSupplicant(global_iface='/tmp/wpas-wlan5')
# First, try a failure case of adding an interface
try:
wpas.interface_add(ifname, br_ifname=br_ifname)
raise Exception("Interface addition succeeded unexpectedly")
except Exception, e:
if "Failed to add" in str(e):
logger.info("Ignore expected interface_add failure due to missing bridge interface: " + str(e))
else:
raise
# Next, add the bridge interface and add the interface again
subprocess.call(['brctl', 'addbr', br_ifname])
subprocess.call(['brctl', 'setfd', br_ifname, '0'])
subprocess.call(['ip', 'link', 'set', 'dev', br_ifname, 'up'])
subprocess.call(['iw', ifname, 'set', '4addr', 'on'])
subprocess.check_call(['brctl', 'addif', br_ifname, ifname])
wpas.interface_add(ifname, br_ifname=br_ifname)
wpas.connect("open", key_mgmt="NONE", scan_freq="2412")
def test_ap_open_start_disabled(dev, apdev):
"""AP with open mode and beaconing disabled"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open",
"start_disabled": "1" })
bssid = apdev[0]['bssid']
dev[0].flush_scan_cache()
dev[0].scan(freq=2412, only_new=True)
if dev[0].get_bss(bssid) is not None:
raise Exception("AP was seen beaconing")
if "OK" not in hapd.request("RELOAD"):
raise Exception("RELOAD failed")
dev[0].scan_for_bss(bssid, freq=2412)
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412")
def test_ap_open_start_disabled2(dev, apdev):
"""AP with open mode and beaconing disabled (2)"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open",
"start_disabled": "1" })
bssid = apdev[0]['bssid']
dev[0].flush_scan_cache()
dev[0].scan(freq=2412, only_new=True)
if dev[0].get_bss(bssid) is not None:
raise Exception("AP was seen beaconing")
if "OK" not in hapd.request("UPDATE_BEACON"):
raise Exception("UPDATE_BEACON failed")
dev[0].scan_for_bss(bssid, freq=2412)
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412")
if "OK" not in hapd.request("UPDATE_BEACON"):
raise Exception("UPDATE_BEACON failed")
dev[0].request("DISCONNECT")
dev[0].wait_disconnected()
dev[0].request("RECONNECT")
dev[0].wait_connected()
def test_ap_open_ifdown(dev, apdev):
"""AP with open mode and external ifconfig down"""
params = { "ssid": "open",
"ap_max_inactivity": "1" }
hapd = hostapd.add_ap(apdev[0]['ifname'], params)
bssid = apdev[0]['bssid']
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412")
dev[1].connect("open", key_mgmt="NONE", scan_freq="2412")
subprocess.call(['ip', 'link', 'set', 'dev', apdev[0]['ifname'], 'down'])
ev = hapd.wait_event(["AP-STA-DISCONNECTED"], timeout=10)
if ev is None:
raise Exception("Timeout on AP-STA-DISCONNECTED (1)")
ev = hapd.wait_event(["AP-STA-DISCONNECTED"], timeout=5)
if ev is None:
raise Exception("Timeout on AP-STA-DISCONNECTED (2)")
ev = hapd.wait_event(["INTERFACE-DISABLED"], timeout=5)
if ev is None:
raise Exception("No INTERFACE-DISABLED event")
# The following wait tests beacon loss detection in mac80211 on dev0.
# dev1 is used to test stopping of AP side functionality on client polling.
dev[1].request("REMOVE_NETWORK all")
subprocess.call(['ip', 'link', 'set', 'dev', apdev[0]['ifname'], 'up'])
dev[0].wait_disconnected()
dev[1].wait_disconnected()
ev = hapd.wait_event(["INTERFACE-ENABLED"], timeout=10)
if ev is None:
raise Exception("No INTERFACE-ENABLED event")
dev[0].wait_connected()
hwsim_utils.test_connectivity(dev[0], hapd)
def test_ap_open_disconnect_in_ps(dev, apdev, params):
"""Disconnect with the client in PS to regression-test a kernel bug"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412",
bg_scan_period="0")
ev = hapd.wait_event([ "AP-STA-CONNECTED" ], timeout=5)
if ev is None:
raise Exception("No connection event received from hostapd")
time.sleep(0.2)
hwsim_utils.set_powersave(dev[0], hwsim_utils.PS_MANUAL_POLL)
try:
# inject some traffic
sa = hapd.own_addr()
da = dev[0].own_addr()
hapd.request('DATA_TEST_CONFIG 1')
hapd.request('DATA_TEST_TX {} {} 0'.format(da, sa))
hapd.request('DATA_TEST_CONFIG 0')
# let the AP send couple of Beacon frames
time.sleep(0.3)
# disconnect - with traffic pending - shouldn't cause kernel warnings
dev[0].request("DISCONNECT")
finally:
hwsim_utils.set_powersave(dev[0], hwsim_utils.PS_DISABLED)
time.sleep(0.2)
out = run_tshark(os.path.join(params['logdir'], "hwsim0.pcapng"),
"wlan_mgt.tim.partial_virtual_bitmap",
["wlan_mgt.tim.partial_virtual_bitmap"])
if out is not None:
state = 0
for l in out.splitlines():
pvb = int(l, 16)
if pvb > 0 and state == 0:
state = 1
elif pvb == 0 and state == 1:
state = 2
if state != 2:
raise Exception("Didn't observe TIM bit getting set and unset (state=%d)" % state)
def test_ap_open_select_network(dev, apdev):
"""Open mode connection and SELECT_NETWORK to change network"""
hapd1 = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
bssid1 = apdev[0]['bssid']
hapd2 = hostapd.add_ap(apdev[1]['ifname'], { "ssid": "open2" })
bssid2 = apdev[1]['bssid']
id1 = dev[0].connect("open", key_mgmt="NONE", scan_freq="2412",
only_add_network=True)
id2 = dev[0].connect("open2", key_mgmt="NONE", scan_freq="2412")
hwsim_utils.test_connectivity(dev[0], hapd2)
dev[0].select_network(id1)
dev[0].wait_connected()
res = dev[0].request("BLACKLIST")
if bssid1 in res or bssid2 in res:
raise Exception("Unexpected blacklist entry")
hwsim_utils.test_connectivity(dev[0], hapd1)
dev[0].select_network(id2)
dev[0].wait_connected()
hwsim_utils.test_connectivity(dev[0], hapd2)
res = dev[0].request("BLACKLIST")
if bssid1 in res or bssid2 in res:
raise Exception("Unexpected blacklist entry(2)")
def test_ap_open_disable_enable(dev, apdev):
"""AP with open mode getting disabled and re-enabled"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
dev[0].connect("open", key_mgmt="NONE", scan_freq="2412",
bg_scan_period="0")
for i in range(2):
hapd.request("DISABLE")
dev[0].wait_disconnected()
hapd.request("ENABLE")
dev[0].wait_connected()
hwsim_utils.test_connectivity(dev[0], hapd)
def sta_enable_disable(dev, bssid):
dev.scan_for_bss(bssid, freq=2412)
work_id = dev.request("RADIO_WORK add block-work")
ev = dev.wait_event(["EXT-RADIO-WORK-START"])
if ev is None:
raise Exception("Timeout while waiting radio work to start")
id = dev.connect("open", key_mgmt="NONE", scan_freq="2412",
only_add_network=True)
dev.request("ENABLE_NETWORK %d" % id)
if "connect@" not in dev.request("RADIO_WORK show"):
raise Exception("connect radio work missing")
dev.request("DISABLE_NETWORK %d" % id)
dev.request("RADIO_WORK done " + work_id)
ok = False
for i in range(30):
if "connect@" not in dev.request("RADIO_WORK show"):
ok = True
break
time.sleep(0.1)
if not ok:
raise Exception("connect radio work not completed")
ev = dev.wait_event(["CTRL-EVENT-CONNECTED"], timeout=0.1)
if ev is not None:
raise Exception("Unexpected connection")
dev.request("DISCONNECT")
def test_ap_open_sta_enable_disable(dev, apdev):
"""AP with open mode and wpa_supplicant ENABLE/DISABLE_NETWORK"""
hapd = hostapd.add_ap(apdev[0]['ifname'], { "ssid": "open" })
bssid = apdev[0]['bssid']
sta_enable_disable(dev[0], bssid)
wpas = WpaSupplicant(global_iface='/tmp/wpas-wlan5')
wpas.interface_add("wlan5", drv_params="force_connect_cmd=1")
sta_enable_disable(wpas, bssid)
| 41.237113 | 124 | 0.6295 | 2,856 | 20,000 | 4.25 | 0.134454 | 0.029 | 0.029659 | 0.033366 | 0.642116 | 0.600923 | 0.546383 | 0.518619 | 0.464492 | 0.401219 | 0 | 0.036791 | 0.21855 | 20,000 | 484 | 125 | 41.322314 | 0.739843 | 0.0455 | 0 | 0.478149 | 0 | 0.007712 | 0.225531 | 0.030219 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.002571 | 0.025707 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13996a1a6227b5d37ae0ca02175dbff81af29e2d | 29,063 | py | Python | src/mushme.py | MuShMe/MuShMe | dbc9b940c827039016d7917d535882b47d7d8e5b | [
"Unlicense",
"MIT"
] | 1 | 2015-07-10T06:14:29.000Z | 2015-07-10T06:14:29.000Z | src/mushme.py | MuShMe/MuShMe | dbc9b940c827039016d7917d535882b47d7d8e5b | [
"Unlicense",
"MIT"
] | 2 | 2016-01-10T04:27:12.000Z | 2016-01-10T10:47:57.000Z | src/mushme.py | MuShMe/MuShMe | dbc9b940c827039016d7917d535882b47d7d8e5b | [
"Unlicense",
"MIT"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
from src import app
import os
import shutil
from flask import Flask, render_template, session, request, flash, url_for, redirect
from Forms import ContactForm, LoginForm, editForm, ReportForm, CommentForm, searchForm, AddPlaylist
from flask.ext.mail import Message, Mail
from werkzeug import secure_filename
from werkzeug import SharedDataMiddleware
from api import API
from songs import SONG
from playlist import playlist
from admin import admin
from artist import artist
import pymysql
import hashlib
from flask import g
mail = Mail()
mail.init_app(app)
#For the collector script.
app.register_blueprint(API);
#For the songs
app.register_blueprint(SONG);
#For the playlist
app.register_blueprint(playlist);
#for the admin pages
app.register_blueprint(admin);
#for the artist pages
app.register_blueprint(artist);
UPLOAD_FOLDER = "img/ProfilePic/"
ALLOWED_EXTENSIONS = set(['png', 'jpg', 'jpeg', 'gif'])
app.config['UPLOAD_FOLDER'] = 'src/static/' + UPLOAD_FOLDER
@app.route('/')
def index():
session["login"] = False
session["signup"] = False
session["logged_in"] = False
return render_template('homepage/index.html', form1=LoginForm(prefix='form1'), form2=ContactForm(prefix='form2'))
#For database connections.
@app.before_request
def before_request():
g.conn = pymysql.connect(host='127.0.0.1', port=3306, user='root', passwd='crimson', db='MuShMe', charset='utf8')
g.database = g.conn.cursor()
@app.teardown_request
def teardown_request(exception):
g.conn.close()
@app.route('/login', methods=['POST'])
def login():
session["login"] = True
session["signup"] = False
if request.method == 'POST':
loginform = LoginForm(request.form, prefix='form1')
if loginform.validate_on_submit():
check_login = g.database.execute("""SELECT User_id from MuShMe.entries WHERE Email_id="%s" AND Pwdhash="%s" """ %
(loginform.email.data, hashlib.sha1(loginform.password.data).hexdigest()))
if check_login:
userid= g.database.fetchone()
g.database.execute("""UPDATE MuShMe.entries SET Last_Login=CURRENT_TIMESTAMP() WHERE User_id="%s" """ % (userid))
g.conn.commit()
for uid in userid:
session['userid'] = uid
g.database.execute("""SELECT Username from MuShMe.entries WHERE User_id="%s" """ % uid )
session['UserName']=g.database.fetchone()[0]
g.database.execute("""SELECT Privilege FROM MuShMe.entries WHERE User_id="%s" """ % uid)
session['privilege'] = g.database.fetchone()[0]
g.database.execute("""SELECT Profile_pic FROM MuShMe.entries WHERE User_id="%s" """ % uid)
session['profilepic'] = g.database.fetchone()[0]
g.database.execute("""SELECT Name from MuShMe.entries WHERE User_id="%s" """ % uid )
session["Name"]=g.database.fetchone()
g.database.execute("""SELECT DOB from MuShMe.entries WHERE User_id="%s" """ % uid )
session["dob"]=str(g.database.fetchone())
session['logged_in'] = True
session['logged_in']=True
#print uid
#print userid
return redirect(url_for('userProfile', userid=uid))
else:
flash("Incorrect Email-Id or Password")
else:
flash("Incorrect Email-Id or Password")
return render_template('homepage/index.html', form1=loginform, form2=ContactForm(prefix='form2'))
else:
return redirect(url_for(('index')))
def flash_errors(form):
for field, errors in form.errors.items():
for error in errors:
flash(u"Error in the %s field - %s" % (
getattr(form, field).label.text,
error
))
@app.route('/signup', methods=['POST'])
def signup():
session["signup"] = True
session["login"] = False
contactform = ContactForm(request.form, prefix='form2')
if contactform.validate_on_submit():
if validate(contactform.email.data,contactform.username.data):
check_signup = g.database.execute("""INSERT into MuShMe.entries (Username,Email_id,Pwdhash,Name) VALUES ("%s","%s","%s","%s")""" %
(contactform.username.data,
contactform.email.data,
hashlib.sha1(contactform.password.data).hexdigest(),contactform.name.data,
))
if check_signup:
g.conn.commit()
g.database.execute("""SELECT User_id from MuShMe.entries WHERE Email_id="%s" AND Pwdhash="%s" """ %
(contactform.email.data, hashlib.sha1(contactform.password.data).hexdigest()))
user_id = g.database.fetchone()
for uid in user_id:
session['userid'] = uid
g.database.execute("""SELECT Username from MuShMe.entries WHERE User_id="%s" """ % uid )
session['UserName']=g.database.fetchone()[0]
g.database.execute("""SELECT Privilege FROM MuShMe.entries WHERE User_id="%s" """ % uid)
session['privilege'] = g.database.fetchone()[0]
g.database.execute("""SELECT Profile_Pic FROM MuShMe.entries WHERE User_id="%s" """ % uid)
session['profilepic'] = g.database.fetchone()[0]
session['logged_in'] = True
g.database.execute("""SELECT Name from MuShMe.entries WHERE User_id="%s" """ % uid )
session["Name"]=g.database.fetchone()
g.database.execute("""SELECT DOB from MuShMe.entries WHERE User_id="%s" """ % uid )
session["dob"]=str(g.database.fetchone())
newPlaylist = session['UserName'] + ' default collection'
g.database.execute("""INSERT INTO MuShMe.playlists (Playlist_name, User_id) VALUES ("%s","%s")""" % (newPlaylist,uid))
g.conn.commit()
return redirect(url_for('userProfile',userid=uid))
else:
flash("Please enter valid data !")
else:
flash("Username or Email has been taken")
else:
flash_errors(contactform)
return render_template('homepage/index.html', form1=LoginForm(prefix='form1'), form2=contactform)
def validate(email,username):
email = g.database.execute(""" SELECT * from MuShMe.entries where Email_id="%s" """ % email)
name = g.database.execute(""" SELECT * from MuShMe.entries where Username="%s" """ % username)
if email or name:
return False
else:
return True
@app.route('/user/<userid>',methods=['GET'])
def userProfile(userid):
if session['logged_in'] == False:
return render_template('error.html'), 404
else:
if request.method == 'GET':
User=getUserData(userid)
return render_template('userprofile/index.html', userid=userid,
form4=CommentForm(prefix='form4'), form3=editForm(prefix='form3'),
form6=searchForm(prefix='form6'), form5=ReportForm(prefix='form5'),form7=AddPlaylist(prefix='form7'),
friend=getFriend(userid), playlist=getPlaylist(userid), User=getUserData(userid), Comments=getComments(userid),
songs=getSong(userid), Recommends=getRecommend(userid), Requests=getRequest(userid),frnd=checkFriend(userid,User),
AllComments=getAllComments(userid), AllRecommends=getAllRecommend(userid))
def checkFriend(userid,User):
friendName =[]
g.database.execute("""SELECT User_id2 from friends WHERE User_id1="%s" """ % (userid))
for user in g.database.fetchall():
data = {}
g.database.execute("""SELECT Username, User_id from MuShMe.entries WHERE User_id="%s" """ % user[0])
for a in g.database.fetchall():
data['friendname']=a[0]
data['friendid']=a[1]
friendName.append(data)
for f in friendName:
a=g.database.execute("""SELECT User_id2 from friends WHERE User_id1="%s" and User_id2="%s" """ % (userid,f['friendid']))
b=g.database.execute("""SELECT User_id2 from friends WHERE User_id2="%s" and User_id1="%s" """ % (userid,f['friendid']))
if a or b:
return True
elif userid == f['friendid']:
return True
else:
return False
g.database.execute("""SELECT User_id1 from friends WHERE User_id2="%s" """ % userid)
for user in g.database.fetchall():
data = {}
g.database.execute("""SELECT Username, User_id from MuShMe.entries WHERE User_id="%s" """ % user[0])
for a in g.database.fetchall():
data['friendname']=a[0]
data['friendid']=a[1]
friendName.append(data)
for f in friendName:
a=g.database.execute("""SELECT User_id2 from friends WHERE User_id2="%s" and User_id1="%s" """ % (userid,f['friendid']))
b=g.database.execute("""SELECT User_id2 from friends WHERE User_id1="%s" and User_id2="%s" """ % (userid,f['friendid']))
if a or b:
return True
elif userid == f['friendid']:
return True
else:
return False
def getAllComments(userid):
g.database.execute("SELECT Comment_id FROM user_comments WHERE User_id=%s ORDER BY Comment_id DESC" % (userid))
commentids = g.database.fetchall()
retval = []
for commentid in commentids:
g.database.execute("SELECT Comment, User_id FROM comments WHERE Comment_id=%s", (commentid[0]))
commentdata = g.database.fetchone()
data = {}
data['comment'] = commentdata[0]
data['userid'] = commentdata[1]
data['commentid'] = commentid[0]
g.database.execute("SELECT Username FROM entries WHERE User_id=%s", (data['userid']))
data['username'] = g.database.fetchone()[0]
retval.append(data)
return retval
def getComments(userid):
g.database.execute("SELECT Comment_id FROM user_comments WHERE User_id=%s ORDER BY Comment_id DESC LIMIT 5" % (userid))
commentids = g.database.fetchall()
retval = []
for commentid in commentids:
g.database.execute("SELECT Comment, User_id FROM comments WHERE Comment_id=%s", (commentid[0]))
commentdata = g.database.fetchone()
data = {}
data['comment'] = commentdata[0]
data['userid'] = commentdata[1]
data['commentid'] = commentid[0]
g.database.execute("SELECT Username FROM entries WHERE User_id=%s", (data['userid']))
data['username'] = g.database.fetchone()[0]
retval.append(data)
return retval
def getFriend(userid):
friendName =[]
g.database.execute("""SELECT User_id2 from friends WHERE User_id1="%s" """ % userid)
for user in g.database.fetchall():
data = {}
g.database.execute("""SELECT Username, User_id, Profile_pic from MuShMe.entries WHERE User_id="%s" """ % user[0])
for a in g.database.fetchall():
data['friendname']=a[0]
data['friendid']=a[1]
data['friendpic']=a[2]
friendName.append(data)
g.database.execute("""SELECT User_id1 from friends WHERE User_id2="%s" """ % userid)
for user in g.database.fetchall():
data = {}
g.database.execute("""SELECT Username, User_id, Profile_pic from MuShMe.entries WHERE User_id="%s" """ % user[0])
for a in g.database.fetchall():
data['friendname']=a[0]
data['friendid']=a[1]
data['friendpic']=a[2]
friendName.append(data)
print friendName
return friendName
def getPlaylist(userid):
playlist = []
g.database.execute("""SELECT Playlist_name,Playlist_id from MuShMe.playlists WHERE User_id="%s" """ % userid)
for p in g.database.fetchall():
data = {}
data['pname']=p[0]
data['pid']=p[1]
playlist.append(data)
return playlist
def getSong(userid):
songName = []
g.database.execute("""SELECT Song_id from MuShMe.user_song WHERE User_id=%s LIMIT 5""" % userid)
for song in g.database.fetchall():
data = {}
g.database.execute("""SELECT Song_title,Song_id,Song_Album from MuShMe.songs WHERE Song_id="%s" """ % song)
for a in g.database.fetchall():
data['songname']=a[0]
data['songid']=a[1]
g.database.execute("SELECT Album_pic FROM albums WHERE Album_id=%s " % (a[2]))
g.conn.commit()
data['art'] = g.database.fetchone()[0]
songName.append(data)
return songName
def getUserData(userid):
User = []
g.database.execute(""" SELECT Username,User_id,Profile_pic,Privilege,Email_id,Name,DOB from entries where User_id="%s" """ % userid)
for a in g.database.fetchall():
data={}
data['username']=a[0]
data['userid']=a[1]
data['profilepic'] = a[2]
data['privilege']=a[3]
data['email']=a[4]
data['name']=a[5]
data['dob']=str(a[6])
User.append(data)
return User
def getAllRecommend(userid):
recommend =[]
g.database.execute(""" SELECT Recommend_id,User_id_from,User_id_to from recommend where User_id_to="%s" """ % userid)
for a in g.database.fetchall():
data={}
data['rid']=a[0]
data['userfrom'] = a[1]
data['userto']=a[2]
g.database.execute(""" SELECT Username from entries where User_id='%s' """ % a[1])
data['userfromname'] = g.database.fetchone()[0]
check_song = g.database.execute(""" SELECT Song_id from recommend_songs where Recommend_id="%s" """ % a[0])
if check_song:
songid = g.database.fetchone()[0]
data['song'] = []
g.database.execute(""" SELECT Song_title,Song_Album,Genre,Publisher from songs where Song_id="%s" """ % songid)
for song in g.database.fetchall():
d = {}
d['title']=song[0]
d['album'] = song[1]
d['genre'] = song[2]
d['publisher'] = song[3]
d['songid'] = songid
data['song'].append(d)
check_playlist = g.database.execute(""" SELECT Playlist_id from recommend_playlists where Recommend_id="%s" """ % a[0])
if check_playlist:
playlistid = g.database.fetchone()[0]
data['playlist'] = []
g.database.execute(""" SELECT Playlist_name,Playlist_id,User_id from playlists where Playlist_id="%s" """ % playlistid)
for p in g.database.fetchall():
d= {}
d['pname']=p[0]
d['pid']=p[1]
g.database.execute(""" SELECT Username, Name,User_id from MuShMe.entries WHERE User_id="%s" """ % p[2])
for k in g.database.fetchall():
d['username']=k[0]
d['uname']=k[1]
d['userid']=k[2]
data['playlist'].append(d)
recommend.append(data)
return recommend
def getRecommend(userid):
recommend =[]
g.database.execute(""" SELECT Recommend_id,User_id_from,User_id_to from recommend where User_id_to="%s" LIMIT 5 """ % userid)
for a in g.database.fetchall():
data={}
data['rid']=a[0]
data['userfrom'] = a[1]
data['userto']=a[2]
g.database.execute(""" SELECT Username from entries where User_id='%s' """ % a[1])
data['userfromname'] = g.database.fetchone()[0]
print data['userfromname']
check_song = g.database.execute(""" SELECT Song_id from recommend_songs where Recommend_id="%s" """ % a[0])
if check_song:
songid = g.database.fetchone()[0]
data['song'] = []
g.database.execute(""" SELECT Song_title,Song_Album,Genre,Publisher from songs where Song_id="%s" """ % songid)
for song in g.database.fetchall():
d = {}
d['title']=song[0]
d['album'] = song[1]
d['genre'] = song[2]
d['publisher'] = song[3]
d['songid'] = songid
d['songart'] = getSongArt(songid)
data['song'].append(d)
check_playlist = g.database.execute(""" SELECT Playlist_id from recommend_playlists where Recommend_id="%s" """ % a[0])
if check_playlist:
playlistid = g.database.fetchone()[0]
data['playlist'] = []
g.database.execute(""" SELECT Playlist_name,Playlist_id,User_id from playlists where Playlist_id="%s" """ % playlistid)
for p in g.database.fetchall():
d= {}
d['pname']=p[0]
d['pid']=p[1]
g.database.execute(""" SELECT Username, Name,User_id from MuShMe.entries WHERE User_id="%s" """ % p[2])
for k in g.database.fetchall():
d['username']=k[0]
d['uname']=k[1]
d['userid']=k[2]
data['playlist'].append(d)
recommend.append(data)
return recommend
def getRequest(userid):
request =[]
g.database.execute(""" SELECT Request_id,Request_from,Request_to,Status from requests where Request_to="%s" """ % userid)
for a in g.database.fetchall():
data={}
data['reqid']=a[0]
data['reqfrom'] = a[1]
data['reqto']=a[2]
data['status']=a[3]
data['reqfromuser'] = []
g.database.execute(""" SELECT User_id,Username,Name from entries where User_id='%s' """ % a[1])
for i in g.database.fetchall():
d={}
d['userid'] = i[0]
d['username'] = i[1]
d['name'] = i[2]
data['reqfromuser'].append(d)
print data
request.append(data)
return request
def getSongArt(songid):
g.database.execute("SELECT Song_Album FROM songs WHERE song_id=%s", (songid))
albumname = g.database.fetchone()[0]
g.database.execute("SELECT Album_pic FROM albums WHERE Album_id=%s", (albumname))
return g.database.fetchone()[0]
@app.route('/user/<userid>/edit',methods=['POST','GET'])
def editName(userid):
if request.method == 'POST':
uid = userid
print request.form
if request.form['editname'] != '':
g.database.execute("""UPDATE MuShMe.entries SET Name=%s WHERE User_id=%s """, ([request.form['editname']], userid))
g.conn.commit()
if request.form['birthday_year'] != '0' and request.form['birthday_month'] != '0' and request.form['birthday_day'] != '0':
g.database.execute("""UPDATE MuShMe.entries SET DOB="%s-%s-%s" WHERE User_id="%s" """ % (request.form['birthday_year'],request.form['birthday_month'],request.form['birthday_day'], userid))
g.conn.commit()
return redirect(url_for('userProfile',userid=userid))
else:
return redirect(url_for('userProfile', userid=userid))
def allowed_file(filename):
return '.' in filename and \
filename.rsplit('.', 1)[1] in ALLOWED_EXTENSIONS
@app.route('/user/<userid>/file', methods=['GET', 'POST'])
def upload_file(userid):
if request.method == 'POST':
file = request.files['file']
if file and allowed_file(file.filename):
filename = secure_filename(file.filename)
file.save(os.path.join(app.config['UPLOAD_FOLDER'], filename))
filepath = UPLOAD_FOLDER + filename
session['profilepic'] = filepath
g.database.execute("""UPDATE MuShMe.entries SET Profile_pic="%s" WHERE User_id="%s" """ % (filepath, userid))
g.conn.commit()
return redirect(url_for('userProfile', userid=userid))
app.add_url_rule('/user/uploads/<filename>', 'uploaded_file',build_only=True)
app.wsgi_app = SharedDataMiddleware(app.wsgi_app, {'/user/uploads': 'src/static' + app.config['UPLOAD_FOLDER'] })
@app.route('/user/<rcvrid>.<senderid>/comment',methods=['POST','GET'])
def comment(rcvrid, senderid):
if request.method == 'POST':
commentform = CommentForm(request.form, prefix='form4')
#print senderid
#print rcvrid
if commentform.comment.data:
query = ("""INSERT INTO MuShMe.comments (comment_type, Comment, User_id) VALUES ("%s","%s","%s") """ % ('U',commentform.comment.data, senderid))
print query
g.database.execute(query)
g.conn.commit()
g.database.execute("""SELECT Comment_id from MuShMe.comments WHERE Comment="%s" """ % (commentform.comment.data))
data = g.database.fetchone()[0]
#print data
enter_comment = g.database.execute("""INSERT INTO MuShMe.user_comments (Comment_id, User_id) VALUES ("%s","%s")""" % (data,rcvrid))
if enter_comment:
g.conn.commit()
g.database.execute("""SELECT User_id FROM MuShMe.user_comments WHERE Comment_id="%s" """ % data)
#print g.database.fetchone()[0]
return redirect(url_for('userProfile', userid=rcvrid))
@app.route('/user/<userid>/<commentid>/report',methods=['POST','GET'])
def report(userid,commentid):
if request.method == 'POST':
reportform = ReportForm(request.form, prefix='form5')
print reportform.report.data
check_report = g.database.execute("""INSERT INTO MuShMe.complaints (Complain_type, Complain_description, Comment_id,reported_by) VALUES ("%s","%s","%s","%s") """ % (reportform.report.data, reportform.other.data, commentid, session['userid'] ))
if check_report == True:
g.conn.commit()
return redirect(url_for('userProfile', userid=userid))
else:
return redirect(url_for('userProfile', userid=userid))
@app.route('/user/<uidto>.<uidfrom>/request',methods=['POST'])
def sendrequest(uidto,uidfrom):
if request.method == 'POST':
if requestvalidate(uidfrom,uidto):
query=(""" INSERT INTO requests (Request_from,Request_to,Status) VALUES ("%s","%s","%s") """ % (uidfrom,uidto,1))
g.database.execute(query)
g.conn.commit()
return redirect(url_for('userProfile', userid=uidto))
@app.route('/user/<userto>.<userfrom>/accept',methods=['POST'])
def acceptrequest(userto,userfrom):
if request.method == 'POST':
query=(""" UPDATE requests SET Status="%s" WHERE Request_from="%s" and Request_to="%s" """ % (0,userfrom,userto))
g.database.execute(query)
g.conn.commit()
query = (""" INSERT INTO friends Values ("%s","%s") """ % (userfrom,userto))
g.database.execute(query)
g.conn.commit()
return redirect(url_for('userProfile', userid=userto))
@app.route('/user/<userto>.<userfrom>/reject',methods=['POST'])
def rejectrequest(userto,userfrom):
if request.method == 'POST':
query=(""" UPDATE requests SET Status="%s" WHERE Request_from="%s" and Request_to="%s" """ % (-1,userfrom,userto))
g.database.execute(query)
g.conn.commit()
return redirect(url_for('userProfile', userid=userto))
def requestvalidate(userfrom,userto):
check = g.database.execute(""" SELECT Status from requests where Request_to="%s" and Request_from="%s" """ % (userfrom,userto))
if check and g.database.fetchone()[0]=='-1' and userfrom!=userto:
return False
else:
return True
@app.route('/search',methods=['POST','GET'])
def search():
if request.method == 'POST':
searchform = searchForm(prefix='form6')
#print 'f'
value = searchform.entry.data + '%'
search_fname = []
search_song= []
search_friend = []
search_playlist =[]
search_artist = []
check_song = g.database.execute("""SELECT Song_title,Song_Album,Genre,Publisher,Song_id from MuShMe.songs WHERE Song_title LIKE "%s" """ % ( value ))
for a in g.database.fetchall():
data={}
data['title']=a[0]
data['album']=a[1]
data['genre']=a[2]
data['publisher']=a[3]
data['songid']=a[4]
data['art']=getSongArt(a[4])
search_song.append(data)
check_artist = g.database.execute("""SELECT Artist_name, Artist_id from MuShMe.artists WHERE Artist_name LIKE "%s" """ % ( value ))
for a in g.database.fetchall():
data = {}
data['artistname']=a[0]
data['artistid']=a[1]
search_artist.append(data)
check_friend = g.database.execute("""SELECT Username, Name, Profile_pic, User_id from MuShMe.entries WHERE Username LIKE "%s" or Name LIKE "%s" """ % ( value, value ))
for a in g.database.fetchall():
data = {}
data['username']=a[0]
data['name']=a[1]
data['profilepic']=a[2]
data['userid']=a[3]
search_friend.append(data)
check_playlist = g.database.execute("""SELECT Playlist_name,User_id, Playlist_id from MuShMe.playlists WHERE Playlist_name LIKE "%s" """ % ( value ))
for a in g.database.fetchall():
data = {}
data['pname']=a[0]
data['pid']=a[2]
g.database.execute(""" SELECT Username, Name from MuShMe.entries WHERE User_id="%s" """ % a[1])
for k in g.database.fetchall():
data['username']=k[0]
data['uname']=k[1]
search_playlist.append(data)
length = len(search_playlist) + len(search_song) + len(search_friend) + len(search_artist) + len(search_fname)
return render_template('searchpage/search.html', entry=searchform.entry.data,form6=searchForm(prefix='form6'),
search_song=search_song, search_artist=search_artist,friends=search_friend,
search_playlist=search_playlist,length = length)
else:
return render_template('searchpage/search.html',form6=searchForm(prefix='form6'))
@app.route('/user/<userid>/addplaylist',methods=['POST'])
def addplaylist(userid):
if request.method=='POST':
addplaylistform = AddPlaylist(prefix='form7')
g.database.execute("""INSERT INTO MuShMe.playlists (Playlist_name, User_id) VALUES ("%s","%s")""" % (addplaylistform.add.data,userid))
g.conn.commit()
return redirect(url_for('userProfile',userid=userid))
@app.route("/playlist/<userid>/deleteplaylist", methods=["POST"])
def deleteplaylist(userid):
playlist = request.form.getlist('playlistselect')
for playlistid in playlist:
g.database.execute("""DELETE FROM playlists WHERE Playlist_id=%s and User_id=%s """ %
(playlistid, userid))
g.conn.commit()
return redirect(url_for('userProfile',userid=userid))
#All your profile are belong to us.
@app.route('/artist/<artistid>')
def artistProfile(artistid):
return render_template('artistpage/index.html',form6=searchForm(prefix='form6'))
#To handle 404 not found errors
@app.errorhandler(404)
def page_not_found_error(error):
return render_template('error.html'), 404
@app.route('/termsofservices')
def tos():
return render_template('tos.html')
@app.route('/about')
def about():
return render_template('about.html')
@app.route('/changepwd')
def changepwd():
return render_template('changepwd.html')
@app.route('/logout')
def logout():
if 'email' not in session:
return render_template('error.html')
session['logged_in']=False
return render_template('login.html')
if not app.debug:
import logging
from logging.handlers import SMTPHandler
mail_handler = SMTPHandler('127.0.0.1', 'server-error@example.com', app.config['DEFAULT_MAIL_SENDER'], 'YourApplication Failed')
mail_handler.setLevel(logging.ERROR)
app.logger.addHandler(mail_handler)
from logging import FileHandler
file_handler = FileHandler('log.txt')
file_handler.setLevel(logging.WARNING)
app.logger.addHandler(file_handler)
from logging import Formatter
mail_handler.setFormatter(Formatter('''
Message type: %(levelname)s
Location: %(pathname)s:%(lineno)d
Module: %(module)s
Function: %(funcName)s
Time: %(asctime)s
Message:
%(message)s
'''))
if __name__ == """__main__""":
# To allow aptana to receive errors, set use_debugger=False
app = create_app(config="""config.yaml""")
if app.debug: use_debugger = True
try:
# Disable Flask's debugger if external debugger is requested
use_debugger = not(app.config.get('DEBUG_WITH_APTANA'))
except:
pass
app.run(use_debugger=use_debugger,
use_reloader=use_debugger, threaded=True, port=8080)
| 42.181422 | 251 | 0.600282 | 3,498 | 29,063 | 4.89251 | 0.104917 | 0.07152 | 0.072923 | 0.080986 | 0.569242 | 0.53354 | 0.5 | 0.45156 | 0.43853 | 0.412703 | 0 | 0.009727 | 0.250077 | 29,063 | 688 | 252 | 42.242733 | 0.775499 | 0.015036 | 0 | 0.456747 | 0 | 0.00692 | 0.272136 | 0.035271 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.012111 | 0.034602 | null | null | 0.019031 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
139af14f3890b6a5fdebd9bc833f815258ac26c3 | 1,433 | py | Python | tests/adv/test_pop_sfrd.py | jlashner/ares | 6df2b676ded6bd59082a531641cb1dadd475c8a8 | [
"MIT"
] | 10 | 2020-03-26T01:08:10.000Z | 2021-12-04T13:02:10.000Z | tests/adv/test_pop_sfrd.py | jlashner/ares | 6df2b676ded6bd59082a531641cb1dadd475c8a8 | [
"MIT"
] | 25 | 2020-06-08T14:52:28.000Z | 2022-03-08T02:30:54.000Z | tests/adv/test_pop_sfrd.py | jlashner/ares | 6df2b676ded6bd59082a531641cb1dadd475c8a8 | [
"MIT"
] | 8 | 2020-03-24T14:11:25.000Z | 2021-11-06T06:32:59.000Z | """
test_pop_models.py
Author: Jordan Mirocha
Affiliation: UCLA
Created on: Fri Jul 15 15:23:11 PDT 2016
Description:
"""
import ares
import matplotlib.pyplot as pl
PB = ares.util.ParameterBundle
def test():
# Create a simple population
pars_1 = PB('pop:fcoll') + PB('sed:bpass')
pop_fcoll = ares.populations.GalaxyPopulation(**pars_1)
#pop_fcoll_XR = ares.populations.GalaxyPopulation(**pars_1)
# Mimic the above population to check our different SFRD/SED techniques
sfrd_pars = {'pop_sfr_model': 'sfrd-func'}
sfrd_pars['pop_sfrd'] = pop_fcoll.SFRD
sfrd_pars['pop_sfrd_units'] = 'internal'
sed = PB('sed:toy')
sed['pop_Nion'] = pop_fcoll.src.Nion
sed['pop_Nlw'] = pop_fcoll.src.Nlw
# pop_Ex?
sed['pop_ion_src_igm'] = False
sed['pop_heat_src_igm'] = False
pars_2 = sed + sfrd_pars
pop_sfrd = ares.populations.GalaxyPopulation(**pars_2)
assert pop_fcoll.SFRD(20.) == pop_sfrd.SFRD(20.), "Error in SFRD."
# Check the emissivities too
#print(pop_fcoll.PhotonLuminosityDensity(20., Emin=10.2, Emax=13.6))
#print(pop_sfrd.PhotonLuminosityDensity(20., Emin=10.2, Emax=13.6))
#assert pop_fcoll.PhotonLuminosityDensity(20., Emin=10.2, Emax=13.6) \
# == pop_sfrd.PhotonLuminosityDensity(20., Emin=10.2, Emax=13.6), \
# "Error in photon luminosity density."
if __name__ == '__main__':
test()
| 25.589286 | 75 | 0.669923 | 203 | 1,433 | 4.502463 | 0.413793 | 0.078775 | 0.04814 | 0.135667 | 0.282276 | 0.203501 | 0.203501 | 0.203501 | 0.203501 | 0.203501 | 0 | 0.046007 | 0.196092 | 1,433 | 55 | 76 | 26.054545 | 0.747396 | 0.432659 | 0 | 0 | 0 | 0 | 0.183081 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 1 | 0.052632 | false | 0.052632 | 0.105263 | 0 | 0.157895 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
139b92054f917712ecbfacdc663b9fc7eea6103f | 6,010 | py | Python | core/self6dpp/tools/ycbv/ycbv_pbr_so_mlBCE_Double_3_merge_train_real_uw_init_results_with_refined_poses_to_json.py | THU-DA-6D-Pose-Group/self6dpp | c267cfa55e440e212136a5e9940598720fa21d16 | [
"Apache-2.0"
] | 33 | 2021-12-15T07:11:47.000Z | 2022-03-29T08:58:32.000Z | core/self6dpp/tools/ycbv/ycbv_pbr_so_mlBCE_Double_3_merge_train_real_uw_init_results_with_refined_poses_to_json.py | THU-DA-6D-Pose-Group/self6dpp | c267cfa55e440e212136a5e9940598720fa21d16 | [
"Apache-2.0"
] | 3 | 2021-12-15T11:39:54.000Z | 2022-03-29T07:24:23.000Z | core/self6dpp/tools/ycbv/ycbv_pbr_so_mlBCE_Double_3_merge_train_real_uw_init_results_with_refined_poses_to_json.py | THU-DA-6D-Pose-Group/self6dpp | c267cfa55e440e212136a5e9940598720fa21d16 | [
"Apache-2.0"
] | null | null | null | import os.path as osp
import sys
import numpy as np
import mmcv
from tqdm import tqdm
from functools import cmp_to_key
cur_dir = osp.dirname(osp.abspath(__file__))
PROJ_ROOT = osp.normpath(osp.join(cur_dir, "../../../../"))
sys.path.insert(0, PROJ_ROOT)
from lib.pysixd import inout, misc
from lib.utils.bbox_utils import xyxy_to_xywh
from lib.utils.utils import iprint, wprint
id2obj = {
1: "002_master_chef_can", # [1.3360, -0.5000, 3.5105]
2: "003_cracker_box", # [0.5575, 1.7005, 4.8050]
3: "004_sugar_box", # [-0.9520, 1.4670, 4.3645]
4: "005_tomato_soup_can", # [-0.0240, -1.5270, 8.4035]
5: "006_mustard_bottle", # [1.2995, 2.4870, -11.8290]
6: "007_tuna_fish_can", # [-0.1565, 0.1150, 4.2625]
7: "008_pudding_box", # [1.1645, -4.2015, 3.1190]
8: "009_gelatin_box", # [1.4460, -0.5915, 3.6085]
9: "010_potted_meat_can", # [2.4195, 0.3075, 8.0715]
10: "011_banana", # [-18.6730, 12.1915, -1.4635]
11: "019_pitcher_base", # [5.3370, 5.8855, 25.6115]
12: "021_bleach_cleanser", # [4.9290, -2.4800, -13.2920]
13: "024_bowl", # [-0.2270, 0.7950, -2.9675]
14: "025_mug", # [-8.4675, -0.6995, -1.6145]
15: "035_power_drill", # [9.0710, 20.9360, -2.1190]
16: "036_wood_block", # [1.4265, -2.5305, 17.1890]
17: "037_scissors", # [7.0535, -28.1320, 0.0420]
18: "040_large_marker", # [0.0460, -2.1040, 0.3500]
19: "051_large_clamp", # [10.5180, -1.9640, -0.4745]
20: "052_extra_large_clamp", # [-0.3950, -10.4130, 0.1620]
21: "061_foam_brick", # [-0.0805, 0.0805, -8.2435]
}
obj_num = len(id2obj)
obj2id = {_name: _id for _id, _name in id2obj.items()}
if __name__ == "__main__":
new_res_path = osp.join(
PROJ_ROOT,
"datasets/BOP_DATASETS/ycbv/test/init_poses/",
"resnest50d_online_AugCosyAAEGray_mlBCE_DoubleMask_ycbv_pbr_100e_so_GdrnPbrPose_withYolov4PbrBbox_wDeepimPbrPose_ycbv_train_real_uw.json",
)
if osp.exists(new_res_path):
wprint("{} already exists! overriding!".format(new_res_path))
res_root = "output/deepim/ycbvPbrSO/FlowNet512_1.5AugCosyAAEGray_AggressiveR_ClipGrad_fxfy1_Dtw01_LogDz_PM10_Flat_ycbvPbr_SO/"
iter_num_test = 4
pkl_paths = [
"01_02MasterChefCan/inference_model_final_wo_optim-2de2b4e3/ycbv_002_master_chef_can_train_real_uw/results.pkl",
"02_03CrackerBox/inference_model_final_wo_optim-41082f8a/ycbv_003_cracker_box_train_real_uw/results.pkl",
"03_04SugarBox/inference_model_final_wo_optim-e09dec3e/ycbv_004_sugar_box_train_real_uw/results.pkl",
"04_05TomatoSoupCan/inference_model_final_wo_optim-5641f5d3/ycbv_005_tomato_soup_can_train_real_uw/results.pkl",
"05_06MustardBottle/inference_model_final_wo_optim-6ce23e94/ycbv_006_mustard_bottle_train_real_uw/results.pkl",
"06_07TunaFishCan/inference_model_final_wo_optim-0a768962/ycbv_007_tuna_fish_can_train_real_uw/results.pkl",
"07_08PuddingBox/inference_model_final_wo_optim-f2f2cf73/ycbv_008_pudding_box_train_real_uw/results.pkl",
"08_09GelatinBox/inference_model_final_wo_optim-a303aa1e/ycbv_009_gelatin_box_train_real_uw/results.pkl",
"09_10PottedMeatCan/inference_model_final_wo_optim-84a56ffd/ycbv_010_potted_meat_can_train_real_uw/results.pkl",
"10_11Banana/inference_model_final_wo_optim-83947126/ycbv_011_banana_train_real_uw/results.pkl",
"11_19PitcherBase/inference_model_final_wo_optim-af1c7e62/ycbv_019_pitcher_base_train_real_uw/results.pkl",
"12_21BleachCleanser/inference_model_final_wo_optim-5d740a46/ycbv_021_bleach_cleanser_train_real_uw/results.pkl",
"13_24Bowl/inference_model_final_wo_optim-f11815d3/ycbv_024_bowl_train_real_uw/results.pkl",
"14_25Mug/inference_model_final_wo_optim-e4824065/ycbv_025_mug_train_real_uw/results.pkl",
"15_35PowerDrill/inference_model_final_wo_optim-30d7d1da/ycbv_035_power_drill_train_real_uw/results.pkl",
"16_36WoodBlock/inference_model_final_wo_optim-fbb38751/ycbv_036_wood_block_train_real_uw/results.pkl",
"17_37Scissors/inference_model_final_wo_optim-5068c6bb/ycbv_037_scissors_train_real_uw/results.pkl",
"18_40LargeMarker/inference_model_final_wo_optim-e8d5867c/ycbv_040_large_marker_train_real_uw/results.pkl",
"19_51LargeClamp/inference_model_final_wo_optim-1ea79b34/ycbv_051_large_clamp_train_real_uw/results.pkl",
"20_52ExtraLargeClamp/inference_model_final_wo_optim-cb595297/ycbv_052_extra_large_clamp_train_real_uw/results.pkl",
"21_61FoamBrick/inference_model_final_wo_optim-d3757ca1/ycbv_061_foam_brick_train_real_uw/results.pkl",
]
obj_names = [obj for obj in obj2id]
new_res_dict = {}
for obj_name, pred_name in zip(obj_names, pkl_paths):
assert obj_name in pred_name, "{} not in {}".format(obj_name, pred_name)
pred_path = osp.join(res_root, pred_name)
assert osp.exists(pred_path), pred_path
iprint(obj_name, pred_path)
# pkl scene_im_id key, list of preds
preds = mmcv.load(pred_path)
for scene_im_id, pred_list in preds.items():
for pred in pred_list:
obj_id = pred["obj_id"]
score = pred["score"]
bbox_est = pred["bbox_det_xyxy"] # xyxy
bbox_est_xywh = xyxy_to_xywh(bbox_est)
refined_pose = pred["pose_{}".format(iter_num_test)]
pose_est = pred["pose_0"]
cur_new_res = {
"obj_id": obj_id,
"score": float(score),
"bbox_est": bbox_est_xywh.tolist(),
"pose_est": pose_est.tolist(),
"pose_refine": refined_pose.tolist(),
}
if scene_im_id not in new_res_dict:
new_res_dict[scene_im_id] = []
new_res_dict[scene_im_id].append(cur_new_res)
inout.save_json(new_res_path, new_res_dict)
iprint()
iprint("new result path: {}".format(new_res_path))
| 52.719298 | 146 | 0.708985 | 906 | 6,010 | 4.248344 | 0.342163 | 0.051442 | 0.062873 | 0.114575 | 0.277734 | 0.075864 | 0.016108 | 0 | 0 | 0 | 0 | 0.141502 | 0.175707 | 6,010 | 113 | 147 | 53.185841 | 0.635446 | 0.100166 | 0 | 0 | 0 | 0 | 0.540305 | 0.456352 | 0 | 0 | 0 | 0 | 0.020202 | 1 | 0 | false | 0 | 0.090909 | 0 | 0.090909 | 0.050505 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
139bcb633d3c2b224334dad0ddfc97013f3a8ff8 | 918 | py | Python | tests/test_app/rest_app/rest_app/services/account_service.py | jadbin/guniflask | 36253a962c056abf34884263c6919b02b921ad9c | [
"MIT"
] | 12 | 2018-09-06T06:14:59.000Z | 2021-04-18T06:30:44.000Z | tests/test_app/rest_app/rest_app/services/account_service.py | jadbin/guniflask | 36253a962c056abf34884263c6919b02b921ad9c | [
"MIT"
] | null | null | null | tests/test_app/rest_app/rest_app/services/account_service.py | jadbin/guniflask | 36253a962c056abf34884263c6919b02b921ad9c | [
"MIT"
] | 2 | 2019-09-08T22:01:26.000Z | 2020-08-03T07:23:29.000Z | from flask import abort
from guniflask.context import service
from ..config.jwt_config import jwt_manager
@service
class AccountService:
accounts = {
'root': {
'authorities': ['role_admin'],
'password': '123456',
}
}
def login(self, username: str, password: str):
if username not in self.accounts or self.accounts[username]['password'] != password:
return abort(403)
account = self.accounts[username]
token = jwt_manager.create_access_token(authorities=account['authorities'], username=username)
return {
'username': username,
'access_token': token,
}
def get(self, username: str):
if username not in self.accounts:
return abort(404)
return {
'username': username,
'authorities': self.accounts[username]['authorities']
}
| 27.818182 | 102 | 0.59695 | 91 | 918 | 5.945055 | 0.395604 | 0.110906 | 0.110906 | 0.05915 | 0.110906 | 0.110906 | 0.110906 | 0 | 0 | 0 | 0 | 0.018576 | 0.296296 | 918 | 32 | 103 | 28.6875 | 0.818885 | 0 | 0 | 0.148148 | 0 | 0 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0.111111 | 0.111111 | 0 | 0.407407 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
139ed5391c8324e35fd54e409887ff876db4d1d0 | 239 | py | Python | noo/impl/utils/__init__.py | nooproject/noo | 238711c55faeb1226a4e5339cd587a312c4babac | [
"MIT"
] | 2 | 2022-02-03T07:35:46.000Z | 2022-02-03T16:12:25.000Z | noo/impl/utils/__init__.py | nooproject/noo | 238711c55faeb1226a4e5339cd587a312c4babac | [
"MIT"
] | 2 | 2022-03-05T02:31:38.000Z | 2022-03-05T21:26:42.000Z | noo/impl/utils/__init__.py | nooproject/noo | 238711c55faeb1226a4e5339cd587a312c4babac | [
"MIT"
] | 1 | 2022-03-05T01:40:29.000Z | 2022-03-05T01:40:29.000Z | from .echo import echo, set_quiet
from .errors import NooException, cancel
from .store import STORE, FileStore, Store
__all__ = (
"FileStore",
"NooException",
"Store",
"STORE",
"cancel",
"echo",
"set_quiet",
)
| 17.071429 | 42 | 0.635983 | 26 | 239 | 5.615385 | 0.423077 | 0.09589 | 0.164384 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.23431 | 239 | 13 | 43 | 18.384615 | 0.797814 | 0 | 0 | 0 | 0 | 0 | 0.209205 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
139f98ba0220830de5e89cabcde17bead64e5fb5 | 625 | py | Python | setup.py | ooreilly/mydocstring | 077cebfb86575914d343bd3291b9e6c5e8beef94 | [
"MIT"
] | 13 | 2018-12-11T00:34:09.000Z | 2022-03-22T20:41:04.000Z | setup.py | ooreilly/mydocstring | 077cebfb86575914d343bd3291b9e6c5e8beef94 | [
"MIT"
] | 13 | 2018-06-15T19:42:06.000Z | 2020-12-18T22:20:02.000Z | setup.py | ooreilly/mydocstring | 077cebfb86575914d343bd3291b9e6c5e8beef94 | [
"MIT"
] | 5 | 2018-06-16T07:45:49.000Z | 2020-12-12T07:12:00.000Z | from setuptools import setup
setup(name='mydocstring',
version='0.2.7',
description="""A tool for extracting and converting Google-style docstrings to
plain-text, markdown, and JSON.""",
url='http://github.com/ooreilly/mydocstring',
author="Ossian O'Reilly",
license='MIT',
packages=['mydocstring'],
install_requires=['mako', 'docopt'],
entry_points = {
'console_scripts': [
'mydocstring=mydocstring.docstring:main',
],},
package_data={'mydocstring': ['templates/google_docstring.md']},
zip_safe=False)
| 32.894737 | 84 | 0.6048 | 63 | 625 | 5.904762 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006424 | 0.2528 | 625 | 18 | 85 | 34.722222 | 0.79015 | 0 | 0 | 0 | 0 | 0 | 0.4592 | 0.1072 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13a92427a8cdec440aec42402a7483f2303b73a6 | 10,075 | py | Python | json_to_relation/mysqldb.py | paepcke/json_to_relation | acfa58d540f8f51d1d913d0c173ee3ded1b6c2a9 | [
"BSD-3-Clause"
] | 4 | 2015-10-10T19:09:49.000Z | 2021-09-02T00:58:06.000Z | json_to_relation/mysqldb.py | paepcke/json_to_relation | acfa58d540f8f51d1d913d0c173ee3ded1b6c2a9 | [
"BSD-3-Clause"
] | null | null | null | json_to_relation/mysqldb.py | paepcke/json_to_relation | acfa58d540f8f51d1d913d0c173ee3ded1b6c2a9 | [
"BSD-3-Clause"
] | 8 | 2015-05-16T14:33:33.000Z | 2019-10-24T08:56:25.000Z | # Copyright (c) 2014, Stanford University
# All rights reserved.
#
# Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met:
# 1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer.
# 2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution.
# 3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
'''
Created on Sep 24, 2013
@author: paepcke
Modifications:
- Dec 30, 2013: Added closing of connection to close() method
'''
import re
import subprocess
import tempfile
import pymysql
#import MySQLdb
class MySQLDB(object):
'''
Shallow interface to MySQL databases. Some niceties nonetheless.
The query() method is an iterator. So::
for result in mySqlObj.query('SELECT * FROM foo'):
print result
'''
def __init__(self, host='127.0.0.1', port=3306, user='root', passwd='', db='mysql'):
'''
:param host: MySQL host
:type host: string
:param port: MySQL host's port
:type port: int
:param user: user to log in as
:type user: string
:param passwd: password to use for given user
:type passwd: string
:param db: database to connect to within server
:type db: string
'''
# If all arguments are set to None, we are unittesting:
if all(arg is None for arg in (host,port,user,passwd,db)):
return
self.user = user
self.pwd = passwd
self.db = db
self.cursors = []
try:
self.connection = pymysql.connect(host=host, port=port, user=user, passwd=passwd, db=db)
#self.connection = MySQLdb.connect(host=host, port=port, user=user, passwd=passwd, db=db, local_infile=1)
#except MySQLdb.OperationalError:
except pymysql.OperationalError:
pwd = '...............' if len(passwd) > 0 else '<no password>'
raise ValueError('Cannot reach MySQL server with host:%s, port:%s, user:%s, pwd:%s, db:%s' %
(host, port, user, pwd, db))
def close(self):
'''
Close all cursors that are currently still open.
'''
for cursor in self.cursors:
try:
cursor.close()
except:
pass
try:
self.connection.close()
except:
pass
def createTable(self, tableName, schema):
'''
Create new table, given its name, and schema.
The schema is a dict mappingt column names to
column types. Example: {'col1' : 'INT', 'col2' : 'TEXT'}
:param tableName: name of new table
:type tableName: String
:param schema: dictionary mapping column names to column types
:type schema: Dict<String,String>
'''
colSpec = ''
for colName, colVal in schema.items():
colSpec += str(colName) + ' ' + str(colVal) + ','
cmd = 'CREATE TABLE IF NOT EXISTS %s (%s) ' % (tableName, colSpec[:-1])
cursor = self.connection.cursor()
try:
cursor.execute(cmd)
self.connection.commit()
finally:
cursor.close()
def dropTable(self, tableName):
'''
Delete table safely. No errors
:param tableName: name of table
:type tableName: String
'''
cursor = self.connection.cursor()
try:
cursor.execute('DROP TABLE IF EXISTS %s' % tableName)
self.connection.commit()
finally:
cursor.close()
def truncateTable(self, tableName):
'''
Delete all table rows. No errors
:param tableName: name of table
:type tableName: String
'''
cursor = self.connection.cursor()
try:
cursor.execute('TRUNCATE TABLE %s' % tableName)
self.connection.commit()
finally:
cursor.close()
def insert(self, tblName, colnameValueDict):
'''
Given a dictionary mapping column names to column values,
insert the data into a specified table
:param tblName: name of table to insert into
:type tblName: String
:param colnameValueDict: mapping of column name to column value
:type colnameValueDict: Dict<String,Any>
'''
colNames, colValues = zip(*colnameValueDict.items())
cursor = self.connection.cursor()
try:
cmd = 'INSERT INTO %s (%s) VALUES (%s)' % (str(tblName), ','.join(colNames), self.ensureSQLTyping(colValues))
cursor.execute(cmd)
self.connection.commit()
finally:
cursor.close()
def bulkInsert(self, tblName, colNameTuple, valueTupleArray):
'''
Inserts large number of rows into given table. Strategy: write
the values to a temp file, then generate a LOAD INFILE LOCAL
MySQL command. Execute that command via subprocess.call().
Using a cursor.execute() fails with error 'LOAD DATA LOCAL
is not supported in this MySQL version...' even though MySQL
is set up to allow the op (load-infile=1 for both mysql and
mysqld in my.cnf).
:param tblName: table into which to insert
:type tblName: string
:param colNameTuple: tuple containing column names in proper order, i.e. \
corresponding to valueTupleArray orders.
:type colNameTuple: (str[,str[...]])
:param valueTupleArray: array of n-tuples, which hold the values. Order of\
values must corresond to order of column names in colNameTuple.
:type valueTupleArray: [(<anyMySQLCompatibleTypes>[<anyMySQLCompatibleTypes,...]])
'''
tmpCSVFile = tempfile.NamedTemporaryFile(dir='/tmp',prefix='userCountryTmp',suffix='.csv')
for valueTuple in valueTupleArray:
tmpCSVFile.write(','.join(valueTuple) + '\n')
try:
# Remove quotes from the values inside the colNameTuple's:
mySQLColNameList = re.sub("'","",str(colNameTuple))
mySQLCmd = "USE %s; LOAD DATA LOCAL INFILE '%s' INTO TABLE %s FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\\n' %s" %\
(self.db, tmpCSVFile.name, tblName, mySQLColNameList)
subprocess.call(['mysql', '-u', self.user, '-p%s'%self.pwd, '-e', mySQLCmd])
finally:
tmpCSVFile.close()
def update(self, tblName, colName, newVal, fromCondition=None):
'''
Update one column with a new value.
:param tblName: name of table in which update is to occur
:type tblName: String
:param colName: column whose value is to be changed
:type colName: String
:param newVal: value acceptable to MySQL for the given column
:type newVal: type acceptable to MySQL for the given column
:param fromCondition: optionally condition that selects which rows to update.\
if None, the named column in all rows are updated to\
the given value. Syntax must conform to what may be in\
a MySQL FROM clause (don't include the 'FROM' keyword)
:type fromCondition: String
'''
cursor = self.connection.cursor()
try:
if fromCondition is None:
cmd = "UPDATE %s SET %s = '%s';" % (tblName,colName,newVal)
else:
cmd = "UPDATE %s SET %s = '%s' WHERE %s;" % (tblName,colName,newVal,fromCondition)
cursor.execute(cmd)
self.connection.commit()
finally:
cursor.close()
def ensureSQLTyping(self, colVals):
'''
Given a list of items, return a string that preserves
MySQL typing. Example: (10, 'My Poem') ---> '10, "My Poem"'
Note that ','.join(map(str,myList)) won't work:
(10, 'My Poem') ---> '10, My Poem'
:param colVals: list of column values destined for a MySQL table
:type colVals: <any>
'''
resList = []
for el in colVals:
if isinstance(el, basestring):
resList.append('"%s"' % el)
else:
resList.append(el)
return ','.join(map(str,resList))
def query(self, queryStr):
'''
Query iterator. Given a query, return one result for each
subsequent call.
:param queryStr: query
:type queryStr: String
'''
cursor = self.connection.cursor()
# For if caller never exhausts the results by repeated calls:
self.cursors.append(cursor)
cursor.execute(queryStr)
while True:
nextRes = cursor.fetchone()
if nextRes is None:
cursor.close()
return
yield nextRes
| 40.461847 | 757 | 0.60794 | 1,195 | 10,075 | 5.121339 | 0.314644 | 0.032026 | 0.019608 | 0.02549 | 0.186438 | 0.156373 | 0.128758 | 0.112909 | 0.112909 | 0.096242 | 0 | 0.006105 | 0.300943 | 10,075 | 248 | 758 | 40.625 | 0.862843 | 0.504218 | 0 | 0.431373 | 0 | 0.019608 | 0.104526 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098039 | false | 0.068627 | 0.039216 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
13a95b957fe3881893fa91e63fed84b8224215f9 | 611 | py | Python | tools/xkeydump.py | treys/crypto-key-derivation | 789900bd73160db9a0d406c7c7f00f5f299aff73 | [
"MIT"
] | 29 | 2017-11-12T08:54:03.000Z | 2022-03-04T21:12:00.000Z | tools/xkeydump.py | treys/crypto-key-derivation | 789900bd73160db9a0d406c7c7f00f5f299aff73 | [
"MIT"
] | 2 | 2019-03-01T05:56:52.000Z | 2021-05-17T00:18:01.000Z | tools/xkeydump.py | treys/crypto-key-derivation | 789900bd73160db9a0d406c7c7f00f5f299aff73 | [
"MIT"
] | 9 | 2018-04-10T08:40:25.000Z | 2021-12-29T16:04:48.000Z | #!./venv/bin/python
from lib.mbp32 import XKey
from lib.utils import one_line_from_stdin
xkey = XKey.from_xkey(one_line_from_stdin())
print(xkey)
print("Version:", xkey.version)
print("Depth:", xkey.depth)
print("Parent FP:", xkey.parent_fp.hex())
print("Child number:", xkey.child_number_with_tick())
print("Chain code:", xkey.chain_code.hex())
print("Key:", xkey.key)
if xkey.key.get_private_bytes():
print("Private bytes:", xkey.key.get_private_bytes().hex())
print("Public bytes:", xkey.key.get_public_bytes().hex())
print("Key ID:", xkey.keyid().hex())
print("XKey:", xkey.to_xkey().decode('ascii'))
| 32.157895 | 63 | 0.721768 | 97 | 611 | 4.350515 | 0.360825 | 0.094787 | 0.07109 | 0.075829 | 0.104265 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003546 | 0.076923 | 611 | 18 | 64 | 33.944444 | 0.744681 | 0.02946 | 0 | 0 | 0 | 0 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.133333 | 0 | 0.133333 | 0.733333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
13aa5a46812a4881dac4a4f78ba8019d3b73841d | 616 | py | Python | examples/compute_angular_resolution.py | meder411/Tangent-Images | 6def4d7b8797110e54f7faa2435973771d9e9722 | [
"BSD-3-Clause"
] | 57 | 2019-12-20T09:28:29.000Z | 2022-03-28T02:38:48.000Z | examples/compute_angular_resolution.py | meder411/Tangent-Images | 6def4d7b8797110e54f7faa2435973771d9e9722 | [
"BSD-3-Clause"
] | 6 | 2020-06-06T16:39:35.000Z | 2021-01-21T01:19:52.000Z | examples/compute_angular_resolution.py | meder411/Tangent-Images | 6def4d7b8797110e54f7faa2435973771d9e9722 | [
"BSD-3-Clause"
] | 16 | 2019-12-21T08:19:33.000Z | 2022-03-28T02:38:49.000Z | from spherical_distortion.util import *
sample_order = 9 # Input resolution to examine
def ang_fov(s):
print('Spherical Resolution:', s)
for b in range(s):
dim = tangent_image_dim(b, s) # Pixel dimension of tangent image
corners = tangent_image_corners(b, s) # Corners of each tangent image
fov_x, fov_y = compute_tangent_image_angular_resolution(corners)
print(' At base level', b)
print(' FOV (x) =', fov_x)
print(' FOV (y) =', fov_y)
print(' deg/pix (x) =', fov_x/dim)
print(' deg/pix (y) =', fov_y/dim)
ang_fov(sample_order) | 36.235294 | 77 | 0.625 | 90 | 616 | 4.066667 | 0.422222 | 0.163934 | 0.103825 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002183 | 0.256494 | 616 | 17 | 78 | 36.235294 | 0.796943 | 0.146104 | 0 | 0 | 0 | 0 | 0.183556 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.071429 | 0 | 0.142857 | 0.428571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
13abd087d6a8034a9d1a9da08f2dab574fb7be66 | 51,194 | py | Python | polymath/srdfg/base.py | he-actlab/polymath | 9b7937d0ddf7452f6cc74ee90d05f8c6acef737e | [
"Apache-2.0"
] | 15 | 2021-05-09T05:46:04.000Z | 2022-03-06T20:46:32.000Z | polymath/srdfg/base.py | he-actlab/polymath | 9b7937d0ddf7452f6cc74ee90d05f8c6acef737e | [
"Apache-2.0"
] | null | null | null | polymath/srdfg/base.py | he-actlab/polymath | 9b7937d0ddf7452f6cc74ee90d05f8c6acef737e | [
"Apache-2.0"
] | 4 | 2021-08-24T07:46:29.000Z | 2022-03-05T18:23:07.000Z |
from polymath import UNSET_SHAPE, DEFAULT_SHAPES
import builtins
import operator
from collections import OrderedDict, Mapping, Sequence, deque
import functools
from numbers import Integral, Rational, Real
import contextlib
import traceback
import uuid
import numpy as np
import importlib
from .graph import Graph
from .domain import Domain
from .util import _noop_callback, _flatten_iterable, node_hash, \
_is_node_type_instance, is_iterable
class Node(object):
"""
Base class for nodes.
Parameters
----------
args : tuple
Positional arguments passed to the `_evaluate` method.
name : str or None
Name of the node or `None` to use a random, unique identifier.
shape : tuple or None
Shape of the output for a node. This can be a tuple of integers or parameter node names.
graph : Node or None
Parent graph of this node. If graph is `None`, this is the top-level graph.
op_name : str
Operation name which describes the node functionality.
value : Any or None
If a node has a default value to use for execution, it can be set using `value`.
kwargs : dict
Keyword arguments passed to the `_evaluate` method.
"""
_graph_stack = deque([None])
_eval_stack = []
stack_size = 5
evaluated_nodes = 0
def __init__(self, *args,
name=None,
shape=None,
graph=None,
dependencies=None,
op_name=None,
value=None,
**kwargs):
self.nodes = Graph()
self.value = value
self.dependencies = []
self._args = []
self._predeecessors = []
self._succesors = []
self.args = args
if "name" in kwargs:
kwargs.pop("name")
self.added_attrs = []
# TODO: CHange this to underscore private variable
self.kwargs = kwargs
self.graph = graph
self._shape = OrderedDict()
self.shape = shape or tuple([])
# Get a list of all dependencies relevant to this node
self.dependencies = [] if dependencies is None else dependencies
if self.graph:
self.dependencies.extend(self.graph.dependencies)
# Choose a name for the node and add the node to the graph
self._name = None
self.name = name or uuid.uuid4().hex
self._op_name = None
self.op_name = op_name
# Get the stack context so we can report where the node was defined
self._stack = traceback.extract_stack(limit=1)
@property
def graph(self):
"""
polymath.srdfg.graph.Graph : Parent graph of this node. If graph is `None`, this is the top-level graph.
"""
return self._graph
def preds(self):
return self._preds
def succs(self):
return self._preds
def add_predecessor(self, pred):
if isinstance(pred, Node):
self._predecessors.append(pred.gname)
else:
self._predecessors.append(pred)
def add_successor(self, succ):
if isinstance(succ, Node):
self._succesors.append(succ.gname)
else:
self._succesors.append(succ)
def set_edges(self):
for e in self.args:
self.add_predecessor(e)
if isinstance(e, Node):
e.add_successor(self)
@property
def domain(self):
return Domain(tuple([]))
@property
def args(self):
"""
tuple : Positional arguments which are used for executing this node.
"""
return tuple(self._args)
@property
def argnames(self):
return [a.name if isinstance(a, Node) else a for a in self.args]
@property
def shape(self):
"""
tuple : Shape of the output for a node. This can be a tuple of integers or parameter node names.
"""
return self._shape
@property
def var(self):
return self
@property
def name(self):
"""str : Unique name of the node"""
return self._name
@property
def op_name(self):
"""
str : Operation name which describes the node functionality.
"""
return self._op_name
@op_name.setter
def op_name(self, op_name):
if op_name:
self._op_name = op_name
elif self.__class__.__name__ == "Node":
self._op_name = self.name
else:
self._op_name = self.__class__.__name__
@name.setter
def name(self, name):
self.set_name(name)
@args.setter
def args(self, args):
new_args = []
for arg in args:
if isinstance(arg, Node):
if self.__class__.__name__ == "Node":
self.nodes[arg.name] = self.graph[arg.name]
new_args.append(arg)
self._args = tuple(new_args)
@shape.setter
def shape(self, shape):
self.set_shape(shape, init=True)
@graph.setter
def graph(self, graph):
self._graph = Node.get_active_graph(graph)
@property
def gname(self):
scope_names = [self.name]
cgraph = self.graph
while cgraph:
scope_names.append(cgraph.name)
cgraph = cgraph.graph
return "/".join(list(reversed(scope_names)))
def __enter__(self):
Node._graph_stack.append(self)
return self
def __exit__(self, *args):
assert self == Node._graph_stack.pop()
def __repr__(self):
return "<node '%s'>" % self.name
def add_attribute(self, key, value):
self.added_attrs.append(key)
self.kwargs[key] = value
def is_shape_finalized(self):
if self.shape == UNSET_SHAPE:
return False
for s in self.shape:
if not isinstance(s, Integral):
return False
return True
def set_shape(self, shape=None, init=False):
if isinstance(shape, float):
self._shape = tuple([np.int(shape)])
elif isinstance(shape, Integral):
self._shape = tuple([shape])
elif isinstance(shape, Node):
self._shape = tuple([shape])
elif not shape or len(shape) == 0:
# TODO: Change in order to enable "is shape finalized" to work
self._shape = UNSET_SHAPE
else:
shapes = []
for dim in shape:
if isinstance(dim, (Node, Integral)):
shapes.append(dim)
elif isinstance(dim, float):
shapes.append(int(dim))
else:
raise TypeError(f"Shape value must be placeholder or integer value for {self.name}\n"
f"\tDim: {dim}"
f"\n\t{self.kwargs} ")
self._shape = tuple(shapes)
@staticmethod
def get_active_graph(graph=None):
"""
Obtain the currently active graph instance by returning the explicitly given graph or using
the default graph.
Parameters
----------
graph : Node or None
Graph to return or `None` to use the default graph.
Raises
------
ValueError
If no `Graph` instance can be obtained.
"""
graph = graph or Node._graph_stack[-1]
return graph
def instantiate_node(self, node): # pylint:disable=W0621
"""
Instantiate nodes by retrieving the node object associated with the node name.
Parameters
----------
node : Node or str
Node instance or name of an node.
Returns
-------
instantiated_node : Node
Node instance.
Raises
------
ValueError
If `node` is not an `Node` instance or an node name.
RuntimeError
If `node` is an `Node` instance but does not belong to this graph.
"""
if isinstance(node, str):
return self.nodes[node]
if isinstance(node, Node):
if node.name not in self.nodes and (node.graph != self):
raise RuntimeError(f"node '{node}' does not belong to {self} graph, instead belongs to"
f" {node.graph}")
return node
raise ValueError(f"'{node}' is not an `Node` instance or node name")
def instantiate_graph(self, context, **kwargs):
"""
Instantiate a graph by replacing all node names with node instances.
.. note::
This function modifies the context in place. Use :code:`context=context.copy()` to avoid
the context being modified.
Parameters
----------
context : dict[Node or str, object]
Context whose keys are node instances or names.
kwargs : dict[str, object]
Additional context information keyed by variable name.
Returns
-------
normalized_context : dict[Node, object]
Normalized context whose keys are node instances.
Raises
------
ValueError
If the context specifies more than one value for any node.
ValueError
If `context` is not a mapping.
"""
if context is None:
context = {}
elif not isinstance(context, Mapping):
raise ValueError("`context` must be a mapping.")
nodes = list(context)
# Add the keyword arguments
for node in nodes: # pylint:disable=W0621
value = context.pop(node)
node = self.instantiate_node(node)
if node in context:
raise ValueError(f"duplicate unequal value for node '{node}'")
context[node] = value
if node.op_name in ["placeholder", "state", "input", "output", "temp"] and not node.is_shape_finalized():
context[node] = node.evaluate(context)
for name, value in kwargs.items():
node = self.nodes[name]
if node in context:
raise ValueError(f"duplicate value for node '{node}'")
context[node] = value
if node.op_name in ["placeholder", "state", "input", "output", "temp"] and not node.is_shape_finalized():
context[node] = node.evaluate(context)
return context
def run(self, fetches, context=None, *, callback=None, **kwargs):
"""
Evaluate one or more nodes given a dictionary of node names with their values.
.. note::
This function modifies the context in place. Use :code:`context=context.copy()` to avoid
the context being modified.
Parameters
----------
fetches : list[str or Node] or str or Node
One or more `Node` instances or names to evaluate.
context : dict or None
Context in which to evaluate the nodes.
callback : callable or None
Callback to be evaluated when an node is evaluated.
kwargs : dict
Additional context information keyed by variable name.
Returns
-------
values : Node or tuple[object]
Output of the nodes given the context.
Raises
------
ValueError
If `fetches` is not an `Node` instance, node name, or a sequence thereof.
"""
if isinstance(fetches, (str, Node)):
fetches = [fetches]
single = True
elif isinstance(fetches, Sequence):
single = False
else:
raise ValueError("`fetches` must be an `Node` instance, node name, or a "
"sequence thereof.")
fetches = [self.instantiate_node(node) for node in fetches]
context = self.instantiate_graph(context, **kwargs)
for c in context:
if c in fetches and c.op_name in ["output", "state", "temp"]:
write_name = "/".join([f"{i}{c.write_count-1}" for i in c.name.split("/")]) if c.write_count > 0 else c.name
fetches[fetches.index(c)] = c.graph.nodes[write_name]
values = [fetch.evaluate_node(fetch, context, callback=callback) for fetch in fetches]
return values[0] if single else tuple(values)
def __getstate__(self):
return self.__dict__
def __setstate__(self, data):
self.__dict__.update(data)
def set_name(self, name):
"""
Set the name of the node and update the graph.
Parameters
----------
value : str
Unique name of the node.
Returns
-------
self : Node
This node.
Raises
------
ValueError
If an node with `value` already exists in the associated graph.
KeyError
If the current name of the node cannot be found in the associated graph.
"""
name = name or uuid.uuid4().hex
# TODO: Need a way to check if the existing node is not equal to the current ndoe as ewll
if self.graph and name in self.graph.nodes:
raise ValueError(f"duplicate name '{name}' in {self.graph.name}:\n\t"
f"Existing: {self.graph.nodes[name].args}\n\t"
f"New: {self.args}")
if self.graph:
graph = self.graph
if self._name and self._name in graph.nodes:
graph.update_graph_key(self._name, name)
else:
graph.nodes[name] = self
self._name = name
return self
def evaluate_dependencies(self, context, callback=None):
"""
Evaluate the dependencies of this node and discard the values.
Parameters
----------
context : dict
Normalised context in which to evaluate the node.
callback : callable or None
Callback to be evaluated when an node is evaluated.
"""
for node in self.dependencies:
node.evaluate(context, callback)
def evaluate(self, context, callback=None):
"""
Evaluate the node given a context.
Parameters
----------
context : dict
Normalised context in which to evaluate the node.
callback : callable or None
Callback to be evaluated when an node is evaluated.
Returns
-------
value : object
Output of the node given the context.
"""
# Evaluate all explicit dependencies first
self.evaluate_dependencies(context, callback)
if self in context:
return context[self]
# Evaluate the parents
partial = functools.partial(self.evaluate_node, context=context, callback=callback)
args = [partial(arg) for arg in self.args]
kwargs = {key: partial(value) for key, value in self.kwargs.items() if key not in self.added_attrs}
# Evaluate the node
callback = callback or _noop_callback
with callback(self, context):
if self.__class__.__name__ == "Node":
context[self] = self.value = self._evaluate(*args, context=context, **kwargs)
else:
context[self] = self.value = self._evaluate(*args, **kwargs)
return self.value
def _evaluate(self, *args, context=None, **kwargs):
"""
Inheriting nodes should implement this function to evaluate the node.
"""
return self(*args, context, **kwargs)
@classmethod
def evaluate_node(cls, node, context, **kwargs):
"""
Evaluate an node or constant given a context.
"""
Node.evaluated_nodes += 1
try:
if isinstance(node, Node):
Node._eval_stack.append(node.name)
return node.evaluate(context, **kwargs)
partial = functools.partial(cls.evaluate_node, context=context, **kwargs)
if isinstance(node, tuple):
return tuple(partial(element) for element in node)
if isinstance(node, list):
return [partial(element) for element in node]
if isinstance(node, dict):
return {partial(key): partial(value) for key, value in node.items()}
if isinstance(node, slice):
return slice(*[partial(getattr(node, attr))
for attr in ['start', 'stop', 'step']])
return node
except Exception as ex: # pragma: no cover
messages = []
interactive = False
if isinstance(node, Node) or not is_iterable(node):
node = [node]
for n in node:
stack = []
if isinstance(n, Node):
for frame in reversed(n._stack): # pylint: disable=protected-access
# Do not capture any internal stack traces
fname = frame.filename
if 'polymath' in fname:
continue
# Stop tracing at the last interactive cell
if interactive and not fname.startswith('<'):
break # pragma: no cover
interactive = fname.startswith('<')
stack.append(frame)
stack = "".join(traceback.format_list(reversed(stack)))
message = "Failed to evaluate node `%s` defined at:\n\n%s" % (n, stack)
messages.append(message)
raise ex from EvaluationError("".join(messages))
@classmethod
def init_from_args(cls, *args,
name=None,
shape=None,
graph=None,
dependencies=None,
op_name=None,
value=None,
**kwargs):
if len(args) == 0:
n = cls(name=name,
shape=shape,
graph=graph,
op_name=op_name,
dependencies=dependencies,
value=value,
**kwargs)
else:
n = cls(*args,
name=name,
shape=shape,
graph=graph,
op_name=op_name,
dependencies=dependencies,
value=value,
**kwargs)
return n
def __bool__(self):
return True
def __hash__(self):
return id(self)
def func_hash(self):
"""
This returns the functional hash of a particular node. The default hash returns an object id, whereas this function
returns a hash of all attributes and subgraphs of a node.
"""
return node_hash(self)
def find_node(self, name):
g = self.graph
while g is not None and name not in g.nodes:
g = g.graph
if name in g.nodes:
return g.nodes[name]
raise RuntimeError(f"Cannot find {name} in graph nodes. Graph: {self.graph}")
def __len__(self):
#TODO: Update this to check for finalzied shape
if self.shape == UNSET_SHAPE:
raise TypeError(f'`shape` must be specified explicitly for nodes {self}')
return self.shape[0]
def __iter__(self):
num = len(self)
for i in range(num):
yield self[i]
def __eq__(self, other):
return hash(self) == hash(other)
def __getattr__(self, name):
return getattr_(self, name, graph=self.graph)
def __getitem__(self, key):
if self.__class__.__name__ != "Node":
if isinstance(key, (slice, Integral)):
return getitem(self, key, graph=self.graph)
else:
if isinstance(key, (list)):
return var_index(self, key, graph=self)
elif isinstance(key, tuple):
return var_index(self, list(key), graph=self)
else:
return var_index(self, [key], graph=self)
else:
return self.nodes[key]
def __add__(self, other):
return add(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__radd__(self)
def __radd__(self, other):
return add(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__add__(self)
def __sub__(self, other):
return sub(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rsub__(self)
def __rsub__(self, other):
return sub(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__sub__(self)
def __pow__(self, other):
return pow_(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rpow__(self)
def __rpow__(self, other):
return pow_(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rpow__(self)
def __matmul__(self, other):
return matmul(self, other, graph=self.graph)
def __rmatmul__(self, other):
return matmul(other, self, graph=self.graph)
def __mul__(self, other):
return mul(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rmul__(self)
def __rmul__(self, other):
return mul(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__mul__(self)
def __truediv__(self, other):
return truediv(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__truediv__(self)
def __rtruediv__(self, other):
return truediv(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rtruediv__(self)
def __floordiv__(self, other):
return floordiv(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rfloordiv__(self)
def __rfloordiv__(self, other):
return floordiv(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__floordiv__(self)
def __mod__(self, other):
return mod(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rmod__(self)
def __rmod__(self, other):
return mod(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__mod__(self)
def __lshift__(self, other):
return lshift(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rlshift__(self)
def __rlshift__(self, other):
return lshift(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__lshift__(self)
def __rshift__(self, other):
return rshift(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rrshift__(self)
def __rrshift__(self, other):
return rshift(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rshift__(self)
def __and__(self, other):
return and_(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rand__(self)
def __rand__(self, other):
return and_(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__and__(self)
def __or__(self, other):
return or_(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__ror__(self)
def __ror__(self, other):
return or_(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__or__(self)
def __xor__(self, other):
return xor(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__rxor__(self)
def __rxor__(self, other):
return xor(other, self, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__xor__(self)
def __lt__(self, other):
return lt(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__gt__(self)
def __le__(self, other):
return le(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__ge__(self)
def __ne__(self, other):
return ne(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__ne__(self)
def __gt__(self, other):
return gt(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__lt__(self)
def __ge__(self, other):
return ge(self, other, graph=self.graph) if not _is_node_type_instance(other, ("slice_op", "var_index", "index")) else other.__le__(self)
def __invert__(self):
return inv(self, graph=self.graph)
def __neg__(self):
return neg(self, graph=self.graph)
def __abs__(self):
return abs_(self, graph=self.graph)
def __pos__(self):
return pos(self, graph=self.graph)
def __reversed__(self):
return reversed_(self, graph=self.graph)
def update_graph_key(self, old_key, new_key):
n = list(map(lambda k: (new_key, self.nodes[k]) if k == old_key else (k, self.nodes[k]), self.nodes.keys()))
self.nodes = Graph(n)
def insert_node(self, node, idx):
node_list = list(self.nodes.items())
node_list.insert(idx, (node.name, node))
self.nodes = Graph(node_list)
def __call__(self, *args, **kwargs):
return self.run(*args, **kwargs)
class EvaluationError(RuntimeError):
"""
Failed to evaluate an node.
"""
class var_index(Node): # pylint: disable=C0103,W0223
"""
Node representing values of a variable corresponding to input index values.
Parameters
----------
var : Node
The multi-dimensional variable used for indexing into.
idx : tuple
Tuple of either integer values or index/index_op nodes.
"""
def __init__(self, var, idx, name=None, **kwargs): # pylint: disable=W0235
if "domain" in kwargs:
domain = tuple(kwargs.pop("domain")) if isinstance(kwargs["domain"], list) else kwargs.pop("domain")
else:
domain = Domain(idx)
super(var_index, self).__init__(var, idx, name=name, domain=domain, **kwargs)
@property
def domain(self):
return self.kwargs["domain"]
@property
def var(self):
var, index_list = self.args
return var
def set_name(self, name):
"""
Set the name for a variable index, making sure to replicate the new name with
a unique stringwhich corresponds to the variable, index combination.
Parameters
----------
value : str
Unique name of the node.
Returns
-------
self : Node
This node.
Raises
------
ValueError
If an node with `value` already exists in the associated graph.
KeyError
If the current name of the node cannot be found in the associated graph.
"""
# TODO: Need a way to check if the existing node is not equal to the current ndoe as ewll
if self.graph and name in self.graph.nodes:
raise ValueError(f"duplicate name '{name}' in {self.graph.name}:"
f"Existing: {self.graph.nodes[name].args}\n"
f"New: {self.args}")
if self.graph:
graph = self.graph
if self._name is not None and self._name in graph.nodes:
graph.update_graph_key(self._name, name)
else:
graph.nodes[name] = self
self._name = name
return self
def __getitem__(self, key):
if self.is_shape_finalized() and len(self.nodes) >= np.prod(self.shape):
if isinstance(key, Integral):
key = tuple([key])
idx = np.ravel_multi_index(key, dims=self.shape, order='C')
ret = self.nodes.item_by_index(idx)
return ret
else:
if isinstance(key, (list)):
ret = var_index(self.var, tuple(key), graph=self)
elif isinstance(key, tuple):
ret = var_index(self.var, key, graph=self)
else:
ret = var_index(self.var, tuple([key]), graph=self)
return ret
def is_scalar(self, val=None):
if val is not None and (not isinstance(val, np.ndarray) or (len(val.shape) == 1 and val.shape[0] == 1)):
if self.var.shape != DEFAULT_SHAPES[0] and (len(self.var.shape) == 1 and not isinstance(self.var.shape[0],Node)):
raise ValueError(f"Invalid shape var for var index {self} with variable shape {self.var.shape}")
return True
else:
return self.var.shape == DEFAULT_SHAPES[0]
def _evaluate(self, var, indices, **kwargs):
if self.is_scalar(var):
out_shape = (1,)
indices = (0,)
single = True
else:
out_shape = self.domain.shape_from_indices(indices)
indices = self.domain.compute_pairs()
single = False
if isinstance(var, (Integral, Real, str)):
var = np.asarray([var])
elif not isinstance(var, (np.ndarray, list)):
raise TypeError(f"Variable {var} with type {type(var)} is not a list or numpy array, and cannot be sliced for {self.name}")
elif isinstance(var, list):
var = np.asarray(var)
if len(var.shape) != len(out_shape) and np.prod(var.shape) == np.prod(out_shape):
if len(out_shape) > len(var.shape):
for i in range(len(out_shape)):
if out_shape[i] == 1:
var = np.expand_dims(var, axis=i)
else:
var = np.squeeze(var)
if len(var.shape) != len(out_shape) and np.prod(var.shape) != np.prod(out_shape):
raise ValueError(f"Index list does not match {var.shape} in {self.var.name} - {self.var.op_name}"
f"dimensions for slice {self.args[0].name} with {out_shape}.\n"
f"Domain: {self.domain}\n"
f"Eval Stack: {Node._eval_stack}")
if not single and not all([(idx_val - 1) >= indices[-1][idx] for idx, idx_val in enumerate(var.shape)]):
raise ValueError(f"var_index {self.name} has indices which are greater than the variable shape:\n"
f"\tArgs: {self.args}\n"
f"\tVar shape: {var.shape}\n"
f"\tNode shape: {self.var.shape}\n"
f"\tIndex Upper bounds: {indices[-1]}")
indices = list(map(lambda x: x.tolist() if isinstance(x, np.ndarray) else x, indices))
res = var[indices] if single else np.asarray([var[idx] for idx in indices]).reshape(out_shape)
if out_shape == (1,) and len(indices) == 1:
res = res[0]
self.domain.set_computed(out_shape, indices)
return res
def __add__(self, other):
return slice_op(operator.add, self, other, graph=self.graph)
def __radd__(self, other):
return slice_op(operator.add, other, self, graph=self.graph)
def __sub__(self, other):
return slice_op(operator.sub, self, other, graph=self.graph)
def __rsub__(self, other):
return slice_op(operator.sub, other, self, graph=self.graph)
def __pow__(self, other):
return slice_op(builtins.pow, self, other, graph=self.graph)
def __rpow__(self, other):
return slice_op(builtins.pow, other, self, graph=self.graph)
def __mul__(self, other):
return slice_op(operator.mul, self, other, graph=self.graph)
def __rmul__(self, other):
return slice_op(operator.mul, other, self, graph=self.graph)
def __truediv__(self, other):
return slice_op(operator.truediv, self, other, graph=self.graph)
def __rtruediv__(self, other):
return slice_op(operator.truediv, other, self, graph=self.graph)
def __floordiv__(self, other):
return slice_op(operator.floordiv, self, other, graph=self.graph)
def __rfloordiv__(self, other):
return slice_op(operator.floordiv, other, self, graph=self.graph)
def __mod__(self, other):
return slice_op(operator.mod, self, other, graph=self.graph)
def __rmod__(self, other):
return slice_op(operator.mod, other, self, graph=self.graph)
def __lshift__(self, other):
return slice_op(operator.lshift, self, other, graph=self.graph)
def __rlshift__(self, other):
return slice_op(operator.lshift, other, self, graph=self.graph)
def __rshift__(self, other):
return slice_op(operator.rshift, self, other, graph=self.graph)
def __rrshift__(self, other):
return slice_op(operator.rshift, other, self, graph=self.graph)
def __and__(self, other):
return slice_op(operator.and_, self, other, graph=self.graph)
def __rand__(self, other):
return slice_op(operator.and_, other, self, graph=self.graph)
def __or__(self, other):
return slice_op(operator.or_, self, other, graph=self.graph)
def __ror__(self, other):
return slice_op(operator.or_, other, self, graph=self.graph)
def __xor__(self, other):
return slice_op(operator.xor, self, other, graph=self.graph)
def __rxor__(self, other):
return slice_op(operator.xor, other, self, graph=self.graph)
def __lt__(self, other):
return slice_op(operator.lt, self, other, graph=self.graph)
def __le__(self, other):
return slice_op(operator.lt, other, self, graph=self.graph)
def __ne__(self, other):
return slice_op(operator.ne, self, other, graph=self.graph)
def __gt__(self, other):
return slice_op(operator.gt, self, other, graph=self.graph)
def __ge__(self, other):
return slice_op(operator.ge, self, other, graph=self.graph)
def __repr__(self):
return "<var_index name=%s, index=%s>" % (self.name, self.args)
class slice_op(Node):
"""
Node representing multi-dimensional operations performed on a node.
Parameters
----------
target : cal
The multi-dimensional variable used for indexing into.
idx : tuple
Tuple of either integer values or index/index_op nodes.
"""
def __init__(self, target, *args, **kwargs):
if "domain" in kwargs:
domain = tuple(kwargs.pop("domain")) if isinstance(kwargs["domain"], list) else kwargs.pop("domain")
else:
all_args = _flatten_iterable(args)
slice1_var, slice1_idx, slice2_var, slice2_idx = self.get_index_nodes(all_args[0], all_args[1])
domain = slice1_idx.combine_set_domains(slice2_idx)
if "op_name" in kwargs:
kwargs.pop("op_name")
target_name = f"{target.__module__}.{target.__name__}"
super(slice_op, self).__init__(*args, target=target_name, domain=domain, op_name=f"slice_{target.__name__}", **kwargs)
self.target = target
@property
def domain(self):
return self.kwargs["domain"]
def __getitem__(self, key):
if isinstance(key, (tuple, list, np.ndarray)) and len(key) == 0:
return self
elif self.is_shape_finalized() and len(self.nodes) > 0:
if isinstance(key, (int, Node)):
key = tuple([key])
if len(key) != len(self.shape):
raise KeyError(f"Invalid key shape for {self.name}:\n"
f"Shape: {self.shape}\n"
f"Key: {key}")
name = f"{self.name}{key}"
if name not in self.nodes.keys():
raise KeyError(f"{name} not in {self.name} keys:\n"
f"Node keys: {list(self.nodes.keys())}")
ret = self.nodes[name]
return ret
else:
name = []
if isinstance(key, Node):
name.append(key.name)
elif hasattr(key, "__len__") and not isinstance(key, str):
for k in key:
if isinstance(k, Node):
name.append(k.name)
else:
name.append(str(k))
else:
name.append(key)
name = self.var.name + "[" + "][".join(name) + "]"
if name in self.graph.nodes:
return self.graph.nodes[name]
elif isinstance(key, (list)):
return var_index(self, key, name=name, graph=self.graph)
elif isinstance(key, tuple):
return var_index(self, list(key), name=name, graph=self.graph)
else:
return var_index(self, [key], name=name, graph=self.graph)
def set_shape(self, shape=None, init=False):
s = []
assert isinstance(shape, (tuple, list))
if all([isinstance(sv, Integral) for sv in shape]) and len(self.domain) == np.product(shape) and len(shape) > 0:
self._shape = shape if isinstance(shape, tuple) else tuple(shape)
else:
for idx, d in enumerate(self.domain.dom_set):
if shape and isinstance(shape[idx], (func_op, Integral)):
s.append(shape[idx])
elif shape and isinstance(shape[idx], float):
s.append(int(shape[idx]))
elif isinstance(d, float):
s.append(int(d))
elif isinstance(d, var_index):
s.append(d.domain)
else:
s.append(d)
self._shape = tuple(s)
def is_scalar(self, val):
return not isinstance(val, np.ndarray) or (len(val.shape) == 1 and val.shape[0] == 1)
def _evaluate(self, op1, op2, context=None, **kwargs):
if self.is_scalar(op1) or self.is_scalar(op2):
value = self.target(op1, op2)
else:
arg0_dom = self.args[0].domain
arg1_dom = self.args[1].domain
op1_idx = self.domain.map_sub_domain(arg0_dom) if isinstance(self.args[0], Node) else tuple([])
op2_idx = self.domain.map_sub_domain(arg1_dom) if isinstance(self.args[1], Node) else tuple([])
op1 = np.asarray(list(map(lambda x: op1[x], op1_idx))).reshape(self.domain.computed_shape)
op2 = np.asarray(list(map(lambda x: op2[x], op2_idx))).reshape(self.domain.computed_shape)
value = self.target(op1, op2)
return value
def get_index_nodes(self, slice1_var=None, slice2_var=None):
if slice1_var is None and slice2_var is None:
slice1_var, slice2_var = self.args
if isinstance(slice1_var, (slice_op, var_index)) or _is_node_type_instance(slice1_var, "GroupNode"):
slice1_idx = slice1_var.domain
elif _is_node_type_instance(slice1_var, "index"):
slice1_idx = slice1_var.domain
else:
slice1_idx = Domain(tuple([]))
if isinstance(slice2_var, (slice_op, var_index)) or _is_node_type_instance(slice2_var, "GroupNode"):
slice2_idx = slice2_var.domain
elif _is_node_type_instance(slice2_var, "index"):
slice2_idx = slice2_var.domain
else:
slice2_idx = Domain(tuple([]))
return slice1_var, slice1_idx, slice2_var, slice2_idx
def __add__(self, other):
return slice_op(operator.add, self, other, graph=self.graph)
def __radd__(self, other):
return slice_op(operator.add, other, self, graph=self.graph)
def __sub__(self, other):
return slice_op(operator.sub, self, other, graph=self.graph)
def __rsub__(self, other):
return slice_op(operator.sub, other, self, graph=self.graph)
def __pow__(self, other):
return slice_op(builtins.pow, self, other, graph=self.graph)
def __rpow__(self, other):
return slice_op(builtins.pow, other, self, graph=self.graph)
def __mul__(self, other):
return slice_op(operator.mul, self, other, graph=self.graph)
def __rmul__(self, other):
return slice_op(operator.mul, other, self, graph=self.graph)
def __truediv__(self, other):
return slice_op(operator.truediv, self, other, graph=self.graph)
def __rtruediv__(self, other):
return slice_op(operator.truediv, other, self, graph=self.graph)
def __floordiv__(self, other):
return slice_op(operator.floordiv, self, other, graph=self.graph)
def __rfloordiv__(self, other):
return slice_op(operator.floordiv, other, self, graph=self.graph)
def __mod__(self, other):
return slice_op(operator.mod, self, other, graph=self.graph)
def __rmod__(self, other):
return slice_op(operator.mod, other, self, graph=self.graph)
def __lshift__(self, other):
return slice_op(operator.lshift, self, other, graph=self.graph)
def __rlshift__(self, other):
return slice_op(operator.lshift, other, self, graph=self.graph)
def __rshift__(self, other):
return slice_op(operator.rshift, self, other, graph=self.graph)
def __rrshift__(self, other):
return slice_op(operator.rshift, other, self, graph=self.graph)
def __and__(self, other):
return slice_op(operator.and_, self, other, graph=self.graph)
def __rand__(self, other):
return slice_op(operator.and_, other, self, graph=self.graph)
def __or__(self, other):
return slice_op(operator.or_, self, other, graph=self.graph)
def __ror__(self, other):
return slice_op(operator.or_, other, self, graph=self.graph)
def __xor__(self, other):
return slice_op(operator.xor, self, other, graph=self.graph)
def __rxor__(self, other):
return slice_op(operator.xor, other, self, graph=self.graph)
def __lt__(self, other):
return slice_op(operator.lt, self, other, graph=self.graph)
def __le__(self, other):
return slice_op(operator.lt, other, self, graph=self.graph)
def __ne__(self, other):
return slice_op(operator.ne, self, other, graph=self.graph)
def __gt__(self, other):
return slice_op(operator.gt, self, other, graph=self.graph)
def __ge__(self, other):
return slice_op(operator.ge, self, other, graph=self.graph)
def __repr__(self):
return "<slice_%s '%s'>" % (self.target.__name__, self.name)
class func_op(Node): # pylint: disable=C0103,R0903
"""
Node wrapper for stateless functions.
Parameters
----------
target : callable
function to evaluate the node
args : tuple
positional arguments passed to the target
kwargs : dict
keywoard arguments passed to the target
"""
def __init__(self, target, *args, **kwargs):
kwargs["op_name"] = kwargs["op_name"] if "op_name" in kwargs \
else f"{target.__name__}"
if "domain" in kwargs:
domain = tuple(kwargs.pop("domain")) if isinstance(kwargs["domain"], list) else kwargs.pop("domain")
elif len(args) == 2:
all_args = _flatten_iterable(args)
slice1_var, slice1_idx, slice2_var, slice2_idx = self.get_index_nodes(all_args[0], all_args[1])
domain = slice1_idx.combine_set_domains(slice2_idx)
else:
domain = Domain(tuple([]))
self._target = None
super(func_op, self).__init__(*args, target=f"{target.__module__}.{target.__name__}", domain=domain, **kwargs)
self.target = target
self.added_attrs += ["domain", "target"]
@property
def target(self):
return self._target
@target.setter
def target(self, fnc):
self._target = fnc
self.op_name = f"{fnc.__name__}"
self.kwargs["target"] = f"{fnc.__module__}.{fnc.__name__}"
def __getitem__(self, key):
return self
@property
def domain(self):
return self.kwargs["domain"]
def get_index_nodes(self, slice1_var=None, slice2_var=None):
if slice1_var is None and slice2_var is None:
slice1_var, slice2_var = self.args
if isinstance(slice1_var, (slice_op, var_index)) or _is_node_type_instance(slice1_var, "GroupNode"):
slice1_idx = slice1_var.domain
else:
slice1_idx = Domain(tuple([]))
if isinstance(slice2_var, (slice_op, var_index)) or _is_node_type_instance(slice2_var, "GroupNode"):
slice2_idx = slice2_var.domain
else:
slice2_idx = Domain(tuple([]))
return slice1_var, slice1_idx, slice2_var, slice2_idx
def _evaluate(self, *args, **kwargs):
for aa in list(kwargs.keys()):
if aa in self.added_attrs:
kwargs.pop(aa)
return self.target(*args, **kwargs)
def __call__(self, *args, **kwargs):
return call(self, *args, **kwargs)
def __repr__(self):
return "<func_op '%s' target=%s args=<%d items>>" % \
(self.name, self.kwargs["target"], len(self.args))
def nodeop(target=None, **kwargs):
"""
Decorator for creating nodes from functions.
"""
# This is called when the decorator is used with arguments
if target is None:
return functools.partial(nodeop, **kwargs)
# This is called when the decorator is used without arguments
@functools.wraps(target)
def _wrapper(*args, **kwargs_inner):
return func_op(target, *args, **kwargs_inner, **kwargs)
return _wrapper
@nodeop
def call(func, *args, **kwargs):
"""
Call `func` with positional arguments `args` and keyword arguments `kwargs`.
Parameters
----------
func : callable
Function to call when the node is executed.
args : list
Sequence of positional arguments passed to `func`.
kwargs : dict
Mapping of keyword arguments passed to `func`.
"""
return func(*args, **kwargs)
@contextlib.contextmanager
def control_dependencies(dependencies, graph=None):
"""
Ensure that all `dependencies` are executed before any nodes in this scope.
Parameters
----------
dependencies : list
Sequence of nodes to be evaluted before evaluating any nodes defined in this
scope.
"""
# Add dependencies to the graph
graph = Node.get_active_graph(graph)
graph.dependencies.extend(dependencies)
yield
# Remove dependencies from the graph
del graph.dependencies[-len(dependencies):]
#pylint: disable=C0103
abs_ = nodeop(builtins.abs)
dict_ = nodeop(builtins.dict)
help_ = nodeop(builtins.help)
min_ = nodeop(builtins.min)
setattr_ = nodeop(builtins.setattr)
all_ = nodeop(builtins.all)
dir_ = nodeop(builtins.dir)
hex_ = nodeop(builtins.hex)
next_ = nodeop(builtins.next)
slice_ = nodeop(builtins.slice)
any_ = nodeop(builtins.any)
divmod_ = nodeop(builtins.divmod)
id_ = nodeop(builtins.id)
object_ = nodeop(builtins.object)
sorted_ = nodeop(builtins.sorted)
ascii_ = nodeop(builtins.ascii)
enumerate_ = nodeop(builtins.enumerate)
input_ = nodeop(builtins.input)
oct_ = nodeop(builtins.oct)
staticmethod_ = nodeop(builtins.staticmethod)
bin_ = nodeop(builtins.bin)
eval_ = nodeop(builtins.eval)
int_ = nodeop(builtins.int)
open_ = nodeop(builtins.open)
str_ = nodeop(builtins.str)
bool_ = nodeop(builtins.bool)
exec_ = nodeop(builtins.exec)
isinstance_ = nodeop(builtins.isinstance)
ord_ = nodeop(builtins.ord)
sum_ = nodeop(builtins.sum)
bytearray_ = nodeop(builtins.bytearray)
filter_ = nodeop(builtins.filter)
issubclass_ = nodeop(builtins.issubclass)
pow_ = nodeop(builtins.pow)
super_ = nodeop(builtins.super)
bytes_ = nodeop(builtins.bytes)
float_ = nodeop(builtins.float)
iter_ = nodeop(builtins.iter)
print_ = nodeop(builtins.print)
tuple_ = nodeop(builtins.tuple)
callable_ = nodeop(builtins.callable)
format_ = nodeop(builtins.format)
len_ = nodeop(builtins.len)
property_ = nodeop(builtins.property)
type_ = nodeop(builtins.type)
chr_ = nodeop(builtins.chr)
frozenset_ = nodeop(builtins.frozenset)
list_ = nodeop(builtins.list)
range_ = nodeop(builtins.range)
vars_ = nodeop(builtins.vars)
classmethod_ = nodeop(builtins.classmethod)
getattr_ = nodeop(builtins.getattr)
locals_ = nodeop(builtins.locals)
repr_ = nodeop(builtins.repr)
zip_ = nodeop(builtins.zip)
compile_ = nodeop(builtins.compile)
globals_ = nodeop(builtins.globals)
map_ = nodeop(builtins.map)
reversed_ = nodeop(builtins.reversed)
complex_ = nodeop(builtins.complex)
hasattr_ = nodeop(builtins.hasattr)
max_ = nodeop(builtins.max)
round_ = nodeop(builtins.round)
delattr_ = nodeop(builtins.delattr)
hash_ = nodeop(builtins.hash)
memoryview_ = nodeop(builtins.memoryview)
set_ = nodeop(builtins.set)
add = nodeop(operator.add)
and_ = nodeop(operator.and_)
attrgetter = nodeop(operator.attrgetter)
concat = nodeop(operator.concat)
contains = nodeop(operator.contains)
countOf = nodeop(operator.countOf)
delitem = nodeop(operator.delitem)
eq = nodeop(operator.eq)
floordiv = nodeop(operator.floordiv)
ge = nodeop(operator.ge)
getitem = nodeop(operator.getitem)
gt = nodeop(operator.gt)
index = nodeop(operator.index)
indexOf = nodeop(operator.indexOf)
inv = nodeop(operator.inv)
invert = nodeop(operator.invert)
ior = nodeop(operator.ior)
ipow = nodeop(operator.ipow)
irshift = nodeop(operator.irshift)
is_ = nodeop(operator.is_)
is_not = nodeop(operator.is_not)
itemgetter = nodeop(operator.itemgetter)
le = nodeop(operator.le)
length_hint = nodeop(operator.length_hint)
lshift = nodeop(operator.lshift)
lt = nodeop(operator.lt)
matmul = nodeop(operator.matmul)
methodcaller = nodeop(operator.methodcaller)
mod = nodeop(operator.mod)
mul = nodeop(operator.mul)
ne = nodeop(operator.ne)
neg = nodeop(operator.neg)
not_ = nodeop(operator.not_)
or_ = nodeop(operator.or_)
pos = nodeop(operator.pos)
rshift = nodeop(operator.rshift)
setitem = nodeop(operator.setitem)
sub = nodeop(operator.sub)
truediv = nodeop(operator.truediv)
truth = nodeop(operator.truth)
xor = nodeop(operator.xor)
import_ = nodeop(importlib.import_module)
| 35.725052 | 158 | 0.607239 | 6,433 | 51,194 | 4.613866 | 0.0813 | 0.050942 | 0.049055 | 0.038947 | 0.487214 | 0.429905 | 0.399717 | 0.387554 | 0.363633 | 0.352481 | 0 | 0.004223 | 0.282963 | 51,194 | 1,432 | 159 | 35.75 | 0.804348 | 0.13777 | 0 | 0.37013 | 0 | 0.002165 | 0.063779 | 0.005614 | 0 | 0 | 0 | 0.003492 | 0.002165 | 1 | 0.187229 | false | 0 | 0.016234 | 0.126623 | 0.398268 | 0.001082 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
13b9386ce9cd9ff6be8dca6211a1ab2dc6917f81 | 7,340 | py | Python | BaseTools/Source/Python/GenFds/CapsuleData.py | James992927108/uEFI_Edk2_Practice | 2cac7618dfee10bfa5104a2e167c85425fde0100 | [
"BSD-2-Clause"
] | 6 | 2020-01-10T05:16:15.000Z | 2022-01-06T17:41:58.000Z | BaseTools/Source/Python/GenFds/CapsuleData.py | James992927108/uEFI_Edk2_Practice | 2cac7618dfee10bfa5104a2e167c85425fde0100 | [
"BSD-2-Clause"
] | null | null | null | BaseTools/Source/Python/GenFds/CapsuleData.py | James992927108/uEFI_Edk2_Practice | 2cac7618dfee10bfa5104a2e167c85425fde0100 | [
"BSD-2-Clause"
] | 3 | 2018-04-21T07:59:33.000Z | 2018-04-23T02:06:01.000Z | ## @file
# generate capsule
#
# Copyright (c) 2007-2017, Intel Corporation. All rights reserved.<BR>
#
# This program and the accompanying materials
# are licensed and made available under the terms and conditions of the BSD License
# which accompanies this distribution. The full text of the license may be found at
# http://opensource.org/licenses/bsd-license.php
#
# THE PROGRAM IS DISTRIBUTED UNDER THE BSD LICENSE ON AN "AS IS" BASIS,
# WITHOUT WARRANTIES OR REPRESENTATIONS OF ANY KIND, EITHER EXPRESS OR IMPLIED.
#
##
# Import Modules
#
import Ffs
from GenFdsGlobalVariable import GenFdsGlobalVariable
import StringIO
from struct import pack
import os
from Common.Misc import SaveFileOnChange
import uuid
## base class for capsule data
#
#
class CapsuleData:
## The constructor
#
# @param self The object pointer
def __init__(self):
pass
## generate capsule data
#
# @param self The object pointer
def GenCapsuleSubItem(self):
pass
## FFS class for capsule data
#
#
class CapsuleFfs (CapsuleData):
## The constructor
#
# @param self The object pointer
#
def __init__(self) :
self.Ffs = None
self.FvName = None
## generate FFS capsule data
#
# @param self The object pointer
# @retval string Generated file name
#
def GenCapsuleSubItem(self):
FfsFile = self.Ffs.GenFfs()
return FfsFile
## FV class for capsule data
#
#
class CapsuleFv (CapsuleData):
## The constructor
#
# @param self The object pointer
#
def __init__(self) :
self.Ffs = None
self.FvName = None
self.CapsuleName = None
## generate FV capsule data
#
# @param self The object pointer
# @retval string Generated file name
#
def GenCapsuleSubItem(self):
if self.FvName.find('.fv') == -1:
if self.FvName.upper() in GenFdsGlobalVariable.FdfParser.Profile.FvDict.keys():
FvObj = GenFdsGlobalVariable.FdfParser.Profile.FvDict.get(self.FvName.upper())
FdBuffer = StringIO.StringIO('')
FvObj.CapsuleName = self.CapsuleName
FvFile = FvObj.AddToBuffer(FdBuffer)
FvObj.CapsuleName = None
FdBuffer.close()
return FvFile
else:
FvFile = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FvName)
return FvFile
## FD class for capsule data
#
#
class CapsuleFd (CapsuleData):
## The constructor
#
# @param self The object pointer
#
def __init__(self) :
self.Ffs = None
self.FdName = None
self.CapsuleName = None
## generate FD capsule data
#
# @param self The object pointer
# @retval string Generated file name
#
def GenCapsuleSubItem(self):
if self.FdName.find('.fd') == -1:
if self.FdName.upper() in GenFdsGlobalVariable.FdfParser.Profile.FdDict.keys():
FdObj = GenFdsGlobalVariable.FdfParser.Profile.FdDict.get(self.FdName.upper())
FdFile = FdObj.GenFd()
return FdFile
else:
FdFile = GenFdsGlobalVariable.ReplaceWorkspaceMacro(self.FdName)
return FdFile
## AnyFile class for capsule data
#
#
class CapsuleAnyFile (CapsuleData):
## The constructor
#
# @param self The object pointer
#
def __init__(self) :
self.Ffs = None
self.FileName = None
## generate AnyFile capsule data
#
# @param self The object pointer
# @retval string Generated file name
#
def GenCapsuleSubItem(self):
return self.FileName
## Afile class for capsule data
#
#
class CapsuleAfile (CapsuleData):
## The constructor
#
# @param self The object pointer
#
def __init__(self) :
self.Ffs = None
self.FileName = None
## generate Afile capsule data
#
# @param self The object pointer
# @retval string Generated file name
#
def GenCapsuleSubItem(self):
return self.FileName
class CapsulePayload(CapsuleData):
'''Generate payload file, the header is defined below:
#pragma pack(1)
typedef struct {
UINT32 Version;
EFI_GUID UpdateImageTypeId;
UINT8 UpdateImageIndex;
UINT8 reserved_bytes[3];
UINT32 UpdateImageSize;
UINT32 UpdateVendorCodeSize;
UINT64 UpdateHardwareInstance; //Introduced in v2
} EFI_FIRMWARE_MANAGEMENT_CAPSULE_IMAGE_HEADER;
'''
def __init__(self):
self.UiName = None
self.Version = None
self.ImageTypeId = None
self.ImageIndex = None
self.HardwareInstance = None
self.ImageFile = []
self.VendorCodeFile = []
self.Certificate_Guid = None
self.MonotonicCount = None
self.Existed = False
self.Buffer = None
def GenCapsuleSubItem(self, AuthData=[]):
if not self.Version:
self.Version = '0x00000002'
if not self.ImageIndex:
self.ImageIndex = '0x1'
if not self.HardwareInstance:
self.HardwareInstance = '0x0'
ImageFileSize = os.path.getsize(self.ImageFile)
if AuthData:
# the ImageFileSize need include the full authenticated info size. From first bytes of MonotonicCount to last bytes of certificate.
# the 32 bit is the MonotonicCount, dwLength, wRevision, wCertificateType and CertType
ImageFileSize += 32
VendorFileSize = 0
if self.VendorCodeFile:
VendorFileSize = os.path.getsize(self.VendorCodeFile)
#
# Fill structure
#
Guid = self.ImageTypeId.split('-')
Buffer = pack('=ILHHBBBBBBBBBBBBIIQ',
int(self.Version,16),
int(Guid[0], 16),
int(Guid[1], 16),
int(Guid[2], 16),
int(Guid[3][-4:-2], 16),
int(Guid[3][-2:], 16),
int(Guid[4][-12:-10], 16),
int(Guid[4][-10:-8], 16),
int(Guid[4][-8:-6], 16),
int(Guid[4][-6:-4], 16),
int(Guid[4][-4:-2], 16),
int(Guid[4][-2:], 16),
int(self.ImageIndex, 16),
0,
0,
0,
ImageFileSize,
VendorFileSize,
int(self.HardwareInstance, 16)
)
if AuthData:
Buffer += pack('QIHH', AuthData[0], AuthData[1], AuthData[2], AuthData[3])
Buffer += uuid.UUID(AuthData[4]).get_bytes_le()
#
# Append file content to the structure
#
ImageFile = open(self.ImageFile, 'rb')
Buffer += ImageFile.read()
ImageFile.close()
if self.VendorCodeFile:
VendorFile = open(self.VendorCodeFile, 'rb')
Buffer += VendorFile.read()
VendorFile.close()
self.Existed = True
return Buffer
| 29.837398 | 143 | 0.568665 | 735 | 7,340 | 5.627211 | 0.282993 | 0.027079 | 0.034816 | 0.052224 | 0.330513 | 0.249758 | 0.249033 | 0.240329 | 0.240329 | 0.240329 | 0 | 0.021798 | 0.343733 | 7,340 | 245 | 144 | 29.959184 | 0.836828 | 0.308311 | 0 | 0.333333 | 1 | 0 | 0.010379 | 0 | 0 | 0 | 0.003256 | 0 | 0 | 1 | 0.113821 | false | 0.01626 | 0.056911 | 0.01626 | 0.292683 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13bb8f895f0b886cb437c77e5bc0bd429007f636 | 2,193 | py | Python | esque_wire/protocol/structs/api/elect_preferred_leaders_response.py | real-digital/esque-wire | eb02c49f38b89ad5e5d25aad15fb4ad795e52807 | [
"MIT"
] | null | null | null | esque_wire/protocol/structs/api/elect_preferred_leaders_response.py | real-digital/esque-wire | eb02c49f38b89ad5e5d25aad15fb4ad795e52807 | [
"MIT"
] | 7 | 2019-11-26T08:19:49.000Z | 2021-03-15T14:27:47.000Z | esque_wire/protocol/structs/api/elect_preferred_leaders_response.py | real-digital/esque-wire | eb02c49f38b89ad5e5d25aad15fb4ad795e52807 | [
"MIT"
] | null | null | null | from typing import ClassVar, List, Optional
from ...constants import ApiKey, ErrorCode
from ..base import ResponseData
class PartitionResult:
partition_id: int
error_code: ErrorCode
error_message: Optional[str]
def __init__(self, partition_id: int, error_code: ErrorCode, error_message: Optional[str]):
"""
:param partition_id: The partition id
:type partition_id: int
:param error_code: The result error, or zero if there was no error.
:type error_code: ErrorCode
:param error_message: The result message, or null if there was no error.
:type error_message: Optional[str]
"""
self.partition_id = partition_id
self.error_code = error_code
self.error_message = error_message
class ReplicaElectionResult:
topic: str
partition_result: List[PartitionResult]
def __init__(self, topic: str, partition_result: List[PartitionResult]):
"""
:param topic: The topic name
:type topic: str
:param partition_result: The results for each partition
:type partition_result: List[PartitionResult]
"""
self.topic = topic
self.partition_result = partition_result
class ElectPreferredLeadersResponseData(ResponseData):
throttle_time_ms: int
replica_election_results: List[ReplicaElectionResult]
api_key: ClassVar[ApiKey] = ApiKey.ELECT_PREFERRED_LEADERS
def __init__(self, throttle_time_ms: int, replica_election_results: List[ReplicaElectionResult]):
"""
:param throttle_time_ms: The duration in milliseconds for which the request was throttled due to a quota
violation, or zero if the request did not violate any quota.
:type throttle_time_ms: int
:param replica_election_results: The election results, or an empty array if the requester did not have
permission and the request asks for all partitions.
:type replica_election_results: List[ReplicaElectionResult]
"""
self.throttle_time_ms = throttle_time_ms
self.replica_election_results = replica_election_results
| 36.55 | 112 | 0.690378 | 257 | 2,193 | 5.645914 | 0.29572 | 0.053067 | 0.057891 | 0.047553 | 0.290145 | 0.257753 | 0.199862 | 0.164025 | 0.164025 | 0.07581 | 0 | 0 | 0.248518 | 2,193 | 59 | 113 | 37.169492 | 0.880461 | 0.401277 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.125 | 0 | 0.708333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
13bc25bc6434cc017d92bbc47c055999ff8c038c | 3,181 | py | Python | tests/stack_test.py | arthurlogilab/py_zipkin | 8e733506c399967ea74c56b99a9a421e1bb1736a | [
"Apache-2.0"
] | 225 | 2016-09-16T17:57:51.000Z | 2022-02-12T22:15:32.000Z | tests/stack_test.py | arthurlogilab/py_zipkin | 8e733506c399967ea74c56b99a9a421e1bb1736a | [
"Apache-2.0"
] | 156 | 2016-09-17T03:50:04.000Z | 2021-03-17T23:19:40.000Z | tests/stack_test.py | arthurlogilab/py_zipkin | 8e733506c399967ea74c56b99a9a421e1bb1736a | [
"Apache-2.0"
] | 53 | 2016-09-20T18:34:08.000Z | 2021-08-31T06:14:03.000Z | import mock
import pytest
import py_zipkin.storage
@pytest.fixture(autouse=True, scope="module")
def create_zipkin_attrs():
# The following tests all expect _thread_local.zipkin_attrs to exist: if it
# doesn't, mock.patch will fail.
py_zipkin.storage.ThreadLocalStack().get()
def test_get_zipkin_attrs_returns_none_if_no_zipkin_attrs():
tracer = py_zipkin.storage.get_default_tracer()
with mock.patch.object(tracer._context_stack, "_storage", []):
assert not py_zipkin.storage.ThreadLocalStack().get()
assert not py_zipkin.storage.ThreadLocalStack().get()
def test_get_zipkin_attrs_with_context_returns_none_if_no_zipkin_attrs():
with mock.patch.object(py_zipkin.storage.log, "warning", autospec=True) as log:
assert not py_zipkin.storage.Stack([]).get()
assert log.call_count == 1
def test_storage_stack_still_works_if_you_dont_pass_in_storage():
# Let's make sure this still works if we don't pass in a custom storage.
assert not py_zipkin.storage.Stack().get()
def test_get_zipkin_attrs_returns_the_last_of_the_list():
tracer = py_zipkin.storage.get_default_tracer()
with mock.patch.object(tracer._context_stack, "_storage", ["foo"]):
assert "foo" == py_zipkin.storage.ThreadLocalStack().get()
def test_get_zipkin_attrs_with_context_returns_the_last_of_the_list():
assert "foo" == py_zipkin.storage.Stack(["bar", "foo"]).get()
def test_pop_zipkin_attrs_does_nothing_if_no_requests():
tracer = py_zipkin.storage.get_default_tracer()
with mock.patch.object(tracer._context_stack, "_storage", []):
assert not py_zipkin.storage.ThreadLocalStack().pop()
def test_pop_zipkin_attrs_with_context_does_nothing_if_no_requests():
assert not py_zipkin.storage.Stack([]).pop()
def test_pop_zipkin_attrs_removes_the_last_zipkin_attrs():
tracer = py_zipkin.storage.get_default_tracer()
with mock.patch.object(tracer._context_stack, "_storage", ["foo", "bar"]):
assert "bar" == py_zipkin.storage.ThreadLocalStack().pop()
assert "foo" == py_zipkin.storage.ThreadLocalStack().get()
def test_pop_zipkin_attrs_with_context_removes_the_last_zipkin_attrs():
context_stack = py_zipkin.storage.Stack(["foo", "bar"])
assert "bar" == context_stack.pop()
assert "foo" == context_stack.get()
def test_push_zipkin_attrs_adds_new_zipkin_attrs_to_list():
tracer = py_zipkin.storage.get_default_tracer()
with mock.patch.object(tracer._context_stack, "_storage", ["foo"]):
assert "foo" == py_zipkin.storage.ThreadLocalStack().get()
py_zipkin.storage.ThreadLocalStack().push("bar")
assert "bar" == py_zipkin.storage.ThreadLocalStack().get()
def test_push_zipkin_attrs_with_context_adds_new_zipkin_attrs_to_list():
stack = py_zipkin.storage.Stack(["foo"])
assert "foo" == stack.get()
stack.push("bar")
assert "bar" == stack.get()
def test_stack_copy():
stack = py_zipkin.storage.Stack()
stack.push("a")
stack.push("b")
the_copy = stack.copy()
the_copy.push("c")
stack.push("d")
assert ["a", "b", "c"] == the_copy._storage
assert ["a", "b", "d"] == stack._storage
| 34.576087 | 83 | 0.727759 | 449 | 3,181 | 4.775056 | 0.184855 | 0.089552 | 0.16791 | 0.14459 | 0.704291 | 0.640392 | 0.508862 | 0.385728 | 0.385728 | 0.362407 | 0 | 0.000366 | 0.142094 | 3,181 | 91 | 84 | 34.956044 | 0.785269 | 0.055014 | 0 | 0.241379 | 0 | 0 | 0.041972 | 0 | 0 | 0 | 0 | 0 | 0.327586 | 1 | 0.224138 | false | 0.017241 | 0.051724 | 0 | 0.275862 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13c4fe2bf0cd10d5be8344221103967c7cea77fd | 12,883 | py | Python | windows/winobject/network.py | marpie/PythonForWindows | b253bc5873e7d97087ed22f2753b51fc6880ec18 | [
"BSD-3-Clause"
] | 1 | 2018-11-15T11:15:56.000Z | 2018-11-15T11:15:56.000Z | windows/winobject/network.py | killvxk/PythonForWindows | b253bc5873e7d97087ed22f2753b51fc6880ec18 | [
"BSD-3-Clause"
] | null | null | null | windows/winobject/network.py | killvxk/PythonForWindows | b253bc5873e7d97087ed22f2753b51fc6880ec18 | [
"BSD-3-Clause"
] | 1 | 2020-12-25T12:59:10.000Z | 2020-12-25T12:59:10.000Z | import windows
import ctypes
import socket
import struct
from windows import winproxy
import windows.generated_def as gdef
from windows.com import interfaces as cominterfaces
from windows.generated_def.winstructs import *
from windows.generated_def.windef import *
class TCP4Connection(MIB_TCPROW_OWNER_PID):
"""A TCP4 socket (connected or listening)"""
@property
def established(self):
"""``True`` if connection is established else it's a listening socket"""
return self.dwState == MIB_TCP_STATE_ESTAB
@property
def remote_port(self):
""":type: :class:`int`"""
if not self.established:
return None
return socket.ntohs(self.dwRemotePort)
@property
def local_port(self):
""":type: :class:`int`"""
return socket.ntohs(self.dwLocalPort)
@property
def local_addr(self):
"""Local address IP (x.x.x.x)
:type: :class:`str`"""
return socket.inet_ntoa(struct.pack("<I", self.dwLocalAddr))
@property
def remote_addr(self):
"""remote address IP (x.x.x.x)
:type: :class:`str`"""
if not self.established:
return None
return socket.inet_ntoa(struct.pack("<I", self.dwRemoteAddr))
@property
def remote_proto(self):
"""Identification of the protocol associated with the remote port.
Equals ``remote_port`` if no protocol is associated with it.
:type: :class:`str` or :class:`int`
"""
try:
return socket.getservbyport(self.remote_port, 'tcp')
except socket.error:
return self.remote_port
@property
def remote_host(self):
"""Identification of the remote hostname.
Equals ``remote_addr`` if the resolution fails
:type: :class:`str` or :class:`int`
"""
try:
return socket.gethostbyaddr(self.remote_addr)
except socket.error:
return self.remote_addr
def close(self):
"""Close the connection <require elevated process>"""
closing = MIB_TCPROW()
closing.dwState = MIB_TCP_STATE_DELETE_TCB
closing.dwLocalAddr = self.dwLocalAddr
closing.dwLocalPort = self.dwLocalPort
closing.dwRemoteAddr = self.dwRemoteAddr
closing.dwRemotePort = self.dwRemotePort
return winproxy.SetTcpEntry(ctypes.byref(closing))
def __repr__(self):
if not self.established:
return "<TCP IPV4 Listening socket on {0}:{1}>".format(self.local_addr, self.local_port)
return "<TCP IPV4 Connection {s.local_addr}:{s.local_port} -> {s.remote_addr}:{s.remote_port}>".format(s=self)
class TCP6Connection(MIB_TCP6ROW_OWNER_PID):
"""A TCP6 socket (connected or listening)"""
@staticmethod
def _str_ipv6_addr(addr):
return ":".join(c.encode('hex') for c in addr)
@property
def established(self):
"""``True`` if connection is established else it's a listening socket"""
return self.dwState == MIB_TCP_STATE_ESTAB
@property
def remote_port(self):
""":type: :class:`int`"""
if not self.established:
return None
return socket.ntohs(self.dwRemotePort)
@property
def local_port(self):
""":type: :class:`int`"""
return socket.ntohs(self.dwLocalPort)
@property
def local_addr(self):
"""Local address IP
:type: :class:`str`"""
return self._str_ipv6_addr(self.ucLocalAddr)
@property
def remote_addr(self):
"""remote address IP
:type: :class:`str`"""
if not self.established:
return None
return self._str_ipv6_addr(self.ucRemoteAddr)
@property
def remote_proto(self):
"""Equals to ``self.remote_port`` for Ipv6"""
return self.remote_port
@property
def remote_host(self):
"""Equals to ``self.remote_addr`` for Ipv6"""
return self.remote_addr
def close(self):
raise NotImplementedError("Closing IPV6 connection non implemented")
def __repr__(self):
if not self.established:
return "<TCP IPV6 Listening socket on {0}:{1}>".format(self.local_addr, self.local_port)
return "<TCP IPV6 Connection {0}:{1} -> {2}:{3}>".format(self.local_addr, self.local_port, self.remote_addr, self.remote_port)
def get_MIB_TCPTABLE_OWNER_PID_from_buffer(buffer):
x = windows.generated_def.winstructs.MIB_TCPTABLE_OWNER_PID.from_buffer(buffer)
nb_entry = x.dwNumEntries
class _GENERATED_MIB_TCPTABLE_OWNER_PID(ctypes.Structure):
_fields_ = [
("dwNumEntries", DWORD),
("table", TCP4Connection * nb_entry),
]
return _GENERATED_MIB_TCPTABLE_OWNER_PID.from_buffer(buffer)
def get_MIB_TCP6TABLE_OWNER_PID_from_buffer(buffer):
x = windows.generated_def.winstructs.MIB_TCP6TABLE_OWNER_PID.from_buffer(buffer)
nb_entry = x.dwNumEntries
# Struct _MIB_TCP6TABLE_OWNER_PID definitions
class _GENERATED_MIB_TCP6TABLE_OWNER_PID(Structure):
_fields_ = [
("dwNumEntries", DWORD),
("table", TCP6Connection * nb_entry),
]
return _GENERATED_MIB_TCP6TABLE_OWNER_PID.from_buffer(buffer)
class Firewall(cominterfaces.INetFwPolicy2):
"""The windows firewall"""
@property
def rules(self):
"""The rules of the firewall
:type: [:class:`FirewallRule`] -- A list of rule
"""
ifw_rules = cominterfaces.INetFwRules()
self.get_Rules(ifw_rules)
nb_rules = gdef.LONG()
ifw_rules.get_Count(nb_rules)
unknw = cominterfaces.IUnknown()
ifw_rules.get__NewEnum(unknw)
pVariant = cominterfaces.IEnumVARIANT()
unknw.QueryInterface(pVariant.IID, pVariant)
count = gdef.ULONG()
var = windows.com.ImprovedVariant()
rules = []
for i in range(nb_rules.value):
pVariant.Next(1, var, count)
if not count.value:
break
rule = FirewallRule()
idisp = var.asdispatch
idisp.QueryInterface(rule.IID, rule)
rules.append(rule)
return rules
@property
def current_profile_types(self):
"""Mask of the profiles currently enabled
:type: :class:`long`
"""
cpt = gdef.LONG()
self.get_CurrentProfileTypes(cpt)
return cpt.value
@property
def enabled(self):
"""A maping of the active firewall profiles
{
``NET_FW_PROFILE_TYPE2_.NET_FW_PROFILE2_DOMAIN(0x1L)``: ``True`` or ``False``,
``NET_FW_PROFILE_TYPE2_.NET_FW_PROFILE2_PRIVATE(0x2L)``: ``True`` or ``False``,
``NET_FW_PROFILE_TYPE2_.NET_FW_PROFILE2_PUBLIC(0x4L)``: ``True`` or ``False``,
}
:type: :class:`dict`
"""
profiles = [gdef.NET_FW_PROFILE2_DOMAIN, gdef.NET_FW_PROFILE2_PRIVATE, gdef.NET_FW_PROFILE2_PUBLIC]
return {prof: self.enabled_for_profile_type(prof) for prof in profiles}
def enabled_for_profile_type(self, profile_type):
enabled = gdef.VARIANT_BOOL()
self.get_FirewallEnabled(profile_type, enabled)
return enabled.value
class FirewallRule(cominterfaces.INetFwRule):
"""A rule of the firewall"""
@property
def name(self):
"""Name of the rule
:type: :class:`unicode`
"""
name = gdef.BSTR()
self.get_Name(name)
return name.value
@property
def description(self):
"""Description of the rule
:type: :class:`unicode`
"""
description = gdef.BSTR()
self.get_Description(description)
return description.value
@property
def application_name(self):
"""Name of the application to which apply the rule
:type: :class:`unicode`
"""
applicationname = gdef.BSTR()
self.get_ApplicationName(applicationname)
return applicationname.value
@property
def service_name(self):
"""Name of the service to which apply the rule
:type: :class:`unicode`
"""
servicename = gdef.BSTR()
self.get_ServiceName(servicename)
return servicename.value
@property
def protocol(self):
"""Protocol to which apply the rule
:type: :class:`long`
"""
protocol = gdef.LONG()
self.get_Protocol(protocol)
return protocol.value
@property
def local_address(self):
"""Local address of the rule
:type: :class:`unicode`
"""
local_address = gdef.BSTR()
self.get_LocalAddresses(local_address)
return local_address.value
@property
def remote_address(self):
"""Remote address of the rule
:type: :class:`unicode`
"""
remote_address = gdef.BSTR()
self.get_RemoteAddresses(remote_address)
return remote_address.value
@property
def direction(self):
"""Direction of the rule, values might be:
* ``NET_FW_RULE_DIRECTION_.NET_FW_RULE_DIR_IN(0x1L)``
* ``NET_FW_RULE_DIRECTION_.NET_FW_RULE_DIR_OUT(0x2L)``
subclass of :class:`long`
"""
direction = gdef.NET_FW_RULE_DIRECTION()
self.get_Direction(direction)
return direction.value
@property
def interface_types(self):
"""Types of interface of the rule
:type: :class:`unicode`
"""
interface_type = gdef.BSTR()
self.get_InterfaceTypes(interface_type)
return interface_type.value
@property
def local_port(self):
"""Local port of the rule
:type: :class:`unicode`
"""
local_port = gdef.BSTR()
self.get_LocalPorts(local_port)
return local_port.value
@property
def remote_port(self):
"""Remote port of the rule
:type: :class:`unicode`
"""
remote_port = gdef.BSTR()
self.get_RemotePorts(remote_port)
return remote_port.value
@property
def action(self):
"""Action of the rule, values might be:
* ``NET_FW_ACTION_.NET_FW_ACTION_BLOCK(0x0L)``
* ``NET_FW_ACTION_.NET_FW_ACTION_ALLOW(0x1L)``
subclass of :class:`long`
"""
action = gdef.NET_FW_ACTION()
self.get_Action(action)
return action.value
@property
def enabled(self):
"""``True`` if rule is enabled"""
enabled = gdef.VARIANT_BOOL()
self.get_Enabled(enabled)
return enabled.value
@property
def grouping(self):
"""Grouping of the rule
:type: :class:`unicode`
"""
grouping = gdef.BSTR()
self.get_RemotePorts(grouping)
return grouping.value
@property
def icmp_type_and_code(self):
icmp_type_and_code = gdef.BSTR()
self.get_RemotePorts(icmp_type_and_code)
return icmp_type_and_code.value
def __repr__(self):
return u'<{0} "{1}">'.format(type(self).__name__, self.name).encode("ascii", errors='backslashreplace')
class Network(object):
NetFwPolicy2 = windows.com.IID.from_string("E2B3C97F-6AE1-41AC-817A-F6F92166D7DD")
@property
def firewall(self):
"""The firewall of the system
:type: :class:`Firewall`
"""
windows.com.init()
firewall = Firewall()
windows.com.create_instance(self.NetFwPolicy2, firewall)
return firewall
@staticmethod
def _get_tcp_ipv4_sockets():
size = ctypes.c_uint(0)
try:
winproxy.GetExtendedTcpTable(None, ctypes.byref(size), ulAf=AF_INET)
except winproxy.IphlpapiError:
pass # Allow us to set size to the needed value
buffer = (ctypes.c_char * size.value)()
winproxy.GetExtendedTcpTable(buffer, ctypes.byref(size), ulAf=AF_INET)
t = get_MIB_TCPTABLE_OWNER_PID_from_buffer(buffer)
return list(t.table)
@staticmethod
def _get_tcp_ipv6_sockets():
size = ctypes.c_uint(0)
try:
winproxy.GetExtendedTcpTable(None, ctypes.byref(size), ulAf=AF_INET6)
except winproxy.IphlpapiError:
pass # Allow us to set size to the needed value
buffer = (ctypes.c_char * size.value)()
winproxy.GetExtendedTcpTable(buffer, ctypes.byref(size), ulAf=AF_INET6)
t = get_MIB_TCP6TABLE_OWNER_PID_from_buffer(buffer)
return list(t.table)
ipv4 = property(lambda self: self._get_tcp_ipv4_sockets())
"""List of TCP IPv4 socket (connection and listening)
:type: [:class:`TCP4Connection`]"""
ipv6 = property(lambda self: self._get_tcp_ipv6_sockets())
"""List of TCP IPv6 socket (connection and listening)
:type: [:class:`TCP6Connection`]
"""
| 28.756696 | 134 | 0.623613 | 1,492 | 12,883 | 5.175603 | 0.164879 | 0.047009 | 0.03108 | 0.022792 | 0.478244 | 0.402745 | 0.347967 | 0.29798 | 0.249806 | 0.192696 | 0 | 0.009764 | 0.268649 | 12,883 | 447 | 135 | 28.821029 | 0.809807 | 0.195607 | 0 | 0.416667 | 1 | 0.003968 | 0.037181 | 0.010188 | 0 | 0 | 0 | 0 | 0 | 1 | 0.174603 | false | 0.007937 | 0.035714 | 0.007937 | 0.460317 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13c5d0054209f9afb389d03f1764cab446c01a96 | 742 | py | Python | src/messages.py | Ewpratten/chat | 4cc8461e442b6530b7874f234b1a2261f3db8456 | [
"MIT"
] | null | null | null | src/messages.py | Ewpratten/chat | 4cc8461e442b6530b7874f234b1a2261f3db8456 | [
"MIT"
] | null | null | null | src/messages.py | Ewpratten/chat | 4cc8461e442b6530b7874f234b1a2261f3db8456 | [
"MIT"
] | null | null | null | greeting = """
--------------- BEGIN SESSION ---------------
You have connected to a chat server. Welcome!
:: About
Chat is a small piece of server software
written by Evan Pratten to allow people to
talk to eachother from any computer as long
as it has an internet connection. (Even an
arduino!). Check out the project at:
https://github.com/Ewpratten/chat
:: Disclaimer
While chatting, keep in mind that, if there
is a rule or regulation about privacy, this
server does not follow it. All data is sent
to and from this server over a raw TCP socket
and data is temporarily stored in plaintext
while the server handles message broadcasting
Now that's out of the way so, happy chatting!
---------------------------------------------
""" | 32.26087 | 45 | 0.690027 | 114 | 742 | 4.491228 | 0.719298 | 0.011719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165768 | 742 | 23 | 46 | 32.26087 | 0.827141 | 0 | 0 | 0 | 0 | 0 | 0.975774 | 0.060565 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13c974d988a5a072e9adfbe93d6a9ef5022a8ab3 | 1,712 | py | Python | source/dump_query_results.py | CheyenneNS/metrics | cfeeac6d01d99679897a998b193d630ada169c61 | [
"MIT"
] | null | null | null | source/dump_query_results.py | CheyenneNS/metrics | cfeeac6d01d99679897a998b193d630ada169c61 | [
"MIT"
] | null | null | null | source/dump_query_results.py | CheyenneNS/metrics | cfeeac6d01d99679897a998b193d630ada169c61 | [
"MIT"
] | null | null | null | #!/usr/local/bin/python
import os
import mysql.connector as mysql
metrics_mysql_password = os.environ['METRICS_MYSQL_PWD']
sql_host = os.environ['SQL_HOST']
metrics = os.environ['QUERY_ON']
def dump_query_results():
"""
This is a simple SQL table dump of a given query so we can supply users with custom tables.
Note that the SQL query itself and column headers portion need to be changed if you want to change
the query/results. Otherwise it is good to go.
It can be called simply with the bin shell script.
Read the README at the top level for an example.
"""
#connect to mysql
db_connection = mysql.connect(
host = sql_host,#"mysql1", #"localhost",
user = "metrics", #"root",
passwd = metrics_mysql_password,
database = "metrics" #"datacamp"
)
cursor = db_connection.cursor()
query = "use "+metrics
cursor.execute(query)
#CHANGE QUERY HERE
query = "select username, display_name, email, orcid, kb_internal_user, institution, country, signup_date, last_signin_date from user_info order by signup_date"
#CHANGE COLUMN HEADERS HERE TO MATCH QUERY HEADERS
print("username\tdisplay_name\temail\torcid\tkb_internal_user\tinstitution\tcountry\tsignup_date\tlast_signin_date")
cursor.execute(query)
row_values = list()
for (row_values) in cursor:
temp_string = ""
for i in range(len(row_values) - 1):
if row_values[i] is not None:
temp_string += str(row_values[i])
temp_string += "\t"
if row_values[-1] is not None:
temp_string += str(row_values[-1])
print(temp_string)
return 1
dump_query_results()
| 33.568627 | 164 | 0.675234 | 242 | 1,712 | 4.603306 | 0.512397 | 0.056553 | 0.02693 | 0.023339 | 0.055655 | 0.055655 | 0.055655 | 0.055655 | 0 | 0 | 0 | 0.003817 | 0.234813 | 1,712 | 50 | 165 | 34.24 | 0.846565 | 0.28271 | 0 | 0.066667 | 0 | 0.033333 | 0.261163 | 0.090143 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0.066667 | 0.066667 | 0 | 0.133333 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
13cbb884947e5c5ee43f164c1fde11e81811776b | 4,399 | py | Python | osaka/storage/sftp.py | riverma/osaka | f9ed386936500303c629d7213d91215085bcf346 | [
"Apache-2.0"
] | 2 | 2018-05-08T03:13:49.000Z | 2022-02-09T08:48:06.000Z | osaka/storage/sftp.py | riverma/osaka | f9ed386936500303c629d7213d91215085bcf346 | [
"Apache-2.0"
] | 6 | 2019-02-06T19:12:09.000Z | 2022-02-08T04:29:49.000Z | osaka/storage/sftp.py | riverma/osaka | f9ed386936500303c629d7213d91215085bcf346 | [
"Apache-2.0"
] | 12 | 2018-04-08T12:58:29.000Z | 2022-03-31T18:35:53.000Z | from __future__ import print_function
from __future__ import unicode_literals
from __future__ import division
from __future__ import absolute_import
from builtins import int
from future import standard_library
standard_library.install_aliases()
import os
import os.path
import stat
import urllib.parse
import paramiko
import traceback
import osaka.utils
"""
A backend used to handle stfp using parimiko
@author starchmd
"""
class SFTP(object):
"""
SFTP handling for Osaka
"""
def __init__(self, params={}):
"""
Constructor
"""
self.keyfile = params["keyfile"] if "keyfile" in params else None
def connect(self, host=None, port=None, user=None, password=None, secure=False):
"""
Connect to this storage medium. All data is parsed out of the url and may be None
scheme:
@param host - may be None, host to connect to
implementor must handle defaulting
@param port - may be None, port to connect to
implementor must handle a None port
@param user - may be None, user to connect as
implementor must handle a None user
@param password - may be None, password to connect with
implementor must handle a None password
"""
self.client = paramiko.client.SSHClient()
self.client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
self.client.connect(
host,
port=22 if port is None else int(port),
username=user,
password=password,
key_filename=self.keyfile,
timeout=15,
)
self.sftp = self.client.open_sftp()
@classmethod
def getSchemes(clazz):
"""
Returns a list of schemes this handler handles
Note: handling the scheme of another handler produces unknown results
@returns list of handled schemes
"""
return ["sftp"]
def put(self, path, url):
"""
Put the given path to the given url
@param path - local path of file/folder to put
@param url - url to put file/folder to
"""
rpath = urllib.parse.urlparse(url).path.lstrip("/")
print("\n\n\n\nUploading:", path)
if not os.path.isdir(path):
print("As file")
try:
self.sftp.mkdir(os.path.dirname(rpath))
except IOError:
pass
dest = rpath
try:
if stat.S_ISDIR(self.sftp.stat(rpath).st_mode) != 0:
dest = os.path.join(rpath, os.path.basename(path))
except:
pass
return self.upload(path, dest)
print("As Dir")
try:
self.sftp.mkdir(rpath)
except IOError:
pass
for dirpath, dirname, filenames in os.walk(path):
extra = os.path.relpath(dirpath, os.path.dirname(path))
try:
self.sftp.mkdir(os.path.join(rpath, extra))
except IOError:
pass
for filename in filenames:
self.upload(
os.path.join(dirpath, filename),
os.path.join(rpath, extra, filename),
)
def upload(self, path, rpath):
"""
Uploads a file to remote path
@param path - path to upload
@param rpath - remote path to upload to
"""
self.sftp.put(path, rpath)
return True
def get(self, url, path):
"""
Get the url (file/folder) to local path
@param url - url to get file/folder from
@param path - path to place fetched files
"""
rpath = urllib.parse.urlparse(url).path
try:
self.sftp.get(rpath, path)
except Exception as e:
osaka.utils.LOGGER.warning(
"Encountered exception: {}\n{}".format(e, traceback.format_exc())
)
raise osaka.utils.OsakaFileNotFound("File {} doesn't exist.".format(url))
def rm(self, url):
"""
Remove the item
@param url - url to remove
"""
rpath = urllib.parse.urlparse(url).path
self.sftp.remove(rpath)
def close(self):
"""
Close this connection
"""
self.client.close()
| 30.130137 | 90 | 0.562855 | 519 | 4,399 | 4.703276 | 0.310212 | 0.02458 | 0.032773 | 0.027038 | 0.119623 | 0.082343 | 0 | 0 | 0 | 0 | 0 | 0.001745 | 0.348716 | 4,399 | 145 | 91 | 30.337931 | 0.850262 | 0.246874 | 0 | 0.17284 | 0 | 0 | 0.034852 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.098765 | false | 0.074074 | 0.160494 | 0 | 0.308642 | 0.049383 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
13ce074cf333bc82bde9c49d1dbfefb77ad96d57 | 715 | py | Python | kindler/solver/optimizer.py | mingruimingrui/kindler | 8a9c2278b607a167b0ce827b218e54949a1120e7 | [
"MIT"
] | null | null | null | kindler/solver/optimizer.py | mingruimingrui/kindler | 8a9c2278b607a167b0ce827b218e54949a1120e7 | [
"MIT"
] | null | null | null | kindler/solver/optimizer.py | mingruimingrui/kindler | 8a9c2278b607a167b0ce827b218e54949a1120e7 | [
"MIT"
] | null | null | null | import torch
def make_sgd_optimizer(
model,
base_lr=0.001,
bias_lr_factor=2.0,
momentum=0.9,
weight_decay=0.0005,
weight_decay_bias=0.0,
):
params = []
for key, value in model.named_parameters():
if not value.requires_grad:
continue
param_lr = base_lr
param_weight_decay = weight_decay
if "bias" in key:
param_lr = base_lr * bias_lr_factor
param_weight_decay = weight_decay_bias
params.append({
'params': [value],
'lr': param_lr,
'weight_decay': param_weight_decay
})
optimizer = torch.optim.SGD(params, base_lr, momentum=momentum)
return optimizer
| 22.34375 | 67 | 0.601399 | 91 | 715 | 4.417582 | 0.384615 | 0.218905 | 0.119403 | 0.064677 | 0.134328 | 0 | 0 | 0 | 0 | 0 | 0 | 0.030426 | 0.31049 | 715 | 31 | 68 | 23.064516 | 0.78499 | 0 | 0 | 0 | 0 | 0 | 0.033566 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.04 | false | 0 | 0.04 | 0 | 0.12 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13d25057738843cced8f3d82852dabf41375fb9a | 754 | py | Python | redshift_upload/base_utilities.py | douglassimonsen/redshift_upload | e549c770538f022c0b90a983ca056f3e9c16c643 | [
"MIT"
] | null | null | null | redshift_upload/base_utilities.py | douglassimonsen/redshift_upload | e549c770538f022c0b90a983ca056f3e9c16c643 | [
"MIT"
] | 1 | 2022-03-12T03:50:55.000Z | 2022-03-12T03:50:55.000Z | redshift_upload/base_utilities.py | douglassimonsen/redshift_upload | e549c770538f022c0b90a983ca056f3e9c16c643 | [
"MIT"
] | null | null | null | import inspect
import os
from pathlib import Path
class change_directory:
"""
A class for changing the working directory using a "with" statement.
It takes the directory to change to as an argument. If no directory is given,
it takes the directory of the file from which this function was called.
"""
def __init__(self, directory: str = None) -> None:
self.old_dir = os.getcwd()
if directory is None:
self.new_dir = Path(inspect.getabsfile(inspect.stack()[1][0])).parent # type: ignore
else:
self.new_dir = directory
def __enter__(self, *_) -> None:
os.chdir(self.new_dir)
def __exit__(self, *_) -> None:
os.chdir(self.old_dir)
| 30.16 | 98 | 0.624668 | 102 | 754 | 4.421569 | 0.529412 | 0.046563 | 0.066519 | 0.084257 | 0.084257 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003704 | 0.28382 | 754 | 24 | 99 | 31.416667 | 0.831481 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.214286 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13d5b4c27424ce3f3958c70f0a0828815649d42f | 2,971 | py | Python | homeassistant/components/zha/core/channels/lighting.py | liangleslie/core | cc807b4d597daaaadc92df4a93c6e30da4f570c6 | [
"Apache-2.0"
] | 30,023 | 2016-04-13T10:17:53.000Z | 2020-03-02T12:56:31.000Z | homeassistant/components/zha/core/channels/lighting.py | liangleslie/core | cc807b4d597daaaadc92df4a93c6e30da4f570c6 | [
"Apache-2.0"
] | 24,710 | 2016-04-13T08:27:26.000Z | 2020-03-02T12:59:13.000Z | homeassistant/components/zha/core/channels/lighting.py | liangleslie/core | cc807b4d597daaaadc92df4a93c6e30da4f570c6 | [
"Apache-2.0"
] | 11,956 | 2016-04-13T18:42:31.000Z | 2020-03-02T09:32:12.000Z | """Lighting channels module for Zigbee Home Automation."""
from __future__ import annotations
from contextlib import suppress
from zigpy.zcl.clusters import lighting
from .. import registries
from ..const import REPORT_CONFIG_DEFAULT
from .base import ClientChannel, ZigbeeChannel
@registries.ZIGBEE_CHANNEL_REGISTRY.register(lighting.Ballast.cluster_id)
class Ballast(ZigbeeChannel):
"""Ballast channel."""
@registries.CLIENT_CHANNELS_REGISTRY.register(lighting.Color.cluster_id)
class ColorClientChannel(ClientChannel):
"""Color client channel."""
@registries.BINDABLE_CLUSTERS.register(lighting.Color.cluster_id)
@registries.ZIGBEE_CHANNEL_REGISTRY.register(lighting.Color.cluster_id)
class ColorChannel(ZigbeeChannel):
"""Color channel."""
CAPABILITIES_COLOR_XY = 0x08
CAPABILITIES_COLOR_TEMP = 0x10
UNSUPPORTED_ATTRIBUTE = 0x86
REPORT_CONFIG = (
{"attr": "current_x", "config": REPORT_CONFIG_DEFAULT},
{"attr": "current_y", "config": REPORT_CONFIG_DEFAULT},
{"attr": "color_temperature", "config": REPORT_CONFIG_DEFAULT},
)
MAX_MIREDS: int = 500
MIN_MIREDS: int = 153
ZCL_INIT_ATTRS = {
"color_mode": False,
"color_temp_physical_min": True,
"color_temp_physical_max": True,
"color_capabilities": True,
"color_loop_active": False,
}
@property
def color_capabilities(self) -> int:
"""Return color capabilities of the light."""
with suppress(KeyError):
return self.cluster["color_capabilities"]
if self.cluster.get("color_temperature") is not None:
return self.CAPABILITIES_COLOR_XY | self.CAPABILITIES_COLOR_TEMP
return self.CAPABILITIES_COLOR_XY
@property
def color_mode(self) -> int | None:
"""Return cached value of the color_mode attribute."""
return self.cluster.get("color_mode")
@property
def color_loop_active(self) -> int | None:
"""Return cached value of the color_loop_active attribute."""
return self.cluster.get("color_loop_active")
@property
def color_temperature(self) -> int | None:
"""Return cached value of color temperature."""
return self.cluster.get("color_temperature")
@property
def current_x(self) -> int | None:
"""Return cached value of the current_x attribute."""
return self.cluster.get("current_x")
@property
def current_y(self) -> int | None:
"""Return cached value of the current_y attribute."""
return self.cluster.get("current_y")
@property
def min_mireds(self) -> int:
"""Return the coldest color_temp that this channel supports."""
return self.cluster.get("color_temp_physical_min", self.MIN_MIREDS)
@property
def max_mireds(self) -> int:
"""Return the warmest color_temp that this channel supports."""
return self.cluster.get("color_temp_physical_max", self.MAX_MIREDS)
| 33.382022 | 76 | 0.693369 | 352 | 2,971 | 5.613636 | 0.238636 | 0.050607 | 0.068826 | 0.07085 | 0.44332 | 0.313765 | 0.20749 | 0.148785 | 0.148785 | 0.069838 | 0 | 0.006313 | 0.200269 | 2,971 | 88 | 77 | 33.761364 | 0.825337 | 0.169976 | 0 | 0.137931 | 0 | 0 | 0.124324 | 0.038254 | 0 | 0 | 0.00499 | 0 | 0 | 1 | 0.137931 | false | 0 | 0.103448 | 0 | 0.586207 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
13d760267b20f874fc4b087de72759e81f401445 | 6,123 | py | Python | servicedirectory/src/sd-api/users/tests/tests_serializers.py | ealogar/servicedirectory | fb4f4bfa8b499b93c03af589ef2f34c08a830b17 | [
"Apache-2.0"
] | null | null | null | servicedirectory/src/sd-api/users/tests/tests_serializers.py | ealogar/servicedirectory | fb4f4bfa8b499b93c03af589ef2f34c08a830b17 | [
"Apache-2.0"
] | null | null | null | servicedirectory/src/sd-api/users/tests/tests_serializers.py | ealogar/servicedirectory | fb4f4bfa8b499b93c03af589ef2f34c08a830b17 | [
"Apache-2.0"
] | null | null | null | '''
(c) Copyright 2013 Telefonica, I+D. Printed in Spain (Europe). All Rights
Reserved.
The copyright to the software program(s) is property of Telefonica I+D.
The program(s) may be used and or copied only with the express written
consent of Telefonica I+D or in accordance with the terms and conditions
stipulated in the agreement/contract under which the program(s) have
been supplied.
'''
from unittest import TestCase
from mock import MagicMock, patch
from commons.json_schema_validator.schema_reader import SchemaField
from commons.json_schema_validator.schema_reader import SchemaReader
from users.serializers import UserCollectionSerializer
class UserSerializerTests(TestCase):
def setUp(self):
super(UserSerializerTests, self).setUp()
mock_schema_instance = MagicMock(name='mock_schema_instance')
mock_schema_instance.return_value = [
SchemaField(name='username', field_type='string', required=True),
SchemaField(name='password', field_type='string', required=True),
SchemaField(name='is_admin', field_type='boolean', required=True, default=False)
]
mock_get_schema_fields = MagicMock(name='mock_get_schema')
mock_get_schema_fields.return_value = mock_schema_instance
# mock schema instance
schema_reader = SchemaReader()
self.patcher_validate = patch.object(schema_reader, 'validate_object') # @UndefinedVariable
self.patcher_schema = patch.object(schema_reader, # @UndefinedVariable
'get_schema_fields', mock_schema_instance)
self.patcher_schema.start()
self.patcher_validate.start()
def tearDown(self):
self.patcher_schema.stop()
self.patcher_validate.stop()
def test_deserialize_user_should_work(self):
# We need to do import here in order generic patches work
serializer = UserCollectionSerializer(data={'username': 'user', 'password': 'pass'})
self.assertEquals(True, serializer.is_valid(), "Serialization invalid")
def test_deserialize_user_invalid_is_admin_should_work(self):
# We need to do import here in order generic patches work
serializer = UserCollectionSerializer(data={'username': 'user', 'password': 'pass', 'is_admin': 'si'})
self.assertEquals(False, serializer.is_valid(), "Serialization invalid")
def test_deserialize_user_empty_user_should_give_error_invalid(self):
# We need to do import here in order generic patches work
serializer = UserCollectionSerializer(data={'username': '', 'password': 'pass'})
self.assertEquals(False, serializer.is_valid(), "Serialization invalid")
self.assertEquals(u"invalid",
serializer.errors['username'][0],
'Invalid error message')
def test_deserialize_user_null_user_should_give_required_error(self):
# We need to do import here in order generic patches work
serializer = UserCollectionSerializer(data={'password': 'pass'})
self.assertEquals(False, serializer.is_valid(), "Serialization invalid")
self.assertEquals(u"required",
serializer.errors['username'][0],
'Invalid error message')
def test_deserialize_user_large_user_ne_should_give_invalid_error(self):
# We need to do import here in order generic patches work
serializer = UserCollectionSerializer(data={'username': 'a' * 600, 'password': 'pass'})
self.assertEquals(False, serializer.is_valid(), "Serialization invalid")
self.assertEquals(u"invalid",
serializer.errors['username'][0],
'Invalid error message')
def test_deserialize_user_with_invalid_origins_should_give_error(self):
serializer = UserCollectionSerializer(data={'username': 'user', 'password': 'pass', 'origins': ["????"]})
self.assertEquals(False, serializer.is_valid())
self.assertEquals(u"invalid",
serializer.errors['origins'][0],
'Invalid error message')
serializer = UserCollectionSerializer(data={'username': 'user', 'password': 'pass', 'origins': [" tugo"]})
self.assertEquals(False, serializer.is_valid())
self.assertEquals(u"invalid",
serializer.errors['origins'][0],
'Invalid error message')
def test_deserialize_user_with_invalid_classes_should_give_error(self):
serializer = UserCollectionSerializer(data={'username': 'user', 'password': 'pass', 'classes': ["????"]})
self.assertEquals(False, serializer.is_valid())
self.assertEquals(u"invalid",
serializer.errors['classes'][0],
'Invalid error message')
serializer = UserCollectionSerializer(data={'username': 'user', 'password': 'pass', 'classes': [" sms"]})
self.assertEquals(False, serializer.is_valid())
self.assertEquals(u"invalid",
serializer.errors['classes'][0],
'Invalid error message')
def test_deserialize_user_invalid_username_should_give_error(self):
# We need to do import here in order generic patches work
serializer = UserCollectionSerializer(data={'username': 'User.user', 'password': 'pass'})
self.assertEquals(False, serializer.is_valid(), "Serialization invalid")
self.assertEquals(u"invalid",
serializer.errors['username'][0],
'Invalid error message')
def test_deserialize_user_invalid_is_admin_should_give_error(self):
# We need to do import here in order generic patches work
serializer = UserCollectionSerializer(data={'username': 'usera', 'password': 'pass', 'is_admin': 0})
self.assertEquals(False, serializer.is_valid(), "Serialization invalid")
self.assertEquals(u"invalid",
serializer.errors['is_admin'][0],
'Invalid error message')
| 52.784483 | 114 | 0.6634 | 654 | 6,123 | 6.022936 | 0.19419 | 0.081239 | 0.106118 | 0.116781 | 0.693069 | 0.687992 | 0.669713 | 0.648388 | 0.606245 | 0.567149 | 0 | 0.003606 | 0.230116 | 6,123 | 115 | 115 | 53.243478 | 0.83199 | 0.136534 | 0 | 0.421687 | 0 | 0 | 0.16926 | 0 | 0 | 0 | 0 | 0 | 0.240964 | 1 | 0.13253 | false | 0.144578 | 0.060241 | 0 | 0.204819 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
13d7896d6d799cba6c0e766504d5f3eea5f2e531 | 3,124 | py | Python | Web/notifyXAPI/app/src/users/views.py | abs0lut3pwn4g3/RootersCTF2019-challenges | 397a6fad0b03e55541df06e5103172ae850cd4e5 | [
"MIT"
] | 14 | 2019-10-13T07:38:04.000Z | 2022-02-13T09:03:50.000Z | Web/notifyXAPI/app/src/users/views.py | abs0lut3pwn4g3/RootersCTF2019-challenges | 397a6fad0b03e55541df06e5103172ae850cd4e5 | [
"MIT"
] | 1 | 2019-10-13T07:35:13.000Z | 2019-10-13T08:22:48.000Z | Web/notifyXAPI/app/src/users/views.py | abs0lut3pwn4g3/RootersCTF2019-challenges | 397a6fad0b03e55541df06e5103172ae850cd4e5 | [
"MIT"
] | 4 | 2019-10-13T08:21:43.000Z | 2022-01-09T16:39:33.000Z | ''' User views '''
from datetime import timedelta
from flask import request, jsonify, make_response, redirect, json, render_template
from flask_jwt_extended import (create_access_token, jwt_required)
from flask_restful import Resource
from flask_login import login_user, current_user
from sqlalchemy.exc import IntegrityError, InvalidRequestError
from src import db, api
from .models import User
from .schemas import UserSchema
class UserLoginResource(Resource):
model = User
schema = UserSchema
def get(self):
return make_response(render_template('login.html'))
def post(self):
if request.json:
data = request.json
user = self.model.query.filter(self.model.email == data['email']).first()
if user and self.model.check_password(user, data['password']):
expires = timedelta(days=365)
user = UserSchema(only=('id', 'email', 'is_admin')).dump(user).data
return make_response(
jsonify({'id': user,
'authentication_token': create_access_token(identity=user['id'], expires_delta=expires)}), 200)
else:
return make_response(jsonify({"error": {"code": 400, "msg": "No such user/wrong password."}}), 400)
else:
data = request.form
user = self.model.query.filter(self.model.email == data['email']).first()
if user and self.model.check_password(user, data['password']) and login_user(user):
return make_response(redirect('/admin/', 302))
else:
return make_response(redirect('/api/v1/login', 403))
class UserRegisterResource(Resource):
model = User
schema = UserSchema
def post(self):
data = request.json
if not data:
return make_response(jsonify({'error': 'No data'}), 400)
user = User.query.filter(User.email == data['email']).first()
if user:
return make_response(jsonify({'error': 'User already exists'}), 403)
user, errors = self.schema().load(data)
if errors:
return make_response(jsonify(errors), 400)
try:
user.set_password(data['password'])
db.session.add(user)
db.session.commit()
except (IntegrityError, InvalidRequestError) as e:
print(e)
db.session.rollback()
return make_response(jsonify(error={'code': 400 }), 400)
expires = timedelta(days=365)
return make_response(
jsonify(created_user={'id': user.id,
'user': self.schema(only=('id', 'email', 'is_admin')).dump(user).data,
'authentication_token': create_access_token(identity=user.id,
expires_delta=expires)}), 200)
api.add_resource(UserLoginResource, '/login/', endpoint='login')
api.add_resource(UserRegisterResource, '/register/', endpoint='register') | 40.571429 | 124 | 0.588348 | 336 | 3,124 | 5.354167 | 0.28869 | 0.073374 | 0.100056 | 0.097276 | 0.367982 | 0.31851 | 0.264591 | 0.223457 | 0.190106 | 0.190106 | 0 | 0.018165 | 0.295134 | 3,124 | 77 | 125 | 40.571429 | 0.798819 | 0.003201 | 0 | 0.274194 | 0 | 0 | 0.08336 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048387 | false | 0.064516 | 0.145161 | 0.016129 | 0.451613 | 0.016129 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
13ded3828a8c037ea4aa78b91386fb78512809eb | 326 | py | Python | tests/test-recipes/metadata/ignore_some_prefix_files/run_test.py | mbargull/conda-build | ebc56f48196774301863fecbe98a32a7ded6eb7e | [
"BSD-3-Clause"
] | null | null | null | tests/test-recipes/metadata/ignore_some_prefix_files/run_test.py | mbargull/conda-build | ebc56f48196774301863fecbe98a32a7ded6eb7e | [
"BSD-3-Clause"
] | null | null | null | tests/test-recipes/metadata/ignore_some_prefix_files/run_test.py | mbargull/conda-build | ebc56f48196774301863fecbe98a32a7ded6eb7e | [
"BSD-3-Clause"
] | null | null | null | import os
pkgs = os.path.join(os.environ["ROOT"], "pkgs")
info_dir = os.path.join(pkgs, "conda-build-test-ignore-some-prefix-files-1.0-0", "info")
has_prefix_file = os.path.join(info_dir, "has_prefix")
print(info_dir)
assert os.path.isfile(has_prefix_file)
with open(has_prefix_file) as f:
assert "test2" not in f.read()
| 32.6 | 88 | 0.733129 | 60 | 326 | 3.816667 | 0.516667 | 0.104803 | 0.131004 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.013605 | 0.09816 | 326 | 9 | 89 | 36.222222 | 0.765306 | 0 | 0 | 0 | 0 | 0 | 0.226994 | 0.144172 | 0 | 0 | 0 | 0 | 0.25 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13e067146d5c409e953e8fe9a97ca674f7b0976f | 2,217 | py | Python | ymir/backend/src/ymir_controller/controller/utils/invoker_mapping.py | phoenix-xhuang/ymir | 537d3ac389c4a365ce4daef431c95b42ddcd5b1b | [
"Apache-2.0"
] | 64 | 2021-11-15T03:48:00.000Z | 2022-03-25T07:08:46.000Z | ymir/backend/src/ymir_controller/controller/utils/invoker_mapping.py | phoenix-xhuang/ymir | 537d3ac389c4a365ce4daef431c95b42ddcd5b1b | [
"Apache-2.0"
] | 35 | 2021-11-23T04:14:35.000Z | 2022-03-26T09:03:43.000Z | ymir/backend/src/ymir_controller/controller/utils/invoker_mapping.py | phoenix-xhuang/ymir | 537d3ac389c4a365ce4daef431c95b42ddcd5b1b | [
"Apache-2.0"
] | 57 | 2021-11-11T10:15:40.000Z | 2022-03-29T07:27:54.000Z | from controller.invoker import (
invoker_cmd_branch_checkout,
invoker_cmd_branch_commit,
invoker_cmd_branch_create,
invoker_cmd_branch_delete,
invoker_cmd_branch_list,
invoker_cmd_evaluate,
invoker_cmd_filter,
invoker_cmd_gpu_info,
invoker_cmd_inference,
invoker_cmd_init,
invoker_cmd_label_add,
invoker_cmd_label_get,
invoker_cmd_log,
invoker_cmd_merge,
invoker_cmd_pull_image,
invoker_cmd_repo_check,
invoker_cmd_repo_clear,
invoker_cmd_sampling,
invoker_cmd_terminate,
invoker_cmd_user_create,
invoker_task_factory,
)
from proto import backend_pb2
RequestTypeToInvoker = {
backend_pb2.CMD_BRANCH_CHECKOUT: invoker_cmd_branch_checkout.BranchCheckoutInvoker,
backend_pb2.CMD_BRANCH_CREATE: invoker_cmd_branch_create.BranchCreateInvoker,
backend_pb2.CMD_BRANCH_DEL: invoker_cmd_branch_delete.BranchDeleteInvoker,
backend_pb2.CMD_BRANCH_LIST: invoker_cmd_branch_list.BranchListInvoker,
backend_pb2.CMD_COMMIT: invoker_cmd_branch_commit.BranchCommitInvoker,
backend_pb2.CMD_EVALUATE: invoker_cmd_evaluate.EvaluateInvoker,
backend_pb2.CMD_FILTER: invoker_cmd_filter.FilterBranchInvoker,
backend_pb2.CMD_GPU_INFO_GET: invoker_cmd_gpu_info.GPUInfoInvoker,
backend_pb2.CMD_INFERENCE: invoker_cmd_inference.InferenceCMDInvoker,
backend_pb2.CMD_INIT: invoker_cmd_init.InitInvoker,
backend_pb2.CMD_LABEL_ADD: invoker_cmd_label_add.LabelAddInvoker,
backend_pb2.CMD_LABEL_GET: invoker_cmd_label_get.LabelGetInvoker,
backend_pb2.CMD_LOG: invoker_cmd_log.LogInvoker,
backend_pb2.CMD_MERGE: invoker_cmd_merge.MergeInvoker,
backend_pb2.CMD_PULL_IMAGE: invoker_cmd_pull_image.ImageHandler,
backend_pb2.CMD_TERMINATE: invoker_cmd_terminate.CMDTerminateInvoker,
backend_pb2.CMD_REPO_CHECK: invoker_cmd_repo_check.RepoCheckInvoker,
backend_pb2.CMD_REPO_CLEAR: invoker_cmd_repo_clear.RepoClearInvoker,
backend_pb2.REPO_CREATE: invoker_cmd_init.InitInvoker,
backend_pb2.TASK_CREATE: invoker_task_factory.CreateTaskInvokerFactory,
backend_pb2.USER_CREATE: invoker_cmd_user_create.UserCreateInvoker,
backend_pb2.CMD_SAMPLING: invoker_cmd_sampling.SamplingInvoker,
}
| 43.470588 | 87 | 0.834461 | 284 | 2,217 | 5.929577 | 0.204225 | 0.243468 | 0.146675 | 0.045131 | 0.273159 | 0.179335 | 0 | 0 | 0 | 0 | 0 | 0.011705 | 0.113667 | 2,217 | 50 | 88 | 44.34 | 0.845293 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.041667 | 0 | 0.041667 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13e10f247a53a809b100dc05b97804f51f30b05a | 463 | py | Python | server/form/mongo.py | SRM-IST-KTR/ossmosis | 06e375dfdd67f91286ffbcb13e04b6543585d8ad | [
"MIT"
] | 6 | 2021-07-04T07:59:17.000Z | 2021-07-04T14:41:00.000Z | server/form/mongo.py | SRM-IST-KTR/ossmosis | 06e375dfdd67f91286ffbcb13e04b6543585d8ad | [
"MIT"
] | null | null | null | server/form/mongo.py | SRM-IST-KTR/ossmosis | 06e375dfdd67f91286ffbcb13e04b6543585d8ad | [
"MIT"
] | 1 | 2022-02-15T13:31:46.000Z | 2022-02-15T13:31:46.000Z | import os
from pymongo import MongoClient
from dotenv import load_dotenv
def database_entry(data):
try:
load_dotenv()
mongo_string = os.getenv('MONGODB_AUTH_URI')
client = MongoClient(mongo_string)
database = client[os.getenv('MONGODB_DB')]
col = database['users']
col.insert_one(data)
return True
except Exception as e:
print(e)
return False
if __name__ == "__main__":
pass
| 21.045455 | 52 | 0.637149 | 56 | 463 | 4.964286 | 0.642857 | 0.071942 | 0.107914 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.278618 | 463 | 21 | 53 | 22.047619 | 0.832335 | 0 | 0 | 0 | 0 | 0 | 0.084233 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.058824 | false | 0.058824 | 0.176471 | 0 | 0.352941 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
13e58721f5e2c382c66ef605548dfd06ad0e1f44 | 1,363 | py | Python | utils/stg/min_jerk_traj.py | dgerod/more-dmps | 4dc886a138f289532b2672537f91ff857448ad27 | [
"BSD-2-Clause"
] | 7 | 2017-12-23T13:43:33.000Z | 2021-08-21T14:50:55.000Z | utils/stg/min_jerk_traj.py | dgerod/more-dmps | 4dc886a138f289532b2672537f91ff857448ad27 | [
"BSD-2-Clause"
] | 1 | 2016-08-21T05:05:32.000Z | 2016-08-21T05:05:32.000Z | utils/stg/min_jerk_traj.py | dgerod/more-dmps | 4dc886a138f289532b2672537f91ff857448ad27 | [
"BSD-2-Clause"
] | 2 | 2018-02-11T02:01:32.000Z | 2019-09-16T01:43:43.000Z | '''
Created on 25.07.2012
@author: karl
'''
def trajectory(start, goal, duration, delta_t):
traj = []
# inital values
t, td, tdd = start, 0, 0
for i in range(int(2 * duration / delta_t)):
try:
t, td, tdd = _min_jerk_step(t, td, tdd, goal, duration - i * delta_t, delta_t)
except:
break
traj.append([t, td, tdd])
return traj
def _min_jerk_step(x, xd, xdd, goal, tau, dt):
#function [x,xd,xdd] = min_jerk_step(x,xd,xdd,goal,tau, dt) computes
# the update of x,xd,xdd for the next time step dt given that we are
# currently at x,xd,xdd, and that we have tau until we want to reach
# the goal
# ported from matlab dmp toolbox
if tau < dt:
raise Exception, "time left (tau) is smaller than current time (dt) - end of traj reached!"
dist = goal - x
a1 = 0
a0 = xdd * tau ** 2
v1 = 0
v0 = xd * tau
t1 = dt
t2 = dt ** 2
t3 = dt ** 3
t4 = dt ** 4
t5 = dt ** 5
c1 = (6.*dist + (a1 - a0) / 2. - 3.*(v0 + v1)) / tau ** 5
c2 = (-15.*dist + (3.*a0 - 2.*a1) / 2. + 8.*v0 + 7.*v1) / tau ** 4
c3 = (10.*dist + (a1 - 3.*a0) / 2. - 6.*v0 - 4.*v1) / tau ** 3
c4 = xdd / 2.
c5 = xd
c6 = x
x = c1 * t5 + c2 * t4 + c3 * t3 + c4 * t2 + c5 * t1 + c6
xd = 5.*c1 * t4 + 4 * c2 * t3 + 3 * c3 * t2 + 2 * c4 * t1 + c5
xdd = 20.*c1 * t3 + 12.*c2 * t2 + 6.*c3 * t1 + 2.*c4
return (x, xd, xdd)
| 24.781818 | 96 | 0.533382 | 253 | 1,363 | 2.826087 | 0.391304 | 0.025175 | 0.05035 | 0.033566 | 0.072727 | 0.072727 | 0.072727 | 0.072727 | 0.072727 | 0 | 0 | 0.10625 | 0.295671 | 1,363 | 54 | 97 | 25.240741 | 0.638542 | 0.186354 | 0 | 0 | 0 | 0 | 0.068053 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
13e74a7f98e6571f4fc714e2743e38c7eafbf58e | 2,607 | py | Python | orthoexon/tests/test_util.py | jessicalettes/orthoexon | 463ad1908364c602cf75dbddb0b16a42f4100a36 | [
"BSD-3-Clause"
] | null | null | null | orthoexon/tests/test_util.py | jessicalettes/orthoexon | 463ad1908364c602cf75dbddb0b16a42f4100a36 | [
"BSD-3-Clause"
] | null | null | null | orthoexon/tests/test_util.py | jessicalettes/orthoexon | 463ad1908364c602cf75dbddb0b16a42f4100a36 | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
test_orthoexon
----------------------------------
Tests for `orthoexon` module.
"""
import os
import pytest
@pytest.fixture
def exon_id_with_quotes():
return "'ENSE00001229068.1'"
@pytest.fixture
def exon_id():
return "ENSE00001229068.1"
def test_separate_with_quotes(exon_id_with_quotes):
from orthoexon.util import separate
test = separate(exon_id_with_quotes)
true = "ENSE00001229068"
assert test == true
def test_separate(exon_id):
from orthoexon.util import separate
test = separate(exon_id)
true = "ENSE00001229068"
assert test == true
@pytest.fixture
def location():
return "chr20:10256140-10256211:+:0"
def test_splitstart(location):
from orthoexon.util import splitstart
test = splitstart(location)
true = '10256140'
assert test == true
def test_splitend(location):
from orthoexon.util import splitend
test = splitend(location)
true = '10256211'
assert test == true
@pytest.fixture
def human_gtf_filename(table_folder):
return os.path.join(table_folder, 'humanrbfox2andfmr1andsnap25.gtf')
@pytest.fixture
def human_gtf_database(table_folder):
return os.path.join(table_folder, 'humanrbfox2andfmr1andsnap25.gtf.db')
@pytest.fixture
def human_fasta(table_folder):
return os.path.join(table_folder, 'GRCm38.p3.genome.fa')
def test_translate(exon_id, human_fasta, human_gtf_database):
from orthoexon.util import translate
from orthoexon.util import separate
for index, species1gene in enumerate(human_gtf_database.features_of_type('gene')):
species1gffutilsgeneid = str(species1gene['gene_id'])
species1geneid = separate(species1gffutilsgeneid)
for exon in human_gtf_database.children(species1geneid,
featuretype='CDS',
order_by='start'):
if exon_id == exon:
test = translate(exon, human_fasta)
break
break
true = 'MAEDADMRNELEEMQRRADQLADE'
assert test == true
# def test_getsequence(exon, human_gtf_database):
# from orthoexon.util import getsequence
#
# test = getsequence(exon, human_gtf_database)
# true = 'ATGGCCGAAGACGCAGACATGCGCAATGAGCTGGAGGAGATGCAGCGAAGGGCTGACCAGTT' \
# 'GGCTGATGAG'
#
# assert test == true
# def test_make_sequence_array(finalsequencedf):
# from orthoexon.util import make_sequence_array
#
# test = make_sequence_array(finalsequencedf)
# true = ......
#
# assert test == true | 23.917431 | 86 | 0.678558 | 287 | 2,607 | 5.965157 | 0.275261 | 0.028037 | 0.079439 | 0.107477 | 0.478388 | 0.271028 | 0.204439 | 0.158879 | 0.136682 | 0.079439 | 0 | 0.048434 | 0.215957 | 2,607 | 109 | 87 | 23.917431 | 0.789139 | 0.221327 | 0 | 0.327273 | 0 | 0 | 0.117588 | 0.057798 | 0 | 0 | 0 | 0 | 0.090909 | 1 | 0.2 | false | 0 | 0.145455 | 0.109091 | 0.454545 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
13efdb45818b7da3afae845201256a86d37c940d | 4,302 | py | Python | Lib/test/libregrtest/utils.py | oskomorokhov/cpython | c0e11a3ceb9427e09db4224f394c7789bf6deec5 | [
"0BSD"
] | 5 | 2017-08-25T04:31:30.000Z | 2022-03-22T15:01:56.000Z | Lib/test/libregrtest/utils.py | oskomorokhov/cpython | c0e11a3ceb9427e09db4224f394c7789bf6deec5 | [
"0BSD"
] | 20 | 2021-03-25T12:52:42.000Z | 2022-03-01T02:02:03.000Z | Lib/test/libregrtest/utils.py | oskomorokhov/cpython | c0e11a3ceb9427e09db4224f394c7789bf6deec5 | [
"0BSD"
] | 3 | 2020-04-13T14:41:31.000Z | 2022-03-02T18:56:32.000Z | import math
import os.path
import sys
import textwrap
from test import support
def format_duration(seconds):
ms = math.ceil(seconds * 1e3)
seconds, ms = divmod(ms, 1000)
minutes, seconds = divmod(seconds, 60)
hours, minutes = divmod(minutes, 60)
parts = []
if hours:
parts.append('%s hour' % hours)
if minutes:
parts.append('%s min' % minutes)
if seconds:
if parts:
# 2 min 1 sec
parts.append('%s sec' % seconds)
else:
# 1.0 sec
parts.append('%.1f sec' % (seconds + ms / 1000))
if not parts:
return '%s ms' % ms
parts = parts[:2]
return ' '.join(parts)
def removepy(names):
if not names:
return
for idx, name in enumerate(names):
basename, ext = os.path.splitext(name)
if ext == '.py':
names[idx] = basename
def count(n, word):
if n == 1:
return "%d %s" % (n, word)
else:
return "%d %ss" % (n, word)
def printlist(x, width=70, indent=4, file=None):
"""Print the elements of iterable x to stdout.
Optional arg width (default 70) is the maximum line length.
Optional arg indent (default 4) is the number of blanks with which to
begin each line.
"""
blanks = ' ' * indent
# Print the sorted list: 'x' may be a '--random' list or a set()
print(textwrap.fill(' '.join(str(elt) for elt in sorted(x)), width,
initial_indent=blanks, subsequent_indent=blanks),
file=file)
def print_warning(msg):
support.print_warning(msg)
orig_unraisablehook = None
def regrtest_unraisable_hook(unraisable):
global orig_unraisablehook
support.environment_altered = True
print_warning("Unraisable exception")
old_stderr = sys.stderr
try:
sys.stderr = sys.__stderr__
orig_unraisablehook(unraisable)
finally:
sys.stderr = old_stderr
def setup_unraisable_hook():
global orig_unraisablehook
orig_unraisablehook = sys.unraisablehook
sys.unraisablehook = regrtest_unraisable_hook
def clear_caches():
# Clear the warnings registry, so they can be displayed again
for mod in sys.modules.values():
if hasattr(mod, '__warningregistry__'):
del mod.__warningregistry__
# Flush standard output, so that buffered data is sent to the OS and
# associated Python objects are reclaimed.
for stream in (sys.stdout, sys.stderr, sys.__stdout__, sys.__stderr__):
if stream is not None:
stream.flush()
# Clear assorted module caches.
# Don't worry about resetting the cache if the module is not loaded
try:
distutils_dir_util = sys.modules['distutils.dir_util']
except KeyError:
pass
else:
distutils_dir_util._path_created.clear()
try:
re = sys.modules['re']
except KeyError:
pass
else:
re.purge()
try:
_strptime = sys.modules['_strptime']
except KeyError:
pass
else:
_strptime._regex_cache.clear()
try:
urllib_parse = sys.modules['urllib.parse']
except KeyError:
pass
else:
urllib_parse.clear_cache()
try:
urllib_request = sys.modules['urllib.request']
except KeyError:
pass
else:
urllib_request.urlcleanup()
try:
linecache = sys.modules['linecache']
except KeyError:
pass
else:
linecache.clearcache()
try:
mimetypes = sys.modules['mimetypes']
except KeyError:
pass
else:
mimetypes._default_mime_types()
try:
filecmp = sys.modules['filecmp']
except KeyError:
pass
else:
filecmp._cache.clear()
try:
struct = sys.modules['struct']
except KeyError:
pass
else:
struct._clearcache()
try:
doctest = sys.modules['doctest']
except KeyError:
pass
else:
doctest.master = None
try:
ctypes = sys.modules['ctypes']
except KeyError:
pass
else:
ctypes._reset_cache()
try:
typing = sys.modules['typing']
except KeyError:
pass
else:
for f in typing._cleanups:
f()
support.gc_collect()
| 22.761905 | 75 | 0.600418 | 510 | 4,302 | 4.933333 | 0.35098 | 0.051669 | 0.085851 | 0.104928 | 0.022258 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009015 | 0.303812 | 4,302 | 188 | 76 | 22.882979 | 0.831052 | 0.125291 | 0 | 0.381295 | 0 | 0 | 0.051701 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.057554 | false | 0.086331 | 0.035971 | 0 | 0.129496 | 0.035971 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
13f954a55ebaa879400311cfe5c32a3993b29137 | 12,933 | py | Python | test/test_rimuhosting.py | shenoyn/libcloud | bd902992a658b6a99193d69323e051ffa7388253 | [
"Apache-2.0"
] | 1 | 2015-11-08T12:59:27.000Z | 2015-11-08T12:59:27.000Z | test/test_rimuhosting.py | shenoyn/libcloud | bd902992a658b6a99193d69323e051ffa7388253 | [
"Apache-2.0"
] | null | null | null | test/test_rimuhosting.py | shenoyn/libcloud | bd902992a658b6a99193d69323e051ffa7388253 | [
"Apache-2.0"
] | null | null | null | # Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# libcloud.org licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Copyright 2009 RedRata Ltd
from libcloud.drivers.rimuhosting import RimuHostingNodeDriver
from test import MockHttp
from test import MockHttp, TestCaseMixin
import unittest
import httplib
class RimuHostingTest(unittest.TestCase, TestCaseMixin):
def setUp(self):
RimuHostingNodeDriver.connectionCls.conn_classes = (None,
RimuHostingMockHttp)
self.driver = RimuHostingNodeDriver('foo')
def test_list_nodes(self):
nodes = self.driver.list_nodes()
self.assertEqual(len(nodes),1)
node = nodes[0]
self.assertEqual(node.public_ip[0], "1.2.3.4")
self.assertEqual(node.public_ip[1], "1.2.3.5")
self.assertEqual(node.extra['order_oid'], 88833465)
self.assertEqual(node.id, "order-88833465-api-ivan-net-nz")
def test_list_sizes(self):
sizes = self.driver.list_sizes()
self.assertEqual(len(sizes),1)
size = sizes[0]
self.assertEqual(size.ram,950)
self.assertEqual(size.disk,20)
self.assertEqual(size.bandwidth,75)
self.assertEqual(size.price,32.54)
def test_list_images(self):
images = self.driver.list_images()
self.assertEqual(len(images),6)
image = images[0]
self.assertEqual(image.name,"Debian 5.0 (aka Lenny, RimuHosting"\
" recommended distro)")
self.assertEqual(image.id, "lenny")
def test_reboot_node(self):
# Raises exception on failure
node = self.driver.list_nodes()[0]
self.driver.reboot_node(node)
def test_destroy_node(self):
# Raises exception on failure
node = self.driver.list_nodes()[0]
self.driver.destroy_node(node)
def test_create_node(self):
# Raises exception on failure
size = self.driver.list_sizes()[0]
image = self.driver.list_images()[0]
self.driver.create_node(name="api.ivan.net.nz", image=image, size=size)
class RimuHostingMockHttp(MockHttp):
def _r_orders(self,method,url,body,headers):
body = """
{ "get_orders_response" :
{ "status_message" : null
, "status_code" : 200
, "error_info" : null
, "response_type" : "OK"
, "human_readable_message" : "Found 15 orders"
, "response_display_duration_type" : "REGULAR",
"about_orders" :
[{ "order_oid" : 88833465
, "domain_name" : "api.ivan.net.nz"
, "slug" : "order-88833465-api-ivan-net-nz"
, "billing_oid" : 96122465
, "is_on_customers_own_physical_server" : false
, "vps_parameters" : { "memory_mb" : 160
, "disk_space_mb" : 4096
, "disk_space_2_mb" : 0}
, "host_server_oid" : "764"
, "server_type" : "VPS"
, "data_transfer_allowance" : { "data_transfer_gb" : 30
, "data_transfer" : "30"}
, "billing_info" : { }
, "allocated_ips" : { "primary_ip" : "1.2.3.4"
, "secondary_ips" : ["1.2.3.5","1.2.3.6"]}
, "running_state" : "RUNNING"}]}}"""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _r_pricing_plans(self,method,url,body,headers):
body = """
{"get_pricing_plans_response" :
{ "status_message" : null
, "status_code" : 200
, "error_info" : null
, "response_type" : "OK"
, "human_readable_message" : "Here some pricing plans we are offering on new orders. Note we offer most disk and memory sizes. So if you setup a new server feel free to vary these (e.g. different memory, disk, etc) and we will just adjust the pricing to suit. Pricing is in USD. If you are an NZ-based customer then we would need to add GST."
, "response_display_duration_type" : "REGULAR"
, "pricing_plan_infos" :
[{ "pricing_plan_code" : "MiroVPSLowContention"
, "pricing_plan_description" : "MiroVPS Semi-Dedicated Server (Dallas)"
, "monthly_recurring_fee" : 32.54
, "monthly_recurring_amt" : { "amt" : 35.0
, "currency" : "CUR_AUD"
,"amt_usd" : 32.54}
, "minimum_memory_mb" : 950
, "minimum_disk_gb" : 20
, "minimum_data_transfer_allowance_gb" : 75
, "see_also_url" : "http://rimuhosting.com/order/serverdetails.jsp?plan=MiroVPSLowContention"
, "server_type" : "VPS"
, "offered_at_data_center" :
{ "data_center_location_code" : "DCDALLAS"
, "data_center_location_name" : "Dallas"}}
]}}
"""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _r_distributions(self, method, url, body, headers):
body = """
{ "get_distros_response" : { "status_message" : null
, "status_code" : 200
, "error_info" : null
, "response_type" : "OK"
, "human_readable_message" : "Here are the distros we are offering on new orders."
, "response_display_duration_type" : "REGULAR"
, "distro_infos" : [{ "distro_code" : "lenny"
, "distro_description" : "Debian 5.0 (aka Lenny, RimuHosting recommended distro)"}
, { "distro_code" : "centos5"
, "distro_description" : "Centos5"}
, { "distro_code" : "ubuntu904"
, "distro_description" : "Ubuntu 9.04 (Jaunty Jackalope, from 2009-04)"}
, { "distro_code" : "ubuntu804"
, "distro_description" : "Ubuntu 8.04 (Hardy Heron, 5 yr long term support (LTS))"}
, { "distro_code" : "ubuntu810"
, "distro_description" : "Ubuntu 8.10 (Intrepid Ibex, from 2008-10)"}
, { "distro_code" : "fedora10"
, "distro_description" : "Fedora 10"}]}}
"""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _r_orders_new_vps(self, method, url, body, headers):
body = """
{ "post_new_vps_response" :
{ "status_message" : null
, "status_code" : 200
, "error_info" : null
, "response_type" : "OK"
, "human_readable_message" : null
, "response_display_duration_type" : "REGULAR"
, "setup_messages" :
["Using user-specified billing data: Wire Transfer" , "Selected user as the owner of the billing details: Ivan Meredith"
, "No VPS paramters provided, using default values."]
, "about_order" :
{ "order_oid" : 52255865
, "domain_name" : "api.ivan.net.nz"
, "slug" : "order-52255865-api-ivan-net-nz"
, "billing_oid" : 96122465
, "is_on_customers_own_physical_server" : false
, "vps_parameters" :
{ "memory_mb" : 160
, "disk_space_mb" : 4096
, "disk_space_2_mb" : 0}
, "host_server_oid" : "764"
, "server_type" : "VPS"
, "data_transfer_allowance" :
{ "data_transfer_gb" : 30 , "data_transfer" : "30"}
, "billing_info" : { }
, "allocated_ips" :
{ "primary_ip" : "74.50.57.80", "secondary_ips" : []}
, "running_state" : "RUNNING"}
, "new_order_request" :
{ "billing_oid" : 96122465
, "user_oid" : 0
, "host_server_oid" : null
, "vps_order_oid_to_clone" : 0
, "ip_request" :
{ "num_ips" : 1, "extra_ip_reason" : ""}
, "vps_parameters" :
{ "memory_mb" : 160
, "disk_space_mb" : 4096
, "disk_space_2_mb" : 0}
, "pricing_plan_code" : "MIRO1B"
, "instantiation_options" :
{ "control_panel" : "webmin"
, "domain_name" : "api.ivan.net.nz"
, "password" : "aruxauce27"
, "distro" : "lenny"}}
, "running_vps_info" :
{ "pings_ok" : true
, "current_kernel" : "default"
, "current_kernel_canonical" : "2.6.30.5-xenU.i386"
, "last_backup_message" : ""
, "is_console_login_enabled" : false
, "console_public_authorized_keys" : null
, "is_backup_running" : false
, "is_backups_enabled" : true
, "next_backup_time" :
{ "ms_since_epoch": 1256446800000, "iso_format" : "2009-10-25T05:00:00Z", "users_tz_offset_ms" : 46800000}
, "vps_uptime_s" : 31
, "vps_cpu_time_s" : 6
, "running_state" : "RUNNING"
, "is_suspended" : false}}}
"""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _r_orders_order_88833465_api_ivan_net_nz_vps(self, method, url, body, headers):
body = """
{ "delete_server_response" :
{ "status_message" : null
, "status_code" : 200
, "error_info" : null
, "response_type" : "OK"
, "human_readable_message" : "Server removed"
, "response_display_duration_type" : "REGULAR"
, "cancel_messages" :
["api.ivan.net.nz is being shut down."
, "A $7.98 credit has been added to your account."
, "If you need to un-cancel the server please contact our support team."]
}
}
"""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
def _r_orders_order_88833465_api_ivan_net_nz_vps_running_state(self, method,
url, body,
headers):
body = """
{ "put_running_state_response" :
{ "status_message" : null
, "status_code" : 200
, "error_info" : null
, "response_type" : "OK"
, "human_readable_message" : "api.ivan.net.nz restarted. After the reboot api.ivan.net.nz is pinging OK."
, "response_display_duration_type" : "REGULAR"
, "is_restarted" : true
, "is_pinging" : true
, "running_vps_info" :
{ "pings_ok" : true
, "current_kernel" : "default"
, "current_kernel_canonical" : "2.6.30.5-xenU.i386"
, "last_backup_message" : ""
, "is_console_login_enabled" : false
, "console_public_authorized_keys" : null
, "is_backup_running" : false
, "is_backups_enabled" : true
, "next_backup_time" :
{ "ms_since_epoch": 1256446800000, "iso_format" : "2009-10-25T05:00:00Z", "users_tz_offset_ms" : 46800000}
, "vps_uptime_s" : 19
, "vps_cpu_time_s" : 5
, "running_state" : "RUNNING"
, "is_suspended" : false}
, "host_server_info" : { "is_host64_bit_capable" : true
, "default_kernel_i386" : "2.6.30.5-xenU.i386"
, "default_kernel_x86_64" : "2.6.30.5-xenU.x86_64"
, "cpu_model_name" : "Intel(R) Xeon(R) CPU E5506 @ 2.13GHz"
, "host_num_cores" : 1
, "host_xen_version" : "3.4.1"
, "hostload" : [1.45
, 0.56
, 0.28]
, "host_uptime_s" : 3378276
, "host_mem_mb_free" : 51825
, "host_mem_mb_total" : 73719
, "running_vpss" : 34}
, "running_state_messages" : null}}
"""
return (httplib.OK, body, {}, httplib.responses[httplib.OK])
| 45.22028 | 380 | 0.540478 | 1,377 | 12,933 | 4.819898 | 0.28976 | 0.029381 | 0.01808 | 0.021697 | 0.446738 | 0.410125 | 0.361911 | 0.339009 | 0.309779 | 0.309779 | 0 | 0.050881 | 0.341993 | 12,933 | 285 | 381 | 45.378947 | 0.729025 | 0.067115 | 0 | 0.387755 | 0 | 0.036735 | 0.757244 | 0.111831 | 0 | 0 | 0 | 0 | 0.053061 | 1 | 0.053061 | false | 0.004082 | 0.020408 | 0 | 0.106122 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b913259774170b0ae117752589cf379fac40286c | 4,139 | py | Python | easyidp/core/tests/test_class_reconsproject.py | HowcanoeWang/EasyIDP | 0d0a0df1287e3c15cda17e8e4cdcbe05f21f7272 | [
"MIT"
] | null | null | null | easyidp/core/tests/test_class_reconsproject.py | HowcanoeWang/EasyIDP | 0d0a0df1287e3c15cda17e8e4cdcbe05f21f7272 | [
"MIT"
] | null | null | null | easyidp/core/tests/test_class_reconsproject.py | HowcanoeWang/EasyIDP | 0d0a0df1287e3c15cda17e8e4cdcbe05f21f7272 | [
"MIT"
] | null | null | null | import os
import numpy as np
import pytest
import easyidp
from easyidp.core.objects import ReconsProject, Points
from easyidp.io import metashape
module_path = os.path.join(easyidp.__path__[0], "io/tests")
def test_init_reconsproject():
attempt1 = ReconsProject("agisoft")
assert attempt1.software == "metashape"
attempt2 = ReconsProject("Metashape")
assert attempt2.software == "metashape"
with pytest.raises(LookupError):
attempt3 = ReconsProject("not_supported_sfm")
def test_local2world2local():
attempt1 = ReconsProject("agisoft")
attempt1.transform.matrix = np.asarray([[-0.86573098, -0.01489186, 0.08977677, 7.65034123],
[0.06972335, 0.44334391, 0.74589315, 1.85910928],
[-0.05848325, 0.74899678, -0.43972184, -0.1835615],
[0., 0., 0., 1.]], dtype=np.float)
w_pos = Points([0.5, 1, 1.5])
l_pos = Points([7.960064093299587, 1.3019528769064523, -2.6697181763370965])
w_pos_ans = Points([0.4999999999999978, 0.9999999999999993, 1.5])
world_pos = attempt1.local2world(l_pos)
np.testing.assert_array_almost_equal(w_pos_ans.values, world_pos.values, decimal=6)
local_pos = attempt1.world2local(w_pos)
np.testing.assert_array_almost_equal(l_pos.values, local_pos.values, decimal=6)
def test_metashape_project_local_points_on_raw():
test_project_folder = easyidp.test_full_path("data/metashape/goya_test.psx")
chunks = metashape.open_project(test_project_folder)
chunk = chunks[0]
# test for single point
l_pos = Points([7.960064093299587, 1.3019528769064523, -2.6697181763370965])
p_dis_out = chunk.project_local_points_on_raw(l_pos, 0, distortion_correct=False)
p_undis_out = chunk.project_local_points_on_raw(l_pos, 0, distortion_correct=True)
# pro_api_out = np.asarray([2218.883386793118, 1991.4709388015149])
my_undistort_out = Points([2220.854889556147, 1992.6933680261686])
my_distort_out = Points([2218.47960556, 1992.46356322])
np.testing.assert_array_almost_equal(p_dis_out.values, my_distort_out.values)
np.testing.assert_array_almost_equal(p_undis_out.values, my_undistort_out.values)
# test for multiple points
l_pos_points = Points([[7.960064093299587, 1.3019528769064523, -2.6697181763370965],
[7.960064093299587, 1.3019528769064523, -2.6697181763370965]])
p_dis_outs = chunk.project_local_points_on_raw(l_pos_points, 0, distortion_correct=False)
p_undis_outs = chunk.project_local_points_on_raw(l_pos_points, 0, distortion_correct=True)
my_undistort_outs = Points([[2220.854889556147, 1992.6933680261686],
[2220.854889556147, 1992.6933680261686]])
my_distort_outs = Points([[2218.47960556, 1992.46356322],
[2218.47960556, 1992.46356322]])
np.testing.assert_array_almost_equal(p_dis_outs.values, my_distort_outs.values)
np.testing.assert_array_almost_equal(p_undis_outs.values, my_undistort_outs.values)
def test_world2crs_and_on_raw_images():
test_project_folder = easyidp.test_full_path("data/metashape/wheat_tanashi.psx")
chunks = metashape.open_project(test_project_folder)
chunk = chunks[0]
local = Points([11.870130675203006, 0.858098777517136, -12.987136541275])
geocentric = Points([-3943658.7087006606, 3363404.124223561, 3704651.3067566575])
geodetic = Points([139.54033578028609, 35.73756358928734, 96.87827569602781], columns=['lon', 'lat', 'alt'])
idp_world = chunk.local2world(local)
np.testing.assert_array_almost_equal(idp_world.values, geocentric.values, decimal=1)
idp_crs = chunk.world2crs(idp_world)
np.testing.assert_array_almost_equal(idp_crs.values, geodetic.values)
camera_id = 56 # camera_label = 'DJI_0057'
camera_pix_ans = Points([2391.7104647010146, 1481.8987733175165])
idp_cam_pix = chunk.project_local_points_on_raw(local, camera_id, distortion_correct=True)
np.testing.assert_array_almost_equal(camera_pix_ans.values, idp_cam_pix.values)
| 42.234694 | 112 | 0.723846 | 535 | 4,139 | 5.297196 | 0.293458 | 0.012703 | 0.047636 | 0.063514 | 0.452717 | 0.41602 | 0.357798 | 0.290049 | 0.269584 | 0.162315 | 0 | 0.222932 | 0.167673 | 4,139 | 97 | 113 | 42.670103 | 0.59971 | 0.033341 | 0 | 0.126984 | 0 | 0 | 0.033801 | 0.015023 | 0 | 0 | 0 | 0 | 0.174603 | 1 | 0.063492 | false | 0 | 0.095238 | 0 | 0.15873 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
b915eeed88fbfbe46318454fd21bc9db43d6d639 | 6,023 | py | Python | utils/utils_bbox.py | MasoonZhang/FasterRConvMixer | a7a17d00f716a28a5b301088053e00840c222524 | [
"MIT"
] | null | null | null | utils/utils_bbox.py | MasoonZhang/FasterRConvMixer | a7a17d00f716a28a5b301088053e00840c222524 | [
"MIT"
] | null | null | null | utils/utils_bbox.py | MasoonZhang/FasterRConvMixer | a7a17d00f716a28a5b301088053e00840c222524 | [
"MIT"
] | 1 | 2022-03-14T05:29:42.000Z | 2022-03-14T05:29:42.000Z | import numpy as np
import torch
from torch.nn import functional as F
from torchvision.ops import nms
def loc2bbox(src_bbox, loc):
if src_bbox.size()[0] == 0:
return torch.zeros((0, 4), dtype=loc.dtype)
src_width = torch.unsqueeze(src_bbox[:, 2] - src_bbox[:, 0], -1)
src_height = torch.unsqueeze(src_bbox[:, 3] - src_bbox[:, 1], -1)
src_ctr_x = torch.unsqueeze(src_bbox[:, 0], -1) + 0.5 * src_width
src_ctr_y = torch.unsqueeze(src_bbox[:, 1], -1) + 0.5 * src_height
dx = loc[:, 0::4]
dy = loc[:, 1::4]
dw = loc[:, 2::4]
dh = loc[:, 3::4]
ctr_x = dx * src_width + src_ctr_x
ctr_y = dy * src_height + src_ctr_y
w = torch.exp(dw) * src_width
h = torch.exp(dh) * src_height
dst_bbox = torch.zeros_like(loc)
dst_bbox[:, 0::4] = ctr_x - 0.5 * w
dst_bbox[:, 1::4] = ctr_y - 0.5 * h
dst_bbox[:, 2::4] = ctr_x + 0.5 * w
dst_bbox[:, 3::4] = ctr_y + 0.5 * h
return dst_bbox
class DecodeBox():
def __init__(self, std, num_classes):
self.std = std
self.num_classes = num_classes + 1
def frcnn_correct_boxes(self, box_xy, box_wh, input_shape, image_shape):
#-----------------------------------------------------------------#
# 把y轴放前面是因为方便预测框和图像的宽高进行相乘
#-----------------------------------------------------------------#
box_yx = box_xy[..., ::-1]
box_hw = box_wh[..., ::-1]
input_shape = np.array(input_shape)
image_shape = np.array(image_shape)
box_mins = box_yx - (box_hw / 2.)
box_maxes = box_yx + (box_hw / 2.)
boxes = np.concatenate([box_mins[..., 0:1], box_mins[..., 1:2], box_maxes[..., 0:1], box_maxes[..., 1:2]], axis=-1)
boxes *= np.concatenate([image_shape, image_shape], axis=-1)
return boxes
def forward(self, roi_cls_locs, roi_scores, rois, image_shape, input_shape, nms_iou = 0.3, confidence = 0.5):
results = []
bs = len(roi_cls_locs)
#--------------------------------#
# batch_size, num_rois, 4
#--------------------------------#
rois = rois.view((bs, -1, 4))
#----------------------------------------------------------------------------------------------------------------#
# 对每一张图片进行处理,由于在predict.py的时候,我们只输入一张图片,所以for i in range(len(mbox_loc))只进行一次
#----------------------------------------------------------------------------------------------------------------#
for i in range(bs):
#----------------------------------------------------------#
# 对回归参数进行reshape
#----------------------------------------------------------#
roi_cls_loc = roi_cls_locs[i] * self.std
#----------------------------------------------------------#
# 第一维度是建议框的数量,第二维度是每个种类
# 第三维度是对应种类的调整参数
#----------------------------------------------------------#
roi_cls_loc = roi_cls_loc.view([-1, self.num_classes, 4])
#-------------------------------------------------------------#
# 利用classifier网络的预测结果对建议框进行调整获得预测框
# num_rois, 4 -> num_rois, 1, 4 -> num_rois, num_classes, 4
#-------------------------------------------------------------#
roi = rois[i].view((-1, 1, 4)).expand_as(roi_cls_loc)
cls_bbox = loc2bbox(roi.contiguous().view((-1, 4)), roi_cls_loc.contiguous().view((-1, 4)))
cls_bbox = cls_bbox.view([-1, (self.num_classes), 4])
#-------------------------------------------------------------#
# 对预测框进行归一化,调整到0-1之间
#-------------------------------------------------------------#
cls_bbox[..., [0, 2]] = (cls_bbox[..., [0, 2]]) / input_shape[1]
cls_bbox[..., [1, 3]] = (cls_bbox[..., [1, 3]]) / input_shape[0]
roi_score = roi_scores[i]
prob = F.softmax(roi_score, dim=-1)
results.append([])
for c in range(1, self.num_classes):
#--------------------------------#
# 取出属于该类的所有框的置信度
# 判断是否大于门限
#--------------------------------#
c_confs = prob[:, c]
c_confs_m = c_confs > confidence
if len(c_confs[c_confs_m]) > 0:
#-----------------------------------------#
# 取出得分高于confidence的框
#-----------------------------------------#
boxes_to_process = cls_bbox[c_confs_m, c]
confs_to_process = c_confs[c_confs_m]
keep = nms(
boxes_to_process,
confs_to_process,
nms_iou
)
#-----------------------------------------#
# 取出在非极大抑制中效果较好的内容
#-----------------------------------------#
good_boxes = boxes_to_process[keep]
confs = confs_to_process[keep][:, None]
labels = (c - 1) * torch.ones((len(keep), 1)).cuda() if confs.is_cuda else (c - 1) * torch.ones((len(keep), 1))
#-----------------------------------------#
# 将label、置信度、框的位置进行堆叠。
#-----------------------------------------#
c_pred = torch.cat((good_boxes, confs, labels), dim=1).cpu().numpy()
# 添加进result里
results[-1].extend(c_pred)
if len(results[-1]) > 0:
results[-1] = np.array(results[-1])
box_xy, box_wh = (results[-1][:, 0:2] + results[-1][:, 2:4])/2, results[-1][:, 2:4] - results[-1][:, 0:2]
results[-1][:, :4] = self.frcnn_correct_boxes(box_xy, box_wh, input_shape, image_shape)
return results
| 45.628788 | 136 | 0.381205 | 601 | 6,023 | 3.567388 | 0.21797 | 0.025187 | 0.020989 | 0.039179 | 0.157183 | 0.102612 | 0.059701 | 0.041978 | 0 | 0 | 0 | 0.028913 | 0.305164 | 6,023 | 131 | 137 | 45.977099 | 0.483393 | 0.267309 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053333 | false | 0 | 0.053333 | 0 | 0.173333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.