hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
5df0f5315c42780b6aaa1dab76caac227ebc61e3 | 178 | py | Python | qulab/server/__main__.py | liuqichun3809/quantum-lab | 05bea707b314ea1687866f56ee439079336cfbbc | [
"MIT"
] | 3 | 2020-08-30T16:11:49.000Z | 2021-03-05T12:09:30.000Z | qulab/server/__main__.py | liuqichun3809/quantum-lab | 05bea707b314ea1687866f56ee439079336cfbbc | [
"MIT"
] | null | null | null | qulab/server/__main__.py | liuqichun3809/quantum-lab | 05bea707b314ea1687866f56ee439079336cfbbc | [
"MIT"
] | 2 | 2019-07-24T15:12:31.000Z | 2019-09-20T02:17:28.000Z | import logging
from .server import main
log = logging.getLogger()
log.setLevel(logging.DEBUG)
ch = logging.StreamHandler()
ch.setLevel(logging.DEBUG)
log.addHandler(ch)
main()
| 16.181818 | 28 | 0.775281 | 24 | 178 | 5.75 | 0.5 | 0.217391 | 0.289855 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.101124 | 178 | 10 | 29 | 17.8 | 0.8625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5dff91aade6b34028cd64ab1e5f8eae8a25b10cf | 469 | py | Python | eyesore/decision_graph/nodes/__init__.py | twizmwazin/hacrs | 3c9386b0fa5f5ea6b93b2bc8b3c4eed6abceec6a | [
"BSD-2-Clause"
] | 2 | 2019-11-07T02:55:40.000Z | 2021-12-30T01:37:43.000Z | eyesore/decision_graph/nodes/__init__.py | twizmwazin/hacrs | 3c9386b0fa5f5ea6b93b2bc8b3c4eed6abceec6a | [
"BSD-2-Clause"
] | null | null | null | eyesore/decision_graph/nodes/__init__.py | twizmwazin/hacrs | 3c9386b0fa5f5ea6b93b2bc8b3c4eed6abceec6a | [
"BSD-2-Clause"
] | 2 | 2019-09-27T12:01:50.000Z | 2019-10-09T21:39:52.000Z | from .actions import ActionsNode
from .decision_base import DecisionBaseNode
from .decision_node import DecisionNode
from .successor import SuccessorNode
from .successors import SuccessorsNode
from .read_eval import ReadEvalNode
from .decision_not_taken import DecisionNotTakenNode
from .read_eval_loop import ReadEvalLoopNode
from .input_byte_switch_table import InputByteSwitchTableNode
from .text_decision import TextDecisionNode
from .text_flat import TextFlatNode | 39.083333 | 61 | 0.882729 | 56 | 469 | 7.178571 | 0.535714 | 0.089552 | 0.059701 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093817 | 469 | 12 | 62 | 39.083333 | 0.945882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
f8e400f2d2957f60517ed6da46ac93fe44473998 | 97 | py | Python | level09/Ressources/script.py | SpenderJ/snowcrash | cc7019b92789d3d7405ee4320251bd0501e927b7 | [
"MIT"
] | 1 | 2021-05-13T20:12:04.000Z | 2021-05-13T20:12:04.000Z | level09/Ressources/script.py | SpenderJ/snowcrash | cc7019b92789d3d7405ee4320251bd0501e927b7 | [
"MIT"
] | null | null | null | level09/Ressources/script.py | SpenderJ/snowcrash | cc7019b92789d3d7405ee4320251bd0501e927b7 | [
"MIT"
] | null | null | null | # coding utf-8
import sys
w=sys.argv[1]
for x in range(0,len(w)):print(chr(ord(w[x])-x),end="")
| 16.166667 | 55 | 0.628866 | 23 | 97 | 2.652174 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034884 | 0.113402 | 97 | 5 | 56 | 19.4 | 0.674419 | 0.123711 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
f8f4ad97aff4a36dcc8849289524c7a14b8d69a1 | 298 | py | Python | tests/test_examples.py | nthndy/BayesianTracker | 443f984ce830373e140f744a27179debdf34ae58 | [
"MIT"
] | null | null | null | tests/test_examples.py | nthndy/BayesianTracker | 443f984ce830373e140f744a27179debdf34ae58 | [
"MIT"
] | null | null | null | tests/test_examples.py | nthndy/BayesianTracker | 443f984ce830373e140f744a27179debdf34ae58 | [
"MIT"
] | null | null | null | from btrack import datasets
def test_pooch_registry():
"""
Test that `pooch` registry is up to date with remote version.
This will fail if the remote file does not match the hash hard-coded
in btrack.datasets.
"""
registry_file = datasets._remote_registry() # noqa: F841
| 27.090909 | 72 | 0.704698 | 43 | 298 | 4.767442 | 0.72093 | 0.126829 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012987 | 0.224832 | 298 | 10 | 73 | 29.8 | 0.874459 | 0.543624 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
5d1205ee319347a1d7a4c8ecdf9c3c470bedd065 | 226 | py | Python | mmcls/models/utils/augment/builder.py | YuxinZou/mmclassification | 2037260ea6c98a3b115e97727e1151a1c2c32f7a | [
"Apache-2.0"
] | 1,190 | 2020-07-10T01:16:01.000Z | 2022-03-31T09:48:38.000Z | mmcls/models/utils/augment/builder.py | YuxinZou/mmclassification | 2037260ea6c98a3b115e97727e1151a1c2c32f7a | [
"Apache-2.0"
] | 702 | 2020-07-13T13:31:33.000Z | 2022-03-31T06:48:04.000Z | mmcls/models/utils/augment/builder.py | YuxinZou/mmclassification | 2037260ea6c98a3b115e97727e1151a1c2c32f7a | [
"Apache-2.0"
] | 502 | 2020-07-10T02:40:55.000Z | 2022-03-31T02:07:09.000Z | # Copyright (c) OpenMMLab. All rights reserved.
from mmcv.utils import Registry, build_from_cfg
AUGMENT = Registry('augment')
def build_augment(cfg, default_args=None):
return build_from_cfg(cfg, AUGMENT, default_args)
| 25.111111 | 53 | 0.778761 | 32 | 226 | 5.28125 | 0.59375 | 0.106509 | 0.142012 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.128319 | 226 | 8 | 54 | 28.25 | 0.857868 | 0.199115 | 0 | 0 | 0 | 0 | 0.039106 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
5d5555c86398d9c3c0de49db455df0439faf2e5b | 68,877 | py | Python | tests/unit/test_identity.py | rs-randallburt/pyrax | fb6eb55f8218cab7248d8f44f77148d254c5074c | [
"Apache-2.0"
] | null | null | null | tests/unit/test_identity.py | rs-randallburt/pyrax | fb6eb55f8218cab7248d8f44f77148d254c5074c | [
"Apache-2.0"
] | null | null | null | tests/unit/test_identity.py | rs-randallburt/pyrax | fb6eb55f8218cab7248d8f44f77148d254c5074c | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
import datetime
import json
import os
import random
import sys
import unittest
from six import StringIO
from mock import MagicMock as Mock
from mock import patch
import pyrax
import pyrax.utils as utils
import pyrax.exceptions as exc
from pyrax import base_identity
from pyrax.identity import rax_identity
from pyrax import fakes
class DummyResponse(object):
def read(self):
pass
def readline(self):
pass
class IdentityTest(unittest.TestCase):
def __init__(self, *args, **kwargs):
super(IdentityTest, self).__init__(*args, **kwargs)
self.username = "TESTUSER"
self.password = "TESTPASSWORD"
self.base_identity_class = base_identity.BaseIdentity
self.keystone_identity_class = pyrax.keystone_identity.KeystoneIdentity
self.rax_identity_class = pyrax.rax_identity.RaxIdentity
self.id_classes = {"keystone": self.keystone_identity_class,
"rackspace": self.rax_identity_class}
def _get_clean_identity(self):
return self.rax_identity_class()
def setUp(self):
self.identity = fakes.FakeIdentity()
self.service = fakes.FakeIdentityService(self.identity)
def tearDown(self):
pass
def test_svc_repr(self):
svc = self.service
rep = svc.__repr__()
self.assertTrue(svc.service_type in rep)
def test_svc_ep_for_region(self):
svc = self.service
region = utils.random_unicode().upper()
bad_region = utils.random_unicode().upper()
good_url = utils.random_unicode()
bad_url = utils.random_unicode()
good_ep = fakes.FakeEndpoint({"public_url": good_url}, svc.service_type,
region, self.identity)
bad_ep = fakes.FakeEndpoint({"public_url": bad_url}, svc.service_type,
bad_region, self.identity)
svc.endpoints = utils.DotDict({region: good_ep, bad_region: bad_ep})
ep = svc._ep_for_region(region)
self.assertEqual(ep, good_ep)
def test_svc_ep_for_region_all(self):
svc = self.service
region = "ALL"
good_url = utils.random_unicode()
bad_url = utils.random_unicode()
good_ep = fakes.FakeEndpoint({"public_url": good_url}, svc.service_type,
region, self.identity)
from collections import OrderedDict
svc.endpoints = utils.DotDict({region: good_ep})
ep = svc._ep_for_region("notthere")
self.assertEqual(ep, good_ep)
def test_svc_ep_for_region_not_found(self):
svc = self.service
region = utils.random_unicode().upper()
good_url = utils.random_unicode()
bad_url = utils.random_unicode()
good_ep = fakes.FakeEndpoint({"public_url": good_url}, svc.service_type,
region, self.identity)
bad_ep = fakes.FakeEndpoint({"public_url": bad_url}, svc.service_type,
region, self.identity)
svc.endpoints = utils.DotDict({region: good_ep, "other": bad_ep})
ep = svc._ep_for_region("notthere")
self.assertIsNone(ep)
def test_svc_get_client(self):
svc = self.service
clt = utils.random_unicode()
region = utils.random_unicode()
class FakeEPForRegion(object):
client = clt
svc._ep_for_region = Mock(return_value=FakeEPForRegion())
ret = svc.get_client(region)
self.assertEqual(ret, clt)
def test_svc_get_client_none(self):
svc = self.service
region = utils.random_unicode()
svc._ep_for_region = Mock(return_value=None)
self.assertRaises(exc.NoEndpointForRegion, svc.get_client, region)
def test_svc_regions(self):
svc = self.service
key1 = utils.random_unicode()
val1 = utils.random_unicode()
key2 = utils.random_unicode()
val2 = utils.random_unicode()
svc.endpoints = {key1: val1, key2: val2}
regions = svc.regions
self.assertEqual(len(regions), 2)
self.assertTrue(key1 in regions)
self.assertTrue(key2 in regions)
def test_ep_get_client_already_failed(self):
svc = self.service
ep_dict = {"publicURL": "http://example.com", "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
ep._client = exc.NoClientForService()
self.assertRaises(exc.NoClientForService, ep._get_client)
def test_ep_get_client_exists(self):
svc = self.service
ep_dict = {"publicURL": "http://example.com", "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
clt = utils.random_unicode()
ep._client = clt
ret = ep._get_client()
self.assertEqual(ret, clt)
def test_ep_get_client_none(self):
svc = self.service
ep_dict = {"publicURL": "http://example.com", "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
sav = pyrax.client_class_for_service
pyrax.client_class_for_service = Mock(return_value=None)
self.assertRaises(exc.NoClientForService, ep._get_client)
pyrax.client_class_for_service = sav
def test_ep_get_client_no_url(self):
svc = self.service
ep_dict = {"publicURL": "http://example.com", "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
sav = pyrax.client_class_for_service
ep.public_url = None
pyrax.client_class_for_service = Mock(return_value=object)
self.assertRaises(exc.NoEndpointForService, ep._get_client)
pyrax.client_class_for_service = sav
def test_ep_get_client(self):
svc = self.service
ep_dict = {"publicURL": "http://example.com", "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
sav = pyrax.client_class_for_service
ep.public_url = utils.random_unicode()
pyrax.client_class_for_service = Mock(return_value=object)
fake = utils.random_unicode()
ep._create_client = Mock(return_value=fake)
ret = ep._get_client()
self.assertEqual(ret, fake)
self.assertEqual(ep._client, fake)
pyrax.client_class_for_service = sav
def test_ep_get_new_client(self):
svc = self.service
ep_dict = {"publicURL": "http://example.com", "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
ep._get_client = Mock()
ep.get_new_client()
ep._get_client.assert_called_once_with(public=True, cached=False)
def test_ep_get(self):
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
ret = ep.get("public")
self.assertEqual(ret, pub)
ret = ep.get("private")
self.assertEqual(ret, priv)
self.assertRaises(ValueError, ep.get, "invalid")
def test_ep_getattr(self):
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
svc_att = "exists"
att_val = utils.random_unicode()
setattr(svc, svc_att, att_val)
ep._get_client = Mock(return_value=svc)
ret = ep.exists
self.assertEqual(ret, att_val)
self.assertRaises(AttributeError, getattr, ep, "bogus")
def test_ep_client_prop(self):
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
clt = utils.random_unicode()
ep._get_client = Mock(return_value=clt)
ret = ep.client
self.assertEqual(ret, clt)
def test_ep_client_private_prop(self):
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
clt = utils.random_unicode()
ep._get_client = Mock(return_value=clt)
ret = ep.client_private
self.assertEqual(ret, clt)
def test_ep_create_client_compute(self):
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
vssl = random.choice((True, False))
public = random.choice((True, False))
sav_gs = pyrax.get_setting
pyrax.get_setting = Mock(return_value=vssl)
sav_conn = pyrax.connect_to_cloudservers
fake_client = fakes.FakeClient()
fake_client.identity = self.identity
pyrax.connect_to_cloudservers = Mock(return_value=fake_client)
ep.service = "compute"
ret = ep._create_client(None, None, public)
self.assertEqual(ret, fake_client)
pyrax.connect_to_cloudservers.assert_called_once_with(region=ep.region,
context=ep.identity)
pyrax.connect_to_cloudservers = sav_conn
pyrax.get_setting = sav_gs
def test_ep_create_client_all_other(self):
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = utils.random_unicode().upper()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
vssl = random.choice((True, False))
public = random.choice((True, False))
url = utils.random_unicode()
sav_gs = pyrax.get_setting
pyrax.get_setting = Mock(return_value=vssl)
class FakeClientClass(object):
def __init__(self, identity, region_name, management_url,
verify_ssl):
self.identity = identity
self.region_name = region_name
self.management_url = management_url
self.verify_ssl = verify_ssl
ret = ep._create_client(FakeClientClass, url, public)
self.assertTrue(isinstance(ret, FakeClientClass))
pyrax.get_setting = sav_gs
def test_init(self):
for cls in self.id_classes.values():
ident = cls(username=self.username, password=self.password)
self.assertEqual(ident.username, self.username)
self.assertEqual(ident.password, self.password)
self.assertIsNone(ident.token)
self.assertIsNone(ident._creds_file)
def test_auth_with_token_name(self):
for cls in self.id_classes.values():
ident = cls()
tok = utils.random_unicode()
nm = utils.random_unicode()
resp = fakes.FakeIdentityResponse()
# Need to stuff this into the standard response
sav = resp.content["access"]["user"]["name"]
resp.content["access"]["user"]["name"] = nm
ident.method_post = Mock(return_value=(resp, resp.json()))
ident.auth_with_token(tok, tenant_name=nm)
ident.method_post.assert_called_once_with("tokens",
headers={'Content-Type': 'application/json', 'Accept':
'application/json'}, std_headers=False, data={'auth':
{'token': {'id': tok}, 'tenantName': nm}})
self.assertEqual(ident.username, nm)
resp.content["access"]["user"]["name"] = sav
def test_auth_with_token_id(self):
for cls in self.id_classes.values():
ident = cls()
tok = utils.random_unicode()
tenant_id = utils.random_unicode()
resp = fakes.FakeIdentityResponse()
# Need to stuff this into the standard response
sav = resp.content["access"]["token"]["tenant"]["id"]
resp.content["access"]["token"]["tenant"]["id"] = tenant_id
ident.method_post = Mock(return_value=(resp, resp.json()))
ident.auth_with_token(tok, tenant_id=tenant_id)
ident.method_post.assert_called_once_with("tokens",
headers={'Content-Type': 'application/json', 'Accept':
'application/json'}, std_headers=False, data={'auth':
{'token': {'id': tok}, 'tenantId': tenant_id}})
self.assertEqual(ident.tenant_id, tenant_id)
resp.content["access"]["token"]["tenant"]["id"] = sav
def test_auth_with_token_without_tenant_id(self):
for cls in self.id_classes.values():
ident = cls()
tok = utils.random_unicode()
tenant_id = None
resp = fakes.FakeIdentityResponse()
# Need to stuff this into the standard response
sav = resp.content["access"]["token"]["tenant"]["id"]
resp.content["access"]["token"]["tenant"]["id"] = tenant_id
ident.method_post = Mock(return_value=(resp, resp.json()))
ident.auth_with_token(tok, tenant_id=tenant_id)
ident.method_post.assert_called_once_with("tokens",
headers={'Content-Type': 'application/json', 'Accept':
'application/json'}, std_headers=False, data={'auth':
{'token': {'id': tok}}})
self.assertEqual(ident.tenant_id, tenant_id)
resp.content["access"]["token"]["tenant"]["id"] = sav
def test_auth_with_token_id_auth_fail(self):
for cls in self.id_classes.values():
ident = cls()
tok = utils.random_unicode()
tenant_id = utils.random_unicode()
resp = fakes.FakeIdentityResponse()
resp.status_code = 401
ident.method_post = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.AuthenticationFailed, ident.auth_with_token,
tok, tenant_id=tenant_id)
def test_auth_with_token_id_auth_fail_general(self):
for cls in self.id_classes.values():
ident = cls()
tok = utils.random_unicode()
tenant_id = utils.random_unicode()
resp = fakes.FakeIdentityResponse()
resp.status_code = 499
resp.reason = "fake"
resp.json = Mock(return_value={"error": {"message": "fake"}})
ident.method_post = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.AuthenticationFailed, ident.auth_with_token,
tok, tenant_id=tenant_id)
def test_auth_with_token_rax(self):
ident = self.rax_identity_class()
mid = utils.random_unicode()
oid = utils.random_unicode()
token = utils.random_unicode()
class FakeResp(object):
info = None
def json(self):
return self.info
resp_main = FakeResp()
resp_main_body = {"access": {
"serviceCatalog": [{"a": "a", "name": "a", "type": "a"},
{"b": "b", "name": "b", "type": "b"}],
"user": {"roles":
[{"tenantId": oid, "name": "object-store:default"}],
}}}
ident._call_token_auth = Mock(return_value=(resp_main, resp_main_body))
def fake_parse(dct):
svcs = dct.get("access", {}).get("serviceCatalog", {})
pyrax.services = [svc["name"] for svc in svcs]
ident._parse_response = fake_parse
ident.auth_with_token(token, tenant_id=mid)
ident._call_token_auth.assert_called_with(token, oid, None)
self.assertTrue("a" in pyrax.services)
self.assertTrue("b" in pyrax.services)
def test_get_client(self):
ident = self.identity
ident.authenticated = True
svc = "fake"
region = utils.random_unicode()
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = "FOO"
clt = fakes.FakeClient()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
ep._get_client = Mock(return_value=clt)
ident.services[svc].endpoints = {region: ep}
ret = ident.get_client(svc, region)
self.assertEqual(ret, clt)
@patch("pyrax.base_identity.BaseIdentity.get_client")
def test_get_client_rax(self, mock_get_client):
from pyrax.cloudnetworks import CloudNetworkClient
ident = self.rax_identity_class()
region = utils.random_unicode()
service = "networks"
ident.get_client(service, region)
mock_get_client.assert_called_with("compute", region, public=True,
cached=True, client_class=CloudNetworkClient)
# Check with any other service
service = utils.random_unicode()
ident.get_client(service, region)
mock_get_client.assert_called_with(service, region, public=True,
cached=True, client_class=None)
def test_get_client_unauthenticated(self):
ident = self.identity
ident.authenticated = False
svc = "fake"
region = utils.random_unicode()
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = "FOO"
clt = fakes.FakeClient()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
ep._get_client = Mock(return_value=clt)
ident.services[svc].endpoints = {region: ep}
self.assertRaises(exc.NotAuthenticated, ident.get_client, svc, region)
def test_get_client_private(self):
ident = self.identity
ident.authenticated = True
svc = "fake"
region = utils.random_unicode()
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = "FOO"
clt = fakes.FakeClient()
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
ep._get_client = Mock(return_value=clt)
ident.services[svc].endpoints = {region: ep}
ret = ident.get_client(svc, region, public=False)
self.assertEqual(ret, clt)
ep._get_client.assert_called_once_with(cached=True, public=False,
client_class=None)
@patch("pyrax.client_class_for_service")
def test_get_client_no_cache(self, mock_ccfs):
ident = self.identity
ident.authenticated = True
svc = "fake"
region = utils.random_unicode()
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = "FOO"
clt_class = fakes.FakeClient
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
mock_ccfs.return_value = clt_class
ident.services[svc].endpoints = {region: ep}
ret = ident.get_client(svc, region, cached=False)
self.assertTrue(isinstance(ret, clt_class))
mock_ccfs.assert_called_once_with(svc)
def test_get_client_no_client(self):
ident = self.identity
ident.authenticated = True
svc = "fake"
region = utils.random_unicode()
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = "FOO"
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
ep._get_client = Mock(return_value=None)
ident.services[svc].endpoints = {region: ep}
self.assertRaises(exc.NoSuchClient, ident.get_client, svc, region)
def test_set_credentials(self):
for cls in self.id_classes.values():
ident = cls()
ident.authenticate = Mock()
ident.set_credentials(self.username, self.password,
authenticate=True)
self.assertEqual(ident.username, self.username)
self.assertEqual(ident.password, self.password)
self.assertIsNone(ident.token)
self.assertIsNone(ident._creds_file)
ident.authenticate.assert_called_once_with()
def test_set_credential_file(self):
ident = self.rax_identity_class()
user = "fakeuser"
# Use percent signs in key to ensure it doesn't get interpolated.
key = "fake%api%key"
ident.authenticate = Mock()
with utils.SelfDeletingTempfile() as tmpname:
with open(tmpname, "wb") as ff:
ff.write("[rackspace_cloud]\n")
ff.write("username = %s\n" % user)
ff.write("api_key = %s\n" % key)
ident.set_credential_file(tmpname, authenticate=True)
self.assertEqual(ident.username, user)
self.assertEqual(ident.password, key)
# Using 'password' instead of 'api_key'
with utils.SelfDeletingTempfile() as tmpname:
with open(tmpname, "wb") as ff:
ff.write("[rackspace_cloud]\n")
ff.write("username = %s\n" % user)
ff.write("password = %s\n" % key)
ident.set_credential_file(tmpname)
self.assertEqual(ident.username, user)
self.assertEqual(ident.password, key)
# File doesn't exist
self.assertRaises(exc.FileNotFound, ident.set_credential_file,
"doesn't exist")
# Missing section
with utils.SelfDeletingTempfile() as tmpname:
with open(tmpname, "wb") as ff:
ff.write("user = x\n")
self.assertRaises(exc.InvalidCredentialFile,
ident.set_credential_file, tmpname)
# Incorrect section
with utils.SelfDeletingTempfile() as tmpname:
with open(tmpname, "wb") as ff:
ff.write("[bad_section]\nusername = x\napi_key = y\n")
self.assertRaises(exc.InvalidCredentialFile,
ident.set_credential_file, tmpname)
# Incorrect option
with utils.SelfDeletingTempfile() as tmpname:
with open(tmpname, "wb") as ff:
ff.write("[rackspace_cloud]\nuserbad = x\napi_key = y\n")
self.assertRaises(exc.InvalidCredentialFile,
ident.set_credential_file, tmpname)
def test_set_credential_file_keystone(self):
ident = pyrax.keystone_identity.KeystoneIdentity(username=self.username,
password=self.password)
user = "fakeuser"
password = "fakeapikey"
tenant_id = "faketenantid"
with utils.SelfDeletingTempfile() as tmpname:
with file(tmpname, "wb") as ff:
ff.write("[keystone]\n")
ff.write("username = %s\n" % user)
ff.write("password = %s\n" % password)
ff.write("tenant_id = %s\n" % tenant_id)
ident.set_credential_file(tmpname)
self.assertEqual(ident.username, user)
self.assertEqual(ident.password, password)
def test_keyring_auth_no_keyring(self):
ident = self.identity
sav = pyrax.base_identity.keyring
pyrax.base_identity.keyring = None
self.assertRaises(exc.KeyringModuleNotInstalled, ident.keyring_auth)
pyrax.base_identity.keyring = sav
def test_keyring_auth_no_username(self):
ident = self.identity
sav = pyrax.get_setting
pyrax.get_setting = Mock(return_value=None)
self.assertRaises(exc.KeyringUsernameMissing, ident.keyring_auth)
pyrax.get_setting = sav
def test_keyring_auth_no_password(self):
ident = self.identity
sav = pyrax.base_identity.keyring.get_password
pyrax.base_identity.keyring.get_password = Mock(return_value=None)
self.assertRaises(exc.KeyringPasswordNotFound, ident.keyring_auth,
"fake")
pyrax.base_identity.keyring.get_password = sav
def test_keyring_auth_apikey(self):
ident = self.identity
ident.authenticate = Mock()
sav = pyrax.base_identity.keyring.get_password
pw = utils.random_unicode()
pyrax.base_identity.keyring.get_password = Mock(return_value=pw)
user = utils.random_unicode()
ident._creds_style = "apikey"
ident.keyring_auth(username=user)
ident.authenticate.assert_called_once_with(username=user, api_key=pw)
pyrax.base_identity.keyring.get_password = sav
def test_keyring_auth_password(self):
ident = self.identity
ident.authenticate = Mock()
sav = pyrax.base_identity.keyring.get_password
pw = utils.random_unicode()
pyrax.base_identity.keyring.get_password = Mock(return_value=pw)
user = utils.random_unicode()
ident._creds_style = "password"
ident.keyring_auth(username=user)
ident.authenticate.assert_called_once_with(username=user, password=pw)
pyrax.base_identity.keyring.get_password = sav
def test_get_extensions(self):
ident = self.identity
v1 = utils.random_unicode()
v2 = utils.random_unicode()
resp_body = {"extensions": {"values": [v1, v2]}}
ident.method_get = Mock(return_value=(None, resp_body))
ret = ident.get_extensions()
self.assertEqual(ret, [v1, v2])
def test_get_credentials_rax(self):
ident = self.rax_identity_class(username=self.username,
api_key=self.password)
ident._creds_style = "apikey"
creds = ident._format_credentials()
user = creds["auth"]["RAX-KSKEY:apiKeyCredentials"]["username"]
key = creds["auth"]["RAX-KSKEY:apiKeyCredentials"]["apiKey"]
self.assertEqual(self.username, user)
self.assertEqual(self.password, key)
def test_get_credentials_rax_password(self):
ident = self.rax_identity_class(username=self.username,
password=self.password)
ident._creds_style = "password"
creds = ident._format_credentials()
user = creds["auth"]["passwordCredentials"]["username"]
key = creds["auth"]["passwordCredentials"]["password"]
self.assertEqual(self.username, user)
self.assertEqual(self.password, key)
def test_get_credentials_keystone(self):
ident = self.keystone_identity_class(username=self.username,
password=self.password)
creds = ident._format_credentials()
user = creds["auth"]["passwordCredentials"]["username"]
key = creds["auth"]["passwordCredentials"]["password"]
self.assertEqual(self.username, user)
self.assertEqual(self.password, key)
def test_authenticate(self):
savrequest = pyrax.http.request
fake_resp = fakes.FakeIdentityResponse()
fake_body = fakes.fake_identity_response
pyrax.http.request = Mock(return_value=(fake_resp, fake_body))
for cls in self.id_classes.values():
ident = cls()
if cls is self.keystone_identity_class:
# Necessary for testing to avoid NotImplementedError.
utils.add_method(ident, lambda self: "", "_get_auth_endpoint")
ident.authenticate()
pyrax.http.request = savrequest
def test_authenticate_fail_creds(self):
ident = self.rax_identity_class(username="BAD", password="BAD")
savrequest = pyrax.http.request
fake_resp = fakes.FakeIdentityResponse()
fake_resp.status_code = 401
fake_body = fakes.fake_identity_response
pyrax.http.request = Mock(return_value=(fake_resp, fake_body))
self.assertRaises(exc.AuthenticationFailed, ident.authenticate)
pyrax.http.request = savrequest
def test_authenticate_fail_other(self):
ident = self.rax_identity_class(username="BAD", password="BAD")
savrequest = pyrax.http.request
fake_resp = fakes.FakeIdentityResponse()
fake_resp.status_code = 500
fake_body = {u'unauthorized': {
u'message': u'Username or api key is invalid', u'code': 500}}
pyrax.http.request = Mock(return_value=(fake_resp, fake_body))
self.assertRaises(exc.InternalServerError, ident.authenticate)
pyrax.http.request = savrequest
def test_authenticate_fail_no_message(self):
ident = self.rax_identity_class(username="BAD", password="BAD")
savrequest = pyrax.http.request
fake_resp = fakes.FakeIdentityResponse()
fake_resp.status_code = 500
fake_body = {u'unauthorized': {
u'bogus': u'Username or api key is invalid', u'code': 500}}
pyrax.http.request = Mock(return_value=(fake_resp, fake_body))
self.assertRaises(exc.InternalServerError, ident.authenticate)
pyrax.http.request = savrequest
def test_authenticate_fail_gt_299(self):
ident = self.rax_identity_class(username="BAD", password="BAD")
savrequest = pyrax.http.request
fake_resp = fakes.FakeIdentityResponse()
fake_resp.status_code = 444
fake_body = {u'unauthorized': {
u'message': u'Username or api key is invalid', u'code': 500}}
pyrax.http.request = Mock(return_value=(fake_resp, fake_body))
self.assertRaises(exc.AuthenticationFailed, ident.authenticate)
pyrax.http.request = savrequest
def test_authenticate_fail_gt_299ino_message(self):
ident = self.rax_identity_class(username="BAD", password="BAD")
savrequest = pyrax.http.request
fake_resp = fakes.FakeIdentityResponse()
fake_resp.status_code = 444
fake_body = {u'unauthorized': {
u'bogus': u'Username or api key is invalid', u'code': 500}}
pyrax.http.request = Mock(return_value=(fake_resp, fake_body))
self.assertRaises(exc.AuthenticationFailed, ident.authenticate)
pyrax.http.request = savrequest
def test_authenticate_backwards_compatibility_connect_param(self):
savrequest = pyrax.http.request
fake_resp = fakes.FakeIdentityResponse()
fake_body = fakes.fake_identity_response
pyrax.http.request = Mock(return_value=(fake_resp, fake_body))
for cls in self.id_classes.values():
ident = cls()
if cls is self.keystone_identity_class:
# Necessary for testing to avoid NotImplementedError.
utils.add_method(ident, lambda self: "", "_get_auth_endpoint")
ident.authenticate(connect=False)
pyrax.http.request = savrequest
def test_rax_endpoints(self):
ident = self.rax_identity_class()
sav = pyrax.get_setting("auth_endpoint")
fake_ep = utils.random_unicode()
pyrax.set_setting("auth_endpoint", fake_ep)
ep = ident._get_auth_endpoint()
self.assertEqual(ep, fake_ep)
pyrax.set_setting("auth_endpoint", sav)
def test_auth_token(self):
for cls in self.id_classes.values():
ident = cls()
test_token = utils.random_unicode()
ident.token = test_token
self.assertEqual(ident.auth_token, test_token)
def test_auth_endpoint(self):
for cls in self.id_classes.values():
ident = cls()
test_ep = utils.random_unicode()
ident._get_auth_endpoint = Mock(return_value=test_ep)
self.assertEqual(ident.auth_endpoint, test_ep)
def test_set_auth_endpoint(self):
for cls in self.id_classes.values():
ident = cls()
test_ep = utils.random_unicode()
ident.auth_endpoint = test_ep
self.assertEqual(ident._auth_endpoint, test_ep)
def test_regions(self):
ident = self.base_identity_class()
fake_resp = fakes.FakeIdentityResponse()
ident._parse_response(fake_resp.json())
expected = ("DFW", "ORD", "SYD", "FAKE")
self.assertEqual(len(ident.regions), len(expected))
for rgn in expected:
self.assertTrue(rgn in ident.regions)
def test_getattr_service(self):
ident = self.base_identity_class()
ident.authenticated = True
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = "FOO"
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
self.service.endpoints = {rgn: ep}
ident.services = {"fake": self.service}
ret = ident.fake
self.assertEqual(ret, self.service.endpoints)
def test_getattr_region(self):
ident = self.base_identity_class()
ident.authenticated = True
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = "FOO"
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
self.service.endpoints = {rgn: ep}
ident.services = {"fake": self.service}
ret = ident.FOO
self.assertEqual(ret, {"fake": ep})
def test_getattr_fail(self):
ident = self.base_identity_class()
ident.authenticated = True
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = "FOO"
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
self.service.endpoints = {rgn: ep}
ident.services = {"fake": self.service}
self.assertRaises(AttributeError, getattr, ident, "BAR")
def test_getattr_not_authed(self):
ident = self.base_identity_class()
ident.authenticated = False
svc = self.service
pub = utils.random_unicode()
priv = utils.random_unicode()
ep_dict = {"publicURL": pub, "privateURL": priv, "tenantId": "aa"}
rgn = "FOO"
ep = fakes.FakeEndpoint(ep_dict, svc, rgn, self.identity)
self.service.endpoints = {rgn: ep}
ident.services = {"fake": self.service}
self.assertRaises(exc.NotAuthenticated, getattr, ident, "BAR")
def test_http_methods(self):
ident = self.base_identity_class()
ident._call = Mock()
uri = utils.random_unicode()
dkv = utils.random_unicode()
data = {dkv: dkv}
hkv = utils.random_unicode()
headers = {hkv: hkv}
std_headers = True
ident.method_get(uri, admin=False, data=data, headers=headers,
std_headers=std_headers)
ident._call.assert_called_with("GET", uri, False, data, headers,
std_headers)
ident.method_head(uri, admin=False, data=data, headers=headers,
std_headers=std_headers)
ident._call.assert_called_with("HEAD", uri, False, data, headers,
std_headers)
ident.method_post(uri, admin=False, data=data, headers=headers,
std_headers=std_headers)
ident._call.assert_called_with("POST", uri, False, data, headers,
std_headers)
ident.method_put(uri, admin=False, data=data, headers=headers,
std_headers=std_headers)
ident._call.assert_called_with("PUT", uri, False, data, headers,
std_headers)
ident.method_delete(uri, admin=False, data=data, headers=headers,
std_headers=std_headers)
ident._call.assert_called_with("DELETE", uri, False, data,
headers, std_headers)
ident.method_patch(uri, admin=False, data=data, headers=headers,
std_headers=std_headers)
ident._call.assert_called_with("PATCH", uri, False, data,
headers, std_headers)
def test_call(self):
ident = self.base_identity_class()
sav_req = pyrax.http.request
pyrax.http.request = Mock()
sav_debug = ident.http_log_debug
ident.http_log_debug = True
uri = "https://%s/%s" % (utils.random_ascii(), utils.random_ascii())
sav_stdout = sys.stdout
out = StringIO()
sys.stdout = out
utils.add_method(ident, lambda self: "", "_get_auth_endpoint")
dkv = utils.random_ascii()
data = {dkv: dkv}
hkv = utils.random_ascii()
headers = {hkv: hkv}
for std_headers in (True, False):
expected_headers = ident._standard_headers() if std_headers else {}
expected_headers.update(headers)
for admin in (True, False):
ident.method_post(uri, data=data, headers=headers,
std_headers=std_headers, admin=admin)
pyrax.http.request.assert_called_with("POST", uri, body=data,
headers=expected_headers)
self.assertEqual(out.getvalue(), "")
out.seek(0)
out.truncate()
out.close()
pyrax.http.request = sav_req
ident.http_log_debug = sav_debug
sys.stdout = sav_stdout
def test_call_without_slash(self):
ident = self.base_identity_class()
ident._get_auth_endpoint = Mock()
ident._get_auth_endpoint.return_value = "http://example.com/v2.0"
ident.verify_ssl = False
pyrax.http.request = Mock()
ident._call("POST", "tokens", False, {}, {}, False)
pyrax.http.request.assert_called_with("POST",
"http://example.com/v2.0/tokens", headers={},
raise_exception=False)
def test_call_with_slash(self):
ident = self.base_identity_class()
ident._get_auth_endpoint = Mock()
ident._get_auth_endpoint.return_value = "http://example.com/v2.0/"
ident.verify_ssl = False
pyrax.http.request = Mock()
ident._call("POST", "tokens", False, {}, {}, False)
pyrax.http.request.assert_called_with("POST",
"http://example.com/v2.0/tokens", headers={},
raise_exception=False)
def test_list_users(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
ident.method_get = Mock(return_value=(resp, resp.json()))
ret = ident.list_users()
self.assertTrue(isinstance(ret, list))
are_users = [isinstance(itm, pyrax.rax_identity.User) for itm in ret]
self.assertTrue(all(are_users))
def test_list_users_alt_body(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
alt = fakes.fake_identity_user_response.get("users")
alt[0]["password"] = "foo"
ident.method_get = Mock(return_value=(resp, alt))
ret = ident.list_users()
self.assertTrue(isinstance(ret, list))
are_users = [isinstance(itm, pyrax.rax_identity.User) for itm in ret]
self.assertTrue(all(are_users))
def test_list_users_fail(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 401
ident.method_get = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.AuthorizationFailure, ident.list_users)
def test_find_user_by_name_rax(self):
ident = self.rax_identity_class()
ident.get_user = Mock()
fake_name = utils.random_unicode()
ret = ident.find_user_by_name(fake_name)
ident.get_user.assert_called_with(username=fake_name)
def test_find_user_by_email_rax(self):
ident = self.rax_identity_class()
ident.get_user = Mock()
fake_email = utils.random_unicode()
ret = ident.find_user_by_email(fake_email)
ident.get_user.assert_called_with(email=fake_email)
def test_find_user_by_id_rax(self):
ident = self.rax_identity_class()
ident.get_user = Mock()
fake_id = utils.random_unicode()
ret = ident.find_user_by_id(fake_id)
ident.get_user.assert_called_with(user_id=fake_id)
def test_find_user_fail_rax(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 404
ident.method_get = Mock(return_value=(resp, resp.json()))
fake_user = utils.random_unicode()
self.assertRaises(exc.NotFound, ident.get_user, username=fake_user)
def test_find_user_fail_base(self):
ident = self.identity
fake = utils.random_unicode()
self.assertRaises(NotImplementedError, ident.find_user_by_name, fake)
self.assertRaises(NotImplementedError, ident.find_user_by_email, fake)
self.assertRaises(NotImplementedError, ident.find_user_by_id, fake)
self.assertRaises(NotImplementedError, ident.get_user, fake)
def test_get_user_by_id(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp_body = resp.json().copy()
del resp_body["users"]
fake = utils.random_unicode()
ident.method_get = Mock(return_value=(resp, resp_body))
ret = ident.get_user(user_id=fake)
self.assertTrue(isinstance(ret, base_identity.User))
def test_get_user_by_username(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp_body = resp.json().copy()
del resp_body["users"]
fake = utils.random_unicode()
ident.method_get = Mock(return_value=(resp, resp_body))
ret = ident.get_user(username=fake)
self.assertTrue(isinstance(ret, base_identity.User))
def test_get_user_by_email(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp_body = resp.json()
fake = utils.random_unicode()
ident.method_get = Mock(return_value=(resp, resp_body))
ret = ident.get_user(email=fake)
self.assertTrue(isinstance(ret[0], base_identity.User))
def test_get_user_missing_params(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
ident.method_get = Mock(return_value=(resp, resp.json()))
self.assertRaises(ValueError, ident.get_user)
def test_get_user_not_found(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp_body = resp.json().copy()
del resp_body["users"]
del resp_body["user"]
fake = utils.random_unicode()
ident.method_get = Mock(return_value=(resp, resp_body))
self.assertRaises(exc.NotFound, ident.get_user, username=fake)
def test_create_user(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 201
ident.method_post = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
fake_email = utils.random_unicode()
fake_password = utils.random_unicode()
ident.create_user(fake_name, fake_email, fake_password)
cargs = ident.method_post.call_args
self.assertEqual(len(cargs), 2)
self.assertEqual(cargs[0], ("users", ))
data = cargs[1]["data"]["user"]
self.assertEqual(data["username"], fake_name)
self.assertTrue(fake_password in data.values())
def test_create_user_not_authorized(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 401
ident.method_post = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
fake_email = utils.random_unicode()
fake_password = utils.random_unicode()
self.assertRaises(exc.AuthorizationFailure, ident.create_user,
fake_name, fake_email, fake_password)
def test_create_user_duplicate(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 409
ident.method_post = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
fake_email = utils.random_unicode()
fake_password = utils.random_unicode()
self.assertRaises(exc.DuplicateUser, ident.create_user,
fake_name, fake_email, fake_password)
def test_create_user_bad_email(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 400
resp_body = {"badRequest": {"message":
"Expecting valid email address"}}
ident.method_post = Mock(return_value=(resp, resp_body))
fake_name = utils.random_unicode()
fake_email = utils.random_unicode()
fake_password = utils.random_unicode()
self.assertRaises(exc.InvalidEmail, ident.create_user,
fake_name, fake_email, fake_password)
def test_create_user_not_found(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 404
ident.method_post = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
fake_email = utils.random_unicode()
fake_password = utils.random_unicode()
self.assertRaises(exc.AuthorizationFailure, ident.create_user,
fake_name, fake_email, fake_password)
def test_create_user_other(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 400
resp_body = {"badRequest": {"message": "fake"}}
ident.method_post = Mock(return_value=(resp, resp_body))
fake_name = utils.random_unicode()
fake_email = utils.random_unicode()
fake_password = utils.random_unicode()
self.assertRaises(exc.BadRequest, ident.create_user,
fake_name, fake_email, fake_password)
def test_update_user(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
ident.method_put = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
fake_email = utils.random_unicode()
fake_username = utils.random_unicode()
fake_uid = utils.random_unicode()
fake_region = utils.random_unicode()
fake_enabled = random.choice((True, False))
kwargs = {"email": fake_email, "username": fake_username,
"uid": fake_uid, "enabled": fake_enabled}
if isinstance(ident, self.rax_identity_class):
kwargs["defaultRegion"] = fake_region
ident.update_user(fake_name, **kwargs)
cargs = ident.method_put.call_args
self.assertEqual(len(cargs), 2)
self.assertEqual(cargs[0], ("users/%s" % fake_name, ))
data = cargs[1]["data"]["user"]
self.assertEqual(data["enabled"], fake_enabled)
self.assertEqual(data["username"], fake_username)
self.assertTrue(fake_email in data.values())
if isinstance(ident, self.rax_identity_class):
self.assertTrue(fake_region in data.values())
def test_update_user_fail(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 401
ident.method_put = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
fake_email = utils.random_unicode()
fake_username = utils.random_unicode()
fake_uid = utils.random_unicode()
fake_region = utils.random_unicode()
fake_enabled = random.choice((True, False))
kwargs = {"email": fake_email, "username": fake_username,
"uid": fake_uid, "enabled": fake_enabled}
if isinstance(ident, self.rax_identity_class):
kwargs["defaultRegion"] = fake_region
self.assertRaises(exc.AuthorizationFailure, ident.update_user,
fake_name, **kwargs)
def test_delete_user(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
ident.method_delete = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
ident.delete_user(fake_name)
cargs = ident.method_delete.call_args
self.assertEqual(len(cargs), 2)
self.assertEqual(cargs[0], ("users/%s" % fake_name, ))
def test_delete_user_not_found(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 404
ident.method_delete = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
self.assertRaises(exc.UserNotFound, ident.delete_user, fake_name)
def test_delete_user_fail(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 401
ident.method_delete = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
self.assertRaises(exc.AuthorizationFailure, ident.delete_user,
fake_name)
def test_list_roles_for_user(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 200
ident.method_get = Mock(return_value=(resp, resp.json()))
resp = ident.list_roles_for_user("fake")
self.assertTrue(isinstance(resp, list))
role = resp[0]
self.assertTrue("description" in role)
self.assertTrue("name" in role)
self.assertTrue("id" in role)
def test_list_roles_for_user_fail(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 401
ident.method_get = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.AuthorizationFailure,
ident.list_roles_for_user, "fake")
def test_list_credentials(self):
ident = self.rax_identity_class()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 200
ident.method_get = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
ident.list_credentials(fake_name)
cargs = ident.method_get.call_args
called_uri = cargs[0][0]
self.assertTrue("/credentials" in called_uri)
self.assertTrue("users/%s/" % fake_name in called_uri)
def test_list_credentials_no_user(self):
ident = self.identity
ident.user = fakes.FakeEntity()
resp = fakes.FakeIdentityResponse()
resp.response_type = "users"
resp.status_code = 200
ident.method_get = Mock(return_value=(resp, resp.json()))
ident.list_credentials()
cargs = ident.method_get.call_args
called_uri = cargs[0][0]
self.assertTrue("/credentials" in called_uri)
self.assertTrue("users/%s/" % ident.user.id in called_uri)
def test_get_keystone_endpoint(self):
ident = self.keystone_identity_class()
fake_ep = utils.random_unicode()
sav_setting = pyrax.get_setting
pyrax.get_setting = Mock(return_value=fake_ep)
ep = ident._get_auth_endpoint()
self.assertEqual(ep, fake_ep)
pyrax.get_setting = sav_setting
def test_get_keystone_endpoint_fail(self):
ident = self.keystone_identity_class()
sav_setting = pyrax.get_setting
pyrax.get_setting = Mock(return_value=None)
self.assertRaises(exc.EndpointNotDefined, ident._get_auth_endpoint)
pyrax.get_setting = sav_setting
def test_get_token(self):
for cls in self.id_classes.values():
ident = cls()
ident.token = "test_token"
sav_valid = ident._has_valid_token
sav_auth = ident.authenticate
ident._has_valid_token = Mock(return_value=True)
ident.authenticate = Mock()
tok = ident.get_token()
self.assertEqual(tok, "test_token")
# Force
tok = ident.get_token(force=True)
ident.authenticate.assert_called_with()
# Invalid token
ident._has_valid_token = Mock(return_value=False)
ident.authenticated = False
tok = ident.get_token()
ident.authenticate.assert_called_with()
ident._has_valid_token = sav_valid
ident.authenticate = sav_auth
def test_has_valid_token(self):
savrequest = pyrax.http.request
pyrax.http.request = Mock(return_value=(fakes.FakeIdentityResponse(),
fakes.fake_identity_response))
for cls in self.id_classes.values():
ident = cls()
if cls is self.keystone_identity_class:
# Necessary for testing to avoid NotImplementedError.
utils.add_method(ident, lambda self: "", "_get_auth_endpoint")
ident.authenticate()
valid = ident._has_valid_token()
self.assertTrue(valid)
ident.expires = datetime.datetime.now() - datetime.timedelta(1)
valid = ident._has_valid_token()
self.assertFalse(valid)
ident = self._get_clean_identity()
valid = ident._has_valid_token()
self.assertFalse(valid)
pyrax.http.request = savrequest
def test_list_token(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tokens"
ident.method_get = Mock(return_value=(resp, resp.json()))
tokens = ident.list_tokens()
ident.method_get.assert_called_with("tokens/%s" % ident.token,
admin=True)
self.assertTrue("token" in tokens)
def test_list_token_fail(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tokens"
resp.status_code = 403
ident.method_get = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.AuthorizationFailure, ident.list_tokens)
def test_check_token(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tokens"
ident.method_head = Mock(return_value=(resp, resp.json()))
valid = ident.check_token()
ident.method_head.assert_called_with("tokens/%s" % ident.token,
admin=True)
self.assertTrue(valid)
def test_check_token_fail_auth(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tokens"
resp.status_code = 403
ident.method_head = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.AuthorizationFailure, ident.check_token)
def test_check_token_fail_valid(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tokens"
resp.status_code = 404
ident.method_head = Mock(return_value=(resp, resp.json()))
valid = ident.check_token()
ident.method_head.assert_called_with("tokens/%s" % ident.token,
admin=True)
self.assertFalse(valid)
def test_revoke_token(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tokens"
token = ident.token = utils.random_unicode()
ident.method_delete = Mock(return_value=(resp, resp.json()))
valid = ident.revoke_token(token)
ident.method_delete.assert_called_with("tokens/%s" % ident.token,
admin=True)
self.assertTrue(valid)
def test_revoke_token_fail(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tokens"
resp.status_code = 401
token = ident.token = utils.random_unicode()
ident.method_delete = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.AuthorizationFailure, ident.revoke_token,
token)
def test_get_token_endpoints(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "endpoints"
ident.method_get = Mock(return_value=(resp, resp.json()))
eps = ident.get_token_endpoints()
self.assertTrue(isinstance(eps, list))
ident.method_get.assert_called_with("tokens/%s/endpoints" %
ident.token, admin=True)
def test_get_token_endpoints_fail(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "endpoints"
resp.status_code = 401
ident.method_get = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.AuthorizationFailure,
ident.get_token_endpoints)
def test_reset_api_key(self):
ident = self.identity
self.assertRaises(NotImplementedError, ident.reset_api_key)
def test_reset_api_key_rax(self):
ident = self.rax_identity_class()
user = utils.random_unicode()
nm = utils.random_unicode()
key = utils.random_unicode()
resp = fakes.FakeResponse()
resp_body = {"RAX-KSKEY:apiKeyCredentials": {
"username": nm, "apiKey": key}}
ident.method_post = Mock(return_value=(resp, resp_body))
exp_uri = "users/%s/OS-KSADM/credentials/" % user
exp_uri += "RAX-KSKEY:apiKeyCredentials/RAX-AUTH/reset"
ret = ident.reset_api_key(user)
self.assertEqual(ret, key)
ident.method_post.assert_called_once_with(exp_uri)
@patch("pyrax.utils.get_id")
def test_reset_api_key_rax_no_user(self, mock_get_id):
ident = self.rax_identity_class()
user = utils.random_unicode()
mock_get_id.return_value = user
ident.authenticated = True
nm = utils.random_unicode()
key = utils.random_unicode()
resp = fakes.FakeResponse()
resp_body = {"RAX-KSKEY:apiKeyCredentials": {
"username": nm, "apiKey": key}}
ident.method_post = Mock(return_value=(resp, resp_body))
exp_uri = "users/%s/OS-KSADM/credentials/" % user
exp_uri += "RAX-KSKEY:apiKeyCredentials/RAX-AUTH/reset"
ret = ident.reset_api_key()
self.assertEqual(ret, key)
ident.method_post.assert_called_once_with(exp_uri)
def test_get_tenant(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tenants"
ident.method_get = Mock(return_value=(resp, resp.json()))
tenant = ident.get_tenant()
self.assertTrue(isinstance(tenant, base_identity.Tenant))
def test_get_tenant_none(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tenants"
ident._list_tenants = Mock(return_value=[])
tenant = ident.get_tenant()
self.assertIsNone(tenant)
def test_list_tenants(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tenants"
ident.method_get = Mock(return_value=(resp, resp.json()))
tenants = ident.list_tenants()
self.assertTrue(isinstance(tenants, list))
are_tenants = [isinstance(itm, base_identity.Tenant)
for itm in tenants]
self.assertTrue(all(are_tenants))
def test_list_tenants_auth_fail(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tenants"
resp.status_code = 403
ident.method_get = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.AuthorizationFailure, ident.list_tenants)
def test_list_tenants_other_fail(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tenants"
resp.status_code = 404
ident.method_get = Mock(return_value=(resp, resp.json()))
self.assertRaises(exc.TenantNotFound, ident.list_tenants)
def test_create_tenant(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tenant"
ident.method_post = Mock(return_value=(resp, resp.json()))
fake_name = utils.random_unicode()
fake_desc = utils.random_unicode()
tenant = ident.create_tenant(fake_name, description=fake_desc)
self.assertTrue(isinstance(tenant, base_identity.Tenant))
cargs = ident.method_post.call_args
self.assertEqual(len(cargs), 2)
self.assertEqual(cargs[0], ("tenants", ))
data = cargs[1]["data"]["tenant"]
self.assertEqual(data["name"], fake_name)
self.assertEqual(data["description"], fake_desc)
def test_update_tenant(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tenant"
ident.method_put = Mock(return_value=(resp, resp.json()))
fake_id = utils.random_unicode()
fake_name = utils.random_unicode()
fake_desc = utils.random_unicode()
tenant = ident.update_tenant(fake_id, name=fake_name,
description=fake_desc)
self.assertTrue(isinstance(tenant, base_identity.Tenant))
cargs = ident.method_put.call_args
self.assertEqual(len(cargs), 2)
self.assertEqual(cargs[0], ("tenants/%s" % fake_id, ))
data = cargs[1]["data"]["tenant"]
self.assertEqual(data["name"], fake_name)
self.assertEqual(data["description"], fake_desc)
def test_delete_tenant(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tenant"
ident.method_delete = Mock(return_value=(resp, resp.json()))
fake_id = utils.random_unicode()
ident.delete_tenant(fake_id)
ident.method_delete.assert_called_with("tenants/%s" % fake_id)
def test_delete_tenantfail(self):
for cls in self.id_classes.values():
ident = cls()
resp = fakes.FakeIdentityResponse()
resp.response_type = "tenant"
resp.status_code = 404
ident.method_delete = Mock(return_value=(resp, resp.json()))
fake_id = utils.random_unicode()
self.assertRaises(exc.TenantNotFound, ident.delete_tenant, fake_id)
def test_list_roles(self):
ident = self.identity
nm = utils.random_unicode()
id_ = utils.random_unicode()
svcid = utils.random_unicode()
limit = utils.random_unicode()
marker = utils.random_unicode()
resp = fakes.FakeResponse()
resp_body = {"roles": [{"name": nm, "id": id_}]}
ident.method_get = Mock(return_value=(resp, resp_body))
exp_uri = "OS-KSADM/roles?serviceId=%s&limit=%s&marker=%s" % (svcid,
limit, marker)
ret = ident.list_roles(service_id=svcid, limit=limit, marker=marker)
ident.method_get.assert_called_once_with(exp_uri)
self.assertEqual(len(ret), 1)
role = ret[0]
self.assertTrue(isinstance(role, base_identity.Role))
def test_get_role(self):
ident = self.identity
role = utils.random_unicode()
nm = utils.random_unicode()
id_ = utils.random_unicode()
resp = fakes.FakeResponse()
resp_body = {"role": {"name": nm, "id": id_}}
ident.method_get = Mock(return_value=(resp, resp_body))
exp_uri = "OS-KSADM/roles/%s" % role
ret = ident.get_role(role)
ident.method_get.assert_called_once_with(exp_uri)
self.assertTrue(isinstance(ret, base_identity.Role))
self.assertEqual(ret.name, nm)
self.assertEqual(ret.id, id_)
def test_add_role_to_user(self):
ident = self.identity
role = utils.random_unicode()
user = utils.random_unicode()
ident.method_put = Mock(return_value=(None, None))
exp_uri = "users/%s/roles/OS-KSADM/%s" % (user, role)
ident.add_role_to_user(role, user)
ident.method_put.assert_called_once_with(exp_uri)
def test_delete_role_from_user(self):
ident = self.identity
role = utils.random_unicode()
user = utils.random_unicode()
ident.method_delete = Mock(return_value=(None, None))
exp_uri = "users/%s/roles/OS-KSADM/%s" % (user, role)
ident.delete_role_from_user(role, user)
ident.method_delete.assert_called_once_with(exp_uri)
def test_parse_api_time_us(self):
test_date = "2012-01-02T05:20:30.000-05:00"
expected = datetime.datetime(2012, 1, 2, 10, 20, 30)
for cls in self.id_classes.values():
ident = cls()
parsed = ident._parse_api_time(test_date)
self.assertEqual(parsed, expected)
def test_parse_api_time_uk(self):
test_date = "2012-01-02T10:20:30.000Z"
expected = datetime.datetime(2012, 1, 2, 10, 20, 30)
for cls in self.id_classes.values():
ident = cls()
parsed = ident._parse_api_time(test_date)
self.assertEqual(parsed, expected)
if __name__ == "__main__":
unittest.main()
# suite = unittest.TestLoader().loadTestsFromTestCase(IdentityTest)
# unittest.TextTestRunner(verbosity=2).run(suite)
| 42.490438 | 80 | 0.621877 | 8,001 | 68,877 | 5.108236 | 0.052368 | 0.045754 | 0.073108 | 0.023709 | 0.798292 | 0.749529 | 0.718994 | 0.689193 | 0.654743 | 0.634215 | 0 | 0.004408 | 0.268827 | 68,877 | 1,620 | 81 | 42.516667 | 0.807152 | 0.009829 | 0 | 0.634801 | 0 | 0 | 0.054634 | 0.0077 | 0 | 0 | 0 | 0 | 0.14099 | 1 | 0.090784 | false | 0.041265 | 0.011692 | 0.001376 | 0.10729 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5d5fca8610522739bb72d1155e20a6a8d88d13c7 | 35 | py | Python | tests/fixtures/test_title_prefix/content_03_expected.py | elifesciences/elife-tools | ee345bf0e6703ef0f7e718355e85730abbdfd117 | [
"MIT"
] | 9 | 2015-04-16T08:13:31.000Z | 2020-05-18T14:03:06.000Z | tests/fixtures/test_title_prefix/content_03_expected.py | elifesciences/elife-tools | ee345bf0e6703ef0f7e718355e85730abbdfd117 | [
"MIT"
] | 310 | 2015-02-11T00:30:09.000Z | 2021-07-14T23:58:50.000Z | tests/fixtures/test_title_prefix/content_03_expected.py | elifesciences/elife-tools | ee345bf0e6703ef0f7e718355e85730abbdfd117 | [
"MIT"
] | 9 | 2015-02-04T01:21:28.000Z | 2021-06-15T12:50:47.000Z | expected = "Scientific publishing"
| 17.5 | 34 | 0.8 | 3 | 35 | 9.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 35 | 1 | 35 | 35 | 0.903226 | 0 | 0 | 0 | 0 | 0 | 0.6 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5d702cee6c8ec82d463db64f4d9ab14ed032efe0 | 838 | py | Python | validation/types/number/uint_type_validation.py | uon-language/uon-parser | 666894cf4917d8da01512918a147882550382269 | [
"MIT"
] | 1 | 2020-05-29T06:47:16.000Z | 2020-05-29T06:47:16.000Z | validation/types/number/uint_type_validation.py | Cell00phane/uon-parser | 666894cf4917d8da01512918a147882550382269 | [
"MIT"
] | 1 | 2020-07-16T08:19:08.000Z | 2020-07-16T08:19:08.000Z | validation/types/number/uint_type_validation.py | uon-language/uon-parser | 666894cf4917d8da01512918a147882550382269 | [
"MIT"
] | null | null | null | from validation.types.type_validation import (
ValidationType, ValidationTypeError
)
from uontypes.scalars.uon_uint import UonUint
class UintTypeValidation(ValidationType):
def validate_type(self, input_):
if (not isinstance(input_, UonUint)):
raise ValidationTypeError("The following input {} type "
"does not correspond to "
"unsigned int".format(input_))
def __repr__(self):
return "UintTypeValidation()"
def __str__(self):
return "!uint"
def to_binary(self):
"""Binary representation of unsigned integer type validation.
b"\x30" corresponds to uint type in UON.
Returns:
bytes: binary representation of uint type validation
"""
return b"\x30" | 28.896552 | 69 | 0.612172 | 82 | 838 | 6.073171 | 0.52439 | 0.084337 | 0.088353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00692 | 0.310263 | 838 | 29 | 70 | 28.896552 | 0.854671 | 0.195704 | 0 | 0 | 0 | 0 | 0.14534 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.125 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
5371e9e48a1b96ec4c876a7ef39886d3f07914c6 | 1,134 | py | Python | src/tensorneko/io/image/image_reader.py | ControlNet/tensorneko | 70dfb2f6395e1703dbdf5d5adcfed7b1334efb8f | [
"MIT"
] | 9 | 2021-05-23T17:38:09.000Z | 2021-12-30T19:12:12.000Z | src/tensorneko/io/image/image_reader.py | ControlNet/tensorneko | 70dfb2f6395e1703dbdf5d5adcfed7b1334efb8f | [
"MIT"
] | null | null | null | src/tensorneko/io/image/image_reader.py | ControlNet/tensorneko | 70dfb2f6395e1703dbdf5d5adcfed7b1334efb8f | [
"MIT"
] | null | null | null | from torch import Tensor
from torchvision.io import read_image
from torchvision.io.image import ImageReadMode
class ImageReader:
"""ImageReader for reading images as :class:`~torch.Tensor`"""
@staticmethod
def of(path: str, mode: ImageReadMode = ImageReadMode.UNCHANGED) -> Tensor:
"""
Read image tensor of given file.
Args:
path (``str``): Path of the JPEG or PNG image.
mode (:class:`~torchvision.io.image.ImageReadMode`, optional):
the read mode used for optionally converting the image.
Default: :class:`~torchvision.io.image.ImageReadMode.UNCHANGED`.
See :class:`~torchvision.io.image.ImageReadMode` class for more information on various
available modes.
Returns:
:class:`~torch.Tensor`: A float tensor of image (C, H, W), with value range of 0. to 1.
"""
return read_image(path, mode) / 255
def __new__(cls, path: str, mode: ImageReadMode = ImageReadMode.UNCHANGED) -> Tensor:
"""Alias of :meth:`~ImageReader.of`"""
return cls.of(path, mode)
| 37.8 | 102 | 0.631393 | 134 | 1,134 | 5.298507 | 0.440299 | 0.091549 | 0.101408 | 0.097183 | 0.298592 | 0.146479 | 0.146479 | 0 | 0 | 0 | 0 | 0.005967 | 0.261023 | 1,134 | 29 | 103 | 39.103448 | 0.841289 | 0.534392 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.333333 | 0 | 0.888889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
53843490fe92b10042aae46acc849641c0aa83d5 | 102 | py | Python | gan_mnist_rdfl_demo/utils/data_split.py | ZJU-DistributedAI/RDFL-GAN | e5f10b071d25db7931749515b1b8a3c477a91257 | [
"Apache-2.0"
] | null | null | null | gan_mnist_rdfl_demo/utils/data_split.py | ZJU-DistributedAI/RDFL-GAN | e5f10b071d25db7931749515b1b8a3c477a91257 | [
"Apache-2.0"
] | null | null | null | gan_mnist_rdfl_demo/utils/data_split.py | ZJU-DistributedAI/RDFL-GAN | e5f10b071d25db7931749515b1b8a3c477a91257 | [
"Apache-2.0"
] | 1 | 2021-04-08T12:11:58.000Z | 2021-04-08T12:11:58.000Z |
from data_util import Data
def main():
Data("mnist", 3)
if __name__ == "__main__":
main()
| 10.2 | 26 | 0.607843 | 14 | 102 | 3.785714 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.012987 | 0.245098 | 102 | 9 | 27 | 11.333333 | 0.675325 | 0 | 0 | 0 | 0 | 0 | 0.13 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | true | 0 | 0.2 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5399a0d86628a7abfda9f547e362a5f487cdcca9 | 305 | py | Python | tests/unit/test_weather.py | vyahello/async-weather-api | 03edabb9c1d29b2fe9b478d2751361d6dc1d4a08 | [
"MIT"
] | 2 | 2020-01-06T19:10:22.000Z | 2020-05-27T03:41:59.000Z | tests/unit/test_weather.py | vyahello/async-weather-api | 03edabb9c1d29b2fe9b478d2751361d6dc1d4a08 | [
"MIT"
] | 3 | 2020-10-30T19:34:54.000Z | 2021-08-14T07:20:43.000Z | tests/unit/test_weather.py | vyahello/async-weather-api | 03edabb9c1d29b2fe9b478d2751361d6dc1d4a08 | [
"MIT"
] | null | null | null | from tests.markers import unit
from weather import Bind
pytestmark = unit
def test_bind_host(bind: Bind) -> None:
assert bind.host == "0.0.0.0"
def test_bind_port(bind: Bind) -> None:
assert bind.port == 5000
def test_bind_as_str(bind: Bind) -> None:
assert str(bind) == "0.0.0.0:5000"
| 17.941176 | 41 | 0.678689 | 52 | 305 | 3.846154 | 0.346154 | 0.06 | 0.06 | 0.27 | 0.22 | 0 | 0 | 0 | 0 | 0 | 0 | 0.064516 | 0.186885 | 305 | 16 | 42 | 19.0625 | 0.741935 | 0 | 0 | 0 | 0 | 0 | 0.062295 | 0 | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0.333333 | false | 0 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
53a1b9eb05125dbabfc60bee6d02a5ecf195a79c | 192 | py | Python | xestore/__init__.py | jmosbacher/xestore | ad5451b54ec44a86a1371e5253fab3aaa8e5a111 | [
"MIT"
] | null | null | null | xestore/__init__.py | jmosbacher/xestore | ad5451b54ec44a86a1371e5253fab3aaa8e5a111 | [
"MIT"
] | null | null | null | xestore/__init__.py | jmosbacher/xestore | ad5451b54ec44a86a1371e5253fab3aaa8e5a111 | [
"MIT"
] | null | null | null | """Top-level package for xestore."""
__author__ = """Yossi Mosbacher"""
__email__ = 'joe.mosbacher@gmail.com'
__version__ = '0.1.1'
from .settings import Config
from .client import XeStore
| 19.2 | 37 | 0.723958 | 25 | 192 | 5.08 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017964 | 0.130208 | 192 | 9 | 38 | 21.333333 | 0.742515 | 0.15625 | 0 | 0 | 0 | 0 | 0.277419 | 0.148387 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
53b1ba0dd25d34712bc019a32ce0741582f86a9a | 315 | py | Python | aulaspythonintermediario/exercicios02/exercicios02/exercicios02.py | lel352/Curso-Python | d65484c807db52d57042eee20ccbd3131825fa98 | [
"MIT"
] | 1 | 2021-09-04T14:34:34.000Z | 2021-09-04T14:34:34.000Z | aulaspythonintermediario/exercicios02/exercicios02/exercicios02.py | lel352/Curso-Python | d65484c807db52d57042eee20ccbd3131825fa98 | [
"MIT"
] | null | null | null | aulaspythonintermediario/exercicios02/exercicios02/exercicios02.py | lel352/Curso-Python | d65484c807db52d57042eee20ccbd3131825fa98 | [
"MIT"
] | null | null | null | def funcao1(funcao, *args, **kwargs):
return funcao(*args, **kwargs)
def funcao2(nome):
return f'Oi {nome}'
def funcao3(nome, saudacao):
return f'{saudacao} {nome}'
executando = funcao1(funcao2, 'Luiz')
print(executando)
executando = funcao1(funcao3, 'Luiz', saudacao='Bom dia')
print(executando)
| 18.529412 | 57 | 0.685714 | 39 | 315 | 5.538462 | 0.435897 | 0.092593 | 0.148148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026217 | 0.152381 | 315 | 16 | 58 | 19.6875 | 0.782772 | 0 | 0 | 0.2 | 0 | 0 | 0.130159 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0 | 0.3 | 0.6 | 0.2 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
53d1d49760424b05379f7c9897153bc723c4223c | 5,418 | py | Python | project/app/forms.py | brunoliveira8/fibrando | b9bc4487ace3fd2ce4d8852fbe06377e1f9148a7 | [
"MIT"
] | null | null | null | project/app/forms.py | brunoliveira8/fibrando | b9bc4487ace3fd2ce4d8852fbe06377e1f9148a7 | [
"MIT"
] | null | null | null | project/app/forms.py | brunoliveira8/fibrando | b9bc4487ace3fd2ce4d8852fbe06377e1f9148a7 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from django import forms
from django.db import models
from django.contrib.auth.models import User
from app.models import User, Athlete, Tracker, Exercise, PersonalTrainer, BodyScreening, Gym, Subscribe
from django.db.models import Q
from registration.forms import RegistrationFormUniqueEmail
class UserForm(forms.ModelForm):
username = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
email = forms.CharField(widget=forms.EmailInput(attrs={'class': "form-control"}))
password = forms.CharField(widget=forms.PasswordInput(attrs={'class' : 'form-control'}))
first_name = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
last_name = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
def __init__(self, *args, **kwargs):
super(UserForm, self).__init__(*args, **kwargs)
for fieldname in ['username', 'password', 'email']:
self.fields[fieldname].help_text = None
class Meta:
model = User
fields = ('username','password', 'email', 'first_name', 'last_name')
class UserEditForm(forms.ModelForm):
first_name = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
last_name = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
email = forms.CharField(widget=forms.EmailInput(attrs={'class': "form-control"}))
def __init__(self, *args, **kwargs):
super(UserEditForm, self).__init__(*args, **kwargs)
class Meta:
model = User
fields = ('first_name', 'last_name', 'email')
class ChangePasswordForm(forms.ModelForm):
password = forms.CharField(widget=forms.PasswordInput(attrs={'class': "form-control"}))
class Meta:
model = User
fields = ('password', )
class AthleteForm(forms.ModelForm):
level = forms.ChoiceField(widget=forms.Select(attrs={'class': "form-control"}), choices=Athlete.LEVELS)
training_period = forms.ChoiceField(widget=forms.Select(attrs={'class': "form-control"}), choices=Athlete.TRAINING_PERIOD)
gender = forms.ChoiceField(widget=forms.Select(attrs={'class': "form-control"}), choices=Athlete.GENDERS)
class Meta:
model = Athlete
fields = ('level', 'training_period', 'gender')
class PersonalTrainerForm(forms.ModelForm):
gender = forms.ChoiceField(widget=forms.Select(attrs={'class': "form-control"}), choices=Athlete.GENDERS)
class Meta:
model = PersonalTrainer
fields = ('gender',)
class ExerciseForm(forms.ModelForm):
weight = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
repetition = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
sets = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
class Meta:
model = Exercise
fields = ('weight','repetition', 'sets')
class BodyScreeningForm(forms.ModelForm):
class Meta:
model = BodyScreening
exclude = ['criado_em', 'imc']
class AthleteSelectForm(forms.Form):
athlete = forms.ModelChoiceField(queryset=User.objects.filter(Q(groups__name='athlete')), empty_label='...', to_field_name='username', widget=forms.Select(attrs={'class': "form-control"}))
class UserTypeForm(forms.Form):
GROUPS = (
('regular', 'Aluno'),
('personal_trainer', 'Professor'),
)
group = forms.ChoiceField(widget=forms.Select(attrs={'class': "form-control"}),choices=GROUPS, required=True, label='User Type')
class UserGenderForm(forms.Form):
GENDERS = (
('F', 'Feminino'),
('M', 'Masculino'),
)
gender = forms.ChoiceField(
widget=forms.Select(attrs={'class': "form-control"}), choices=GENDERS,
required=True,
initial='F'
)
class RegisterForm(RegistrationFormUniqueEmail):
username = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}), label="Usuário")
email = forms.CharField(widget=forms.EmailInput(attrs={'class': "form-control"}))
password1 = forms.CharField(label="Senha",widget=forms.PasswordInput(attrs={'class' : 'form-control'}))
password2 = forms.CharField(label="Confirmar senha",widget=forms.PasswordInput(attrs={'class' : 'form-control'}))
#first_name = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
#last_name = forms.CharField(widget=forms.TextInput(attrs={'class': "form-control"}))
GENDERS = (
('F', 'Feminino'),
('M', 'Masculino'),
)
gender = forms.ChoiceField(
widget=forms.Select(attrs={'class': "form-control"}), choices=GENDERS,
required=True,
initial='F',
label="Sexo"
)
GROUPS = (
('athlete', 'Aluno'),
('trainer', 'Professor'),
)
group = forms.ChoiceField(widget=forms.Select(attrs={'class': "form-control"}),choices=GROUPS, required=True, label='Tipo de conta')
#gym = forms.ModelChoiceField(label="Academia", queryset=Gym.objects.all(), widget=forms.Select(attrs={'class': "form-control"}))
class Meta:
model = User
#fields = ('username','password1','password2', 'email', 'gender', 'group', 'gym')
fields = ('username','password1','password2', 'email', 'gender', 'group')
class SubscribeForm(forms.ModelForm):
class Meta:
model = Subscribe
fields = ('name', 'email')
| 39.547445 | 192 | 0.667774 | 585 | 5,418 | 6.121368 | 0.198291 | 0.086009 | 0.109467 | 0.1642 | 0.631667 | 0.609327 | 0.600391 | 0.571349 | 0.539235 | 0.47389 | 0 | 0.001546 | 0.164267 | 5,418 | 136 | 193 | 39.838235 | 0.789311 | 0.073459 | 0 | 0.376238 | 0 | 0 | 0.16138 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.019802 | false | 0.089109 | 0.059406 | 0 | 0.574257 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
53d7d29c754277adcb7e032f2fd6287394359f46 | 180 | py | Python | amac_spider/main.py | galaxyyao/amac_spider | 271692207cc68a2495c0f0ac130b99dd1c7b957f | [
"MIT"
] | 1 | 2019-12-19T06:26:43.000Z | 2019-12-19T06:26:43.000Z | amac_spider/main.py | galaxyyao/amac_spider | 271692207cc68a2495c0f0ac130b99dd1c7b957f | [
"MIT"
] | null | null | null | amac_spider/main.py | galaxyyao/amac_spider | 271692207cc68a2495c0f0ac130b99dd1c7b957f | [
"MIT"
] | null | null | null | from scrapy.crawler import CrawlerProcess
from amac_spider.spiders.fund_spider import FundSpider
#print ("aaa")
process = CrawlerProcess()
process.crawl(FundSpider)
process.start() | 30 | 54 | 0.827778 | 22 | 180 | 6.681818 | 0.681818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.077778 | 180 | 6 | 55 | 30 | 0.885542 | 0.072222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9900b5064b9785746cb40bc61f4274ff3f307392 | 27 | py | Python | examples/transactions/__init__.py | adamzhang1987/py-etherscan-api | 9a3accfa455eb1d9d82ac7be41012948a58f90e3 | [
"MIT"
] | 458 | 2016-07-21T19:49:30.000Z | 2022-03-23T18:01:19.000Z | examples/transactions/__init__.py | adamzhang1987/py-etherscan-api | 9a3accfa455eb1d9d82ac7be41012948a58f90e3 | [
"MIT"
] | 71 | 2016-06-17T19:34:18.000Z | 2022-03-06T20:13:37.000Z | examples/transactions/__init__.py | adamzhang1987/py-etherscan-api | 9a3accfa455eb1d9d82ac7be41012948a58f90e3 | [
"MIT"
] | 269 | 2016-06-20T09:51:17.000Z | 2022-03-17T19:19:10.000Z | __author__ = 'Corey Petty'
| 13.5 | 26 | 0.740741 | 3 | 27 | 5.333333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.148148 | 27 | 1 | 27 | 27 | 0.695652 | 0 | 0 | 0 | 0 | 0 | 0.407407 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
54e9b989bff6d19b37896081992977d13d429fb6 | 1,095 | py | Python | api/permissions.py | bymi15/JobTrackerBoard | fe48b4276ccfe8f11555d43e5eb45589905fde3e | [
"Apache-2.0"
] | null | null | null | api/permissions.py | bymi15/JobTrackerBoard | fe48b4276ccfe8f11555d43e5eb45589905fde3e | [
"Apache-2.0"
] | 4 | 2021-04-08T22:00:18.000Z | 2021-06-10T20:38:40.000Z | api/permissions.py | bymi15/JobTrackerBoard | fe48b4276ccfe8f11555d43e5eb45589905fde3e | [
"Apache-2.0"
] | null | null | null | from rest_framework import permissions
class IsSameUser(permissions.BasePermission):
"""
Object-level permission to only allow owners of an object to edit it.
Assumes the model instance has a `user` attribute.
"""
def has_object_permission(self, request, view, obj):
# Instance must have an attribute named `user`.
return obj == request.user
class IsOwner(permissions.BasePermission):
def has_object_permission(self, request, view, obj):
# Instance must have an attribute named `user`.
return request.user.is_superuser or obj.user == request.user
class IsOwnerApplication(permissions.BasePermission):
def has_object_permission(self, request, view, obj):
# Instance must have an attribute named `application.user`.
return obj.application.user == request.user
class IsOwnerInterview(permissions.BasePermission):
def has_object_permission(self, request, view, obj):
# Instance must have an attribute named `interview.application.user`.
return obj.interview.application.user == request.user | 42.115385 | 77 | 0.724201 | 132 | 1,095 | 5.931818 | 0.333333 | 0.070243 | 0.061303 | 0.112388 | 0.489144 | 0.489144 | 0.489144 | 0.489144 | 0.489144 | 0.489144 | 0 | 0 | 0.193607 | 1,095 | 26 | 78 | 42.115385 | 0.88675 | 0.309589 | 0 | 0.307692 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0 | 0.076923 | 0.307692 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
0706e73e08ea2d38fbb0e0e66f4b3e79c9cb5704 | 297 | py | Python | tool/utils/generaltools.py | panmt/MRNet_for_vehicle_reID | aa97998934972cc1e034ac2b117869574d36cb90 | [
"MIT"
] | 1 | 2021-07-11T17:00:58.000Z | 2021-07-11T17:00:58.000Z | tool/utils/generaltools.py | saiftumrani/MRNet_for_vehicle_reID | aa97998934972cc1e034ac2b117869574d36cb90 | [
"MIT"
] | null | null | null | tool/utils/generaltools.py | saiftumrani/MRNet_for_vehicle_reID | aa97998934972cc1e034ac2b117869574d36cb90 | [
"MIT"
] | 1 | 2021-08-03T06:16:04.000Z | 2021-08-03T06:16:04.000Z | from __future__ import absolute_import
from __future__ import print_function
from __future__ import division
import random
import numpy as np
import torch
def set_random_seed(seed):
random.seed(seed)
np.random.seed(seed)
torch.manual_seed(seed)
torch.cuda.manual_seed_all(seed)
| 19.8 | 38 | 0.791246 | 44 | 297 | 4.909091 | 0.409091 | 0.148148 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.151515 | 297 | 14 | 39 | 21.214286 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.545455 | 0 | 0.636364 | 0.090909 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
07572ad67ed391542b632e4c84aab396ccee241e | 130 | py | Python | easytrader/__init__.py | YuSekai/easytrader | e9317b33101de37f20da739cc588d9d7ed8b8d14 | [
"MIT"
] | null | null | null | easytrader/__init__.py | YuSekai/easytrader | e9317b33101de37f20da739cc588d9d7ed8b8d14 | [
"MIT"
] | null | null | null | easytrader/__init__.py | YuSekai/easytrader | e9317b33101de37f20da739cc588d9d7ed8b8d14 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from .api import use, follower
from . import exceptions
__version__ = "0.15.2"
__author__ = "shidenggui"
| 18.571429 | 30 | 0.684615 | 17 | 130 | 4.764706 | 0.882353 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.045872 | 0.161538 | 130 | 6 | 31 | 21.666667 | 0.697248 | 0.161538 | 0 | 0 | 0 | 0 | 0.149533 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
4ae461cab2c9881868e92293a5539d686f742aa2 | 416 | py | Python | pycmbs/tests/test_hov.py | pygeo/pycmbs | 0df863e1575ffad21c1ea9790bcbd3a7982d99c6 | [
"MIT"
] | 9 | 2015-04-01T04:22:25.000Z | 2018-08-31T03:51:34.000Z | pycmbs/tests/test_hov.py | pygeo/pycmbs | 0df863e1575ffad21c1ea9790bcbd3a7982d99c6 | [
"MIT"
] | 14 | 2015-01-27T20:33:10.000Z | 2016-06-02T07:23:25.000Z | pycmbs/tests/test_hov.py | pygeo/pycmbs | 0df863e1575ffad21c1ea9790bcbd3a7982d99c6 | [
"MIT"
] | 8 | 2015-02-07T20:46:42.000Z | 2019-10-25T00:36:32.000Z | # -*- coding: utf-8 -*-
"""
This file is part of pyCMBS.
(c) 2012- Alexander Loew
For COPYING and LICENSE details, please refer to the LICENSE file
"""
import unittest
from pycmbs import hov
class TestHov(unittest.TestCase):
def setUp(self):
pass
def test_StubTest(self):
self.assertEqual(1, 1)
if __name__ == "__main__":
unittest.main()
# vim: expandtab shiftwidth=4 softtabstop=4
| 18.086957 | 65 | 0.680288 | 57 | 416 | 4.807018 | 0.789474 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.027356 | 0.209135 | 416 | 22 | 66 | 18.909091 | 0.805471 | 0.442308 | 0 | 0 | 0 | 0 | 0.035874 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.222222 | false | 0.111111 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
ab0c02ecfa4cb95b9388aa5a10ed4443e27a91a0 | 748 | py | Python | fwrite/filters/__init__.py | fooying/fwrite | e370ed66c2410b78db5db3fda2967f7b9a93e417 | [
"MIT"
] | 9 | 2015-05-05T02:03:42.000Z | 2019-05-14T11:51:59.000Z | fwrite/filters/__init__.py | fooying/fwrite | e370ed66c2410b78db5db3fda2967f7b9a93e417 | [
"MIT"
] | null | null | null | fwrite/filters/__init__.py | fooying/fwrite | e370ed66c2410b78db5db3fda2967f7b9a93e417 | [
"MIT"
] | 5 | 2015-06-10T01:09:20.000Z | 2018-12-01T17:18:18.000Z | #!/usr/bin/env python
#encoding=utf-8
#by Fooying 2013-11-17 01:57:59
'''
过滤器
'''
import datetime
from ..config import read_config
FILTERS = ['get_base', 'get_url', 'get_version', 'get_year', 'get_this_site', 'get_icp']
def get_base(name):
if name:
value = read_config('base', name)
else:
value = ''
return value
def get_url(name):
value = read_config('system', 'office_url')
return value
def get_version(name):
value = read_config('system', 'version')
return value
def get_year(name):
return datetime.datetime.now().strftime('%Y')
def get_this_site(name):
value = read_config('base', 'name')
return value
def get_icp(name):
value = read_config('base', 'icp')
return value
| 17.809524 | 88 | 0.65107 | 108 | 748 | 4.314815 | 0.37963 | 0.128755 | 0.139485 | 0.203863 | 0.272532 | 0.11588 | 0 | 0 | 0 | 0 | 0 | 0.025042 | 0.199198 | 748 | 41 | 89 | 18.243902 | 0.752922 | 0.090909 | 0 | 0.217391 | 0 | 0 | 0.156156 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.26087 | false | 0 | 0.086957 | 0.043478 | 0.608696 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
ab13d751c0b1b805d742342e45811a42318d6672 | 11,604 | py | Python | tests/cleanup_test.py | bretttegart/treadmill | 812109e31c503a6eddaee2d3f2e1faf2833b6aaf | [
"Apache-2.0"
] | 2 | 2017-10-31T18:48:20.000Z | 2018-03-04T20:35:20.000Z | tests/cleanup_test.py | bretttegart/treadmill | 812109e31c503a6eddaee2d3f2e1faf2833b6aaf | [
"Apache-2.0"
] | null | null | null | tests/cleanup_test.py | bretttegart/treadmill | 812109e31c503a6eddaee2d3f2e1faf2833b6aaf | [
"Apache-2.0"
] | null | null | null | """Unit test for cleanup - cleanup node apps
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from __future__ import unicode_literals
import glob
import os
import shutil
import tempfile
import unittest
import mock
import treadmill
import treadmill.runtime.runtime_base
from treadmill import cleanup
class CleanupTest(unittest.TestCase):
"""Mock test for treadmill.cleanup.Cleanup.
"""
@mock.patch('treadmill.appenv.AppEnvironment', mock.Mock(autospec=True))
@mock.patch('treadmill.watchdog.Watchdog', mock.Mock(autospec=True))
def setUp(self):
self.root = tempfile.mkdtemp()
self.cleanup_dir = os.path.join(self.root, 'cleanup')
self.cleaning_dir = os.path.join(self.root, 'cleaning')
self.cleanup_apps_dir = os.path.join(self.root, 'cleanup_apps')
self.cleanup_tombstone_dir = os.path.join(self.root, 'tombstones')
for tmp_dir in [self.cleanup_dir, self.cleaning_dir,
self.cleanup_apps_dir]:
os.mkdir(tmp_dir)
self.tm_env = mock.Mock(
root=self.root,
cleanup_dir=self.cleanup_dir,
cleaning_dir=self.cleaning_dir,
cleanup_apps_dir=self.cleanup_apps_dir,
cleanup_tombstone_dir=self.cleanup_tombstone_dir
)
self.cleanup = cleanup.Cleanup(self.tm_env)
def tearDown(self):
if self.root and os.path.isdir(self.root):
shutil.rmtree(self.root)
@mock.patch('treadmill.supervisor.control_svscan', mock.Mock())
def test__refresh_supervisor(self):
"""Check how the supervisor is being refreshed.
"""
# Access to a protected member _refresh_supervisor of a client class
# pylint: disable=W0212
self.cleanup._refresh_supervisor()
treadmill.supervisor.control_svscan.assert_called_with(
self.cleaning_dir, (
treadmill.supervisor.SvscanControlAction.alarm,
treadmill.supervisor.SvscanControlAction.nuke
)
)
@mock.patch('os.path.islink', mock.Mock())
@mock.patch('treadmill.supervisor.create_service', mock.Mock())
@mock.patch('treadmill.fs.symlink_safe', mock.Mock())
@mock.patch('treadmill.cleanup.Cleanup._refresh_supervisor', mock.Mock())
def test__add_cleanup_app(self):
"""Tests that a new cleanup app is correctly configured.
"""
# Access to a protected member _add_cleanup_app of a client class
# pylint: disable=W0212
os.path.islink.side_effect = [False, True]
self.cleanup._add_cleanup_app(
os.path.join(self.cleanup_dir, 'proid.app#0000000000001'))
treadmill.supervisor.create_service.assert_called_with(
self.cleanup_apps_dir,
name='proid.app#0000000000001',
app_run_script=mock.ANY,
userid='root',
monitor_policy={
'limit': 5,
'interval': 60,
'tombstone': os.path.join(self.cleanup_tombstone_dir,
'proid.app#0000000000001'),
'skip_path': os.path.join(self.cleanup_dir,
'proid.app#0000000000001')
},
log_run_script=None,
)
treadmill.fs.symlink_safe.assert_called_with(
os.path.join(self.cleaning_dir, 'proid.app#0000000000001'),
os.path.join(self.cleanup_apps_dir, 'proid.app#0000000000001')
)
treadmill.cleanup.Cleanup._refresh_supervisor.assert_called()
@mock.patch('os.path.islink', mock.Mock())
@mock.patch('treadmill.supervisor.create_service', mock.Mock())
def test__add_cleanup_app_exists(self):
"""Tests add app when already exists.
"""
# Access to a protected member _add_cleanup_app of a client class
# pylint: disable=W0212
os.path.islink.side_effect = [True]
self.cleanup._add_cleanup_app(
os.path.join(self.cleanup_dir, 'proid.app#0000000000001'))
treadmill.supervisor.create_service.assert_not_called()
# Disable C0103(Invalid method name)
# pylint: disable=C0103
@mock.patch('os.path.islink', mock.Mock())
@mock.patch('treadmill.supervisor.create_service', mock.Mock())
def test__add_cleanup_app_not_exists(self):
"""Tests add app when cleanup link does not exist.
"""
# Access to a protected member _add_cleanup_app of a client class
# pylint: disable=W0212
os.path.islink.side_effect = [False, False]
self.cleanup._add_cleanup_app(
os.path.join(self.cleanup_dir, 'proid.app#0000000000001'))
treadmill.supervisor.create_service.assert_not_called()
@mock.patch('treadmill.supervisor.create_service', mock.Mock())
def test__add_cleanup_app_temp(self):
"""Tests add app when cleanup link is a temp file
"""
# Access to a protected member _add_cleanup_app of a client class
# pylint: disable=W0212
self.cleanup._add_cleanup_app(
os.path.join(self.cleanup_dir, '.sdfasdfds'))
treadmill.supervisor.create_service.assert_not_called()
@mock.patch('os.path.exists', mock.Mock())
@mock.patch('treadmill.supervisor.ensure_not_supervised', mock.Mock())
@mock.patch('treadmill.fs.rm_safe', mock.Mock())
@mock.patch('treadmill.fs.rmtree_safe', mock.Mock())
@mock.patch('treadmill.cleanup.Cleanup._refresh_supervisor', mock.Mock())
def test__remove_cleanup_app(self):
"""Tests that a cleanup app is properly removed.
"""
# Access to a protected member _remove_cleanup_app of a client class
# pylint: disable=W0212
os.path.exists.side_effect = [True]
self.cleanup._remove_cleanup_app(
os.path.join(self.cleanup_dir, 'proid.app#0000000000001'))
treadmill.fs.rm_safe.assert_called_with(
os.path.join(self.cleaning_dir, 'proid.app#0000000000001')
)
treadmill.cleanup.Cleanup._refresh_supervisor.assert_called()
treadmill.supervisor.ensure_not_supervised.assert_called()
treadmill.fs.rmtree_safe.assert_called_with(
os.path.join(self.cleanup_apps_dir, 'proid.app#0000000000001')
)
# Disable C0103(Invalid method name)
# pylint: disable=C0103
@mock.patch('os.path.exists', mock.Mock())
@mock.patch('treadmill.supervisor.ensure_not_supervised', mock.Mock())
@mock.patch('treadmill.fs.rm_safe', mock.Mock())
@mock.patch('treadmill.fs.rmtree_safe', mock.Mock())
@mock.patch('treadmill.cleanup.Cleanup._refresh_supervisor', mock.Mock())
def test__remove_cleanup_app_no_link(self):
"""Tests that a cleanup app is removed even if the cleaning link
has been removed.
"""
# Access to a protected member _remove_cleanup_app of a client class
# pylint: disable=W0212
os.path.exists.side_effect = [False]
self.cleanup._remove_cleanup_app(
os.path.join(self.cleanup_dir, 'proid.app#0000000000001'))
treadmill.fs.rm_safe.assert_not_called()
treadmill.cleanup.Cleanup._refresh_supervisor.assert_not_called()
treadmill.supervisor.ensure_not_supervised.assert_not_called()
treadmill.fs.rmtree_safe.assert_called_with(
os.path.join(self.cleanup_apps_dir, 'proid.app#0000000000001')
)
@mock.patch('os.path.exists', mock.Mock())
@mock.patch('treadmill.supervisor.ensure_not_supervised', mock.Mock())
@mock.patch('treadmill.fs.rm_safe', mock.Mock())
@mock.patch('treadmill.fs.rmtree_safe', mock.Mock())
@mock.patch('treadmill.cleanup.Cleanup._refresh_supervisor', mock.Mock())
def test__remove_cleanup_app_temp(self):
"""Tests removed cleanup app when link is a temp file.
"""
# Access to a protected member _remove_cleanup_app of a client class
# pylint: disable=W0212
os.path.exists.side_effect = [False]
self.cleanup._remove_cleanup_app(
os.path.join(self.cleanup_dir, '.sdfasdfds'))
treadmill.fs.rm_safe.assert_not_called()
treadmill.cleanup.Cleanup._refresh_supervisor.assert_not_called()
treadmill.supervisor.ensure_not_supervised.assert_not_called()
treadmill.fs.rmtree_safe.assert_not_called()
@mock.patch('os.readlink', mock.Mock())
@mock.patch('os.path.exists', mock.Mock())
@mock.patch('treadmill.runtime.get_runtime', mock.Mock(
return_value=mock.Mock(
spec_set=treadmill.runtime.runtime_base.RuntimeBase)))
@mock.patch('treadmill.fs.rm_safe', mock.Mock())
def test_invoke(self):
"""Tests invoking the cleanup action.
"""
os.readlink.side_effect = [
os.path.join(self.cleanup_apps_dir, 'proid.app#0000000000001')
]
os.path.exists.side_effect = [True]
self.cleanup.invoke('test', 'proid.app#0000000000001')
mock_runtime = treadmill.runtime.get_runtime(
'test',
self.cleanup.tm_env,
os.path.join(self.cleanup_apps_dir, 'proid.app#0000000000001')
)
mock_runtime.finish.assert_called()
treadmill.fs.rm_safe.assert_called_with(
os.path.join(self.cleanup_dir, 'proid.app#0000000000001')
)
@mock.patch('os.readlink', mock.Mock())
@mock.patch('os.path.exists', mock.Mock())
@mock.patch('treadmill.runtime.get_runtime', mock.Mock(
return_value=mock.Mock(
spec_set=treadmill.runtime.runtime_base.RuntimeBase)))
@mock.patch('treadmill.fs.rm_safe', mock.Mock())
def test_invoke_not_exists(self):
"""Tests invoking the cleanup action when the app dir does not exist
anymore.
"""
os.readlink.side_effect = [
os.path.join(self.cleanup_apps_dir, 'proid.app#0000000000001')
]
os.path.exists.side_effect = [False]
self.cleanup.invoke('test', 'proid.app#0000000000001')
mock_runtime = treadmill.runtime.get_runtime(
'test',
self.cleanup.tm_env,
os.path.join(self.cleanup_apps_dir, 'proid.app#0000000000001')
)
mock_runtime.finish.assert_not_called()
treadmill.fs.rm_safe.assert_called_with(
os.path.join(self.cleanup_dir, 'proid.app#0000000000001')
)
@mock.patch('glob.glob', mock.Mock())
@mock.patch('treadmill.cleanup.Cleanup._add_cleanup_app', mock.Mock())
@mock.patch('treadmill.cleanup.Cleanup._remove_cleanup_app', mock.Mock())
def test__sync(self):
"""Tests a full sync of cleanup apps.
"""
# Access to a protected member _sync of a client class
# pylint: disable=W0212
glob.glob.side_effect = [
[
os.path.join(self.cleanup_dir, 'proid.app#0000000000002'),
os.path.join(self.cleanup_dir, 'proid.app#0000000000003')
],
[
os.path.join(self.cleanup_apps_dir, 'proid.app#0000000000001'),
os.path.join(self.cleanup_apps_dir, 'proid.app#0000000000002')
]
]
self.cleanup._sync()
treadmill.cleanup.Cleanup._add_cleanup_app.assert_has_calls([
mock.call('proid.app#0000000000003')
])
treadmill.cleanup.Cleanup._remove_cleanup_app.assert_has_calls([
mock.call('proid.app#0000000000001')
])
if __name__ == '__main__':
unittest.main()
| 38.171053 | 79 | 0.653395 | 1,414 | 11,604 | 5.128006 | 0.108911 | 0.070611 | 0.038615 | 0.054062 | 0.773411 | 0.74569 | 0.709006 | 0.661702 | 0.651083 | 0.622673 | 0 | 0.04547 | 0.230524 | 11,604 | 303 | 80 | 38.29703 | 0.766603 | 0.138745 | 0 | 0.450495 | 0 | 0 | 0.178383 | 0.142018 | 0 | 0 | 0 | 0 | 0.123762 | 1 | 0.064356 | false | 0 | 0.064356 | 0 | 0.133663 | 0.004951 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
ab2ecb3b8a21ab465235898d0d8fb4001a4ee970 | 127 | py | Python | CodeChef/FLOW017.py | PROxZIMA/Competitive-Coding | ba6b365ea130b6fcaa15c5537b530ed363bab793 | [
"MIT"
] | 1 | 2021-01-10T13:29:21.000Z | 2021-01-10T13:29:21.000Z | CodeChef/FLOW017.py | PROxZIMA/Competitive-Coding | ba6b365ea130b6fcaa15c5537b530ed363bab793 | [
"MIT"
] | null | null | null | CodeChef/FLOW017.py | PROxZIMA/Competitive-Coding | ba6b365ea130b6fcaa15c5537b530ed363bab793 | [
"MIT"
] | null | null | null | # https://www.codechef.com/viewsolution/43728670
for _ in range(int(input())):
print(sorted(map(int, input().split()))[1]) | 31.75 | 48 | 0.677165 | 18 | 127 | 4.722222 | 0.888889 | 0.188235 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078261 | 0.094488 | 127 | 4 | 49 | 31.75 | 0.66087 | 0.362205 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
ab3970634f4f82f3b7e73d6fd620bff2dee97b51 | 4,280 | py | Python | env/Lib/site-packages/OpenGL/raw/GL/EXT/memory_object.py | 5gconnectedbike/Navio2 | 8c3f2b5d8bbbcea1fc08739945183c12b206712c | [
"BSD-3-Clause"
] | 210 | 2016-04-09T14:26:00.000Z | 2022-03-25T18:36:19.000Z | env/Lib/site-packages/OpenGL/raw/GL/EXT/memory_object.py | 5gconnectedbike/Navio2 | 8c3f2b5d8bbbcea1fc08739945183c12b206712c | [
"BSD-3-Clause"
] | 72 | 2016-09-04T09:30:19.000Z | 2022-03-27T17:06:53.000Z | env/Lib/site-packages/OpenGL/raw/GL/EXT/memory_object.py | 5gconnectedbike/Navio2 | 8c3f2b5d8bbbcea1fc08739945183c12b206712c | [
"BSD-3-Clause"
] | 64 | 2016-04-09T14:26:49.000Z | 2022-03-21T11:19:47.000Z | '''Autogenerated by xml_generate script, do not edit!'''
from OpenGL import platform as _p, arrays
# Code generation uses this
from OpenGL.raw.GL import _types as _cs
# End users want this...
from OpenGL.raw.GL._types import *
from OpenGL.raw.GL import _errors
from OpenGL.constant import Constant as _C
import ctypes
_EXTENSION_NAME = 'GL_EXT_memory_object'
def _f( function ):
return _p.createFunction( function,_p.PLATFORM.GL,'GL_EXT_memory_object',error_checker=_errors._error_checker)
GL_DEDICATED_MEMORY_OBJECT_EXT=_C('GL_DEDICATED_MEMORY_OBJECT_EXT',0x9581)
GL_DEVICE_UUID_EXT=_C('GL_DEVICE_UUID_EXT',0x9597)
GL_DRIVER_UUID_EXT=_C('GL_DRIVER_UUID_EXT',0x9598)
GL_LINEAR_TILING_EXT=_C('GL_LINEAR_TILING_EXT',0x9585)
GL_NUM_DEVICE_UUIDS_EXT=_C('GL_NUM_DEVICE_UUIDS_EXT',0x9596)
GL_NUM_TILING_TYPES_EXT=_C('GL_NUM_TILING_TYPES_EXT',0x9582)
GL_OPTIMAL_TILING_EXT=_C('GL_OPTIMAL_TILING_EXT',0x9584)
GL_PROTECTED_MEMORY_OBJECT_EXT=_C('GL_PROTECTED_MEMORY_OBJECT_EXT',0x959B)
GL_TEXTURE_TILING_EXT=_C('GL_TEXTURE_TILING_EXT',0x9580)
GL_TILING_TYPES_EXT=_C('GL_TILING_TYPES_EXT',0x9583)
GL_UUID_SIZE_EXT=_C('GL_UUID_SIZE_EXT',16)
@_f
@_p.types(None,_cs.GLenum,_cs.GLsizeiptr,_cs.GLuint,_cs.GLuint64)
def glBufferStorageMemEXT(target,size,memory,offset):pass
@_f
@_p.types(None,_cs.GLsizei,arrays.GLuintArray)
def glCreateMemoryObjectsEXT(n,memoryObjects):pass
@_f
@_p.types(None,_cs.GLsizei,arrays.GLuintArray)
def glDeleteMemoryObjectsEXT(n,memoryObjects):pass
@_f
@_p.types(None,_cs.GLuint,_cs.GLenum,arrays.GLintArray)
def glGetMemoryObjectParameterivEXT(memoryObject,pname,params):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLuint,arrays.GLubyteArray)
def glGetUnsignedBytei_vEXT(target,index,data):pass
@_f
@_p.types(None,_cs.GLenum,arrays.GLubyteArray)
def glGetUnsignedBytevEXT(pname,data):pass
@_f
@_p.types(_cs.GLboolean,_cs.GLuint)
def glIsMemoryObjectEXT(memoryObject):pass
@_f
@_p.types(None,_cs.GLuint,_cs.GLenum,arrays.GLintArray)
def glMemoryObjectParameterivEXT(memoryObject,pname,params):pass
@_f
@_p.types(None,_cs.GLuint,_cs.GLsizeiptr,_cs.GLuint,_cs.GLuint64)
def glNamedBufferStorageMemEXT(buffer,size,memory,offset):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLuint,_cs.GLuint64)
def glTexStorageMem1DEXT(target,levels,internalFormat,width,memory,offset):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLuint,_cs.GLuint64)
def glTexStorageMem2DEXT(target,levels,internalFormat,width,height,memory,offset):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLboolean,_cs.GLuint,_cs.GLuint64)
def glTexStorageMem2DMultisampleEXT(target,samples,internalFormat,width,height,fixedSampleLocations,memory,offset):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLsizei,_cs.GLuint,_cs.GLuint64)
def glTexStorageMem3DEXT(target,levels,internalFormat,width,height,depth,memory,offset):pass
@_f
@_p.types(None,_cs.GLenum,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLsizei,_cs.GLboolean,_cs.GLuint,_cs.GLuint64)
def glTexStorageMem3DMultisampleEXT(target,samples,internalFormat,width,height,depth,fixedSampleLocations,memory,offset):pass
@_f
@_p.types(None,_cs.GLuint,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLuint,_cs.GLuint64)
def glTextureStorageMem1DEXT(texture,levels,internalFormat,width,memory,offset):pass
@_f
@_p.types(None,_cs.GLuint,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLuint,_cs.GLuint64)
def glTextureStorageMem2DEXT(texture,levels,internalFormat,width,height,memory,offset):pass
@_f
@_p.types(None,_cs.GLuint,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLboolean,_cs.GLuint,_cs.GLuint64)
def glTextureStorageMem2DMultisampleEXT(texture,samples,internalFormat,width,height,fixedSampleLocations,memory,offset):pass
@_f
@_p.types(None,_cs.GLuint,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLsizei,_cs.GLuint,_cs.GLuint64)
def glTextureStorageMem3DEXT(texture,levels,internalFormat,width,height,depth,memory,offset):pass
@_f
@_p.types(None,_cs.GLuint,_cs.GLsizei,_cs.GLenum,_cs.GLsizei,_cs.GLsizei,_cs.GLsizei,_cs.GLboolean,_cs.GLuint,_cs.GLuint64)
def glTextureStorageMem3DMultisampleEXT(texture,samples,internalFormat,width,height,depth,fixedSampleLocations,memory,offset):pass
| 52.195122 | 130 | 0.839019 | 630 | 4,280 | 5.31746 | 0.179365 | 0.091343 | 0.105075 | 0.059104 | 0.64597 | 0.545672 | 0.545672 | 0.51403 | 0.500896 | 0.438209 | 0 | 0.020571 | 0.034579 | 4,280 | 81 | 131 | 52.839506 | 0.790174 | 0.023364 | 0 | 0.298701 | 1 | 0 | 0.066858 | 0.035466 | 0 | 0 | 0.014378 | 0 | 0 | 1 | 0.25974 | false | 0.246753 | 0.077922 | 0.012987 | 0.350649 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
ab54141b94a8ba77e9362b67bc8d84a20df58a68 | 308 | py | Python | application/controllers/site.py | yongli82/FlaskBookRead | b69e469e1d48cda29906ed6f33ec0a239576313a | [
"MIT"
] | 543 | 2015-01-12T06:22:48.000Z | 2022-02-18T11:42:32.000Z | application/controllers/site.py | yongli82/FlaskBookRead | b69e469e1d48cda29906ed6f33ec0a239576313a | [
"MIT"
] | 31 | 2015-01-01T06:55:59.000Z | 2018-02-17T00:20:42.000Z | application/controllers/site.py | yongli82/FlaskBookRead | b69e469e1d48cda29906ed6f33ec0a239576313a | [
"MIT"
] | 124 | 2015-12-10T01:17:18.000Z | 2021-11-08T04:03:38.000Z | # coding: utf-8
from flask import render_template, Blueprint
bp = Blueprint('site', __name__)
@bp.route('/')
def index():
"""Index page."""
return render_template('site/index/index.html')
@bp.route('/about')
def about():
"""About page."""
return render_template('site/about/about.html')
| 18.117647 | 51 | 0.659091 | 40 | 308 | 4.9 | 0.475 | 0.214286 | 0.163265 | 0.244898 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003831 | 0.152597 | 308 | 16 | 52 | 19.25 | 0.747126 | 0.123377 | 0 | 0 | 0 | 0 | 0.204633 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0 | 0.625 | 0.25 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
ab5654f1b6d4b2b517f023168ba1c5502477c257 | 417 | py | Python | settings.py | Wilfongjt/lb-data | eca16bcec6cae5822146dfce8ea56e5f533c7f87 | [
"MIT"
] | null | null | null | settings.py | Wilfongjt/lb-data | eca16bcec6cae5822146dfce8ea56e5f533c7f87 | [
"MIT"
] | 6 | 2020-04-15T10:15:30.000Z | 2020-07-10T16:12:42.000Z | settings.py | Wilfongjt/lb-data | eca16bcec6cae5822146dfce8ea56e5f533c7f87 | [
"MIT"
] | null | null | null | import os
#from pathlib import Path
from dotenv import load_dotenv
import sys
# sys.path.insert(0, '{}/__classes__'.format(os.getcwd()))
print('app', '{}/_app'.format(os.getcwd()) )
#sys.path.insert(0, '{}'.format(os.getcwd().replace('/pg-dev','')))
sys.path.insert(0, '{}/_app'.format(os.getcwd()))
load_dotenv()
#print('WORKING_FOLDER_NAME',os.getenv('WORKING_FOLDER_NAME'))
#print(sys.path)
#print('settings')
| 24.529412 | 67 | 0.693046 | 61 | 417 | 4.540984 | 0.377049 | 0.101083 | 0.202166 | 0.151625 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007752 | 0.071942 | 417 | 16 | 68 | 26.0625 | 0.70801 | 0.57554 | 0 | 0 | 0 | 0 | 0.1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0.166667 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
ab585a78b3c0d7d034c56674d77099113402048c | 859 | py | Python | verstack/lgbm_optuna_tuning/optuna_tools.py | DanilZherebtsov/verstack | 56a5f4f8a7dbae17866fbffd45ef0e781bd59215 | [
"MIT"
] | 23 | 2020-09-18T22:38:03.000Z | 2022-02-22T18:30:59.000Z | verstack/lgbm_optuna_tuning/optuna_tools.py | DanilZherebtsov/verstack | 56a5f4f8a7dbae17866fbffd45ef0e781bd59215 | [
"MIT"
] | 9 | 2020-12-05T15:31:14.000Z | 2022-03-17T07:01:44.000Z | verstack/lgbm_optuna_tuning/optuna_tools.py | DanilZherebtsov/verstack | 56a5f4f8a7dbae17866fbffd45ef0e781bd59215 | [
"MIT"
] | 2 | 2020-12-01T23:00:03.000Z | 2021-11-05T11:50:07.000Z | from enum import Enum
# define optuna distribution pattern for params
class Distribution(Enum):
CHOICE = 0
UNIFORM = 1
INTUNIFORM = 2
QUNIFORM = 3
LOGUNIFORM = 4
DISCRETEUNIFORM = 5
NORMAL = 6
QNORMAL = 7
LOGNORMAL = 8
OPTUNA_DISTRIBUTIONS_MAP = {Distribution.CHOICE: "suggest_categorical",
Distribution.UNIFORM: "suggest_uniform",
Distribution.LOGUNIFORM: "suggest_loguniform",
Distribution.INTUNIFORM: "suggest_int",
Distribution.DISCRETEUNIFORM: "suggest_discrete_uniform"}
class SearchSpace:
distribution_type: Distribution = None
params: {}
def __init__(self, distribution_type: Distribution, *args, **kwargs):
self.distribution_type = distribution_type
self.params = kwargs | 859 | 859 | 0.637951 | 78 | 859 | 6.820513 | 0.538462 | 0.120301 | 0.157895 | 0.120301 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014803 | 0.2922 | 859 | 1 | 859 | 859 | 0.860197 | 0.052387 | 0 | 0 | 0 | 0 | 0.107011 | 0.02952 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.045455 | 0 | 0.681818 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
ab5d9209e92b3f8c6419f642114e2d100f85904b | 51 | py | Python | api/auth/__init__.py | pypeclub/openpype4-backend | a0abe2ed66887c6529b01bbb9cb00278bbff41e4 | [
"Apache-2.0"
] | 2 | 2022-03-09T08:02:52.000Z | 2022-03-15T00:34:01.000Z | api/auth/__init__.py | pypeclub/openpype4-backend | a0abe2ed66887c6529b01bbb9cb00278bbff41e4 | [
"Apache-2.0"
] | 1 | 2022-03-08T16:22:34.000Z | 2022-03-08T16:22:34.000Z | api/auth/__init__.py | pypeclub/openpype4-backend | a0abe2ed66887c6529b01bbb9cb00278bbff41e4 | [
"Apache-2.0"
] | null | null | null | __all__ = ["router"]
from auth.auth import router
| 12.75 | 28 | 0.72549 | 7 | 51 | 4.714286 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.156863 | 51 | 3 | 29 | 17 | 0.767442 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
db491d031a0d7a7429c4b56a39601e07370ccbc5 | 62 | py | Python | main.py | kwanj-k/Jacaranda_test_api | 2e8b13fd340612385e227f12fe456e8d65e29e9c | [
"MIT"
] | null | null | null | main.py | kwanj-k/Jacaranda_test_api | 2e8b13fd340612385e227f12fe456e8d65e29e9c | [
"MIT"
] | null | null | null | main.py | kwanj-k/Jacaranda_test_api | 2e8b13fd340612385e227f12fe456e8d65e29e9c | [
"MIT"
] | null | null | null | from flask_marshmallow import Marshmallow
ma = Marshmallow()
| 15.5 | 41 | 0.822581 | 7 | 62 | 7.142857 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 62 | 3 | 42 | 20.666667 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
db4bfb16e417b9f8e89a6b35fecaf5936dbb6bf2 | 359 | py | Python | slamcore_utils/__init__.py | slamcore/slamcore_utils | cdef772f9b46601e737c676deb9cd535e2331562 | [
"BSD-3-Clause"
] | 6 | 2021-12-07T10:56:33.000Z | 2021-12-14T00:54:47.000Z | slamcore_utils/__init__.py | slamcore/slamcore_utils | cdef772f9b46601e737c676deb9cd535e2331562 | [
"BSD-3-Clause"
] | null | null | null | slamcore_utils/__init__.py | slamcore/slamcore_utils | cdef772f9b46601e737c676deb9cd535e2331562 | [
"BSD-3-Clause"
] | null | null | null | from slamcore_utils.data_converter import DataConverter
from slamcore_utils.dataset_dir_converter import DatasetDirConverter
from slamcore_utils.openloris_converter import OpenLORISConverter
__all__ = [
"DataConverter",
"DatasetDirConverter",
"OpenLORISConverter",
]
# TODO Find a way to use this version in pyproject.toml
__version__ = "0.1.4"
| 27.615385 | 68 | 0.807799 | 41 | 359 | 6.707317 | 0.658537 | 0.130909 | 0.185455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009615 | 0.130919 | 359 | 12 | 69 | 29.916667 | 0.871795 | 0.147632 | 0 | 0 | 0 | 0 | 0.180921 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
db62f04fc0f7d8293c2d19b2a076b975227e0ada | 861 | py | Python | fdk_client/platform/models/EditTicketPayload.py | kavish-d/fdk-client-python | a1023eb530473322cb52e095fc4ceb226c1e6037 | [
"MIT"
] | null | null | null | fdk_client/platform/models/EditTicketPayload.py | kavish-d/fdk-client-python | a1023eb530473322cb52e095fc4ceb226c1e6037 | [
"MIT"
] | null | null | null | fdk_client/platform/models/EditTicketPayload.py | kavish-d/fdk-client-python | a1023eb530473322cb52e095fc4ceb226c1e6037 | [
"MIT"
] | null | null | null | """Platform Models."""
from marshmallow import fields, Schema
from marshmallow.validate import OneOf
from ..enums import *
from ..models.BaseSchema import BaseSchema
from .TicketContent import TicketContent
from .AgentChangePayload import AgentChangePayload
class EditTicketPayload(BaseSchema):
# Lead swagger.json
content = fields.Nested(TicketContent, required=False)
category = fields.Str(required=False)
sub_category = fields.Str(required=False)
source = fields.Str(required=False)
status = fields.Str(required=False)
priority = fields.Str(required=False, validate=OneOf([val.value for val in PriorityEnum.__members__.values()]))
assigned_to = fields.Nested(AgentChangePayload, required=False)
tags = fields.List(fields.Str(required=False), required=False)
| 18.717391 | 115 | 0.716609 | 92 | 861 | 6.641304 | 0.423913 | 0.191489 | 0.166939 | 0.216039 | 0.0982 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190476 | 861 | 45 | 116 | 19.133333 | 0.876614 | 0.04065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.4 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
db6a504929454b659b7415007fb5b94a0b69527d | 7,976 | py | Python | api/alembic/versions/88ed88c89cb3_init_db.py | sedlar/work-tracking | 78917ff8200829eb674142ce43b503d8e892d7eb | [
"BSD-2-Clause"
] | null | null | null | api/alembic/versions/88ed88c89cb3_init_db.py | sedlar/work-tracking | 78917ff8200829eb674142ce43b503d8e892d7eb | [
"BSD-2-Clause"
] | null | null | null | api/alembic/versions/88ed88c89cb3_init_db.py | sedlar/work-tracking | 78917ff8200829eb674142ce43b503d8e892d7eb | [
"BSD-2-Clause"
] | null | null | null | """Init db
Revision ID: 88ed88c89cb3
Revises:
Create Date: 2019-01-13 21:48:04.041794
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '88ed88c89cb3'
down_revision = None
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.create_table('deliverables',
sa.Column('object_id', sa.String(length=15), nullable=False),
sa.Column('project_id', sa.String(length=5), nullable=False),
sa.Column('name', sa.String(length=128), nullable=False),
sa.Column('status', sa.String(length=9), nullable=False),
sa.Column('description', sa.String(), nullable=False),
sa.Column('date_opened', sa.DateTime(), nullable=False),
sa.Column('date_closed', sa.DateTime(), nullable=True),
sa.Column('deadline', sa.DateTime(), nullable=True),
sa.Column('created_on', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('object_id')
)
op.create_index(op.f('ix_deliverables_project_id'), 'deliverables', ['project_id'], unique=False)
op.create_table('field_files',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('parent_id', sa.String(length=15), nullable=False),
sa.Column('uri', sa.String(length=2048), nullable=False),
sa.Column('created_on', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('parent_id', 'uri')
)
op.create_index(op.f('ix_field_files_parent_id'), 'field_files', ['parent_id'], unique=False)
op.create_index(op.f('ix_field_files_uri'), 'field_files', ['uri'], unique=False)
op.create_table('field_links',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('parent_id', sa.String(length=15), nullable=False),
sa.Column('uri', sa.String(length=2048), nullable=False),
sa.Column('title', sa.String(length=126), nullable=False),
sa.Column('description', sa.String(length=4096), nullable=False),
sa.Column('created_on', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('parent_id', 'uri')
)
op.create_index(op.f('ix_field_links_parent_id'), 'field_links', ['parent_id'], unique=False)
op.create_table('field_tags',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('parent_id', sa.String(length=15), nullable=False),
sa.Column('tag', sa.String(length=50), nullable=False),
sa.Column('created_on', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('parent_id', 'tag')
)
op.create_index(op.f('ix_field_tags_parent_id'), 'field_tags', ['parent_id'], unique=False)
op.create_index(op.f('ix_field_tags_tag'), 'field_tags', ['tag'], unique=False)
op.create_table('field_tasks',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('parent_id', sa.String(length=15), nullable=False),
sa.Column('task', sa.String(length=1024), nullable=False),
sa.Column('completed', sa.Boolean(), nullable=False),
sa.Column('created_on', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('parent_id', 'task')
)
op.create_index(op.f('ix_field_tasks_parent_id'), 'field_tasks', ['parent_id'], unique=False)
op.create_table('ids_counter',
sa.Column('project_id', sa.String(length=15), nullable=False),
sa.Column('next_id', sa.Integer(), nullable=False),
sa.PrimaryKeyConstraint('project_id')
)
op.create_table('issues',
sa.Column('object_id', sa.String(length=15), nullable=False),
sa.Column('project_id', sa.String(length=5), nullable=False),
sa.Column('name', sa.String(length=128), nullable=False),
sa.Column('description', sa.String(), nullable=False),
sa.Column('external_type', sa.String(length=256), nullable=False),
sa.Column('status', sa.String(length=11), nullable=False),
sa.Column('priority', sa.String(length=11), nullable=False),
sa.Column('type', sa.String(length=11), nullable=False),
sa.Column('date_opened', sa.DateTime(), nullable=False),
sa.Column('date_closed', sa.DateTime(), nullable=True),
sa.Column('deadline', sa.DateTime(), nullable=True),
sa.Column('hour_rate_amount', sa.DECIMAL(), nullable=True),
sa.Column('hour_rate_currency', sa.String(length=3), nullable=True),
sa.Column('estimated_duration', sa.DECIMAL(), nullable=True),
sa.Column('created_on', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('object_id')
)
op.create_index(op.f('ix_issues_project_id'), 'issues', ['project_id'], unique=False)
op.create_table('objects_tracker',
sa.Column('id', sa.String(length=15), nullable=False),
sa.Column('project_id', sa.String(length=5), nullable=False),
sa.Column('type', sa.String(length=11), nullable=False),
sa.PrimaryKeyConstraint('id')
)
op.create_index(op.f('ix_objects_tracker_project_id'), 'objects_tracker', ['project_id'], unique=False)
op.create_table('projects',
sa.Column('project_id', sa.String(length=5), nullable=False),
sa.Column('name', sa.String(length=128), nullable=False),
sa.Column('status', sa.String(length=9), nullable=False),
sa.Column('date_opened', sa.DateTime(), nullable=False),
sa.Column('date_closed', sa.DateTime(), nullable=True),
sa.Column('deadline', sa.DateTime(), nullable=True),
sa.Column('hour_rate_amount', sa.DECIMAL(), nullable=True),
sa.Column('hour_rate_currency', sa.String(length=3), nullable=True),
sa.Column('description', sa.String(), nullable=False),
sa.Column('limitations_and_restrictions', sa.String(), nullable=False),
sa.Column('goals_and_metrics', sa.String(), nullable=False),
sa.Column('primary_color', sa.String(length=7), nullable=False),
sa.Column('secondary_color', sa.String(length=7), nullable=False),
sa.Column('created_on', sa.DateTime(), nullable=False),
sa.PrimaryKeyConstraint('project_id')
)
op.create_table('users',
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
sa.Column('username', sa.String(length=64), nullable=False),
sa.Column('password', sa.LargeBinary(length=256), nullable=False),
sa.PrimaryKeyConstraint('id'),
sa.UniqueConstraint('username')
)
op.create_table('entity_links',
sa.Column('object_id', sa.String(length=15), nullable=False),
sa.Column('other_object_id', sa.String(length=15), nullable=False),
sa.ForeignKeyConstraint(['object_id'], ['objects_tracker.id'], ondelete='RESTRICT'),
sa.ForeignKeyConstraint(['other_object_id'], ['objects_tracker.id'], ondelete='RESTRICT'),
sa.PrimaryKeyConstraint('object_id', 'other_object_id')
)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_table('entity_links')
op.drop_table('users')
op.drop_table('projects')
op.drop_index(op.f('ix_objects_tracker_project_id'), table_name='objects_tracker')
op.drop_table('objects_tracker')
op.drop_index(op.f('ix_issues_project_id'), table_name='issues')
op.drop_table('issues')
op.drop_table('ids_counter')
op.drop_index(op.f('ix_field_tasks_parent_id'), table_name='field_tasks')
op.drop_table('field_tasks')
op.drop_index(op.f('ix_field_tags_tag'), table_name='field_tags')
op.drop_index(op.f('ix_field_tags_parent_id'), table_name='field_tags')
op.drop_table('field_tags')
op.drop_index(op.f('ix_field_links_parent_id'), table_name='field_links')
op.drop_table('field_links')
op.drop_index(op.f('ix_field_files_uri'), table_name='field_files')
op.drop_index(op.f('ix_field_files_parent_id'), table_name='field_files')
op.drop_table('field_files')
op.drop_index(op.f('ix_deliverables_project_id'), table_name='deliverables')
op.drop_table('deliverables')
# ### end Alembic commands ###
| 48.932515 | 107 | 0.698847 | 1,099 | 7,976 | 4.879891 | 0.11556 | 0.099944 | 0.156629 | 0.176207 | 0.788365 | 0.776058 | 0.73802 | 0.663248 | 0.553235 | 0.49394 | 0 | 0.015687 | 0.120863 | 7,976 | 162 | 108 | 49.234568 | 0.749144 | 0.034729 | 0 | 0.409722 | 0 | 0 | 0.22592 | 0.042809 | 0 | 0 | 0 | 0 | 0 | 1 | 0.013889 | false | 0.006944 | 0.013889 | 0 | 0.027778 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
db9a656cd42c737591de79d83f69c8a6467579dc | 237 | py | Python | configs/ModianConfig-demo.py | IcyBiscuit/modian-lottery-bot | 29607d873586d42af844815fe5fa7abe02f0515e | [
"MIT"
] | 1 | 2019-06-21T08:39:49.000Z | 2019-06-21T08:39:49.000Z | configs/ModianConfig-demo.py | IcyBiscuit/modian-lottery-bot | 29607d873586d42af844815fe5fa7abe02f0515e | [
"MIT"
] | null | null | null | configs/ModianConfig-demo.py | IcyBiscuit/modian-lottery-bot | 29607d873586d42af844815fe5fa7abe02f0515e | [
"MIT"
] | 1 | 2020-06-04T16:09:27.000Z | 2020-06-04T16:09:27.000Z | config = {
"qqGroup": "",
"pro_ids": ['11111', '22222'],
"daily": {
"pro_ids": ['11111']
},
# "pk": {
# "me": "22222",
# "vs": ['33333']
# }
"dailyInterval": 25,
"pkInterval": 30
}
| 16.928571 | 34 | 0.379747 | 19 | 237 | 4.631579 | 0.789474 | 0.136364 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.190789 | 0.35865 | 237 | 13 | 35 | 18.230769 | 0.388158 | 0.202532 | 0 | 0 | 0 | 0 | 0.347826 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
db9f6a4b796b2b598d84c72f996c42f6337233e9 | 2,497 | py | Python | var/spack/repos/builtin/packages/openexr/package.py | kkauder/spack | 6ae8d5c380c1f42094b05d38be26b03650aafb39 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 2 | 2020-09-10T22:50:08.000Z | 2021-01-12T22:18:54.000Z | var/spack/repos/builtin/packages/openexr/package.py | kkauder/spack | 6ae8d5c380c1f42094b05d38be26b03650aafb39 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 32 | 2020-12-15T17:29:20.000Z | 2022-03-21T15:08:31.000Z | var/spack/repos/builtin/packages/openexr/package.py | kkauder/spack | 6ae8d5c380c1f42094b05d38be26b03650aafb39 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 2 | 2018-04-06T09:04:11.000Z | 2020-01-24T12:52:12.000Z | # Copyright 2013-2021 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class Openexr(AutotoolsPackage):
"""OpenEXR Graphics Tools (high dynamic-range image file format)"""
homepage = "http://www.openexr.com/"
url = "https://github.com/openexr/openexr/releases/download/v2.3.0/openexr-2.3.0.tar.gz"
# New versions should come from github now
version('2.3.0', sha256='fd6cb3a87f8c1a233be17b94c74799e6241d50fc5efd4df75c7a4b9cf4e25ea6')
version('2.2.0', sha256='36a012f6c43213f840ce29a8b182700f6cf6b214bea0d5735594136b44914231',
url="http://download.savannah.nongnu.org/releases/openexr/openexr-2.2.0.tar.gz")
version('2.1.0', sha256='54486b454073c1dcb5ae9892cf0f730ffefe62f38176325281505093fd218a14',
url="http://download.savannah.nongnu.org/releases/openexr/openexr-2.1.0.tar.gz")
version('2.0.1', sha256='b9924d2f9d57376ff99234209231ad97a47f5cfebd18a5d0570db6d1a220685a',
url="http://download.savannah.nongnu.org/releases/openexr/openexr-2.0.1.tar.gz")
version('1.7.0', sha256='b68a2164d01bd028d15bd96af2704634a344e291dc7cc2019a662045d8c52ca4',
url="http://download.savannah.nongnu.org/releases/openexr/openexr-1.7.0.tar.gz")
version('1.6.1', sha256='c616906ab958de9c37bb86ca7547cfedbdfbad5e1ca2a4ab98983c9afa6a5950',
url="http://download.savannah.nongnu.org/releases/openexr/openexr-1.6.1.tar.gz")
version('1.5.0', sha256='5a745eee4b8ab94cd16f85528c2debfebe6aa1ba23f5b8fc7933d4aa5c3c3416',
url="http://download.savannah.nongnu.org/releases/openexr/openexr-1.5.0.tar.gz")
version('1.4.0a', sha256='5d8a7327bd28eeb5d3064640d8eb32c3cd8c5a15999c70b0afa9f8af851936d1',
url="http://download.savannah.nongnu.org/releases/openexr/openexr-1.4.0a.tar.gz")
version('1.3.2', sha256='fa08ad904bf89e2968078d25d1d9817f5bc17f372d1bafabf82e8f08ca2adc20',
url="http://download.savannah.nongnu.org/releases/openexr/openexr-1.3.2.tar.gz")
variant('debug', default=False,
description='Builds a debug version of the libraries')
depends_on('pkgconfig', type='build')
depends_on('ilmbase')
depends_on('zlib', type=('build', 'link'))
def configure_args(self):
configure_options = []
configure_options += self.enable_or_disable('debug')
return configure_options
| 52.020833 | 96 | 0.734882 | 281 | 2,497 | 6.498221 | 0.366548 | 0.069003 | 0.065717 | 0.100767 | 0.286966 | 0.240964 | 0.240964 | 0.240964 | 0.240964 | 0.240964 | 0 | 0.205423 | 0.128554 | 2,497 | 47 | 97 | 53.12766 | 0.633732 | 0.11694 | 0 | 0 | 0 | 0.3 | 0.634624 | 0.262415 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0 | 0.033333 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
dbb4681704fc8492e4f4002975db12d1138eeffe | 153 | py | Python | Cursos-Extras/Python/ex014.py | talessantos49/Primeiros_Passos | 7eac781fb3663c8cf71629611981d0ebd9f0216f | [
"MIT"
] | null | null | null | Cursos-Extras/Python/ex014.py | talessantos49/Primeiros_Passos | 7eac781fb3663c8cf71629611981d0ebd9f0216f | [
"MIT"
] | null | null | null | Cursos-Extras/Python/ex014.py | talessantos49/Primeiros_Passos | 7eac781fb3663c8cf71629611981d0ebd9f0216f | [
"MIT"
] | null | null | null | temp = float(input('Qual a temperatura atual em ºC? '))
n1 = (temp * (9/5)) + 32
print('A temperatura {:.2f}ºC é em Farenheit {:.2f}ºF'.format(temp, n1)) | 51 | 72 | 0.640523 | 27 | 153 | 3.62963 | 0.703704 | 0.244898 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.061538 | 0.150327 | 153 | 3 | 72 | 51 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0.506494 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
dbbf693225bc802059eb1a8b151330a936895c1e | 297 | py | Python | mayan/apps/autoadmin/management/commands/createautoadmin.py | nattangwiwat/Mayan-EDMS-recitation | fcf16afb56eae812fb99144d65ae1ae6749de0b7 | [
"Apache-2.0"
] | 343 | 2015-01-05T14:19:35.000Z | 2018-12-10T19:07:48.000Z | mayan/apps/autoadmin/management/commands/createautoadmin.py | nattangwiwat/Mayan-EDMS-recitation | fcf16afb56eae812fb99144d65ae1ae6749de0b7 | [
"Apache-2.0"
] | 191 | 2015-01-03T00:48:19.000Z | 2018-11-30T09:10:25.000Z | mayan/apps/autoadmin/management/commands/createautoadmin.py | nattangwiwat/Mayan-EDMS-recitation | fcf16afb56eae812fb99144d65ae1ae6749de0b7 | [
"Apache-2.0"
] | 257 | 2019-05-14T10:26:37.000Z | 2022-03-30T03:37:36.000Z | from django.core.management.base import BaseCommand
from ...models import AutoAdminSingleton
class Command(BaseCommand):
help = 'Used to create a superuser with a secure and automatic password.'
def handle(self, *args, **options):
AutoAdminSingleton.objects.create_autoadmin()
| 27 | 77 | 0.754209 | 35 | 297 | 6.371429 | 0.828571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.161616 | 297 | 10 | 78 | 29.7 | 0.895582 | 0 | 0 | 0 | 0 | 0 | 0.215488 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.166667 | 0.333333 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 3 |
dbd30653ccb5a41a0c5fb8829c9c0f33076f1480 | 533 | py | Python | commands/pub_add_command.py | kennethnym/Subliminal | ffa49eba5bcaf1a7cc566d77e15db28fc33cdafe | [
"MIT"
] | null | null | null | commands/pub_add_command.py | kennethnym/Subliminal | ffa49eba5bcaf1a7cc566d77e15db28fc33cdafe | [
"MIT"
] | null | null | null | commands/pub_add_command.py | kennethnym/Subliminal | ffa49eba5bcaf1a7cc566d77e15db28fc33cdafe | [
"MIT"
] | null | null | null | from .dart_command import DartCommand
import sublime_plugin
class PubAddCommand(DartCommand):
def run(self, package_name):
if project := super().project():
project.pub_add(package_name)
def input(self, _):
return PackageNameInputHandler()
class PackageNameInputHandler(sublime_plugin.TextInputHandler):
def name(self):
return "package_name"
def validate(self, arg: str) -> bool:
return bool(arg)
def placeholder(self):
return "package_name[:@x.y.z]"
| 19.740741 | 63 | 0.669794 | 59 | 533 | 5.898305 | 0.525424 | 0.126437 | 0.08046 | 0.12069 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.228893 | 533 | 26 | 64 | 20.5 | 0.846715 | 0 | 0 | 0 | 0 | 0 | 0.061914 | 0.0394 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.133333 | 0.266667 | 0.866667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
dbd4e3d2853228b61013efafc16b11b00edebd97 | 149 | py | Python | Exercise-1/Q10_sumReturn.py | abhay-lal/18CSC207J-APP | 79a955a99837e6d41c89cb1a9e84eb0230c0fa7b | [
"MIT"
] | null | null | null | Exercise-1/Q10_sumReturn.py | abhay-lal/18CSC207J-APP | 79a955a99837e6d41c89cb1a9e84eb0230c0fa7b | [
"MIT"
] | null | null | null | Exercise-1/Q10_sumReturn.py | abhay-lal/18CSC207J-APP | 79a955a99837e6d41c89cb1a9e84eb0230c0fa7b | [
"MIT"
] | null | null | null | int1 = input("Enter first integer: ")
int2 = input("Enter second integer: ")
sum = int(int1) + int(int2)
if sum in range(105, 201):
print(200)
| 18.625 | 38 | 0.644295 | 23 | 149 | 4.173913 | 0.695652 | 0.208333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108333 | 0.194631 | 149 | 7 | 39 | 21.285714 | 0.691667 | 0 | 0 | 0 | 0 | 0 | 0.288591 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.2 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
915dffe031047069e06e2f78adbb553a1671d2ea | 399 | py | Python | src/data/margy_filtering.py | Quentindcf-grindstone/member_value_estimator | 0b32087d50723173d7d52956c8e4b05bcbe21f64 | [
"MIT"
] | null | null | null | src/data/margy_filtering.py | Quentindcf-grindstone/member_value_estimator | 0b32087d50723173d7d52956c8e4b05bcbe21f64 | [
"MIT"
] | null | null | null | src/data/margy_filtering.py | Quentindcf-grindstone/member_value_estimator | 0b32087d50723173d7d52956c8e4b05bcbe21f64 | [
"MIT"
] | null | null | null | import pandas as pd
raw_member_df = pd.read_csv('../../data/raw/member_data_raw.csv')
raw_member_df = raw_member_df[raw_member_df['start_date'] > '2021-03-15']
print(raw_member_df.head())
raw_member_df = raw_member_df[raw_member_df['amount_paid'] > 0]
raw_member_df = raw_member_df[raw_member_df['affiliate_id'] == 129304]
print(raw_member_df.head())
raw_member_df.to_csv('../../data/raw/v.csv')
| 33.25 | 73 | 0.759398 | 73 | 399 | 3.69863 | 0.342466 | 0.466667 | 0.52963 | 0.311111 | 0.555556 | 0.555556 | 0.555556 | 0.555556 | 0.366667 | 0 | 0 | 0.040431 | 0.070175 | 399 | 11 | 74 | 36.272727 | 0.687332 | 0 | 0 | 0.25 | 0 | 0 | 0.243108 | 0.085213 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.125 | 0 | 0.125 | 0.25 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9164c82a85964aaef37002f8cc1239cf2e9007e6 | 226 | py | Python | ndarray/list_().py | Hupengyu/Paddle_learning | 0ac1e2ad32e41ac87bbb19e4535a4bc253ca9b0f | [
"Apache-2.0"
] | 1 | 2021-08-02T01:51:35.000Z | 2021-08-02T01:51:35.000Z | ndarray/list_().py | Hupengyu/Paddle_learning | 0ac1e2ad32e41ac87bbb19e4535a4bc253ca9b0f | [
"Apache-2.0"
] | 1 | 2021-11-03T08:58:30.000Z | 2021-11-03T08:58:30.000Z | ndarray/list_().py | Hupengyu/Paddle_learning | 0ac1e2ad32e41ac87bbb19e4535a4bc253ca9b0f | [
"Apache-2.0"
] | null | null | null | movie_name = ['加勒比海盗', '骇客帝国', '第一滴血', '指环王', '霍比特人', '速度与激情']
print('------删除之前--------')
for temp in movie_name:
print(temp)
del movie_name[2]
print('--------删除之后---------')
for temp in movie_name:
print(temp)
| 20.545455 | 62 | 0.544248 | 30 | 226 | 3.966667 | 0.533333 | 0.302521 | 0.151261 | 0.235294 | 0.453782 | 0.453782 | 0.453782 | 0 | 0 | 0 | 0 | 0.005263 | 0.159292 | 226 | 10 | 63 | 22.6 | 0.621053 | 0 | 0 | 0.5 | 0 | 0 | 0.284444 | 0.093333 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
9169cab1d5b7b0eb069f0edb2333fdb9d3b1919d | 160 | py | Python | oi/51nod/P1065/gen.py | Riteme/test | b511d6616a25f4ae8c3861e2029789b8ee4dcb8d | [
"BSD-Source-Code"
] | 3 | 2018-08-30T09:43:20.000Z | 2019-12-03T04:53:43.000Z | oi/51nod/P1065/gen.py | Riteme/test | b511d6616a25f4ae8c3861e2029789b8ee4dcb8d | [
"BSD-Source-Code"
] | null | null | null | oi/51nod/P1065/gen.py | Riteme/test | b511d6616a25f4ae8c3861e2029789b8ee4dcb8d | [
"BSD-Source-Code"
] | null | null | null | #!/usr/bin/env pypy
from sys import argv
from random import *
n, m = map(int, argv[1:])
print n
print " ".join(map(str, [randint(-m, m) for i in xrange(n)]))
| 17.777778 | 61 | 0.6375 | 31 | 160 | 3.290323 | 0.709677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.007576 | 0.175 | 160 | 8 | 62 | 20 | 0.765152 | 0.1125 | 0 | 0 | 0 | 0 | 0.007092 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.4 | null | null | 0.4 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
916bebfc6df1fdf61fe7b37b8694e375845476a7 | 882 | py | Python | app/entities/Runtime/dump.py | u8sand/FAIRshake | 8f6f3dde42de29b88e9a43bdd43f848382e3bad7 | [
"Apache-2.0"
] | null | null | null | app/entities/Runtime/dump.py | u8sand/FAIRshake | 8f6f3dde42de29b88e9a43bdd43f848382e3bad7 | [
"Apache-2.0"
] | 8 | 2018-06-05T17:01:43.000Z | 2018-06-22T01:19:39.000Z | app/entities/Runtime/dump.py | u8sand/FAIRshake | 8f6f3dde42de29b88e9a43bdd43f848382e3bad7 | [
"Apache-2.0"
] | 1 | 2018-06-06T17:22:28.000Z | 2018-06-06T17:22:28.000Z | from injector import Injector, Module, provider, singleton, inject
from ...ioc import injector
from ...types import Apps
class CommandLineDump:
def run():
import os
from app.util.generate_spec import json_to_yml
from app.interfaces.Assessment import AssessmentSpec
from app.interfaces.Repository import RepositorySpec
from app.interfaces.Rubric import RubricSpec
from app.interfaces.Score import ScoreSpec
if not os.path.isdir('spec'):
os.mkdir('spec')
for name, spec in {
'assessment': AssessmentSpec,
'repository': RepositorySpec,
'rubric': RubricSpec,
'score': ScoreSpec,
}.items():
with open('spec/%s.yml' % (name), 'w') as fh:
print(
json_to_yml(
spec
),
file=fh,
)
injector.binder.bind(Apps, to={'dump': CommandLineDump}, scope=singleton)
| 26.727273 | 73 | 0.651927 | 102 | 882 | 5.588235 | 0.509804 | 0.061404 | 0.119298 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.24263 | 882 | 32 | 74 | 27.5625 | 0.853293 | 0 | 0 | 0 | 1 | 0 | 0.062358 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | true | 0 | 0.333333 | 0 | 0.407407 | 0.037037 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
91741950eac15787004f883255f9a777a07cc2bd | 30,700 | py | Python | models_nonconvex_simple/autocorr_bern55-06.py | grossmann-group/pyomo-MINLP-benchmarking | 714f0a0dffd61675649a805683c0627af6b4929e | [
"MIT"
] | 7 | 2019-05-08T19:14:34.000Z | 2021-12-24T00:00:40.000Z | models_nonconvex_simple/autocorr_bern55-06.py | grossmann-group/pyomo-MINLP-benchmarking | 714f0a0dffd61675649a805683c0627af6b4929e | [
"MIT"
] | null | null | null | models_nonconvex_simple/autocorr_bern55-06.py | grossmann-group/pyomo-MINLP-benchmarking | 714f0a0dffd61675649a805683c0627af6b4929e | [
"MIT"
] | 1 | 2019-05-10T18:34:18.000Z | 2019-05-10T18:34:18.000Z | # MINLP written by GAMS Convert at 08/13/20 17:37:53
#
# Equation counts
# Total E G L N X C B
# 1 0 0 1 0 0 0 0
#
# Variable counts
# x b i s1s s2s sc si
# Total cont binary integer sos1 sos2 scont sint
# 56 1 55 0 0 0 0 0
# FX 0 0 0 0 0 0 0 0
#
# Nonzero counts
# Total const NL DLL
# 56 1 55 0
from pyomo.environ import *
model = m = ConcreteModel()
m.b1 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b2 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b3 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b4 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b5 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b6 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b7 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b8 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b9 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b10 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b11 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b12 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b13 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b14 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b15 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b16 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b17 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b18 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b19 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b20 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b21 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b22 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b23 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b24 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b25 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b26 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b27 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b28 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b29 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b30 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b31 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b32 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b33 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b34 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b35 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b36 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b37 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b38 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b39 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b40 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b41 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b42 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b43 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b44 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b45 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b46 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b47 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b48 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b49 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b50 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b51 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b52 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b53 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b54 = Var(within=Binary,bounds=(0,1),initialize=0)
m.b55 = Var(within=Binary,bounds=(0,1),initialize=0)
m.x56 = Var(within=Reals,bounds=(None,None),initialize=0)
m.obj = Objective(expr=m.x56, sense=minimize)
m.c1 = Constraint(expr=64*m.b1*m.b2*m.b3*m.b4 + 64*m.b1*m.b2*m.b4*m.b5 + 64*m.b1*m.b2*m.b5*m.b6 + 64*m.b1*m.b3*m.b4*m.b6
+ 128*m.b2*m.b3*m.b4*m.b5 + 128*m.b2*m.b3*m.b5*m.b6 + 64*m.b2*m.b3*m.b6*m.b7 + 64*m.b2*m.b4*m.b5
*m.b7 + 192*m.b3*m.b4*m.b5*m.b6 + 128*m.b3*m.b4*m.b6*m.b7 + 64*m.b3*m.b4*m.b7*m.b8 + 64*m.b3*m.b5
*m.b6*m.b8 + 192*m.b4*m.b5*m.b6*m.b7 + 128*m.b4*m.b5*m.b7*m.b8 + 64*m.b4*m.b5*m.b8*m.b9 + 64*m.b4
*m.b6*m.b7*m.b9 + 192*m.b5*m.b6*m.b7*m.b8 + 128*m.b5*m.b6*m.b8*m.b9 + 64*m.b5*m.b6*m.b9*m.b10 +
64*m.b5*m.b7*m.b8*m.b10 + 192*m.b6*m.b7*m.b8*m.b9 + 128*m.b6*m.b7*m.b9*m.b10 + 64*m.b6*m.b7*m.b10
*m.b11 + 64*m.b6*m.b8*m.b9*m.b11 + 192*m.b7*m.b8*m.b9*m.b10 + 128*m.b7*m.b8*m.b10*m.b11 + 64*m.b7
*m.b8*m.b11*m.b12 + 64*m.b7*m.b9*m.b10*m.b12 + 192*m.b8*m.b9*m.b10*m.b11 + 128*m.b8*m.b9*m.b11*
m.b12 + 64*m.b8*m.b9*m.b12*m.b13 + 64*m.b8*m.b10*m.b11*m.b13 + 192*m.b9*m.b10*m.b11*m.b12 + 128*
m.b9*m.b10*m.b12*m.b13 + 64*m.b9*m.b10*m.b13*m.b14 + 64*m.b9*m.b11*m.b12*m.b14 + 192*m.b10*m.b11*
m.b12*m.b13 + 128*m.b10*m.b11*m.b13*m.b14 + 64*m.b10*m.b11*m.b14*m.b15 + 64*m.b10*m.b12*m.b13*
m.b15 + 192*m.b11*m.b12*m.b13*m.b14 + 128*m.b11*m.b12*m.b14*m.b15 + 64*m.b11*m.b12*m.b15*m.b16 +
64*m.b11*m.b13*m.b14*m.b16 + 192*m.b12*m.b13*m.b14*m.b15 + 128*m.b12*m.b13*m.b15*m.b16 + 64*m.b12
*m.b13*m.b16*m.b17 + 64*m.b12*m.b14*m.b15*m.b17 + 192*m.b13*m.b14*m.b15*m.b16 + 128*m.b13*m.b14*
m.b16*m.b17 + 64*m.b13*m.b14*m.b17*m.b18 + 64*m.b13*m.b15*m.b16*m.b18 + 192*m.b14*m.b15*m.b16*
m.b17 + 128*m.b14*m.b15*m.b17*m.b18 + 64*m.b14*m.b15*m.b18*m.b19 + 64*m.b14*m.b16*m.b17*m.b19 +
192*m.b15*m.b16*m.b17*m.b18 + 128*m.b15*m.b16*m.b18*m.b19 + 64*m.b15*m.b16*m.b19*m.b20 + 64*m.b15
*m.b17*m.b18*m.b20 + 192*m.b16*m.b17*m.b18*m.b19 + 128*m.b16*m.b17*m.b19*m.b20 + 64*m.b16*m.b17*
m.b20*m.b21 + 64*m.b16*m.b18*m.b19*m.b21 + 192*m.b17*m.b18*m.b19*m.b20 + 128*m.b17*m.b18*m.b20*
m.b21 + 64*m.b17*m.b18*m.b21*m.b22 + 64*m.b17*m.b19*m.b20*m.b22 + 192*m.b18*m.b19*m.b20*m.b21 +
128*m.b18*m.b19*m.b21*m.b22 + 64*m.b18*m.b19*m.b22*m.b23 + 64*m.b18*m.b20*m.b21*m.b23 + 192*m.b19
*m.b20*m.b21*m.b22 + 128*m.b19*m.b20*m.b22*m.b23 + 64*m.b19*m.b20*m.b23*m.b24 + 64*m.b19*m.b21*
m.b22*m.b24 + 192*m.b20*m.b21*m.b22*m.b23 + 128*m.b20*m.b21*m.b23*m.b24 + 64*m.b20*m.b21*m.b24*
m.b25 + 64*m.b20*m.b22*m.b23*m.b25 + 192*m.b21*m.b22*m.b23*m.b24 + 128*m.b21*m.b22*m.b24*m.b25 +
64*m.b21*m.b22*m.b25*m.b26 + 64*m.b21*m.b23*m.b24*m.b26 + 192*m.b22*m.b23*m.b24*m.b25 + 128*m.b22
*m.b23*m.b25*m.b26 + 64*m.b22*m.b23*m.b26*m.b27 + 64*m.b22*m.b24*m.b25*m.b27 + 192*m.b23*m.b24*
m.b25*m.b26 + 128*m.b23*m.b24*m.b26*m.b27 + 64*m.b23*m.b24*m.b27*m.b28 + 64*m.b23*m.b25*m.b26*
m.b28 + 192*m.b24*m.b25*m.b26*m.b27 + 128*m.b24*m.b25*m.b27*m.b28 + 64*m.b24*m.b25*m.b28*m.b29 +
64*m.b24*m.b26*m.b27*m.b29 + 192*m.b25*m.b26*m.b27*m.b28 + 128*m.b25*m.b26*m.b28*m.b29 + 64*m.b25
*m.b26*m.b29*m.b30 + 64*m.b25*m.b27*m.b28*m.b30 + 192*m.b26*m.b27*m.b28*m.b29 + 128*m.b26*m.b27*
m.b29*m.b30 + 64*m.b26*m.b27*m.b30*m.b31 + 64*m.b26*m.b28*m.b29*m.b31 + 192*m.b27*m.b28*m.b29*
m.b30 + 128*m.b27*m.b28*m.b30*m.b31 + 64*m.b27*m.b28*m.b31*m.b32 + 64*m.b27*m.b29*m.b30*m.b32 +
192*m.b28*m.b29*m.b30*m.b31 + 128*m.b28*m.b29*m.b31*m.b32 + 64*m.b28*m.b29*m.b32*m.b33 + 64*m.b28
*m.b30*m.b31*m.b33 + 192*m.b29*m.b30*m.b31*m.b32 + 128*m.b29*m.b30*m.b32*m.b33 + 64*m.b29*m.b30*
m.b33*m.b34 + 64*m.b29*m.b31*m.b32*m.b34 + 192*m.b30*m.b31*m.b32*m.b33 + 128*m.b30*m.b31*m.b33*
m.b34 + 64*m.b30*m.b31*m.b34*m.b35 + 64*m.b30*m.b32*m.b33*m.b35 + 192*m.b31*m.b32*m.b33*m.b34 +
128*m.b31*m.b32*m.b34*m.b35 + 64*m.b31*m.b32*m.b35*m.b36 + 64*m.b31*m.b33*m.b34*m.b36 + 192*m.b32
*m.b33*m.b34*m.b35 + 128*m.b32*m.b33*m.b35*m.b36 + 64*m.b32*m.b33*m.b36*m.b37 + 64*m.b32*m.b34*
m.b35*m.b37 + 192*m.b33*m.b34*m.b35*m.b36 + 128*m.b33*m.b34*m.b36*m.b37 + 64*m.b33*m.b34*m.b37*
m.b38 + 64*m.b33*m.b35*m.b36*m.b38 + 192*m.b34*m.b35*m.b36*m.b37 + 128*m.b34*m.b35*m.b37*m.b38 +
64*m.b34*m.b35*m.b38*m.b39 + 64*m.b34*m.b36*m.b37*m.b39 + 192*m.b35*m.b36*m.b37*m.b38 + 128*m.b35
*m.b36*m.b38*m.b39 + 64*m.b35*m.b36*m.b39*m.b40 + 64*m.b35*m.b37*m.b38*m.b40 + 192*m.b36*m.b37*
m.b38*m.b39 + 128*m.b36*m.b37*m.b39*m.b40 + 64*m.b36*m.b37*m.b40*m.b41 + 64*m.b36*m.b38*m.b39*
m.b41 + 192*m.b37*m.b38*m.b39*m.b40 + 128*m.b37*m.b38*m.b40*m.b41 + 64*m.b37*m.b38*m.b41*m.b42 +
64*m.b37*m.b39*m.b40*m.b42 + 192*m.b38*m.b39*m.b40*m.b41 + 128*m.b38*m.b39*m.b41*m.b42 + 64*m.b38
*m.b39*m.b42*m.b43 + 64*m.b38*m.b40*m.b41*m.b43 + 192*m.b39*m.b40*m.b41*m.b42 + 128*m.b39*m.b40*
m.b42*m.b43 + 64*m.b39*m.b40*m.b43*m.b44 + 64*m.b39*m.b41*m.b42*m.b44 + 192*m.b40*m.b41*m.b42*
m.b43 + 128*m.b40*m.b41*m.b43*m.b44 + 64*m.b40*m.b41*m.b44*m.b45 + 64*m.b40*m.b42*m.b43*m.b45 +
192*m.b41*m.b42*m.b43*m.b44 + 128*m.b41*m.b42*m.b44*m.b45 + 64*m.b41*m.b42*m.b45*m.b46 + 64*m.b41
*m.b43*m.b44*m.b46 + 192*m.b42*m.b43*m.b44*m.b45 + 128*m.b42*m.b43*m.b45*m.b46 + 64*m.b42*m.b43*
m.b46*m.b47 + 64*m.b42*m.b44*m.b45*m.b47 + 192*m.b43*m.b44*m.b45*m.b46 + 128*m.b43*m.b44*m.b46*
m.b47 + 64*m.b43*m.b44*m.b47*m.b48 + 64*m.b43*m.b45*m.b46*m.b48 + 192*m.b44*m.b45*m.b46*m.b47 +
128*m.b44*m.b45*m.b47*m.b48 + 64*m.b44*m.b45*m.b48*m.b49 + 64*m.b44*m.b46*m.b47*m.b49 + 192*m.b45
*m.b46*m.b47*m.b48 + 128*m.b45*m.b46*m.b48*m.b49 + 64*m.b45*m.b46*m.b49*m.b50 + 64*m.b45*m.b47*
m.b48*m.b50 + 192*m.b46*m.b47*m.b48*m.b49 + 128*m.b46*m.b47*m.b49*m.b50 + 64*m.b46*m.b47*m.b50*
m.b51 + 64*m.b46*m.b48*m.b49*m.b51 + 192*m.b47*m.b48*m.b49*m.b50 + 128*m.b47*m.b48*m.b50*m.b51 +
64*m.b47*m.b48*m.b51*m.b52 + 64*m.b47*m.b49*m.b50*m.b52 + 192*m.b48*m.b49*m.b50*m.b51 + 128*m.b48
*m.b49*m.b51*m.b52 + 64*m.b48*m.b49*m.b52*m.b53 + 64*m.b48*m.b50*m.b51*m.b53 + 192*m.b49*m.b50*
m.b51*m.b52 + 128*m.b49*m.b50*m.b52*m.b53 + 64*m.b49*m.b50*m.b53*m.b54 + 64*m.b49*m.b51*m.b52*
m.b54 + 192*m.b50*m.b51*m.b52*m.b53 + 128*m.b50*m.b51*m.b53*m.b54 + 64*m.b50*m.b51*m.b54*m.b55 +
64*m.b50*m.b52*m.b53*m.b55 + 128*m.b51*m.b52*m.b53*m.b54 + 64*m.b51*m.b52*m.b54*m.b55 + 64*m.b52*
m.b53*m.b54*m.b55 - 32*m.b1*m.b2*m.b3 - 64*m.b1*m.b2*m.b4 - 64*m.b1*m.b2*m.b5 - 32*m.b1*m.b2*m.b6
- 64*m.b1*m.b3*m.b4 - 32*m.b1*m.b3*m.b6 - 32*m.b1*m.b4*m.b5 - 32*m.b1*m.b4*m.b6 - 32*m.b1*m.b5*
m.b6 - 96*m.b2*m.b3*m.b4 - 128*m.b2*m.b3*m.b5 - 96*m.b2*m.b3*m.b6 - 32*m.b2*m.b3*m.b7 - 128*m.b2*
m.b4*m.b5 - 32*m.b2*m.b4*m.b7 - 96*m.b2*m.b5*m.b6 - 32*m.b2*m.b5*m.b7 - 32*m.b2*m.b6*m.b7 - 160*
m.b3*m.b4*m.b5 - 192*m.b3*m.b4*m.b6 - 96*m.b3*m.b4*m.b7 - 32*m.b3*m.b4*m.b8 - 192*m.b3*m.b5*m.b6
- 32*m.b3*m.b5*m.b8 - 96*m.b3*m.b6*m.b7 - 32*m.b3*m.b6*m.b8 - 32*m.b3*m.b7*m.b8 - 192*m.b4*m.b5*
m.b6 - 192*m.b4*m.b5*m.b7 - 96*m.b4*m.b5*m.b8 - 32*m.b4*m.b5*m.b9 - 192*m.b4*m.b6*m.b7 - 32*m.b4*
m.b6*m.b9 - 96*m.b4*m.b7*m.b8 - 32*m.b4*m.b7*m.b9 - 32*m.b4*m.b8*m.b9 - 192*m.b5*m.b6*m.b7 - 192*
m.b5*m.b6*m.b8 - 96*m.b5*m.b6*m.b9 - 32*m.b5*m.b6*m.b10 - 192*m.b5*m.b7*m.b8 - 32*m.b5*m.b7*m.b10
- 96*m.b5*m.b8*m.b9 - 32*m.b5*m.b8*m.b10 - 32*m.b5*m.b9*m.b10 - 192*m.b6*m.b7*m.b8 - 192*m.b6*
m.b7*m.b9 - 96*m.b6*m.b7*m.b10 - 32*m.b6*m.b7*m.b11 - 192*m.b6*m.b8*m.b9 - 32*m.b6*m.b8*m.b11 -
96*m.b6*m.b9*m.b10 - 32*m.b6*m.b9*m.b11 - 32*m.b6*m.b10*m.b11 - 192*m.b7*m.b8*m.b9 - 192*m.b7*
m.b8*m.b10 - 96*m.b7*m.b8*m.b11 - 32*m.b7*m.b8*m.b12 - 192*m.b7*m.b9*m.b10 - 32*m.b7*m.b9*m.b12
- 96*m.b7*m.b10*m.b11 - 32*m.b7*m.b10*m.b12 - 32*m.b7*m.b11*m.b12 - 192*m.b8*m.b9*m.b10 - 192*
m.b8*m.b9*m.b11 - 96*m.b8*m.b9*m.b12 - 32*m.b8*m.b9*m.b13 - 192*m.b8*m.b10*m.b11 - 32*m.b8*m.b10*
m.b13 - 96*m.b8*m.b11*m.b12 - 32*m.b8*m.b11*m.b13 - 32*m.b8*m.b12*m.b13 - 192*m.b9*m.b10*m.b11 -
192*m.b9*m.b10*m.b12 - 96*m.b9*m.b10*m.b13 - 32*m.b9*m.b10*m.b14 - 192*m.b9*m.b11*m.b12 - 32*m.b9
*m.b11*m.b14 - 96*m.b9*m.b12*m.b13 - 32*m.b9*m.b12*m.b14 - 32*m.b9*m.b13*m.b14 - 192*m.b10*m.b11*
m.b12 - 192*m.b10*m.b11*m.b13 - 96*m.b10*m.b11*m.b14 - 32*m.b10*m.b11*m.b15 - 192*m.b10*m.b12*
m.b13 - 32*m.b10*m.b12*m.b15 - 96*m.b10*m.b13*m.b14 - 32*m.b10*m.b13*m.b15 - 32*m.b10*m.b14*m.b15
- 192*m.b11*m.b12*m.b13 - 192*m.b11*m.b12*m.b14 - 96*m.b11*m.b12*m.b15 - 32*m.b11*m.b12*m.b16 -
192*m.b11*m.b13*m.b14 - 32*m.b11*m.b13*m.b16 - 96*m.b11*m.b14*m.b15 - 32*m.b11*m.b14*m.b16 - 32*
m.b11*m.b15*m.b16 - 192*m.b12*m.b13*m.b14 - 192*m.b12*m.b13*m.b15 - 96*m.b12*m.b13*m.b16 - 32*
m.b12*m.b13*m.b17 - 192*m.b12*m.b14*m.b15 - 32*m.b12*m.b14*m.b17 - 96*m.b12*m.b15*m.b16 - 32*
m.b12*m.b15*m.b17 - 32*m.b12*m.b16*m.b17 - 192*m.b13*m.b14*m.b15 - 192*m.b13*m.b14*m.b16 - 96*
m.b13*m.b14*m.b17 - 32*m.b13*m.b14*m.b18 - 192*m.b13*m.b15*m.b16 - 32*m.b13*m.b15*m.b18 - 96*
m.b13*m.b16*m.b17 - 32*m.b13*m.b16*m.b18 - 32*m.b13*m.b17*m.b18 - 192*m.b14*m.b15*m.b16 - 192*
m.b14*m.b15*m.b17 - 96*m.b14*m.b15*m.b18 - 32*m.b14*m.b15*m.b19 - 192*m.b14*m.b16*m.b17 - 32*
m.b14*m.b16*m.b19 - 96*m.b14*m.b17*m.b18 - 32*m.b14*m.b17*m.b19 - 32*m.b14*m.b18*m.b19 - 192*
m.b15*m.b16*m.b17 - 192*m.b15*m.b16*m.b18 - 96*m.b15*m.b16*m.b19 - 32*m.b15*m.b16*m.b20 - 192*
m.b15*m.b17*m.b18 - 32*m.b15*m.b17*m.b20 - 96*m.b15*m.b18*m.b19 - 32*m.b15*m.b18*m.b20 - 32*m.b15
*m.b19*m.b20 - 192*m.b16*m.b17*m.b18 - 192*m.b16*m.b17*m.b19 - 96*m.b16*m.b17*m.b20 - 32*m.b16*
m.b17*m.b21 - 192*m.b16*m.b18*m.b19 - 32*m.b16*m.b18*m.b21 - 96*m.b16*m.b19*m.b20 - 32*m.b16*
m.b19*m.b21 - 32*m.b16*m.b20*m.b21 - 192*m.b17*m.b18*m.b19 - 192*m.b17*m.b18*m.b20 - 96*m.b17*
m.b18*m.b21 - 32*m.b17*m.b18*m.b22 - 192*m.b17*m.b19*m.b20 - 32*m.b17*m.b19*m.b22 - 96*m.b17*
m.b20*m.b21 - 32*m.b17*m.b20*m.b22 - 32*m.b17*m.b21*m.b22 - 192*m.b18*m.b19*m.b20 - 192*m.b18*
m.b19*m.b21 - 96*m.b18*m.b19*m.b22 - 32*m.b18*m.b19*m.b23 - 192*m.b18*m.b20*m.b21 - 32*m.b18*
m.b20*m.b23 - 96*m.b18*m.b21*m.b22 - 32*m.b18*m.b21*m.b23 - 32*m.b18*m.b22*m.b23 - 192*m.b19*
m.b20*m.b21 - 192*m.b19*m.b20*m.b22 - 96*m.b19*m.b20*m.b23 - 32*m.b19*m.b20*m.b24 - 192*m.b19*
m.b21*m.b22 - 32*m.b19*m.b21*m.b24 - 96*m.b19*m.b22*m.b23 - 32*m.b19*m.b22*m.b24 - 32*m.b19*m.b23
*m.b24 - 192*m.b20*m.b21*m.b22 - 192*m.b20*m.b21*m.b23 - 96*m.b20*m.b21*m.b24 - 32*m.b20*m.b21*
m.b25 - 192*m.b20*m.b22*m.b23 - 32*m.b20*m.b22*m.b25 - 96*m.b20*m.b23*m.b24 - 32*m.b20*m.b23*
m.b25 - 32*m.b20*m.b24*m.b25 - 192*m.b21*m.b22*m.b23 - 192*m.b21*m.b22*m.b24 - 96*m.b21*m.b22*
m.b25 - 32*m.b21*m.b22*m.b26 - 192*m.b21*m.b23*m.b24 - 32*m.b21*m.b23*m.b26 - 96*m.b21*m.b24*
m.b25 - 32*m.b21*m.b24*m.b26 - 32*m.b21*m.b25*m.b26 - 192*m.b22*m.b23*m.b24 - 192*m.b22*m.b23*
m.b25 - 96*m.b22*m.b23*m.b26 - 32*m.b22*m.b23*m.b27 - 192*m.b22*m.b24*m.b25 - 32*m.b22*m.b24*
m.b27 - 96*m.b22*m.b25*m.b26 - 32*m.b22*m.b25*m.b27 - 32*m.b22*m.b26*m.b27 - 192*m.b23*m.b24*
m.b25 - 192*m.b23*m.b24*m.b26 - 96*m.b23*m.b24*m.b27 - 32*m.b23*m.b24*m.b28 - 192*m.b23*m.b25*
m.b26 - 32*m.b23*m.b25*m.b28 - 96*m.b23*m.b26*m.b27 - 32*m.b23*m.b26*m.b28 - 32*m.b23*m.b27*m.b28
- 192*m.b24*m.b25*m.b26 - 192*m.b24*m.b25*m.b27 - 96*m.b24*m.b25*m.b28 - 32*m.b24*m.b25*m.b29 -
192*m.b24*m.b26*m.b27 - 32*m.b24*m.b26*m.b29 - 96*m.b24*m.b27*m.b28 - 32*m.b24*m.b27*m.b29 - 32*
m.b24*m.b28*m.b29 - 192*m.b25*m.b26*m.b27 - 192*m.b25*m.b26*m.b28 - 96*m.b25*m.b26*m.b29 - 32*
m.b25*m.b26*m.b30 - 192*m.b25*m.b27*m.b28 - 32*m.b25*m.b27*m.b30 - 96*m.b25*m.b28*m.b29 - 32*
m.b25*m.b28*m.b30 - 32*m.b25*m.b29*m.b30 - 192*m.b26*m.b27*m.b28 - 192*m.b26*m.b27*m.b29 - 96*
m.b26*m.b27*m.b30 - 32*m.b26*m.b27*m.b31 - 192*m.b26*m.b28*m.b29 - 32*m.b26*m.b28*m.b31 - 96*
m.b26*m.b29*m.b30 - 32*m.b26*m.b29*m.b31 - 32*m.b26*m.b30*m.b31 - 192*m.b27*m.b28*m.b29 - 192*
m.b27*m.b28*m.b30 - 96*m.b27*m.b28*m.b31 - 32*m.b27*m.b28*m.b32 - 192*m.b27*m.b29*m.b30 - 32*
m.b27*m.b29*m.b32 - 96*m.b27*m.b30*m.b31 - 32*m.b27*m.b30*m.b32 - 32*m.b27*m.b31*m.b32 - 192*
m.b28*m.b29*m.b30 - 192*m.b28*m.b29*m.b31 - 96*m.b28*m.b29*m.b32 - 32*m.b28*m.b29*m.b33 - 192*
m.b28*m.b30*m.b31 - 32*m.b28*m.b30*m.b33 - 96*m.b28*m.b31*m.b32 - 32*m.b28*m.b31*m.b33 - 32*m.b28
*m.b32*m.b33 - 192*m.b29*m.b30*m.b31 - 192*m.b29*m.b30*m.b32 - 96*m.b29*m.b30*m.b33 - 32*m.b29*
m.b30*m.b34 - 192*m.b29*m.b31*m.b32 - 32*m.b29*m.b31*m.b34 - 96*m.b29*m.b32*m.b33 - 32*m.b29*
m.b32*m.b34 - 32*m.b29*m.b33*m.b34 - 192*m.b30*m.b31*m.b32 - 192*m.b30*m.b31*m.b33 - 96*m.b30*
m.b31*m.b34 - 32*m.b30*m.b31*m.b35 - 192*m.b30*m.b32*m.b33 - 32*m.b30*m.b32*m.b35 - 96*m.b30*
m.b33*m.b34 - 32*m.b30*m.b33*m.b35 - 32*m.b30*m.b34*m.b35 - 192*m.b31*m.b32*m.b33 - 192*m.b31*
m.b32*m.b34 - 96*m.b31*m.b32*m.b35 - 32*m.b31*m.b32*m.b36 - 192*m.b31*m.b33*m.b34 - 32*m.b31*
m.b33*m.b36 - 96*m.b31*m.b34*m.b35 - 32*m.b31*m.b34*m.b36 - 32*m.b31*m.b35*m.b36 - 192*m.b32*
m.b33*m.b34 - 192*m.b32*m.b33*m.b35 - 96*m.b32*m.b33*m.b36 - 32*m.b32*m.b33*m.b37 - 192*m.b32*
m.b34*m.b35 - 32*m.b32*m.b34*m.b37 - 96*m.b32*m.b35*m.b36 - 32*m.b32*m.b35*m.b37 - 32*m.b32*m.b36
*m.b37 - 192*m.b33*m.b34*m.b35 - 192*m.b33*m.b34*m.b36 - 96*m.b33*m.b34*m.b37 - 32*m.b33*m.b34*
m.b38 - 192*m.b33*m.b35*m.b36 - 32*m.b33*m.b35*m.b38 - 96*m.b33*m.b36*m.b37 - 32*m.b33*m.b36*
m.b38 - 32*m.b33*m.b37*m.b38 - 192*m.b34*m.b35*m.b36 - 192*m.b34*m.b35*m.b37 - 96*m.b34*m.b35*
m.b38 - 32*m.b34*m.b35*m.b39 - 192*m.b34*m.b36*m.b37 - 32*m.b34*m.b36*m.b39 - 96*m.b34*m.b37*
m.b38 - 32*m.b34*m.b37*m.b39 - 32*m.b34*m.b38*m.b39 - 192*m.b35*m.b36*m.b37 - 192*m.b35*m.b36*
m.b38 - 96*m.b35*m.b36*m.b39 - 32*m.b35*m.b36*m.b40 - 192*m.b35*m.b37*m.b38 - 32*m.b35*m.b37*
m.b40 - 96*m.b35*m.b38*m.b39 - 32*m.b35*m.b38*m.b40 - 32*m.b35*m.b39*m.b40 - 192*m.b36*m.b37*
m.b38 - 192*m.b36*m.b37*m.b39 - 96*m.b36*m.b37*m.b40 - 32*m.b36*m.b37*m.b41 - 192*m.b36*m.b38*
m.b39 - 32*m.b36*m.b38*m.b41 - 96*m.b36*m.b39*m.b40 - 32*m.b36*m.b39*m.b41 - 32*m.b36*m.b40*m.b41
- 192*m.b37*m.b38*m.b39 - 192*m.b37*m.b38*m.b40 - 96*m.b37*m.b38*m.b41 - 32*m.b37*m.b38*m.b42 -
192*m.b37*m.b39*m.b40 - 32*m.b37*m.b39*m.b42 - 96*m.b37*m.b40*m.b41 - 32*m.b37*m.b40*m.b42 - 32*
m.b37*m.b41*m.b42 - 192*m.b38*m.b39*m.b40 - 192*m.b38*m.b39*m.b41 - 96*m.b38*m.b39*m.b42 - 32*
m.b38*m.b39*m.b43 - 192*m.b38*m.b40*m.b41 - 32*m.b38*m.b40*m.b43 - 96*m.b38*m.b41*m.b42 - 32*
m.b38*m.b41*m.b43 - 32*m.b38*m.b42*m.b43 - 192*m.b39*m.b40*m.b41 - 192*m.b39*m.b40*m.b42 - 96*
m.b39*m.b40*m.b43 - 32*m.b39*m.b40*m.b44 - 192*m.b39*m.b41*m.b42 - 32*m.b39*m.b41*m.b44 - 96*
m.b39*m.b42*m.b43 - 32*m.b39*m.b42*m.b44 - 32*m.b39*m.b43*m.b44 - 192*m.b40*m.b41*m.b42 - 192*
m.b40*m.b41*m.b43 - 96*m.b40*m.b41*m.b44 - 32*m.b40*m.b41*m.b45 - 192*m.b40*m.b42*m.b43 - 32*
m.b40*m.b42*m.b45 - 96*m.b40*m.b43*m.b44 - 32*m.b40*m.b43*m.b45 - 32*m.b40*m.b44*m.b45 - 192*
m.b41*m.b42*m.b43 - 192*m.b41*m.b42*m.b44 - 96*m.b41*m.b42*m.b45 - 32*m.b41*m.b42*m.b46 - 192*
m.b41*m.b43*m.b44 - 32*m.b41*m.b43*m.b46 - 96*m.b41*m.b44*m.b45 - 32*m.b41*m.b44*m.b46 - 32*m.b41
*m.b45*m.b46 - 192*m.b42*m.b43*m.b44 - 192*m.b42*m.b43*m.b45 - 96*m.b42*m.b43*m.b46 - 32*m.b42*
m.b43*m.b47 - 192*m.b42*m.b44*m.b45 - 32*m.b42*m.b44*m.b47 - 96*m.b42*m.b45*m.b46 - 32*m.b42*
m.b45*m.b47 - 32*m.b42*m.b46*m.b47 - 192*m.b43*m.b44*m.b45 - 192*m.b43*m.b44*m.b46 - 96*m.b43*
m.b44*m.b47 - 32*m.b43*m.b44*m.b48 - 192*m.b43*m.b45*m.b46 - 32*m.b43*m.b45*m.b48 - 96*m.b43*
m.b46*m.b47 - 32*m.b43*m.b46*m.b48 - 32*m.b43*m.b47*m.b48 - 192*m.b44*m.b45*m.b46 - 192*m.b44*
m.b45*m.b47 - 96*m.b44*m.b45*m.b48 - 32*m.b44*m.b45*m.b49 - 192*m.b44*m.b46*m.b47 - 32*m.b44*
m.b46*m.b49 - 96*m.b44*m.b47*m.b48 - 32*m.b44*m.b47*m.b49 - 32*m.b44*m.b48*m.b49 - 192*m.b45*
m.b46*m.b47 - 192*m.b45*m.b46*m.b48 - 96*m.b45*m.b46*m.b49 - 32*m.b45*m.b46*m.b50 - 192*m.b45*
m.b47*m.b48 - 32*m.b45*m.b47*m.b50 - 96*m.b45*m.b48*m.b49 - 32*m.b45*m.b48*m.b50 - 32*m.b45*m.b49
*m.b50 - 192*m.b46*m.b47*m.b48 - 192*m.b46*m.b47*m.b49 - 96*m.b46*m.b47*m.b50 - 32*m.b46*m.b47*
m.b51 - 192*m.b46*m.b48*m.b49 - 32*m.b46*m.b48*m.b51 - 96*m.b46*m.b49*m.b50 - 32*m.b46*m.b49*
m.b51 - 32*m.b46*m.b50*m.b51 - 192*m.b47*m.b48*m.b49 - 192*m.b47*m.b48*m.b50 - 96*m.b47*m.b48*
m.b51 - 32*m.b47*m.b48*m.b52 - 192*m.b47*m.b49*m.b50 - 32*m.b47*m.b49*m.b52 - 96*m.b47*m.b50*
m.b51 - 32*m.b47*m.b50*m.b52 - 32*m.b47*m.b51*m.b52 - 192*m.b48*m.b49*m.b50 - 192*m.b48*m.b49*
m.b51 - 96*m.b48*m.b49*m.b52 - 32*m.b48*m.b49*m.b53 - 192*m.b48*m.b50*m.b51 - 32*m.b48*m.b50*
m.b53 - 96*m.b48*m.b51*m.b52 - 32*m.b48*m.b51*m.b53 - 32*m.b48*m.b52*m.b53 - 192*m.b49*m.b50*
m.b51 - 192*m.b49*m.b50*m.b52 - 96*m.b49*m.b50*m.b53 - 32*m.b49*m.b50*m.b54 - 192*m.b49*m.b51*
m.b52 - 32*m.b49*m.b51*m.b54 - 96*m.b49*m.b52*m.b53 - 32*m.b49*m.b52*m.b54 - 32*m.b49*m.b53*m.b54
- 192*m.b50*m.b51*m.b52 - 192*m.b50*m.b51*m.b53 - 96*m.b50*m.b51*m.b54 - 32*m.b50*m.b51*m.b55 -
192*m.b50*m.b52*m.b53 - 32*m.b50*m.b52*m.b55 - 96*m.b50*m.b53*m.b54 - 32*m.b50*m.b53*m.b55 - 32*
m.b50*m.b54*m.b55 - 160*m.b51*m.b52*m.b53 - 128*m.b51*m.b52*m.b54 - 32*m.b51*m.b52*m.b55 - 128*
m.b51*m.b53*m.b54 - 64*m.b51*m.b54*m.b55 - 96*m.b52*m.b53*m.b54 - 64*m.b52*m.b53*m.b55 - 64*m.b52
*m.b54*m.b55 - 32*m.b53*m.b54*m.b55 + 48*m.b1*m.b2 + 40*m.b1*m.b3 + 48*m.b1*m.b4 + 40*m.b1*m.b5
+ 32*m.b1*m.b6 + 96*m.b2*m.b3 + 96*m.b2*m.b4 + 112*m.b2*m.b5 + 80*m.b2*m.b6 + 32*m.b2*m.b7 + 160
*m.b3*m.b4 + 152*m.b3*m.b5 + 160*m.b3*m.b6 + 80*m.b3*m.b7 + 32*m.b3*m.b8 + 208*m.b4*m.b5 + 192*
m.b4*m.b6 + 160*m.b4*m.b7 + 80*m.b4*m.b8 + 32*m.b4*m.b9 + 256*m.b5*m.b6 + 192*m.b5*m.b7 + 160*
m.b5*m.b8 + 80*m.b5*m.b9 + 32*m.b5*m.b10 + 256*m.b6*m.b7 + 192*m.b6*m.b8 + 160*m.b6*m.b9 + 80*
m.b6*m.b10 + 32*m.b6*m.b11 + 256*m.b7*m.b8 + 192*m.b7*m.b9 + 160*m.b7*m.b10 + 80*m.b7*m.b11 + 32*
m.b7*m.b12 + 256*m.b8*m.b9 + 192*m.b8*m.b10 + 160*m.b8*m.b11 + 80*m.b8*m.b12 + 32*m.b8*m.b13 +
256*m.b9*m.b10 + 192*m.b9*m.b11 + 160*m.b9*m.b12 + 80*m.b9*m.b13 + 32*m.b9*m.b14 + 256*m.b10*
m.b11 + 192*m.b10*m.b12 + 160*m.b10*m.b13 + 80*m.b10*m.b14 + 32*m.b10*m.b15 + 256*m.b11*m.b12 +
192*m.b11*m.b13 + 160*m.b11*m.b14 + 80*m.b11*m.b15 + 32*m.b11*m.b16 + 256*m.b12*m.b13 + 192*m.b12
*m.b14 + 160*m.b12*m.b15 + 80*m.b12*m.b16 + 32*m.b12*m.b17 + 256*m.b13*m.b14 + 192*m.b13*m.b15 +
160*m.b13*m.b16 + 80*m.b13*m.b17 + 32*m.b13*m.b18 + 256*m.b14*m.b15 + 192*m.b14*m.b16 + 160*m.b14
*m.b17 + 80*m.b14*m.b18 + 32*m.b14*m.b19 + 256*m.b15*m.b16 + 192*m.b15*m.b17 + 160*m.b15*m.b18 +
80*m.b15*m.b19 + 32*m.b15*m.b20 + 256*m.b16*m.b17 + 192*m.b16*m.b18 + 160*m.b16*m.b19 + 80*m.b16*
m.b20 + 32*m.b16*m.b21 + 256*m.b17*m.b18 + 192*m.b17*m.b19 + 160*m.b17*m.b20 + 80*m.b17*m.b21 +
32*m.b17*m.b22 + 256*m.b18*m.b19 + 192*m.b18*m.b20 + 160*m.b18*m.b21 + 80*m.b18*m.b22 + 32*m.b18*
m.b23 + 256*m.b19*m.b20 + 192*m.b19*m.b21 + 160*m.b19*m.b22 + 80*m.b19*m.b23 + 32*m.b19*m.b24 +
256*m.b20*m.b21 + 192*m.b20*m.b22 + 160*m.b20*m.b23 + 80*m.b20*m.b24 + 32*m.b20*m.b25 + 256*m.b21
*m.b22 + 192*m.b21*m.b23 + 160*m.b21*m.b24 + 80*m.b21*m.b25 + 32*m.b21*m.b26 + 256*m.b22*m.b23 +
192*m.b22*m.b24 + 160*m.b22*m.b25 + 80*m.b22*m.b26 + 32*m.b22*m.b27 + 256*m.b23*m.b24 + 192*m.b23
*m.b25 + 160*m.b23*m.b26 + 80*m.b23*m.b27 + 32*m.b23*m.b28 + 256*m.b24*m.b25 + 192*m.b24*m.b26 +
160*m.b24*m.b27 + 80*m.b24*m.b28 + 32*m.b24*m.b29 + 256*m.b25*m.b26 + 192*m.b25*m.b27 + 160*m.b25
*m.b28 + 80*m.b25*m.b29 + 32*m.b25*m.b30 + 256*m.b26*m.b27 + 192*m.b26*m.b28 + 160*m.b26*m.b29 +
80*m.b26*m.b30 + 32*m.b26*m.b31 + 256*m.b27*m.b28 + 192*m.b27*m.b29 + 160*m.b27*m.b30 + 80*m.b27*
m.b31 + 32*m.b27*m.b32 + 256*m.b28*m.b29 + 192*m.b28*m.b30 + 160*m.b28*m.b31 + 80*m.b28*m.b32 +
32*m.b28*m.b33 + 256*m.b29*m.b30 + 192*m.b29*m.b31 + 160*m.b29*m.b32 + 80*m.b29*m.b33 + 32*m.b29*
m.b34 + 256*m.b30*m.b31 + 192*m.b30*m.b32 + 160*m.b30*m.b33 + 80*m.b30*m.b34 + 32*m.b30*m.b35 +
256*m.b31*m.b32 + 192*m.b31*m.b33 + 160*m.b31*m.b34 + 80*m.b31*m.b35 + 32*m.b31*m.b36 + 256*m.b32
*m.b33 + 192*m.b32*m.b34 + 160*m.b32*m.b35 + 80*m.b32*m.b36 + 32*m.b32*m.b37 + 256*m.b33*m.b34 +
192*m.b33*m.b35 + 160*m.b33*m.b36 + 80*m.b33*m.b37 + 32*m.b33*m.b38 + 256*m.b34*m.b35 + 192*m.b34
*m.b36 + 160*m.b34*m.b37 + 80*m.b34*m.b38 + 32*m.b34*m.b39 + 256*m.b35*m.b36 + 192*m.b35*m.b37 +
160*m.b35*m.b38 + 80*m.b35*m.b39 + 32*m.b35*m.b40 + 256*m.b36*m.b37 + 192*m.b36*m.b38 + 160*m.b36
*m.b39 + 80*m.b36*m.b40 + 32*m.b36*m.b41 + 256*m.b37*m.b38 + 192*m.b37*m.b39 + 160*m.b37*m.b40 +
80*m.b37*m.b41 + 32*m.b37*m.b42 + 256*m.b38*m.b39 + 192*m.b38*m.b40 + 160*m.b38*m.b41 + 80*m.b38*
m.b42 + 32*m.b38*m.b43 + 256*m.b39*m.b40 + 192*m.b39*m.b41 + 160*m.b39*m.b42 + 80*m.b39*m.b43 +
32*m.b39*m.b44 + 256*m.b40*m.b41 + 192*m.b40*m.b42 + 160*m.b40*m.b43 + 80*m.b40*m.b44 + 32*m.b40*
m.b45 + 256*m.b41*m.b42 + 192*m.b41*m.b43 + 160*m.b41*m.b44 + 80*m.b41*m.b45 + 32*m.b41*m.b46 +
256*m.b42*m.b43 + 192*m.b42*m.b44 + 160*m.b42*m.b45 + 80*m.b42*m.b46 + 32*m.b42*m.b47 + 256*m.b43
*m.b44 + 192*m.b43*m.b45 + 160*m.b43*m.b46 + 80*m.b43*m.b47 + 32*m.b43*m.b48 + 256*m.b44*m.b45 +
192*m.b44*m.b46 + 160*m.b44*m.b47 + 80*m.b44*m.b48 + 32*m.b44*m.b49 + 256*m.b45*m.b46 + 192*m.b45
*m.b47 + 160*m.b45*m.b48 + 80*m.b45*m.b49 + 32*m.b45*m.b50 + 256*m.b46*m.b47 + 192*m.b46*m.b48 +
160*m.b46*m.b49 + 80*m.b46*m.b50 + 32*m.b46*m.b51 + 256*m.b47*m.b48 + 192*m.b47*m.b49 + 160*m.b47
*m.b50 + 80*m.b47*m.b51 + 32*m.b47*m.b52 + 256*m.b48*m.b49 + 192*m.b48*m.b50 + 160*m.b48*m.b51 +
80*m.b48*m.b52 + 32*m.b48*m.b53 + 256*m.b49*m.b50 + 192*m.b49*m.b51 + 160*m.b49*m.b52 + 80*m.b49*
m.b53 + 32*m.b49*m.b54 + 256*m.b50*m.b51 + 192*m.b50*m.b52 + 160*m.b50*m.b53 + 80*m.b50*m.b54 +
32*m.b50*m.b55 + 208*m.b51*m.b52 + 152*m.b51*m.b53 + 112*m.b51*m.b54 + 40*m.b51*m.b55 + 160*m.b52
*m.b53 + 96*m.b52*m.b54 + 48*m.b52*m.b55 + 96*m.b53*m.b54 + 40*m.b53*m.b55 + 48*m.b54*m.b55 - 40*
m.b1 - 88*m.b2 - 136*m.b3 - 184*m.b4 - 232*m.b5 - 272*m.b6 - 272*m.b7 - 272*m.b8 - 272*m.b9 - 272
*m.b10 - 272*m.b11 - 272*m.b12 - 272*m.b13 - 272*m.b14 - 272*m.b15 - 272*m.b16 - 272*m.b17 - 272*
m.b18 - 272*m.b19 - 272*m.b20 - 272*m.b21 - 272*m.b22 - 272*m.b23 - 272*m.b24 - 272*m.b25 - 272*
m.b26 - 272*m.b27 - 272*m.b28 - 272*m.b29 - 272*m.b30 - 272*m.b31 - 272*m.b32 - 272*m.b33 - 272*
m.b34 - 272*m.b35 - 272*m.b36 - 272*m.b37 - 272*m.b38 - 272*m.b39 - 272*m.b40 - 272*m.b41 - 272*
m.b42 - 272*m.b43 - 272*m.b44 - 272*m.b45 - 272*m.b46 - 272*m.b47 - 272*m.b48 - 272*m.b49 - 272*
m.b50 - 232*m.b51 - 184*m.b52 - 136*m.b53 - 88*m.b54 - 40*m.b55 - m.x56 <= 0)
| 99.675325 | 120 | 0.504528 | 7,161 | 30,700 | 2.162966 | 0.019271 | 0.049196 | 0.043386 | 0.074569 | 0.796113 | 0.71709 | 0.253212 | 0.246691 | 0.239202 | 0 | 0 | 0.346589 | 0.260358 | 30,700 | 307 | 121 | 100 | 0.335535 | 0.020391 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.003509 | 0 | 0.003509 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
917ff31517f7c95562d6984fe142d5ef3eed94e0 | 39,224 | py | Python | pgoapi/protos/pogoprotos/data/telemetry/client_telemetry_omni_pb2.py | SkOODaT/Pgoapi | a0021998eb75f6b1a867d7348acb4fcf1298987d | [
"MIT"
] | 1 | 2017-12-12T10:05:57.000Z | 2017-12-12T10:05:57.000Z | pgoapi/protos/pogoprotos/data/telemetry/client_telemetry_omni_pb2.py | SkOODaT/Pgoapi | a0021998eb75f6b1a867d7348acb4fcf1298987d | [
"MIT"
] | null | null | null | pgoapi/protos/pogoprotos/data/telemetry/client_telemetry_omni_pb2.py | SkOODaT/Pgoapi | a0021998eb75f6b1a867d7348acb4fcf1298987d | [
"MIT"
] | null | null | null | # Generated by the protocol buffer compiler. DO NOT EDIT!
# source: pogoprotos/data/telemetry/client_telemetry_omni.proto
import sys
_b=sys.version_info[0]<3 and (lambda x:x) or (lambda x:x.encode('latin1'))
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
from google.protobuf import descriptor_pb2
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
from pogoprotos.data.telemetry import boot_time_pb2 as pogoprotos_dot_data_dot_telemetry_dot_boot__time__pb2
from pogoprotos.data.telemetry import frame_rate_pb2 as pogoprotos_dot_data_dot_telemetry_dot_frame__rate__pb2
from pogoprotos.data.telemetry import generic_click_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_generic__click__telemetry__pb2
from pogoprotos.data.telemetry import map_events_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_map__events__telemetry__pb2
from pogoprotos.data.telemetry import spin_pokestop_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_spin__pokestop__telemetry__pb2
from pogoprotos.data.telemetry import profile_page_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_profile__page__telemetry__pb2
from pogoprotos.data.telemetry import shopping_page_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_shopping__page__telemetry__pb2
from pogoprotos.data.telemetry import encounter_pokemon_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_encounter__pokemon__telemetry__pb2
from pogoprotos.data.telemetry import catch_pokemon_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_catch__pokemon__telemetry__pb2
from pogoprotos.data.telemetry import deploy_pokemon_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_deploy__pokemon__telemetry__pb2
from pogoprotos.data.telemetry import feed_pokemon_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_feed__pokemon__telemetry__pb2
from pogoprotos.data.telemetry import evolve_pokemon_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_evolve__pokemon__telemetry__pb2
from pogoprotos.data.telemetry import release_pokemon_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_release__pokemon__telemetry__pb2
from pogoprotos.data.telemetry import nickname_pokemon_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_nickname__pokemon__telemetry__pb2
from pogoprotos.data.telemetry import news_page_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_news__page__telemetry__pb2
from pogoprotos.data.telemetry import item_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_item__telemetry__pb2
from pogoprotos.data.telemetry import battle_party_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_battle__party__telemetry__pb2
from pogoprotos.data.telemetry import passcode_redeem_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_passcode__redeem__telemetry__pb2
from pogoprotos.data.telemetry import link_login_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_link__login__telemetry__pb2
from pogoprotos.data.telemetry import raid_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_raid__telemetry__pb2
from pogoprotos.data.telemetry import push_notification_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_push__notification__telemetry__pb2
from pogoprotos.data.telemetry import avatar_customization_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_avatar__customization__telemetry__pb2
from pogoprotos.data.telemetry import read_point_of_interest_description_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_read__point__of__interest__description__telemetry__pb2
from pogoprotos.data.telemetry import web_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_web__telemetry__pb2
from pogoprotos.data.telemetry import change_ar_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_change__ar__telemetry__pb2
from pogoprotos.data.telemetry import weather_detail_click_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_weather__detail__click__telemetry__pb2
from pogoprotos.data.player import user_issue_weather_report_pb2 as pogoprotos_dot_data_dot_player_dot_user__issue__weather__report__pb2
from pogoprotos.data.telemetry import pokemon_inventory_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_pokemon__inventory__telemetry__pb2
from pogoprotos.data.telemetry import social_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_social__telemetry__pb2
from pogoprotos.data.telemetry import check_encounter_tray_info_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_check__encounter__tray__info__telemetry__pb2
from pogoprotos.networking.platform.telemetry import platform_server_data_pb2 as pogoprotos_dot_networking_dot_platform_dot_telemetry_dot_platform__server__data__pb2
from pogoprotos.data.telemetry import pokemon_go_plus_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_pokemon__go__plus__telemetry__pb2
from pogoprotos.data.telemetry import rpc_response_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_rpc__response__telemetry__pb2
from pogoprotos.data.telemetry import asset_bundle_download_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_asset__bundle__download__telemetry__pb2
from pogoprotos.data.telemetry import asset_poi_download_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_asset__poi__download__telemetry__pb2
from pogoprotos.data.telemetry import asset_stream_download_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_asset__stream__download__telemetry__pb2
from pogoprotos.data.telemetry import asset_stream_cache_culled_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_asset__stream__cache__culled__telemetry__pb2
from pogoprotos.data.telemetry import rpc_socket_response_telemetry_pb2 as pogoprotos_dot_data_dot_telemetry_dot_rpc__socket__response__telemetry__pb2
from pogoprotos.settings import social_client_settings_pb2 as pogoprotos_dot_settings_dot_social__client__settings__pb2
DESCRIPTOR = _descriptor.FileDescriptor(
name='pogoprotos/data/telemetry/client_telemetry_omni.proto',
package='pogoprotos.data.telemetry',
syntax='proto3',
serialized_pb=_b('\n5pogoprotos/data/telemetry/client_telemetry_omni.proto\x12\x19pogoprotos.data.telemetry\x1a)pogoprotos/data/telemetry/boot_time.proto\x1a*pogoprotos/data/telemetry/frame_rate.proto\x1a\x37pogoprotos/data/telemetry/generic_click_telemetry.proto\x1a\x34pogoprotos/data/telemetry/map_events_telemetry.proto\x1a\x37pogoprotos/data/telemetry/spin_pokestop_telemetry.proto\x1a\x36pogoprotos/data/telemetry/profile_page_telemetry.proto\x1a\x37pogoprotos/data/telemetry/shopping_page_telemetry.proto\x1a;pogoprotos/data/telemetry/encounter_pokemon_telemetry.proto\x1a\x37pogoprotos/data/telemetry/catch_pokemon_telemetry.proto\x1a\x38pogoprotos/data/telemetry/deploy_pokemon_telemetry.proto\x1a\x36pogoprotos/data/telemetry/feed_pokemon_telemetry.proto\x1a\x38pogoprotos/data/telemetry/evolve_pokemon_telemetry.proto\x1a\x39pogoprotos/data/telemetry/release_pokemon_telemetry.proto\x1a:pogoprotos/data/telemetry/nickname_pokemon_telemetry.proto\x1a\x33pogoprotos/data/telemetry/news_page_telemetry.proto\x1a.pogoprotos/data/telemetry/item_telemetry.proto\x1a\x36pogoprotos/data/telemetry/battle_party_telemetry.proto\x1a\x39pogoprotos/data/telemetry/passcode_redeem_telemetry.proto\x1a\x34pogoprotos/data/telemetry/link_login_telemetry.proto\x1a.pogoprotos/data/telemetry/raid_telemetry.proto\x1a;pogoprotos/data/telemetry/push_notification_telemetry.proto\x1a>pogoprotos/data/telemetry/avatar_customization_telemetry.proto\x1aLpogoprotos/data/telemetry/read_point_of_interest_description_telemetry.proto\x1a-pogoprotos/data/telemetry/web_telemetry.proto\x1a\x33pogoprotos/data/telemetry/change_ar_telemetry.proto\x1a>pogoprotos/data/telemetry/weather_detail_click_telemetry.proto\x1a\x36pogoprotos/data/player/user_issue_weather_report.proto\x1a;pogoprotos/data/telemetry/pokemon_inventory_telemetry.proto\x1a\x30pogoprotos/data/telemetry/social_telemetry.proto\x1a\x43pogoprotos/data/telemetry/check_encounter_tray_info_telemetry.proto\x1a\x43pogoprotos/networking/platform/telemetry/platform_server_data.proto\x1a\x39pogoprotos/data/telemetry/pokemon_go_plus_telemetry.proto\x1a\x36pogoprotos/data/telemetry/rpc_response_telemetry.proto\x1a?pogoprotos/data/telemetry/asset_bundle_download_telemetry.proto\x1a<pogoprotos/data/telemetry/asset_poi_download_telemetry.proto\x1a?pogoprotos/data/telemetry/asset_stream_download_telemetry.proto\x1a\x43pogoprotos/data/telemetry/asset_stream_cache_culled_telemetry.proto\x1a=pogoprotos/data/telemetry/rpc_socket_response_telemetry.proto\x1a\x30pogoprotos/settings/social_client_settings.proto\"\xdd\x19\n\x13\x43lientTelemetryOmni\x12\x36\n\tboot_time\x18\x01 \x01(\x0b\x32#.pogoprotos.data.telemetry.BootTime\x12\x38\n\nframe_rate\x18\x02 \x01(\x0b\x32$.pogoprotos.data.telemetry.FrameRate\x12Q\n\x17generic_click_telemetry\x18\x03 \x01(\x0b\x32\x30.pogoprotos.data.telemetry.GenericClickTelemetry\x12K\n\x14map_events_telemetry\x18\x04 \x01(\x0b\x32-.pogoprotos.data.telemetry.MapEventsTelemetry\x12Q\n\x17spin_pokestop_telemetry\x18\x05 \x01(\x0b\x32\x30.pogoprotos.data.telemetry.SpinPokestopTelemetry\x12O\n\x16profile_page_telemetry\x18\x06 \x01(\x0b\x32/.pogoprotos.data.telemetry.ProfilePageTelemetry\x12Q\n\x17shopping_page_telemetry\x18\x07 \x01(\x0b\x32\x30.pogoprotos.data.telemetry.ShoppingPageTelemetry\x12Y\n\x1b\x65ncounter_pokemon_telemetry\x18\x08 \x01(\x0b\x32\x34.pogoprotos.data.telemetry.EncounterPokemonTelemetry\x12Q\n\x17\x63\x61tch_pokemon_telemetry\x18\t \x01(\x0b\x32\x30.pogoprotos.data.telemetry.CatchPokemonTelemetry\x12S\n\x18\x64\x65ploy_pokemon_telemetry\x18\n \x01(\x0b\x32\x31.pogoprotos.data.telemetry.DeployPokemonTelemetry\x12O\n\x16\x66\x65\x65\x64_pokemon_telemetry\x18\x0b \x01(\x0b\x32/.pogoprotos.data.telemetry.FeedPokemonTelemetry\x12S\n\x18\x65volve_pokemon_telemetry\x18\x0c \x01(\x0b\x32\x31.pogoprotos.data.telemetry.EvolvePokemonTelemetry\x12U\n\x19release_pokemon_telemetry\x18\r \x01(\x0b\x32\x32.pogoprotos.data.telemetry.ReleasePokemonTelemetry\x12W\n\x1anickname_pokemon_telemetry\x18\x0e \x01(\x0b\x32\x33.pogoprotos.data.telemetry.NicknamePokemonTelemetry\x12I\n\x13news_page_telemetry\x18\x0f \x01(\x0b\x32,.pogoprotos.data.telemetry.NewsPageTelemetry\x12@\n\x0eitem_telemetry\x18\x10 \x01(\x0b\x32(.pogoprotos.data.telemetry.ItemTelemetry\x12O\n\x16\x62\x61ttle_party_telemetry\x18\x11 \x01(\x0b\x32/.pogoprotos.data.telemetry.BattlePartyTelemetry\x12U\n\x19passcode_redeem_telemetry\x18\x12 \x01(\x0b\x32\x32.pogoprotos.data.telemetry.PasscodeRedeemTelemetry\x12K\n\x14link_login_telemetry\x18\x13 \x01(\x0b\x32-.pogoprotos.data.telemetry.LinkLoginTelemetry\x12@\n\x0eraid_telemetry\x18\x14 \x01(\x0b\x32(.pogoprotos.data.telemetry.RaidTelemetry\x12Y\n\x1bpush_notification_telemetry\x18\x15 \x01(\x0b\x32\x34.pogoprotos.data.telemetry.PushNotificationTelemetry\x12_\n\x1e\x61vatar_customization_telemetry\x18\x16 \x01(\x0b\x32\x37.pogoprotos.data.telemetry.AvatarCustomizationTelemetry\x12x\n,read_point_of_interest_description_telemetry\x18\x17 \x01(\x0b\x32\x42.pogoprotos.data.telemetry.ReadPointOfInterestDescriptionTelemetry\x12>\n\rweb_telemetry\x18\x18 \x01(\x0b\x32\'.pogoprotos.data.telemetry.WebTelemetry\x12I\n\x13\x63hange_ar_telemetry\x18\x19 \x01(\x0b\x32,.pogoprotos.data.telemetry.ChangeArTelemetry\x12^\n\x1eweather_detail_click_telemetry\x18\x1a \x01(\x0b\x32\x36.pogoprotos.data.telemetry.WeatherDetailClickTelemetry\x12Q\n\x19user_issue_weather_report\x18\x1b \x01(\x0b\x32..pogoprotos.data.player.UserIssueWeatherReport\x12Y\n\x1bpokemon_inventory_telemetry\x18\x1c \x01(\x0b\x32\x34.pogoprotos.data.telemetry.PokemonInventoryTelemetry\x12\x44\n\x10social_telemetry\x18\x1d \x01(\x0b\x32*.pogoprotos.data.telemetry.SocialTelemetry\x12\x62\n\x1e\x63heck_encounter_info_telemetry\x18\x1e \x01(\x0b\x32:.pogoprotos.data.telemetry.CheckEncounterTrayInfoTelemetry\x12T\n\x19pokemon_go_plus_telemetry\x18\x1f \x01(\x0b\x32\x31.pogoprotos.data.telemetry.PokemonGoPlusTelemetry\x12M\n\x14rpc_timing_telemetry\x18 \x01(\x0b\x32/.pogoprotos.data.telemetry.RpcResponseTelemetry\x12R\n\x1bsocial_gift_count_telemetry\x18! \x01(\x0b\x32-.pogoprotos.settings.SocialGiftCountTelemetry\x12W\n\x16\x61sset_bundle_telemetry\x18\" \x01(\x0b\x32\x37.pogoprotos.data.telemetry.AssetBundleDownloadTelemetry\x12Z\n\x1c\x61sset_poi_download_telemetry\x18# \x01(\x0b\x32\x34.pogoprotos.data.telemetry.AssetPoiDownloadTelemetry\x12`\n\x1f\x61sset_stream_download_telemetry\x18$ \x01(\x0b\x32\x37.pogoprotos.data.telemetry.AssetStreamDownloadTelemetry\x12g\n#asset_stream_cache_culled_telemetry\x18% \x01(\x0b\x32:.pogoprotos.data.telemetry.AssetStreamCacheCulledTelemetry\x12Z\n\x1brpc_socket_timing_telemetry\x18& \x01(\x0b\x32\x35.pogoprotos.data.telemetry.RpcSocketResponseTelemetry\x12R\n\x0bserver_data\x18\xe9\x07 \x01(\x0b\x32<.pogoprotos.networking.platform.telemetry.PlatformServerDatab\x06proto3')
,
dependencies=[pogoprotos_dot_data_dot_telemetry_dot_boot__time__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_frame__rate__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_generic__click__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_map__events__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_spin__pokestop__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_profile__page__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_shopping__page__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_encounter__pokemon__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_catch__pokemon__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_deploy__pokemon__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_feed__pokemon__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_evolve__pokemon__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_release__pokemon__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_nickname__pokemon__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_news__page__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_item__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_battle__party__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_passcode__redeem__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_link__login__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_raid__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_push__notification__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_avatar__customization__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_read__point__of__interest__description__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_web__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_change__ar__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_weather__detail__click__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_player_dot_user__issue__weather__report__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_pokemon__inventory__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_social__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_check__encounter__tray__info__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_networking_dot_platform_dot_telemetry_dot_platform__server__data__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_pokemon__go__plus__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_rpc__response__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_asset__bundle__download__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_asset__poi__download__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_asset__stream__download__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_asset__stream__cache__culled__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_data_dot_telemetry_dot_rpc__socket__response__telemetry__pb2.DESCRIPTOR,pogoprotos_dot_settings_dot_social__client__settings__pb2.DESCRIPTOR,])
_CLIENTTELEMETRYOMNI = _descriptor.Descriptor(
name='ClientTelemetryOmni',
full_name='pogoprotos.data.telemetry.ClientTelemetryOmni',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='boot_time', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.boot_time', index=0,
number=1, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='frame_rate', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.frame_rate', index=1,
number=2, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='generic_click_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.generic_click_telemetry', index=2,
number=3, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='map_events_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.map_events_telemetry', index=3,
number=4, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='spin_pokestop_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.spin_pokestop_telemetry', index=4,
number=5, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='profile_page_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.profile_page_telemetry', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='shopping_page_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.shopping_page_telemetry', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='encounter_pokemon_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.encounter_pokemon_telemetry', index=7,
number=8, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='catch_pokemon_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.catch_pokemon_telemetry', index=8,
number=9, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='deploy_pokemon_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.deploy_pokemon_telemetry', index=9,
number=10, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='feed_pokemon_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.feed_pokemon_telemetry', index=10,
number=11, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='evolve_pokemon_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.evolve_pokemon_telemetry', index=11,
number=12, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='release_pokemon_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.release_pokemon_telemetry', index=12,
number=13, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='nickname_pokemon_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.nickname_pokemon_telemetry', index=13,
number=14, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='news_page_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.news_page_telemetry', index=14,
number=15, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='item_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.item_telemetry', index=15,
number=16, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='battle_party_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.battle_party_telemetry', index=16,
number=17, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='passcode_redeem_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.passcode_redeem_telemetry', index=17,
number=18, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='link_login_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.link_login_telemetry', index=18,
number=19, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='raid_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.raid_telemetry', index=19,
number=20, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='push_notification_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.push_notification_telemetry', index=20,
number=21, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='avatar_customization_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.avatar_customization_telemetry', index=21,
number=22, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='read_point_of_interest_description_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.read_point_of_interest_description_telemetry', index=22,
number=23, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='web_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.web_telemetry', index=23,
number=24, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='change_ar_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.change_ar_telemetry', index=24,
number=25, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='weather_detail_click_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.weather_detail_click_telemetry', index=25,
number=26, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='user_issue_weather_report', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.user_issue_weather_report', index=26,
number=27, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='pokemon_inventory_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.pokemon_inventory_telemetry', index=27,
number=28, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='social_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.social_telemetry', index=28,
number=29, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='check_encounter_info_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.check_encounter_info_telemetry', index=29,
number=30, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='pokemon_go_plus_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.pokemon_go_plus_telemetry', index=30,
number=31, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rpc_timing_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.rpc_timing_telemetry', index=31,
number=32, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='social_gift_count_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.social_gift_count_telemetry', index=32,
number=33, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='asset_bundle_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.asset_bundle_telemetry', index=33,
number=34, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='asset_poi_download_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.asset_poi_download_telemetry', index=34,
number=35, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='asset_stream_download_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.asset_stream_download_telemetry', index=35,
number=36, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='asset_stream_cache_culled_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.asset_stream_cache_culled_telemetry', index=36,
number=37, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='rpc_socket_timing_telemetry', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.rpc_socket_timing_telemetry', index=37,
number=38, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
_descriptor.FieldDescriptor(
name='server_data', full_name='pogoprotos.data.telemetry.ClientTelemetryOmni.server_data', index=38,
number=1001, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
options=None),
],
extensions=[
],
nested_types=[],
enum_types=[
],
options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=2341,
serialized_end=5634,
)
_CLIENTTELEMETRYOMNI.fields_by_name['boot_time'].message_type = pogoprotos_dot_data_dot_telemetry_dot_boot__time__pb2._BOOTTIME
_CLIENTTELEMETRYOMNI.fields_by_name['frame_rate'].message_type = pogoprotos_dot_data_dot_telemetry_dot_frame__rate__pb2._FRAMERATE
_CLIENTTELEMETRYOMNI.fields_by_name['generic_click_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_generic__click__telemetry__pb2._GENERICCLICKTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['map_events_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_map__events__telemetry__pb2._MAPEVENTSTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['spin_pokestop_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_spin__pokestop__telemetry__pb2._SPINPOKESTOPTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['profile_page_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_profile__page__telemetry__pb2._PROFILEPAGETELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['shopping_page_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_shopping__page__telemetry__pb2._SHOPPINGPAGETELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['encounter_pokemon_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_encounter__pokemon__telemetry__pb2._ENCOUNTERPOKEMONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['catch_pokemon_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_catch__pokemon__telemetry__pb2._CATCHPOKEMONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['deploy_pokemon_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_deploy__pokemon__telemetry__pb2._DEPLOYPOKEMONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['feed_pokemon_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_feed__pokemon__telemetry__pb2._FEEDPOKEMONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['evolve_pokemon_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_evolve__pokemon__telemetry__pb2._EVOLVEPOKEMONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['release_pokemon_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_release__pokemon__telemetry__pb2._RELEASEPOKEMONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['nickname_pokemon_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_nickname__pokemon__telemetry__pb2._NICKNAMEPOKEMONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['news_page_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_news__page__telemetry__pb2._NEWSPAGETELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['item_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_item__telemetry__pb2._ITEMTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['battle_party_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_battle__party__telemetry__pb2._BATTLEPARTYTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['passcode_redeem_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_passcode__redeem__telemetry__pb2._PASSCODEREDEEMTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['link_login_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_link__login__telemetry__pb2._LINKLOGINTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['raid_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_raid__telemetry__pb2._RAIDTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['push_notification_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_push__notification__telemetry__pb2._PUSHNOTIFICATIONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['avatar_customization_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_avatar__customization__telemetry__pb2._AVATARCUSTOMIZATIONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['read_point_of_interest_description_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_read__point__of__interest__description__telemetry__pb2._READPOINTOFINTERESTDESCRIPTIONTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['web_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_web__telemetry__pb2._WEBTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['change_ar_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_change__ar__telemetry__pb2._CHANGEARTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['weather_detail_click_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_weather__detail__click__telemetry__pb2._WEATHERDETAILCLICKTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['user_issue_weather_report'].message_type = pogoprotos_dot_data_dot_player_dot_user__issue__weather__report__pb2._USERISSUEWEATHERREPORT
_CLIENTTELEMETRYOMNI.fields_by_name['pokemon_inventory_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_pokemon__inventory__telemetry__pb2._POKEMONINVENTORYTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['social_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_social__telemetry__pb2._SOCIALTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['check_encounter_info_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_check__encounter__tray__info__telemetry__pb2._CHECKENCOUNTERTRAYINFOTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['pokemon_go_plus_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_pokemon__go__plus__telemetry__pb2._POKEMONGOPLUSTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['rpc_timing_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_rpc__response__telemetry__pb2._RPCRESPONSETELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['social_gift_count_telemetry'].message_type = pogoprotos_dot_settings_dot_social__client__settings__pb2._SOCIALGIFTCOUNTTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['asset_bundle_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_asset__bundle__download__telemetry__pb2._ASSETBUNDLEDOWNLOADTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['asset_poi_download_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_asset__poi__download__telemetry__pb2._ASSETPOIDOWNLOADTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['asset_stream_download_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_asset__stream__download__telemetry__pb2._ASSETSTREAMDOWNLOADTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['asset_stream_cache_culled_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_asset__stream__cache__culled__telemetry__pb2._ASSETSTREAMCACHECULLEDTELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['rpc_socket_timing_telemetry'].message_type = pogoprotos_dot_data_dot_telemetry_dot_rpc__socket__response__telemetry__pb2._RPCSOCKETRESPONSETELEMETRY
_CLIENTTELEMETRYOMNI.fields_by_name['server_data'].message_type = pogoprotos_dot_networking_dot_platform_dot_telemetry_dot_platform__server__data__pb2._PLATFORMSERVERDATA
DESCRIPTOR.message_types_by_name['ClientTelemetryOmni'] = _CLIENTTELEMETRYOMNI
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
ClientTelemetryOmni = _reflection.GeneratedProtocolMessageType('ClientTelemetryOmni', (_message.Message,), dict(
DESCRIPTOR = _CLIENTTELEMETRYOMNI,
__module__ = 'pogoprotos.data.telemetry.client_telemetry_omni_pb2'
# @@protoc_insertion_point(class_scope:pogoprotos.data.telemetry.ClientTelemetryOmni)
))
_sym_db.RegisterMessage(ClientTelemetryOmni)
# @@protoc_insertion_point(module_scope)
| 94.515663 | 6,824 | 0.851468 | 5,100 | 39,224 | 6.022745 | 0.068627 | 0.065601 | 0.098841 | 0.072275 | 0.747656 | 0.717086 | 0.648099 | 0.57361 | 0.513446 | 0.44651 | 0 | 0.031318 | 0.07034 | 39,224 | 414 | 6,825 | 94.743961 | 0.811024 | 0.007011 | 0 | 0.508861 | 1 | 0.005063 | 0.252266 | 0.239863 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.012658 | 0.113924 | 0 | 0.113924 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
918b116ff6f07031b6bfd478277f2fd2bf9bd382 | 197 | py | Python | Challenge3.py | dzh123xt/PythonChallenge | 931ff3b0371d7fba2faa9e1730f4b771f290c7d2 | [
"MIT"
] | null | null | null | Challenge3.py | dzh123xt/PythonChallenge | 931ff3b0371d7fba2faa9e1730f4b771f290c7d2 | [
"MIT"
] | null | null | null | Challenge3.py | dzh123xt/PythonChallenge | 931ff3b0371d7fba2faa9e1730f4b771f290c7d2 | [
"MIT"
] | null | null | null | import re
f = open("./Challenge3ocr.html", "r")
result = ""
for line in f.readlines():
content = re.sub("([^a-zA-Z]*)", "", line)
if(content != ""):
print content,
print result
f.close()
| 15.153846 | 44 | 0.57868 | 28 | 197 | 4.071429 | 0.714286 | 0.210526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006211 | 0.182741 | 197 | 12 | 45 | 16.416667 | 0.701863 | 0 | 0 | 0 | 0 | 0 | 0.167513 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.111111 | null | null | 0.222222 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
91938e145b137d701cee6ecbc560c1516b70a299 | 1,055 | py | Python | Reports/lib/manageSec.py | unixutils/projects | 43f2a5344fac6c90fab208d0a9d67f76c7c0f7b8 | [
"MIT"
] | 1 | 2018-11-13T02:29:22.000Z | 2018-11-13T02:29:22.000Z | Reports/lib/manageSec.py | unixutils/projects | 43f2a5344fac6c90fab208d0a9d67f76c7c0f7b8 | [
"MIT"
] | null | null | null | Reports/lib/manageSec.py | unixutils/projects | 43f2a5344fac6c90fab208d0a9d67f76c7c0f7b8 | [
"MIT"
] | null | null | null | #!/usr/bin/python
import os
import sys
from cryptography.fernet import Fernet
from manageInventory import manage_host_config
class manage_sec(manage_host_config):
def __init__(self, **kwargs):
super(manage_sec, self).__init__()
def generateKey(self):
return Fernet.generate_key()
def createKeyFile(self):
with open("key.key", "wb") as key_file:
key_file.write(self.generateKey())
def getKeyFromFile(self):
KeyFile = open("key.key", "rb").read()
cip = Fernet(KeyFile)
return cip
def encryptPassword(self, defaultSshPassword):
self.createKeyFile()
key = self.getKeyFromFile()
encodedSecret = key.encrypt(defaultSshPassword)
return encodedSecret
def decryptPassword(self):
key = self.getKeyFromFile()
return key.decrypt(self.getDefaultEncryptedSshPassword())
def storeEncryptedPassword(self):
encryptedPassword = self.encryptPassword()
self.setDefaultEncryptedSshPassword(encryptedPassword)
| 29.305556 | 65 | 0.681517 | 105 | 1,055 | 6.685714 | 0.438095 | 0.02849 | 0.045584 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.222749 | 1,055 | 35 | 66 | 30.142857 | 0.856098 | 0.015166 | 0 | 0.074074 | 0 | 0 | 0.017341 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.259259 | false | 0.259259 | 0.148148 | 0.037037 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
919809335c3f180363adfc4e1ed2672e051427f3 | 737 | py | Python | models/exporttask.py | jonchui/MyLife | e5ba2bb7692eb27b59a3a744de22cb909a1e620f | [
"MIT"
] | 27 | 2015-01-14T18:12:13.000Z | 2021-07-26T16:40:15.000Z | models/exporttask.py | jonchui/MyLife | e5ba2bb7692eb27b59a3a744de22cb909a1e620f | [
"MIT"
] | 13 | 2015-07-29T22:45:21.000Z | 2020-10-08T08:32:50.000Z | models/exporttask.py | jonchui/MyLife | e5ba2bb7692eb27b59a3a744de22cb909a1e620f | [
"MIT"
] | 15 | 2015-10-23T23:27:34.000Z | 2022-01-07T22:59:12.000Z | import datetime
from google.appengine.ext import ndb
class ExportTask(ndb.Model):
blob_key = ndb.BlobKeyProperty()
total_posts = ndb.IntegerProperty(default=0)
total_photos = ndb.IntegerProperty(default=0)
exported_posts = ndb.IntegerProperty(default=0)
exported_photos = ndb.IntegerProperty(default=0)
created = ndb.DateTimeProperty(auto_now_add=True)
updated = ndb.DateTimeProperty(auto_now=True)
filename = ndb.StringProperty()
status = ndb.StringProperty(choices=['new', 'inprogress', 'finished', 'failed'],default='new')
message = ndb.TextProperty(default='Waiting for task to start...')
def update(self, message, **kwargs):
self.message = message
for k,v in kwargs.items():
self.__setattr__(k, v)
self.put()
| 33.5 | 95 | 0.757123 | 95 | 737 | 5.747368 | 0.547368 | 0.131868 | 0.18315 | 0.190476 | 0.260073 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006107 | 0.111262 | 737 | 21 | 96 | 35.095238 | 0.827481 | 0 | 0 | 0 | 0 | 0 | 0.078697 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.055556 | false | 0 | 0.111111 | 0 | 0.777778 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
91a06e86a69b81cb1152cf4483367e926156165c | 216 | py | Python | src/views/index.py | aiweithesushigirl/personal-portfolio | 040a2dcbfceb83985756f51f782bac88870ebb93 | [
"MIT"
] | null | null | null | src/views/index.py | aiweithesushigirl/personal-portfolio | 040a2dcbfceb83985756f51f782bac88870ebb93 | [
"MIT"
] | null | null | null | src/views/index.py | aiweithesushigirl/personal-portfolio | 040a2dcbfceb83985756f51f782bac88870ebb93 | [
"MIT"
] | null | null | null | """Search index view."""
import flask
import src
@src.app.route('/', methods=['GET', 'POST'])
def show_index():
"""Display / route."""
context = {}
return flask.render_template("index.html", **context)
| 19.636364 | 57 | 0.62037 | 26 | 216 | 5.076923 | 0.730769 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.162037 | 216 | 10 | 58 | 21.6 | 0.729282 | 0.162037 | 0 | 0 | 0 | 0 | 0.105882 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
91ab416af81b86e7a261a8a68945bf638ac0bd64 | 237 | py | Python | setup.py | mattjmuw/iam-resttools | 9062656b0341c5d13999bdf812ae3b1962e1d4dc | [
"Apache-2.0"
] | null | null | null | setup.py | mattjmuw/iam-resttools | 9062656b0341c5d13999bdf812ae3b1962e1d4dc | [
"Apache-2.0"
] | null | null | null | setup.py | mattjmuw/iam-resttools | 9062656b0341c5d13999bdf812ae3b1962e1d4dc | [
"Apache-2.0"
] | null | null | null | # from distutils.core import setup
from setuptools import setup, find_packages
setup(name='resttools',
version='1.0',
description='UW-IT REST library',
packages=find_packages(),
include_package_data=True,
)
| 26.333333 | 43 | 0.691983 | 29 | 237 | 5.517241 | 0.758621 | 0.1375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010582 | 0.202532 | 237 | 8 | 44 | 29.625 | 0.835979 | 0.135021 | 0 | 0 | 0 | 0 | 0.147783 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
91aca1148559619e3843543e2dfd36f15d30156b | 1,984 | py | Python | stubs.min/Autodesk/Revit/DB/__init___parts/ViewDisplaySketchyLines.py | ricardyn/ironpython-stubs | 4d2b405eda3ceed186e8adca55dd97c332c6f49d | [
"MIT"
] | 1 | 2021-02-02T13:39:16.000Z | 2021-02-02T13:39:16.000Z | stubs.min/Autodesk/Revit/DB/__init___parts/ViewDisplaySketchyLines.py | hdm-dt-fb/ironpython-stubs | 4d2b405eda3ceed186e8adca55dd97c332c6f49d | [
"MIT"
] | null | null | null | stubs.min/Autodesk/Revit/DB/__init___parts/ViewDisplaySketchyLines.py | hdm-dt-fb/ironpython-stubs | 4d2b405eda3ceed186e8adca55dd97c332c6f49d | [
"MIT"
] | null | null | null | class ViewDisplaySketchyLines(object,IDisposable):
""" Represents the settings for sketchy lines. """
def Dispose(self):
""" Dispose(self: ViewDisplaySketchyLines) """
pass
def ReleaseUnmanagedResources(self,*args):
""" ReleaseUnmanagedResources(self: ViewDisplaySketchyLines,disposing: bool) """
pass
def __enter__(self,*args):
""" __enter__(self: IDisposable) -> object """
pass
def __exit__(self,*args):
""" __exit__(self: IDisposable,exc_type: object,exc_value: object,exc_back: object) """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __repr__(self,*args):
""" __repr__(self: object) -> str """
pass
EnableSketchyLines=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""True to enable sketchy lines visibility. False to disable it.
Get: EnableSketchyLines(self: ViewDisplaySketchyLines) -> bool
Set: EnableSketchyLines(self: ViewDisplaySketchyLines)=value
"""
Extension=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""The extension scale value. Controls the magnitude of line's extension.
Values between 0 and 10.
Get: Extension(self: ViewDisplaySketchyLines) -> int
Set: Extension(self: ViewDisplaySketchyLines)=value
"""
IsValidObject=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Specifies whether the .NET object represents a valid Revit entity.
Get: IsValidObject(self: ViewDisplaySketchyLines) -> bool
"""
Jitter=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""The jitter defines jitteriness of the line.
Values between 0 and 10.
Get: Jitter(self: ViewDisplaySketchyLines) -> int
Set: Jitter(self: ViewDisplaySketchyLines)=value
"""
| 36.072727 | 215 | 0.715726 | 226 | 1,984 | 5.986726 | 0.314159 | 0.088692 | 0.053215 | 0.070953 | 0.277162 | 0.277162 | 0.244642 | 0.244642 | 0.244642 | 0.244642 | 0 | 0.003601 | 0.160282 | 1,984 | 54 | 216 | 36.740741 | 0.808523 | 0.259073 | 0 | 0.352941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.352941 | false | 0.352941 | 0 | 0 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
91ae80095f294482c917adb64643192cba62c63b | 625 | py | Python | sympy/solvers/__init__.py | jegerjensen/sympy | 3a43310f1957a21a6f095fe2801cc05b5268a2c7 | [
"BSD-3-Clause"
] | 1 | 2016-07-13T04:30:25.000Z | 2016-07-13T04:30:25.000Z | sympy/solvers/__init__.py | jegerjensen/sympy | 3a43310f1957a21a6f095fe2801cc05b5268a2c7 | [
"BSD-3-Clause"
] | null | null | null | sympy/solvers/__init__.py | jegerjensen/sympy | 3a43310f1957a21a6f095fe2801cc05b5268a2c7 | [
"BSD-3-Clause"
] | null | null | null | """A module for solving all kinds of equations.
Examples
--------
>>> from sympy.solvers import solve
>>> from sympy.abc import x
>>> solve(x**5+5*x**4+10*x**3+10*x**2+5*x+1,x)
[-1]
"""
from solvers import solve, solve_linear_system, solve_linear_system_LU, \
solve_undetermined_coeffs, tsolve, nsolve, solve_linear
from recurr import rsolve, rsolve_poly, rsolve_ratio, rsolve_hyper
from ode import checkodesol, classify_ode, ode_order, dsolve, \
homogeneous_order
from polysys import solve_poly_system, solve_triangulated
from pde import pde_separate, pde_separate_add, pde_separate_mul
| 29.761905 | 73 | 0.7344 | 94 | 625 | 4.648936 | 0.489362 | 0.075515 | 0.08238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022814 | 0.1584 | 625 | 20 | 74 | 31.25 | 0.807985 | 0.2864 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.714286 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
91b16e048afc0fb169c2bd8ce5c74bbbeb9067eb | 1,199 | py | Python | leetcode-algorithms/977. Squares of a Sorted Array/977.squares-of-a-sorted-array.py | cnyy7/LeetCode_EY | 44e92f102b61f5e931e66081ed6636d7ecbdefd4 | [
"MIT"
] | null | null | null | leetcode-algorithms/977. Squares of a Sorted Array/977.squares-of-a-sorted-array.py | cnyy7/LeetCode_EY | 44e92f102b61f5e931e66081ed6636d7ecbdefd4 | [
"MIT"
] | null | null | null | leetcode-algorithms/977. Squares of a Sorted Array/977.squares-of-a-sorted-array.py | cnyy7/LeetCode_EY | 44e92f102b61f5e931e66081ed6636d7ecbdefd4 | [
"MIT"
] | null | null | null | #
# @lc app=leetcode id=977 lang=python3
#
# [977] Squares of a Sorted Array
#
# https://leetcode.com/problems/squares-of-a-sorted-array/description/
#
# algorithms
# Easy (72.86%)
# Total Accepted: 56.2K
# Total Submissions: 77.7K
# Testcase Example: '[-4,-1,0,3,10]'
#
# Given an array of integers A sorted in non-decreasing order, return an array
# of the squares of each number, also in sorted non-decreasing order.
#
#
#
#
# Example 1:
#
#
# Input: [-4,-1,0,3,10]
# Output: [0,1,9,16,100]
#
#
#
# Example 2:
#
#
# Input: [-7,-3,2,3,11]
# Output: [4,9,9,49,121]
#
#
#
#
# Note:
#
#
# 1 <= A.length <= 10000
# -10000 <= A[i] <= 10000
# A is sorted in non-decreasing order.
#
#
#
#
class Solution:
def sortedSquares(self, A: List[int]) -> List[int]:
# left,right=0,len(A)-1
# sol=[_ for _ in range(len(A))]
# for i in range(len(A)-1,-1,-1):
# if abs(A[left])>abs(A[right]):
# sol[i]=A[left]*A[left]
# left+=1
# else:
# sol[i]=A[right]*A[right]
# right-=1
# return sol
return sorted(x*x for x in A)
| 19.33871 | 79 | 0.516264 | 177 | 1,199 | 3.485876 | 0.440678 | 0.04376 | 0.08752 | 0.051864 | 0.171799 | 0 | 0 | 0 | 0 | 0 | 0 | 0.091018 | 0.303586 | 1,199 | 61 | 80 | 19.655738 | 0.647904 | 0.694746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0.333333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
91d640486b1af4e898c95abd3f3dce37622643aa | 2,383 | py | Python | zfit/util/exception.py | kailiu77/zfit | 00eed81fb34e0eb2e4bae5ddc9ebf38699e107ca | [
"BSD-3-Clause"
] | 1 | 2022-01-15T13:38:12.000Z | 2022-01-15T13:38:12.000Z | zfit/util/exception.py | kailiu77/zfit | 00eed81fb34e0eb2e4bae5ddc9ebf38699e107ca | [
"BSD-3-Clause"
] | null | null | null | zfit/util/exception.py | kailiu77/zfit | 00eed81fb34e0eb2e4bae5ddc9ebf38699e107ca | [
"BSD-3-Clause"
] | null | null | null | # Copyright (c) 2019 zfit
# TODO: improve errors of models. Generate more general error, inherit and use more specific?
class PDFCompatibilityError(Exception):
pass
class LogicalUndefinedOperationError(Exception):
pass
class ExtendedPDFError(Exception):
pass
class AlreadyExtendedPDFError(ExtendedPDFError):
pass
class NotExtendedPDFError(ExtendedPDFError):
pass
class ConversionError(Exception):
pass
class SubclassingError(Exception):
pass
class BasePDFSubclassingError(SubclassingError):
pass
class IntentionNotUnambiguousError(Exception):
pass
class UnderdefinedError(IntentionNotUnambiguousError):
pass
class LimitsUnderdefinedError(UnderdefinedError):
pass
class OverdefinedError(IntentionNotUnambiguousError):
pass
class LimitsOverdefinedError(OverdefinedError):
pass
class AxesNotUnambiguousError(IntentionNotUnambiguousError):
pass
class NotSpecifiedError(Exception):
pass
class LimitsNotSpecifiedError(NotSpecifiedError):
pass
class NormRangeNotSpecifiedError(NotSpecifiedError):
pass
class AxesNotSpecifiedError(NotSpecifiedError):
pass
class ObsNotSpecifiedError(NotSpecifiedError):
pass
# Parameter Errors
class NameAlreadyTakenError(Exception):
pass
# Operation errors
class IncompatibleError(Exception):
pass
class ShapeIncompatibleError(IncompatibleError):
pass
class ObsIncompatibleError(IncompatibleError):
pass
class SpaceIncompatibleError(IncompatibleError):
pass
class LimitsIncompatibleError(IncompatibleError):
pass
class ModelIncompatibleError(IncompatibleError):
pass
# Data errors
class WeightsNotImplementedError(Exception):
pass
# Minimizer errors
class NotMinimizedError(Exception):
pass
# Runtime Errors
class NoSessionSpecifiedError(Exception):
pass
# PDF class internal handling errors
class NormRangeNotImplementedError(Exception):
"""Indicates that a function does not support the normalization range argument `norm_range`."""
pass
class MultipleLimitsNotImplementedError(Exception):
"""Indicates that a function does not support several limits in a :py:class:`~zfit.Space`."""
pass
# Developer verbose messages
class DueToLazynessNotImplementedError(Exception):
"""Only for developing purpose! Does not serve as a 'real' Exception."""
pass
| 16.548611 | 99 | 0.778431 | 203 | 2,383 | 9.133005 | 0.428571 | 0.116505 | 0.07767 | 0.024811 | 0.048544 | 0.048544 | 0.048544 | 0.048544 | 0 | 0 | 0 | 0.002001 | 0.161141 | 2,383 | 143 | 100 | 16.664336 | 0.925463 | 0.210659 | 0 | 0.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006993 | 0 | 1 | 0 | true | 0.5 | 0 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
91de3b5347e9fa9610ce3187bc840d45632d0334 | 239 | py | Python | python/covid_web/covid_web/wsgi.py | lukipuki/COVID-19-simulation | 5532a84f7af99223816dbeea9e9a2c359b0f569f | [
"Unlicense"
] | 3 | 2020-04-03T16:43:21.000Z | 2020-05-20T20:44:49.000Z | python/covid_web/covid_web/wsgi.py | lukipuki/COVID-19-simulation | 5532a84f7af99223816dbeea9e9a2c359b0f569f | [
"Unlicense"
] | 66 | 2020-04-11T19:40:53.000Z | 2020-11-21T14:11:55.000Z | python/covid_web/covid_web/wsgi.py | lukipuki/COVID-19-simulation | 5532a84f7af99223816dbeea9e9a2c359b0f569f | [
"Unlicense"
] | 1 | 2020-04-14T15:04:51.000Z | 2020-04-14T15:04:51.000Z | from os import getenv
from pathlib import Path
from .server import setup_server
data_path = Path(getenv(key="DATA_PATH", default="data"))
app = setup_server(data_path, data_path / "predictions")
if __name__ == "__main__":
app.run()
| 21.727273 | 57 | 0.740586 | 35 | 239 | 4.657143 | 0.485714 | 0.196319 | 0.184049 | 0.233129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.142259 | 239 | 10 | 58 | 23.9 | 0.795122 | 0 | 0 | 0 | 0 | 0 | 0.133891 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
37e60838d6492819fec285d1232ff933bacc64ed | 461 | py | Python | submissions/hello-django/mysite/myapp/models.py | hcktheheaven/csoc18-week5 | 00fb870dfa34798f7a469b4e188eebbfe4560b42 | [
"MIT"
] | null | null | null | submissions/hello-django/mysite/myapp/models.py | hcktheheaven/csoc18-week5 | 00fb870dfa34798f7a469b4e188eebbfe4560b42 | [
"MIT"
] | null | null | null | submissions/hello-django/mysite/myapp/models.py | hcktheheaven/csoc18-week5 | 00fb870dfa34798f7a469b4e188eebbfe4560b42 | [
"MIT"
] | null | null | null | from django.db import models
from django.urls import reverse
# Create your models here.
class Product(models.Model):
name = models.CharField(max_length=250)
price = models.CharField(max_length=250)
description = models.CharField(max_length=500)
image = models.FileField()
def get_absolute_url(self):
return reverse('myapp:productdetail', kwargs={'pk':self.pk})
def __str__(self):
return self.name + ' - ' + self.price
| 30.733333 | 68 | 0.700651 | 60 | 461 | 5.233333 | 0.566667 | 0.143312 | 0.171975 | 0.229299 | 0.171975 | 0 | 0 | 0 | 0 | 0 | 0 | 0.023936 | 0.184382 | 461 | 14 | 69 | 32.928571 | 0.81117 | 0.052061 | 0 | 0 | 0 | 0 | 0.055172 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.181818 | false | 0 | 0.181818 | 0.181818 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
37fe1348639e401f1ce1827f33b57f47b8dec989 | 2,736 | py | Python | bulletins/migrations/0001_initial.py | Kgermando/catsr | 52da79130a4bb7c15b59e3690c8df8d23573d728 | [
"Apache-2.0"
] | null | null | null | bulletins/migrations/0001_initial.py | Kgermando/catsr | 52da79130a4bb7c15b59e3690c8df8d23573d728 | [
"Apache-2.0"
] | null | null | null | bulletins/migrations/0001_initial.py | Kgermando/catsr | 52da79130a4bb7c15b59e3690c8df8d23573d728 | [
"Apache-2.0"
] | null | null | null | # Generated by Django 3.1.2 on 2020-12-01 08:53
from django.db import migrations, models
class Migration(migrations.Migration):
initial = True
dependencies = [
]
operations = [
migrations.CreateModel(
name='Bulletin',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('titre_bulletin', models.CharField(max_length=200)),
('slug', models.SlugField(blank=True, help_text='Laissez ce champ vide', unique=True)),
('img_bulletin', models.ImageField(upload_to='bulletin_img/')),
('content_bulletin', models.TextField()),
('numero_bulletin', models.IntegerField()),
('total_enfant_scolarise', models.IntegerField(blank=True)),
('enfants_scolarise_filles', models.IntegerField(blank=True)),
('enfants_formation_professionnelle', models.IntegerField(blank=True, help_text='orientés et accompagnés en formation professionnelle ', null=True)),
('enfants_formation_professionnelle_filles', models.IntegerField(blank=True, help_text='orientés et accompagnés en formation professionnelle ', null=True)),
('enfants_jeunes_adultes', models.IntegerField(blank=True, help_text='enfants, jeunes et adultes contactés lors de la descente sur le terrain ', null=True)),
('nombre_enfants_identifies', models.IntegerField(blank=True, help_text='Uniquement les chiffres', null=True)),
('nombre_contacts', models.IntegerField(blank=True, help_text='Uniquement les chiffres', null=True)),
('nombre_entretiens_realises', models.IntegerField(blank=True, help_text='Uniquement les chiffres', null=True)),
('nombre_suivi_familial', models.IntegerField(blank=True, help_text='Uniquement les chiffres', null=True)),
('nombre_enquetes_realises', models.IntegerField(blank=True, help_text='Uniquement les chiffres', null=True)),
('nombre_suivi_scolarise', models.IntegerField(blank=True, help_text='Uniquement les chiffres', null=True)),
('nombre_suivi_formations_professionnelles', models.IntegerField(blank=True, help_text='Uniquement les chiffres', null=True)),
('nombre_animations_rue', models.IntegerField(blank=True, help_text='Uniquement les chiffres', null=True)),
('nombre_reinsertions_familiales', models.IntegerField(blank=True, help_text='Uniquement les chiffres', null=True)),
('editeur', models.CharField(max_length=200)),
('created', models.DateTimeField()),
],
),
]
| 65.142857 | 173 | 0.664839 | 288 | 2,736 | 6.131944 | 0.347222 | 0.076444 | 0.182333 | 0.214043 | 0.586636 | 0.500566 | 0.480747 | 0.480747 | 0.480747 | 0.480747 | 0 | 0.009736 | 0.211623 | 2,736 | 41 | 174 | 66.731707 | 0.808994 | 0.016447 | 0 | 0 | 1 | 0 | 0.323912 | 0.13016 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.029412 | 0 | 0.147059 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
53014e17fcf5b1ea81153e0dbd74af802bfe8a6c | 618 | py | Python | modules/identity.py | tranduytrung/image-3dmodel-embedding | 380ba5bfa853afb8175af79159521e2426efa800 | [
"Apache-2.0"
] | null | null | null | modules/identity.py | tranduytrung/image-3dmodel-embedding | 380ba5bfa853afb8175af79159521e2426efa800 | [
"Apache-2.0"
] | null | null | null | modules/identity.py | tranduytrung/image-3dmodel-embedding | 380ba5bfa853afb8175af79159521e2426efa800 | [
"Apache-2.0"
] | null | null | null | import torch
class Identity(torch.jit.ScriptModule):
r"""A placeholder identity operator that is argument-insensitive.
Args:
args: any argument (unused)
kwargs: any keyword argument (unused)
Examples::
>>> m = nn.Identity(54, unused_argumenbt1=0.1, unused_argument2=False)
>>> input = torch.randn(128, 20)
>>> output = m(input)
>>> print(output.size())
torch.Size([128, 20])
"""
def __init__(self, *args, **kwargs):
super(Identity, self).__init__()
@torch.jit.script_method
def forward(self, input):
return input | 30.9 | 79 | 0.608414 | 72 | 618 | 5.069444 | 0.597222 | 0.043836 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034858 | 0.257282 | 618 | 20 | 80 | 30.9 | 0.760349 | 0.548544 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0.125 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
5313a918903d953287846a288588db9535842fee | 6,235 | py | Python | var/spack/repos/builtin/packages/r-tidyverse/package.py | player1537-forks/spack | 822b7632222ec5a91dc7b7cda5fc0e08715bd47c | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 11 | 2015-10-04T02:17:46.000Z | 2018-02-07T18:23:00.000Z | var/spack/repos/builtin/packages/r-tidyverse/package.py | player1537-forks/spack | 822b7632222ec5a91dc7b7cda5fc0e08715bd47c | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 22 | 2017-08-01T22:45:10.000Z | 2022-03-10T07:46:31.000Z | var/spack/repos/builtin/packages/r-tidyverse/package.py | player1537-forks/spack | 822b7632222ec5a91dc7b7cda5fc0e08715bd47c | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 4 | 2016-06-10T17:57:39.000Z | 2018-09-11T04:59:38.000Z | # Copyright 2013-2022 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack import *
class RTidyverse(RPackage):
"""Easily Install and Load the 'Tidyverse'.
The 'tidyverse' is a set of packages that work in harmony because they
share common data representations and 'API' design. This package is
designed to make it easy to install and load multiple 'tidyverse' packages
in a single step. Learn more about the 'tidyverse' at
<https://tidyverse.org>."""
cran = "tidyverse"
version('1.3.1', sha256='83cf95109d4606236274f5a8ec2693855bf75d3a1b3bc1ab4426dcc275ed6632')
version('1.3.0', sha256='6d8acb81e994f9bef5e4dcf908bcea3786d108adcf982628235b6c8c80f6fe09')
version('1.2.1', sha256='ad67a27bb4e89417a15338fe1a40251a7b5dedba60e9b72637963d3de574c37b')
depends_on('r+X', type=('build', 'run'))
depends_on('r@3.2:', type=('build', 'run'), when='@1.3.0:')
depends_on('r@3.3:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-broom@0.4.2:', type=('build', 'run'))
depends_on('r-broom@0.5.2:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-broom@0.7.6:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-cli@1.0.0:', type=('build', 'run'))
depends_on('r-cli@1.1.0:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-cli@2.4.0:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-crayon@1.3.4:', type=('build', 'run'))
depends_on('r-crayon@1.4.1:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-dbplyr@1.1.0:', type=('build', 'run'))
depends_on('r-dbplyr@1.4.2:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-dbplyr@2.1.1:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-dplyr@0.7.4:', type=('build', 'run'))
depends_on('r-dplyr@0.8.3:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-dplyr@1.0.5:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-dtplyr@1.1.0:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-forcats@0.2.0:', type=('build', 'run'))
depends_on('r-forcats@0.4.0:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-forcats@0.5.1:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-googledrive@1.0.1:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-googlesheets4@0.3.0:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-ggplot2@2.2.1:', type=('build', 'run'))
depends_on('r-ggplot2@3.2.1:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-ggplot2@3.3.3:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-haven@1.1.0:', type=('build', 'run'))
depends_on('r-haven@2.2.0:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-haven@2.3.1:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-hms@0.3:', type=('build', 'run'))
depends_on('r-hms@0.5.2:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-hms@1.0.0:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-httr@1.3.1:', type=('build', 'run'))
depends_on('r-httr@1.4.1:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-httr@1.4.2:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-jsonlite@1.5:', type=('build', 'run'))
depends_on('r-jsonlite@1.6:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-jsonlite@1.7.2:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-lubridate@1.7.1:', type=('build', 'run'))
depends_on('r-lubridate@1.7.4:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-lubridate@1.7.10:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-magrittr@1.5:', type=('build', 'run'))
depends_on('r-magrittr@2.0.1:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-modelr@0.1.1:', type=('build', 'run'))
depends_on('r-modelr@0.1.5:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-modelr@0.1.8:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-pillar@1.4.2:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-pillar@1.6.0:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-purrr@0.2.4:', type=('build', 'run'))
depends_on('r-purrr@0.3.3:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-purrr@0.3.4:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-readr@1.1.1:', type=('build', 'run'))
depends_on('r-readr@1.3.1:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-readr@1.4.0:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-readxl@1.0.0:', type=('build', 'run'))
depends_on('r-readxl@1.3.1:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-reprex@0.1.1:', type=('build', 'run'))
depends_on('r-reprex@0.3.0:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-reprex@2.0.0:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-rlang@0.1.4:', type=('build', 'run'))
depends_on('r-rlang@0.4.1:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-rlang@0.4.10:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-rstudioapi@0.7:', type=('build', 'run'))
depends_on('r-rstudioapi@0.10:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-rstudioapi@0.13:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-rvest@0.3.2:', type=('build', 'run'))
depends_on('r-rvest@0.3.5:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-rvest@1.0.0:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-stringr@1.2.0:', type=('build', 'run'))
depends_on('r-stringr@1.4.0:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-tibble@1.3.4:', type=('build', 'run'))
depends_on('r-tibble@2.1.3:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-tibble@3.1.0:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-tidyr@0.7.2:', type=('build', 'run'))
depends_on('r-tidyr@1.0.0:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-tidyr@1.1.3:', type=('build', 'run'), when='@1.3.1:')
depends_on('r-xml2@1.1.1:', type=('build', 'run'))
depends_on('r-xml2@1.2.2:', type=('build', 'run'), when='@1.3.0:')
depends_on('r-xml2@1.3.2:', type=('build', 'run'), when='@1.3.1:')
| 60.533981 | 95 | 0.573857 | 1,097 | 6,235 | 3.189608 | 0.111212 | 0.203201 | 0.225779 | 0.242355 | 0.762218 | 0.680194 | 0.631323 | 0.570163 | 0.535296 | 0.459274 | 0 | 0.098761 | 0.119808 | 6,235 | 102 | 96 | 61.127451 | 0.538812 | 0.084042 | 0 | 0 | 0 | 0 | 0.415523 | 0.037663 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.011765 | 0 | 0.035294 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
5327835e44b1b3db832f372bb410425e4e0c1638 | 767 | py | Python | test/utils/utils.py | maxlambrecht/py-spiffe | 4665084179624be04a51e33026e975ece646533e | [
"Apache-2.0"
] | null | null | null | test/utils/utils.py | maxlambrecht/py-spiffe | 4665084179624be04a51e33026e975ece646533e | [
"Apache-2.0"
] | null | null | null | test/utils/utils.py | maxlambrecht/py-spiffe | 4665084179624be04a51e33026e975ece646533e | [
"Apache-2.0"
] | null | null | null | import grpc
_TEST_JWKS_PATH = 'test/bundle/jwt_bundle/jwks/{}'
def read_file_bytes(filename):
with open(filename, 'rb') as file:
return file.read()
JWKS_1_EC_KEY = read_file_bytes(_TEST_JWKS_PATH.format('jwks_1_ec_key.json'))
JWKS_2_EC_1_RSA_KEYS = read_file_bytes(_TEST_JWKS_PATH.format('jwks_3_keys.json'))
JWKS_MISSING_KEY_ID = read_file_bytes(_TEST_JWKS_PATH.format('jwks_missing_kid.json'))
JWKS_MISSING_X = read_file_bytes(_TEST_JWKS_PATH.format('jwks_ec_missing_x.json'))
class FakeCall(grpc.Call, grpc.RpcError):
def __init__(self):
self._code = grpc.StatusCode.UNKNOWN
self._details = 'Error details from Workload API'
def code(self):
return self._code
def details(self):
return self._details
| 28.407407 | 86 | 0.744459 | 120 | 767 | 4.283333 | 0.366667 | 0.077821 | 0.116732 | 0.132296 | 0.272374 | 0.272374 | 0.272374 | 0.272374 | 0 | 0 | 0 | 0.007657 | 0.148631 | 767 | 26 | 87 | 29.5 | 0.779479 | 0 | 0 | 0 | 0 | 0 | 0.182529 | 0.095176 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.058824 | 0.117647 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
532da1d01984272b63a5e5285fe5382ef743a766 | 205 | py | Python | documental/snippet.py | DennisMerkus/documental | 9d97359c68274937647ed80158b67054d0e53d05 | [
"MIT"
] | null | null | null | documental/snippet.py | DennisMerkus/documental | 9d97359c68274937647ed80158b67054d0e53d05 | [
"MIT"
] | 1 | 2020-08-10T14:43:49.000Z | 2020-08-10T14:43:49.000Z | documental/snippet.py | DennisMerkus/documental | 9d97359c68274937647ed80158b67054d0e53d05 | [
"MIT"
] | null | null | null | from .document import RootDocument
class Snippet(RootDocument):
def __init__(self):
super().__init__()
self.type = "Snippet"
def __str__(self):
return super().__str__()
| 17.083333 | 34 | 0.629268 | 21 | 205 | 5.380952 | 0.619048 | 0.141593 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.253659 | 205 | 11 | 35 | 18.636364 | 0.738562 | 0 | 0 | 0 | 0 | 0 | 0.034146 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0.142857 | 0.142857 | 0.714286 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
5338ea354fb34af394ab3e1c779ce5918bb3a48e | 204 | py | Python | main.py | Diegomastro/materias-correlatividades | 93b65de79696c410e69d22f82d8a9d4d93d42ddf | [
"MIT"
] | null | null | null | main.py | Diegomastro/materias-correlatividades | 93b65de79696c410e69d22f82d8a9d4d93d42ddf | [
"MIT"
] | null | null | null | main.py | Diegomastro/materias-correlatividades | 93b65de79696c410e69d22f82d8a9d4d93d42ddf | [
"MIT"
] | null | null | null | import sys
from PyQt5.QtWidgets import QApplication
from ventana_main import VentanaMain
if __name__ == '__main__':
app = QApplication(sys.argv)
main_win = VentanaMain()
sys.exit(app.exec()) | 22.666667 | 40 | 0.740196 | 26 | 204 | 5.423077 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005882 | 0.166667 | 204 | 9 | 41 | 22.666667 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0.039024 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
5342799e4dc7eafda1dd5cdbf7fe46afce1a7e27 | 1,551 | py | Python | lucchi.py | mental689/connectomics | 6d533bf2b0cd3e3427d5bb5cab216f0aaa828e70 | [
"MIT"
] | 1 | 2019-04-18T10:32:58.000Z | 2019-04-18T10:32:58.000Z | lucchi.py | mental689/connectomics | 6d533bf2b0cd3e3427d5bb5cab216f0aaa828e70 | [
"MIT"
] | null | null | null | lucchi.py | mental689/connectomics | 6d533bf2b0cd3e3427d5bb5cab216f0aaa828e70 | [
"MIT"
] | null | null | null | import torch
from torch.utils.data import Dataset
import cv2
import os
from glob import glob
from tqdm import tqdm
import numpy as np
class LucchiPPDataset(Dataset):
def __init__(self, train=True, transforms=None):
self.subset = 'Train' if train else 'Test'
self.files = glob('dataset/Lucchi++/'+self.subset+'_In/*.png')
self.files.sort(key=lambda x:int(os.path.basename(x).split('.')[0].replace('mask','')))
self.labels = glob('dataset/Lucchi++/'+self.subset+'_Out/*.png')
self.labels.sort(key=lambda x:int(os.path.basename(x).split('.')[0]))
assert len(self.files) == len(self.labels)
assert len(self.files) > 0
self.transforms = transforms
def __len__(self):
return len(self.files)
def __getitem__(self, item):
x = cv2.imread(self.files[item])
y = cv2.imread(self.labels[item])
if self.transforms is not None:
x = self.transforms(x)
y = self.transforms(y)
return torch.from_numpy(x.astype(np.float32).transpose(2,0,1)), torch.from_numpy(y.astype(np.float32).transpose(2,0,1))
class Resize(object):
def __init__(self, size):
self.size = size
def __call__(self, img):
return cv2.resize(img, self.size)
def __repr__(self):
return self.__class__.__name__ + '(size={0})'.format(self.size)
class Scale(object):
def __init__(self):
pass
def __call__(self, img):
return img / 255.
def __repr__(self):
return self.__class__.__name__
| 29.264151 | 127 | 0.632495 | 215 | 1,551 | 4.302326 | 0.316279 | 0.058378 | 0.035676 | 0.045405 | 0.307027 | 0.205405 | 0.205405 | 0.082162 | 0.082162 | 0.082162 | 0 | 0.017384 | 0.221148 | 1,551 | 52 | 128 | 29.826923 | 0.748344 | 0 | 0 | 0.1 | 0 | 0 | 0.05029 | 0 | 0 | 0 | 0 | 0 | 0.05 | 1 | 0.225 | false | 0.025 | 0.175 | 0.125 | 0.625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
5361fcc0642b8657399987714da4cea57cca4523 | 203 | py | Python | ABC/C/0176.py | taro-masuda/AtCoder | e8cb050260c1dff4ef61a27d1a3a2a8029fc939a | [
"MIT"
] | null | null | null | ABC/C/0176.py | taro-masuda/AtCoder | e8cb050260c1dff4ef61a27d1a3a2a8029fc939a | [
"MIT"
] | null | null | null | ABC/C/0176.py | taro-masuda/AtCoder | e8cb050260c1dff4ef61a27d1a3a2a8029fc939a | [
"MIT"
] | null | null | null | N = int(input())
A = list(map(int, input().split()))
ans = 0
max_tall = 0
for i in range(0, N):
if max_tall > A[i]:
ans += max_tall - A[i]
if A[i] > max_tall:
max_tall = A[i]
print(ans) | 16.916667 | 35 | 0.546798 | 41 | 203 | 2.585366 | 0.414634 | 0.330189 | 0.226415 | 0.254717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019868 | 0.256158 | 203 | 12 | 36 | 16.916667 | 0.682119 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.1 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
725bf1bc38930b3bd047babdcea69fcc6d6be69e | 201 | py | Python | Ex3.py | LuizHps18/Exercicios_LP_1B | 00901fe5493ceabba922a1626451117196417da6 | [
"MIT"
] | null | null | null | Ex3.py | LuizHps18/Exercicios_LP_1B | 00901fe5493ceabba922a1626451117196417da6 | [
"MIT"
] | null | null | null | Ex3.py | LuizHps18/Exercicios_LP_1B | 00901fe5493ceabba922a1626451117196417da6 | [
"MIT"
] | null | null | null | dia = input("Digite seu dia de nascimento ")
mes = input("Digite seu mês de nascimento ")
ano = input("Digite seu ano de nascimento ")
print("Sua data de nascimento é {}/{}/{}".format(dia, mes, ano))
| 40.2 | 65 | 0.676617 | 31 | 201 | 4.387097 | 0.451613 | 0.352941 | 0.308824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.164179 | 201 | 4 | 66 | 50.25 | 0.809524 | 0 | 0 | 0 | 0 | 0 | 0.597015 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.25 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
72887606e5f957b5c80dcc5df627b7df235445a1 | 774 | py | Python | 99_misc/decorator.py | zzz0072/Python_Exercises | 9918aa8197a77ef237e5e60306c7785eca5cb1d3 | [
"BSD-2-Clause"
] | null | null | null | 99_misc/decorator.py | zzz0072/Python_Exercises | 9918aa8197a77ef237e5e60306c7785eca5cb1d3 | [
"BSD-2-Clause"
] | null | null | null | 99_misc/decorator.py | zzz0072/Python_Exercises | 9918aa8197a77ef237e5e60306c7785eca5cb1d3 | [
"BSD-2-Clause"
] | null | null | null | #/usr/bin/env python
def my_func1(callback):
def func_wrapper(x):
print("my_func1: {0} ".format(callback(x)))
return func_wrapper
@my_func1
def my_func2(x):
return x
# Actuall call sequence is similar to:
# deco = my_func1(my_func2)
# deco("test") => func_wrapper("test")
my_func2("test")
#-------------------------------------------
# Test decorator with parameter
def dec_param(param):
def my_func3(callback):
def func_wrapper(x):
print("my_func3: {0} {1} ".format(param, callback(x)))
return func_wrapper
return my_func3
@dec_param("tag")
def my_func4(x):
return x
# Actuall call sequence is similar to:
# deco = dec_pram("tag", my_func3(my_func4))
# deco("test") => func_wrapper("test")
my_func4("test")
| 22.764706 | 66 | 0.624031 | 111 | 774 | 4.144144 | 0.306306 | 0.143478 | 0.065217 | 0.095652 | 0.534783 | 0.421739 | 0.313043 | 0.182609 | 0.182609 | 0.182609 | 0 | 0.026856 | 0.182171 | 774 | 33 | 67 | 23.454545 | 0.699842 | 0.399225 | 0 | 0.333333 | 0 | 0 | 0.094298 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.388889 | false | 0 | 0 | 0.111111 | 0.666667 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
72dbc31acb6be2718833f729dff3c50b8035b24a | 281 | py | Python | api/core/serializers.py | yarsanich/django-nuxt-test-app | 14b10055eb077a81889c17d4a9d1a09d8bdfec6a | [
"MIT"
] | null | null | null | api/core/serializers.py | yarsanich/django-nuxt-test-app | 14b10055eb077a81889c17d4a9d1a09d8bdfec6a | [
"MIT"
] | null | null | null | api/core/serializers.py | yarsanich/django-nuxt-test-app | 14b10055eb077a81889c17d4a9d1a09d8bdfec6a | [
"MIT"
] | null | null | null | # core/serializers.py
from rest_framework import serializers
from .models import Recipe
class RecipeSerializer(serializers.ModelSerializer):
class Meta:
model = Recipe
fields = ("id", "name", "ingredients", "picture", "difficulty", "prep_time", "prep_guide") | 31.222222 | 98 | 0.711744 | 30 | 281 | 6.566667 | 0.766667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170819 | 281 | 9 | 98 | 31.222222 | 0.845494 | 0.067616 | 0 | 0 | 0 | 0 | 0.203065 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
72dda02b344cf31f3fe421e61f5e552af8d60ef1 | 36,012 | py | Python | mars/dataframe/groupby/tests/test_groupby_execution.py | ConanoutlooklvTBS/mars | 7030566fd9e9fc02b6b4064ef7bd86f6c24a2f60 | [
"Apache-2.0"
] | null | null | null | mars/dataframe/groupby/tests/test_groupby_execution.py | ConanoutlooklvTBS/mars | 7030566fd9e9fc02b6b4064ef7bd86f6c24a2f60 | [
"Apache-2.0"
] | null | null | null | mars/dataframe/groupby/tests/test_groupby_execution.py | ConanoutlooklvTBS/mars | 7030566fd9e9fc02b6b4064ef7bd86f6c24a2f60 | [
"Apache-2.0"
] | null | null | null | # Copyright 1999-2021 Alibaba Group Holding Ltd.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from collections import OrderedDict
import numpy as np
import pandas as pd
import pytest
try:
import pyarrow as pa
except ImportError: # pragma: no cover
pa = None
from .... import dataframe as md
from ....core.operand import OperandStage
from ....tests.core import assert_groupby_equal, require_cudf
from ....utils import arrow_array_to_objects
from ..aggregation import DataFrameGroupByAgg
class MockReduction1(md.CustomReduction):
def agg(self, v1):
return v1.sum()
class MockReduction2(md.CustomReduction):
def pre(self, value):
return value + 1, value * 2
def agg(self, v1, v2):
return v1.sum(), v2.min()
def post(self, v1, v2):
return v1 + v2
def test_groupby(setup):
rs = np.random.RandomState(0)
data_size = 100
data_dict = {'a': rs.randint(0, 10, size=(data_size,)),
'b': rs.randint(0, 10, size=(data_size,)),
'c': rs.choice(list('abcd'), size=(data_size,))}
# test groupby with DataFrames and RangeIndex
df1 = pd.DataFrame(data_dict)
mdf = md.DataFrame(df1, chunk_size=13)
grouped = mdf.groupby('b')
assert_groupby_equal(grouped.execute().fetch(),
df1.groupby('b'))
# test groupby with string index with duplications
df2 = pd.DataFrame(data_dict, index=['i' + str(i % 3) for i in range(data_size)])
mdf = md.DataFrame(df2, chunk_size=13)
grouped = mdf.groupby('b')
assert_groupby_equal(grouped.execute().fetch(),
df2.groupby('b'))
# test groupby with DataFrames by series
grouped = mdf.groupby(mdf['b'])
assert_groupby_equal(grouped.execute().fetch(),
df2.groupby(df2['b']))
# test groupby with DataFrames by multiple series
grouped = mdf.groupby(by=[mdf['b'], mdf['c']])
assert_groupby_equal(grouped.execute().fetch(),
df2.groupby(by=[df2['b'], df2['c']]))
# test groupby with DataFrames with MultiIndex
df3 = pd.DataFrame(data_dict,
index=pd.MultiIndex.from_tuples(
[(i % 3, 'i' + str(i)) for i in range(data_size)]))
mdf = md.DataFrame(df3, chunk_size=13)
grouped = mdf.groupby(level=0)
assert_groupby_equal(grouped.execute().fetch(),
df3.groupby(level=0))
# test groupby with DataFrames by integer columns
df4 = pd.DataFrame(list(data_dict.values())).T
mdf = md.DataFrame(df4, chunk_size=13)
grouped = mdf.groupby(0)
assert_groupby_equal(grouped.execute().fetch(),
df4.groupby(0))
series1 = pd.Series(data_dict['a'])
ms1 = md.Series(series1, chunk_size=13)
grouped = ms1.groupby(lambda x: x % 3)
assert_groupby_equal(grouped.execute().fetch(),
series1.groupby(lambda x: x % 3))
# test groupby series
grouped = ms1.groupby(ms1)
assert_groupby_equal(grouped.execute().fetch(),
series1.groupby(series1))
series2 = pd.Series(data_dict['a'],
index=['i' + str(i) for i in range(data_size)])
ms2 = md.Series(series2, chunk_size=13)
grouped = ms2.groupby(lambda x: int(x[1:]) % 3)
assert_groupby_equal(grouped.execute().fetch(),
series2.groupby(lambda x: int(x[1:]) % 3))
def test_groupby_getitem(setup):
rs = np.random.RandomState(0)
data_size = 100
raw = pd.DataFrame({'a': rs.randint(0, 10, size=(data_size,)),
'b': rs.randint(0, 10, size=(data_size,)),
'c': rs.choice(list('abcd'), size=(data_size,))},
index=pd.MultiIndex.from_tuples([(i % 3, 'i' + str(i)) for i in range(data_size)]))
mdf = md.DataFrame(raw, chunk_size=13)
r = mdf.groupby(level=0)[['a', 'b']]
assert_groupby_equal(r.execute().fetch(),
raw.groupby(level=0)[['a', 'b']], with_selection=True)
for method in ('tree', 'shuffle'):
r = mdf.groupby(level=0)[['a', 'b']].sum(method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby(level=0)[['a', 'b']].sum().sort_index())
r = mdf.groupby(level=0)[['a', 'b']].apply(lambda x: x + 1)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby(level=0)[['a', 'b']].apply(lambda x: x + 1).sort_index())
r = mdf.groupby('b')[['a', 'b']]
assert_groupby_equal(r.execute().fetch(),
raw.groupby('b')[['a', 'b']], with_selection=True)
r = mdf.groupby('b')[['a', 'c']]
assert_groupby_equal(r.execute().fetch(),
raw.groupby('b')[['a', 'c']], with_selection=True)
for method in ('tree', 'shuffle'):
r = mdf.groupby('b')[['a', 'b']].sum(method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('b')[['a', 'b']].sum().sort_index())
r = mdf.groupby('b')[['a', 'b']].agg(['sum', 'count'], method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('b')[['a', 'b']].agg(['sum', 'count']).sort_index())
r = mdf.groupby('b')[['a', 'c']].agg(['sum', 'count'], method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('b')[['a', 'c']].agg(['sum', 'count']).sort_index())
r = mdf.groupby('b')[['a', 'b']].apply(lambda x: x + 1)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('b')[['a', 'b']].apply(lambda x: x + 1).sort_index())
r = mdf.groupby('b')[['a', 'b']].transform(lambda x: x + 1)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('b')[['a', 'b']].transform(lambda x: x + 1).sort_index())
r = mdf.groupby('b')[['a', 'b']].cumsum()
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('b')[['a', 'b']].cumsum().sort_index())
r = mdf.groupby('b').a
assert_groupby_equal(r.execute().fetch(),
raw.groupby('b').a, with_selection=True)
for method in ('shuffle', 'tree'):
r = mdf.groupby('b').a.sum(method=method)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
raw.groupby('b').a.sum().sort_index())
r = mdf.groupby('b').a.agg(['sum', 'mean', 'var'], method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('b').a.agg(['sum', 'mean', 'var']).sort_index())
r = mdf.groupby('b', as_index=False).a.sum(method=method)
pd.testing.assert_frame_equal(
r.execute().fetch().sort_values('b', ignore_index=True),
raw.groupby('b', as_index=False).a.sum().sort_values('b', ignore_index=True))
r = mdf.groupby('b', as_index=False).b.count(method=method)
pd.testing.assert_frame_equal(
r.execute().fetch().sort_values('b', ignore_index=True),
raw.groupby('b', as_index=False).b.count().sort_values('b', ignore_index=True))
r = mdf.groupby('b', as_index=False).b.agg({'cnt': 'count'}, method=method)
pd.testing.assert_frame_equal(
r.execute().fetch().sort_values('b', ignore_index=True),
raw.groupby('b', as_index=False).b.agg({'cnt': 'count'}).sort_values('b', ignore_index=True))
r = mdf.groupby('b').a.apply(lambda x: x + 1)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
raw.groupby('b').a.apply(lambda x: x + 1).sort_index())
r = mdf.groupby('b').a.transform(lambda x: x + 1)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
raw.groupby('b').a.transform(lambda x: x + 1).sort_index())
r = mdf.groupby('b').a.cumsum()
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
raw.groupby('b').a.cumsum().sort_index())
# special test for selection key == 0
raw = pd.DataFrame(rs.rand(data_size, 10))
raw[0] = 0
mdf = md.DataFrame(raw, chunk_size=13)
r = mdf.groupby(0, as_index=False)[0].agg({'cnt': 'count'}, method='tree')
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby(0, as_index=False)[0].agg({'cnt': 'count'}))
def test_dataframe_groupby_agg(setup):
agg_funs = ['std', 'mean', 'var', 'max', 'count', 'size', 'all', 'any', 'skew', 'kurt', 'sem']
rs = np.random.RandomState(0)
raw = pd.DataFrame({'c1': np.arange(100).astype(np.int64),
'c2': rs.choice(['a', 'b', 'c'], (100,)),
'c3': rs.rand(100)})
mdf = md.DataFrame(raw, chunk_size=13)
for method in ['tree', 'shuffle']:
r = mdf.groupby('c2').agg('size', method=method)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
raw.groupby('c2').agg('size').sort_index())
for agg_fun in agg_funs:
if agg_fun == 'size':
continue
r = mdf.groupby('c2').agg(agg_fun, method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('c2').agg(agg_fun).sort_index())
r = mdf.groupby('c2').agg(agg_funs, method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('c2').agg(agg_funs).sort_index())
agg = OrderedDict([('c1', ['min', 'mean']), ('c3', 'std')])
r = mdf.groupby('c2').agg(agg, method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('c2').agg(agg).sort_index())
agg = OrderedDict([('c1', 'min'), ('c3', 'sum')])
r = mdf.groupby('c2').agg(agg, method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('c2').agg(agg).sort_index())
r = mdf.groupby('c2').agg({'c1': 'min', 'c3': 'min'}, method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('c2').agg({'c1': 'min', 'c3': 'min'}).sort_index())
r = mdf.groupby('c2').agg({'c1': 'min'}, method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('c2').agg({'c1': 'min'}).sort_index())
# test groupby series
r = mdf.groupby(mdf['c2']).sum(method=method)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby(raw['c2']).sum().sort_index())
r = mdf.groupby('c2').size(method='tree')
pd.testing.assert_series_equal(r.execute().fetch(),
raw.groupby('c2').size())
# test inserted kurt method
r = mdf.groupby('c2').kurtosis(method='tree')
pd.testing.assert_frame_equal(r.execute().fetch(),
raw.groupby('c2').kurtosis())
for agg_fun in agg_funs:
if agg_fun == 'size' or callable(agg_fun):
continue
r = getattr(mdf.groupby('c2'), agg_fun)(method='tree')
pd.testing.assert_frame_equal(r.execute().fetch(),
getattr(raw.groupby('c2'), agg_fun)())
# test as_index=False
for method in ['tree', 'shuffle']:
r = mdf.groupby('c2', as_index=False).agg('mean', method=method)
pd.testing.assert_frame_equal(
r.execute().fetch().sort_values('c2', ignore_index=True),
raw.groupby('c2', as_index=False).agg('mean').sort_values('c2', ignore_index=True))
assert r.op.groupby_params['as_index'] is False
r = mdf.groupby(['c1', 'c2'], as_index=False).agg('mean', method=method)
pd.testing.assert_frame_equal(
r.execute().fetch().sort_values(['c1', 'c2'], ignore_index=True),
raw.groupby(['c1', 'c2'], as_index=False).agg('mean').sort_values(['c1', 'c2'], ignore_index=True))
assert r.op.groupby_params['as_index'] is False
# test as_index=False takes no effect
r = mdf.groupby(['c1', 'c2'], as_index=False).agg(['mean', 'count'])
pd.testing.assert_frame_equal(r.execute().fetch(),
raw.groupby(['c1', 'c2'], as_index=False).agg(['mean', 'count']))
assert r.op.groupby_params['as_index'] is True
r = mdf.groupby('c2').agg(['cumsum', 'cumcount'])
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
raw.groupby('c2').agg(['cumsum', 'cumcount']).sort_index())
r = mdf[['c1', 'c3']].groupby(mdf['c2']).agg(MockReduction2())
pd.testing.assert_frame_equal(r.execute().fetch(),
raw[['c1', 'c3']].groupby(raw['c2']).agg(MockReduction2()))
r = mdf.groupby('c2').agg(sum_c1=md.NamedAgg('c1', 'sum'), min_c1=md.NamedAgg('c1', 'min'),
mean_c3=md.NamedAgg('c3', 'mean'), method='tree')
pd.testing.assert_frame_equal(r.execute().fetch(),
raw.groupby('c2').agg(sum_c1=md.NamedAgg('c1', 'sum'),
min_c1=md.NamedAgg('c1', 'min'),
mean_c3=md.NamedAgg('c3', 'mean')))
def test_series_groupby_agg(setup):
rs = np.random.RandomState(0)
series1 = pd.Series(rs.rand(10))
ms1 = md.Series(series1, chunk_size=3)
agg_funs = ['std', 'mean', 'var', 'max', 'count', 'size', 'all', 'any', 'skew', 'kurt', 'sem']
for method in ['tree', 'shuffle']:
for agg_fun in agg_funs:
r = ms1.groupby(lambda x: x % 2).agg(agg_fun, method=method)
pd.testing.assert_series_equal(r.execute().fetch(),
series1.groupby(lambda x: x % 2).agg(agg_fun))
r = ms1.groupby(lambda x: x % 2).agg(agg_funs, method=method)
pd.testing.assert_frame_equal(r.execute().fetch(),
series1.groupby(lambda x: x % 2).agg(agg_funs))
# test groupby series
r = ms1.groupby(ms1).sum(method=method)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
series1.groupby(series1).sum().sort_index())
r = ms1.groupby(ms1).sum(method=method)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
series1.groupby(series1).sum().sort_index())
# test inserted kurt method
r = ms1.groupby(ms1).kurtosis(method='tree')
pd.testing.assert_series_equal(r.execute().fetch(),
series1.groupby(series1).kurtosis())
for agg_fun in agg_funs:
r = getattr(ms1.groupby(lambda x: x % 2), agg_fun)(method='tree')
pd.testing.assert_series_equal(r.execute().fetch(),
getattr(series1.groupby(lambda x: x % 2), agg_fun)())
r = ms1.groupby(lambda x: x % 2).agg(['cumsum', 'cumcount'], method='tree')
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
series1.groupby(lambda x: x % 2).agg(['cumsum', 'cumcount']).sort_index())
r = ms1.groupby(lambda x: x % 2).agg(MockReduction2(name='custom_r'), method='tree')
pd.testing.assert_series_equal(r.execute().fetch(),
series1.groupby(lambda x: x % 2).agg(MockReduction2(name='custom_r')))
r = ms1.groupby(lambda x: x % 2).agg(col_var='var', col_skew='skew', method='tree')
pd.testing.assert_frame_equal(r.execute().fetch(),
series1.groupby(lambda x: x % 2).agg(col_var='var', col_skew='skew'))
def test_groupby_agg_auto_method(setup):
rs = np.random.RandomState(0)
raw = pd.DataFrame({'c1': rs.randint(20, size=100),
'c2': rs.choice(['a', 'b', 'c'], (100,)),
'c3': rs.rand(100)})
mdf = md.DataFrame(raw, chunk_size=20)
def _disallow_reduce(ctx, op):
assert op.stage != OperandStage.reduce
op.execute(ctx, op)
r = mdf.groupby('c2').agg('sum')
operand_executors = {DataFrameGroupByAgg: _disallow_reduce}
result = r.execute(extra_config={'operand_executors': operand_executors,
'check_all': False}).fetch()
pd.testing.assert_frame_equal(result.sort_index(),
raw.groupby('c2').agg('sum'))
def _disallow_combine_and_agg(ctx, op):
assert op.stage not in (OperandStage.combine, OperandStage.agg)
op.execute(ctx, op)
r = mdf.groupby('c1').agg('sum')
operand_executors = {DataFrameGroupByAgg: _disallow_combine_and_agg}
result = r.execute(extra_config={'operand_executors': operand_executors,
'check_all': False}).fetch()
pd.testing.assert_frame_equal(result.sort_index(),
raw.groupby('c1').agg('sum'))
def test_groupby_agg_str_cat(setup):
agg_fun = lambda x: x.str.cat(sep='_', na_rep='NA')
rs = np.random.RandomState(0)
raw_df = pd.DataFrame({'a': rs.choice(['A', 'B', 'C'], size=(100,)),
'b': rs.choice([None, 'alfa', 'bravo', 'charlie'], size=(100,))})
mdf = md.DataFrame(raw_df, chunk_size=13)
r = mdf.groupby('a').agg(agg_fun, method='tree')
pd.testing.assert_frame_equal(r.execute().fetch(),
raw_df.groupby('a').agg(agg_fun))
raw_series = pd.Series(rs.choice([None, 'alfa', 'bravo', 'charlie'], size=(100,)))
ms = md.Series(raw_series, chunk_size=13)
r = ms.groupby(lambda x: x % 2).agg(agg_fun, method='tree')
pd.testing.assert_series_equal(r.execute().fetch(),
raw_series.groupby(lambda x: x % 2).agg(agg_fun))
@require_cudf
def test_gpu_groupby_agg(setup_gpu):
rs = np.random.RandomState(0)
df1 = pd.DataFrame({'a': rs.choice([2, 3, 4], size=(100,)),
'b': rs.choice([2, 3, 4], size=(100,))})
mdf = md.DataFrame(df1, chunk_size=13).to_gpu()
r = mdf.groupby('a').sum()
pd.testing.assert_frame_equal(r.execute().fetch().to_pandas(),
df1.groupby('a').sum())
r = mdf.groupby('a').kurt()
pd.testing.assert_frame_equal(r.execute().fetch().to_pandas(),
df1.groupby('a').kurt())
r = mdf.groupby('a').agg(['sum', 'var'])
pd.testing.assert_frame_equal(r.execute().fetch().to_pandas(),
df1.groupby('a').agg(['sum', 'var']))
rs = np.random.RandomState(0)
idx = pd.Index(np.where(rs.rand(10) > 0.5, 'A', 'B'))
series1 = pd.Series(rs.rand(10), index=idx)
ms = md.Series(series1, index=idx, chunk_size=3).to_gpu().to_gpu()
r = ms.groupby(level=0).sum()
pd.testing.assert_series_equal(r.execute().fetch().to_pandas(),
series1.groupby(level=0).sum())
r = ms.groupby(level=0).kurt()
pd.testing.assert_series_equal(r.execute().fetch().to_pandas(),
series1.groupby(level=0).kurt())
r = ms.groupby(level=0).agg(['sum', 'var'])
pd.testing.assert_frame_equal(r.execute().fetch().to_pandas(),
series1.groupby(level=0).agg(['sum', 'var']))
def test_groupby_apply(setup):
df1 = pd.DataFrame({'a': [3, 4, 5, 3, 5, 4, 1, 2, 3],
'b': [1, 3, 4, 5, 6, 5, 4, 4, 4],
'c': list('aabaaddce')})
def apply_df(df, ret_series=False):
df = df.sort_index()
df.a += df.b
if len(df.index) > 0:
if not ret_series:
df = df.iloc[:-1, :]
else:
df = df.iloc[-1, :]
return df
def apply_series(s, truncate=True):
s = s.sort_index()
if truncate and len(s.index) > 0:
s = s.iloc[:-1]
return s
mdf = md.DataFrame(df1, chunk_size=3)
applied = mdf.groupby('b').apply(lambda df: None)
pd.testing.assert_frame_equal(applied.execute().fetch(),
df1.groupby('b').apply(lambda df: None))
applied = mdf.groupby('b').apply(apply_df)
pd.testing.assert_frame_equal(applied.execute().fetch().sort_index(),
df1.groupby('b').apply(apply_df).sort_index())
applied = mdf.groupby('b').apply(apply_df, ret_series=True)
pd.testing.assert_frame_equal(applied.execute().fetch().sort_index(),
df1.groupby('b').apply(apply_df, ret_series=True).sort_index())
applied = mdf.groupby('b').apply(lambda df: df.a, output_type='series')
pd.testing.assert_series_equal(applied.execute().fetch().sort_index(),
df1.groupby('b').apply(lambda df: df.a).sort_index())
applied = mdf.groupby('b').apply(lambda df: df.a.sum())
pd.testing.assert_series_equal(applied.execute().fetch().sort_index(),
df1.groupby('b').apply(lambda df: df.a.sum()).sort_index())
series1 = pd.Series([3, 4, 5, 3, 5, 4, 1, 2, 3])
ms1 = md.Series(series1, chunk_size=3)
applied = ms1.groupby(lambda x: x % 3).apply(lambda df: None)
pd.testing.assert_series_equal(applied.execute().fetch(),
series1.groupby(lambda x: x % 3).apply(lambda df: None))
applied = ms1.groupby(lambda x: x % 3).apply(apply_series)
pd.testing.assert_series_equal(applied.execute().fetch().sort_index(),
series1.groupby(lambda x: x % 3).apply(apply_series).sort_index())
sindex2 = pd.MultiIndex.from_arrays([list(range(9)), list('ABCDEFGHI')])
series2 = pd.Series(list('CDECEDABC'), index=sindex2)
ms2 = md.Series(series2, chunk_size=3)
applied = ms2.groupby(lambda x: x[0] % 3).apply(apply_series)
pd.testing.assert_series_equal(applied.execute().fetch().sort_index(),
series2.groupby(lambda x: x[0] % 3).apply(apply_series).sort_index())
def test_groupby_transform(setup):
df1 = pd.DataFrame({
'a': [3, 4, 5, 3, 5, 4, 1, 2, 3],
'b': [1, 3, 4, 5, 6, 5, 4, 4, 4],
'c': list('aabaaddce'),
'd': [3, 4, 5, 3, 5, 4, 1, 2, 3],
'e': [1, 3, 4, 5, 6, 5, 4, 4, 4],
'f': list('aabaaddce'),
})
def transform_series(s, truncate=True):
s = s.sort_index()
if truncate and len(s.index) > 1:
s = s.iloc[:-1].reset_index(drop=True)
return s
mdf = md.DataFrame(df1, chunk_size=3)
r = mdf.groupby('b').transform(transform_series, truncate=False)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b').transform(transform_series, truncate=False).sort_index())
if pd.__version__ != '1.1.0':
r = mdf.groupby('b').transform(['cummax', 'cumsum'], _call_agg=True)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b').agg(['cummax', 'cumsum']).sort_index())
agg_list = ['cummax', 'cumsum']
r = mdf.groupby('b').transform(agg_list, _call_agg=True)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b').agg(agg_list).sort_index())
agg_dict = OrderedDict([('d', 'cummax'), ('b', 'cumsum')])
r = mdf.groupby('b').transform(agg_dict, _call_agg=True)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b').agg(agg_dict).sort_index())
agg_list = ['sum', lambda s: s.sum()]
r = mdf.groupby('b').transform(agg_list, _call_agg=True)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b').agg(agg_list).sort_index())
series1 = pd.Series([3, 4, 5, 3, 5, 4, 1, 2, 3])
ms1 = md.Series(series1, chunk_size=3)
r = ms1.groupby(lambda x: x % 3).transform(lambda x: x + 1)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
series1.groupby(lambda x: x % 3).transform(lambda x: x + 1).sort_index())
r = ms1.groupby(lambda x: x % 3).transform('cummax', _call_agg=True)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
series1.groupby(lambda x: x % 3).agg('cummax').sort_index())
agg_list = ['cummax', 'cumcount']
r = ms1.groupby(lambda x: x % 3).transform(agg_list, _call_agg=True)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
series1.groupby(lambda x: x % 3).agg(agg_list).sort_index())
def test_groupby_cum(setup):
df1 = pd.DataFrame({'a': [3, 5, 2, 7, 1, 2, 4, 6, 2, 4],
'b': [8, 3, 4, 1, 8, 2, 2, 2, 2, 3],
'c': [1, 8, 8, 5, 3, 5, 0, 0, 5, 4]})
mdf = md.DataFrame(df1, chunk_size=3)
for fun in ['cummin', 'cummax', 'cumprod', 'cumsum']:
r1 = getattr(mdf.groupby('b'), fun)()
pd.testing.assert_frame_equal(r1.execute().fetch().sort_index(),
getattr(df1.groupby('b'), fun)().sort_index())
r2 = getattr(mdf.groupby('b'), fun)(axis=1)
pd.testing.assert_frame_equal(r2.execute().fetch().sort_index(),
getattr(df1.groupby('b'), fun)(axis=1).sort_index())
r3 = mdf.groupby('b').cumcount()
pd.testing.assert_series_equal(r3.execute().fetch().sort_index(),
df1.groupby('b').cumcount().sort_index())
series1 = pd.Series([3, 4, 5, 3, 5, 4, 1, 2, 3])
ms1 = md.Series(series1, chunk_size=3)
for fun in ['cummin', 'cummax', 'cumprod', 'cumsum', 'cumcount']:
r1 = getattr(ms1.groupby(lambda x: x % 2), fun)()
pd.testing.assert_series_equal(r1.execute().fetch().sort_index(),
getattr(series1.groupby(lambda x: x % 2), fun)().sort_index())
def test_groupby_head(setup):
df1 = pd.DataFrame({'a': [3, 5, 2, 7, 1, 2, 4, 6, 2, 4],
'b': [8, 3, 4, 1, 8, 2, 2, 2, 2, 3],
'c': [1, 8, 8, 5, 3, 5, 0, 0, 5, 4],
'd': [9, 7, 6, 3, 6, 3, 2, 1, 5, 8]})
# test single chunk
mdf = md.DataFrame(df1)
r = mdf.groupby('b').head(1)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b').head(1))
r = mdf.groupby('b').head(-1)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b').head(-1))
r = mdf.groupby('b')['a', 'c'].head(1)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b')['a', 'c'].head(1))
# test multiple chunks
mdf = md.DataFrame(df1, chunk_size=3)
r = mdf.groupby('b').head(1)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b').head(1))
# test head with selection
r = mdf.groupby('b')['a', 'd'].head(1)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b')['a', 'd'].head(1))
r = mdf.groupby('b')['c', 'a', 'd'].head(1)
pd.testing.assert_frame_equal(r.execute().fetch().sort_index(),
df1.groupby('b')['c', 'a', 'd'].head(1))
r = mdf.groupby('b')['c'].head(1)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
df1.groupby('b')['c'].head(1))
# test single chunk
series1 = pd.Series([3, 4, 5, 3, 5, 4, 1, 2, 3])
ms = md.Series(series1)
r = ms.groupby(lambda x: x % 2).head(1)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
series1.groupby(lambda x: x % 2).head(1))
r = ms.groupby(lambda x: x % 2).head(-1)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
series1.groupby(lambda x: x % 2).head(-1))
# test multiple chunk
ms = md.Series(series1, chunk_size=3)
r = ms.groupby(lambda x: x % 2).head(1)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
series1.groupby(lambda x: x % 2).head(1))
# test with special index
series1 = pd.Series([3, 4, 5, 3, 5, 4, 1, 2, 3], index=[4, 1, 2, 3, 5, 8, 6, 7, 9])
ms = md.Series(series1, chunk_size=3)
r = ms.groupby(lambda x: x % 2).head(1)
pd.testing.assert_series_equal(r.execute().fetch().sort_index(),
series1.groupby(lambda x: x % 2).head(1).sort_index())
def test_groupby_sample(setup):
rs = np.random.RandomState(0)
sample_count = 10
src_data_list = []
for b in range(5):
data_count = int(rs.randint(20, 100))
src_data_list.append(pd.DataFrame({
'a': rs.randint(0, 100, size=data_count),
'b': np.array([b] * data_count),
'c': rs.randint(0, 100, size=data_count),
'd': rs.randint(0, 100, size=data_count),
}))
df1 = pd.concat(src_data_list)
shuffle_idx = np.arange(len(df1))
rs.shuffle(shuffle_idx)
df1 = df1.iloc[shuffle_idx].reset_index(drop=True)
# test single chunk
mdf = md.DataFrame(df1)
r1 = mdf.groupby('b').sample(sample_count, random_state=rs)
result1 = r1.execute().fetch()
r2 = mdf.groupby('b').sample(sample_count, random_state=rs)
result2 = r2.execute().fetch()
pd.testing.assert_frame_equal(result1, result2)
assert not (result1.groupby('b').count() - sample_count).any()[0]
r1 = mdf.groupby('b').sample(sample_count, weights=df1['c'] / df1['c'].sum(), random_state=rs)
result1 = r1.execute().fetch()
r2 = mdf.groupby('b').sample(sample_count, weights=df1['c'] / df1['c'].sum(), random_state=rs)
result2 = r2.execute().fetch()
pd.testing.assert_frame_equal(result1, result2)
assert not (result1.groupby('b').count() - sample_count).any()[0]
r1 = mdf.groupby('b')[['b', 'c']].sample(sample_count, random_state=rs)
result1 = r1.execute().fetch()
r2 = mdf.groupby('b')[['b', 'c']].sample(sample_count, random_state=rs)
result2 = r2.execute().fetch()
pd.testing.assert_frame_equal(result1, result2)
assert len(result1.columns) == 2
assert not (result1.groupby('b').count() - sample_count).any()[0]
r1 = mdf.groupby('b').c.sample(sample_count, random_state=rs)
result1 = r1.execute().fetch()
r2 = mdf.groupby('b').c.sample(sample_count, random_state=rs)
result2 = r2.execute().fetch()
pd.testing.assert_series_equal(result1, result2)
r1 = mdf.groupby('b').c.sample(len(df1), random_state=rs)
result1 = r1.execute().fetch()
assert len(result1) == len(df1)
with pytest.raises(ValueError):
r1 = mdf.groupby('b').c.sample(len(df1), random_state=rs, errors='raises')
r1.execute().fetch()
# test multiple chunks
mdf = md.DataFrame(df1, chunk_size=47)
r1 = mdf.groupby('b').sample(sample_count, random_state=rs)
result1 = r1.execute().fetch()
r2 = mdf.groupby('b').sample(sample_count, random_state=rs)
result2 = r2.execute().fetch()
pd.testing.assert_frame_equal(result1, result2)
assert not (result1.groupby('b').count() - sample_count).any()[0]
r1 = mdf.groupby('b').sample(sample_count, weights=df1['c'] / df1['c'].sum(), random_state=rs)
result1 = r1.execute().fetch()
r2 = mdf.groupby('b').sample(sample_count, weights=df1['c'] / df1['c'].sum(), random_state=rs)
result2 = r2.execute().fetch()
pd.testing.assert_frame_equal(result1, result2)
assert not (result1.groupby('b').count() - sample_count).any()[0]
r1 = mdf.groupby('b')[['b', 'c']].sample(sample_count, random_state=rs)
result1 = r1.execute().fetch()
r2 = mdf.groupby('b')[['b', 'c']].sample(sample_count, random_state=rs)
result2 = r2.execute().fetch()
pd.testing.assert_frame_equal(result1, result2)
assert len(result1.columns) == 2
assert not (result1.groupby('b').count() - sample_count).any()[0]
r1 = mdf.groupby('b').c.sample(sample_count, random_state=rs)
result1 = r1.execute().fetch()
r2 = mdf.groupby('b').c.sample(sample_count, random_state=rs)
result2 = r2.execute().fetch()
pd.testing.assert_series_equal(result1, result2)
r1 = mdf.groupby('b').c.sample(len(df1), random_state=rs)
result1 = r1.execute().fetch()
assert len(result1) == len(df1)
with pytest.raises(ValueError):
r1 = mdf.groupby('b').c.sample(len(df1), random_state=rs, errors='raises')
r1.execute().fetch()
@pytest.mark.skipif(pa is None, reason='pyarrow not installed')
def test_groupby_agg_with_arrow_dtype(setup):
df1 = pd.DataFrame({'a': [1, 2, 1],
'b': ['a', 'b', 'a']})
mdf = md.DataFrame(df1)
mdf['b'] = mdf['b'].astype('Arrow[string]')
r = mdf.groupby('a').count()
result = r.execute().fetch()
expected = df1.groupby('a').count()
pd.testing.assert_frame_equal(result, expected)
r = mdf.groupby('b').count()
result = r.execute().fetch()
expected = df1.groupby('b').count()
pd.testing.assert_frame_equal(result, expected)
series1 = df1['b']
mseries = md.Series(series1).astype('Arrow[string]')
r = mseries.groupby(mseries).count()
result = r.execute().fetch()
expected = series1.groupby(series1).count()
pd.testing.assert_series_equal(result, expected)
series2 = series1.copy()
series2.index = pd.MultiIndex.from_tuples([(0, 1), (2, 3), (4, 5)])
mseries = md.Series(series2).astype('Arrow[string]')
r = mseries.groupby(mseries).count()
result = r.execute().fetch()
expected = series2.groupby(series2).count()
pd.testing.assert_series_equal(result, expected)
@pytest.mark.skipif(pa is None, reason='pyarrow not installed')
def test_groupby_apply_with_arrow_dtype(setup):
df1 = pd.DataFrame({'a': [1, 2, 1],
'b': ['a', 'b', 'a']})
mdf = md.DataFrame(df1)
mdf['b'] = mdf['b'].astype('Arrow[string]')
applied = mdf.groupby('b').apply(lambda df: df.a.sum())
result = applied.execute().fetch()
expected = df1.groupby('b').apply(lambda df: df.a.sum())
pd.testing.assert_series_equal(result, expected)
series1 = df1['b']
mseries = md.Series(series1).astype('Arrow[string]')
applied = mseries.groupby(mseries).apply(lambda s: s)
result = applied.execute().fetch()
expected = series1.groupby(series1).apply(lambda s: s)
pd.testing.assert_series_equal(arrow_array_to_objects(result), expected)
| 43.917073 | 111 | 0.567616 | 4,866 | 36,012 | 4.065351 | 0.061447 | 0.0734 | 0.07431 | 0.067334 | 0.830806 | 0.784653 | 0.738297 | 0.684663 | 0.636134 | 0.586897 | 0 | 0.030911 | 0.250778 | 36,012 | 819 | 112 | 43.970696 | 0.702272 | 0.0341 | 0 | 0.448276 | 0 | 0 | 0.038852 | 0 | 0 | 0 | 0 | 0 | 0.208539 | 1 | 0.037767 | false | 0 | 0.018062 | 0.006568 | 0.070608 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
72e220879112a4764c0b49f5cab93b48a033c5b8 | 617 | py | Python | keycloak_api_client/factories.py | masterplandev/python-keycloak-api-client | 60406dace35a4c9b2a0fd149823c09fd993d5b8b | [
"MIT"
] | 1 | 2021-07-16T11:40:03.000Z | 2021-07-16T11:40:03.000Z | keycloak_api_client/factories.py | masterplandev/python-keycloak-api-client | 60406dace35a4c9b2a0fd149823c09fd993d5b8b | [
"MIT"
] | 2 | 2021-07-29T14:42:15.000Z | 2021-11-25T16:35:39.000Z | keycloak_api_client/factories.py | masterplandev/python-keycloak-api-client | 60406dace35a4c9b2a0fd149823c09fd993d5b8b | [
"MIT"
] | null | null | null | from uuid import UUID
from keycloak_api_client.data_classes import ReadKeycloakUser
def read_keycloak_user_factory(user_endpoint_data: dict) -> ReadKeycloakUser:
return ReadKeycloakUser(
keycloak_id=UUID(user_endpoint_data.get('id')),
username=user_endpoint_data.get('username'),
first_name=user_endpoint_data.get('firstName'),
last_name=user_endpoint_data.get('lastName'),
email=user_endpoint_data.get('email'),
enabled=user_endpoint_data.get('enabled'),
email_verified=user_endpoint_data.get('emailVerified'),
raw_data=user_endpoint_data
)
| 36.294118 | 77 | 0.740681 | 76 | 617 | 5.631579 | 0.381579 | 0.252336 | 0.336449 | 0.310748 | 0.107477 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160454 | 617 | 16 | 78 | 38.5625 | 0.826255 | 0 | 0 | 0 | 0 | 0 | 0.084279 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.076923 | false | 0 | 0.153846 | 0.076923 | 0.307692 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
72e26587ba2fbf03ac2f85f3f88633ca9b9d6978 | 217 | py | Python | app/routes/main.py | curefatih/griffin | 8560841038303e3e314b2d6e68c3a8ddaa0432e1 | [
"Apache-2.0"
] | 16 | 2021-01-02T19:14:11.000Z | 2022-01-19T19:30:16.000Z | app/routes/main.py | curefatih/griffin | 8560841038303e3e314b2d6e68c3a8ddaa0432e1 | [
"Apache-2.0"
] | null | null | null | app/routes/main.py | curefatih/griffin | 8560841038303e3e314b2d6e68c3a8ddaa0432e1 | [
"Apache-2.0"
] | 2 | 2021-03-02T02:33:28.000Z | 2021-06-09T06:15:39.000Z | from flask import Blueprint, render_template
main = Blueprint('main', __name__, url_prefix='/')
@main.route('/')
@main.route('/index')
def index():
return render_template('views/main/index.html', title='Home')
| 21.7 | 65 | 0.705069 | 28 | 217 | 5.214286 | 0.642857 | 0.191781 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.110599 | 217 | 9 | 66 | 24.111111 | 0.756477 | 0 | 0 | 0 | 0 | 0 | 0.170507 | 0.096774 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0.166667 | 0.5 | 0.333333 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
72e52fd9e37a346c31e8622a904c0b0d63f9b300 | 11,933 | py | Python | sphinx_automodapi/tests/test_automodapi.py | astropy/sphinx-automodapi | 3c9c885da71f682a28ea60e2d07350c6c7829ef2 | [
"BSD-3-Clause"
] | 47 | 2016-11-06T23:23:37.000Z | 2021-12-30T09:11:50.000Z | sphinx_automodapi/tests/test_automodapi.py | astropy/sphinx-automodapi | 3c9c885da71f682a28ea60e2d07350c6c7829ef2 | [
"BSD-3-Clause"
] | 121 | 2016-11-10T14:35:04.000Z | 2022-03-29T20:45:40.000Z | sphinx_automodapi/tests/test_automodapi.py | astropy/sphinx-automodapi | 3c9c885da71f682a28ea60e2d07350c6c7829ef2 | [
"BSD-3-Clause"
] | 39 | 2016-11-10T10:27:42.000Z | 2022-03-02T09:17:00.000Z | # -*- coding: utf-8 -*-
# Licensed under a 3-clause BSD style license - see LICENSE.rst
import sys
from copy import copy
import pytest
from docutils.parsers.rst import directives, roles
from . import cython_testpackage # noqa
from .helpers import run_sphinx_in_tmpdir
if sys.version_info[0] == 2:
from io import open as io_open
else:
io_open = open
def setup_function(func):
# This can be replaced with the docutils_namespace context manager once
# it is in a stable release of Sphinx
func._directives = copy(directives._directives)
func._roles = copy(roles._roles)
def teardown_function(func):
directives._directives = func._directives
roles._roles = func._roles
am_replacer_str = """
This comes before
.. automodapi:: sphinx_automodapi.tests.example_module.mixed
{options}
This comes after
"""
am_replacer_basic_expected = """
This comes before
sphinx_automodapi.tests.example_module.mixed Module
---------------------------------------------------
.. automodule:: sphinx_automodapi.tests.example_module.mixed
Functions
^^^^^^^^^
.. automodsumm:: sphinx_automodapi.tests.example_module.mixed
:functions-only:
:toctree: api
Classes
^^^^^^^
.. automodsumm:: sphinx_automodapi.tests.example_module.mixed
:classes-only:
:toctree: api
Class Inheritance Diagram
^^^^^^^^^^^^^^^^^^^^^^^^^
.. automod-diagram:: sphinx_automodapi.tests.example_module.mixed
:private-bases:
:parts: 1
This comes after
"""
def test_am_replacer_basic(tmpdir):
"""
Tests replacing an ".. automodapi::" with the automodapi no-option
template
"""
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_str.format(options=''))
run_sphinx_in_tmpdir(tmpdir)
with open(tmpdir.join('index.rst.automodapi').strpath) as f:
result = f.read()
assert result == am_replacer_basic_expected
am_replacer_repr_str = u"""
This comes before with spéciàl çhars
.. automodapi:: sphinx_automodapi.tests.example_module.mixed
{options}
This comes after
"""
@pytest.mark.parametrize('writereprocessed', [False, True])
def test_am_replacer_writereprocessed(tmpdir, writereprocessed):
"""
Tests the automodapi_writereprocessed option
"""
with io_open(tmpdir.join('index.rst').strpath, 'w', encoding='utf-8') as f:
f.write(am_replacer_repr_str.format(options=''))
run_sphinx_in_tmpdir(tmpdir, additional_conf={'automodapi_writereprocessed': writereprocessed})
assert tmpdir.join('index.rst.automodapi').isfile() is writereprocessed
am_replacer_noinh_expected = """
This comes before
sphinx_automodapi.tests.example_module.mixed Module
---------------------------------------------------
.. automodule:: sphinx_automodapi.tests.example_module.mixed
Functions
^^^^^^^^^
.. automodsumm:: sphinx_automodapi.tests.example_module.mixed
:functions-only:
:toctree: api
Classes
^^^^^^^
.. automodsumm:: sphinx_automodapi.tests.example_module.mixed
:classes-only:
:toctree: api
This comes after
""".format(empty='')
def test_am_replacer_noinh(tmpdir):
"""
Tests replacing an ".. automodapi::" with no-inheritance-diagram
option
"""
ops = ['', ':no-inheritance-diagram:']
ostr = '\n '.join(ops)
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_str.format(options=ostr))
run_sphinx_in_tmpdir(tmpdir)
with open(tmpdir.join('index.rst.automodapi').strpath) as f:
result = f.read()
assert result == am_replacer_noinh_expected
am_replacer_titleandhdrs_expected = """
This comes before
sphinx_automodapi.tests.example_module.mixed Module
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
.. automodule:: sphinx_automodapi.tests.example_module.mixed
Functions
*********
.. automodsumm:: sphinx_automodapi.tests.example_module.mixed
:functions-only:
:toctree: api
Classes
*******
.. automodsumm:: sphinx_automodapi.tests.example_module.mixed
:classes-only:
:toctree: api
Class Inheritance Diagram
*************************
.. automod-diagram:: sphinx_automodapi.tests.example_module.mixed
:private-bases:
:parts: 1
This comes after
"""
def test_am_replacer_titleandhdrs(tmpdir):
"""
Tests replacing an ".. automodapi::" entry with title-setting and header
character options.
"""
ops = ['', ':headings: &*']
ostr = '\n '.join(ops)
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_str.format(options=ostr))
run_sphinx_in_tmpdir(tmpdir)
with open(tmpdir.join('index.rst.automodapi').strpath) as f:
result = f.read()
assert result == am_replacer_titleandhdrs_expected
def test_am_replacer_titleandhdrs_invalid(tmpdir, capsys):
"""
Tests replacing an ".. automodapi::" entry with title-setting and header
character options.
"""
ops = ['', ':headings: &']
ostr = '\n '.join(ops)
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_str.format(options=ostr))
run_sphinx_in_tmpdir(tmpdir, expect_error=True)
stdout, stderr = capsys.readouterr()
assert "Not enough headings (got 1, need 2), using default -^" in stderr
am_replacer_nomain_str = """
This comes before
.. automodapi:: sphinx_automodapi.tests.example_module.functions
:no-main-docstr:
This comes after
"""
am_replacer_nomain_expected = """
This comes before
sphinx_automodapi.tests.example_module.functions Module
-------------------------------------------------------
Functions
^^^^^^^^^
.. automodsumm:: sphinx_automodapi.tests.example_module.functions
:functions-only:
:toctree: api
This comes after
""".format(empty='')
def test_am_replacer_nomain(tmpdir):
"""
Tests replacing an ".. automodapi::" with "no-main-docstring" .
"""
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_nomain_str.format(options=''))
run_sphinx_in_tmpdir(tmpdir)
with open(tmpdir.join('index.rst.automodapi').strpath) as f:
result = f.read()
assert result == am_replacer_nomain_expected
am_replacer_skip_str = """
This comes before
.. automodapi:: sphinx_automodapi.tests.example_module.functions
:skip: add
:skip: subtract
This comes after
"""
am_replacer_skip_expected = """
This comes before
sphinx_automodapi.tests.example_module.functions Module
-------------------------------------------------------
.. automodule:: sphinx_automodapi.tests.example_module.functions
Functions
^^^^^^^^^
.. automodsumm:: sphinx_automodapi.tests.example_module.functions
:functions-only:
:toctree: api
:skip: add,subtract
This comes after
""".format(empty='')
def test_am_replacer_skip(tmpdir):
"""
Tests using the ":skip: option in an ".. automodapi::" .
"""
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_skip_str.format(options=''))
run_sphinx_in_tmpdir(tmpdir)
with open(tmpdir.join('index.rst.automodapi').strpath) as f:
result = f.read()
assert result == am_replacer_skip_expected
am_replacer_skip_stdlib_str = """
This comes before
.. automodapi:: sphinx_automodapi.tests.example_module.stdlib
:skip: time
:skip: Path
This comes after
"""
am_replacer_skip_stdlib_expected = """
This comes before
sphinx_automodapi.tests.example_module.stdlib Module
----------------------------------------------------
.. automodule:: sphinx_automodapi.tests.example_module.stdlib
Functions
^^^^^^^^^
.. automodsumm:: sphinx_automodapi.tests.example_module.stdlib
:functions-only:
:toctree: api
:skip: time,Path
This comes after
""".format(empty='')
def test_am_replacer_skip_stdlib(tmpdir):
"""
Tests using the ":skip:" option in an ".. automodapi::"
that skips objects imported from the standard library.
This is a regression test for #141
"""
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_skip_stdlib_str.format(options=''))
run_sphinx_in_tmpdir(tmpdir)
with open(tmpdir.join('index.rst.automodapi').strpath) as f:
result = f.read()
assert result == am_replacer_skip_stdlib_expected
am_replacer_include_stdlib_str = """
This comes before
.. automodapi:: sphinx_automodapi.tests.example_module.stdlib
:include: add
:allowed-package-names: pathlib, datetime, sphinx_automodapi
This comes after
"""
am_replacer_include_stdlib_expected = """
This comes before
sphinx_automodapi.tests.example_module.stdlib Module
----------------------------------------------------
.. automodule:: sphinx_automodapi.tests.example_module.stdlib
Functions
^^^^^^^^^
.. automodsumm:: sphinx_automodapi.tests.example_module.stdlib
:functions-only:
:toctree: api
:skip: Path,time
:allowed-package-names: pathlib,datetime,sphinx_automodapi
This comes after
""".format(empty='')
def test_am_replacer_include_stdlib(tmpdir):
"""
Tests using the ":include: option in an ".. automodapi::"
in the presence of objects imported from the standard library.
"""
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_include_stdlib_str.format(options=''))
run_sphinx_in_tmpdir(tmpdir)
with open(tmpdir.join('index.rst.automodapi').strpath) as f:
result = f.read()
assert result == am_replacer_include_stdlib_expected
am_replacer_include_str = """
This comes before
.. automodapi:: sphinx_automodapi.tests.example_module.functions
:include: add
:include: subtract
This comes after
"""
am_replacer_include_expected = """
This comes before
sphinx_automodapi.tests.example_module.functions Module
-------------------------------------------------------
.. automodule:: sphinx_automodapi.tests.example_module.functions
Functions
^^^^^^^^^
.. automodsumm:: sphinx_automodapi.tests.example_module.functions
:functions-only:
:toctree: api
:skip: multiply
This comes after
""".format(empty='')
def test_am_replacer_include(tmpdir):
"""
Tests using the ":include: option in an ".. automodapi::" .
"""
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_include_str.format(options=''))
run_sphinx_in_tmpdir(tmpdir)
with open(tmpdir.join('index.rst.automodapi').strpath) as f:
result = f.read()
assert result == am_replacer_include_expected
am_replacer_invalidop_str = """
This comes before
.. automodapi:: sphinx_automodapi.tests.example_module.functions
:invalid-option:
This comes after
"""
def test_am_replacer_invalidop(tmpdir, capsys):
"""
Tests that a sphinx warning is produced with an invalid option.
"""
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_invalidop_str.format(options=''))
run_sphinx_in_tmpdir(tmpdir, expect_error=True)
stdout, stderr = capsys.readouterr()
assert "Found additional options invalid-option in automodapi." in stderr
am_replacer_cython_str = """
This comes before
.. automodapi:: apyhtest_eva.unit02
{options}
This comes after
"""
am_replacer_cython_expected = """
This comes before
apyhtest_eva.unit02 Module
--------------------------
.. automodule:: apyhtest_eva.unit02
Functions
^^^^^^^^^
.. automodsumm:: apyhtest_eva.unit02
:functions-only:
:toctree: api
This comes after
""".format(empty='')
def test_am_replacer_cython(tmpdir, cython_testpackage): # noqa
"""
Tests replacing an ".. automodapi::" for a Cython module.
"""
with open(tmpdir.join('index.rst').strpath, 'w') as f:
f.write(am_replacer_cython_str.format(options=''))
run_sphinx_in_tmpdir(tmpdir)
with open(tmpdir.join('index.rst.automodapi').strpath) as f:
result = f.read()
assert result == am_replacer_cython_expected
| 21.855311 | 99 | 0.678203 | 1,438 | 11,933 | 5.420723 | 0.123088 | 0.065427 | 0.096985 | 0.129314 | 0.780757 | 0.753175 | 0.715972 | 0.698396 | 0.691341 | 0.673637 | 0 | 0.001995 | 0.159893 | 11,933 | 545 | 100 | 21.895413 | 0.775561 | 0.09612 | 0 | 0.681208 | 0 | 0 | 0.507662 | 0.214624 | 0 | 0 | 0 | 0 | 0.040268 | 1 | 0.04698 | false | 0 | 0.02349 | 0 | 0.07047 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
72ebf534398e1b79e598c6f978e538ed58858cf8 | 1,292 | py | Python | aiotdlib/api/types/formatted_text.py | jraylan/aiotdlib | 4528fcfca7c5c69b54a878ce6ce60e934a2dcc73 | [
"MIT"
] | null | null | null | aiotdlib/api/types/formatted_text.py | jraylan/aiotdlib | 4528fcfca7c5c69b54a878ce6ce60e934a2dcc73 | [
"MIT"
] | null | null | null | aiotdlib/api/types/formatted_text.py | jraylan/aiotdlib | 4528fcfca7c5c69b54a878ce6ce60e934a2dcc73 | [
"MIT"
] | null | null | null | # =============================================================================== #
# #
# This file has been generated automatically!! Do not change this manually! #
# #
# =============================================================================== #
from __future__ import annotations
from pydantic import Field
from .text_entity import TextEntity
from ..base_object import BaseObject
class FormattedText(BaseObject):
"""
A text with some entities
:param text: The text
:type text: :class:`str`
:param entities: Entities contained in the text. Entities can be nested, but must not mutually intersect with each other. Pre, Code and PreCode entities can't contain other entities. Bold, Italic, Underline and Strikethrough entities can contain and to be contained in all other entities. All other entities can't contain each other
:type entities: :class:`list[TextEntity]`
"""
ID: str = Field("formattedText", alias="@type")
text: str
entities: list[TextEntity]
@staticmethod
def read(q: dict) -> FormattedText:
return FormattedText.construct(**q)
| 39.151515 | 336 | 0.530186 | 124 | 1,292 | 5.475806 | 0.524194 | 0.064801 | 0.035346 | 0.055965 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.285604 | 1,292 | 32 | 337 | 40.375 | 0.735645 | 0.660991 | 0 | 0 | 1 | 0 | 0.046997 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.363636 | 0.090909 | 0.909091 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
72fb7a308bca18ff7cff07b9d781582c2cddd58a | 793 | py | Python | registrations/models.py | lhggomes/-meteorological-control | 6a10ab32a8732ffa67f98c1bf4fe1b76c24e94fb | [
"MIT"
] | 1 | 2021-01-10T03:03:31.000Z | 2021-01-10T03:03:31.000Z | registrations/models.py | lhggomes/-meteorological-control | 6a10ab32a8732ffa67f98c1bf4fe1b76c24e94fb | [
"MIT"
] | null | null | null | registrations/models.py | lhggomes/-meteorological-control | 6a10ab32a8732ffa67f98c1bf4fe1b76c24e94fb | [
"MIT"
] | null | null | null | from django.db import models
class Target(models.Model):
name = models.CharField(max_length=200)
latitude = models.CharField(max_length=50)
longitude = models.CharField(max_length=50)
exp_date = models.DateField()
def __str__(self):
return f'{self.name} - @{self.longitude}, {self.latitude}z'
# Relationship 1-N
class Map(models.Model):
name = models.CharField(max_length=200)
location = models.ForeignKey(Target)
# Relationship N-N
class Address(models.Model):
name = models.CharField(max_length=200)
maps = models.ManyToManyField(Map)
# Relationship 1-1
class Person(models.Model):
name = models.CharField(max_length=200)
map = models.OneToOneField(
Map,
on_delete=models.CASCADE,
primary_key=True,
)
| 23.323529 | 68 | 0.692308 | 101 | 793 | 5.306931 | 0.415842 | 0.16791 | 0.201493 | 0.268657 | 0.410448 | 0.313433 | 0.313433 | 0.313433 | 0 | 0 | 0 | 0.029641 | 0.191677 | 793 | 33 | 69 | 24.030303 | 0.806552 | 0.063052 | 0 | 0.190476 | 0 | 0 | 0.067659 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.047619 | 0.047619 | 0.809524 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
f434df776edc821296f63153be6bba8512602d05 | 1,728 | py | Python | release/stubs.min/Autodesk/Revit/DB/__init___parts/ConnectionValidationInfo.py | htlcnn/ironpython-stubs | 780d829e2104b2789d5f4d6f32b0ec9f2930ca03 | [
"MIT"
] | 182 | 2017-06-27T02:26:15.000Z | 2022-03-30T18:53:43.000Z | release/stubs.min/Autodesk/Revit/DB/__init___parts/ConnectionValidationInfo.py | htlcnn/ironpython-stubs | 780d829e2104b2789d5f4d6f32b0ec9f2930ca03 | [
"MIT"
] | 28 | 2017-06-27T13:38:23.000Z | 2022-03-15T11:19:44.000Z | release/stubs.min/Autodesk/Revit/DB/__init___parts/ConnectionValidationInfo.py | htlcnn/ironpython-stubs | 780d829e2104b2789d5f4d6f32b0ec9f2930ca03 | [
"MIT"
] | 67 | 2017-06-28T09:43:59.000Z | 2022-03-20T21:17:10.000Z | class ConnectionValidationInfo(object,IDisposable):
"""
This object contains information about fabrication connection validations.
ConnectionValidationInfo()
"""
def Dispose(self):
""" Dispose(self: ConnectionValidationInfo) """
pass
def GetWarning(self,index):
"""
GetWarning(self: ConnectionValidationInfo,index: int) -> ConnectionValidationWarning
Access specific warning number of warnings generated by reload.
"""
pass
def IsValidWarningIndex(self,index):
"""
IsValidWarningIndex(self: ConnectionValidationInfo,index: int) -> bool
Validate warning index.
"""
pass
def ManyWarnings(self):
"""
ManyWarnings(self: ConnectionValidationInfo) -> int
Returns number of warnings generated by reload.
"""
pass
def ReleaseUnmanagedResources(self,*args):
""" ReleaseUnmanagedResources(self: ConnectionValidationInfo,disposing: bool) """
pass
def __enter__(self,*args):
""" __enter__(self: IDisposable) -> object """
pass
def __exit__(self,*args):
""" __exit__(self: IDisposable,exc_type: object,exc_value: object,exc_back: object) """
pass
def __init__(self,*args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __repr__(self,*args):
""" __repr__(self: object) -> str """
pass
IsValidObject=property(lambda self: object(),lambda self,v: None,lambda self: None)
"""Specifies whether the .NET object represents a valid Revit entity.
Get: IsValidObject(self: ConnectionValidationInfo) -> bool
"""
| 26.181818 | 215 | 0.697917 | 174 | 1,728 | 6.545977 | 0.385057 | 0.049166 | 0.042142 | 0.050044 | 0.169447 | 0.169447 | 0.169447 | 0.169447 | 0.09921 | 0.09921 | 0 | 0 | 0.182292 | 1,728 | 65 | 216 | 26.584615 | 0.806086 | 0.539931 | 0 | 0.45 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.45 | false | 0.45 | 0 | 0 | 0.55 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
f43caab6daad16259da69bfee37b033f280bc762 | 2,471 | py | Python | src/Utilities/deleteAllLinks.py | boydhont/Dynamo-BIM-Revit-API | 5385693c564ec6803594b029d55a501add8d10de | [
"MIT"
] | null | null | null | src/Utilities/deleteAllLinks.py | boydhont/Dynamo-BIM-Revit-API | 5385693c564ec6803594b029d55a501add8d10de | [
"MIT"
] | null | null | null | src/Utilities/deleteAllLinks.py | boydhont/Dynamo-BIM-Revit-API | 5385693c564ec6803594b029d55a501add8d10de | [
"MIT"
] | null | null | null | import clr
clr.AddReference("RevitAPI")
from Autodesk.Revit.DB import *
clr.AddReference("RevitServices")
from RevitServices.Persistence import DocumentManager
clr.AddReference("RevitNodes")
import Revit
clr.ImportExtensions(Revit.Elements)
def getAllRevitLinkInstance():
filter = ElementClassFilter(RevitLinkInstance)
collector = FilteredElementCollector(DocumentManager.Instance.CurrentDBDocument)
return collector.WherePasses(filter).WhereElementIsNotElementType().ToElements()
def getAllRevitLinkType():
filter = ElementClassFilter(RevitLinkType)
collector = FilteredElementCollector(DocumentManager.Instance.CurrentDBDocument)
return collector.WherePasses(filter).WhereElementIsNotElementType().ToElements()
def getAllImportInstance():
filter = ElementClassFilter(ImportInstance)
collector = FilteredElementCollector(DocumentManager.Instance.CurrentDBDocument)
return collector.WherePasses(filter).WhereElementIsNotElementType().ToElements()
def getAllRasterImages():
filter = ElementCategoryFilter(BuiltInCategory.OST_RasterImages)
collector = FilteredElementCollector(DocumentManager.Instance.CurrentDBDocument)
return collector.WherePasses(filter).WhereElementIsNotElementType().ToElements()
def getAllImageInstance():
filter = ElementClassFilter(ImageInstance)
collector = FilteredElementCollector(DocumentManager.Instance.CurrentDBDocument)
return collector.WherePasses(filter).WhereElementIsNotElementType().ToElements()
def getAllImageType():
filter = ElementClassFilter(ImageType)
collector = FilteredElementCollector(DocumentManager.Instance.CurrentDBDocument)
return collector.WherePasses(filter).WhereElementIsNotElementType().ToElements()
def getElementsToDelete(elements):
elementsToDelete = []
for element in elements:
elementsToDelete.append(element.Id)
return elementsToDelete
def flatten(t):
return [item for sublist in t for item in sublist]
def deleteElements(elements):
trans = Transaction(DocumentManager.Instance.CurrentDBDocument)
trans.Start("Delete views, legends, views and sheets")
for id in elements:
try:
DocumentManager.Instance.CurrentDBDocument.Delete(id)
except:
continue
trans.Commit()
return str(elements.Count)
views = flatten([getAllRevitLinkInstance(), getAllRevitLinkType(), getAllImportInstance(), getAllRasterImages(), getAllImageInstance(), getAllImageType()])
OUT = deleteElements(getElementsToDelete(views)) | 40.508197 | 156 | 0.812626 | 198 | 2,471 | 10.136364 | 0.338384 | 0.091679 | 0.159442 | 0.167414 | 0.436472 | 0.436472 | 0.436472 | 0.436472 | 0.436472 | 0.436472 | 0 | 0 | 0.09996 | 2,471 | 61 | 157 | 40.508197 | 0.902428 | 0 | 0 | 0.235294 | 0 | 0 | 0.029022 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0.117647 | 0.156863 | 0.019608 | 0.509804 | 0 | 0 | 0 | 1 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
f46a32f8e0bf72fdd6880d73f17a956943fd3ac7 | 1,499 | py | Python | networking_bagpipe/bagpipe_bgp/engine/ipvpn.py | oferby/networking-bagpipe | 4c42971872065192b4da33c050ccfa86deb5ea77 | [
"Apache-2.0"
] | null | null | null | networking_bagpipe/bagpipe_bgp/engine/ipvpn.py | oferby/networking-bagpipe | 4c42971872065192b4da33c050ccfa86deb5ea77 | [
"Apache-2.0"
] | null | null | null | networking_bagpipe/bagpipe_bgp/engine/ipvpn.py | oferby/networking-bagpipe | 4c42971872065192b4da33c050ccfa86deb5ea77 | [
"Apache-2.0"
] | null | null | null | # vim: tabstop=4 shiftwidth=4 softtabstop=4
# encoding: utf-8
# Copyright 2014 Orange
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from networking_bagpipe.bagpipe_bgp.engine import exa
def prefix_to_packed_ip_mask(prefix):
ip_string, mask = prefix.split("/")
return (exa.IP.pton(ip_string), int(mask))
@exa.NLRI.register(exa.AFI.ipv4, exa.SAFI.mpls_vpn, force=True)
@exa.NLRI.register(exa.AFI.ipv6, exa.SAFI.mpls_vpn, force=True)
class IPVPN(exa.IPVPN):
# two NLRIs with same RD and prefix, but different labels need to
# be equal and have the same hash
def __eq__(self, other):
return self.rd == other.rd and self.cidr == other.cidr
def __hash__(self):
return hash((self.rd, self.cidr._packed))
def IPVPNRouteFactory(afi, prefix, label, rd, nexthop):
packed_prefix, mask = prefix_to_packed_ip_mask(prefix)
return IPVPN.new(afi, exa.SAFI(exa.SAFI.mpls_vpn), packed_prefix, mask,
exa.Labels([label], True), rd, nexthop)
| 34.068182 | 75 | 0.723816 | 233 | 1,499 | 4.545064 | 0.51073 | 0.056657 | 0.031161 | 0.03966 | 0.1322 | 0.09254 | 0 | 0 | 0 | 0 | 0 | 0.011318 | 0.174783 | 1,499 | 43 | 76 | 34.860465 | 0.844786 | 0.46431 | 0 | 0 | 0 | 0 | 0.001274 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.066667 | 0.133333 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
f46e4fb9890fdad550d509139cf9a8cb15bbd691 | 1,123 | py | Python | phonon/client.py | buzzfeed/Phonon | 32fd036d64fab19c554e841f162466f6eb28b50f | [
"MIT"
] | 4 | 2015-03-30T22:46:35.000Z | 2020-09-08T02:03:53.000Z | phonon/client.py | buzzfeed/Phonon | 32fd036d64fab19c554e841f162466f6eb28b50f | [
"MIT"
] | 21 | 2015-02-03T23:12:36.000Z | 2017-09-15T21:03:24.000Z | phonon/client.py | buzzfeed/disref | 32fd036d64fab19c554e841f162466f6eb28b50f | [
"MIT"
] | 2 | 2016-08-14T20:18:52.000Z | 2019-09-30T16:02:22.000Z | import redis
import zlib
from phonon import get_logger
logger = get_logger(__name__)
connection = None
class ShardedClient(object):
def __init__(self, hosts=None, port=6379, db=0):
self.hosts = sorted(hosts)
self.clients = [redis.StrictRedis(host=host, port=port, db=db) for host in self.hosts]
def route(self, key):
return self.clients[(zlib.crc32(key) & 0xffffffff) % len(self.clients)]
def __flushall(self):
return all([client.flushall() for client in self.clients])
def __flushdb(self):
return all([client.flushdb() for client in self.clients])
def __getattr__(self, method):
if method == 'flushall':
return self.__flushall
if method == 'flushdb':
return self.__flushdb
def wrap(*args, **kwargs):
if not args:
return [getattr(client, method)(*args, **kwargs) for client in self.clients]
client = self.route(args[0])
return getattr(client, method)(*args, **kwargs)
return wrap
def using_key(self, key):
return self.route(key)
| 26.116279 | 94 | 0.621549 | 140 | 1,123 | 4.821429 | 0.321429 | 0.097778 | 0.062222 | 0.066667 | 0.21037 | 0.177778 | 0 | 0 | 0 | 0 | 0 | 0.01087 | 0.262689 | 1,123 | 42 | 95 | 26.738095 | 0.804348 | 0 | 0 | 0 | 0 | 0 | 0.013357 | 0 | 0 | 0 | 0.008905 | 0 | 0 | 1 | 0.25 | false | 0 | 0.107143 | 0.142857 | 0.714286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
f4712596eda2e906fe5d4d6144392694848e892f | 12,705 | py | Python | release/stubs.min/Grasshopper/Kernel/Undo/__init__.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs.min/Grasshopper/Kernel/Undo/__init__.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | release/stubs.min/Grasshopper/Kernel/Undo/__init__.py | YKato521/ironpython-stubs | b1f7c580de48528490b3ee5791b04898be95a9ae | [
"MIT"
] | null | null | null | # encoding: utf-8
# module Grasshopper.Kernel.Undo calls itself Undo
# from Grasshopper,Version=1.0.0.20,Culture=neutral,PublicKeyToken=dda4f5ec2cd80803
# by generator 1.145
""" NamespaceTracker represent a CLS namespace. """
# no imports
# no functions
# classes
class GH_UndoAction(object, IGH_UndoAction, GH_ISerializable):
# no doc
def Internal_Redo(self, *args):
""" Internal_Redo(self: GH_UndoAction,doc: GH_Document) """
pass
def Internal_Undo(self, *args):
""" Internal_Undo(self: GH_UndoAction,doc: GH_Document) """
pass
def Read(self, reader):
""" Read(self: GH_UndoAction,reader: GH_IReader) -> bool """
pass
def Redo(self, doc):
""" Redo(self: GH_UndoAction,doc: GH_Document) """
pass
def Undo(self, doc):
""" Undo(self: GH_UndoAction,doc: GH_Document) """
pass
def Write(self, writer):
""" Write(self: GH_UndoAction,writer: GH_IWriter) -> bool """
pass
def __init__(self, *args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __repr__(self, *args):
""" __repr__(self: object) -> str """
pass
ExpiresDisplay = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: ExpiresDisplay(self: GH_UndoAction) -> bool
"""
ExpiresSolution = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: ExpiresSolution(self: GH_UndoAction) -> bool
"""
State = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: State(self: GH_UndoAction) -> GH_UndoState
"""
class GH_ArchivedUndoAction(GH_UndoAction, IGH_UndoAction, GH_ISerializable):
# no doc
def Deserialize(self, *args):
""" Deserialize(self: GH_ArchivedUndoAction,obj: GH_ISerializable) """
pass
def Internal_Redo(self, *args):
""" Internal_Redo(self: GH_UndoAction,doc: GH_Document) """
pass
def Internal_Undo(self, *args):
""" Internal_Undo(self: GH_UndoAction,doc: GH_Document) """
pass
def Read(self, reader):
""" Read(self: GH_ArchivedUndoAction,reader: GH_IReader) -> bool """
pass
def Serialize(self, *args):
""" Serialize(self: GH_ArchivedUndoAction,obj: GH_ISerializable) """
pass
def SerializeToByteArray(self, *args):
""" SerializeToByteArray(self: GH_ArchivedUndoAction,obj: GH_ISerializable) -> Array[Byte] """
pass
def Write(self, writer):
""" Write(self: GH_ArchivedUndoAction,writer: GH_IWriter) -> bool """
pass
def __init__(self, *args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
m_data = None
class GH_ObjectUndoAction(GH_UndoAction, IGH_UndoAction, GH_ISerializable):
# no doc
def Internal_Redo(self, *args):
""" Internal_Redo(self: GH_ObjectUndoAction,doc: GH_Document) """
pass
def Internal_Undo(self, *args):
""" Internal_Undo(self: GH_ObjectUndoAction,doc: GH_Document) """
pass
def Object_Redo(self, *args):
""" Object_Redo(self: GH_ObjectUndoAction,doc: GH_Document,obj: IGH_DocumentObject) """
pass
def Object_Undo(self, *args):
""" Object_Undo(self: GH_ObjectUndoAction,doc: GH_Document,obj: IGH_DocumentObject) """
pass
def __init__(self, *args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self, *args): # cannot find CLR constructor
""" __new__(cls: type,object_id: Guid) """
pass
class GH_UndoException(Exception, ISerializable, _Exception):
"""
GH_UndoException(message: str)
GH_UndoException(message: str,*args: Array[object])
"""
def add_SerializeObjectState(self, *args):
""" add_SerializeObjectState(self: Exception,value: EventHandler[SafeSerializationEventArgs]) """
pass
def remove_SerializeObjectState(self, *args):
""" remove_SerializeObjectState(self: Exception,value: EventHandler[SafeSerializationEventArgs]) """
pass
def __init__(self, *args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self, message, args=None):
"""
__new__(cls: type,message: str)
__new__(cls: type,message: str,*args: Array[object])
"""
pass
def __str__(self, *args):
pass
class GH_UndoRecord(object):
"""
GH_UndoRecord()
GH_UndoRecord(name: str)
GH_UndoRecord(name: str,action: IGH_UndoAction)
GH_UndoRecord(name: str,*actions: Array[IGH_UndoAction])
GH_UndoRecord(name: str,actions: IEnumerable[IGH_UndoAction])
"""
def AddAction(self, action):
""" AddAction(self: GH_UndoRecord,action: IGH_UndoAction) """
pass
def Redo(self, doc):
""" Redo(self: GH_UndoRecord,doc: GH_Document) """
pass
def Undo(self, doc):
""" Undo(self: GH_UndoRecord,doc: GH_Document) """
pass
@staticmethod
def __new__(self, name=None, *__args):
"""
__new__(cls: type)
__new__(cls: type,name: str)
__new__(cls: type,name: str,action: IGH_UndoAction)
__new__(cls: type,name: str,*actions: Array[IGH_UndoAction])
__new__(cls: type,name: str,actions: IEnumerable[IGH_UndoAction])
"""
pass
ActionCount = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: ActionCount(self: GH_UndoRecord) -> int
"""
Actions = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: Actions(self: GH_UndoRecord) -> IList[IGH_UndoAction]
"""
ExpiresDisplay = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: ExpiresDisplay(self: GH_UndoRecord) -> bool
"""
ExpiresSolution = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: ExpiresSolution(self: GH_UndoRecord) -> bool
"""
Guid = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: Guid(self: GH_UndoRecord) -> Guid
"""
Name = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: Name(self: GH_UndoRecord) -> str
Set: Name(self: GH_UndoRecord)=value
"""
State = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: State(self: GH_UndoRecord) -> GH_UndoState
"""
Time = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: Time(self: GH_UndoRecord) -> DateTime
"""
class GH_UndoServer(object, IGH_DebugDescription):
""" GH_UndoServer(nOwner: GH_Document) """
def AppendToDebugLog(self, writer):
""" AppendToDebugLog(self: GH_UndoServer,writer: GH_DebugDescriptionWriter) """
pass
def Clear(self):
""" Clear(self: GH_UndoServer) """
pass
def ClearRedo(self):
""" ClearRedo(self: GH_UndoServer) """
pass
def ClearUndo(self):
""" ClearUndo(self: GH_UndoServer) """
pass
def PerformRedo(self):
""" PerformRedo(self: GH_UndoServer) """
pass
def PerformUndo(self):
""" PerformUndo(self: GH_UndoServer) """
pass
def PushUndoRecord(self, *__args):
"""
PushUndoRecord(self: GH_UndoServer,name: str,action: GH_UndoAction) -> Guid
PushUndoRecord(self: GH_UndoServer,record: GH_UndoRecord)
"""
pass
def RemoveRecord(self, id):
""" RemoveRecord(self: GH_UndoServer,id: Guid) -> bool """
pass
def __init__(self, *args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
@staticmethod
def __new__(self, nOwner):
""" __new__(cls: type,nOwner: GH_Document) """
pass
def __repr__(self, *args):
""" __repr__(self: object) -> str """
pass
FirstRedoName = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: FirstRedoName(self: GH_UndoServer) -> str
"""
FirstUndoName = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: FirstUndoName(self: GH_UndoServer) -> str
"""
MaxRecords = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: MaxRecords(self: GH_UndoServer) -> int
Set: MaxRecords(self: GH_UndoServer)=value
"""
RedoCount = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: RedoCount(self: GH_UndoServer) -> int
"""
RedoGuids = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: RedoGuids(self: GH_UndoServer) -> List[Guid]
"""
RedoNames = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: RedoNames(self: GH_UndoServer) -> List[str]
"""
UndoCount = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: UndoCount(self: GH_UndoServer) -> int
"""
UndoGuids = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: UndoGuids(self: GH_UndoServer) -> List[Guid]
"""
UndoNames = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: UndoNames(self: GH_UndoServer) -> List[str]
"""
class GH_UndoState(Enum, IComparable, IFormattable, IConvertible):
""" enum GH_UndoState,values: redo (1),undo (0) """
def __eq__(self, *args):
""" x.__eq__(y) <==> x==yx.__eq__(y) <==> x==yx.__eq__(y) <==> x==y """
pass
def __format__(self, *args):
""" __format__(formattable: IFormattable,format: str) -> str """
pass
def __ge__(self, *args):
pass
def __gt__(self, *args):
pass
def __init__(self, *args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
def __le__(self, *args):
pass
def __lt__(self, *args):
pass
def __ne__(self, *args):
pass
def __reduce_ex__(self, *args):
pass
def __str__(self, *args):
pass
redo = None
undo = None
value__ = None
class IGH_UndoAction(GH_ISerializable):
# no doc
def Redo(self, doc):
""" Redo(self: IGH_UndoAction,doc: GH_Document) """
pass
def Undo(self, doc):
""" Undo(self: IGH_UndoAction,doc: GH_Document) """
pass
def __init__(self, *args):
""" x.__init__(...) initializes x; see x.__class__.__doc__ for signaturex.__init__(...) initializes x; see x.__class__.__doc__ for signature """
pass
ExpiresDisplay = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: ExpiresDisplay(self: IGH_UndoAction) -> bool
"""
ExpiresSolution = property(
lambda self: object(), lambda self, v: None, lambda self: None
)
"""Get: ExpiresSolution(self: IGH_UndoAction) -> bool
"""
State = property(lambda self: object(), lambda self, v: None, lambda self: None)
"""Get: State(self: IGH_UndoAction) -> GH_UndoState
"""
# variables with complex values
| 26.46875 | 221 | 0.609681 | 1,403 | 12,705 | 5.133286 | 0.109052 | 0.095807 | 0.057484 | 0.076645 | 0.677312 | 0.62677 | 0.598445 | 0.574146 | 0.502222 | 0.488337 | 0 | 0.002118 | 0.256749 | 12,705 | 479 | 222 | 26.524008 | 0.760563 | 0.35608 | 0 | 0.621302 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.325444 | false | 0.325444 | 0 | 0 | 0.532544 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
f4716969344974c31f9aa0564ffafe5913d549a9 | 2,873 | py | Python | pywick/functions/activations_jit.py | achaiah/pywick | 9d663faf0c1660a9b8359a6472c164f658dfc8cb | [
"MIT"
] | 408 | 2019-05-16T16:12:41.000Z | 2022-03-26T17:27:12.000Z | pywick/functions/activations_jit.py | achaiah/pywick | 9d663faf0c1660a9b8359a6472c164f658dfc8cb | [
"MIT"
] | 13 | 2019-05-17T05:47:06.000Z | 2021-06-21T19:02:30.000Z | pywick/functions/activations_jit.py | achaiah/pywick | 9d663faf0c1660a9b8359a6472c164f658dfc8cb | [
"MIT"
] | 42 | 2019-05-16T19:57:12.000Z | 2022-03-06T15:23:18.000Z | # Source: https://github.com/rwightman/gen-efficientnet-pytorch/blob/master/geffnet/activations/activations_jit.py (Apache 2.0)
import torch
from torch import nn
from torch.nn import functional as F
__all__ = ['swish_jit', 'SwishJit', 'mish_jit', 'MishJit']
#'hard_swish_jit', 'HardSwishJit', 'hard_sigmoid_jit', 'HardSigmoidJit']
@torch.jit.script
def swish_jit_fwd(x):
return x.mul(torch.sigmoid(x))
@torch.jit.script
def swish_jit_bwd(x, grad_output):
x_sigmoid = torch.sigmoid(x)
return grad_output * (x_sigmoid * (1 + x * (1 - x_sigmoid)))
class SwishJitAutoFn(torch.autograd.Function):
""" torch.jit.script optimised Swish
Inspired by conversation btw Jeremy Howard & Adam Pazske
https://twitter.com/jeremyphoward/status/1188251041835315200
"""
@staticmethod
def forward(ctx, x):
ctx.save_for_backward(x)
return swish_jit_fwd(x)
@staticmethod
def backward(ctx, grad_output):
x = ctx.saved_tensors[0]
return swish_jit_bwd(x, grad_output)
def swish_jit(x, inplace=False):
# inplace ignored
return SwishJitAutoFn.apply(x)
class SwishJit(nn.Module):
def __init__(self, inplace: bool = False):
super(SwishJit, self).__init__()
self.inplace = inplace
@staticmethod
def forward(x):
return SwishJitAutoFn.apply(x)
@torch.jit.script
def mish_jit_fwd(x):
return x.mul(torch.tanh(F.softplus(x)))
@torch.jit.script
def mish_jit_bwd(x, grad_output):
x_sigmoid = torch.sigmoid(x)
x_tanh_sp = F.softplus(x).tanh()
return grad_output.mul(x_tanh_sp + x * x_sigmoid * (1 - x_tanh_sp * x_tanh_sp))
class MishJitAutoFn(torch.autograd.Function):
@staticmethod
def forward(ctx, x):
ctx.save_for_backward(x)
return mish_jit_fwd(x)
@staticmethod
def backward(ctx, grad_output):
x = ctx.saved_tensors[0]
return mish_jit_bwd(x, grad_output)
def mish_jit(x, inplace=False):
# inplace ignored
return MishJitAutoFn.apply(x)
class MishJit(nn.Module):
def __init__(self, inplace: bool = False):
super(MishJit, self).__init__()
self.inplace = inplace
@staticmethod
def forward(x):
return MishJitAutoFn.apply(x)
# @torch.jit.script
# def hard_swish_jit(x, inplac: bool = False):
# return x.mul(F.relu6(x + 3.).mul_(1./6.))
#
#
# class HardSwishJit(nn.Module):
# def __init__(self, inplace: bool = False):
# super(HardSwishJit, self).__init__()
#
# def forward(self, x):
# return hard_swish_jit(x)
#
#
# @torch.jit.script
# def hard_sigmoid_jit(x, inplace: bool = False):
# return F.relu6(x + 3.).mul(1./6.)
#
#
# class HardSigmoidJit(nn.Module):
# def __init__(self, inplace: bool = False):
# super(HardSigmoidJit, self).__init__()
#
# def forward(self, x):
# return hard_sigmoid_jit(x)
| 24.555556 | 127 | 0.671076 | 395 | 2,873 | 4.635443 | 0.21519 | 0.039323 | 0.053523 | 0.055707 | 0.540688 | 0.540142 | 0.459858 | 0.372474 | 0.316767 | 0.229383 | 0 | 0.014776 | 0.199095 | 2,873 | 116 | 128 | 24.767241 | 0.780965 | 0.33519 | 0 | 0.545455 | 0 | 0 | 0.017177 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.254545 | false | 0 | 0.054545 | 0.109091 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
f4734ff928d7a2ab000c1a6c8ee85768a4612d29 | 88 | py | Python | pysrc/labstreaminglayer_ros/Converter__init__.py | guthom/labstreaminglayer_ros | e987cdb85c8e860dbca3ab075054e080d8e2906c | [
"MIT"
] | null | null | null | pysrc/labstreaminglayer_ros/Converter__init__.py | guthom/labstreaminglayer_ros | e987cdb85c8e860dbca3ab075054e080d8e2906c | [
"MIT"
] | null | null | null | pysrc/labstreaminglayer_ros/Converter__init__.py | guthom/labstreaminglayer_ros | e987cdb85c8e860dbca3ab075054e080d8e2906c | [
"MIT"
] | null | null | null | __all__ = ['Transform', 'Float32', 'TransformStamped', 'ConverterBase', 'Bool', 'Image'] | 88 | 88 | 0.693182 | 7 | 88 | 8.142857 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.024691 | 0.079545 | 88 | 1 | 88 | 88 | 0.679012 | 0 | 0 | 0 | 0 | 0 | 0.606742 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
f47963e92f5f289a387006ec4f57b71920577a89 | 721 | py | Python | flatdata-py/generator/tree/nodes/resources/archive.py | tianchengli/flatdata | 03ccf3669254ddef171d23a287c643dcd28650d2 | [
"Apache-2.0"
] | null | null | null | flatdata-py/generator/tree/nodes/resources/archive.py | tianchengli/flatdata | 03ccf3669254ddef171d23a287c643dcd28650d2 | [
"Apache-2.0"
] | null | null | null | flatdata-py/generator/tree/nodes/resources/archive.py | tianchengli/flatdata | 03ccf3669254ddef171d23a287c643dcd28650d2 | [
"Apache-2.0"
] | null | null | null | from generator.tree.nodes.references import ArchiveReference
from .base import ResourceBase
class Archive(ResourceBase):
def __init__(self, name, properties=None, target=None):
super(Archive, self).__init__(name=name, properties=properties)
self._target = target
@staticmethod
def create(properties):
return Archive(name=properties.name,
properties=properties,
target=properties.type.archive.name)
@property
def target(self):
targets = self.children_like(ArchiveReference)
assert len(targets) == 1
return targets[0]
def create_references(self):
return [ArchiveReference(name=self._target)]
| 30.041667 | 71 | 0.668516 | 74 | 721 | 6.351351 | 0.418919 | 0.119149 | 0.102128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003663 | 0.242718 | 721 | 23 | 72 | 31.347826 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.055556 | 1 | 0.222222 | false | 0 | 0.111111 | 0.111111 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
f47968cf39246f0addadaaec9d9e73515321f3c0 | 1,832 | py | Python | backend/tokens/schema.py | karadalex/crypto-app | 2f1396200f0ea334fd0cea82c45d0df375665477 | [
"MIT"
] | null | null | null | backend/tokens/schema.py | karadalex/crypto-app | 2f1396200f0ea334fd0cea82c45d0df375665477 | [
"MIT"
] | null | null | null | backend/tokens/schema.py | karadalex/crypto-app | 2f1396200f0ea334fd0cea82c45d0df375665477 | [
"MIT"
] | null | null | null | import graphene
from graphene_django.types import DjangoObjectType
from .models import CryptoCurrency, FiatCurrency, PriceRecord
from .types import CryptoCurrencyType, FiatCurrencyType, PriceRecordType
class Query(graphene.ObjectType):
crypto_currencies = graphene.List(CryptoCurrencyType)
crypto_currency = graphene.Field(CryptoCurrencyType, symbol=graphene.String())
fiat_currencies = graphene.List(FiatCurrencyType)
fiat_currency = graphene.Field(FiatCurrencyType, symbol=graphene.String())
price_records = graphene.List(PriceRecordType, from_symbol=graphene.String(), to_symbol=graphene.String())
latest_price_record = graphene.Field(PriceRecordType, from_symbol=graphene.String(), to_symbol=graphene.String())
def resolve_crypto_currencies(self, info, **kwargs):
return CryptoCurrency.objects.all()
def resolve_crypto_currency(self, info, symbol):
return CryptoCurrency.objects.get(symbol=symbol.upper())
def resolve_fiat_currencies(self, info, **kwargs):
return FiatCurrency.objects.all()
def resolve_fiat_currency(self, info, symbol):
return FiatCurrency.objects.get(symbol=symbol.upper())
def resolve_price_records(self, info, from_symbol, to_symbol):
from_currency = FiatCurrency.objects.get(symbol=from_symbol.upper())
to_currency = CryptoCurrency.objects.get(symbol=to_symbol.upper())
return PriceRecord.objects.filter(from_id=from_currency.id, to_id=to_currency.id).all()
def resolve_latest_price_record(self, info, from_symbol, to_symbol):
from_currency = FiatCurrency.objects.get(symbol=from_symbol.upper())
to_currency = CryptoCurrency.objects.get(symbol=to_symbol.upper())
return PriceRecord.objects.filter(from_id=from_currency.id, to_id=to_currency.id).order_by('-updated_at').first()
# class Mutation(graphene.ObjectType):
# pass
schema = graphene.Schema(query=Query)
| 42.604651 | 115 | 0.808406 | 228 | 1,832 | 6.289474 | 0.214912 | 0.058577 | 0.083682 | 0.062762 | 0.496513 | 0.415621 | 0.415621 | 0.364017 | 0.364017 | 0.27894 | 0 | 0 | 0.081332 | 1,832 | 42 | 116 | 43.619048 | 0.85205 | 0.022926 | 0 | 0.142857 | 0 | 0 | 0.006156 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0 | 0.142857 | 0.142857 | 0.821429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
f47b743088089de5457cc2aadeaed34cf8754d91 | 117 | py | Python | redirect_file.py | DahlitzFlorian/capture-stdout-output-video-snippets | 0b0cbbe8238c77fdf62f26cf631be054213a42b9 | [
"MIT"
] | null | null | null | redirect_file.py | DahlitzFlorian/capture-stdout-output-video-snippets | 0b0cbbe8238c77fdf62f26cf631be054213a42b9 | [
"MIT"
] | null | null | null | redirect_file.py | DahlitzFlorian/capture-stdout-output-video-snippets | 0b0cbbe8238c77fdf62f26cf631be054213a42b9 | [
"MIT"
] | null | null | null | import contextlib
with open("help_output.txt", "w") as f:
with contextlib.redirect_stdout(f):
help(pow)
| 19.5 | 39 | 0.675214 | 17 | 117 | 4.529412 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.196581 | 117 | 5 | 40 | 23.4 | 0.819149 | 0 | 0 | 0 | 0 | 0 | 0.136752 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.25 | 0 | 0.25 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
f482e5fec95a7d9fc49897148dfad44e3db0ad07 | 3,581 | py | Python | src/adafruit_blinka/microcontroller/snapdragon/apq8016/pin.py | tekktrik/Adafruit_Blinka | d094997a471301a2147d58f1693aa1d3a49df6cd | [
"MIT"
] | null | null | null | src/adafruit_blinka/microcontroller/snapdragon/apq8016/pin.py | tekktrik/Adafruit_Blinka | d094997a471301a2147d58f1693aa1d3a49df6cd | [
"MIT"
] | null | null | null | src/adafruit_blinka/microcontroller/snapdragon/apq8016/pin.py | tekktrik/Adafruit_Blinka | d094997a471301a2147d58f1693aa1d3a49df6cd | [
"MIT"
] | null | null | null | # SPDX-FileCopyrightText: 2021 Melissa LeBlanc-Williams for Adafruit Industries
#
# SPDX-License-Identifier: MIT
"""SnapDragon APQ8016 pin names"""
from adafruit_blinka.microcontroller.generic_linux.libgpiod_pin import Pin
GPIO_0 = Pin((0, 0))
GPIO_1 = Pin((0, 1))
GPIO_2 = Pin((0, 2))
GPIO_3 = Pin((0, 3))
GPIO_4 = Pin((0, 4))
GPIO_5 = Pin((0, 5))
GPIO_6 = Pin((0, 6))
GPIO_7 = Pin((0, 7))
GPIO_8 = Pin((0, 8))
GPIO_9 = Pin((0, 9))
GPIO_10 = Pin((0, 10))
GPIO_11 = Pin((0, 11))
GPIO_12 = Pin((0, 12))
GPIO_13 = Pin((0, 13))
GPIO_14 = Pin((0, 14))
GPIO_15 = Pin((0, 15))
GPIO_16 = Pin((0, 16))
GPIO_17 = Pin((0, 17))
GPIO_18 = Pin((0, 18))
GPIO_19 = Pin((0, 19))
GPIO_20 = Pin((0, 20))
GPIO_22 = Pin((0, 22))
GPIO_23 = Pin((0, 23))
GPIO_24 = Pin((0, 24))
GPIO_25 = Pin((0, 25))
GPIO_26 = Pin((0, 26))
GPIO_27 = Pin((0, 27))
GPIO_28 = Pin((0, 28))
GPIO_29 = Pin((0, 29))
GPIO_30 = Pin((0, 30))
GPIO_33 = Pin((0, 33))
GPIO_34 = Pin((0, 34))
GPIO_35 = Pin((0, 35))
GPIO_36 = Pin((0, 36))
GPIO_37 = Pin((0, 37))
GPIO_39 = Pin((0, 39))
GPIO_40 = Pin((0, 40))
GPIO_41 = Pin((0, 41))
GPIO_42 = Pin((0, 42))
GPIO_43 = Pin((0, 43))
GPIO_44 = Pin((0, 44))
GPIO_45 = Pin((0, 45))
GPIO_46 = Pin((0, 46))
GPIO_47 = Pin((0, 47))
GPIO_48 = Pin((0, 48))
GPIO_49 = Pin((0, 49))
GPIO_50 = Pin((0, 50))
GPIO_51 = Pin((0, 51))
GPIO_52 = Pin((0, 52))
GPIO_53 = Pin((0, 53))
GPIO_54 = Pin((0, 54))
GPIO_55 = Pin((0, 55))
GPIO_56 = Pin((0, 56))
GPIO_57 = Pin((0, 57))
GPIO_58 = Pin((0, 58))
GPIO_59 = Pin((0, 59))
GPIO_60 = Pin((0, 60))
GPIO_61 = Pin((0, 61))
GPIO_62 = Pin((0, 62))
GPIO_63 = Pin((0, 63))
GPIO_64 = Pin((0, 64))
GPIO_65 = Pin((0, 65))
GPIO_66 = Pin((0, 66))
GPIO_67 = Pin((0, 67))
GPIO_68 = Pin((0, 68))
GPIO_69 = Pin((0, 69))
GPIO_70 = Pin((0, 70))
GPIO_71 = Pin((0, 71))
GPIO_72 = Pin((0, 72))
GPIO_73 = Pin((0, 73))
GPIO_74 = Pin((0, 74))
GPIO_75 = Pin((0, 75))
GPIO_76 = Pin((0, 76))
GPIO_77 = Pin((0, 77))
GPIO_78 = Pin((0, 78))
GPIO_79 = Pin((0, 79))
GPIO_80 = Pin((0, 80))
GPIO_81 = Pin((0, 81))
GPIO_82 = Pin((0, 82))
GPIO_83 = Pin((0, 83))
GPIO_84 = Pin((0, 84))
GPIO_85 = Pin((0, 85))
GPIO_86 = Pin((0, 86))
GPIO_87 = Pin((0, 87))
GPIO_88 = Pin((0, 88))
GPIO_89 = Pin((0, 89))
GPIO_90 = Pin((0, 90))
GPIO_91 = Pin((0, 91))
GPIO_92 = Pin((0, 92))
GPIO_93 = Pin((0, 93))
GPIO_94 = Pin((0, 94))
GPIO_95 = Pin((0, 95))
GPIO_96 = Pin((0, 96))
GPIO_97 = Pin((0, 97))
GPIO_98 = Pin((0, 98))
GPIO_99 = Pin((0, 99))
GPIO_100 = Pin((0, 100))
GPIO_101 = Pin((0, 101))
GPIO_102 = Pin((0, 102))
GPIO_103 = Pin((0, 103))
GPIO_104 = Pin((0, 104))
GPIO_105 = Pin((0, 105))
GPIO_106 = Pin((0, 106))
GPIO_108 = Pin((0, 108))
GPIO_109 = Pin((0, 109))
GPIO_110 = Pin((0, 110))
GPIO_111 = Pin((0, 111))
GPIO_112 = Pin((0, 112))
GPIO_113 = Pin((0, 113))
GPIO_114 = Pin((0, 114))
GPIO_115 = Pin((0, 115))
GPIO_116 = Pin((0, 116))
GPIO_117 = Pin((0, 117))
GPIO_118 = Pin((0, 118))
GPIO_119 = Pin((0, 119))
PM_GPIO_0 = Pin((1, 0))
PM_GPIO_1 = Pin((1, 1))
PM_GPIO_2 = Pin((1, 2))
PM_GPIO_3 = Pin((1, 3))
PM_MPP_1 = Pin((2, 0))
PM_MPP_2 = Pin((2, 1))
PM_MPP_3 = Pin((2, 2))
PM_MPP_4 = Pin((2, 3))
I2C0_SDA = GPIO_6
I2C0_SCL = GPIO_7
I2C1_SDA = GPIO_22
I2C1_SCL = GPIO_23
UART0_TX = GPIO_0
UART0_RX = GPIO_1
UART1_TX = GPIO_4
UART1_RX = GPIO_5
SPI0_SCLK = GPIO_19
SPI0_MISO = GPIO_17
SPI0_MOSI = GPIO_16
SPI0_CS = GPIO_18
i2cPorts = (
(0, I2C0_SCL, I2C0_SDA),
(1, I2C1_SCL, I2C1_SDA),
)
# ordered as spiId, sckId, mosiId, misoId
spiPorts = ((0, SPI0_SCLK, SPI0_MOSI, SPI0_MISO),)
# ordered as uartId, txId, rxId
uartPorts = (
(0, UART0_TX, UART0_RX),
(1, UART1_TX, UART1_RX),
)
| 21.969325 | 79 | 0.602346 | 739 | 3,581 | 2.690122 | 0.224628 | 0.231388 | 0.008048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.229188 | 0.17146 | 3,581 | 162 | 80 | 22.104938 | 0.440849 | 0.057526 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.006897 | 0 | 0.006897 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
be2f71ab5c38ea0703ad73fc6fff8a0a2ce65914 | 94 | py | Python | data/studio21_generated/interview/0191/starter_code.py | vijaykumawat256/Prompt-Summarization | 614f5911e2acd2933440d909de2b4f86653dc214 | [
"Apache-2.0"
] | null | null | null | data/studio21_generated/interview/0191/starter_code.py | vijaykumawat256/Prompt-Summarization | 614f5911e2acd2933440d909de2b4f86653dc214 | [
"Apache-2.0"
] | null | null | null | data/studio21_generated/interview/0191/starter_code.py | vijaykumawat256/Prompt-Summarization | 614f5911e2acd2933440d909de2b4f86653dc214 | [
"Apache-2.0"
] | null | null | null | class Solution:
def atMostNGivenDigitSet(self, digits: List[str], n: int) -> int:
| 31.333333 | 69 | 0.638298 | 11 | 94 | 5.454545 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.234043 | 94 | 3 | 70 | 31.333333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
be3df20a41162c1fcd6bc6649e735fe27424e3ba | 1,917 | py | Python | app/sync/events.py | travcunn/kissync-python | a1c2a4d0619bcb93658007c2f240119b11964b9b | [
"MIT"
] | 3 | 2016-09-06T21:06:48.000Z | 2018-02-28T12:15:48.000Z | app/sync/events.py | travcunn/kissync-python | a1c2a4d0619bcb93658007c2f240119b11964b9b | [
"MIT"
] | null | null | null | app/sync/events.py | travcunn/kissync-python | a1c2a4d0619bcb93658007c2f240119b11964b9b | [
"MIT"
] | null | null | null | import datetime
class BaseEvent(object):
def __init__(self, path):
self.path = path
self.__timestamp = datetime.datetime.now()
@property
def timestamp(self):
""" Return a timestamp of when the event was created. """
return self.__timestamp
def __hash__(self):
return hash(self.__class__.__name__)
class LocalMovedEvent(BaseEvent):
def __init__(self, src, path):
self._src = src
super(LocalMovedEvent, self).__init__(path)
@property
def src(self):
return self._src
class LocalCreatedEvent(BaseEvent):
def __init__(self, path, isDir):
self._isDir = isDir
super(LocalCreatedEvent, self).__init__(path)
@property
def isDir(self):
return self._isDir
class LocalDeletedEvent(BaseEvent):
def __init__(self, path, isDir=False):
self._isDir = isDir
super(LocalDeletedEvent, self).__init__(path)
@property
def isDir(self):
return self._isDir
class LocalModifiedEvent(BaseEvent):
def __init__(self, path):
super(LocalModifiedEvent, self).__init__(path)
class RemoteMovedEvent(BaseEvent):
def __init__(self, src, path):
self._src = src
super(RemoteMovedEvent, self).__init__(path)
@property
def src(self):
return self._src
class RemoteCreatedEvent(BaseEvent):
def __init__(self, path, isDir):
self._isDir = isDir
super(RemoteCreatedEvent, self).__init__(path)
@property
def isDir(self):
return self._isDir
class RemoteDeletedEvent(BaseEvent):
def __init__(self, path, isDir=False):
self._isDir = isDir
super(RemoteDeletedEvent, self).__init__(path)
@property
def isDir(self):
return self._isDir
class RemoteModifiedEvent(BaseEvent):
def __init__(self, path):
super(RemoteModifiedEvent, self).__init__(path)
| 22.290698 | 65 | 0.659364 | 205 | 1,917 | 5.697561 | 0.165854 | 0.053938 | 0.08476 | 0.136986 | 0.554795 | 0.554795 | 0.505137 | 0.505137 | 0.505137 | 0.505137 | 0 | 0 | 0.242567 | 1,917 | 85 | 66 | 22.552941 | 0.804408 | 0.025561 | 0 | 0.586207 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.293103 | false | 0 | 0.017241 | 0.12069 | 0.603448 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
be45cf1a32d732f84bafe0c04ea7cc2603f6b7eb | 1,901 | py | Python | ICM-20948_0_constants.py | bopopescu/ICM-20948-py2 | d23c5f754e00a4d9dad029d82565098b11745f87 | [
"MIT"
] | 1 | 2020-03-18T01:35:36.000Z | 2020-03-18T01:35:36.000Z | ICM-20948_0_constants.py | junhuanchen/ICM-20948-py2 | d23c5f754e00a4d9dad029d82565098b11745f87 | [
"MIT"
] | null | null | null | ICM-20948_0_constants.py | junhuanchen/ICM-20948-py2 | d23c5f754e00a4d9dad029d82565098b11745f87 | [
"MIT"
] | 2 | 2019-07-25T14:17:43.000Z | 2020-07-22T03:56:04.000Z | #!/usr/bin/env python2
# -*- coding: utf-8 -*-
"""ICM-20948: Low power 9-axis MotionTracking device that is ideally suited for Smartphones, Tablets, Wearable Sensors, and IoT applications."""
__author__ = "ChISL"
__copyright__ = "TBD"
__credits__ = ["TDK Invensense"]
__license__ = "TBD"
__version__ = "Version 0.1"
__maintainer__ = "https://chisl.io"
__email__ = "info@chisl.io"
__status__ = "Test"
#
# THIS FILE IS AUTOMATICALLY CREATED
# D O N O T M O D I F Y !
#
class REG:
WHO_AM_I = 0
USER_CTRL = 3
LP_CONFIG = 5
PWR_MGMT_1 = 6
PWR_MGMT_2 = 7
INT_PIN_CFG = 15
INT_ENABLE = 16
INT_ENABLE_1 = 17
INT_ENABLE_2 = 18
INT_ENABLE_3 = 19
I2C_MST_STATUS = 23
INT_STATUS = 25
INT_STATUS_1 = 26
INT_STATUS_2 = 27
INT_STATUS_3 = 28
DELAY_TIMEH = 40
DELAY_TIMEL = 41
ACCEL_XOUT_H = 45
ACCEL_XOUT_L = 46
ACCEL_YOUT_H = 47
ACCEL_YOUT_L = 48
ACCEL_ZOUT_H = 49
ACCEL_ZOUT_L = 50
GYRO_XOUT_H = 51
GYRO_XOUT_L = 52
GYRO_YOUT_H = 53
GYRO_YOUT_L = 54
GYRO_ZOUT_H = 55
GYRO_ZOUT_L = 56
TEMP_OUT_H = 57
TEMP_OUT_L = 58
EXT_SLV_SENS_DATA_00 = 59
EXT_SLV_SENS_DATA_01 = 60
EXT_SLV_SENS_DATA_02 = 61
EXT_SLV_SENS_DATA_03 = 62
EXT_SLV_SENS_DATA_04 = 63
EXT_SLV_SENS_DATA_05 = 64
EXT_SLV_SENS_DATA_06 = 65
EXT_SLV_SENS_DATA_07 = 66
EXT_SLV_SENS_DATA_08 = 67
EXT_SLV_SENS_DATA_09 = 68
EXT_SLV_SENS_DATA_10 = 69
EXT_SLV_SENS_DATA_11 = 70
EXT_SLV_SENS_DATA_12 = 71
EXT_SLV_SENS_DATA_13 = 72
EXT_SLV_SENS_DATA_14 = 73
EXT_SLV_SENS_DATA_15 = 74
EXT_SLV_SENS_DATA_16 = 75
EXT_SLV_SENS_DATA_17 = 76
EXT_SLV_SENS_DATA_18 = 77
EXT_SLV_SENS_DATA_19 = 78
EXT_SLV_SENS_DATA_20 = 79
EXT_SLV_SENS_DATA_21 = 80
EXT_SLV_SENS_DATA_22 = 81
EXT_SLV_SENS_DATA_23 = 82
FIFO_EN_1 = 102
FIFO_EN_2 = 103
FIFO_RST = 104
FIFO_MODE = 105
FIFO_COUNTH = 112
FIFO_COUNTL = 113
FIFO_R_W = 114
DATA_RDY_STATUS = 116
FIFO_CFG = 118
REG_BANK_SEL = 127
| 22.364706 | 144 | 0.73435 | 365 | 1,901 | 3.279452 | 0.50137 | 0.120301 | 0.200501 | 0.280702 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.133595 | 0.196739 | 1,901 | 84 | 145 | 22.630952 | 0.650295 | 0.13414 | 0 | 0 | 0 | 0 | 0.042202 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0.891892 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
be8063414fdff16c86b1e57151c5cc56e66ef924 | 551 | py | Python | boa3_test/test_sc/interop_test/contract/CallFlagsUsage.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 25 | 2020-07-22T19:37:43.000Z | 2022-03-08T03:23:55.000Z | boa3_test/test_sc/interop_test/contract/CallFlagsUsage.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 419 | 2020-04-23T17:48:14.000Z | 2022-03-31T13:17:45.000Z | boa3_test/test_sc/interop_test/contract/CallFlagsUsage.py | hal0x2328/neo3-boa | 6825a3533384cb01660773050719402a9703065b | [
"Apache-2.0"
] | 15 | 2020-05-21T21:54:24.000Z | 2021-11-18T06:17:24.000Z | from typing import Any
from boa3.builtin import public
from boa3.builtin.interop.contract import NEO, call_contract
from boa3.builtin.interop.runtime import executing_script_hash, notify
from boa3.builtin.interop.storage import get, put
@public
def call_another_contract() -> Any:
return call_contract(NEO, 'balanceOf', [executing_script_hash])
@public
def notify_user():
notify('Notify was called')
@public
def put_value(key: str, value: int):
put(key, value)
@public
def get_value(key: str) -> int:
return get(key).to_int()
| 20.407407 | 70 | 0.753176 | 81 | 551 | 4.975309 | 0.382716 | 0.079404 | 0.148883 | 0.163772 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008475 | 0.143376 | 551 | 26 | 71 | 21.192308 | 0.845339 | 0 | 0 | 0.235294 | 0 | 0 | 0.047187 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.235294 | false | 0 | 0.294118 | 0.117647 | 0.647059 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 3 |
be8e59c25d2dbedf798b4e399ff15aa27d5eeef8 | 432 | py | Python | tests/test_sphinxcontrib-rdflib.py | westurner/sphinxcontrib-rdf | 8a3f5abe185ff837984cf3efb768702ad9a231db | [
"BSD-3-Clause"
] | null | null | null | tests/test_sphinxcontrib-rdflib.py | westurner/sphinxcontrib-rdf | 8a3f5abe185ff837984cf3efb768702ad9a231db | [
"BSD-3-Clause"
] | 1 | 2020-02-11T18:27:57.000Z | 2020-02-11T18:27:57.000Z | tests/test_sphinxcontrib-rdflib.py | westurner/sphinxcontrib-rdf | 8a3f5abe185ff837984cf3efb768702ad9a231db | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
"""
test_sphinxcontrib-rdf
----------------------------------
Tests for `sphinxcontrib-rdf` module.
"""
import unittest
from sphinxcontrib-rdf import sphinxcontrib-rdf
class TestSphinxcontrib-rdf(unittest.TestCase):
def setUp(self):
pass
def test_something(self):
pass
def tearDown(self):
pass
if __name__ == '__main__':
unittest.main() | 15.428571 | 47 | 0.608796 | 46 | 432 | 5.5 | 0.608696 | 0.252964 | 0.086957 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00289 | 0.199074 | 432 | 28 | 48 | 15.428571 | 0.728324 | 0.097222 | 0 | 0.272727 | 0 | 0 | 0.02807 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.272727 | 0.181818 | null | null | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
bea563f6bd3dbbc67e2d3867d3f459d1fc99930b | 1,353 | py | Python | tests/test_isbns.py | SimonGreenhill/pyglottolog | 1e0aa0cdc5ae35906c763f9219c6db9b976f8d38 | [
"Apache-2.0"
] | 7 | 2019-07-28T16:09:05.000Z | 2021-09-12T20:21:55.000Z | tests/test_isbns.py | d97hah/pyglottolog | fe4c2a52d54cdcf0804b4f889598dbb9b8698dbd | [
"Apache-2.0"
] | 52 | 2019-06-18T05:16:38.000Z | 2022-02-21T11:20:02.000Z | tests/test_isbns.py | d97hah/pyglottolog | fe4c2a52d54cdcf0804b4f889598dbb9b8698dbd | [
"Apache-2.0"
] | 6 | 2019-07-26T17:40:25.000Z | 2021-12-08T00:59:38.000Z | import pytest
from pyglottolog.references import Isbns, Isbn
def test_Isbns():
assert Isbns.from_field('9783866801929, 3866801920') == \
[Isbn('9783866801929')]
assert Isbns.from_field('978-3-86680-192-9 3-86680-192-0') == \
[Isbn('9783866801929')]
with pytest.raises(ValueError, match=r'pattern'):
Isbns.from_field('9783866801929 spam, 3866801920')
with pytest.raises(ValueError, match=r'delimiter'):
Isbns.from_field('9783866801929: 3866801920')
assert Isbns.from_field('9780199593569, 9780191739385').to_string() == \
'9780199593569, 9780191739385'
def test_Isbn():
with pytest.raises(ValueError, match='length'):
Isbn('978-3-86680-192-9')
with pytest.raises(ValueError, match=r'length'):
Isbn('03-86680-192-0')
with pytest.raises(ValueError, match=r'0 instead of 9'):
Isbn('9783866801920')
with pytest.raises(ValueError, match=r'9 instead of 0'):
Isbn('3866801929')
assert Isbn('9783866801929').digits == '9783866801929'
assert Isbn('3866801920').digits == '9783866801929'
l, r = twins = Isbn('9783866801929'), Isbn('9783866801929')
assert l == r and not l != r
assert len(set(twins)) == 1
assert repr(Isbn('9783866801929')) in \
["Isbn(u'9783866801929')", "Isbn('9783866801929')"]
| 30.066667 | 76 | 0.654102 | 159 | 1,353 | 5.515723 | 0.301887 | 0.13569 | 0.109464 | 0.177879 | 0.331813 | 0.18244 | 0 | 0 | 0 | 0 | 0 | 0.30854 | 0.195122 | 1,353 | 44 | 77 | 30.75 | 0.496786 | 0 | 0 | 0.068966 | 0 | 0 | 0.320769 | 0.031781 | 0 | 0 | 0 | 0 | 0.275862 | 1 | 0.068966 | false | 0 | 0.068966 | 0 | 0.137931 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
beb289094aa3e39d2f660f32b1c40766154d62fe | 351 | py | Python | v2/bindings/python/swisnap_plugin_lib_py/runner.py | solarwinds/snap-plugin-lib-go | 33805aab6d093b08d360520255e684126c5314db | [
"Apache-2.0"
] | 3 | 2020-10-05T01:48:25.000Z | 2021-09-06T09:24:35.000Z | v2/bindings/python/swisnap_plugin_lib_py/runner.py | solarwinds/snap-plugin-lib-go | 33805aab6d093b08d360520255e684126c5314db | [
"Apache-2.0"
] | 8 | 2020-10-05T09:25:47.000Z | 2021-11-23T17:59:10.000Z | v2/bindings/python/swisnap_plugin_lib_py/runner.py | solarwinds/snap-plugin-lib-go | 33805aab6d093b08d360520255e684126c5314db | [
"Apache-2.0"
] | null | null | null | from .c_bridge import start_c_collector, start_c_publisher, start_c_streaming_collector
# Starting collector
def start_collector(collector):
start_c_collector(collector)
def start_streaming_collector(collector):
start_c_streaming_collector(collector)
# Starting publisher
def start_publisher(publisher):
start_c_publisher(publisher)
| 21.9375 | 87 | 0.831909 | 44 | 351 | 6.204545 | 0.227273 | 0.131868 | 0.164835 | 0.175824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 351 | 15 | 88 | 23.4 | 0.875 | 0.105413 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.428571 | false | 0 | 0.142857 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
bebd75ce068352f0bd04291463754e9fe03c68f1 | 295 | py | Python | backend/routes/__init__.py | PasitC/audino | efe58d079012819bb71b1c1f05fc708223d58223 | [
"MIT"
] | 900 | 2020-05-30T20:58:01.000Z | 2022-03-30T08:49:52.000Z | backend/routes/__init__.py | xbsdsongnan/audino | 10d26c47b14ae45cdd154f544ecf6b170a524f0a | [
"MIT"
] | 45 | 2020-05-31T09:17:51.000Z | 2022-02-27T05:54:22.000Z | backend/routes/__init__.py | xbsdsongnan/audino | 10d26c47b14ae45cdd154f544ecf6b170a524f0a | [
"MIT"
] | 107 | 2020-06-11T02:03:14.000Z | 2022-03-29T14:53:09.000Z | from flask import Blueprint
auth = Blueprint("auth", __name__, url_prefix="/auth")
api = Blueprint("api", __name__, url_prefix="/api")
from .login import *
from .users import *
from .projects import *
from .labels import *
from .current_user import *
from .data import *
from .audios import *
| 22.692308 | 54 | 0.728814 | 40 | 295 | 5.1 | 0.425 | 0.294118 | 0.127451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.149153 | 295 | 12 | 55 | 24.583333 | 0.812749 | 0 | 0 | 0 | 0 | 0 | 0.054237 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.8 | 0 | 0.8 | 0.3 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
fe201d3e9817f86d0c3dada214c5a4604c625668 | 373 | py | Python | test/comment_aggregate.py | MineSelf2016/SummerResearch | a02202935ca92ecf34e736e7b54cad6721f20fd4 | [
"MIT"
] | null | null | null | test/comment_aggregate.py | MineSelf2016/SummerResearch | a02202935ca92ecf34e736e7b54cad6721f20fd4 | [
"MIT"
] | null | null | null | test/comment_aggregate.py | MineSelf2016/SummerResearch | a02202935ca92ecf34e736e7b54cad6721f20fd4 | [
"MIT"
] | null | null | null | # %%
# 处理评论数据,按照微博发表时间进行聚合
# 1.20 日
# 1. 读取1.20 日微博列表信息;
# 2. 根据微博列表信息聚合出微博评论信息
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
# %%
dataset_comments_1_20 = pd.read_csv("test/dataset_comment/1.20_comments.csv")
dataset_comments_1_20.info()
# %%
len(dataset_comments_1_20["pub_time"])
# %%
len(dataset_comments_1_20["pub_time"].unique())
# %%
| 16.954545 | 77 | 0.726542 | 58 | 373 | 4.37931 | 0.5 | 0.070866 | 0.251969 | 0.283465 | 0.220472 | 0.220472 | 0.220472 | 0 | 0 | 0 | 0 | 0.070769 | 0.128686 | 373 | 21 | 78 | 17.761905 | 0.710769 | 0.217158 | 0 | 0 | 0 | 0 | 0.191489 | 0.134752 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
fe31cf4dd357ddff48e95597e2b1feb2d14a5558 | 2,823 | py | Python | django_settings/moduleregistry.py | gruy/django-settings | da8d98a7e75adcc021b2865fecb70d6551ce6a13 | [
"BSD-3-Clause"
] | 44 | 2015-01-13T17:30:45.000Z | 2021-04-06T17:25:55.000Z | django_settings/moduleregistry.py | gruy/django-settings | da8d98a7e75adcc021b2865fecb70d6551ce6a13 | [
"BSD-3-Clause"
] | 12 | 2015-02-02T21:56:42.000Z | 2021-06-10T17:31:08.000Z | django_settings/moduleregistry.py | gruy/django-settings | da8d98a7e75adcc021b2865fecb70d6551ce6a13 | [
"BSD-3-Clause"
] | 21 | 2015-01-06T14:08:16.000Z | 2019-04-02T15:51:18.000Z | # -*- coding: utf-8 -*-
"""
moduleregistry
--------------
allows to extend given module by simply registeing classes using "register" method.
Usage::
>>> import moduleregistry
>>>
>>> import mymodule
>>> import othermodule
>>>
>>> assert not hasattr(mymodule, 'OtherClass') # True
>>>
>>> registry = moduleregistry.new_registry(mymodule)
>>> registry.register(othermodule.OtherClass)
>>>
>>> assert hasattr(mymodule, 'OtherClass') # True, BUT! - it's not the same class
>>> assert mymodule.OtherClass != othermodule.OtherClass # True!
>>> # Subclassing takes place on register invokation
>>>
>>> registry.names() == ['OtherClass'] # True
>>> 'OtherClass' in registry # True
>>> registry['OtherClass'] # returns the class
>>>
>>>
>>> registry.register(othermodule.OtherClass)
>>> # This will raise moduleregistry.RegisterError
>>>
"""
# system
import sys
import types
def subclass(class_=None, module=None):
attrs = {
'__module__': module.__name__,
}
return type(class_.__name__, (class_, ), attrs)
class RegisterError(Exception):
pass
class ModuleRegistry(object):
def __init__(self, module):
self.elements = {}
self.module = module
def _subclass(self, model_class):
return subclass(model_class, self.module)
def register(self, thing, subclass=True):
name = thing.__name__
if name in self.elements:
raise RegisterError('"%s" model already registered on "%s"' % (
name, self.module.__name__
))
new = thing if not subclass else self._subclass(thing)
self.elements[name] = new
setattr(self.module, name, new)
def unregister(self, name):
del self.elements[name]
delattr(self.module, name)
def unregister_all(self):
for name in self.elements.keys():
self.unregister(name)
def __str__(self):
return "<%s object %s>" % (self.__class__.__name__, self.elements)
def __unicode__(self):
return unicode(str(self))
def __repr__(self):
return unicode(self)
def __contains__(self, model_class_name):
return model_class_name in self.elements
def __item__(self):
return self.elements.values()
def __getitem__(self, name):
return self.elements[name]
def __call__(self, model_class):
return self.register(model_class)
def names(self):
return self.elements.keys()
def values(self):
return self.elements.values()
def new_registry(module_or_name):
if issubclass(module_or_name.__class__, types.ModuleType):
module = module_or_name
else:
module = sys.modules[module_or_name]
return ModuleRegistry(module)
| 25.205357 | 85 | 0.629472 | 306 | 2,823 | 5.519608 | 0.281046 | 0.078153 | 0.042629 | 0.031972 | 0.036708 | 0.036708 | 0 | 0 | 0 | 0 | 0 | 0.000471 | 0.247255 | 2,823 | 111 | 86 | 25.432432 | 0.794353 | 0.328728 | 0 | 0.037037 | 0 | 0 | 0.032344 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.296296 | false | 0.018519 | 0.037037 | 0.185185 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
fe40ef772f3b99e4fd867172c0e398554a14a2d4 | 5,559 | py | Python | starkplot/_axes/_old/_axes.py | nstarman/starkplot | e629abfa96448281cf590911107362f77c11fb4a | [
"BSD-3-Clause"
] | 2 | 2019-03-22T22:07:27.000Z | 2020-11-20T18:27:09.000Z | starkplot/_axes/_old/_axes.py | nstarman/starkplot | e629abfa96448281cf590911107362f77c11fb4a | [
"BSD-3-Clause"
] | 3 | 2020-09-25T06:58:28.000Z | 2021-12-20T00:16:21.000Z | starkplot/_axes/_old/_axes.py | nstarman/starkplot | e629abfa96448281cf590911107362f77c11fb4a | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
# -*- coding: utf-8 -*-
# ----------------------------------------------------------------------------
#
# TITLE : _axes
# PROJECT : starkplot
#
# ----------------------------------------------------------------------------
### Docstring and Metadata
"""**DOCSTRING**
TODO oldax and return to current ax
"""
__author__ = "Nathaniel Starkman"
__credits__ = ["matplotlib"]
# __all__ = ['figure', ]
##############################################################################
### IMPORTS
# matplotlib
from matplotlib.pyplot import figure
from matplotlib import pyplot # for usage here
# import decorator
# from .decorators import mpl_decorator, docstring
# from .decorators.docstring import wrap_func_keep_orig_sign
# # custom imports
# from ._util import _parseoptsdict, _parsestrandopts, _parselatexstrandopts,\
# _stripprefix, _latexstr,\
# axisLabels, axisScales
# from ._info import _pltypes, _annotations, _customadded, _other
# try:
# from astropy.utils.decorators import wraps # TODO move to internal file
# except Exception as e:
# print('cannot import preferred wrapper')
# from functools import wraps
#############################################################################
# Info
__author__ = "Nathaniel Starkman"
__copyright__ = "Copyright 2018, "
__credits__ = ["Jo Bovy", "The Matplotlib Team"]
__license__ = "MIT"
__version__ = "1.0.0"
__maintainer__ = "Nathaniel Starkman"
__email__ = "n.starkman@mail.utoronto.ca"
__status__ = "Production"
# __all__ = [
# 'axes',
# ]
###############################################################################
def _gca(ax=None):
return ax if ax is not None else pyplot.gca()
# /def
###############################################################################
# Axes
# @decorator.contextmanager
# def protem_axes(*args, **kw):
# """`with` enabled figure
# when used in `with` statement:
# stores current figure & creates a pro-tem figure
# executes with statment
# restores old figure
# """
# # BEFORE
# # store old axes
# oldax = pyplot.gca()
# # make new axes
# pyplot.axes(*args, **kw)
# yield # DURING
# # AFTER
# # restore old figure
# pyplot.sca(oldax)
# # /def
# NEW DEFAULT PYPLOT FIGURE
# imported into _plot to overwrite
# axes = wrap_func_keep_orig_sign(
# protem_axes, pyplot.axes,
# doc_post='\n\nThis function has been modified to be `with` enabled')
axes = pyplot.axes
# @mpl_decorator
# def axes(*args, **kw): # TODO improve,
# return axes(*args, **kw)
###############################################################################
# title
# TODO with decorator
def get_title(ax=None):
ax = _gca(ax=ax)
return ax.get_title()
# /def
# TODO with decorator
# TODO with _parsexkwandopts
def set_title(label, fontdict=None, loc="center", pad=None, ax=None, **kwargs):
ax = _gca(ax=ax)
ax.set_title(label, fontdict=fontdict, loc=loc, pad=pad, **kwargs)
# /def
###############################################################################
# xlabel
def get_xlabel(ax=None):
ax = _gca(ax=ax)
return ax.get_xlabel()
# /def
###############################################################################
# ylabel
def get_ylabel(ax=None):
ax = _gca(ax=ax)
return ax.get_ylabel()
# /def
###############################################################################
# zlabel
def get_zlabel(ax=None):
ax = _gca(ax=ax)
return ax.get_zlabel()
# /def
###############################################################################
# axes labels
def get_axes_labels(ax=None):
return [get_xlabel(ax=ax), get_ylabel(ax=ax), get_zlabel(ax=ax)]
# /def
###############################################################################
# xlim
def get_xlim(ax=None):
ax = _gca(ax=ax)
return ax.get_xlim()
# /def
###############################################################################
# ylim
def get_ylim(ax=None):
ax = _gca(ax=ax)
return ax.get_ylim()
# /def
###############################################################################
# zlim
def get_zlim(ax=None):
ax = _gca(ax=ax)
return ax.get_zlim()
# /def
###############################################################################
# axes limits
def get_axes_lims(ax=None):
ax = _gca(ax=ax)
return ax.get_xlim()
# /def
###############################################################################
# invert_xaxis
###############################################################################
# invert_yaxis
###############################################################################
# invert_zaxis
###############################################################################
# invert_axis
###############################################################################
# invert axes
###############################################################################
# xscale
###############################################################################
# yscale
###############################################################################
# zscale
###############################################################################
# axes scales
###############################################################################
# xlabel
###############################################################################
# ylabel
###############################################################################
# zlabel
| 21.056818 | 79 | 0.400612 | 451 | 5,559 | 4.687361 | 0.370288 | 0.024598 | 0.029801 | 0.038316 | 0.123936 | 0.105014 | 0.105014 | 0.105014 | 0.105014 | 0.03122 | 0 | 0.001672 | 0.139234 | 5,559 | 263 | 80 | 21.136882 | 0.440125 | 0.347185 | 0 | 0.295455 | 0 | 0 | 0.101618 | 0.017476 | 0 | 0 | 0 | 0.007605 | 0 | 1 | 0.25 | false | 0 | 0.045455 | 0.045455 | 0.522727 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
fe44283d0a52301c091e231f00f58950061abacd | 241 | py | Python | solver/debug.py | ukc-co663/dependency-solver-2019-callumforrester | 23d9bcd548de570fb1e0b72e640e450b5c824d64 | [
"MIT"
] | null | null | null | solver/debug.py | ukc-co663/dependency-solver-2019-callumforrester | 23d9bcd548de570fb1e0b72e640e450b5c824d64 | [
"MIT"
] | null | null | null | solver/debug.py | ukc-co663/dependency-solver-2019-callumforrester | 23d9bcd548de570fb1e0b72e640e450b5c824d64 | [
"MIT"
] | 1 | 2019-06-20T16:31:01.000Z | 2019-06-20T16:31:01.000Z | import logging
from tqdm import tqdm
from typing import Iterable
def logging_tqdm(it: Iterable, min_level: int = logging.INFO) -> tqdm:
should_disable = logging.getLogger().level > min_level
return tqdm(it, disable=should_disable)
| 26.777778 | 70 | 0.763485 | 34 | 241 | 5.264706 | 0.470588 | 0.067039 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153527 | 241 | 8 | 71 | 30.125 | 0.877451 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.5 | 0 | 0.833333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
fe5ff4ed20dd3f7a0b06217e098d6ece2f7bd346 | 333 | py | Python | code/parrot.py | palexjo/pokey_talon | b143314850271e185968b12f6e224df1cbb4611c | [
"MIT"
] | 4 | 2021-06-28T23:44:54.000Z | 2022-02-08T06:47:47.000Z | code/parrot.py | palexjo/pokey_talon | b143314850271e185968b12f6e224df1cbb4611c | [
"MIT"
] | 1 | 2022-03-01T11:24:38.000Z | 2022-03-01T11:24:38.000Z | code/parrot.py | palexjo/pokey_talon | b143314850271e185968b12f6e224df1cbb4611c | [
"MIT"
] | 1 | 2022-02-23T12:43:22.000Z | 2022-02-23T12:43:22.000Z | from talon import Module, Context, actions, app
mod = Module()
@mod.action_class
class Actions:
def dental_click():
"""Responds to an alveolar click"""
pass
# app.notify("Dental click")
def postalveolar_click():
"""Responds to an postalveolar click"""
actions.core.repeat_phrase(1) | 22.2 | 47 | 0.63964 | 40 | 333 | 5.225 | 0.6 | 0.105263 | 0.143541 | 0.162679 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004 | 0.249249 | 333 | 15 | 48 | 22.2 | 0.832 | 0.273273 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.125 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
fe6451d0451bbd8553a931da45650d20001c62ef | 2,097 | py | Python | src/cuckoofilter/bucket.py | heming621/postgraduate- | e459a361ce09954acbdc386a2dcd02e26962c8ff | [
"MIT"
] | 13 | 2016-11-24T03:01:33.000Z | 2022-02-08T04:14:23.000Z | src/cuckoofilter/bucket.py | heming621/postgraduate- | e459a361ce09954acbdc386a2dcd02e26962c8ff | [
"MIT"
] | null | null | null | src/cuckoofilter/bucket.py | heming621/postgraduate- | e459a361ce09954acbdc386a2dcd02e26962c8ff | [
"MIT"
] | 5 | 2016-11-24T03:01:33.000Z | 2021-12-23T13:04:17.000Z | import random
class Bucket:
'''Bucket class for storing fingerprints.'''
def __init__(self, size=4):
'''
Initialize bucket.
size : the maximum nr. of fingerprints the bucket can store
Default size is 4, which closely approaches the best size for FPP between 0.00001 and 0.002 (see Fan et al.).
If your targeted FPP is greater than 0.002, a bucket size of 2 is more space efficient.
'''
self.size = size
self.b = []
def insert(self, fingerprint):
'''
Insert a fingerprint into the bucket.
The insertion of duplicate entries is allowed.
'''
if not self.is_full():
self.b.append(fingerprint)
return True
return False
def contains(self, fingerprint):
return fingerprint in self.b
def delete(self, fingerprint):
'''
Delete a fingerprint from the bucket.
Returns True if the fingerprint was present in the bucket.
This is useful for keeping track of how many items are present in the filter.
'''
try:
del self.b[self.b.index(fingerprint)]
return True
except ValueError:
# This error is explicitly silenced.
# It simply means the fingerprint was never present in the bucket.
return False
def swap(self, fingerprint):
'''
Swap a fingerprint with a randomly chosen fingerprint from the bucket.
The given fingerprint is stored in the bucket.
The swapped fingerprint is returned.
'''
bucket_index = random.choice(range(len(self.b)))
fingerprint, self.b[bucket_index] = self.b[bucket_index], fingerprint
return fingerprint
def is_full(self):
return len(self.b) >= self.size
def __contains__(self, fingerprint):
return self.contains(fingerprint)
def __repr__(self):
return '<Bucket: ' + str(self.b) + '>'
def __sizeof__(self):
return super().__sizeof__() + self.b.__sizeof__() | 31.298507 | 121 | 0.60515 | 258 | 2,097 | 4.806202 | 0.395349 | 0.044355 | 0.019355 | 0.041935 | 0.051613 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011863 | 0.316643 | 2,097 | 67 | 122 | 31.298507 | 0.853454 | 0.401526 | 0 | 0.133333 | 0 | 0 | 0.009285 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.033333 | 0.166667 | 0.7 | 0.366667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
fe6d48c2ff41cc96a1c4ed55c0a4242c3f97801e | 33 | py | Python | python/litex/__init__.py | IgiArdiyanto/litehub | 54a43690d80e57f16bea1efc698e7d30f06b9d4f | [
"MIT"
] | null | null | null | python/litex/__init__.py | IgiArdiyanto/litehub | 54a43690d80e57f16bea1efc698e7d30f06b9d4f | [
"MIT"
] | null | null | null | python/litex/__init__.py | IgiArdiyanto/litehub | 54a43690d80e57f16bea1efc698e7d30f06b9d4f | [
"MIT"
] | null | null | null | # Versions
__version__ = '0.0.1'
| 11 | 21 | 0.666667 | 5 | 33 | 3.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107143 | 0.151515 | 33 | 2 | 22 | 16.5 | 0.535714 | 0.242424 | 0 | 0 | 0 | 0 | 0.217391 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
fe6e257177b63538c13060f87ca2e0164af72b10 | 972 | py | Python | wolframclient/tests/evaluation/__init__.py | WolframResearch/WolframClientForPython | 27cffef560eea8d16c02fe4086f42363604284b6 | [
"MIT"
] | 358 | 2018-10-18T13:39:48.000Z | 2022-03-26T09:42:53.000Z | wolframclient/tests/evaluation/__init__.py | WolframResearch/WolframClientForPython | 27cffef560eea8d16c02fe4086f42363604284b6 | [
"MIT"
] | 29 | 2018-10-20T09:04:12.000Z | 2022-03-06T18:36:19.000Z | wolframclient/tests/evaluation/__init__.py | LaudateCorpus1/WolframClientForPython | 26f7fa3d81691ba2a63d3eadcd9734b261130b7c | [
"MIT"
] | 38 | 2018-10-19T21:52:14.000Z | 2021-11-21T13:07:04.000Z | from __future__ import absolute_import, print_function, unicode_literals
import fnmatch
import os
import unittest
from wolframclient.utils import six
# The evaluation modules is only supported on python 3.5+, because of asyncio
# We need to prevent the test suite from loading this module containing coroutines.
if six.PY_35:
from wolframclient.tests.evaluation import test_async_cloud as m4
from wolframclient.tests.evaluation import test_cloud as m1
from wolframclient.tests.evaluation import test_coroutine as m3
from wolframclient.tests.evaluation import test_kernel as m2
test_modules = (m1, m2, m3, m4)
else:
test_modules = ()
__all__ = ["load_tests"]
def load_tests(loader, tests, pattern):
suite = unittest.TestSuite()
for module in test_modules:
if fnmatch.fnmatch(os.path.basename(module.__file__), pattern):
tests = loader.loadTestsFromModule(module)
suite.addTests(tests)
return suite
| 31.354839 | 83 | 0.755144 | 131 | 972 | 5.40458 | 0.503817 | 0.120057 | 0.124294 | 0.180791 | 0.237288 | 0.237288 | 0 | 0 | 0 | 0 | 0 | 0.015094 | 0.182099 | 972 | 30 | 84 | 32.4 | 0.875472 | 0.161523 | 0 | 0 | 0 | 0 | 0.012315 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.428571 | 0 | 0.52381 | 0.047619 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
fe7149d13473207b2c14ec3b51325f9e95bc4536 | 160 | py | Python | superinvoke/collections/misc.py | Neoxelox/superinvoke | 032dde6d31ecd4fa94b3fd75ca2f09ccf799e9ba | [
"MIT"
] | 1 | 2021-11-15T01:25:00.000Z | 2021-11-15T01:25:00.000Z | superinvoke/collections/misc.py | Neoxelox/superinvoke | 032dde6d31ecd4fa94b3fd75ca2f09ccf799e9ba | [
"MIT"
] | null | null | null | superinvoke/collections/misc.py | Neoxelox/superinvoke | 032dde6d31ecd4fa94b3fd75ca2f09ccf799e9ba | [
"MIT"
] | null | null | null | from invoke import task
@task(variadic=True)
def help(context, collection):
"""Show available commands."""
context.run(f"invoke --list {collection}")
| 20 | 46 | 0.7 | 20 | 160 | 5.6 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 160 | 7 | 47 | 22.857143 | 0.823529 | 0.15 | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.