hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
86dddcddfe77c2b734a7063e01e9782d1701a2c0 | 1,511 | py | Python | great_expectations/core/data_context_key.py | joshuataylor/great_expectations | 19dcead43aef9a833b3aa894a1226714a80ab840 | [
"Apache-2.0"
] | 1 | 2021-06-06T00:44:10.000Z | 2021-06-06T00:44:10.000Z | great_expectations/core/data_context_key.py | joshuataylor/great_expectations | 19dcead43aef9a833b3aa894a1226714a80ab840 | [
"Apache-2.0"
] | 47 | 2020-07-15T06:32:50.000Z | 2022-03-29T12:03:23.000Z | great_expectations/core/data_context_key.py | joshuataylor/great_expectations | 19dcead43aef9a833b3aa894a1226714a80ab840 | [
"Apache-2.0"
] | null | null | null | from abc import ABCMeta, abstractmethod
class DataContextKey(object, metaclass=ABCMeta):
"""DataContextKey objects are used to uniquely identify resources used by the DataContext.
A DataContextKey is designed to support clear naming with multiple representations including a hashable
version making it suitable for use as the key in a dictionary.
"""
@abstractmethod
def to_tuple(self):
pass
@classmethod
def from_tuple(cls, tuple_):
return cls(*tuple_)
def to_fixed_length_tuple(self):
raise NotImplementedError
@classmethod
def from_fixed_length_tuple(cls, tuple_):
raise NotImplementedError
def __eq__(self, other):
if not isinstance(other, self.__class__):
# Delegate comparison to the other instance's __eq__.
return NotImplemented
return self.to_tuple() == other.to_tuple()
def __ne__(self, other):
return not self == other
def __hash__(self):
return hash(self.to_tuple())
def __repr__(self):
return self.__class__.__name__ + "::" + "/".join(self.to_tuple())
class StringKey(DataContextKey):
"""A simple DataContextKey with just a single string value"""
def __init__(self, key):
self._key = key
def to_tuple(self):
return (self._key,)
def to_fixed_length_tuple(self):
return self.to_tuple()
@classmethod
def from_fixed_length_tuple(cls, tuple_):
return cls.from_tuple(tuple_)
| 26.508772 | 107 | 0.674388 | 184 | 1,511 | 5.190217 | 0.396739 | 0.051309 | 0.067016 | 0.029319 | 0.172775 | 0.140314 | 0.087958 | 0.087958 | 0 | 0 | 0 | 0 | 0.243547 | 1,511 | 56 | 108 | 26.982143 | 0.835521 | 0.2409 | 0 | 0.333333 | 0 | 0 | 0.002671 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.363636 | false | 0.030303 | 0.030303 | 0.212121 | 0.727273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
8103d4e966869ddca57b6f41e402964e10986396 | 232 | py | Python | util/converter.py | brenns10/love | 1a2a60c510327d91eff9caf32a252141ae00a9eb | [
"MIT"
] | 4 | 2017-02-16T02:18:39.000Z | 2018-01-14T01:56:21.000Z | util/converter.py | brenns10/love | 1a2a60c510327d91eff9caf32a252141ae00a9eb | [
"MIT"
] | 17 | 2017-02-16T17:19:53.000Z | 2018-01-08T01:43:05.000Z | util/converter.py | brenns10/love | 1a2a60c510327d91eff9caf32a252141ae00a9eb | [
"MIT"
] | 4 | 2017-02-16T18:48:18.000Z | 2018-01-08T02:34:07.000Z | # -*- coding: utf-8 -*-
from werkzeug.routing import BaseConverter
class RegexConverter(BaseConverter):
def __init__(self, url_map, *items):
super(RegexConverter, self).__init__(url_map)
self.regex = items[0]
| 23.2 | 53 | 0.689655 | 27 | 232 | 5.555556 | 0.703704 | 0.08 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010582 | 0.185345 | 232 | 9 | 54 | 25.777778 | 0.783069 | 0.090517 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
81171023cd51f3019c4093c40f5efea0036507e0 | 255 | py | Python | web2py/applications/rip/modules/Analytics.py | 2spmohanty/vcenter-automation | 1d10b765ef335087902b0194ed12a61e53807987 | [
"Apache-2.0"
] | 1 | 2019-10-02T13:25:03.000Z | 2019-10-02T13:25:03.000Z | web2py/applications/rip/modules/Analytics.py | 2spmohanty/vcenter-automation | 1d10b765ef335087902b0194ed12a61e53807987 | [
"Apache-2.0"
] | null | null | null | web2py/applications/rip/modules/Analytics.py | 2spmohanty/vcenter-automation | 1d10b765ef335087902b0194ed12a61e53807987 | [
"Apache-2.0"
] | 1 | 2021-11-05T09:51:02.000Z | 2021-11-05T09:51:02.000Z | __author__ = 'smrutim'
import requests
import json
def PostVpxData():
URI = "https://vcsa.vmware.com/ph-stg/api/hyper/send?_c=cpbu_vcst_vac_staging.v0&_i=RIP_STAGING_DATA"
def PostHeapAnalysisDate():
pass
def PostMemoryLeakData():
pass | 15 | 105 | 0.737255 | 34 | 255 | 5.205882 | 0.852941 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.004587 | 0.145098 | 255 | 17 | 106 | 15 | 0.807339 | 0 | 0 | 0.222222 | 0 | 0.111111 | 0.390625 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0.222222 | 0.222222 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
812df00886c4450e2c7dee800417b0d60fefdffe | 538 | py | Python | deprecated/InitializeAgentStates.py | PellelNitram/corona_contact_tracing | df5a6ba18b84397b721893fb5eb89889dc82ab2f | [
"MIT"
] | 6 | 2020-03-21T20:44:54.000Z | 2020-05-14T05:32:49.000Z | deprecated/InitializeAgentStates.py | PellelNitram/corona_contact_tracing | df5a6ba18b84397b721893fb5eb89889dc82ab2f | [
"MIT"
] | null | null | null | deprecated/InitializeAgentStates.py | PellelNitram/corona_contact_tracing | df5a6ba18b84397b721893fb5eb89889dc82ab2f | [
"MIT"
] | 2 | 2020-03-22T15:37:41.000Z | 2020-03-31T10:11:24.000Z | # Generated with SMOP 0.41
from libsmop import *
# InitializeAgentStates.m
@function
def InitializeAgentStates(numberOfAgents=None,initialInfected=None,*args,**kwargs):
varargin = InitializeAgentStates.varargin
nargin = InitializeAgentStates.nargin
agentState=ones(numberOfAgents,1)
# InitializeAgentStates.m:3
perm=randperm(numberOfAgents)
# InitializeAgentStates.m:5
agentState[perm < initialInfected + 1]=2
# InitializeAgentStates.m:7
return agentState
if __name__ == '__main__':
pass
| 24.454545 | 83 | 0.745353 | 52 | 538 | 7.557692 | 0.634615 | 0.223919 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.020089 | 0.167286 | 538 | 22 | 84 | 24.454545 | 0.857143 | 0.236059 | 0 | 0 | 1 | 0 | 0.019704 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0.090909 | 0.090909 | 0 | 0.272727 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
813102b707b7bbdc28fdec720e2668bdf2505236 | 429 | py | Python | django_lazifier/apps/data_table/demo_app/urls.py | hotmit/django-lazifier | a2914b7ced955fa91b1961025e1d3ccc7ac3702a | [
"MIT"
] | 1 | 2017-04-27T19:25:34.000Z | 2017-04-27T19:25:34.000Z | django_lazifier/apps/data_table/demo_app/urls.py | hotmit/django-lazifier | a2914b7ced955fa91b1961025e1d3ccc7ac3702a | [
"MIT"
] | null | null | null | django_lazifier/apps/data_table/demo_app/urls.py | hotmit/django-lazifier | a2914b7ced955fa91b1961025e1d3ccc7ac3702a | [
"MIT"
] | null | null | null | from django.conf.urls import url
from . import views
from django_lazifier.apps.data_table.demo_app.demo_data import generate_sample_data
urlpatterns = [
url(r'^$', views.view_demo_app, name='view_demo_app'),
url(r'^manage/$', views.semi_auto_manual_param_override__manage_demo_app, name='manage_demo_app'),
# ignore url below
url(r'^gen-data/$', generate_sample_data, name='generate_sample_data'),
]
| 35.75 | 103 | 0.74359 | 64 | 429 | 4.609375 | 0.4375 | 0.118644 | 0.183051 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135198 | 429 | 11 | 104 | 39 | 0.795148 | 0.037296 | 0 | 0 | 1 | 0 | 0.175 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.375 | 0 | 0.375 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
d48c980ed9edbc0a4244af5a3b8eaad19b6a9d1b | 64,668 | py | Python | ps_typer/type_test/texts.py | Rolv-Apneseth/rolvs-typing-speedtest | 136a5ff4df34ef368074e9f3b187053c53af3660 | [
"MIT"
] | 1 | 2020-10-12T22:41:51.000Z | 2020-10-12T22:41:51.000Z | ps_typer/type_test/texts.py | Rolv-Apneseth/rolvs-typing-speedtest | 136a5ff4df34ef368074e9f3b187053c53af3660 | [
"MIT"
] | 32 | 2020-10-28T16:03:49.000Z | 2021-08-22T12:31:24.000Z | ps_typer/type_test/texts.py | Rolv-Apneseth/ps-typer | 136a5ff4df34ef368074e9f3b187053c53af3660 | [
"MIT"
] | null | null | null | from pathlib import Path
from typing import Generator
import random
# PATHS
PROJECT_FOLDER = Path(__file__).absolute().parents[1]
ASSETS_FOLDER = PROJECT_FOLDER / "assets"
TEXTS_FOLDER = ASSETS_FOLDER / "texts"
BROWN_TEXT = TEXTS_FOLDER / "brown.txt"
WEBTEXT_TEXT = TEXTS_FOLDER / "webtext.txt"
GUTENBERG_TEXT = TEXTS_FOLDER / "gutenberg.txt"
# CONSTANTS
RANDOM_TEXT_SENTENCES = 3 # Length in sentences for random text
# TEXTS
COMMON_PHRASES = [
"A bird in the hand is worth two in the bush.",
"A penny for your thoughts.",
"A penny saved is a penny earned.",
"A picture is worth 1000 words.",
"Actions speak louder than words.",
"Barking up the wrong tree.",
"Birds of a feather flock together.",
"By the skin of your teeth.",
"Comparing apples to oranges.",
"Do unto others as you would have them do unto you.",
"Don't count your chickens before they hatch.",
"Don't cry over spilt milk.",
"Don't give up your day job.",
"Don't put all your eggs in one basket.",
"Every cloud has a silver lining.",
"Get a taste of your own medicine.",
"Good things come to those who wait.",
"He has bigger fish to fry.",
"He's a chip off the old block.",
"Hit the nail on the head.",
"It ain't over till the fat lady sings.",
"It takes one to know one.",
"It's raining cats and dogs.",
"Kill two birds with one stone.",
"Let the cat out of the bag.",
"Look before you leap.",
"Saving for a rainy day.",
"Slow and steady wins the race.",
"Take it with a grain of salt.",
"The ball is in your court.",
"The best thing since sliced bread.",
"The devil is in the details.",
"The early bird gets the worm.",
"The elephant in the room.",
"The whole nine yards.",
"There are other fish in the sea.",
"There's a method to his madness.",
"There's no such thing as a free lunch.",
"Throw caution to the wind.",
"You can't have your cake and eat it too.",
"You can't judge a book by its cover.",
"A little learning is a dangerous thing.",
"A snowball's chance in hell.",
"A stitch in time saves nine.",
"An apple a day keeps the doctor away.",
"An ounce of prevention is worth a pound of cure.",
"Bolt from the blue.",
"Calm before the storm.",
"Curiosity killed the cat.",
"Don't beat a dead horse.",
"Every dog has his day.",
"Familiarity breeds contempt.",
"Fortune favours the bold.",
"Haste makes waste.",
"He who laughs last laughs loudest.",
"He's not playing with a full deck.",
"He's sitting on the fence.",
"It is a poor workman who blames his tools.",
"It is always darkest before the dawn.",
"It takes two to tango.",
"Know which way the wind is blowing.",
"Leave no stone unturned.",
"Let sleeping dogs lie.",
"Like riding a bicycle.",
"Like two peas in a pod.",
"Make hay while the sun shines.",
"Once bitten, twice shy.",
"Out of the frying pan and into the fire.",
"Shape up or ship out.",
"The pot calling the kettle black.",
"There are clouds on the horizon.",
"Those who live in glass houses shouldn't throw stones.",
"Through thick and thin.",
"Waste not, want not.",
"Well begun is half done.",
"When it rains it pours.",
"You can't make an omelet without breaking some eggs.",
]
FACTS = [
"There are more possible iterations of a game of chess than there are atoms in the known universe. There are also more ways to arrange a deck of cards than atoms in the known universe.",
"Cleopatra lived closer in time to the Moon landing than to the construction of the Great Pyramid of Giza.",
"It can take a photon 100,000 years to travel from the core of the sun to the surface, but only 8 minutes to travel the rest of the way to earth. This is due to the extreme density of the core of the Sun (150 times that of water).",
"It would take roughly 1.1 million mosquitoes, all sucking at once, to completely drain the average human of blood. The average human has 1.75 square metres of skin, so this would mean about 63 mosquitoes per square centimeter of your skin.",
"Written language was developed independently by the Egyptians, Mesopotamians (the Sumer language), Chinese, and Mesoamericans (such as the Olmecs and Zapotecs). There is also evidence of possible forms of writing developed in other regions, such as Polynesia.",
"Atoms are made up of mostly empty space. If the nucleus of an atom was the size of a football, the nearest electron would be 0.8km away. That means even the most solid-looking objects we see are predominantly nothingness. Put another way, if you were to remove all the empty space in the atoms that make up a human being, they would be a lot smaller than a grain of salt. In fact, you would be able to fit 6 billion of us inside a single apple.",
"Honey does not spoil. You could feasibly eat 3000 year old honey. It does, however, crystallise with time, but all you have to do is put it in water and warm it until it's back to its original state and you can eat it. This has been observed by archaeologists excavating Egyptina tombs, finding pots of honey thousands of years old yet still preserved.",
"If you somehow found a way to extract all of the gold from the bubbling core of our lovely little planet, it is estimated that there would be enough to cover the surface of the planet in 13 inches.",
"To know when to mate, a male giraffe will continuously headbutt the female in the bladder until she urinates. The male then tastes the pee and that helps it determine whether the female is ovulating.",
"The largest known living organism by mass is a clonal colony of quaking aspen trees. Pando (Latin for I spread out) is a group of genetically identical quaking aspens in Utah with an interconnected root system. It's an estimated 80,000 years old and takes up more than 400000 square metres. The largest organism by area is a colony of honey fungus in Oregon, which has almost as much biomass as Pando.",
"The blue whale is the largest animal to have ever lived, reaching a confirmed max length of 29.9 meters and weight of 173 tonnes, dwarfing even the biggest dinosaurs and our biggest megaladon estimates. Their hearts are the size of a small car and have blood vessels so big that a small child could swim through. Our first ever recording of a blue whale's heartbeat showed their heartbeat staying within a range of 2-37bpm. For reference, the average human's resting heartrate is between 60-100bpm.",
"Four times more people speak English as a second language than as a native one. It's the most widely spoken tongue in the world, with nearly two billion people learning it as a second language and only around 460 million people speaking it natively. As of 2012, India claims to have the world's second-largest English-speaking population at 125 million people, second only to the USA (330 million).",
"About 400-500 grapes go into one bottle of wine. That's approximately 2kg per bottle.",
"Once, a Texas man was hospitalized when a bullet he shot at an armadillo ricocheted off the animal and hit him in the jaw. Despite several reports saying bullets ricocheted off of armadillos, these creatures are not bulletproof. Their shells are made of bony plates called osteoderms that grow in the skin. They're loosely connected for flexibility and are covered by a layer of keratin, the protein that makes up hair, nails, and horns. The shell protects the armadillos from thorny shrubs, under which they can hide from predators.",
"Chess is called the game of kings. The history of chess can be traced back nearly 1500 years, although the earliest origins are uncertain. The earliest predecessor of the game probably originated in India, before the 6th century AD. From India, the game spread to Persia. When the Arabs conquered Persia, chess was taken up by the Muslim world and subsequently spread to Southern Europe. In Europe, chess evolved into roughly its current form in the 15th century.",
"It might seem safe to assume that the Canary Islands were named after canary birds, but the location was actually named after dogs. Although it's off the coast of northwestern Africa, the archipelago is actually part of Spain. In Spanish, the area's name is Islas Canarias, which comes from the Latin phrase 'Canariae Insulae' which means 'island of dogs.'",
"When 174 world leaders signed the Paris Agreement on Earth Day in 2016 at the United Nations (UN) headquarters in New York, it was the largest number of countries ever to come together to sign anything on a single day, according to the UN. The agreement aimed to combat climate change and accelerate and intensify the actions and investments needed to strengthen the global climate effort.",
"Earthquakes can range from minor tremors that are barely noticeable to building-toppling ground-shakers that cause massive destruction. But it's an inevitable part of life for those who live in countries such as China, Indonesia, Iran, and Turkey, which are some of the most earthquake-prone places on the planet. However, according to the U.S. Geological Survey, Japan records the most earthquakes in the world, but other countries such as Indonesia, Tonga or Fiji likely have the most earthquakes per unit area.",
"According to the Population Reference Bureau, since the time 'modern' Homo sapiens first hit the scene 50,000 years ago, more than 108 billion members of our species have been born. And a large chunk of that number is alive right now. According to the bureau, the number of people alive today represents a whopping seven percent of the total number of humans who have ever lived.",
"Not everyone lives in a booming city or sprawling suburb. Many people still make their homes outside of bustling locations-especially in India, which has the largest number of people living in rural areas (approximately 737 million people live outside of the city). China also has an impressively large rural population, with 545 million living outside of urban areas.",
"While modern nation states known as countries are relatively new, many nations can trace back their history hundreds or thousands of years (for example, Greece). But South Sudan in North Africa just gained its independence from Sudan in 2011, which currently makes it the youngest country in the world (with widespread recognition). It gained independence from the Republic of Sudan after a decades long civil war which ended in 2005. As of 2019, South Sudan unfortunately ranks third-lowest in the UN World Happiness Report, and second-lowest on the Global Peace Index.",
"The British royal family may be the most famous royal family on the planet, but there are still plenty of other nobles out there. In total, there are 26 royal families, and a total of 44 sovereign states around the world with a monarch as their Head of State. Examples include Japan, Spain, Swaziland, Bhutan, Thailand, Monaco, Sweden and the Netherlands",
"Panda diplomacy is the practice of sending giant pandas from China to other countries as a tool of diplomacy. While the practice has been recorded as far back as the Tang dynasty, when Empress Wu Zetian sent a pair of giant pandas to Emperor Tenmu of Japan in 685CE, the term only came into popular use during the Cold War. The People's Republic of China began to use panda diplomacy more prominently in the 1950s, and has continued the practice into the present day. However, in 1984 they adopted a loan policy which meant that subsequent pandas would be loaned, not gifted, so almost all giant pandas worldwide belong to China.",
"During his lifetime between 1162 and 1227, Genghis Khan fathered countless children.When Mongol armies attacked, the most beautiful women were reserved for Genghis. One thirteenth century Persian historian claimed that within a century of Khan's birth, his enthusiastic mating habits had created a lineage of more than 20,000 individuals. And while we may never know exactly how many offspring the leader of the Mongol Empire had, an international team of geneticists found that around 1 in every 200 men (around 16 million people) are direct descendants of his, according to a 2003 historical genetics paper.",
"Tokyo is a booming city-not only by Japanese standards, but also compared to cities around the world. With around 37 million people living in Tokyo, it's the world's largest city when it comes to population size, according to Reuters. The next largest city is Delhi, India (population 29 million) and Shanghai, China (population 26 million).",
"Canadians say 'sorry' so much that a law was passed in 2009 called the Apology Act declaring that an apology can't be used as evidence of admission to guilt.",
"Scientists previously thought that the moon's volcanic activity died down a billion years ago. But new data from NASA's Lunar Reconnaissance Orbiter, or LRO, hints that lunar lava flowed much more recently, perhaps less than 100 million years ago. This would mean that there could still have been volcanic activity on the moon back when dinosaurs were still around.",
"There were two AI chatbots created by Facebook to talk to each other, but they were shut down after they started communicating in a gibberish language they made for themselves. These AIs were made to trade with each other, and started speaking in this gibberish because no language enforcement was set for them, and their only goal was to trade, so English became irrelevant. The code and documentation for these AIs is publicly available and you can run them yourself if you want to.",
"In 2009, Stephen Hawking held a reception for time travelers, but didn't publicize it until after. This way, only those who could time travel would be able to attend. Nobody else attended.",
"In World War II, Germany tried to collapse the British economy by dropping millions of counterfeit bills over London. This was known as Operation Bernhard and estimates on the value of forged bills dropped varies from £132.6 million up to £300 million. This unit responsible for forging the bills also managed to perfect the art for US dollars, and forged bills were used to finance German intelligence operations.",
"Birds are the closest living relatives of crocodilians, as well as the descendants of extinct dinosaurs with feathers. This means that birds are thought to be the only direct descendants of dinosaurs still living today.",
"Cold showers have more health benefits than hot or warm showers. These include improving circulation, stimulating weight loss by improving metabolism, and easing depression by acting as a kind of light electroshock therapy. Cold showers can also increase your resistance to common illnesses.",
"During the first live iPhone presentation, Steve Jobs had to frequently switch phones behind his desk. Otherwise, it would run out of RAM and crash. The 100 or so iphones in existence at the time were also riddled with bugs, meaning the development team had to come up with a 'golden path', a series of specific tasks performed in a specific order that would be least likely to cause the phone to crash.",
"Movie theaters make roughly 85 percent of their profit off concession stands. This is because most of the money earned from ticket revenue goes to the movie distributors, and things like popcorn and fizzy drinks can be sold at profit margins of around 90%.",
"If you ate nothing but rabbit meat, you would die from protein poisoning. This would be a mixture of too much protein and an absence of fat in the diet (fat is essential to human nutrition), and is the origin of the term rabbit starvation. Similarly, any diet made up entirely of lean meats would also lead to protein poisining.",
"Italy built an entire courthouse to prosecute the Mafia back in 1986. Throughout and after the trial, several judges and magistrates were killed by the Mafia, including the two who led it--Giovanni Falcone and Paolo Borsellino. They indicted 475 members in a trial that lasted from 1986-1992. They convicted 338 people, sentenced to a total of 2,665 years, not including life sentences handed to 19 bosses. To date, it was the biggest trial in the world. In 2020, another courthouse has been built, this time to prosecute the 'Ndrangheta, believed to currently be the richest crime syndicate in the world.",
"Pitbulls rank high among the most affectionate and least aggressive dogs. In general, they are not aggressive towards people but may be less tolerant of other dogs than other breeds. Pitbulls are only aggressive when forcibly trained/encouraged as such; usually because of irresponsible owners drawn to the dog's macho image who encourage aggression for fighting and protection.",
"When Blackbeard captured ships, many of the African slaves on board would go on to become pirates. When he died, nearly one-third of his total crew were former slaves. However, most slaves. However, he was no abolitionist. Reports also account that Blackbeard and his associates also returned slaves to the mainland to be sold at auction.",
"Cucumber can actually cure bad breath. A slice pressed to the roof of your mouth for 30 seconds with your tongue allows the phytochemicals to kill the problematic bacteria. Crunchy vegetables help remove plaque on teeth and gums, which bacteria can feed on, says Gregg Lituchy, a cosmetic dentist in New York City.",
"The King of Macedon, Philip, threatened Sparta with ' If once I enter into your territories, I will destroy ye all, never to rise again'. The Spartans replied: 'If'. Subsequently, neither Philip nor his son Alexander the Great attempted to capture the city. Philip is also recorded as approaching Sparta on another occasion and asking whether he should come as friend or foe; the reply was 'Neither'.",
"Einstein's brain went missing when he died in 1955. The pathologist on call, Thomas Harvey, who worked on his autopsy took it without permission. Einstein had left behind specific instructions regarding his remains: cremate them, and scatter the ashes secretly in order to discourage idolaters. Einstein's family was essentially strong-armed into agreeing to participate in research that Einstein explicitly did not want to participate in. Several studies were released about his brain many years later but none of them conclusively proved that there was anything special about his brain.",
"Mulan has the highest kill-count of any (pure, not MCU, Star Wars etc. in which case Thanos easily comes out on top) Disney character (except for maybe King Kashekim Nedakh from Atlantis), and was the first Disney Princess to be shown killing people on-screen. From a scene in the movie, she shoots causes an avalanche to crush 2000 Huns, and only 6 survive. She later goes on to kill the leader of the Huns, one of the survivors, bringing her kill count to 1,995.",
"One of the earliest depictions of dreadlocks dates back to 1600BCE (roughly 3600 years ago) to the Minoan civilization, one of Europe's earliest civilizations, who lived in what is now known as Greece.",
"The reason the taste of artificial banana flavoring and artificial banana flavored products doesn't taste like bananas is because it is based on a type of banana that was mostly wiped out by several fungal plagues (most notably the Panama disease) in the 1950's.",
"Due to the humid and moist conditions that a sloth lives in, algae will sometimes grow in its fur giving the animal a green tint. Sloths also move extremely slowly, with top speeds of 6cm per second. Both of these traits allow this so called lazy animal to be almost invisible to predators, giving them a major evolutionary advantage. They also have an extremely thorough digestive system, where food can take many days to pass through. This allows them to extract every bit of energy and nutrition from the relatively small amount of food they consume.",
"American microbiologist Maurice Ralph Hilleman and his team are accredited with developing 8 of the 14 routine vaccinations used in current American vaccine schedules, these being measles, mumps, hepatitis A, hepatitis B, chickenpox, meningitis, Neisseria meningitidis, Streptococcus pneumoniae and Haemophilus influenzae. He developed over 40 vaccines, an unparalleled record of productivity. According to one estimate, his vaccines save nearly 8 million lives each year.",
"""It is predicted that the reason why night insects, such as moths, are attracted to lights is because they mistake them for the light of the moon, which they use to navigate the Earth in a process called transverse orientation. "Elements in their eyes are tuned to faint light, and act 'like miniature telescopes'. Thus when they're faced with powerful artificial illumination, it can act as a 'super-stimulant'," says Lynn Kimsey, professor of entomology at UC Davis.""",
"Film producer Jeffrey Katzenberg revived The Walt Disney Studios by producing some of their biggest hits: The Little Mermaid, The Lion King, Beauty and the Beast and Aladdin. He decided to quit after the chairman refused to promote him to the number two spot. After leaving them, they withheld a bonus from him, which he took them to court for $250 million they owed him and won. He went on to found DreamWorks Studios, and oversaw the production of such popular animated franchises as Shrek, Madagascar and Kung Fu Panda.",
"Through the use of optogenetics, which uses a pulse of light to activate or deactivate neurons, scientists were able to create a false memory within a mouse's brain. A mouse was put in a box with the smell of acetephenone on one side and the smell of carvone on the other, but went to the side with acetephenone even though it had never smelled it before. This was done by simultaneously activating the neurons that sense acetophenone and those associated with reward, creating the 'memory' that the smell of acetephenone leads to a reward.",
"The word 'quarantine' derives from the Venetian dialect of Italian and the words 'quaranta giorni', meaning 'forty days'. This is because when it was discovered that ships were infested with plague-carrying rats they were made to sit at anchor outside Venice's city walls for forty days before coming ashore.",
"In a survival situation if you were to drink seawater it would rapidly dehydrate you and soon lead to your death. However, it is vastly less harmful to eat frozen seawater. This is because it contains a tenth the amount of salt as its liquid form, due to the fact that the salt is separated from the water when freezing as it does not fit into the crystalline structure of ice. If you are trying to make seawater drinkable manually, however, evaporation is still more efficient.",
"Due to the extremely warm weather in the summer of 2013, several nuclear power plants across the world, including ones in Japan, Israel and Scotland, were forced to close down because of a sudden increase in the population of jellyfish, as well as loss in efficiency due to warmer water. Mass amounts of jellyfish can sometimes clog the filters that draw seawater into the power plants in order to cool down the reactors.",
"France has conducted 210 nuclear weapon tests, more than the United Kingdom, China, India, and North Korea combined! This is just over a fifth of the amount conducted by the United States, however, who have conducted roughly 1,032 tests.",
"Iran carries out the most gender-change operations in the world after Thailand. Estimates suggest around 50,000 people living in Iran are transgender. Sex reassignment surgeries are partially financially supported by the state. However, the government of Iran is considered to be one of the most discriminatory towards homosexual people in the world, and hundreds have been executed due to their sexual orientation. Some homosexual individuals in Iran have been pressured to undergo sex reassignment surgery in order to avoid legal and social persecution.",
"There is an Australian man, James Harrison, who has a singularly unique blood plasma composition that has been used to cure Rhesus disease, a hemolytic disease that affects newborn babies. He has made over 1,000 donations throughout his lifetime, and these donations are estimated to have saved over 2.4 million babies from the condition.",
"In Bordeaux, France, 1940, Portuguese diplomat Aristides de Sousa Mendes issued an estimated 30,000 Portuguese travel visas to Jewish families in order for them to flee persecution from the Nazis. Once his superiors had learned of his actions, he was ordered back to Portugal, dismissed from office and denied his pension benefits. Sousa Mendes went on to die in 1954, impoverished and unsung.",
"Archeologists in London have found a Mesolithic tool-making factory that gives substantial proof human beings were living on the River Thames 7,000 BCE. That's over 9,000 years ago! This predates previous estimates of human habitation of the Thames, which was thought to be 4000 BCE.",
"In 1995, strange 2-meter-wide circular patterns were discovered on the ocean's floor. Deemed the 'underwater crop circles', these mysterious patterns were a mystery until early 2011 when it was discovered that a previously undiscovered species of 12-centimeter long puffer-fish were the culprits. After studying these animals, scientists say that the meticulous creation and upkeep of these patterns by the male puffer-fish serve as an attraction for the opposite sex as well as a nest for the female puffer-fish's eggs.",
"In the bioengineering department of the University of Illinois, researchers have created small 'biobots', partly out of synthetic gel and partly out of muscle cell, that can move on their own. Whilst only a small scientific step, this brings mechanical engineering one step closer to developing autonomous biobots: tiny devices that could exist within the human body, freely detecting illness and administering medication.",
"Colombian drug-lord Pablo Escobar kept four hippos in his estate before his death in 1993. Deemed too much hassle to move by authorities, his hippos were left there and have since bred and escaped becoming an invasive species of Colombia. TThere are now an estimated 80-100 hippos living in the Magdalena River Basin area.",
"The world's biggest tire producer is LEGO. In 2011, LEGO manufactured over 318 million tires, while brands such as Bridgestone, Michelin, Goodyear all produced below 200 million each. In Billund, Denmark, LEGO produces 870,000 tyres every day. They may be tiny toy tires, but the fact still stands.",
"Research has found that a mid-day nap can make you more creative, focused, and fresh for the rest of the day. But one study in 2007 also found that they can also reduce your risk of heart disease. Specifically, those who regularly nap were found to be 37 percent less likely to die from a heart attack or other coronary ailment than those who worked straight through the day. This is likely due to reducing stress and lowering blood pressure.",
"Orcas are the only predators that regularly kill and devour Pacific white-sided dolphins off the B.C. and Washington coasts. So researchers were surprised when drone footage showed such dolphins playing within a few fin-spans of killer whales' toothy jaws. As it turns out, the orcas they play with are of a different species, which are strict pescatarians, and so don't eat dolphins as they are mammals. This still seems like a surprising risk to take, as the two species are nearly identical to our eyes.",
"NASA answering President Kennedy's challenge and landing men on the moon by 1969 required the most sudden burst of technological creativity, and the largest commitment of resources ($24 billion) ever made by any nation in peacetime. At its peak, the Apollo program employed 400,000 Americans and required the support of over 20,000 industrial firms and universities. In less than 8 years they developed 5 different space craft. i.e. Mercury, Gemini, Apollo service, Apollo command, Lunar Lander. Ultimately 24 people flew to the moon and 12 walked on it. In the 50 years since that time, no human has traveled more than a few hundred miles from Earth.",
"There are more life forms on human skin than there are people on our planet. There are about a trillion microbes on your skin or in your skin, so more than 100 times the total number of humans on the planet. In fact, the ratio of human cells to microbes in the human body is roughly 1:1.3.",
"The possibility of dying on your way to buy a lottery ticket is higher than the possibility of actually winning the lottery. You are also more likely to be struck by lightning, or be hit by a falling airplane part in your lifetime than win the lottery. This is because on average the chance of a ticket being the jackpot ticket is 1 in 13,983,816.",
"It is possible that pessimism is inherited genetically. People can be predisposed to see the world more darkly than others if they have a different variation of the ADRA2b gene. Neuroscientist Professor Rebecca Todd explain: 'A previously known genetic variation causes some individuals to perceive the world more vividly than others - and particularly negative aspects of the world...For example, people who have this variation might look out at a crowd of people and only see angry faces'.",
"In 1913, upon Edinburgh Zoo's opening, Norway gifted them their first king penguin. Since 1972, the Norwegian King's Guard has adopted 3 penguins, at different time periods, and each was given a rank within the regiment. One of them was even knighted by King Harald V of Norway as Sir Nils Olav III",
"While the Egyptians were building the pyramids, a colony of Woolly mammoths that had survived, took residence on a small island called Wrangle Island. Mammoths lived there up until around 1,650 BCE, which is nearly 1,000 years after the pyramids were built.",
"Jack Black is the son of rocket scientists. His parents, Thomas William Black and Judith Love Cohen were satellite engineers who worked on the Hubble Space Telescope. Jack Black joked about his academic parents in a 2003 interview with Newsweek, saying, \"I didn't inherit any of their brainpower. But I have the power to rock. They're rocket scientists. I'm a rock scientist.\"",
'Ethan Zuckerman invented popup ads in the late 90s while working for Tripod.com. He has since apologized, and thinks it is time online sites and services moved away from using ads altogether. "I have come to believe that advertising is the original sin of the web" he writes in an article for The Atlantic, going on to explain that everything from Facebook tracking us across sites to Google knowing just about everything about you has something to do with advertising.',
"Green is seen as a symbol of life, but scientists claim that the earliest life on Earth might have been purple. Early life-forms on Earth may have been able to generate metabolic energy from sunlight using a purple-pigmented molecule called retinal that possibly predates the evolution of chlorophyll and photosynthesis. If retinal has evolved on other worlds, it could create a a distinctive biosignature as it absorbs green light in the same way that vegetation on Earth absorbs red and blue light.",
"The Earth is not a perfect sphere: instead, it is closer to an oblate spheroid. It is pudgier towards the equator, mostly due to the centrifugal force caused by the Earth's rotation. However, it is not a perfect oblate spheroid either. The mass is distributed very unevenly throughout the planet, and the higher the concentration of mass at one location, the stronger the gravitational pull, creating 'bumps' around the globe. Other dynamic factors also influence the shape of the Earth, such as tides (shifting the distribution of water), movement of tectonic plates, mass shifting inside the planet and more.",
"Nowadays, E-commerce is a dominant market. Who wouldn't want to get anything at the click of a button? However, you'd be surprised that the earliest sale transaction on the internet was of weed. In 1972, long before eBay or Amazon, students from Stanford University in California and MIT in Massachusetts conducted the first ever online transaction. Using the Arpanet account at their artificial intelligence lab, the Stanford students sold their counterparts a tiny amount of marijuana.",
"Walmart once had over 23,000 applications for 600 jobs in a newly opened store in Washignton DC. With those numbers, the Walmart acceptance rate was at 2.6%. This makes it twice as hard as getting into Harvard and over five times harder than getting into Cornell.",
"According to the UN's World Happiness Report, Finland has been the world's happiest country for 3 consecutive years as of 2020. The data is based on citizens asked to rate their life from 1 to 10. Interestingly, Finland is closely followed by other European countries such as Denmark, Norway, Iceland, and the Netherlands.",
r"According to a 2014 study, 12 out of 15 addicts were able to quit smoking through the use of magic mushrooms. For 3 sessions, the chronic smokers were treated with psychedelic mushrooms. Surprisingly, the 80% success rate dwarfed the 35% success rate of leading treatment drugs.",
"Every year, the town of Lopburi holds a buffet for monkeys. During the Monkey Buffet Festival, the town serves 3000 kgs of fruits and vegetables to the local monkey population of 2,000 crab-eating macaques in Lopburi Province north of Bangkok. The festival was described as one of the strangest festivals by London's Guardian newspaper along with Spain's baby-jumping festival. During that festival, known as El Salto del Colacho (the devil jump), men dressed as the devil in red and yellow suits jump over babies born during the previous twelve months of the year who lie on mattresses in the street. The 'devils' hold whips and oversized castanets as they jump over the infant children.",
"Rowan Atkinson has made generations laugh at his goofy antics as Mr. Bean. However, the Englishman is actually quite the intellectual. What most people don't know is that Atkinson has a Master's degree in Electrical Engineering from Oxford up his sleeve. His MSc thesis considered the application of self-tuning control. Oxfor also made Atkinson an Honorary Fellow in 2006.",
"Chlorine is in all of our bodily secretions and excretions. Our body's chlorine levels are almost always parallel to the levels of sodium (due to the makeup of salt i.e. sodium chloride). There is roughly 95g of chlorine in the body, which is enough to disinfect about 8000l of water.",
"Adrenaline, also known as epinephrine, is a hormone released by our body during stressful situations. This hormone gives us a temporary boost of strength, speed, or basically anything that can help us stay alive. In some cases, adrenaline also keeps us from feeling the pain of fatal wounds. It has even been found that adrenergic hormones, such as adrenaline, can produce retrograde enhancement of long-term memory in humans.",
r"The Sun accounts for 99.8% of the mass in our solar system with a mass of around 330,000 times that of Earth. The Sun is made up of mostly hydrogen (three quarters worth) with the rest of its mass attributed to helium. It is roughly 4.5 billion years old. Although massive, it is relatively tiny compared to some other stars, and is classified as a yellow dwarf star. For example, UY Scuti, which lies near the center of the Milky Way, is classified as a hypergiant and is 1,708 solar radii (compared to the Sun's 1).",
"The universe extends far beyond our own galaxy, The Milky Way, which is why scientists can only estimate how many stars are in space. However, scientists estimate the universe contains approximately 1 septillion (1 followed by 24 zeros!) stars. While no one can actually count every single grain of sand on the earth, the estimated total from researchers at the University of Hawaii, is somewhere around seven quintillion, so there are many more stars in the known universe than grains of sand on Earth.",
r"In Monopoly, when a player throws doubles (both dice land on the same number) he may take another turn. However, if he throws doubles three times in one turn, then he is considered to be 'speeding' and must go to jail. There is an approximately 0.46% chance of this happening. However, a monopoly game lasts about 20-25 turns, so according to Wolfram Alpha that's about a 7% chance of rolling three doubles in the whole game.",
r"In terms of land area, the British Empire was the largest empire in recorded history, covering around 26% of the entire world's land surface. However, the Mongol Empire, which comes in at second with around 18% of the world's land surface, was the largest contiguous land empire, and was at its peak roughly 700 years before the British Empire.",
r"Without a doubt, the greatest conqueror of all time was Genghis Khan, founder of the Mongol Empire. It is estimated that he was responsible for the deaths of up to 11% of the world's population (40 million people). Originally known as Temijin, this son of a Mongol chieftain was given the honorary title of Chinggis Khan when he assumed power, thought to mean 'the oceanic, universal ruler'. He went on to conquer more than double the land than the second greatest conqueror in history, Alexander the Great.",
r"According to estimates in the Food Waste Index Report 2021 by UNEP, 931 million tonnes of food was wasted globally in 2019, roughly 17% of food produced for human consumption.If food waste were a country, it would be the third-biggest source of greenhouse gas emissions, behind only China and the United States. Individual households were found to be responsible for around 61% of the total, meaning reducing food waste at home could be extremely beneficial for the environment.",
"Bones of primitive Homo sapiens first appear 300,000 years ago in Africa, with brains as large or larger than ours. They're followed by anatomically modern Homo sapiens at least 200,000 years ago, and brain shape became essentially modern by at least 100,000 years ago. However, tools, artefacts and cave art suggest that complex technology and cultures, 'behavioural modernity', evolved more recently, about 65,000 years ago, and agriculture as we understand it today is believed to have been discovered only 12,000 years ago.",
]
LITERATURE_EXCERPTS = [
"The Ministry of Truth, which concerned itself with news, entertainment, education and the fine arts. The Ministry of Peace, which concerned itself with war. The Ministry of Love, which maintained law and order. And the Ministry of Plenty, which was responsible for economic affairs. Their names, in Newspeak: Minitrue, Minipax, Miniluv and Miniplenty. - George Orwell, 1984",
"He found himself understanding the wearisomeness of this life, where every path was an improvisation and a considerable part of one's waking life was spent watching one's feet. - William Golding, Lord of the Flies",
"Vonnegut could not help looking back, despite the danger of being turned metaphorically into a pillar of salt, into an emblem of the death that comes to those who cannot let go of the past. - Kurt Vonnegut, Slaughterhouse-Five",
"But I remembered one thing: it wasn't me that started acting deaf; it was people that first started acting like I was too dumb to hear or see or say anything at all. - Ken Kesey, One Flew Over the Cuckoo's Nest",
"When today fails to offer the justification for hope, tomorrow becomes the only grail worth pursuing. - Arthur Miller, Death of a Salesman",
"It is far better to endure patiently a smart which nobody feels but yourself, than to commit a hasty action whose evil consequences will extend to all connected with you; and besides, the Bible bids us return good for evil. - Charlotte Bronte, Jane Eyre",
"As I took another breath, I saw the three stars again. They were not calling to me; they were letting me go, leaving me to the black universe I had wandered for so many lifetimes. I drifted into the black, and it got brighter and brighter. It wasn't black at all-it was blue. Warm, vibrant, brilliant blue... I floated into it with no fear at all. - Stephenie Meyer, The Host",
"Religion is like language or dress. We gravitate toward the practices with which we were raised. In the end, though, we are all proclaiming the same thing. That life has meaning. That we are grateful for the power that created us. - Dan Brown, Angels & Demons",
"First, let no one rule your mind or body. Take special care that your thoughts remain unfettered... Give men your ear, but not your heart. Show respect for those in power, but don't follow them blindly. Judge with logic and reason, but comment not. Consider none your superior whatever their rank or station in life. Treat all fairly, or they will seek revenge. Be careful with your money. Hold fast to your beliefs and others will listen. - Christopher Paolini, Eragon",
"Love, whether newly born or aroused from a deathlike slumber, must always create sunshine, filling the heart so full of radiance, that it overflows upon the outward world. - Nathaniel Hawthorne, The Scarlet Letter",
"You're both the fire and the water that extinguishes it. You're the narrator, the protagonist, and the sidekick. You're the storyteller and the story told. You are somebody's something, but you are also your you. - John Green, Turtles All the Way Down",
"When you cannot pinpoint a pain in your body, the whole world seems to throb with it. Trees in pain, lit windows in pain, Wednesday nights in pain. Pianos flaming with pain, and the scale sliding up into a cry. - Patricia Lockwood, Priestdaddy",
"At an early age, I learned that people make mistakes, and you have to decide if their mistakes are bigger than your love for them. - Angie Thomas, The Hate U Give",
"Grief was what you owed the dead for the necessary crime of living on without them. - Kamila Shamsie, Home Fire",
"Grief was the deal God struck with the angel of death, who wanted an unpassable river to separate the living from the dead; grief the bridge that would allow the dead to flit among the living, their footsteps overheard, their laughter around the corner, their posture recognizable in the bodies of strangers you would follow down the street, willing them to never turn around. - Kamila Shamsie, Home Fire",
"If you are one of those people who has the ability to make it down to the bottom of the ocean, the ability to swim the dark waters without fear, the astonishing ability to move through life's worst crucibles and not die, then you also have the ability to bring something back to the surface that helps others in a way that they cannot achieve themselves. - Lidia Yuknavitch, The Misfit's Manifesto",
"Her life is architected, elegant and angular, a beauty to behold, and mine is a stew, a juicy, sloppy mess of ingredients and feelings and emotions, too much salt and spice, too much anxiety, always a little dribbling down the front of my shirt. But have you tasted it? It's delicious. Jami Attenberg, All Grown Up",
"I kept thinking about the uneven quality of time--the way it was almost always so empty, and then with no warning came a few days that felt so dense and alive and real that it seemed indisputable that that was what life was, that its real nature had finally been revealed. But then time passed and unthinkably grew dead again, and it turned out that that fullness had been an aberration and might never come back. - Elif Batuman, The Idiot",
"""My mother died today. Or maybe yesterday, I don't know. I received a telegram from the old people's home: "Mother deceased. Funeral tomorrow. Very sincerely yours." That doesn't mean anything. It might have been yesterday. - Albert Camus, The Stranger""",
"America was never innocent. We popped our cherry on the boat over and looked back with no regrets. You can't ascribe our fall from grace to any single event or set of circumstances. You can't lose what you lacked at conception. - James Ellroy, American Tabloid",
"The studio was filled with the rich odour of roses, and when the light summer wind stirred amidst the trees of the garden, there came through the open door the heavy scent of the lilac, or the more delicate perfume of the pink-flowering thorn. - Oscar Wilde, The Picture of Dorian Gray",
"It is a truth universally acknowledged, that a single man in possession of a good fortune must be in want of a wife. However little known the feelings or views of such a man may be on his first entering a neighbourhood, this truth is so well fixed in the minds of the surrounding families, that he is considered as the rightful property of some one or other of their daughters. - Jane Austen, Pride and Prejudice",
"I was 37 then, strapped in my seat as the huge 747 plunged through dense cloud cover on approach to Hamburg Airport. Cold November rains drenched the earth. lending everything the gloomy air of a Flemish landscape: the ground crew in waterproofs, a flag atop a squat building, a BMW billboard. So - Germany again. - Haruki Murakami, Norwegian Wood",
"But who can say what's best? That's why you need to grab whatever chance you have of happiness where you find it, and not worry about other people too much. My experience tells me that we get no more than two or three such chances in a life time, and if we let them go, we regret it for the rest of our lives. - Haruki Murakami, Norwegian Wood",
"If you only read the books that everyone else is reading, you can only think what everyone else is thinking. - Haruki Murakami, Norwegian Wood",
"No truth can cure the sorrow we feel from losing a loved one. No truth, no sincerity, no strength, no kindness can cure that sorrow. All we can do is see it through to the end and learn something from it, but what we learn will be no help in facing the next sorrow that comes to us without warning. - Haruki Murakami, Norwegian Wood",
"Nikki, the name we finally gave my younger daughter, is not an abbreviation; it was a compromise I reached with her father. For paradoxically it was he who wanted to give her a Japanese name and I - perhaps out of some selfish desire not to be reminded of the past - insisted on an English one. - Kazuo Ishiguro, A Pale View of Hills",
"Her first name was India - she was never able to get used to it. It seemed to her that her parents must have been thinking of someone else when they named her. Or were they hoping for another sort of daughter? As a child she was often on the point of inquiring, but time passed, and she never did. - Evan S. Connell, Mrs Bridge",
"For seven years I tried not to remember too much because there was too much to remember, and I didn't want to fall any further behind with the events of my life. I still don't have a vegetable garden. I still haven't been to France. I have gone to bed with enough people that they seem like actual people now, but while I was going to bed with them I thought I was catching up. I am sorry. I had lost what seemed like a lot of time. - Sarah Manguso, The Two Kinds of Decay",
"Nobody died that year. Nobody prospered. There were no births or marriages. Seventeen reverent satires were written - disrupting a cliche and, presumably, creating a genre. There was a dream, of course, but many of the most important things, I find, are the ones learned in your sleep. Speech, tennis, music, skiing, manners, love - you try them waking and perhaps balk at the jump, and then you're over. - Renata Adler, Speedboat",
"You don't know about me, without you have read a book by the name of The Adventures of Tom Sawyer, but that ain't no matter. That book was made by Mr Mark Twain, and he told the truth, mainly. There was things he stretched, but mainly he told the truth. That is nothing. - Mark Twain, The Adventures of Huckleberry Finn",
"About all I know is, I sort of miss everybody I told about. Even old Stradlater and Ackley, for instance. I think I even miss that goddam Maurice. It's funny. Don't ever tell anybody anything. If you do, you start missing everybody. - J.D. Salinger, The Catcher in the Rye",
""""But soon," he cried with sad and solemn enthusiasm, "I shall die, and what I now feel be no longer felt. Soon these burning miseries will be extinct. I shall ascend my funeral pile triumphantly and exult in the agony of the torturing flames. The light of that conflagration will fade away; my ashes will be swept into the sea by the winds. My spirit will sleep in peace, or if it thinks, it will not surely think thus. Farewell." He sprang from the cabin-window as he said this, upon the ice raft which lay close to the vessel. He was soon borne away by the waves, and lost in darkness and distance. - Mary Shelly, Frankenstein""",
"So we beat on, boats against the current, borne back ceaselessly into the past. - F. Scott Fitzgerald, The Great Gatsby",
"There was a white horse, on a quiet winter morning when snow covered the streets gently and was not deep, and the sky was swept with vibrant stars, except in the east, where dawn was beginning in a light blue flood. The air was motionless, but would soon start to move as the sun came up and winds from Canada came charging down the Hudson. - Mark Helprin, A New York Winter's Tale",
"I must not fear. Fear is the mind-killer. Fear is the little-death that brings total obliteration. I will face my fear. I will permit it to pass over me and through me. And when it has gone past I will turn the inner eye to see its path. Where the fear has gone there will be nothing. Only I will remain. - Frank Herbert, Dune",
"Stuff your eyes with wonder, he said, live as if you'd drop dead in ten seconds. See the world. It's more fantastic than any dream made or paid for in factories. - Ray Bradbury, Fahrenheit 451",
"We cannot tell the precise moment when friendship is formed. As in filling a vessel drop by drop, there is at last a drop which makes it run over; so in a series of kindnesses there is at last one which makes the heart run over. - Ray Bradbury, Fahrenheit 451",
"Don't ask for guarantees. And don't look to be saved in any one thing, person, machine, or library. Do your own bit of saving, and if you drown, at least die knowing you were heading for shore. - Ray Bradbury, Fahrenheit 451",
"You can tell yourself that you would be willing to lose everything you have in order to get something you want. But it's a catch-22: all of those things you're willing to lose are what make you recognizable. Lose them, and you've lost yourself. - Jodi Picoult, Handle with Care",
"You have brains in your head. You have feet in your shoes. You can steer yourself any direction you choose. You're on your own. And you know what you know. And YOU are the one who'll decide where to go... - Dr. Seuss, Oh, the Places You'll Go!",
"I had forgotten that time wasn't fixed like concrete but in fact was fluid as sand, or water. I had forgotten that even misery can end. - Joyce Carol Oates, I Am No One You Know",
"If you want to know what a man's like, take a good look at how he treats his inferiors, not his equals. - J.K. Rowling, Harry Potter and the Goblet of Fire",
"We are the music-makers, And we are the dreamers of dreams, Wandering by lone sea-breakers, And sitting by desolate streams. World-losers and world-forsakers, Upon whom the pale moon gleams; Yet we are the movers and shakers, Of the world forever, it seems. - Arthur O'Shaughnessy, Ode",
"The wide world is all about you: you can fence yourselves in, but you cannot for ever fence it out. - J.R.R. Tolkien, The Fellowship of the Ring",
"There are many Beths in the world, shy and quiet, sitting in corners till needed, and living for others so cheerfully that no one sees the sacrifices till the little cricket on the hearth stops chirping, and the sweet, sunshiny presence vanishes, leaving silence and shadow behind. - Louisa May Alcott, Little Women",
"But of course we can't take any credit for our talents. It's how we use them that counts. - Madeleine L'Engle, A Wrinkle in Time: With Related Readings",
"The rules of the Hunger Games are simple. In punishment for the uprising, each of the twelve districts must provide one girl and one boy, called tributes, to participate. The twenty-four tributes will be imprisoned in a vast outdoor arena that could hold anything from a burning desert to a frozen wasteland. Over a period of several weeks, the competitors must fight to the death. The last tribute standing wins. - Suzanne Collins, The Hunger Games",
"It does not do to dwell on dreams and forget to live, remember that. Now, why don't you put that admirable Cloak back on and get off to bed? - J.K. Rowling, Harry Potter and the Sorcerer's Stone",
"Of course it is happening inside your head, Harry, but why on earth should that mean that it is not real? - J.K. Rowling, Harry Potter and the Deathly Hallows",
]
QUOTES = [
"The greatest glory in living lies not in never falling, but in rising every time we fall. - Nelson Mandela",
"Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma - which is living with the results of other people's thinking. - Steve Jobs",
"If life were predictable it would cease to be life, and be without flavor. - Eleanor Roosevelt",
"If you look at what you have in life, you'll always have more. If you look at what you don't have in life, you'll never have enough. - Oprah Winfrey",
"Do not go where the path may lead, go instead where there is no path and leave a trail. - Ralph Waldo Emerson",
"Tell me and I forget. Teach me and I remember. Involve me and I learn. - Benjamin Franklin",
"You will face many defeats in life, but never let yourself be defeated. - Maya Angelou",
"Only a life lived for others is a life worthwhile. - Albert Einstein",
"Twenty years from now you will be more disappointed by the things that you didn't do than by the ones you did do. So, throw off the bowlines, sail away from safe harbor, catch the trade winds in your sails. Explore, Dream, Discover. - Mark Twain",
"The mediocre teacher tells. The good teacher explains. The superior teacher demonstrates. The great teacher inspires. - William Arthur Ward",
"If you can't communicate and talk to other people and get across your ideas, you're giving up your potential. - Warren Buffet",
"If you can't explain it simply, you don't understand it well enough. - Albert Einstein",
"A designer knows he or she has achieved perfection, not when there is nothing left to add, but when there is nothing left to take away. - Nolan Haims",
"Always do right. This will gratify some people and astonish the rest. - Mark Twain",
"Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has. - Margaret Mead",
"I'm sorry, but I don't want to be an emperor. That's not my business. I don't want to rule or conquer anyone. I should like to help everyone if possible; Jew, Gentile, black man, white. We all want to help one another. Human beings are like that. We want to live by each other's happiness, not by each other's misery. We don't want to hate and despise one another. In this world there is room for everyone, and the good earth is rich and can provide for everyone. - Charlie Chaplain",
"Remembering that I'll be dead soon is the most important tool I've ever encountered to help me make the big choices in life. Almost everything - all external expectations, all pride, all fear of embarrassment or failure - these things just fall away in the face of death, leaving only what is truly important. - Steve Jobs",
"No one wants to die. Even people who want to go to heaven don't want to die to get there. And yet, death is the destination we all share. No one has ever escaped it, and that is how it should be, because death is very likely the single best invention of life. It's life's change agent. It clears out the old to make way for the new. - Steve Jobs",
"We speak not only to tell other people what we think, but to tell ourselves what we think. Speech is a part of thought. - Oliver Sacks",
"We may not be able to stop evil in the world, but how we treat one another is entirely up to us. - Barack Obama",
"The quality of mercy is not straind. It droppeth as the gentle rain from heaven upon the place beneath. It is twice blest: it blesseth him that gives and him that take - William Shakespeare, The Merchant of Venice",
"I love you the more in that I believe you had liked me for my own sake and for nothing else. - John Keats",
"Let us sacrifice our today so that our children can have a better tomorrow. - A.P.J. Abdul Kalam",
"The most difficult thing is the decision to act, the rest is merely tenacity. The fears are paper tigers. You can do anything you decide to do. You can act to change and control your life; and the procedure, the process is its own reward. - Amelia Earhart",
"Do not mind anything that anyone tells you about anyone else. Judge everyone and everything for yourself. - Henry James",
"Good judgment comes from experience, and a lot of that comes from bad judgment. - Will Rogers",
"Think in the morning. Act in the noon. Eat in the evening. Sleep in the night. - William Blake",
"Work like you don't need the money. Love like you've never been hurt. Dance like nobody's watching. - Satchel Paige",
"If you know the enemy and know yourself, you need not fear the result of a hundred battles. - Sun Tzu, The Art of War",
"The supreme art of war is to subdue the enemy without fighting. - Sun Tzu, The Art of War",
"There is only one corner of the universe you can be certain of improving, and that's your own self. - Aldous Huxley",
"Wise men speak because they have something to say; Fools because they have to say something. - Plato",
"Always remember that you are absolutely unique. Just like everyone else. - Margaret Mead",
"The World is my country, all mankind are my brethren, and to do good is my religion. - Thomas Paine",
"The only true wisdom is in knowing you know nothing. - Socrates",
"As we express our gratitude, we must never forget that the highest appreciation is not to utter words, but to live by them. - John F. Kennedy",
"Education is the most powerful weapon which you can use to change the world. - Nelson Mandela",
"Today you are you! That is truer than true! There is no one alive who is you-er than you! - Dr. Seuss",
"The only thing necessary for the triumph of evil is for good men to do nothing. - Edmund Burke",
"Don't judge each day by the harvest you reap but by the seeds that you plant. - Robert Louis Stevenson",
"It is during our darkest moments that we must focus to see the light. - Aristotle",
"Where tillage begins, other arts follow. The farmers therefore are the founders of human civilization. - Daniel Webster",
"Those who cannot remember the past are condemned to repeat it. - George Santayana",
"The haft of the arrow had been feathered with one of the eagle's own plumes. We often give our enemies the means of our own destruction. - Aesop",
"The unleashed power of the atom has changed everything save our modes of thinking, and we thus drift toward unparalleled catastrophes. - Albert Einstein",
"If the brain were so simple we could understand it, we would be so simple we couldn't. - Lyall Watson",
"Wherever we look, the work of the chemist has raised the level of our civilization and has increased the productive capacity of our nation. - Calvin Coolidge",
"Better is bread with a happy heart than wealth with vexation. - Amenemope",
"As soon as men decide that all means are permitted to fight an evil, then their good becomes indistinguishable from the evil that they set out to destroy. - Christopher Dawson",
"Is it a fact - or have I dreamt it - that, by means of electricity, the world of matter has become a great nerve, vibrating thousands of miles in a breathless point of time? - Nathaniel Hawthorne",
"The day when two army corps can annihilate each other in one second, all civilized nations, it is to be hoped, will recoil from war and discharge their troops. - Alfred Nobel",
"A horse, a horse! My kingdom for a horse! - William Shakespeare, Richard III",
"The press is the best instrument for enlightening the mind of man, and improving him as a rational, moral and social being. - Thomas Jefferson",
"The speed of communication is wondrous to behold. It is also true that speed can multiply the distribution of information that we know to be untrue. - Edward R. Murrow",
"There never was a good knife made of bad steel. - Benjamin Franklin",
"And homeless near a thousand homes I stood, and near a thousand tables pined and wanted food. - William Wordsworth",
"1. A robot may not injure a human being or, through inaction, allow a human being to come to harm. 2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law. 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. - Isaac Asimov",
"Pale Death beats equally at the poor man's gate and at the palaces of kings. - Horace",
"Most of us can, as we choose, make of the world either a palace or a prison. - John Lubbock",
"We only live to discover beauty. All else is a form of waiting. - Kahlil Gibran",
"Every genuine work of art has as much reason for being as the earth and the sun. - Ralph Waldo Emerson",
"Time crumbles things; everything grows old and is forgotten under the power of time. - Aristotle",
"The true test of civilization is, not the census, nor the size of cities, nor the crops - no, but the kind of man the country turns out. - Ralph Waldo Emerson",
"He who knows others is wise; He who know himself is enlightened. - Lao Tzu",
"Whoever desires to found a state and give it laws, must start with assuming that all men are bad and ever ready to display their vicious nature, whenever they may find occasion for it. - Niccolo Machiavelli",
"In the country of the blind, the one-eyed man is king. - Erasmus",
"I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin but by the content of their character. I have a dream today. - Martin Luther King Jr.",
"Not everything that counts can be counted, and not everything that can be counted counts. - Albert Einstein",
]
# FUNCTIONS
def get_random_choice(lst: list) -> Generator:
"""
Generator which shuffles a list of strings and yields one string at a time.
Text is also stripped of trailing and leading whitespaces.
"""
random.shuffle(lst)
for text in lst:
yield text.strip()
def get_random_text(text_filename: Path, num_sentences: int) -> Generator:
"""
Generator which yields a string of a given number of sentences from a given
text file (text_filename = Path object to .txt file).
The text is also formatted slightly.
"""
with open(text_filename) as corpus_text:
lines = corpus_text.read().splitlines()
while True:
rand_int = int(random.random() * (len(lines) - num_sentences))
rand_sentences = lines[rand_int : rand_int + num_sentences]
raw_text = " ".join(rand_sentences)
# Make sure first character is always capitalised if possible
processed_text = f"{raw_text[0].upper()}{raw_text[1:]}"
yield processed_text
_translate = {
"Common Phrases": lambda: get_random_choice(COMMON_PHRASES),
"Facts": lambda: get_random_choice(FACTS),
"Famous Literature Excerpts": lambda: get_random_choice(LITERATURE_EXCERPTS),
"Famous Quotes": lambda: get_random_choice(QUOTES),
"Random Text: Brown": lambda: get_random_text(BROWN_TEXT, RANDOM_TEXT_SENTENCES),
"Random Text: Gutenberg": lambda: get_random_text(
GUTENBERG_TEXT, RANDOM_TEXT_SENTENCES
),
"Random Text: Webtext": lambda: get_random_text(
WEBTEXT_TEXT, RANDOM_TEXT_SENTENCES
),
}
| 179.135734 | 695 | 0.767737 | 11,000 | 64,668 | 4.507545 | 0.291818 | 0.006353 | 0.001815 | 0.002178 | 0.009681 | 0.005284 | 0.001573 | 0 | 0 | 0 | 0 | 0.010737 | 0.187744 | 64,668 | 360 | 696 | 179.633333 | 0.933177 | 0.006665 | 0 | 0.006116 | 0 | 0.385321 | 0.94047 | 0.000557 | 0 | 0 | 0 | 0 | 0 | 1 | 0.006116 | false | 0.018349 | 0.015291 | 0 | 0.021407 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d49190c65649ee0f1cd77cab3806caaeda2edd87 | 5,925 | py | Python | tests/unit/modules/nxos/nxos_n5k.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 19 | 2016-01-29T14:37:52.000Z | 2022-03-30T18:08:01.000Z | tests/unit/modules/nxos/nxos_n5k.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 223 | 2016-03-02T16:39:41.000Z | 2022-03-03T12:26:35.000Z | tests/unit/modules/nxos/nxos_n5k.py | Noah-Huppert/salt | 998c382f5f2c3b4cbf7d96aa6913ada6993909b3 | [
"Apache-2.0"
] | 64 | 2016-02-04T19:45:26.000Z | 2021-12-15T02:02:31.000Z | # -*- coding: utf-8 -*-
"""
:codeauthor: Thomas Stoner <tmstoner@cisco.com>
"""
# Copyright (c) 2018 Cisco and/or its affiliates.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from tests.unit.modules.nxos.nxos_platform import NXOSPlatform
class N5KPlatform(NXOSPlatform):
""" Cisco Systems N5K Platform Unit Test Object """
chassis = "cisco Nexus 5672UP 16G-FC Chassis"
# Captured output from: show install all impact kickstart <kimage> system <image>
show_install_all_impact = """
Verifying image bootflash:/$KIMAGE for boot variable "kickstart".
[####################] 100% -- SUCCESS
Verifying image bootflash:/$IMAGE for boot variable "system".
[####################] 100% -- SUCCESS
Verifying image type.
[####################] 100% -- SUCCESS
Extracting "system" version from image bootflash:/$IMAGE.
[####################] 100% -- SUCCESS
Extracting "kickstart" version from image bootflash:/$KIMAGE.
[####################] 100% -- SUCCESS
Extracting "bios" version from image bootflash:/$IMAGE.
[####################] 100% -- SUCCESS
Performing module support checks.
[####################] 100% -- SUCCESS
Compatibility check is done:
Module bootable Impact Install-type Reason
------ -------- -------------- ------------ ------
0 yes disruptive reset ISSD is not supported and switch will reset with ascii configuration
1 yes disruptive reset ISSD is not supported and switch will reset with ascii configuration
2 yes disruptive reset ISSD is not supported and switch will reset with ascii configuration
Images will be upgraded according to following table:
Module Image Running-Version New-Version Upg-Required
------ ---------------- ---------------------- ---------------------- ------------
0 system $CVER $NVER $REQ
0 kickstart $CVER $NVER $REQ
0 bios v0.1.9(03/09/2016) v0.1.6(12/03/2015) no
0 power-seq SF-uC:37, SF-FPGA:35 SF-uC:37, SF-FPGA:35 no
0 iofpga v0.0.0.39 v0.0.0.39 no
1 iofpga v0.0.0.18 v0.0.0.18 no
2 iofpga v0.0.0.18 v0.0.0.18 no
Warning : ISSD is not supported and switch will reset with ASCII configuration.
All incompatible configuration will be lost in the target release.
Please also refer the downgrade procedure documentation of the release for more details.
"""
# Captured output from: install all kickstart <kimage> system <image> '''
install_all_disruptive_success = """
Verifying image bootflash:/$KIMAGE for boot variable "kickstart".
[####################] 100% -- SUCCESS
Verifying image bootflash:/$IMAGE for boot variable "system".
[####################] 100% -- SUCCESS
Verifying image type.
[####################] 100% -- SUCCESS
Extracting "system" version from image bootflash:/$IMAGE.
[####################] 100% -- SUCCESS
Extracting "kickstart" version from image bootflash:/$KIMAGE.
[####################] 100% -- SUCCESS
Extracting "bios" version from image bootflash:/$IMAGE.
[####################] 100% -- SUCCESS
Performing module support checks.
[####################] 100% -- SUCCESS
Compatibility check is done:
Module bootable Impact Install-type Reason
------ -------- -------------- ------------ ------
0 yes disruptive reset ISSD is not supported and switch will reset with ascii configuration
1 yes disruptive reset ISSD is not supported and switch will reset with ascii configuration
2 yes disruptive reset ISSD is not supported and switch will reset with ascii configuration
Images will be upgraded according to following table:
Module Image Running-Version New-Version Upg-Required
------ ---------------- ---------------------- ---------------------- ------------
0 system $CVER $NVER $REQ
0 kickstart $CKVER $NKVER $KREQ
0 bios v0.1.9(03/09/2016) v0.1.6(12/03/2015) no
0 power-seq SF-uC:37, SF-FPGA:35 SF-uC:37, SF-FPGA:35 no
0 iofpga v0.0.0.39 v0.0.0.39 no
1 iofpga v0.0.0.18 v0.0.0.18 no
2 iofpga v0.0.0.18 v0.0.0.18 no
Warning : ISSD is not supported and switch will reset with ASCII configuration.
All incompatible configuration will be lost in the target release.
Please also refer the downgrade procedure documentation of the release for more details.
Install is in progress, please wait.
Performing runtime checks.
[####################] 100% -- SUCCESS
Setting boot variables.
[####################] 100% -- SUCCESS
Performing configuration copy.
[####################] 100% -- SUCCESS
Converting startup config.
[####################] 100% -- SUCCESS
Finishing the upgrade, switch will reboot in 10 seconds.
"""
| 39.765101 | 116 | 0.555781 | 673 | 5,925 | 4.875186 | 0.28529 | 0.054861 | 0.01463 | 0.043889 | 0.669918 | 0.669918 | 0.669918 | 0.669918 | 0.669918 | 0.669918 | 0 | 0.049708 | 0.276793 | 5,925 | 148 | 117 | 40.033784 | 0.715986 | 0.141266 | 0 | 0.833333 | 0 | 0.047619 | 0.945598 | 0.095747 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.02381 | 0 | 0.071429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d49d6f7d3294dc1be8ddac769e650e95458367b6 | 6,541 | py | Python | btk20_src/square_root/square_root.py | mmahrous90/distant_speech_recognition | 083e663d7c1eb6e5fe89c40ba2b43a30bf9c65b5 | [
"MIT"
] | null | null | null | btk20_src/square_root/square_root.py | mmahrous90/distant_speech_recognition | 083e663d7c1eb6e5fe89c40ba2b43a30bf9c65b5 | [
"MIT"
] | null | null | null | btk20_src/square_root/square_root.py | mmahrous90/distant_speech_recognition | 083e663d7c1eb6e5fe89c40ba2b43a30bf9c65b5 | [
"MIT"
] | null | null | null | # This file was automatically generated by SWIG (http://www.swig.org).
# Version 3.0.12
#
# Do not make changes to this file unless you know what you are doing--modify
# the SWIG interface file instead.
from sys import version_info as _swig_python_version_info
if _swig_python_version_info >= (2, 7, 0):
def swig_import_helper():
import importlib
pkg = __name__.rpartition('.')[0]
mname = '.'.join((pkg, '_square_root')).lstrip('.')
try:
return importlib.import_module(mname)
except ImportError:
return importlib.import_module('_square_root')
_square_root = swig_import_helper()
del swig_import_helper
elif _swig_python_version_info >= (2, 6, 0):
def swig_import_helper():
from os.path import dirname
import imp
fp = None
try:
fp, pathname, description = imp.find_module('_square_root', [dirname(__file__)])
except ImportError:
import _square_root
return _square_root
try:
_mod = imp.load_module('_square_root', fp, pathname, description)
finally:
if fp is not None:
fp.close()
return _mod
_square_root = swig_import_helper()
del swig_import_helper
else:
import _square_root
del _swig_python_version_info
try:
_swig_property = property
except NameError:
pass # Python < 2.2 doesn't have 'property'.
try:
import builtins as __builtin__
except ImportError:
import __builtin__
def _swig_setattr_nondynamic(self, class_type, name, value, static=1):
if (name == "thisown"):
return self.this.own(value)
if (name == "this"):
if type(value).__name__ == 'SwigPyObject':
self.__dict__[name] = value
return
method = class_type.__swig_setmethods__.get(name, None)
if method:
return method(self, value)
if (not static):
if _newclass:
object.__setattr__(self, name, value)
else:
self.__dict__[name] = value
else:
raise AttributeError("You cannot add attributes to %s" % self)
def _swig_setattr(self, class_type, name, value):
return _swig_setattr_nondynamic(self, class_type, name, value, 0)
def _swig_getattr(self, class_type, name):
if (name == "thisown"):
return self.this.own()
method = class_type.__swig_getmethods__.get(name, None)
if method:
return method(self)
raise AttributeError("'%s' object has no attribute '%s'" % (class_type.__name__, name))
def _swig_repr(self):
try:
strthis = "proxy of " + self.this.__repr__()
except __builtin__.Exception:
strthis = ""
return "<%s.%s; %s >" % (self.__class__.__module__, self.__class__.__name__, strthis,)
try:
_object = object
_newclass = 1
except __builtin__.Exception:
class _object:
pass
_newclass = 0
def cholesky_backsub(A, x):
return _square_root.cholesky_backsub(A, x)
cholesky_backsub = _square_root.cholesky_backsub
def vector_matrix_product(vec, mat, D):
return _square_root.vector_matrix_product(vec, mat, D)
vector_matrix_product = _square_root.vector_matrix_product
def make_conjugate_symmetric(mat):
return _square_root.make_conjugate_symmetric(mat)
make_conjugate_symmetric = _square_root.make_conjugate_symmetric
def cholesky_forwardsub(A, x):
return _square_root.cholesky_forwardsub(A, x)
cholesky_forwardsub = _square_root.cholesky_forwardsub
def cholesky_forwardsub_complex(lt, rhs, lhs, conjugate=False):
return _square_root.cholesky_forwardsub_complex(lt, rhs, lhs, conjugate)
cholesky_forwardsub_complex = _square_root.cholesky_forwardsub_complex
def cholesky_backsub_complex(lt, rhs, lhs, conjugate=False):
return _square_root.cholesky_backsub_complex(lt, rhs, lhs, conjugate)
cholesky_backsub_complex = _square_root.cholesky_backsub_complex
def rank_one_update_cholesky_factor(A11, alpha_m, c_m):
return _square_root.rank_one_update_cholesky_factor(A11, alpha_m, c_m)
rank_one_update_cholesky_factor = _square_root.rank_one_update_cholesky_factor
def propagate_covar_square_root_real(A11, A12, A21, A22, flag=False):
return _square_root.propagate_covar_square_root_real(A11, A12, A21, A22, flag)
propagate_covar_square_root_real = _square_root.propagate_covar_square_root_real
def sweep_lower_triangular(A, B):
return _square_root.sweep_lower_triangular(A, B)
sweep_lower_triangular = _square_root.sweep_lower_triangular
def propagate_covar_square_root_step1(A12, A22):
return _square_root.propagate_covar_square_root_step1(A12, A22)
propagate_covar_square_root_step1 = _square_root.propagate_covar_square_root_step1
def propagate_covar_square_root_step2a(A11, A12, A21, A22):
return _square_root.propagate_covar_square_root_step2a(A11, A12, A21, A22)
propagate_covar_square_root_step2a = _square_root.propagate_covar_square_root_step2a
def propagate_covar_square_root_step2b(A22):
return _square_root.propagate_covar_square_root_step2b(A22)
propagate_covar_square_root_step2b = _square_root.propagate_covar_square_root_step2b
def propagate_covar_square_root(A11, A12, A21, A22):
return _square_root.propagate_covar_square_root(A11, A12, A21, A22)
propagate_covar_square_root = _square_root.propagate_covar_square_root
def propagate_info_square_root(sqrt_Pm_inv, A12, a_21, a_22, rankOneA12=True):
return _square_root.propagate_info_square_root(sqrt_Pm_inv, A12, a_21, a_22, rankOneA12)
propagate_info_square_root = _square_root.propagate_info_square_root
def propagate_info_square_root_step2_rls(sqrt_Pm_inv, a_12, a_21, a_22):
return _square_root.propagate_info_square_root_step2_rls(sqrt_Pm_inv, a_12, a_21, a_22)
propagate_info_square_root_step2_rls = _square_root.propagate_info_square_root_step2_rls
def propagate_info_square_root_rls(sqrt_Pm_inv, a_12, a_21, a_22):
return _square_root.propagate_info_square_root_rls(sqrt_Pm_inv, a_12, a_21, a_22)
propagate_info_square_root_rls = _square_root.propagate_info_square_root_rls
def add_diagonal_loading(sqrt_Pm_inv, dim, wght):
return _square_root.add_diagonal_loading(sqrt_Pm_inv, dim, wght)
add_diagonal_loading = _square_root.add_diagonal_loading
def cholesky_diagonal(v, m):
return _square_root.cholesky_diagonal(v, m)
cholesky_diagonal = _square_root.cholesky_diagonal
def square_diagonal(v, m):
return _square_root.square_diagonal(v, m)
square_diagonal = _square_root.square_diagonal
# This file is compatible with both classic and new-style classes.
| 36.954802 | 92 | 0.756765 | 919 | 6,541 | 4.875952 | 0.190424 | 0.1763 | 0.071413 | 0.107119 | 0.590047 | 0.464405 | 0.407052 | 0.293015 | 0.195938 | 0.165142 | 0 | 0.025023 | 0.162972 | 6,541 | 176 | 93 | 37.164773 | 0.793425 | 0.0451 | 0 | 0.226277 | 1 | 0 | 0.02662 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.182482 | false | 0.014599 | 0.138686 | 0.145985 | 0.547445 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
d4a0b5e88fa6bde4d08745995dc36746fc0a3909 | 6,907 | py | Python | Python_files/INRIX_data_preprocessing_20_flow_conservation_adjustment_LS_Apr_ext.py | jingzbu/InverseVITraffic | c0d33d91bdd3c014147d58866c1a2b99fb8a9608 | [
"MIT"
] | null | null | null | Python_files/INRIX_data_preprocessing_20_flow_conservation_adjustment_LS_Apr_ext.py | jingzbu/InverseVITraffic | c0d33d91bdd3c014147d58866c1a2b99fb8a9608 | [
"MIT"
] | null | null | null | Python_files/INRIX_data_preprocessing_20_flow_conservation_adjustment_LS_Apr_ext.py | jingzbu/InverseVITraffic | c0d33d91bdd3c014147d58866c1a2b99fb8a9608 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
__author__ = "Jing Zhang"
__email__ = "jingzbu@gmail.com"
__status__ = "Development"
from util import *
import numpy as np
from numpy.linalg import inv
# load the original link counts data
import json
with open('../temp_files/link_day_minute_Apr_dict_ext_JSON.json', 'r') as json_file:
link_day_minute_Apr_dict_JSON = json.load(json_file)
AM_flow_list = []
MD_flow_list = []
PM_flow_list = []
NT_flow_list = []
AM_flow_minute_list = []
MD_flow_minute_list = []
PM_flow_minute_list = []
NT_flow_minute_list = []
for link_idx in range(64):
for day in range(31)[1:]:
key = 'link_' + str(link_idx) + '_' + str(day)
AM_flow_list.append(link_day_minute_Apr_dict_JSON[key] ['AM_flow'])
MD_flow_list.append(link_day_minute_Apr_dict_JSON[key] ['MD_flow'])
PM_flow_list.append(link_day_minute_Apr_dict_JSON[key] ['PM_flow'])
NT_flow_list.append(link_day_minute_Apr_dict_JSON[key] ['NT_flow'])
for minute_idx in range(120):
AM_flow_minute_list.append(link_day_minute_Apr_dict_JSON[key] ['AM_flow_minute'][minute_idx])
MD_flow_minute_list.append(link_day_minute_Apr_dict_JSON[key] ['MD_flow_minute'][minute_idx])
PM_flow_minute_list.append(link_day_minute_Apr_dict_JSON[key] ['PM_flow_minute'][minute_idx])
NT_flow_minute_list.append(link_day_minute_Apr_dict_JSON[key] ['NT_flow_minute'][minute_idx])
x_AM_flow = np.matrix(AM_flow_list)
x_AM_flow = np.matrix.reshape(x_AM_flow, 64, 30)
x_AM_flow = np.nan_to_num(x_AM_flow)
x_MD_flow = np.matrix(MD_flow_list)
x_MD_flow = np.matrix.reshape(x_MD_flow, 64, 30)
x_MD_flow = np.nan_to_num(x_MD_flow)
x_PM_flow = np.matrix(PM_flow_list)
x_PM_flow = np.matrix.reshape(x_PM_flow, 64, 30)
x_PM_flow = np.nan_to_num(x_PM_flow)
x_NT_flow = np.matrix(NT_flow_list)
x_NT_flow = np.matrix.reshape(x_NT_flow, 64, 30)
x_NT_flow = np.nan_to_num(x_NT_flow)
x_AM_flow_minute = np.matrix(AM_flow_minute_list)
x_AM_flow_minute = np.matrix.reshape(x_AM_flow_minute, 64, 3600)
x_AM_flow_minute = np.nan_to_num(x_AM_flow_minute)
x_MD_flow_minute = np.matrix(MD_flow_minute_list)
x_MD_flow_minute = np.matrix.reshape(x_MD_flow_minute, 64, 3600)
x_MD_flow_minute = np.nan_to_num(x_MD_flow_minute)
x_PM_flow_minute = np.matrix(PM_flow_minute_list)
x_PM_flow_minute = np.matrix.reshape(x_PM_flow_minute, 64, 3600)
x_PM_flow_minute = np.nan_to_num(x_PM_flow_minute)
x_NT_flow_minute = np.matrix(NT_flow_minute_list)
x_NT_flow_minute = np.matrix.reshape(x_NT_flow_minute, 64, 3600)
x_NT_flow_minute = np.nan_to_num(x_NT_flow_minute)
print(link_day_minute_Apr_dict_JSON['link_0_1'] ['AM_flow_minute'][0])
y_AM_flow = []
y_MD_flow = []
y_PM_flow = []
y_NT_flow = []
for j in range(np.size(x_AM_flow, 1)):
y_AM_flow_0 = x_AM_flow[:,j] # initial flow vector
y_MD_flow_0 = x_MD_flow[:,j] # initial flow vector
y_PM_flow_0 = x_PM_flow[:,j] # initial flow vector
y_NT_flow_0 = x_NT_flow[:,j] # initial flow vector
y_AM_flow.append(flow_conservation_adjustment_ext(y_AM_flow_0))
y_MD_flow.append(flow_conservation_adjustment_ext(y_MD_flow_0))
y_PM_flow.append(flow_conservation_adjustment_ext(y_PM_flow_0))
y_NT_flow.append(flow_conservation_adjustment_ext(y_NT_flow_0))
y_AM_flow_minute = []
y_MD_flow_minute = []
y_PM_flow_minute = []
y_NT_flow_minute = []
for j in range(np.size(x_AM_flow_minute, 1)):
y_AM_flow_minute_0 = x_AM_flow_minute[:,j] # initial flow vector
y_MD_flow_minute_0 = x_MD_flow_minute[:,j] # initial flow vector
y_PM_flow_minute_0 = x_PM_flow_minute[:,j] # initial flow vector
y_NT_flow_minute_0 = x_NT_flow_minute[:,j] # initial flow vector
y_AM_flow_minute.append(flow_conservation_adjustment_ext(y_AM_flow_minute_0))
y_MD_flow_minute.append(flow_conservation_adjustment_ext(y_MD_flow_minute_0))
y_PM_flow_minute.append(flow_conservation_adjustment_ext(y_PM_flow_minute_0))
y_NT_flow_minute.append(flow_conservation_adjustment_ext(y_NT_flow_minute_0))
y_AM_flow = np.matrix(y_AM_flow)
y_AM_flow = np.matrix.transpose(y_AM_flow)
y_MD_flow = np.matrix(y_MD_flow)
y_MD_flow = np.matrix.transpose(y_MD_flow)
y_PM_flow = np.matrix(y_PM_flow)
y_PM_flow = np.matrix.transpose(y_PM_flow)
y_NT_flow = np.matrix(y_NT_flow)
y_NT_flow = np.matrix.transpose(y_NT_flow)
y_AM_flow_minute = np.matrix(y_AM_flow_minute)
y_AM_flow_minute = np.matrix.transpose(y_AM_flow_minute)
y_MD_flow_minute = np.matrix(y_MD_flow_minute)
y_MD_flow_minute = np.matrix.transpose(y_MD_flow_minute)
y_PM_flow_minute = np.matrix(y_PM_flow_minute)
y_PM_flow_minute = np.matrix.transpose(y_PM_flow_minute)
y_NT_flow_minute = np.matrix(y_NT_flow_minute)
y_NT_flow_minute = np.matrix.transpose(y_NT_flow_minute)
link_day_minute_Apr_dict_JSON_adjusted = {}
for link_idx in range(64):
AM_flow = np.matrix.reshape(y_AM_flow[link_idx,:], 30, 1)
MD_flow = np.matrix.reshape(y_MD_flow[link_idx,:], 30, 1)
PM_flow = np.matrix.reshape(y_PM_flow[link_idx,:], 30, 1)
NT_flow = np.matrix.reshape(y_NT_flow[link_idx,:], 30, 1)
AM_flow_minute = np.matrix.reshape(y_AM_flow_minute[link_idx,:], 30, 120)
MD_flow_minute = np.matrix.reshape(y_MD_flow_minute[link_idx,:], 30, 120)
PM_flow_minute = np.matrix.reshape(y_PM_flow_minute[link_idx,:], 30, 120)
NT_flow_minute = np.matrix.reshape(y_NT_flow_minute[link_idx,:], 30, 120)
for day in range(31)[1:]:
key = 'link_' + str(link_idx) + '_' + str(day)
data = {'link_idx': link_idx, 'day': day, \
'init_node': link_day_minute_Apr_dict_JSON[key]['init_node'], \
'term_node': link_day_minute_Apr_dict_JSON[key]['term_node'], \
'AM_capac': link_day_minute_Apr_dict_JSON[key]['AM_capac'], \
'MD_capac': link_day_minute_Apr_dict_JSON[key]['MD_capac'], \
'PM_capac': link_day_minute_Apr_dict_JSON[key]['PM_capac'], \
'NT_capac': link_day_minute_Apr_dict_JSON[key]['NT_capac'], \
'free_flow_time': link_day_minute_Apr_dict_JSON[key]['free_flow_time'], \
'length': link_day_minute_Apr_dict_JSON[key]['length'], \
'AM_flow': np.array(AM_flow)[day-1].tolist()[0], \
'MD_flow': np.array(AM_flow)[day-1].tolist()[0], \
'PM_flow': np.array(AM_flow)[day-1].tolist()[0], \
'NT_flow': np.array(AM_flow)[day-1].tolist()[0], \
'AM_flow_minute': np.array(AM_flow_minute)[day-1].tolist(), \
'MD_flow_minute': np.array(MD_flow_minute)[day-1].tolist(), \
'PM_flow_minute': np.array(PM_flow_minute)[day-1].tolist(), \
'NT_flow_minute': np.array(NT_flow_minute)[day-1].tolist()}
link_day_minute_Apr_dict_JSON_adjusted[key] = data
# Writing JSON data
with open('../temp_files/link_day_minute_Apr_dict_ext_JSON_adjusted.json', 'w') as json_file:
json.dump(link_day_minute_Apr_dict_JSON_adjusted, json_file)
| 41.113095 | 105 | 0.73954 | 1,252 | 6,907 | 3.559904 | 0.069489 | 0.20193 | 0.075387 | 0.082567 | 0.800763 | 0.708773 | 0.475881 | 0.347094 | 0.16289 | 0.126991 | 0 | 0.019822 | 0.138121 | 6,907 | 167 | 106 | 41.359281 | 0.728876 | 0.033734 | 0 | 0.046875 | 0 | 0 | 0.075942 | 0.016959 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.03125 | 0 | 0.03125 | 0.007813 | 0 | 0 | 0 | null | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d4a39a48ff6c65557a290eef4e57e92cfd0dbb4f | 245 | py | Python | afternoon_sessions/alive/boilerplate/util/network/__init__.py | renewfrl/python2 | 61abd0a8215ad6c8d90e8da40c7d9cfe492b8d2d | [
"CNRI-Python"
] | null | null | null | afternoon_sessions/alive/boilerplate/util/network/__init__.py | renewfrl/python2 | 61abd0a8215ad6c8d90e8da40c7d9cfe492b8d2d | [
"CNRI-Python"
] | null | null | null | afternoon_sessions/alive/boilerplate/util/network/__init__.py | renewfrl/python2 | 61abd0a8215ad6c8d90e8da40c7d9cfe492b8d2d | [
"CNRI-Python"
] | null | null | null | import ssl
import requests
from datetime import datetime
def check_server(url):
start = datetime.now().strftime("s")
res = requests.get(url)
stop = datetime.now().strftime("s")
if res.status_code == 200:
print("jippy")
| 20.416667 | 40 | 0.661224 | 33 | 245 | 4.848485 | 0.666667 | 0.1375 | 0.2375 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.015385 | 0.204082 | 245 | 11 | 41 | 22.272727 | 0.805128 | 0 | 0 | 0 | 0 | 0 | 0.028689 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0 | 0.444444 | 0.111111 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
d4a6a0897e2315f9dc182bc7dc183ee0f08bc9db | 787 | py | Python | lib/gobbet/wordlist.py | simoncozens/gobbet | 07a61527636f1e4eedf5f29bc4bb560ef5cf530e | [
"MIT"
] | null | null | null | lib/gobbet/wordlist.py | simoncozens/gobbet | 07a61527636f1e4eedf5f29bc4bb560ef5cf530e | [
"MIT"
] | null | null | null | lib/gobbet/wordlist.py | simoncozens/gobbet | 07a61527636f1e4eedf5f29bc4bb560ef5cf530e | [
"MIT"
] | null | null | null | from collections import Counter, defaultdict
def pairwise(a):
return zip(a, a[1::])
class Wordlist(Counter):
def bigrams(self):
bigrams = defaultdict(set)
for word in self.keys():
for a,b in pairwise(word):
bigrams[a+b].add(word)
return bigrams
def filter_popularity(self, threshold=3):
return Wordlist({x: count for x, count in self.items() if count >= threshold})
def filter_length(self, threshold=3):
return Wordlist({x: count for x, count in self.items() if len(x) >= threshold})
def filter_unicodes(self, codepoint_ranges):
def _included_letter(l):
return any(ord(l) in r for r in codepoint_ranges)
def _included(word):
return all(_included_letter(l) for l in word)
return Wordlist({x: count for x, count in self.items() if _included(x)})
| 28.107143 | 81 | 0.709022 | 125 | 787 | 4.376 | 0.336 | 0.065814 | 0.082267 | 0.109689 | 0.281536 | 0.281536 | 0.281536 | 0.281536 | 0.281536 | 0.281536 | 0 | 0.00458 | 0.167726 | 787 | 27 | 82 | 29.148148 | 0.830534 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.35 | false | 0 | 0.05 | 0.25 | 0.8 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
d4adc8e6d9a5e1f3c7782171f8abee8c070a4325 | 291 | py | Python | models.py | reedkavner/truthtk | 53707b2900a74d7b25ae4d45c22a855d6a4e7972 | [
"MIT"
] | 1 | 2017-02-15T16:39:00.000Z | 2017-02-15T16:39:00.000Z | models.py | reedkavner/truthtk | 53707b2900a74d7b25ae4d45c22a855d6a4e7972 | [
"MIT"
] | null | null | null | models.py | reedkavner/truthtk | 53707b2900a74d7b25ae4d45c22a855d6a4e7972 | [
"MIT"
] | null | null | null | from google.appengine.ext import ndb
#venue, title, perfid, cast, date_time
class Tweet(ndb.Model):
text = ndb.StringProperty()
tid = ndb.IntegerProperty()
donation = ndb.IntegerProperty()
donation_successful = ndb.BooleanProperty()
date_added = ndb.DateTimeProperty(auto_now_add=True) | 32.333333 | 53 | 0.783505 | 37 | 291 | 6.027027 | 0.756757 | 0.161435 | 0.233184 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106529 | 291 | 9 | 53 | 32.333333 | 0.857692 | 0.127148 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
d4b925bf14f314a8f2d74cc4008397a4721d7cf4 | 2,597 | py | Python | pygsp/optimization.py | naspert/pygsp | 361f0258a210193f482c6197ea879765a9041e91 | [
"BSD-3-Clause"
] | null | null | null | pygsp/optimization.py | naspert/pygsp | 361f0258a210193f482c6197ea879765a9041e91 | [
"BSD-3-Clause"
] | 1 | 2018-03-29T09:39:45.000Z | 2018-03-29T09:39:45.000Z | pygsp/optimization.py | naspert/pygsp | 361f0258a210193f482c6197ea879765a9041e91 | [
"BSD-3-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
r"""
The :mod:`pygsp.optimization` module provides tools to solve convex
optimization problems on graphs.
"""
from pygsp import utils
logger = utils.build_logger(__name__)
def _import_pyunlocbox():
try:
from pyunlocbox import functions, solvers
except Exception:
raise ImportError('Cannot import pyunlocbox, which is needed to solve '
'this optimization problem. Try to install it with '
'pip (or conda) install pyunlocbox.')
return functions, solvers
def prox_tv(x, gamma, G, A=None, At=None, nu=1, tol=10e-4, maxit=200, use_matrix=True):
r"""
Total Variation proximal operator for graphs.
This function computes the TV proximal operator for graphs. The TV norm
is the one norm of the gradient. The gradient is defined in the
function :meth:`pygsp.graphs.Graph.grad`.
This function requires the PyUNLocBoX to be executed.
This function solves:
:math:`sol = \min_{z} \frac{1}{2} \|x - z\|_2^2 + \gamma \|x\|_{TV}`
Parameters
----------
x: int
Input signal
gamma: ndarray
Regularization parameter
G: graph object
Graphs structure
A: lambda function
Forward operator, this parameter allows to solve the following problem:
:math:`sol = \min_{z} \frac{1}{2} \|x - z\|_2^2 + \gamma \| A x\|_{TV}`
(default = Id)
At: lambda function
Adjoint operator. (default = Id)
nu: float
Bound on the norm of the operator (default = 1)
tol: float
Stops criterion for the loop. The algorithm will stop if :
:math:`\frac{n(t) - n(t - 1)} {n(t)} < tol`
where :math:`n(t) = f(x) + 0.5 \|x-y\|_2^2` is the objective function at iteration :math:`t`
(default = :math:`10e-4`)
maxit: int
Maximum iteration. (default = 200)
use_matrix: bool
If a matrix should be used. (default = True)
Returns
-------
sol: solution
Examples
--------
"""
if A is None:
def A(x):
return x
if At is None:
def At(x):
return x
tight = 0
l1_nu = 2 * G.lmax * nu
if use_matrix:
def l1_a(x):
return G.Diff * A(x)
def l1_at(x):
return G.Diff * At(D.T * x)
else:
def l1_a(x):
return G.grad(A(x))
def l1_at(x):
return G.div(x)
functions, _ = _import_pyunlocbox()
functions.norm_l1(x, gamma, A=l1_a, At=l1_at, tight=tight, maxit=maxit, verbose=verbose, tol=tol)
| 27.052083 | 101 | 0.581055 | 365 | 2,597 | 4.054795 | 0.378082 | 0.008108 | 0.021622 | 0.033784 | 0.077027 | 0.077027 | 0.058108 | 0.058108 | 0.035135 | 0.035135 | 0 | 0.020948 | 0.301502 | 2,597 | 95 | 102 | 27.336842 | 0.794928 | 0.519446 | 0 | 0.181818 | 0 | 0 | 0.125 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.242424 | false | 0 | 0.151515 | 0.181818 | 0.606061 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
d4bbd08dc8733618fb52dedb182d63acdc72370c | 168 | py | Python | Basic's/If Statements/if statements2.py | Fahad-Hafeez/Python-Learning | c37f26e10c1d23f5277327c9bc474c19747f2b90 | [
"Apache-2.0"
] | null | null | null | Basic's/If Statements/if statements2.py | Fahad-Hafeez/Python-Learning | c37f26e10c1d23f5277327c9bc474c19747f2b90 | [
"Apache-2.0"
] | null | null | null | Basic's/If Statements/if statements2.py | Fahad-Hafeez/Python-Learning | c37f26e10c1d23f5277327c9bc474c19747f2b90 | [
"Apache-2.0"
] | null | null | null | price = 1000000
has_good_credit = True
if has_good_credit:
down_payment = 0.1 * price
else:
down_paymet = 0.2 * price
print(f"Down Payment: £{down_payment}") | 16.8 | 39 | 0.702381 | 28 | 168 | 4 | 0.607143 | 0.294643 | 0.232143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.080882 | 0.190476 | 168 | 10 | 39 | 16.8 | 0.735294 | 0 | 0 | 0 | 0 | 0 | 0.171598 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d4c17a27a61780b431916b2634585de035778ce8 | 7,122 | py | Python | python/paddle/nn/functional/__init__.py | TochkaAI/Paddle | 481ee79fc92304f33165f7ed0679f16c36862cea | [
"Apache-2.0"
] | 3 | 2021-06-08T14:24:36.000Z | 2021-06-08T14:24:38.000Z | python/paddle/nn/functional/__init__.py | chenyanlei1/Paddle | f249a5f05f0f5832279244d88c8cb4eaaad1fbd4 | [
"Apache-2.0"
] | null | null | null | python/paddle/nn/functional/__init__.py | chenyanlei1/Paddle | f249a5f05f0f5832279244d88c8cb4eaaad1fbd4 | [
"Apache-2.0"
] | 1 | 2021-06-17T06:52:01.000Z | 2021-06-17T06:52:01.000Z | # Copyright (c) 2020 PaddlePaddle Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# TODO: import all neural network related api under this directory,
# including layers, linear, conv, rnn etc.
from .activation import elu # noqa: F401
from .activation import elu_ # noqa: F401
from .activation import gelu # noqa: F401
from .activation import hardshrink # noqa: F401
from .activation import hardtanh # noqa: F401
from .activation import hardsigmoid # noqa: F401
from .activation import hardswish # noqa: F401
from .activation import leaky_relu # noqa: F401
from .activation import log_sigmoid # noqa: F401
from .activation import maxout # noqa: F401
from .activation import prelu # noqa: F401
from .activation import relu # noqa: F401
from .activation import relu_ # noqa: F401
from .activation import relu6 # noqa: F401
from .activation import selu # noqa: F401
from .activation import sigmoid # noqa: F401
from .activation import silu # noqa: F401
from .activation import softmax # noqa: F401
from .activation import softmax_ # noqa: F401
from .activation import softplus # noqa: F401
from .activation import softshrink # noqa: F401
from .activation import softsign # noqa: F401
from .activation import swish # noqa: F401
from .activation import tanh # noqa: F401
from .activation import tanh_ # noqa: F401
from .activation import tanhshrink # noqa: F401
from .activation import thresholded_relu # noqa: F401
from .activation import log_softmax # noqa: F401
from .activation import glu # noqa: F401
from .common import dropout # noqa: F401
from .common import dropout2d # noqa: F401
from .common import dropout3d # noqa: F401
from .common import alpha_dropout # noqa: F401
from .common import label_smooth # noqa: F401
from .common import pad # noqa: F401
from .common import cosine_similarity # noqa: F401
from .common import unfold # noqa: F401
from .common import interpolate # noqa: F401
from .common import upsample # noqa: F401
from .common import bilinear # noqa: F401
from .conv import conv1d # noqa: F401
from .conv import conv1d_transpose # noqa: F401
from .common import linear # noqa: F401
from .conv import conv2d # noqa: F401
from .conv import conv2d_transpose # noqa: F401
from .conv import conv3d # noqa: F401
from .conv import conv3d_transpose # noqa: F401
from .extension import diag_embed # noqa: F401
from .extension import sequence_mask
from .loss import binary_cross_entropy # noqa: F401
from .loss import binary_cross_entropy_with_logits # noqa: F401
from .loss import cross_entropy # noqa: F401
from .loss import dice_loss # noqa: F401
from .loss import hsigmoid_loss # noqa: F401
from .loss import kl_div # noqa: F401
from .loss import l1_loss # noqa: F401
from .loss import log_loss # noqa: F401
from .loss import margin_ranking_loss # noqa: F401
from .loss import mse_loss # noqa: F401
from .loss import nll_loss # noqa: F401
from .loss import npair_loss # noqa: F401
from .loss import sigmoid_focal_loss # noqa: F401
from .loss import smooth_l1_loss # noqa: F401
from .loss import softmax_with_cross_entropy # noqa: F401
from .loss import square_error_cost # noqa: F401
from .loss import ctc_loss # noqa: F401
from .norm import batch_norm # noqa: F401
from .norm import instance_norm # noqa: F401
from .norm import layer_norm # noqa: F401
from .norm import local_response_norm # noqa: F401
from .norm import normalize # noqa: F401
from .pooling import avg_pool1d # noqa: F401
from .pooling import avg_pool2d # noqa: F401
from .pooling import avg_pool3d # noqa: F401
from .pooling import max_pool1d # noqa: F401
from .pooling import max_pool2d # noqa: F401
from .pooling import max_pool3d # noqa: F401
from .pooling import adaptive_max_pool1d # noqa: F401
from .pooling import adaptive_max_pool2d # noqa: F401
from .pooling import adaptive_max_pool3d # noqa: F401
from .pooling import adaptive_avg_pool1d # noqa: F401
from .pooling import adaptive_avg_pool2d # noqa: F401
from .pooling import adaptive_avg_pool3d # noqa: F401
from .vision import affine_grid # noqa: F401
from .vision import grid_sample # noqa: F401
from .vision import pixel_shuffle # noqa: F401
from .input import one_hot # noqa: F401
from .input import embedding # noqa: F401
from ...fluid.layers import gather_tree # noqa: F401
from ...fluid.layers import temporal_shift # noqa: F401
__all__ = [ #noqa
'conv1d',
'conv1d_transpose',
'conv2d',
'conv2d_transpose',
'conv3d',
'conv3d_transpose',
'elu',
'elu_',
'gelu',
'hardshrink',
'hardtanh',
'hardsigmoid',
'hardswish',
'leaky_relu',
'log_sigmoid',
'maxout',
'prelu',
'relu',
'relu_',
'relu6',
'selu',
'softmax',
'softmax_',
'softplus',
'softshrink',
'softsign',
'sigmoid',
'silu',
'swish',
'tanh',
'tanh_',
'tanhshrink',
'thresholded_relu',
'log_softmax',
'glu',
'diag_embed',
'sequence_mask',
'dropout',
'dropout2d',
'dropout3d',
'alpha_dropout',
'label_smooth',
'linear',
'pad',
'unfold',
'interpolate',
'upsample',
'bilinear',
'cosine_similarity',
'avg_pool1d',
'avg_pool2d',
'avg_pool3d',
'max_pool1d',
'max_pool2d',
'max_pool3d',
'adaptive_avg_pool1d',
'adaptive_avg_pool2d',
'adaptive_avg_pool3d',
'adaptive_max_pool1d',
'adaptive_max_pool2d',
'adaptive_max_pool3d',
'binary_cross_entropy',
'binary_cross_entropy_with_logits',
'cross_entropy',
'dice_loss',
'hsigmoid_loss',
'kl_div',
'l1_loss',
'log_loss',
'mse_loss',
'margin_ranking_loss',
'nll_loss',
'npair_loss',
'sigmoid_focal_loss',
'smooth_l1_loss',
'softmax_with_cross_entropy',
'square_error_cost',
'ctc_loss',
'affine_grid',
'grid_sample',
'local_response_norm',
'pixel_shuffle',
'embedding',
'gather_tree',
'one_hot',
'normalize'
]
| 35.788945 | 74 | 0.653328 | 870 | 7,122 | 5.182759 | 0.206897 | 0.157906 | 0.234198 | 0.136616 | 0.59104 | 0.395875 | 0.218008 | 0.099357 | 0.080727 | 0.062542 | 0 | 0.061295 | 0.264673 | 7,122 | 198 | 75 | 35.969697 | 0.799694 | 0.234906 | 0 | 0 | 0 | 0 | 0.169069 | 0.010859 | 0 | 0 | 0 | 0.005051 | 0 | 1 | 0 | false | 0 | 0.505618 | 0 | 0.505618 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
d4c338cd184a56765f445b79e0e5c95b11693d80 | 412 | py | Python | packages/lrn/main.py | mallegrini/learn | 4f9cc8ca773e7e790cd573fb7b0fae78979c5996 | [
"MIT"
] | 1 | 2019-11-17T11:00:46.000Z | 2019-11-17T11:00:46.000Z | packages/lrn/main.py | mallegrini/learn | 4f9cc8ca773e7e790cd573fb7b0fae78979c5996 | [
"MIT"
] | null | null | null | packages/lrn/main.py | mallegrini/learn | 4f9cc8ca773e7e790cd573fb7b0fae78979c5996 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
# encoding: utf-8
from gnr.app.gnrdbo import GnrDboTable, GnrDboPackage
class Package(GnrDboPackage):
def config_attributes(self):
return dict(comment='lrn package',sqlschema='lrn',sqlprefix=True,
name_short='Lrn', name_long='Lrn', name_full='Lrn')
def config_db(self, pkg):
pass
class Table(GnrDboTable):
pass
| 27.466667 | 73 | 0.628641 | 49 | 412 | 5.183673 | 0.714286 | 0.070866 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003268 | 0.257282 | 412 | 14 | 74 | 29.428571 | 0.826797 | 0.087379 | 0 | 0.222222 | 0 | 0 | 0.061497 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0.222222 | 0.111111 | 0.111111 | 0.666667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 3 |
d4cb39621d67307cdd186ab8544d42e95cc38eab | 345 | py | Python | hrm/project/validator.py | jprsurendra/pocs | 8e4321a16f342366a47ce699b5955d42632d4b69 | [
"Apache-2.0"
] | null | null | null | hrm/project/validator.py | jprsurendra/pocs | 8e4321a16f342366a47ce699b5955d42632d4b69 | [
"Apache-2.0"
] | null | null | null | hrm/project/validator.py | jprsurendra/pocs | 8e4321a16f342366a47ce699b5955d42632d4b69 | [
"Apache-2.0"
] | null | null | null | from django.core.validators import RegexValidator
zip_validate = RegexValidator(r'^[0-9]*$', 'Please enter valid zip code.')
phone_validate = RegexValidator(r'^\s*\d{5}-\d{5}\s*$',
'Please enter valid phone number. Phone number is allowed in following '
'pattern 12345-67890.') | 49.285714 | 104 | 0.597101 | 41 | 345 | 4.97561 | 0.658537 | 0.215686 | 0.22549 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.056225 | 0.278261 | 345 | 7 | 105 | 49.285714 | 0.763052 | 0 | 0 | 0 | 0 | 0 | 0.419075 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.2 | 0 | 0.2 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
d4e8890b3642bd326a60fb1e3b69c9b651643e95 | 2,682 | py | Python | tracker/utils/query/__init__.py | dti-research/tracker | f2384c0c7b631aa9efd39bf606cda8b85187fcc6 | [
"BSD-3-Clause"
] | 1 | 2019-07-25T18:02:37.000Z | 2019-07-25T18:02:37.000Z | tracker/utils/query/__init__.py | dti-research/tracker | f2384c0c7b631aa9efd39bf606cda8b85187fcc6 | [
"BSD-3-Clause"
] | 10 | 2019-08-29T12:27:35.000Z | 2020-01-04T18:40:48.000Z | tracker/utils/query/__init__.py | dti-research/tracker | f2384c0c7b631aa9efd39bf606cda8b85187fcc6 | [
"BSD-3-Clause"
] | null | null | null | # Copyright 2017-2019 TensorHub, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
from __future__ import absolute_import
from __future__ import division
from six.moves import shlex_quote as q
class ParseError(Exception):
pass
class Select(object):
def __init__(self, cols):
self.cols = cols
def __repr__(self):
return "<guild.query.Select %s>" % [str(c) for c in self.cols]
class Col(object):
named_as = None
def __repr__(self):
cls = self.__class__
return "<%s.%s %s>" % (cls.__module__, cls.__name__, self)
def __str__(self):
raise NotImplementedError()
def _as_suffix(self):
return " as %s" % q(self.named_as) if self.named_as else ""
@property
def header(self):
return self.named_as or str(self)
class Scalar(Col):
def __init__(self, key, qualifier=None, step=False):
self.key = key
self.qualifier = qualifier
self.step = step
def __str__(self):
qual = "%s " % self.qualifier if self.qualifier else ""
step = " step" if self.step else ""
return "scalar:%s%s%s%s" % (qual, self.key, step, self._as_suffix())
@property
def header(self):
if self.named_as:
return self.named_as
key = self.key.replace("#", " ").strip()
step = " step" if self.step else ""
return "%s%s" % (key, step)
def split_key(self):
parts = self.key.split("#", 1)
if len(parts) == 2:
return parts
return None, parts[0]
class Attr(Col):
def __init__(self, name):
self.name = name
def __str__(self):
return "attr:%s%s" % (self.name, self._as_suffix())
@property
def header(self):
return self.named_as or self.name
class Flag(Col):
def __init__(self, name):
self.name = name
def __str__(self):
return "flag:%s%s" % (self.name, self._as_suffix())
@property
def header(self):
return self.named_as or self.name
def parse(s):
from . import qparse
p = qparse.parser()
return p.parse(s)
def parse_colspec(colspec):
return parse("select %s" % colspec)
| 23.526316 | 76 | 0.629381 | 375 | 2,682 | 4.288 | 0.328 | 0.034826 | 0.047886 | 0.052239 | 0.224502 | 0.224502 | 0.224502 | 0.169154 | 0.169154 | 0.144279 | 0 | 0.007519 | 0.256152 | 2,682 | 113 | 77 | 23.734513 | 0.798496 | 0.206189 | 0 | 0.338462 | 0 | 0 | 0.047754 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.276923 | false | 0.015385 | 0.061538 | 0.123077 | 0.676923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
be1db6b69e09757436212218623e563ecc20f18a | 376 | py | Python | script/ground_server.py | ycpengpeng/pva_tracker | 7c1f188dc641ed2d29e0605dd201811e20ef629f | [
"BSD-3-Clause"
] | null | null | null | script/ground_server.py | ycpengpeng/pva_tracker | 7c1f188dc641ed2d29e0605dd201811e20ef629f | [
"BSD-3-Clause"
] | null | null | null | script/ground_server.py | ycpengpeng/pva_tracker | 7c1f188dc641ed2d29e0605dd201811e20ef629f | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python
import rospy
import dynamic_reconfigure.server
from pva_tracker.cfg import PVA_Ground_TrackerConfig
def severCallback(config, level):
return config
if __name__ == '__main__':
rospy.init_node('ground_reconfigure_server', anonymous=False)
server = dynamic_reconfigure.server.Server(PVA_Ground_TrackerConfig, severCallback)
rospy.spin() | 25.066667 | 87 | 0.792553 | 46 | 376 | 6.086957 | 0.608696 | 0.182143 | 0.171429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.12234 | 376 | 15 | 88 | 25.066667 | 0.848485 | 0.053191 | 0 | 0 | 0 | 0 | 0.092697 | 0.070225 | 0 | 0 | 0 | 0 | 0 | 1 | 0.111111 | false | 0 | 0.333333 | 0.111111 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 3 |
be2039c5b86943a691572012167bc5cef7cc60c5 | 83 | py | Python | setup.py | sl1-1/weasyl | d4f6bf3e33b85a2289a451d95d5b90ff24f5d539 | [
"Apache-2.0"
] | 1 | 2019-02-15T04:21:48.000Z | 2019-02-15T04:21:48.000Z | setup.py | sl1-1/weasyl | d4f6bf3e33b85a2289a451d95d5b90ff24f5d539 | [
"Apache-2.0"
] | 254 | 2017-12-23T19:36:43.000Z | 2020-04-14T21:46:13.000Z | setup.py | sl1-1/weasyl | d4f6bf3e33b85a2289a451d95d5b90ff24f5d539 | [
"Apache-2.0"
] | 1 | 2017-12-23T18:42:16.000Z | 2017-12-23T18:42:16.000Z | from setuptools import setup
setup(
name='weasyl',
packages=['weasyl'],
)
| 11.857143 | 28 | 0.650602 | 9 | 83 | 6 | 0.777778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.204819 | 83 | 6 | 29 | 13.833333 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.144578 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.2 | 0 | 0.2 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
be27e7899c21618317de02ffd0df9c2af9c6c37a | 120 | py | Python | src/ck/upper_cheminp.py | gywukun09/GPS | ce474f4afbcb64d46e85f04675e63343d5b65b47 | [
"MIT"
] | 18 | 2017-08-08T16:46:21.000Z | 2021-11-24T06:43:08.000Z | src/ck/upper_cheminp.py | haoxy97/GPS | 3da6d3a7410b7b7e5340373f206a1833759d5acf | [
"MIT"
] | 1 | 2019-12-24T11:53:18.000Z | 2019-12-24T11:53:18.000Z | src/ck/upper_cheminp.py | haoxy97/GPS | 3da6d3a7410b7b7e5340373f206a1833759d5acf | [
"MIT"
] | 14 | 2017-07-08T03:17:37.000Z | 2022-01-10T12:33:27.000Z | name = 'therm_PRF.dat'
fin = open(name,'r')
fout = open(name+'_upper','w')
for line in fin:
fout.write(line.upper())
| 15 | 30 | 0.641667 | 21 | 120 | 3.571429 | 0.666667 | 0.213333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.141667 | 120 | 7 | 31 | 17.142857 | 0.728155 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
07715c25a85bba139951d60478ca8b68467c7702 | 22,654 | py | Python | lws_pb2.py | dabankio/rpc-sync | dd4cda16a70b9ea6d6270f1e326cedb35e414ebb | [
"MIT"
] | null | null | null | lws_pb2.py | dabankio/rpc-sync | dd4cda16a70b9ea6d6270f1e326cedb35e414ebb | [
"MIT"
] | null | null | null | lws_pb2.py | dabankio/rpc-sync | dd4cda16a70b9ea6d6270f1e326cedb35e414ebb | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: dbp/lws.proto
from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)
_sym_db = _symbol_database.Default()
DESCRIPTOR = _descriptor.FileDescriptor(
name='dbp/lws.proto',
package='lws',
syntax='proto3',
serialized_options=None,
serialized_pb=b'\n\rdbp/lws.proto\x12\x03lws\"\x15\n\x06\x46orkID\x12\x0b\n\x03ids\x18\x01 \x03(\t\",\n\x0cGetBlocksArg\x12\x0c\n\x04hash\x18\x01 \x01(\x0c\x12\x0e\n\x06number\x18\x02 \x01(\x05\"\x18\n\x08GetTxArg\x12\x0c\n\x04hash\x18\x01 \x01(\x0c\"\x19\n\tSendTxArg\x12\x0c\n\x04\x64\x61ta\x18\x01 \x01(\x0c\"\xb5\x03\n\x0bTransaction\x12\x10\n\x08nVersion\x18\x01 \x01(\r\x12\r\n\x05nType\x18\x02 \x01(\r\x12\x12\n\nnLockUntil\x18\x03 \x01(\r\x12\x12\n\nhashAnchor\x18\x04 \x01(\x0c\x12&\n\x06vInput\x18\x05 \x03(\x0b\x32\x16.lws.Transaction.CTxIn\x12\x33\n\x0c\x63\x44\x65stination\x18\x06 \x01(\x0b\x32\x1d.lws.Transaction.CDestination\x12\x0f\n\x07nAmount\x18\x07 \x01(\x03\x12\x0e\n\x06nTxFee\x18\x08 \x01(\x03\x12\x0f\n\x07vchData\x18\t \x01(\x0c\x12\x0e\n\x06vchSig\x18\n \x01(\x0c\x12\x0c\n\x04hash\x18\x0b \x01(\x0c\x12\x0f\n\x07nChange\x18\x0c \x01(\x03\x1a \n\x05\x43TxIn\x12\x0c\n\x04hash\x18\x01 \x01(\x0c\x12\t\n\x01n\x18\x02 \x01(\r\x1a}\n\x0c\x43\x44\x65stination\x12\x0e\n\x06prefix\x18\x01 \x01(\r\x12\x0c\n\x04\x64\x61ta\x18\x02 \x01(\x0c\x12\x0c\n\x04size\x18\x03 \x01(\r\"A\n\x06PREFIX\x12\x0f\n\x0bPREFIX_NULL\x10\x00\x12\x11\n\rPREFIX_PUBKEY\x10\x01\x12\x13\n\x0fPREFIX_TEMPLATE\x10\x02\"\xe4\x01\n\x05\x42lock\x12\x10\n\x08nVersion\x18\x01 \x01(\r\x12\r\n\x05nType\x18\x02 \x01(\r\x12\x12\n\nnTimeStamp\x18\x03 \x01(\r\x12\x10\n\x08hashPrev\x18\x04 \x01(\x0c\x12\x12\n\nhashMerkle\x18\x05 \x01(\x0c\x12\x10\n\x08vchProof\x18\x06 \x01(\x0c\x12 \n\x06txMint\x18\x07 \x01(\x0b\x32\x10.lws.Transaction\x12\x1d\n\x03vtx\x18\x08 \x03(\x0b\x32\x10.lws.Transaction\x12\x0e\n\x06vchSig\x18\t \x01(\x0c\x12\x0f\n\x07nHeight\x18\n \x01(\r\x12\x0c\n\x04hash\x18\x0b \x01(\x0c\"9\n\tSendTxRet\x12\x0c\n\x04hash\x18\x01 \x01(\x0c\x12\x0e\n\x06result\x18\x02 \x01(\t\x12\x0e\n\x06reason\x18\x03 \x01(\tb\x06proto3'
)
_TRANSACTION_CDESTINATION_PREFIX = _descriptor.EnumDescriptor(
name='PREFIX',
full_name='lws.Transaction.CDestination.PREFIX',
filename=None,
file=DESCRIPTOR,
values=[
_descriptor.EnumValueDescriptor(
name='PREFIX_NULL', index=0, number=0,
serialized_options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='PREFIX_PUBKEY', index=1, number=1,
serialized_options=None,
type=None),
_descriptor.EnumValueDescriptor(
name='PREFIX_TEMPLATE', index=2, number=2,
serialized_options=None,
type=None),
],
containing_type=None,
serialized_options=None,
serialized_start=517,
serialized_end=582,
)
_sym_db.RegisterEnumDescriptor(_TRANSACTION_CDESTINATION_PREFIX)
_FORKID = _descriptor.Descriptor(
name='ForkID',
full_name='lws.ForkID',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='ids', full_name='lws.ForkID.ids', index=0,
number=1, type=9, cpp_type=9, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=22,
serialized_end=43,
)
_GETBLOCKSARG = _descriptor.Descriptor(
name='GetBlocksArg',
full_name='lws.GetBlocksArg',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='hash', full_name='lws.GetBlocksArg.hash', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='number', full_name='lws.GetBlocksArg.number', index=1,
number=2, type=5, cpp_type=1, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=45,
serialized_end=89,
)
_GETTXARG = _descriptor.Descriptor(
name='GetTxArg',
full_name='lws.GetTxArg',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='hash', full_name='lws.GetTxArg.hash', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=91,
serialized_end=115,
)
_SENDTXARG = _descriptor.Descriptor(
name='SendTxArg',
full_name='lws.SendTxArg',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='data', full_name='lws.SendTxArg.data', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=117,
serialized_end=142,
)
_TRANSACTION_CTXIN = _descriptor.Descriptor(
name='CTxIn',
full_name='lws.Transaction.CTxIn',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='hash', full_name='lws.Transaction.CTxIn.hash', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='n', full_name='lws.Transaction.CTxIn.n', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=423,
serialized_end=455,
)
_TRANSACTION_CDESTINATION = _descriptor.Descriptor(
name='CDestination',
full_name='lws.Transaction.CDestination',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='prefix', full_name='lws.Transaction.CDestination.prefix', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='data', full_name='lws.Transaction.CDestination.data', index=1,
number=2, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='size', full_name='lws.Transaction.CDestination.size', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
_TRANSACTION_CDESTINATION_PREFIX,
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=457,
serialized_end=582,
)
_TRANSACTION = _descriptor.Descriptor(
name='Transaction',
full_name='lws.Transaction',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='nVersion', full_name='lws.Transaction.nVersion', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='nType', full_name='lws.Transaction.nType', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='nLockUntil', full_name='lws.Transaction.nLockUntil', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='hashAnchor', full_name='lws.Transaction.hashAnchor', index=3,
number=4, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='vInput', full_name='lws.Transaction.vInput', index=4,
number=5, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='cDestination', full_name='lws.Transaction.cDestination', index=5,
number=6, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='nAmount', full_name='lws.Transaction.nAmount', index=6,
number=7, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='nTxFee', full_name='lws.Transaction.nTxFee', index=7,
number=8, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='vchData', full_name='lws.Transaction.vchData', index=8,
number=9, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='vchSig', full_name='lws.Transaction.vchSig', index=9,
number=10, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='hash', full_name='lws.Transaction.hash', index=10,
number=11, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='nChange', full_name='lws.Transaction.nChange', index=11,
number=12, type=3, cpp_type=2, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[_TRANSACTION_CTXIN, _TRANSACTION_CDESTINATION, ],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=145,
serialized_end=582,
)
_BLOCK = _descriptor.Descriptor(
name='Block',
full_name='lws.Block',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='nVersion', full_name='lws.Block.nVersion', index=0,
number=1, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='nType', full_name='lws.Block.nType', index=1,
number=2, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='nTimeStamp', full_name='lws.Block.nTimeStamp', index=2,
number=3, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='hashPrev', full_name='lws.Block.hashPrev', index=3,
number=4, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='hashMerkle', full_name='lws.Block.hashMerkle', index=4,
number=5, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='vchProof', full_name='lws.Block.vchProof', index=5,
number=6, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='txMint', full_name='lws.Block.txMint', index=6,
number=7, type=11, cpp_type=10, label=1,
has_default_value=False, default_value=None,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='vtx', full_name='lws.Block.vtx', index=7,
number=8, type=11, cpp_type=10, label=3,
has_default_value=False, default_value=[],
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='vchSig', full_name='lws.Block.vchSig', index=8,
number=9, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='nHeight', full_name='lws.Block.nHeight', index=9,
number=10, type=13, cpp_type=3, label=1,
has_default_value=False, default_value=0,
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='hash', full_name='lws.Block.hash', index=10,
number=11, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=585,
serialized_end=813,
)
_SENDTXRET = _descriptor.Descriptor(
name='SendTxRet',
full_name='lws.SendTxRet',
filename=None,
file=DESCRIPTOR,
containing_type=None,
fields=[
_descriptor.FieldDescriptor(
name='hash', full_name='lws.SendTxRet.hash', index=0,
number=1, type=12, cpp_type=9, label=1,
has_default_value=False, default_value=b"",
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='result', full_name='lws.SendTxRet.result', index=1,
number=2, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
_descriptor.FieldDescriptor(
name='reason', full_name='lws.SendTxRet.reason', index=2,
number=3, type=9, cpp_type=9, label=1,
has_default_value=False, default_value=b"".decode('utf-8'),
message_type=None, enum_type=None, containing_type=None,
is_extension=False, extension_scope=None,
serialized_options=None, file=DESCRIPTOR),
],
extensions=[
],
nested_types=[],
enum_types=[
],
serialized_options=None,
is_extendable=False,
syntax='proto3',
extension_ranges=[],
oneofs=[
],
serialized_start=815,
serialized_end=872,
)
_TRANSACTION_CTXIN.containing_type = _TRANSACTION
_TRANSACTION_CDESTINATION.containing_type = _TRANSACTION
_TRANSACTION_CDESTINATION_PREFIX.containing_type = _TRANSACTION_CDESTINATION
_TRANSACTION.fields_by_name['vInput'].message_type = _TRANSACTION_CTXIN
_TRANSACTION.fields_by_name['cDestination'].message_type = _TRANSACTION_CDESTINATION
_BLOCK.fields_by_name['txMint'].message_type = _TRANSACTION
_BLOCK.fields_by_name['vtx'].message_type = _TRANSACTION
DESCRIPTOR.message_types_by_name['ForkID'] = _FORKID
DESCRIPTOR.message_types_by_name['GetBlocksArg'] = _GETBLOCKSARG
DESCRIPTOR.message_types_by_name['GetTxArg'] = _GETTXARG
DESCRIPTOR.message_types_by_name['SendTxArg'] = _SENDTXARG
DESCRIPTOR.message_types_by_name['Transaction'] = _TRANSACTION
DESCRIPTOR.message_types_by_name['Block'] = _BLOCK
DESCRIPTOR.message_types_by_name['SendTxRet'] = _SENDTXRET
_sym_db.RegisterFileDescriptor(DESCRIPTOR)
ForkID = _reflection.GeneratedProtocolMessageType('ForkID', (_message.Message,), {
'DESCRIPTOR' : _FORKID,
'__module__' : 'dbp.lws_pb2'
# @@protoc_insertion_point(class_scope:lws.ForkID)
})
_sym_db.RegisterMessage(ForkID)
GetBlocksArg = _reflection.GeneratedProtocolMessageType('GetBlocksArg', (_message.Message,), {
'DESCRIPTOR' : _GETBLOCKSARG,
'__module__' : 'dbp.lws_pb2'
# @@protoc_insertion_point(class_scope:lws.GetBlocksArg)
})
_sym_db.RegisterMessage(GetBlocksArg)
GetTxArg = _reflection.GeneratedProtocolMessageType('GetTxArg', (_message.Message,), {
'DESCRIPTOR' : _GETTXARG,
'__module__' : 'dbp.lws_pb2'
# @@protoc_insertion_point(class_scope:lws.GetTxArg)
})
_sym_db.RegisterMessage(GetTxArg)
SendTxArg = _reflection.GeneratedProtocolMessageType('SendTxArg', (_message.Message,), {
'DESCRIPTOR' : _SENDTXARG,
'__module__' : 'dbp.lws_pb2'
# @@protoc_insertion_point(class_scope:lws.SendTxArg)
})
_sym_db.RegisterMessage(SendTxArg)
Transaction = _reflection.GeneratedProtocolMessageType('Transaction', (_message.Message,), {
'CTxIn' : _reflection.GeneratedProtocolMessageType('CTxIn', (_message.Message,), {
'DESCRIPTOR' : _TRANSACTION_CTXIN,
'__module__' : 'dbp.lws_pb2'
# @@protoc_insertion_point(class_scope:lws.Transaction.CTxIn)
})
,
'CDestination' : _reflection.GeneratedProtocolMessageType('CDestination', (_message.Message,), {
'DESCRIPTOR' : _TRANSACTION_CDESTINATION,
'__module__' : 'dbp.lws_pb2'
# @@protoc_insertion_point(class_scope:lws.Transaction.CDestination)
})
,
'DESCRIPTOR' : _TRANSACTION,
'__module__' : 'dbp.lws_pb2'
# @@protoc_insertion_point(class_scope:lws.Transaction)
})
_sym_db.RegisterMessage(Transaction)
_sym_db.RegisterMessage(Transaction.CTxIn)
_sym_db.RegisterMessage(Transaction.CDestination)
Block = _reflection.GeneratedProtocolMessageType('Block', (_message.Message,), {
'DESCRIPTOR' : _BLOCK,
'__module__' : 'dbp.lws_pb2'
# @@protoc_insertion_point(class_scope:lws.Block)
})
_sym_db.RegisterMessage(Block)
SendTxRet = _reflection.GeneratedProtocolMessageType('SendTxRet', (_message.Message,), {
'DESCRIPTOR' : _SENDTXRET,
'__module__' : 'dbp.lws_pb2'
# @@protoc_insertion_point(class_scope:lws.SendTxRet)
})
_sym_db.RegisterMessage(SendTxRet)
# @@protoc_insertion_point(module_scope)
| 37.631229 | 1,828 | 0.727907 | 2,910 | 22,654 | 5.402406 | 0.076976 | 0.061574 | 0.06679 | 0.051778 | 0.711914 | 0.659691 | 0.650595 | 0.639908 | 0.627441 | 0.616945 | 0 | 0.042402 | 0.140108 | 22,654 | 601 | 1,829 | 37.693844 | 0.76463 | 0.029178 | 0 | 0.690909 | 1 | 0.001818 | 0.165347 | 0.102148 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.007273 | 0 | 0.007273 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
078319dc51e9c9edf93d833688f4ddd6fe74e0fa | 187 | py | Python | services/director-v2/src/simcore_service_director_v2/api/routes/health.py | colinRawlings/osparc-simcore | bf2f18d5bc1e574d5f4c238d08ad15156184c310 | [
"MIT"
] | 25 | 2018-04-13T12:44:12.000Z | 2022-03-12T15:01:17.000Z | services/director-v2/src/simcore_service_director_v2/api/routes/health.py | colinRawlings/osparc-simcore | bf2f18d5bc1e574d5f4c238d08ad15156184c310 | [
"MIT"
] | 2,553 | 2018-01-18T17:11:55.000Z | 2022-03-31T16:26:40.000Z | services/director-v2/src/simcore_service_director_v2/api/routes/health.py | colinRawlings/osparc-simcore | bf2f18d5bc1e574d5f4c238d08ad15156184c310 | [
"MIT"
] | 20 | 2018-01-18T19:45:33.000Z | 2022-03-29T07:08:47.000Z | from typing import Dict
from fastapi import APIRouter
router = APIRouter()
@router.get("/")
async def check_service_health() -> Dict[str, str]:
return {"msg": "I am healthy :-)"}
| 17 | 51 | 0.679144 | 25 | 187 | 5 | 0.76 | 0.24 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171123 | 187 | 10 | 52 | 18.7 | 0.806452 | 0 | 0 | 0 | 0 | 0 | 0.106952 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
07860a803b014dcbb39639d9c7817263428033f8 | 64 | py | Python | login.py | feidie666/42_02 | 02698b1a4d5759834bf17a5ffb0ba91cd5c0720a | [
"MIT"
] | null | null | null | login.py | feidie666/42_02 | 02698b1a4d5759834bf17a5ffb0ba91cd5c0720a | [
"MIT"
] | null | null | null | login.py | feidie666/42_02 | 02698b1a4d5759834bf17a5ffb0ba91cd5c0720a | [
"MIT"
] | null | null | null | num1 = 100 #经理
num2 = 200 #zhangsan
num3 = 300 #zhangsan
| 10.666667 | 22 | 0.609375 | 9 | 64 | 4.333333 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.266667 | 0.296875 | 64 | 5 | 23 | 12.8 | 0.6 | 0.28125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
07a5651290cb0745e03ed7273969097de9000c31 | 92 | py | Python | ABC/A/0208.py | taro-masuda/AtCoder | e8cb050260c1dff4ef61a27d1a3a2a8029fc939a | [
"MIT"
] | null | null | null | ABC/A/0208.py | taro-masuda/AtCoder | e8cb050260c1dff4ef61a27d1a3a2a8029fc939a | [
"MIT"
] | null | null | null | ABC/A/0208.py | taro-masuda/AtCoder | e8cb050260c1dff4ef61a27d1a3a2a8029fc939a | [
"MIT"
] | null | null | null | A, B = map(int, input().split())
if 6*A < B or A > B:
print('No')
else:
print('Yes') | 18.4 | 32 | 0.5 | 18 | 92 | 2.555556 | 0.722222 | 0.130435 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014493 | 0.25 | 92 | 5 | 33 | 18.4 | 0.652174 | 0 | 0 | 0 | 0 | 0 | 0.053763 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0.4 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
07c995410ec452d4323eca68b5a83a6ab3c2b291 | 690 | py | Python | __init__.py | perilib/perilib-python-robotis-dynamixel2 | 3fd8349f4355ae5da51033623b1d3c419e2fcbda | [
"MIT"
] | 2 | 2019-02-21T11:00:12.000Z | 2021-01-17T02:53:15.000Z | __init__.py | perilib/perilib-python-robotis-dynamixel2 | 3fd8349f4355ae5da51033623b1d3c419e2fcbda | [
"MIT"
] | null | null | null | __init__.py | perilib/perilib-python-robotis-dynamixel2 | 3fd8349f4355ae5da51033623b1d3c419e2fcbda | [
"MIT"
] | 1 | 2019-02-13T12:21:24.000Z | 2019-02-13T12:21:24.000Z | """
This module provides protocol and packet definitions for communicating with
newer Robotis Dynamixel servos that implement version 2 of the Dynamixel
protocol. In addition to the basic protocol, it also abstracts common sets of
transactions (such as writing new values to control table registers) into simple
servo class methods, so the application does not require knowledge or use of the
underlying protocol directly.
"""
# .py files
from .RobotisDynamixel2Device import *
from .RobotisDynamixel2Protocol import *
from .RobotisDynamixel2ParserGenerator import *
from .RobotisDynamixel2Packet import *
from .RobotisDynamixel2Servo import *
from .RobotisDynamixel2ControlTable import *
| 40.588235 | 80 | 0.827536 | 83 | 690 | 6.879518 | 0.759036 | 0.087566 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011725 | 0.134783 | 690 | 16 | 81 | 43.125 | 0.944724 | 0.621739 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
07d9f74cf63411c3d66210ffa14c2375412449f3 | 146 | py | Python | sokoapp/sweepstake/tests/admin_tests.py | Mercy-Nekesa/sokoapp | 6c7bc4c1278b7223226124a49fc33c5b8b6b617a | [
"MIT"
] | 1 | 2019-04-01T05:52:37.000Z | 2019-04-01T05:52:37.000Z | sokoapp/sweepstake/tests/admin_tests.py | Mercy-Nekesa/sokoapp | 6c7bc4c1278b7223226124a49fc33c5b8b6b617a | [
"MIT"
] | 1 | 2015-03-11T16:18:12.000Z | 2015-03-11T16:18:12.000Z | sokoapp/sweepstake/tests/admin_tests.py | Mercy-Nekesa/sokoapp | 6c7bc4c1278b7223226124a49fc33c5b8b6b617a | [
"MIT"
] | null | null | null | from django.test import TestCase
class SampleTestThree(TestCase):
def testThree(self):
three = 3
self.assertEqual(three, 3),
| 20.857143 | 35 | 0.678082 | 17 | 146 | 5.823529 | 0.764706 | 0.121212 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017857 | 0.232877 | 146 | 6 | 36 | 24.333333 | 0.866071 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.2 | 1 | 0.2 | false | 0 | 0.2 | 0 | 0.6 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
07dc40c750fe77c48b149e7f9f4ddb628668a80f | 60 | py | Python | src/models/stochastic/bbb/quantized/__init__.py | tjiagoM/quantised-bayesian-nets | c6ff1db376c366633afa2845b7527cc144ffd3b2 | [
"BSD-3-Clause"
] | 13 | 2021-02-23T09:48:10.000Z | 2021-12-30T18:04:30.000Z | src/models/stochastic/bbb/quantized/__init__.py | tjiagoM/quantised-bayesian-nets | c6ff1db376c366633afa2845b7527cc144ffd3b2 | [
"BSD-3-Clause"
] | 1 | 2021-06-15T16:24:36.000Z | 2021-06-15T16:24:36.000Z | src/models/stochastic/bbb/quantized/__init__.py | tjiagoM/quantised-bayesian-nets | c6ff1db376c366633afa2845b7527cc144ffd3b2 | [
"BSD-3-Clause"
] | 1 | 2021-06-15T14:53:20.000Z | 2021-06-15T14:53:20.000Z | NOISE_SCALE = float(0.02362204724)
NOISE_ZERO_POINT = int(0) | 30 | 34 | 0.8 | 10 | 60 | 4.5 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.236364 | 0.083333 | 60 | 2 | 35 | 30 | 0.581818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
07f0191eb9622c9bd6e944572e72f9884adf7e01 | 20 | py | Python | testpkg/__init__.py | iUnknwn/PythonImportDemo | 8402ec9b63d6abb3caec4cdff577eda579c7b53f | [
"MIT"
] | null | null | null | testpkg/__init__.py | iUnknwn/PythonImportDemo | 8402ec9b63d6abb3caec4cdff577eda579c7b53f | [
"MIT"
] | null | null | null | testpkg/__init__.py | iUnknwn/PythonImportDemo | 8402ec9b63d6abb3caec4cdff577eda579c7b53f | [
"MIT"
] | null | null | null | __all__ = ["submod"] | 20 | 20 | 0.65 | 2 | 20 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.1 | 20 | 1 | 20 | 20 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
07faddec54d39b572808542548be1e6e57d98c78 | 1,761 | py | Python | zebra/models.py | inhersight/django-zebra | 40e83f3e8979fcb615bc07c6a8069fa6d8776a7b | [
"MIT"
] | null | null | null | zebra/models.py | inhersight/django-zebra | 40e83f3e8979fcb615bc07c6a8069fa6d8776a7b | [
"MIT"
] | null | null | null | zebra/models.py | inhersight/django-zebra | 40e83f3e8979fcb615bc07c6a8069fa6d8776a7b | [
"MIT"
] | null | null | null | from builtins import object
from django.db import models
from zebra import mixins
from zebra.conf import options
class StripeCustomer(models.Model, mixins.StripeMixin, mixins.StripeCustomerMixin):
stripe_customer_id = models.CharField(max_length=50, blank=True, null=True)
class Meta(object):
abstract = True
def __str__(self):
return str(self.stripe_customer_id)
class StripePlan(models.Model, mixins.StripeMixin, mixins.StripePlanMixin):
stripe_plan_id = models.CharField(max_length=50, blank=True, null=True)
class Meta(object):
abstract = True
def __str__(self):
return str(self.stripe_plan_id)
class StripeSubscription(models.Model, mixins.StripeMixin, mixins.StripeSubscriptionMixin):
"""
You need to provide a stripe_customer attribute. See zebra.models for an
example implimentation.
"""
class Meta(object):
abstract = True
# Non-abstract classes must be enabled in your project's settings.py
if options.ZEBRA_ENABLE_APP:
class DatesModelBase(models.Model):
date_created = models.DateTimeField(auto_now_add=True)
date_modified = models.DateTimeField(auto_now=True)
class Meta(object):
abstract = True
class Customer(DatesModelBase, StripeCustomer):
pass
class Plan(DatesModelBase, StripePlan):
pass
class Subscription(DatesModelBase, StripeSubscription):
customer = models.ForeignKey(Customer, on_delete=models.CASCADE)
plan = models.ForeignKey(Plan, on_delete=models.CASCADE)
def __str__(self):
return "{}: {}".format(self.customer, self.plan)
@property
def stripe_customer(self):
return self.customer.stripe_customer | 28.868852 | 91 | 0.70812 | 204 | 1,761 | 5.946078 | 0.382353 | 0.057708 | 0.049464 | 0.075845 | 0.298434 | 0.192086 | 0.166529 | 0.166529 | 0.166529 | 0.166529 | 0 | 0.002865 | 0.207269 | 1,761 | 61 | 92 | 28.868852 | 0.866046 | 0.093129 | 0 | 0.351351 | 0 | 0 | 0.003802 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.108108 | false | 0.054054 | 0.108108 | 0.108108 | 0.783784 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 0 | 3 |
6af1ef93f1b79a5cb395ef687ee433f938b0927d | 175 | py | Python | coolPro/app/module1/test_print_funs.py | airwindow/Python-Standard-Project | f975350b8eb05466198ae7e548b7ad63837fbd36 | [
"Apache-2.0"
] | null | null | null | coolPro/app/module1/test_print_funs.py | airwindow/Python-Standard-Project | f975350b8eb05466198ae7e548b7ad63837fbd36 | [
"Apache-2.0"
] | null | null | null | coolPro/app/module1/test_print_funs.py | airwindow/Python-Standard-Project | f975350b8eb05466198ae7e548b7ad63837fbd36 | [
"Apache-2.0"
] | null | null | null | from ..module2.print_funs import print_sth
def test_funs_from_other_modules(s):
print_sth(s)
if __name__ == "__main__":
s = 'test in main'
test_funs_from_other_modules(s) | 21.875 | 42 | 0.777143 | 30 | 175 | 3.9 | 0.5 | 0.136752 | 0.205128 | 0.290598 | 0.42735 | 0.42735 | 0 | 0 | 0 | 0 | 0 | 0.006494 | 0.12 | 175 | 8 | 43 | 21.875 | 0.753247 | 0 | 0 | 0 | 0 | 0 | 0.113636 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0 | 0.333333 | 0.333333 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
6af4e842a10a2c0f26ecd9eb1209702602cf9c3b | 1,179 | py | Python | applications/cli/commands/model/model.py | starcell/deepcell-ncluster | 885d6b3678c1413ecdd8681c08306402484706e9 | [
"Apache-2.0"
] | null | null | null | applications/cli/commands/model/model.py | starcell/deepcell-ncluster | 885d6b3678c1413ecdd8681c08306402484706e9 | [
"Apache-2.0"
] | null | null | null | applications/cli/commands/model/model.py | starcell/deepcell-ncluster | 885d6b3678c1413ecdd8681c08306402484706e9 | [
"Apache-2.0"
] | null | null | null | #
# Copyright (c) 2019 Intel Corporation
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
import click
from commands.model.status import status
from commands.model import export
from commands.model.logs import logs
from util.logger import initialize_logger
from util.aliascmd import AliasGroup
from cli_text_consts import ModelCmdTexts as Texts
logger = initialize_logger(__name__)
@click.group(short_help=Texts.HELP, help=Texts.HELP, cls=AliasGroup, alias='mo',
subcommand_metavar="COMMAND [options] [args]...")
def model():
pass
model.add_command(export.export)
model.add_command(status)
model.add_command(logs)
| 29.475 | 81 | 0.743851 | 167 | 1,179 | 5.173653 | 0.580838 | 0.069444 | 0.059028 | 0.037037 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008273 | 0.179813 | 1,179 | 39 | 82 | 30.230769 | 0.885212 | 0.474131 | 0 | 0 | 0 | 0 | 0.051327 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0.066667 | 0.466667 | 0 | 0.533333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 3 |
6af7754979884d1db68e7a1b431f90e158f94179 | 37 | py | Python | 04_Selenium/framework/core/configuration.py | twiindan/selenium_lessons | 798557e8f584f9e6655414c13f232017483f0439 | [
"Apache-2.0"
] | null | null | null | 04_Selenium/framework/core/configuration.py | twiindan/selenium_lessons | 798557e8f584f9e6655414c13f232017483f0439 | [
"Apache-2.0"
] | null | null | null | 04_Selenium/framework/core/configuration.py | twiindan/selenium_lessons | 798557e8f584f9e6655414c13f232017483f0439 | [
"Apache-2.0"
] | 1 | 2020-07-16T09:49:47.000Z | 2020-07-16T09:49:47.000Z | webdriver_configuration = 'Firefox'
| 12.333333 | 35 | 0.810811 | 3 | 37 | 9.666667 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 37 | 2 | 36 | 18.5 | 0.878788 | 0 | 0 | 0 | 0 | 0 | 0.194444 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
ed081f2ead1e7832273a066b2899e83a1292ec33 | 254 | py | Python | python_venv/__main__.py | jmknoble/python-venv | 698aff4341c358b0e1469c845398f275a3df1cb8 | [
"MIT"
] | 1 | 2021-06-04T15:24:45.000Z | 2021-06-04T15:24:45.000Z | python_venv/__main__.py | jmknoble/python-venv | 698aff4341c358b0e1469c845398f275a3df1cb8 | [
"MIT"
] | 6 | 2021-06-04T15:06:31.000Z | 2021-09-24T06:04:45.000Z | python_venv/__main__.py | jmknoble/python-venv | 698aff4341c358b0e1469c845398f275a3df1cb8 | [
"MIT"
] | null | null | null | """Wrapper around `~python_venv.cli`:py:mod:."""
from __future__ import absolute_import
import sys
from . import cli
def main():
"""Provide a generic main entry point."""
sys.exit(cli.main(*sys.argv))
if __name__ == "__main__":
main()
| 14.941176 | 48 | 0.661417 | 35 | 254 | 4.4 | 0.657143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181102 | 254 | 16 | 49 | 15.875 | 0.740385 | 0.307087 | 0 | 0 | 0 | 0 | 0.048485 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | true | 0 | 0.428571 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
ed2d59ad7d37d81e7595e8cf9b466a618fec4182 | 5,079 | py | Python | Audio.py | ishine/GST_Tacotron | 0c3d8e51042dc5d49abc842b59a13ea70f927f9d | [
"MIT"
] | 21 | 2020-02-23T03:35:27.000Z | 2021-11-01T11:08:18.000Z | Audio.py | ishine/GST_Tacotron | 0c3d8e51042dc5d49abc842b59a13ea70f927f9d | [
"MIT"
] | 6 | 2020-03-14T15:43:38.000Z | 2021-07-06T09:06:57.000Z | Audio.py | ishine/GST_Tacotron | 0c3d8e51042dc5d49abc842b59a13ea70f927f9d | [
"MIT"
] | 7 | 2020-03-07T11:33:09.000Z | 2021-11-28T16:19:01.000Z | # https://github.com/keithito/tacotron/blob/master/util/audio.py
# https://github.com/carpedm20/multi-speaker-tacotron-tensorflow/blob/master/audio/__init__.py
# I only changed the hparams to usual parameters from oroginal code.
import numpy as np
from scipy import signal
import librosa.filters
import librosa
def preemphasis(x, preemphasis = 0.97):
return signal.lfilter([1, -preemphasis], [1], x)
def inv_preemphasis(x, preemphasis = 0.97):
return signal.lfilter([1], [1, -preemphasis], x)
def spectrogram(y, num_freq, hop_length, win_length, sample_rate, ref_level_db = 20, max_abs_value = None, spectral_subtract= False):
M = _magnitude(y, num_freq, hop_length, win_length, sample_rate, spectral_subtract)
S = _amp_to_db(M) - ref_level_db
return _normalize(S) if max_abs_value is None else _symmetric_normalize(S, max_abs_value= max_abs_value)
def inv_spectrogram(spectrogram, num_freq, hop_length, win_length, sample_rate, ref_level_db = 20, power = 1.5, max_abs_value = None, griffin_lim_iters= 60):
'''Converts spectrogram to waveform using librosa'''
spectrogram = _denormalize(spectrogram) if max_abs_value is None else _symmetric_denormalize(spectrogram, max_abs_value= max_abs_value)
S = _db_to_amp(spectrogram + ref_level_db) # Convert back to linear
return inv_preemphasis(_griffin_lim(S ** power, num_freq, hop_length, win_length, sample_rate, griffin_lim_iters= griffin_lim_iters)) # Reconstruct phase
def melspectrogram(y, num_freq, hop_length, win_length, num_mels, sample_rate, max_abs_value = None, spectral_subtract= False):
M = _magnitude(y, num_freq, hop_length, win_length, sample_rate, spectral_subtract)
S = _amp_to_db(_linear_to_mel(M, num_freq, num_mels, sample_rate))
return _normalize(S) if max_abs_value is None else _symmetric_normalize(S, max_abs_value= max_abs_value)
def spectrogram_and_mel(y, num_freq, hop_length, win_length, sample_rate, spect_ref_level_db = 20, num_mels= 80, max_abs_mels = None, spectral_subtract= False):
M = _magnitude(y, num_freq, hop_length, win_length, sample_rate, spectral_subtract)
spect_S = _normalize(_amp_to_db(M) - spect_ref_level_db)
mel_S = _amp_to_db(_linear_to_mel(M, num_freq, num_mels, sample_rate))
mel_S = _normalize(mel_S) if max_abs_mels is None else _symmetric_normalize(mel_S, max_abs_value= max_abs_mels)
return spect_S, mel_S
def mfcc(y, num_freq, num_mfcc, hop_length, win_length, sample_rate, use_energy= False):
n_fft = (num_freq - 1) * 2
mfcc_Array = librosa.feature.mfcc(y, sr= sample_rate, n_mfcc= num_mfcc + 1, n_fft= n_fft, hop_length= hop_length, win_length= win_length)
mfcc_Array = mfcc_Array[:-1] if use_energy else mfcc_Array[1:]
return mfcc_Array
def _magnitude(y, num_freq, hop_length, win_length, sample_rate, spectral_subtract= False):
D = _stft(preemphasis(y), num_freq, hop_length, win_length, sample_rate)
M = np.abs(D)
if spectral_subtract:
M = np.clip(M - np.mean(M, axis= 1, keepdims= True) / 10, a_min= 0.0, a_max= np.inf)
return M
def _griffin_lim(S, num_freq, hop_length, win_length, sample_rate, griffin_lim_iters = 60):
'''librosa implementation of Griffin-Lim
Based on https://github.com/librosa/librosa/issues/434
'''
angles = np.exp(2j * np.pi * np.random.rand(*S.shape))
S_complex = np.abs(S).astype(np.complex)
y = _istft(S_complex * angles, num_freq, hop_length, win_length, sample_rate)
for _ in range(griffin_lim_iters):
angles = np.exp(1j * np.angle(_stft(y, num_freq, hop_length, win_length, sample_rate)))
y = _istft(S_complex * angles, num_freq, hop_length, win_length, sample_rate)
return y
def _stft(y, num_freq, hop_length, win_length, sample_rate):
n_fft = (num_freq - 1) * 2
return librosa.stft(y=y, n_fft=n_fft, hop_length=hop_length, win_length=win_length)
def _istft(y, num_freq, hop_length, win_length, sample_rate):
return librosa.istft(y, hop_length=hop_length, win_length=win_length)
def _linear_to_mel(spectrogram, num_freq, num_mels, sample_rate):
_mel_basis = _build_mel_basis(num_freq, num_mels, sample_rate)
return np.dot(_mel_basis, spectrogram)
def _build_mel_basis(num_freq, num_mels, sample_rate):
n_fft = (num_freq - 1) * 2
return librosa.filters.mel(sample_rate, n_fft, n_mels=num_mels)
def _amp_to_db(x):
return 20 * np.log10(np.maximum(1e-5, x))
def _db_to_amp(x):
return np.power(10.0, x * 0.05)
def _normalize(S, min_level_db = -100):
return np.clip((S - min_level_db) / -min_level_db, 0, 1)
def _symmetric_normalize(S, min_level_db = -100, max_abs_value = 4):
return np.clip((2 * max_abs_value) * ((S - min_level_db) / (-min_level_db)) - max_abs_value, -max_abs_value, max_abs_value)
def _denormalize(S, min_level_db = -100):
return (np.clip(S, 0, 1) * -min_level_db) + min_level_db
def _symmetric_denormalize(S, min_level_db = -100, max_abs_value = 4):
return ((np.clip(S, -max_abs_value, max_abs_value) + max_abs_value) / (2 * max_abs_value) * -min_level_db) + min_level_db | 49.794118 | 166 | 0.741681 | 848 | 5,079 | 4.058962 | 0.168632 | 0.045322 | 0.100232 | 0.10459 | 0.576119 | 0.567984 | 0.521209 | 0.490122 | 0.480825 | 0.353283 | 0 | 0.017812 | 0.148848 | 5,079 | 102 | 167 | 49.794118 | 0.778395 | 0.079543 | 0 | 0.147059 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.279412 | false | 0 | 0.058824 | 0.132353 | 0.617647 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
ed3056fafc3a40958bd9508ee71576e9021be55b | 80 | py | Python | CCF/CSP/2018/18033u.py | cnsteven/online-judge | 60ee841a97e2bc0dc9c7b23fe5daa186898ab8b7 | [
"MIT"
] | 1 | 2019-05-04T10:28:32.000Z | 2019-05-04T10:28:32.000Z | CCF/CSP/2018/18033u.py | cnsteven/online-judge | 60ee841a97e2bc0dc9c7b23fe5daa186898ab8b7 | [
"MIT"
] | null | null | null | CCF/CSP/2018/18033u.py | cnsteven/online-judge | 60ee841a97e2bc0dc9c7b23fe5daa186898ab8b7 | [
"MIT"
] | 3 | 2020-12-31T04:36:38.000Z | 2021-07-25T07:39:31.000Z | n, m = map(int, input().split())
for _ in range(n):
k, v = input().split()
| 16 | 32 | 0.525 | 14 | 80 | 2.928571 | 0.785714 | 0.487805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225 | 80 | 4 | 33 | 20 | 0.66129 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
ed30848abdf4720c84665c28ca35bd77c87606fc | 80 | py | Python | wiki/__init__.py | qaisjp/cogs | e9b3094722980722419905aa8af07c77ebb0ea6c | [
"MIT"
] | 2 | 2020-06-04T11:16:39.000Z | 2020-06-27T08:31:57.000Z | wiki/__init__.py | qaisjp/cogs | e9b3094722980722419905aa8af07c77ebb0ea6c | [
"MIT"
] | 2 | 2020-06-04T11:07:02.000Z | 2020-08-03T16:54:36.000Z | wiki/__init__.py | qaisjp/cogs | e9b3094722980722419905aa8af07c77ebb0ea6c | [
"MIT"
] | 2 | 2020-07-08T18:28:08.000Z | 2021-02-06T22:18:05.000Z | from .wiki import wiki
def setup(bot):
cog = wiki(bot)
bot.add_cog(cog) | 16 | 22 | 0.65 | 14 | 80 | 3.642857 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.225 | 80 | 5 | 23 | 16 | 0.822581 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
ed43f931aba6d492b179198fb7f7ca3479b9a1c0 | 576 | py | Python | modules/Music/errors.py | master142009/Mbot-Nextgen | adfe0db5c9503c0a9274a22f05493431d3454c51 | [
"MIT"
] | null | null | null | modules/Music/errors.py | master142009/Mbot-Nextgen | adfe0db5c9503c0a9274a22f05493431d3454c51 | [
"MIT"
] | null | null | null | modules/Music/errors.py | master142009/Mbot-Nextgen | adfe0db5c9503c0a9274a22f05493431d3454c51 | [
"MIT"
] | null | null | null | from nextcord.ext.commands.errors import CheckFailure
class NotConnectedToVoice(CheckFailure):
"""User not connected to any voice channel"""
pass
class PlayerNotConnected(CheckFailure):
"""Player not connected"""
pass
class MustBeSameChannel(CheckFailure):
"""Player and user not in same channel"""
pass
class NothingIsPlaying(CheckFailure):
"""Nothing is playing"""
pass
class NotEnoughSong(CheckFailure):
"""Not enough songs in queue"""
pass
class InvalidLoopMode(CheckFailure):
"""Invalid loop mode"""
pass
| 15.157895 | 53 | 0.699653 | 59 | 576 | 6.830508 | 0.59322 | 0.111663 | 0.079404 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.201389 | 576 | 37 | 54 | 15.567568 | 0.876087 | 0.276042 | 0 | 0.461538 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0.461538 | 0.076923 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
ed46ef7c0ab105918603a68030e445566bd11b27 | 4,459 | py | Python | Locus/settings.py | matbut/Locus | 0a9fc61a27bf2ed06c8c87648f1160365d3d1422 | [
"MIT"
] | 1 | 2019-11-15T09:49:53.000Z | 2019-11-15T09:49:53.000Z | Locus/settings.py | matbut/Locus | 0a9fc61a27bf2ed06c8c87648f1160365d3d1422 | [
"MIT"
] | 11 | 2019-11-04T16:56:39.000Z | 2022-02-11T03:43:39.000Z | Locus/settings.py | matbut/Locus | 0a9fc61a27bf2ed06c8c87648f1160365d3d1422 | [
"MIT"
] | 1 | 2019-11-16T11:13:30.000Z | 2019-11-16T11:13:30.000Z | """
Django settings for Locus project.
Generated by 'django-admin startproject' using Django 2.2.
For more information on this file, see
https://docs.djangoproject.com/en/2.2/topics/settings/
For the full list of settings and their values, see
https://docs.djangoproject.com/en/2.2/ref/settings/
"""
import os
# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/2.2/howto/deployment/checklist/
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = 'ilq$1=zfns0$w5^5cq8dcs0eag%lj%$r+fhsgwa%8fr_-f(+e$'
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
ALLOWED_HOSTS = []
# Application definition
INSTALLED_APPS = [
'django.contrib.admin',
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.messages',
'django.contrib.staticfiles',
'search.apps.SearchConfig',
'twitter.apps.TweetCrawlerConfig',
'searchEngine.apps.GoogleCrawlerOfficialConfig',
'database.apps.DatabaseConfig',
'channels',
'rest_framework',
'compressor',
'compressor_toolkit',
]
MIDDLEWARE = [
'django.middleware.security.SecurityMiddleware',
'django.contrib.sessions.middleware.SessionMiddleware',
'django.middleware.common.CommonMiddleware',
'django.middleware.csrf.CsrfViewMiddleware',
'django.contrib.auth.middleware.AuthenticationMiddleware',
'django.contrib.messages.middleware.MessageMiddleware',
'django.middleware.clickjacking.XFrameOptionsMiddleware',
]
ROOT_URLCONF = 'Locus.urls'
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [os.path.join(BASE_DIR, 'templates')]
,
'APP_DIRS': True,
'OPTIONS': {
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
],
},
},
]
WSGI_APPLICATION = 'Locus.wsgi.application'
# Database
# https://docs.djangoproject.com/en/2.2/ref/settings/#databases
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'locus_db',
'USER': 'locus',
'PASSWORD': 'locus',
'HOST': 'localhost',
'PORT': '',
}
}
# Password validation
# https://docs.djangoproject.com/en/2.2/ref/settings/#auth-password-validators
AUTH_PASSWORD_VALIDATORS = [
{
'NAME': 'django.contrib.auth.password_validation.UserAttributeSimilarityValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.MinimumLengthValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.CommonPasswordValidator',
},
{
'NAME': 'django.contrib.auth.password_validation.NumericPasswordValidator',
},
]
# Internationalization
# https://docs.djangoproject.com/en/2.2/topics/i18n/
LANGUAGE_CODE = 'en-us'
TIME_ZONE = 'UTC'
USE_I18N = True
USE_L10N = True
USE_TZ = True
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/2.2/howto/static-files/
STATIC_ROOT = 'static'
STATIC_URL = '/static/'
MEDIA_ROOT = 'media'
MEDIA_URL = '/media/'
STATICFILES_DIRS = [
'/var/www/static/',
os.path.join(BASE_DIR, 'node_modules'),
]
STATICFILES_FINDERS = (
'django.contrib.staticfiles.finders.FileSystemFinder',
'django.contrib.staticfiles.finders.AppDirectoriesFinder',
'compressor.finders.CompressorFinder',
)
COMPRESS_CSS_FILTERS = [
'compressor.filters.css_default.CssAbsoluteFilter',
'compressor.filters.cssmin.CSSMinFilter',
'compressor.filters.template.TemplateFilter'
]
COMPRESS_JS_FILTERS = [
'compressor.filters.jsmin.JSMinFilter',
]
COMPRESS_PRECOMPILERS = (
('module', 'compressor_toolkit.precompilers.ES6Compiler'),
('css', 'compressor_toolkit.precompilers.SCSSCompiler'),
)
COMPRESS_ENABLED = True
CHANNEL_LAYERS = {
'default': {
'BACKEND': 'channels_redis.core.RedisChannelLayer',
'CONFIG': {
'hosts': [('localhost', 6379)]
}
}
}
ASGI_APPLICATION = 'search.routing.application' | 25.924419 | 91 | 0.690289 | 463 | 4,459 | 6.531317 | 0.419006 | 0.073082 | 0.050926 | 0.05787 | 0.167328 | 0.139881 | 0.088294 | 0.088294 | 0.039683 | 0 | 0 | 0.00948 | 0.172012 | 4,459 | 172 | 92 | 25.924419 | 0.809588 | 0.220453 | 0 | 0.017857 | 1 | 0.008929 | 0.565431 | 0.467863 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.053571 | 0.008929 | 0 | 0.008929 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
ed5f71ab4d52d47962b0bf855599d609ee976ee0 | 5,653 | py | Python | src/node.py | wuttinanhi/dumb-lang | 031679717d7ae8dbc5024a676d3c34c684aedc3c | [
"Apache-2.0"
] | null | null | null | src/node.py | wuttinanhi/dumb-lang | 031679717d7ae8dbc5024a676d3c34c684aedc3c | [
"Apache-2.0"
] | null | null | null | src/node.py | wuttinanhi/dumb-lang | 031679717d7ae8dbc5024a676d3c34c684aedc3c | [
"Apache-2.0"
] | null | null | null | from typing import List
class ENodeType:
MODULE = "MODULE"
NUMBER = "NUMBER"
STRING = "STRING"
BINOP = "BINOP"
PRINT = "PRINT"
ASSIGN = "ASSIGN"
PARENTHESES = "PARENTHESES"
IF = "IF"
CONDITION = "CONDITION"
BLOCK = "BLOCK"
PRINT = "PRINT"
class EMathOperation:
ADD = "+"
MINUS = "-"
MULTIPLY = "*"
DIVIDE = "/"
def operation_coverter(operation: str):
if operation == "+":
return EMathOperation.ADD
if operation == "-":
return EMathOperation.MINUS
if operation == "*":
return EMathOperation.MULTIPLY
if operation == "/":
return EMathOperation.DIVIDE
class Node:
def exec(self):
pass
class ModuleNode(Node):
def __init__(self, body: List[Node] = []) -> None:
self.__type = ENodeType.MODULE
self.__body = body
def add_node(self, node: Node):
self.__body.append(node)
def exec(self):
for node in self.__body:
node.exec()
return
def __str__(self) -> str:
return f"ModuleNode(body=[{self.__body}])"
def __repr__(self) -> str:
return self.__str__()
class NumberNode(Node):
def __init__(self, value) -> None:
self.__type = ENodeType.NUMBER
if isinstance(value, Node):
value = value.exec()
if str(value).isdigit():
self.__value = int(value)
else:
self.__value = float(value)
def exec(self):
return self.__value
def __str__(self) -> str:
return f"NumberNode(value={self.__value})"
def __repr__(self) -> str:
return self.__str__()
class StringNode(Node):
def __init__(self, value) -> None:
self.__type = ENodeType.STRING
if isinstance(value, Node):
value = value.exec()
self.__value = value
def exec(self):
return self.__value
def __str__(self) -> str:
return f"StringNode(value={self.__value})"
def __repr__(self) -> str:
return self.__str__()
class BinOpNode(Node):
def __init__(self, operation: EMathOperation, left: Node, right: Node) -> None:
self.__type = ENodeType.BINOP
self.__operation = operation
self.__left = left
self.__right = right
self.__value = None
def exec(self):
left_value = self.__left.exec()
right_value = self.__right.exec()
if self.__operation == EMathOperation.ADD:
self.__value = left_value + right_value
if self.__operation == EMathOperation.MINUS:
self.__value = left_value - right_value
if self.__operation == EMathOperation.MULTIPLY:
self.__value = left_value * right_value
if self.__operation == EMathOperation.DIVIDE:
self.__value = left_value / right_value
return self.__value
def __str__(self) -> str:
return f"BinOpNode(operation={self.__operation}, left={self.__left}, right={self.__right})"
def __repr__(self) -> str:
return self.__str__()
class ParenthesesNode(Node):
def __init__(self, content: Node) -> None:
self.__type = ENodeType.PARENTHESES
self.__content = content
self.__value = None
def exec(self):
self.__value = self.__content.exec()
return self.__value
def __str__(self) -> str:
return f"ParenthesesNode(content={self.__content})"
def __repr__(self) -> str:
return self.__str__()
class BlockStatementNode(Node):
def __init__(self, body: List[Node]) -> None:
self.__type = ENodeType.BLOCK
self.__body = body
def exec(self):
for node in self.__body:
node.exec()
return
def __str__(self) -> str:
return f"BlockStatementNode(body=[{self.__body}])"
def __repr__(self) -> str:
return self.__str__()
class ConditionNode(Node):
def __init__(self, compare: str, left: Node, right: Node) -> None:
self.__type = ENodeType.CONDITION
self.__left = left
self.__right = right
self.__compare = compare
def exec(self) -> bool:
# NEED IMPLEMENTATION
pass
def __str__(self) -> str:
return f"ConditionNode(compare={self.__compare}, left={self.__left}, right={self.__right})"
def __repr__(self) -> str:
return self.__str__()
class IfNode(Node):
def __init__(self, condition: ConditionNode, body: List[Node]) -> None:
self.__type = ENodeType.IF
self.__condition = condition
self.__body = body
def exec(self):
if self.__condition == True:
for node in self.__body:
node.exec()
return
def __str__(self) -> str:
return f"IfNode(condition={self.__condition},body={self.__body})"
def __repr__(self) -> str:
return self.__str__()
class PrintNode(Node):
def __init__(self, value: Node) -> None:
self.__type = ENodeType.PRINT
self.__value = value
def exec(self):
return print(self.__value.exec())
def __str__(self) -> str:
return f"PrintNode(value={self.__value})"
def __repr__(self) -> str:
return self.__str__()
class AssignNode(Node):
def __init__(self, key: str, value: Node) -> None:
self.__type = ENodeType.ASSIGN
self.__key = key
self.__value = value
def exec(self):
# TO DO: assign variable to data store
return
def __str__(self) -> str:
return f"AssignNode(key={self.__key},value={self.__value})"
def __repr__(self) -> str:
return self.__str__()
| 24.261803 | 99 | 0.597559 | 630 | 5,653 | 4.87619 | 0.109524 | 0.068359 | 0.084635 | 0.048828 | 0.563802 | 0.550781 | 0.47526 | 0.400716 | 0.36556 | 0.316081 | 0 | 0 | 0.284628 | 5,653 | 232 | 100 | 24.366379 | 0.759644 | 0.009906 | 0 | 0.460606 | 0 | 0 | 0.097962 | 0.077226 | 0 | 0 | 0 | 0 | 0 | 1 | 0.260606 | false | 0.012121 | 0.006061 | 0.145455 | 0.636364 | 0.006061 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
ed617876583e2976806a220709bcf7ed8be09d49 | 187 | py | Python | src/dpfinder/event.py | gr3yknigh1/dpfinder | aad5608f36464d689ad700df5986372b173a07a9 | [
"MIT"
] | null | null | null | src/dpfinder/event.py | gr3yknigh1/dpfinder | aad5608f36464d689ad700df5986372b173a07a9 | [
"MIT"
] | null | null | null | src/dpfinder/event.py | gr3yknigh1/dpfinder | aad5608f36464d689ad700df5986372b173a07a9 | [
"MIT"
] | null | null | null | class event(object):
def __init__(self):
self.__funcs = set()
def invoke(self, *args):
for f in self.__funcs:
f.__call__(*args)
def reg(self, func):
self.__funcs.add(func)
| 15.583333 | 25 | 0.663102 | 29 | 187 | 3.793103 | 0.586207 | 0.245455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.181818 | 187 | 11 | 26 | 17 | 0.718954 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.375 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
ed72f43f2ee6200cf3dd4eaf05f8e123b6eb0966 | 56,006 | py | Python | tests/test_elb/test_elb.py | gtourkas/moto | 307104417b579d23d02f670ff55217a2d4a16bee | [
"Apache-2.0"
] | 5,460 | 2015-01-01T01:11:17.000Z | 2022-03-31T23:45:38.000Z | tests/test_elb/test_elb.py | gtourkas/moto | 307104417b579d23d02f670ff55217a2d4a16bee | [
"Apache-2.0"
] | 4,475 | 2015-01-05T19:37:30.000Z | 2022-03-31T13:55:12.000Z | tests/test_elb/test_elb.py | gtourkas/moto | 307104417b579d23d02f670ff55217a2d4a16bee | [
"Apache-2.0"
] | 1,831 | 2015-01-14T00:00:44.000Z | 2022-03-31T20:30:04.000Z | import boto3
import botocore
import boto
import boto.ec2.elb
from boto.ec2.elb import HealthCheck
from boto.ec2.elb.attributes import (
ConnectionSettingAttribute,
ConnectionDrainingAttribute,
AccessLogAttribute,
)
from botocore.exceptions import ClientError
from boto.exception import BotoServerError
import pytest
import sure # noqa # pylint: disable=unused-import
from moto import mock_acm, mock_elb, mock_ec2, mock_elb_deprecated, mock_ec2_deprecated
from tests import EXAMPLE_AMI_ID
from uuid import uuid4
# Has boto3 equivalent
@mock_elb_deprecated
@mock_ec2_deprecated
def test_create_load_balancer():
conn = boto.connect_elb()
ec2 = boto.ec2.connect_to_region("us-east-1")
security_group = ec2.create_security_group("sg-abc987", "description")
zones = ["us-east-1a", "us-east-1b"]
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
conn.create_load_balancer(
"my-lb", zones, ports, scheme="internal", security_groups=[security_group.id]
)
balancers = conn.get_all_load_balancers()
balancer = balancers[0]
balancer.name.should.equal("my-lb")
balancer.scheme.should.equal("internal")
list(balancer.security_groups).should.equal([security_group.id])
set(balancer.availability_zones).should.equal(set(["us-east-1a", "us-east-1b"]))
listener1 = balancer.listeners[0]
listener1.load_balancer_port.should.equal(80)
listener1.instance_port.should.equal(8080)
listener1.protocol.should.equal("HTTP")
listener2 = balancer.listeners[1]
listener2.load_balancer_port.should.equal(443)
listener2.instance_port.should.equal(8443)
listener2.protocol.should.equal("TCP")
@pytest.mark.parametrize("region_name", ["us-east-1", "ap-south-1"])
@pytest.mark.parametrize(
"zones",
[
["us-east-1a"],
["us-east-1a", "us-east-1b"],
["eu-north-1a", "eu-north-1b", "eu-north-1c"],
],
)
@mock_elb
@mock_ec2
def test_create_load_balancer_boto3(zones, region_name):
# Both regions and availability zones are parametrized
# This does not seem to have an effect on the DNS name
client = boto3.client("elb", region_name=region_name)
ec2 = boto3.resource("ec2", region_name=region_name)
security_group = ec2.create_security_group(
GroupName="sg01", Description="Test security group sg01"
)
lb = client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[
{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080},
{"Protocol": "http", "LoadBalancerPort": 81, "InstancePort": 9000},
],
AvailabilityZones=zones,
Scheme="internal",
SecurityGroups=[security_group.id],
)
lb.should.have.key("DNSName").equal("my-lb.us-east-1.elb.amazonaws.com")
describe = client.describe_load_balancers(LoadBalancerNames=["my-lb"])[
"LoadBalancerDescriptions"
][0]
describe.should.have.key("LoadBalancerName").equal("my-lb")
describe.should.have.key("DNSName").equal("my-lb.us-east-1.elb.amazonaws.com")
describe.should.have.key("CanonicalHostedZoneName").equal(
"my-lb.us-east-1.elb.amazonaws.com"
)
describe.should.have.key("AvailabilityZones").equal(zones)
describe.should.have.key("VPCId")
describe.should.have.key("SecurityGroups").equal([security_group.id])
describe.should.have.key("Scheme").equal("internal")
describe.should.have.key("ListenerDescriptions")
describe["ListenerDescriptions"].should.have.length_of(2)
tcp = [
l["Listener"]
for l in describe["ListenerDescriptions"]
if l["Listener"]["Protocol"] == "TCP"
][0]
http = [
l["Listener"]
for l in describe["ListenerDescriptions"]
if l["Listener"]["Protocol"] == "HTTP"
][0]
tcp.should.equal(
{
"Protocol": "TCP",
"LoadBalancerPort": 80,
"InstanceProtocol": "TCP",
"InstancePort": 8080,
"SSLCertificateId": "None",
}
)
http.should.equal(
{
"Protocol": "HTTP",
"LoadBalancerPort": 81,
"InstanceProtocol": "HTTP",
"InstancePort": 9000,
"SSLCertificateId": "None",
}
)
# Has boto3 equivalent
@mock_elb_deprecated
def test_getting_missing_elb():
conn = boto.connect_elb()
conn.get_all_load_balancers.when.called_with(
load_balancer_names="aaa"
).should.throw(BotoServerError)
@mock_elb
def test_get_missing_elb_boto3():
client = boto3.client("elb", region_name="us-west-2")
with pytest.raises(ClientError) as ex:
client.describe_load_balancers(LoadBalancerNames=["unknown-lb"])
err = ex.value.response["Error"]
err["Code"].should.equal("LoadBalancerNotFound")
err["Message"].should.equal(
"The specified load balancer does not exist: unknown-lb"
)
# Has boto3 equivalent
@mock_elb_deprecated
def test_create_elb_in_multiple_region():
zones = ["us-east-1a", "us-east-1b"]
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
west1_conn = boto.ec2.elb.connect_to_region("us-west-1")
west1_conn.create_load_balancer("my-lb", zones, ports)
west2_conn = boto.ec2.elb.connect_to_region("us-west-2")
west2_conn.create_load_balancer("my-lb", zones, ports)
list(west1_conn.get_all_load_balancers()).should.have.length_of(1)
list(west2_conn.get_all_load_balancers()).should.have.length_of(1)
@mock_elb
def test_create_elb_in_multiple_region_boto3():
client_east = boto3.client("elb", region_name="us-east-2")
client_west = boto3.client("elb", region_name="us-west-2")
name_east = str(uuid4())[0:6]
name_west = str(uuid4())[0:6]
client_east.create_load_balancer(
LoadBalancerName=name_east,
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
client_west.create_load_balancer(
LoadBalancerName=name_west,
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
east_names = [
lb["LoadBalancerName"]
for lb in client_east.describe_load_balancers()["LoadBalancerDescriptions"]
]
east_names.should.contain(name_east)
east_names.shouldnt.contain(name_west)
west_names = [
lb["LoadBalancerName"]
for lb in client_west.describe_load_balancers()["LoadBalancerDescriptions"]
]
west_names.should.contain(name_west)
west_names.shouldnt.contain(name_east)
@mock_acm
@mock_elb
def test_create_load_balancer_with_certificate_boto3():
acm_client = boto3.client("acm", region_name="us-east-2")
acm_request_response = acm_client.request_certificate(
DomainName="fake.domain.com",
DomainValidationOptions=[
{"DomainName": "fake.domain.com", "ValidationDomain": "domain.com"},
],
)
certificate_arn = acm_request_response["CertificateArn"]
client = boto3.client("elb", region_name="us-east-2")
name = str(uuid4())[0:6]
client.create_load_balancer(
LoadBalancerName=name,
Listeners=[
{
"Protocol": "https",
"LoadBalancerPort": 8443,
"InstancePort": 443,
"SSLCertificateId": certificate_arn,
}
],
AvailabilityZones=["us-east-1a"],
)
describe = client.describe_load_balancers(LoadBalancerNames=[name])[
"LoadBalancerDescriptions"
][0]
describe["Scheme"].should.equal("internet-facing")
listener = describe["ListenerDescriptions"][0]["Listener"]
listener.should.have.key("Protocol").equal("HTTPS")
listener.should.have.key("SSLCertificateId").equals(certificate_arn)
@mock_elb
def test_create_load_balancer_with_invalid_certificate():
client = boto3.client("elb", region_name="us-east-2")
name = str(uuid4())[0:6]
with pytest.raises(ClientError) as exc:
client.create_load_balancer(
LoadBalancerName=name,
Listeners=[
{
"Protocol": "https",
"LoadBalancerPort": 8443,
"InstancePort": 443,
"SSLCertificateId": "invalid_arn",
}
],
AvailabilityZones=["us-east-1a"],
)
err = exc.value.response["Error"]
err["Code"].should.equal("CertificateNotFoundException")
@mock_elb
def test_create_and_delete_boto3_support():
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
list(
client.describe_load_balancers()["LoadBalancerDescriptions"]
).should.have.length_of(1)
client.delete_load_balancer(LoadBalancerName="my-lb")
list(
client.describe_load_balancers()["LoadBalancerDescriptions"]
).should.have.length_of(0)
@mock_elb
def test_create_load_balancer_with_no_listeners_defined():
client = boto3.client("elb", region_name="us-east-1")
with pytest.raises(ClientError):
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
@mock_elb
def test_describe_paginated_balancers():
client = boto3.client("elb", region_name="us-east-1")
for i in range(51):
client.create_load_balancer(
LoadBalancerName="my-lb%d" % i,
Listeners=[
{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}
],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
resp = client.describe_load_balancers()
resp["LoadBalancerDescriptions"].should.have.length_of(50)
resp["NextMarker"].should.equal(
resp["LoadBalancerDescriptions"][-1]["LoadBalancerName"]
)
resp2 = client.describe_load_balancers(Marker=resp["NextMarker"])
resp2["LoadBalancerDescriptions"].should.have.length_of(1)
assert "NextToken" not in resp2.keys()
@mock_elb
@mock_ec2
def test_apply_security_groups_to_load_balancer():
client = boto3.client("elb", region_name="us-east-1")
ec2 = boto3.resource("ec2", region_name="us-east-1")
vpc = ec2.create_vpc(CidrBlock="10.0.0.0/16")
security_group = ec2.create_security_group(
GroupName="sg01", Description="Test security group sg01", VpcId=vpc.id
)
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
response = client.apply_security_groups_to_load_balancer(
LoadBalancerName="my-lb", SecurityGroups=[security_group.id]
)
assert response["SecurityGroups"] == [security_group.id]
balancer = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
assert balancer["SecurityGroups"] == [security_group.id]
# Using a not-real security group raises an error
with pytest.raises(ClientError) as error:
response = client.apply_security_groups_to_load_balancer(
LoadBalancerName="my-lb", SecurityGroups=["not-really-a-security-group"]
)
assert "One or more of the specified security groups do not exist." in str(
error.value
)
# Has boto3 equivalent
@mock_elb_deprecated
def test_add_listener():
conn = boto.connect_elb()
zones = ["us-east-1a", "us-east-1b"]
ports = [(80, 8080, "http")]
conn.create_load_balancer("my-lb", zones, ports)
new_listener = (443, 8443, "tcp")
conn.create_load_balancer_listeners("my-lb", [new_listener])
balancers = conn.get_all_load_balancers()
balancer = balancers[0]
listener1 = balancer.listeners[0]
listener1.load_balancer_port.should.equal(80)
listener1.instance_port.should.equal(8080)
listener1.protocol.should.equal("HTTP")
listener2 = balancer.listeners[1]
listener2.load_balancer_port.should.equal(443)
listener2.instance_port.should.equal(8443)
listener2.protocol.should.equal("TCP")
# Has boto3 equivalent
@mock_elb_deprecated
def test_delete_listener():
conn = boto.connect_elb()
zones = ["us-east-1a", "us-east-1b"]
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
conn.create_load_balancer("my-lb", zones, ports)
conn.delete_load_balancer_listeners("my-lb", [443])
balancers = conn.get_all_load_balancers()
balancer = balancers[0]
listener1 = balancer.listeners[0]
listener1.load_balancer_port.should.equal(80)
listener1.instance_port.should.equal(8080)
listener1.protocol.should.equal("HTTP")
balancer.listeners.should.have.length_of(1)
@mock_elb
def test_create_and_delete_listener_boto3_support():
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
list(
client.describe_load_balancers()["LoadBalancerDescriptions"]
).should.have.length_of(1)
client.create_load_balancer_listeners(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 443, "InstancePort": 8443}],
)
balancer = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
list(balancer["ListenerDescriptions"]).should.have.length_of(2)
balancer["ListenerDescriptions"][0]["Listener"]["Protocol"].should.equal("HTTP")
balancer["ListenerDescriptions"][0]["Listener"]["LoadBalancerPort"].should.equal(80)
balancer["ListenerDescriptions"][0]["Listener"]["InstancePort"].should.equal(8080)
balancer["ListenerDescriptions"][1]["Listener"]["Protocol"].should.equal("TCP")
balancer["ListenerDescriptions"][1]["Listener"]["LoadBalancerPort"].should.equal(
443
)
balancer["ListenerDescriptions"][1]["Listener"]["InstancePort"].should.equal(8443)
# Creating this listener with an conflicting definition throws error
with pytest.raises(ClientError):
client.create_load_balancer_listeners(
LoadBalancerName="my-lb",
Listeners=[
{"Protocol": "tcp", "LoadBalancerPort": 443, "InstancePort": 1234}
],
)
client.delete_load_balancer_listeners(
LoadBalancerName="my-lb", LoadBalancerPorts=[443]
)
balancer = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
list(balancer["ListenerDescriptions"]).should.have.length_of(1)
@mock_acm
@mock_elb
def test_create_lb_listener_with_ssl_certificate():
acm_client = boto3.client("acm", region_name="eu-west-1")
acm_request_response = acm_client.request_certificate(
DomainName="fake.domain.com",
DomainValidationOptions=[
{"DomainName": "fake.domain.com", "ValidationDomain": "domain.com"},
],
)
certificate_arn = acm_request_response["CertificateArn"]
client = boto3.client("elb", region_name="eu-west-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
client.create_load_balancer_listeners(
LoadBalancerName="my-lb",
Listeners=[
{
"Protocol": "tcp",
"LoadBalancerPort": 443,
"InstancePort": 8443,
"SSLCertificateId": certificate_arn,
}
],
)
balancer = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
listeners = balancer["ListenerDescriptions"]
listeners.should.have.length_of(2)
listeners[0]["Listener"]["Protocol"].should.equal("HTTP")
listeners[0]["Listener"]["SSLCertificateId"].should.equal("None")
listeners[1]["Listener"]["Protocol"].should.equal("TCP")
listeners[1]["Listener"]["SSLCertificateId"].should.equal(certificate_arn)
@mock_acm
@mock_elb
def test_create_lb_listener_with_invalid_ssl_certificate():
client = boto3.client("elb", region_name="eu-west-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
with pytest.raises(ClientError) as exc:
client.create_load_balancer_listeners(
LoadBalancerName="my-lb",
Listeners=[
{
"Protocol": "tcp",
"LoadBalancerPort": 443,
"InstancePort": 8443,
"SSLCertificateId": "unknownarn",
}
],
)
err = exc.value.response["Error"]
err["Code"].should.equal("CertificateNotFoundException")
@mock_acm
@mock_elb
def test_set_sslcertificate_boto3():
acm_client = boto3.client("acm", region_name="us-east-1")
acm_request_response = acm_client.request_certificate(
DomainName="fake.domain.com",
DomainValidationOptions=[
{"DomainName": "fake.domain.com", "ValidationDomain": "domain.com"},
],
)
certificate_arn = acm_request_response["CertificateArn"]
client = boto3.client("elb", region_name="us-east-1")
lb_name = str(uuid4())[0:6]
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[
{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080},
{"Protocol": "https", "LoadBalancerPort": 81, "InstancePort": 8081},
],
AvailabilityZones=["us-east-1a"],
)
client.set_load_balancer_listener_ssl_certificate(
LoadBalancerName=lb_name, LoadBalancerPort=81, SSLCertificateId=certificate_arn,
)
elb = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
listener = elb["ListenerDescriptions"][0]["Listener"]
listener.should.have.key("LoadBalancerPort").equals(80)
listener.should.have.key("SSLCertificateId").equals("None")
listener = elb["ListenerDescriptions"][1]["Listener"]
listener.should.have.key("LoadBalancerPort").equals(81)
listener.should.have.key("SSLCertificateId").equals(certificate_arn)
# Has boto3 equivalent
@mock_elb_deprecated
def test_get_load_balancers_by_name():
conn = boto.connect_elb()
zones = ["us-east-1a", "us-east-1b"]
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
conn.create_load_balancer("my-lb1", zones, ports)
conn.create_load_balancer("my-lb2", zones, ports)
conn.create_load_balancer("my-lb3", zones, ports)
conn.get_all_load_balancers().should.have.length_of(3)
conn.get_all_load_balancers(load_balancer_names=["my-lb1"]).should.have.length_of(1)
conn.get_all_load_balancers(
load_balancer_names=["my-lb1", "my-lb2"]
).should.have.length_of(2)
@mock_elb
def test_get_load_balancers_by_name_boto3():
client = boto3.client("elb", region_name="us-east-1")
lb_name1 = str(uuid4())[0:6]
lb_name2 = str(uuid4())[0:6]
client.create_load_balancer(
LoadBalancerName=lb_name1,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
client.create_load_balancer(
LoadBalancerName=lb_name2,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
lbs = client.describe_load_balancers(LoadBalancerNames=[lb_name1])
lbs["LoadBalancerDescriptions"].should.have.length_of(1)
lbs = client.describe_load_balancers(LoadBalancerNames=[lb_name2])
lbs["LoadBalancerDescriptions"].should.have.length_of(1)
lbs = client.describe_load_balancers(LoadBalancerNames=[lb_name1, lb_name2])
lbs["LoadBalancerDescriptions"].should.have.length_of(2)
with pytest.raises(ClientError) as ex:
client.describe_load_balancers(LoadBalancerNames=["unknownlb"])
err = ex.value.response["Error"]
err["Code"].should.equal("LoadBalancerNotFound")
err["Message"].should.equal(
f"The specified load balancer does not exist: unknownlb"
)
with pytest.raises(ClientError) as ex:
client.describe_load_balancers(LoadBalancerNames=[lb_name1, "unknownlb"])
err = ex.value.response["Error"]
err["Code"].should.equal("LoadBalancerNotFound")
# Bug - message sometimes shows the lb that does exist
err["Message"].should.match(f"The specified load balancer does not exist:")
# Has boto3 equivalent
@mock_elb_deprecated
def test_delete_load_balancer():
conn = boto.connect_elb()
zones = ["us-east-1a"]
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
conn.create_load_balancer("my-lb", zones, ports)
balancers = conn.get_all_load_balancers()
balancers.should.have.length_of(1)
conn.delete_load_balancer("my-lb")
balancers = conn.get_all_load_balancers()
balancers.should.have.length_of(0)
@mock_elb
def test_delete_load_balancer_boto3():
client = boto3.client("elb", region_name="us-east-1")
lb_name1 = str(uuid4())[0:6]
lb_name2 = str(uuid4())[0:6]
client.create_load_balancer(
LoadBalancerName=lb_name1,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
client.create_load_balancer(
LoadBalancerName=lb_name2,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
lbs = client.describe_load_balancers()["LoadBalancerDescriptions"]
lb_names = [lb["LoadBalancerName"] for lb in lbs]
lb_names.should.contain(lb_name1)
lb_names.should.contain(lb_name2)
client.delete_load_balancer(LoadBalancerName=lb_name1)
lbs = client.describe_load_balancers()["LoadBalancerDescriptions"]
lb_names = [lb["LoadBalancerName"] for lb in lbs]
lb_names.shouldnt.contain(lb_name1)
lb_names.should.contain(lb_name2)
# Has boto3 equivalent
@mock_elb_deprecated
def test_create_health_check():
conn = boto.connect_elb()
hc = HealthCheck(
interval=20,
healthy_threshold=3,
unhealthy_threshold=5,
target="HTTP:8080/health",
timeout=23,
)
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
lb.configure_health_check(hc)
balancer = conn.get_all_load_balancers()[0]
health_check = balancer.health_check
health_check.interval.should.equal(20)
health_check.healthy_threshold.should.equal(3)
health_check.unhealthy_threshold.should.equal(5)
health_check.target.should.equal("HTTP:8080/health")
health_check.timeout.should.equal(23)
@mock_elb
def test_create_health_check_boto3():
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
client.configure_health_check(
LoadBalancerName="my-lb",
HealthCheck={
"Target": "HTTP:8080/health",
"Interval": 20,
"Timeout": 23,
"HealthyThreshold": 3,
"UnhealthyThreshold": 5,
},
)
balancer = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
balancer["HealthCheck"]["Target"].should.equal("HTTP:8080/health")
balancer["HealthCheck"]["Interval"].should.equal(20)
balancer["HealthCheck"]["Timeout"].should.equal(23)
balancer["HealthCheck"]["HealthyThreshold"].should.equal(3)
balancer["HealthCheck"]["UnhealthyThreshold"].should.equal(5)
# Has boto3 equivalent
@mock_ec2_deprecated
@mock_elb_deprecated
def test_register_instances():
ec2_conn = boto.connect_ec2()
reservation = ec2_conn.run_instances(EXAMPLE_AMI_ID, 2)
instance_id1 = reservation.instances[0].id
instance_id2 = reservation.instances[1].id
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
lb.register_instances([instance_id1, instance_id2])
balancer = conn.get_all_load_balancers()[0]
instance_ids = [instance.id for instance in balancer.instances]
set(instance_ids).should.equal(set([instance_id1, instance_id2]))
@mock_ec2
@mock_elb
def test_register_instances_boto3():
ec2 = boto3.resource("ec2", region_name="us-east-1")
response = ec2.create_instances(ImageId=EXAMPLE_AMI_ID, MinCount=2, MaxCount=2)
instance_id1 = response[0].id
instance_id2 = response[1].id
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
client.register_instances_with_load_balancer(
LoadBalancerName="my-lb",
Instances=[{"InstanceId": instance_id1}, {"InstanceId": instance_id2}],
)
balancer = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
instance_ids = [instance["InstanceId"] for instance in balancer["Instances"]]
set(instance_ids).should.equal(set([instance_id1, instance_id2]))
# Has boto3 equivalent
@mock_ec2_deprecated
@mock_elb_deprecated
def test_deregister_instances():
ec2_conn = boto.connect_ec2()
reservation = ec2_conn.run_instances(EXAMPLE_AMI_ID, 2)
instance_id1 = reservation.instances[0].id
instance_id2 = reservation.instances[1].id
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
lb.register_instances([instance_id1, instance_id2])
balancer = conn.get_all_load_balancers()[0]
balancer.instances.should.have.length_of(2)
balancer.deregister_instances([instance_id1])
balancer.instances.should.have.length_of(1)
balancer.instances[0].id.should.equal(instance_id2)
@mock_ec2
@mock_elb
def test_deregister_instances_boto3():
ec2 = boto3.resource("ec2", region_name="us-east-1")
response = ec2.create_instances(ImageId=EXAMPLE_AMI_ID, MinCount=2, MaxCount=2)
instance_id1 = response[0].id
instance_id2 = response[1].id
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
client.register_instances_with_load_balancer(
LoadBalancerName="my-lb",
Instances=[{"InstanceId": instance_id1}, {"InstanceId": instance_id2}],
)
balancer = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
balancer["Instances"].should.have.length_of(2)
client.deregister_instances_from_load_balancer(
LoadBalancerName="my-lb", Instances=[{"InstanceId": instance_id1}]
)
balancer = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
balancer["Instances"].should.have.length_of(1)
balancer["Instances"][0]["InstanceId"].should.equal(instance_id2)
# Has boto3 equivalent
@mock_elb_deprecated
def test_default_attributes():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
attributes = lb.get_attributes()
attributes.cross_zone_load_balancing.enabled.should.be.false
attributes.connection_draining.enabled.should.be.false
attributes.access_log.enabled.should.be.false
attributes.connecting_settings.idle_timeout.should.equal(60)
@mock_elb
def test_default_attributes_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
attributes = client.describe_load_balancer_attributes(LoadBalancerName=lb_name)[
"LoadBalancerAttributes"
]
attributes.should.have.key("CrossZoneLoadBalancing").equal({"Enabled": False})
attributes.should.have.key("AccessLog").equal({"Enabled": False})
attributes.should.have.key("ConnectionDraining").equal({"Enabled": False})
attributes.should.have.key("ConnectionSettings").equal({"IdleTimeout": 60})
# Has boto3 equivalent
@mock_elb_deprecated
def test_cross_zone_load_balancing_attribute():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
conn.modify_lb_attribute("my-lb", "CrossZoneLoadBalancing", True)
attributes = lb.get_attributes(force=True)
attributes.cross_zone_load_balancing.enabled.should.be.true
conn.modify_lb_attribute("my-lb", "CrossZoneLoadBalancing", False)
attributes = lb.get_attributes(force=True)
attributes.cross_zone_load_balancing.enabled.should.be.false
@mock_elb
def test_cross_zone_load_balancing_attribute_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
client.modify_load_balancer_attributes(
LoadBalancerName=lb_name,
LoadBalancerAttributes={"CrossZoneLoadBalancing": {"Enabled": True}},
)
attributes = client.describe_load_balancer_attributes(LoadBalancerName=lb_name)[
"LoadBalancerAttributes"
]
# Bug: This property is not properly propagated
attributes.should.have.key("CrossZoneLoadBalancing").equal({"Enabled": False})
attributes.should.have.key("AccessLog").equal({"Enabled": False})
attributes.should.have.key("ConnectionDraining").equal({"Enabled": False})
attributes.should.have.key("ConnectionSettings").equal({"IdleTimeout": 60})
client.modify_load_balancer_attributes(
LoadBalancerName=lb_name,
LoadBalancerAttributes={"CrossZoneLoadBalancing": {"Enabled": False}},
)
attributes = client.describe_load_balancer_attributes(LoadBalancerName=lb_name)[
"LoadBalancerAttributes"
]
attributes.should.have.key("CrossZoneLoadBalancing").equal({"Enabled": False})
# Has boto3 equivalent
@mock_elb_deprecated
def test_connection_draining_attribute():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
connection_draining = ConnectionDrainingAttribute()
connection_draining.enabled = True
connection_draining.timeout = 60
conn.modify_lb_attribute("my-lb", "ConnectionDraining", connection_draining)
attributes = lb.get_attributes(force=True)
attributes.connection_draining.enabled.should.be.true
attributes.connection_draining.timeout.should.equal(60)
connection_draining.timeout = 30
conn.modify_lb_attribute("my-lb", "ConnectionDraining", connection_draining)
attributes = lb.get_attributes(force=True)
attributes.connection_draining.timeout.should.equal(30)
connection_draining.enabled = False
conn.modify_lb_attribute("my-lb", "ConnectionDraining", connection_draining)
attributes = lb.get_attributes(force=True)
attributes.connection_draining.enabled.should.be.false
@mock_elb
def test_connection_draining_attribute_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
client.modify_load_balancer_attributes(
LoadBalancerName=lb_name,
LoadBalancerAttributes={"ConnectionDraining": {"Enabled": True, "Timeout": 42}},
)
attributes = client.describe_load_balancer_attributes(LoadBalancerName=lb_name)[
"LoadBalancerAttributes"
]
attributes.should.have.key("ConnectionDraining").equal(
{"Enabled": True, "Timeout": 42}
)
client.modify_load_balancer_attributes(
LoadBalancerName=lb_name,
LoadBalancerAttributes={"ConnectionDraining": {"Enabled": False}},
)
attributes = client.describe_load_balancer_attributes(LoadBalancerName=lb_name)[
"LoadBalancerAttributes"
]
attributes.should.have.key("ConnectionDraining").equal({"Enabled": False})
# This does not work in Boto3, so we can't write a equivalent test
# Moto always looks for attribute 's3_bucket_name', but Boto3 sends 'S3BucketName'
# We'll need to rewrite this feature completely anyway, to get rid of the boto-objects
@mock_elb_deprecated
def test_access_log_attribute():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
access_log = AccessLogAttribute()
access_log.enabled = True
access_log.s3_bucket_name = "bucket"
access_log.s3_bucket_prefix = "prefix"
access_log.emit_interval = 60
conn.modify_lb_attribute("my-lb", "AccessLog", access_log)
attributes = lb.get_attributes(force=True)
attributes.access_log.enabled.should.be.true
attributes.access_log.s3_bucket_name.should.equal("bucket")
attributes.access_log.s3_bucket_prefix.should.equal("prefix")
attributes.access_log.emit_interval.should.equal(60)
access_log.enabled = False
conn.modify_lb_attribute("my-lb", "AccessLog", access_log)
attributes = lb.get_attributes(force=True)
attributes.access_log.enabled.should.be.false
# This does not work in Boto3, so we can't write a equivalent test
# Moto always looks for attribute 'idle_timeout', but Boto3 sends 'IdleTimeout'
# We'll need to rewrite this feature completely anyway, to get rid of the boto-objects
@mock_elb_deprecated
def test_connection_settings_attribute():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
connection_settings = ConnectionSettingAttribute(conn)
connection_settings.idle_timeout = 120
conn.modify_lb_attribute("my-lb", "ConnectingSettings", connection_settings)
attributes = lb.get_attributes(force=True)
attributes.connecting_settings.idle_timeout.should.equal(120)
connection_settings.idle_timeout = 60
conn.modify_lb_attribute("my-lb", "ConnectingSettings", connection_settings)
attributes = lb.get_attributes(force=True)
attributes.connecting_settings.idle_timeout.should.equal(60)
# Has boto3 equivalent
@mock_elb_deprecated
def test_create_lb_cookie_stickiness_policy():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
cookie_expiration_period = 60
policy_name = "LBCookieStickinessPolicy"
lb.create_cookie_stickiness_policy(cookie_expiration_period, policy_name)
lb = conn.get_all_load_balancers()[0]
# There appears to be a quirk about boto, whereby it returns a unicode
# string for cookie_expiration_period, despite being stated in
# documentation to be a long numeric.
#
# To work around that, this value is converted to an int and checked.
cookie_expiration_period_response_str = lb.policies.lb_cookie_stickiness_policies[
0
].cookie_expiration_period
int(cookie_expiration_period_response_str).should.equal(cookie_expiration_period)
lb.policies.lb_cookie_stickiness_policies[0].policy_name.should.equal(policy_name)
@mock_elb
def test_create_lb_cookie_stickiness_policy_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
balancer = client.describe_load_balancers(LoadBalancerNames=[lb_name])[
"LoadBalancerDescriptions"
][0]
lbc_policies = balancer["Policies"]["LBCookieStickinessPolicies"]
lbc_policies.should.have.length_of(0)
client.create_lb_cookie_stickiness_policy(
LoadBalancerName=lb_name, PolicyName="pname", CookieExpirationPeriod=42
)
balancer = client.describe_load_balancers(LoadBalancerNames=[lb_name])[
"LoadBalancerDescriptions"
][0]
policies = balancer["Policies"]
lbc_policies = policies["LBCookieStickinessPolicies"]
lbc_policies.should.have.length_of(1)
lbc_policies[0].should.equal({"PolicyName": "pname", "CookieExpirationPeriod": 42})
# Has boto3 equivalent
@mock_elb_deprecated
def test_create_lb_cookie_stickiness_policy_no_expiry():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
policy_name = "LBCookieStickinessPolicy"
lb.create_cookie_stickiness_policy(None, policy_name)
lb = conn.get_all_load_balancers()[0]
lb.policies.lb_cookie_stickiness_policies[0].cookie_expiration_period.should.be.none
lb.policies.lb_cookie_stickiness_policies[0].policy_name.should.equal(policy_name)
@mock_elb
def test_create_lb_cookie_stickiness_policy_no_expiry_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
balancer = client.describe_load_balancers(LoadBalancerNames=[lb_name])[
"LoadBalancerDescriptions"
][0]
lbc_policies = balancer["Policies"]["LBCookieStickinessPolicies"]
lbc_policies.should.have.length_of(0)
client.create_lb_cookie_stickiness_policy(
LoadBalancerName=lb_name, PolicyName="pname"
)
balancer = client.describe_load_balancers(LoadBalancerNames=[lb_name])[
"LoadBalancerDescriptions"
][0]
policies = balancer["Policies"]
lbc_policies = policies["LBCookieStickinessPolicies"]
lbc_policies.should.have.length_of(1)
lbc_policies[0].should.equal({"PolicyName": "pname"})
# Has boto3 equivalent
@mock_elb_deprecated
def test_create_app_cookie_stickiness_policy():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
cookie_name = "my-stickiness-policy"
policy_name = "AppCookieStickinessPolicy"
lb.create_app_cookie_stickiness_policy(cookie_name, policy_name)
lb = conn.get_all_load_balancers()[0]
lb.policies.app_cookie_stickiness_policies[0].cookie_name.should.equal(cookie_name)
lb.policies.app_cookie_stickiness_policies[0].policy_name.should.equal(policy_name)
@mock_elb
def test_create_app_cookie_stickiness_policy_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
balancer = client.describe_load_balancers(LoadBalancerNames=[lb_name])[
"LoadBalancerDescriptions"
][0]
lbc_policies = balancer["Policies"]["AppCookieStickinessPolicies"]
lbc_policies.should.have.length_of(0)
client.create_app_cookie_stickiness_policy(
LoadBalancerName=lb_name, PolicyName="pname", CookieName="cname"
)
balancer = client.describe_load_balancers(LoadBalancerNames=[lb_name])[
"LoadBalancerDescriptions"
][0]
policies = balancer["Policies"]
lbc_policies = policies["AppCookieStickinessPolicies"]
lbc_policies.should.have.length_of(1)
lbc_policies[0].should.equal({"CookieName": "cname", "PolicyName": "pname"})
# Has boto3 equivalent
@mock_elb_deprecated
def test_create_lb_policy():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
policy_name = "ProxyPolicy"
lb.create_lb_policy(policy_name, "ProxyProtocolPolicyType", {"ProxyProtocol": True})
lb = conn.get_all_load_balancers()[0]
lb.policies.other_policies[0].policy_name.should.equal(policy_name)
@mock_elb
def test_create_lb_policy_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a"],
)
client.create_load_balancer_policy(
LoadBalancerName=lb_name,
PolicyName="ProxyPolicy",
PolicyTypeName="ProxyProtocolPolicyType",
PolicyAttributes=[
{"AttributeName": "ProxyProtocol", "AttributeValue": "true",},
],
)
balancer = client.describe_load_balancers(LoadBalancerNames=[lb_name])[
"LoadBalancerDescriptions"
][0]
policies = balancer["Policies"]
policies.should.have.key("OtherPolicies").equal(["ProxyPolicy"])
# Has boto3 equivalent
@mock_elb_deprecated
def test_set_policies_of_listener():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
listener_port = 80
policy_name = "my-stickiness-policy"
# boto docs currently state that zero or one policy may be associated
# with a given listener
# in a real flow, it is necessary first to create a policy,
# then to set that policy to the listener
lb.create_cookie_stickiness_policy(None, policy_name)
lb.set_policies_of_listener(listener_port, [policy_name])
lb = conn.get_all_load_balancers()[0]
listener = lb.listeners[0]
listener.load_balancer_port.should.equal(listener_port)
# by contrast to a backend, a listener stores only policy name strings
listener.policy_names[0].should.equal(policy_name)
@mock_elb
def test_set_policies_of_listener_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[
{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080},
{"Protocol": "https", "LoadBalancerPort": 81, "InstancePort": 8081},
],
AvailabilityZones=["us-east-1a"],
)
client.create_app_cookie_stickiness_policy(
LoadBalancerName=lb_name, PolicyName="pname", CookieName="cname"
)
client.set_load_balancer_policies_of_listener(
LoadBalancerName=lb_name, LoadBalancerPort=81, PolicyNames=["pname"]
)
balancer = client.describe_load_balancers(LoadBalancerNames=[lb_name])[
"LoadBalancerDescriptions"
][0]
http_l = [
l
for l in balancer["ListenerDescriptions"]
if l["Listener"]["Protocol"] == "HTTP"
][0]
http_l.should.have.key("PolicyNames").should.equal([])
https_l = [
l
for l in balancer["ListenerDescriptions"]
if l["Listener"]["Protocol"] == "HTTPS"
][0]
https_l.should.have.key("PolicyNames").should.equal(["pname"])
# Has boto3 equivalent
@mock_elb_deprecated
def test_set_policies_of_backend_server():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", [], ports)
instance_port = 8080
policy_name = "ProxyPolicy"
# in a real flow, it is necessary first to create a policy,
# then to set that policy to the backend
lb.create_lb_policy(policy_name, "ProxyProtocolPolicyType", {"ProxyProtocol": True})
lb.set_policies_of_backend_server(instance_port, [policy_name])
lb = conn.get_all_load_balancers()[0]
backend = lb.backends[0]
backend.instance_port.should.equal(instance_port)
# by contrast to a listener, a backend stores OtherPolicy objects
backend.policies[0].policy_name.should.equal(policy_name)
@mock_elb
def test_set_policies_of_backend_server_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[
{"Protocol": "http", "LoadBalancerPort": 80, "InstancePort": 8080},
{"Protocol": "https", "LoadBalancerPort": 81, "InstancePort": 8081},
],
AvailabilityZones=["us-east-1a"],
)
client.create_app_cookie_stickiness_policy(
LoadBalancerName=lb_name, PolicyName="pname", CookieName="cname"
)
client.set_load_balancer_policies_for_backend_server(
LoadBalancerName=lb_name, InstancePort=8081, PolicyNames=["pname"]
)
balancer = client.describe_load_balancers(LoadBalancerNames=[lb_name])[
"LoadBalancerDescriptions"
][0]
balancer.should.have.key("BackendServerDescriptions")
desc = balancer["BackendServerDescriptions"]
desc.should.have.length_of(1)
desc[0].should.equal({"InstancePort": 8081, "PolicyNames": ["pname"]})
# Has boto3 equivalent
@mock_ec2_deprecated
@mock_elb_deprecated
def test_describe_instance_health():
ec2_conn = boto.connect_ec2()
reservation = ec2_conn.run_instances(EXAMPLE_AMI_ID, 2)
instance_id1 = reservation.instances[0].id
instance_id2 = reservation.instances[1].id
conn = boto.connect_elb()
zones = ["us-east-1a", "us-east-1b"]
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
lb = conn.create_load_balancer("my-lb", zones, ports)
instances_health = conn.describe_instance_health("my-lb")
instances_health.should.be.empty
lb.register_instances([instance_id1, instance_id2])
instances_health = conn.describe_instance_health("my-lb")
instances_health.should.have.length_of(2)
for instance_health in instances_health:
instance_health.instance_id.should.be.within([instance_id1, instance_id2])
instance_health.state.should.equal("InService")
instances_health = conn.describe_instance_health("my-lb", [instance_id1])
instances_health.should.have.length_of(1)
instances_health[0].instance_id.should.equal(instance_id1)
instances_health[0].state.should.equal("InService")
@mock_ec2
@mock_elb
def test_describe_instance_health_boto3():
elb = boto3.client("elb", region_name="us-east-1")
ec2 = boto3.client("ec2", region_name="us-east-1")
instances = ec2.run_instances(ImageId=EXAMPLE_AMI_ID, MinCount=2, MaxCount=2)[
"Instances"
]
lb_name = "my_load_balancer"
elb.create_load_balancer(
Listeners=[{"InstancePort": 80, "LoadBalancerPort": 8080, "Protocol": "HTTP"}],
LoadBalancerName=lb_name,
)
elb.register_instances_with_load_balancer(
LoadBalancerName=lb_name, Instances=[{"InstanceId": instances[0]["InstanceId"]}]
)
instances_health = elb.describe_instance_health(
LoadBalancerName=lb_name,
Instances=[{"InstanceId": instance["InstanceId"]} for instance in instances],
)
instances_health["InstanceStates"].should.have.length_of(2)
instances_health["InstanceStates"][0]["InstanceId"].should.equal(
instances[0]["InstanceId"]
)
instances_health["InstanceStates"][0]["State"].should.equal("InService")
instances_health["InstanceStates"][1]["InstanceId"].should.equal(
instances[1]["InstanceId"]
)
instances_health["InstanceStates"][1]["State"].should.equal("Unknown")
@mock_elb
def test_add_remove_tags():
client = boto3.client("elb", region_name="us-east-1")
client.add_tags.when.called_with(
LoadBalancerNames=["my-lb"], Tags=[{"Key": "a", "Value": "b"}]
).should.throw(botocore.exceptions.ClientError)
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
list(
client.describe_load_balancers()["LoadBalancerDescriptions"]
).should.have.length_of(1)
client.add_tags(LoadBalancerNames=["my-lb"], Tags=[{"Key": "a", "Value": "b"}])
tags = dict(
[
(d["Key"], d["Value"])
for d in client.describe_tags(LoadBalancerNames=["my-lb"])[
"TagDescriptions"
][0]["Tags"]
]
)
tags.should.have.key("a").which.should.equal("b")
client.add_tags(
LoadBalancerNames=["my-lb"],
Tags=[
{"Key": "a", "Value": "b"},
{"Key": "b", "Value": "b"},
{"Key": "c", "Value": "b"},
{"Key": "d", "Value": "b"},
{"Key": "e", "Value": "b"},
{"Key": "f", "Value": "b"},
{"Key": "g", "Value": "b"},
{"Key": "h", "Value": "b"},
{"Key": "i", "Value": "b"},
{"Key": "j", "Value": "b"},
],
)
client.add_tags.when.called_with(
LoadBalancerNames=["my-lb"], Tags=[{"Key": "k", "Value": "b"}]
).should.throw(botocore.exceptions.ClientError)
client.add_tags(LoadBalancerNames=["my-lb"], Tags=[{"Key": "j", "Value": "c"}])
tags = dict(
[
(d["Key"], d["Value"])
for d in client.describe_tags(LoadBalancerNames=["my-lb"])[
"TagDescriptions"
][0]["Tags"]
]
)
tags.should.have.key("a").which.should.equal("b")
tags.should.have.key("b").which.should.equal("b")
tags.should.have.key("c").which.should.equal("b")
tags.should.have.key("d").which.should.equal("b")
tags.should.have.key("e").which.should.equal("b")
tags.should.have.key("f").which.should.equal("b")
tags.should.have.key("g").which.should.equal("b")
tags.should.have.key("h").which.should.equal("b")
tags.should.have.key("i").which.should.equal("b")
tags.should.have.key("j").which.should.equal("c")
tags.shouldnt.have.key("k")
client.remove_tags(LoadBalancerNames=["my-lb"], Tags=[{"Key": "a"}])
tags = dict(
[
(d["Key"], d["Value"])
for d in client.describe_tags(LoadBalancerNames=["my-lb"])[
"TagDescriptions"
][0]["Tags"]
]
)
tags.shouldnt.have.key("a")
tags.should.have.key("b").which.should.equal("b")
tags.should.have.key("c").which.should.equal("b")
tags.should.have.key("d").which.should.equal("b")
tags.should.have.key("e").which.should.equal("b")
tags.should.have.key("f").which.should.equal("b")
tags.should.have.key("g").which.should.equal("b")
tags.should.have.key("h").which.should.equal("b")
tags.should.have.key("i").which.should.equal("b")
tags.should.have.key("j").which.should.equal("c")
client.create_load_balancer(
LoadBalancerName="other-lb",
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 433, "InstancePort": 8433}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
client.add_tags(
LoadBalancerNames=["other-lb"], Tags=[{"Key": "other", "Value": "something"}]
)
lb_tags = dict(
[
(l["LoadBalancerName"], dict([(d["Key"], d["Value"]) for d in l["Tags"]]))
for l in client.describe_tags(LoadBalancerNames=["my-lb", "other-lb"])[
"TagDescriptions"
]
]
)
lb_tags.should.have.key("my-lb")
lb_tags.should.have.key("other-lb")
lb_tags["my-lb"].shouldnt.have.key("other")
lb_tags["other-lb"].should.have.key("other").which.should.equal("something")
@mock_elb
def test_create_with_tags():
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
Tags=[{"Key": "k", "Value": "v"}],
)
tags = dict(
(d["Key"], d["Value"])
for d in client.describe_tags(LoadBalancerNames=["my-lb"])["TagDescriptions"][
0
]["Tags"]
)
tags.should.have.key("k").which.should.equal("v")
@mock_elb
def test_modify_attributes():
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}],
AvailabilityZones=["us-east-1a", "us-east-1b"],
)
# Default ConnectionDraining timeout of 300 seconds
client.modify_load_balancer_attributes(
LoadBalancerName="my-lb",
LoadBalancerAttributes={"ConnectionDraining": {"Enabled": True}},
)
lb_attrs = client.describe_load_balancer_attributes(LoadBalancerName="my-lb")
lb_attrs["LoadBalancerAttributes"]["ConnectionDraining"]["Enabled"].should.equal(
True
)
lb_attrs["LoadBalancerAttributes"]["ConnectionDraining"]["Timeout"].should.equal(
300
)
# specify a custom ConnectionDraining timeout
client.modify_load_balancer_attributes(
LoadBalancerName="my-lb",
LoadBalancerAttributes={"ConnectionDraining": {"Enabled": True, "Timeout": 45}},
)
lb_attrs = client.describe_load_balancer_attributes(LoadBalancerName="my-lb")
lb_attrs["LoadBalancerAttributes"]["ConnectionDraining"]["Enabled"].should.equal(
True
)
lb_attrs["LoadBalancerAttributes"]["ConnectionDraining"]["Timeout"].should.equal(45)
@mock_ec2
@mock_elb
def test_subnets():
ec2 = boto3.resource("ec2", region_name="us-east-1")
vpc = ec2.create_vpc(CidrBlock="172.28.7.0/24", InstanceTenancy="default")
subnet = ec2.create_subnet(VpcId=vpc.id, CidrBlock="172.28.7.192/26")
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName="my-lb",
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}],
Subnets=[subnet.id],
)
lb = client.describe_load_balancers()["LoadBalancerDescriptions"][0]
lb.should.have.key("Subnets").which.should.have.length_of(1)
lb["Subnets"][0].should.equal(subnet.id)
lb.should.have.key("VPCId").which.should.equal(vpc.id)
@mock_elb_deprecated
def test_create_load_balancer_duplicate():
conn = boto.connect_elb()
ports = [(80, 8080, "http"), (443, 8443, "tcp")]
conn.create_load_balancer("my-lb", [], ports)
conn.create_load_balancer.when.called_with("my-lb", [], ports).should.throw(
BotoServerError
)
@mock_elb
def test_create_load_balancer_duplicate_boto3():
lb_name = str(uuid4())[0:6]
client = boto3.client("elb", region_name="us-east-1")
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}],
)
with pytest.raises(ClientError) as ex:
client.create_load_balancer(
LoadBalancerName=lb_name,
Listeners=[
{"Protocol": "tcp", "LoadBalancerPort": 80, "InstancePort": 8080}
],
)
err = ex.value.response["Error"]
err["Code"].should.equal("DuplicateLoadBalancerName")
err["Message"].should.equal(
f"The specified load balancer name already exists for this account: {lb_name}"
)
| 35.157564 | 88 | 0.67932 | 6,507 | 56,006 | 5.635469 | 0.065929 | 0.039269 | 0.037306 | 0.025525 | 0.796864 | 0.753232 | 0.717889 | 0.677202 | 0.635888 | 0.604581 | 0 | 0.028624 | 0.177838 | 56,006 | 1,592 | 89 | 35.179648 | 0.767754 | 0.035353 | 0 | 0.553429 | 0 | 0 | 0.175949 | 0.033237 | 0 | 0 | 0 | 0 | 0.00319 | 1 | 0.044657 | false | 0 | 0.010367 | 0 | 0.055024 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
ed7f9248c5637fd780d0e0142872b5d31c037345 | 158 | py | Python | app/api/routes.py | jampzs/fastapi-base-project | ad2d0d8438af3d3a844266dade90fa58f0025098 | [
"MIT"
] | null | null | null | app/api/routes.py | jampzs/fastapi-base-project | ad2d0d8438af3d3a844266dade90fa58f0025098 | [
"MIT"
] | null | null | null | app/api/routes.py | jampzs/fastapi-base-project | ad2d0d8438af3d3a844266dade90fa58f0025098 | [
"MIT"
] | null | null | null | from fastapi import APIRouter
from app.api.endpoints import initial
# Routes for our app
api_router = APIRouter()
api_router.include_router(initial.router)
| 19.75 | 41 | 0.816456 | 23 | 158 | 5.478261 | 0.565217 | 0.095238 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.120253 | 158 | 7 | 42 | 22.571429 | 0.906475 | 0.113924 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
71e9bae776cb98760acf578c03a263dbd2203e26 | 2,386 | py | Python | src/encoding_jobs.py | ValentinMastro/av1_split_encode | 27fcec1cfc92701c3fa37b6d6470f18422f2819c | [
"MIT"
] | null | null | null | src/encoding_jobs.py | ValentinMastro/av1_split_encode | 27fcec1cfc92701c3fa37b6d6470f18422f2819c | [
"MIT"
] | 1 | 2022-01-21T06:35:49.000Z | 2022-01-25T12:29:37.000Z | src/encoding_jobs.py | ValentinMastro/av1_split_encode | 27fcec1cfc92701c3fa37b6d6470f18422f2819c | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
from os.path import join, split
from os import makedirs, system
from sys import maxsize
def check_sub_directories(list_of_fullpath):
for path in list_of_fullpath:
head, tail = split(path)
makedirs(head, exist_ok = True)
class Encoding_job:
def __init__(self, parameters, split = None):
self.parameters = parameters
self.split = split
def get_type(self):
return "Abstract"
def get_number_of_frames(self):
return 0
def generate_encode_command(self, pass_num):
return []
def run_job(self):
pass
class Aomenc_job(Encoding_job):
def __init__(self, parameters, split):
super().__init__(parameters, split)
self.first_pass_log_file = join(self.parameters.temp_folder,
'log', f"{split.number_filled()}.log")
self.second_pass_ivf_file = join(self.parameters.temp_folder,
'ivf', f"{split.number_filled()}.ivf")
check_sub_directories([self.first_pass_log_file, self.second_pass_ivf_file])
def get_type(self):
return "Video"
def get_number_of_frames(self):
return self.split.get_number_of_frames()
def generate_encode_command(self, pass_num):
options = [ '--lag-in-frames=35', '--bit-depth=10', '--frame-boost=1',
'--auto-alt-ref=1', '--enable-fwd-kf=1']
command = [ self.parameters.aomenc,
'-t', str(self.parameters.threads_per_split),
'--cpu-used=' + str(self.parameters.cpu_used),
'--end-usage=q', '--cq-level=' + str(self.parameters.cq_level),
'--passes=2', f'--pass={pass_num}'
] + options + [
'--fpf=' + self.first_pass_log_file,
'-o', self.second_pass_ivf_file, '-',
'2>', '/dev/null'
]
return command
def run_job(self):
for pass_num in (1,2):
pipe = self.split.generate_pipe()
encode_command = self.generate_encode_command(pass_num)
system(" ".join(pipe) + " | " + " ".join(encode_command))
return self.get_number_of_frames()
class Audio_job(Encoding_job):
def __init__(self, parameters):
super().__init__(parameters)
def get_type(self):
return "Audio"
def get_number_of_frames(self):
return maxsize
def run_job(self):
check_sub_directories([self.parameters.temp_audio])
command = [ self.parameters.ffmpeg, '-y', '-loglevel', 'quiet',
'-i', self.parameters.input_file, '-map', '0:a',
'-c:a', 'libopus', '-b:a', '192k',
self.parameters.temp_audio]
system(" ".join(command))
return 0
| 25.655914 | 78 | 0.686505 | 341 | 2,386 | 4.510264 | 0.31085 | 0.127438 | 0.035761 | 0.055267 | 0.33225 | 0.218466 | 0.176853 | 0 | 0 | 0 | 0 | 0.008955 | 0.157586 | 2,386 | 92 | 79 | 25.934783 | 0.756219 | 0.008801 | 0 | 0.19697 | 0 | 0 | 0.123519 | 0.022843 | 0 | 0 | 0 | 0 | 0 | 1 | 0.227273 | false | 0.166667 | 0.045455 | 0.106061 | 0.469697 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 3 |
71ec996f576782c1d512df7db69cad23c39e8d34 | 303 | py | Python | manage.py | igorshevk/oknoname | 0828504afb8fdae5f5e85040bec1d95cb27ce471 | [
"MIT"
] | 1 | 2022-01-09T19:53:55.000Z | 2022-01-09T19:53:55.000Z | manage.py | igorshevk/oknoname | 0828504afb8fdae5f5e85040bec1d95cb27ce471 | [
"MIT"
] | 5 | 2021-06-08T21:03:58.000Z | 2022-03-12T00:18:43.000Z | manage.py | BinaryTree0/fer3 | 85c3bbf2f328e69ad4d7c01b6e2c8d4ef1d9e0a3 | [
"MIT"
] | 1 | 2022-02-15T16:56:49.000Z | 2022-02-15T16:56:49.000Z | #!/usr/bin/env python
import logging
import os
import sys
if __name__ == "__main__":
logging.captureWarnings(True)
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mysite.settings.dev")
from django.core.management import execute_from_command_line
execute_from_command_line(sys.argv)
| 21.642857 | 74 | 0.772277 | 40 | 303 | 5.45 | 0.675 | 0.100917 | 0.165138 | 0.201835 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.132013 | 303 | 13 | 75 | 23.307692 | 0.828897 | 0.066007 | 0 | 0 | 0 | 0 | 0.173759 | 0.078014 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
71f2b1623884a596cd2d8a37c83ceea2d8d9505d | 88 | py | Python | LargeApp/run.py | ckc6842/my-hot-spot | 7968cd5c45fac492da74087e364acbecc49f59f2 | [
"MIT"
] | null | null | null | LargeApp/run.py | ckc6842/my-hot-spot | 7968cd5c45fac492da74087e364acbecc49f59f2 | [
"MIT"
] | null | null | null | LargeApp/run.py | ckc6842/my-hot-spot | 7968cd5c45fac492da74087e364acbecc49f59f2 | [
"MIT"
] | null | null | null | __author__ = 'Yeob'
from app import app
app.run(host='127.0.0.1', port=5000, debug=True) | 29.333333 | 48 | 0.715909 | 17 | 88 | 3.470588 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.126582 | 0.102273 | 88 | 3 | 48 | 29.333333 | 0.620253 | 0 | 0 | 0 | 0 | 0 | 0.146067 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
71f643612bb6aba18fcab2d4169a9ef13ebe223b | 20 | py | Python | tvdordrecht/__init__.py | allcaps/tvdordrecht.nl | a2ff1b5ade88378f1a72a7ab36d51965b06509b9 | [
"MIT"
] | null | null | null | tvdordrecht/__init__.py | allcaps/tvdordrecht.nl | a2ff1b5ade88378f1a72a7ab36d51965b06509b9 | [
"MIT"
] | 1 | 2022-01-13T00:48:55.000Z | 2022-01-13T00:48:55.000Z | tvdordrecht/__init__.py | allcaps/tvdordrecht.nl | a2ff1b5ade88378f1a72a7ab36d51965b06509b9 | [
"MIT"
] | null | null | null | __author__ = 'coen'
| 10 | 19 | 0.7 | 2 | 20 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 20 | 1 | 20 | 20 | 0.588235 | 0 | 0 | 0 | 0 | 0 | 0.2 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
71fc2cb9eb1a0e29e3b2ae0cd9f5aa4d98554920 | 56 | py | Python | name.py | sufailps/jenkinsname | 0c088b8c805ba86b3718b866c05fadf4155a037a | [
"MIT"
] | null | null | null | name.py | sufailps/jenkinsname | 0c088b8c805ba86b3718b866c05fadf4155a037a | [
"MIT"
] | null | null | null | name.py | sufailps/jenkinsname | 0c088b8c805ba86b3718b866c05fadf4155a037a | [
"MIT"
] | null | null | null |
with open("name.txt","w") as fd:
fd.write("Sufail")
| 14 | 32 | 0.589286 | 10 | 56 | 3.3 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.160714 | 56 | 3 | 33 | 18.666667 | 0.702128 | 0 | 0 | 0 | 0 | 0 | 0.267857 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9c02ccd2e73d09db05c0979ea81b40b198cd176b | 199 | py | Python | Other Basic Stuff/jeyranFile.py | egmnklc/Sabanci | 31eaeedb01010879b5d62dee84130bd88e7e5bdf | [
"MIT"
] | null | null | null | Other Basic Stuff/jeyranFile.py | egmnklc/Sabanci | 31eaeedb01010879b5d62dee84130bd88e7e5bdf | [
"MIT"
] | null | null | null | Other Basic Stuff/jeyranFile.py | egmnklc/Sabanci | 31eaeedb01010879b5d62dee84130bd88e7e5bdf | [
"MIT"
] | null | null | null | words_splitted = input("Give me some words: ")
words = words_splitted.split(",")
counter = 0
for i in range(len(words)):
if words[i-1].isalpha() == True:
counter += 1
print(counter) | 28.428571 | 47 | 0.628141 | 29 | 199 | 4.241379 | 0.655172 | 0.211382 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019108 | 0.211055 | 199 | 7 | 48 | 28.428571 | 0.764331 | 0 | 0 | 0 | 0 | 0 | 0.108247 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.142857 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9c0784a10e27d3e9958c9cf4e18313ca0dbbadd3 | 169 | py | Python | fist_phase/0202_round.py | kapuni/exercise_py | b60ba8462d2545cae57483bcb0b3428b03c5d522 | [
"MIT"
] | null | null | null | fist_phase/0202_round.py | kapuni/exercise_py | b60ba8462d2545cae57483bcb0b3428b03c5d522 | [
"MIT"
] | null | null | null | fist_phase/0202_round.py | kapuni/exercise_py | b60ba8462d2545cae57483bcb0b3428b03c5d522 | [
"MIT"
] | null | null | null | import math
radius = float(input('请输入圆的半径: '))
perimeter = 2 * math.pi * radius
area = math.pi * radius * radius
print('周长: %.2f' % perimeter)
print('面积: %.2f' % area)
| 21.125 | 34 | 0.639053 | 24 | 169 | 4.5 | 0.583333 | 0.111111 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021429 | 0.171598 | 169 | 7 | 35 | 24.142857 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.147929 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0.333333 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9c0bdadae487bd34237613d6caae4c9ca6eccc66 | 229 | py | Python | beetsplug/__init__.py | juanmeleiro/BeetsPluginXtractor | c58588b7ef9c6921e3d010fbd9f4d03cc3c4885f | [
"MIT"
] | 10 | 2020-04-08T05:20:59.000Z | 2022-01-22T12:57:46.000Z | beetsplug/__init__.py | juanmeleiro/BeetsPluginXtractor | c58588b7ef9c6921e3d010fbd9f4d03cc3c4885f | [
"MIT"
] | 7 | 2020-03-30T12:48:01.000Z | 2022-03-07T23:22:09.000Z | beetsplug/__init__.py | juanmeleiro/BeetsPluginXtractor | c58588b7ef9c6921e3d010fbd9f4d03cc3c4885f | [
"MIT"
] | 3 | 2021-10-03T02:45:44.000Z | 2022-01-10T21:05:30.000Z | # Copyright: Copyright (c) 2020., Adam Jakab
#
# Author: Adam Jakab <adam at jakab dot pro>
# Created: 3/12/20, 11:42 PM
# License: See LICENSE.txt
from pkgutil import extend_path
__path__ = extend_path(__path__, __name__)
| 22.9 | 45 | 0.720524 | 35 | 229 | 4.314286 | 0.714286 | 0.119205 | 0.18543 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.068783 | 0.174672 | 229 | 9 | 46 | 25.444444 | 0.730159 | 0.611354 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9c13bca1aaab94bd49120e98534fe4d85d6d45de | 16,996 | py | Python | imblearn/under_sampling/_prototype_selection/_edited_nearest_neighbours.py | timgates42/imbalanced-learn | edd75224a3ecfa043f6971f81e2c7073cb08db30 | [
"MIT"
] | 2 | 2017-06-15T04:49:43.000Z | 2020-06-20T02:29:29.000Z | imblearn/under_sampling/_prototype_selection/_edited_nearest_neighbours.py | timgates42/imbalanced-learn | edd75224a3ecfa043f6971f81e2c7073cb08db30 | [
"MIT"
] | 3 | 2016-07-26T09:39:44.000Z | 2020-06-20T02:29:30.000Z | imblearn/under_sampling/_prototype_selection/_edited_nearest_neighbours.py | timgates42/imbalanced-learn | edd75224a3ecfa043f6971f81e2c7073cb08db30 | [
"MIT"
] | null | null | null | """Class to perform under-sampling based on the edited nearest neighbour
method."""
# Authors: Guillaume Lemaitre <g.lemaitre58@gmail.com>
# Dayvid Oliveira
# Christos Aridas
# License: MIT
from collections import Counter
import numpy as np
from scipy.stats import mode
from sklearn.utils import _safe_indexing
from ..base import BaseCleaningSampler
from ...utils import check_neighbors_object
from ...utils import Substitution
from ...utils._docstring import _n_jobs_docstring
from ...utils._validation import _deprecate_positional_args
SEL_KIND = ("all", "mode")
@Substitution(
sampling_strategy=BaseCleaningSampler._sampling_strategy_docstring,
n_jobs=_n_jobs_docstring,
)
class EditedNearestNeighbours(BaseCleaningSampler):
"""Undersample based on the edited nearest neighbour method.
This method will clean the database by removing samples close to the
decision boundary.
Read more in the :ref:`User Guide <edited_nearest_neighbors>`.
Parameters
----------
{sampling_strategy}
n_neighbors : int or object, default=3
If ``int``, size of the neighbourhood to consider to compute the
nearest neighbors. If object, an estimator that inherits from
:class:`~sklearn.neighbors.base.KNeighborsMixin` that will be used to
find the nearest-neighbors.
kind_sel : {{'all', 'mode'}}, default='all'
Strategy to use in order to exclude samples.
- If ``'all'``, all neighbours will have to agree with the samples of
interest to not be excluded.
- If ``'mode'``, the majority vote of the neighbours will be used in
order to exclude a sample.
The strategy `"all"` will be less conservative than `'mode'`. Thus,
more samples will be removed when `kind_sel="all"` generally.
{n_jobs}
Attributes
----------
sample_indices_ : ndarray of shape (n_new_samples,)
Indices of the samples selected.
.. versionadded:: 0.4
See Also
--------
CondensedNearestNeighbour : Undersample by condensing samples.
RepeatedEditedNearestNeighbours : Undersample by repeating ENN algorithm.
AllKNN : Undersample using ENN and various number of neighbours.
Notes
-----
The method is based on [1]_.
Supports multi-class resampling. A one-vs.-rest scheme is used when
sampling a class as proposed in [1]_.
References
----------
.. [1] D. Wilson, Asymptotic" Properties of Nearest Neighbor Rules Using
Edited Data," In IEEE Transactions on Systems, Man, and Cybernetrics,
vol. 2 (3), pp. 408-421, 1972.
Examples
--------
>>> from collections import Counter
>>> from sklearn.datasets import make_classification
>>> from imblearn.under_sampling import \
EditedNearestNeighbours # doctest: +NORMALIZE_WHITESPACE
>>> X, y = make_classification(n_classes=2, class_sep=2,
... weights=[0.1, 0.9], n_informative=3, n_redundant=1, flip_y=0,
... n_features=20, n_clusters_per_class=1, n_samples=1000, random_state=10)
>>> print('Original dataset shape %s' % Counter(y))
Original dataset shape Counter({{1: 900, 0: 100}})
>>> enn = EditedNearestNeighbours()
>>> X_res, y_res = enn.fit_resample(X, y)
>>> print('Resampled dataset shape %s' % Counter(y_res))
Resampled dataset shape Counter({{1: 887, 0: 100}})
"""
@_deprecate_positional_args
def __init__(
self, *, sampling_strategy="auto", n_neighbors=3, kind_sel="all",
n_jobs=None
):
super().__init__(sampling_strategy=sampling_strategy)
self.n_neighbors = n_neighbors
self.kind_sel = kind_sel
self.n_jobs = n_jobs
def _validate_estimator(self):
"""Validate the estimator created in the ENN."""
self.nn_ = check_neighbors_object(
"n_neighbors", self.n_neighbors, additional_neighbor=1
)
self.nn_.set_params(**{"n_jobs": self.n_jobs})
if self.kind_sel not in SEL_KIND:
raise NotImplementedError
def _fit_resample(self, X, y):
self._validate_estimator()
idx_under = np.empty((0,), dtype=int)
self.nn_.fit(X)
for target_class in np.unique(y):
if target_class in self.sampling_strategy_.keys():
target_class_indices = np.flatnonzero(y == target_class)
X_class = _safe_indexing(X, target_class_indices)
y_class = _safe_indexing(y, target_class_indices)
nnhood_idx = self.nn_.kneighbors(
X_class, return_distance=False
)[:, 1:]
nnhood_label = y[nnhood_idx]
if self.kind_sel == "mode":
nnhood_label, _ = mode(nnhood_label, axis=1)
nnhood_bool = np.ravel(nnhood_label) == y_class
elif self.kind_sel == "all":
nnhood_label = nnhood_label == target_class
nnhood_bool = np.all(nnhood_label, axis=1)
index_target_class = np.flatnonzero(nnhood_bool)
else:
index_target_class = slice(None)
idx_under = np.concatenate(
(
idx_under,
np.flatnonzero(y == target_class)[index_target_class],
),
axis=0,
)
self.sample_indices_ = idx_under
return _safe_indexing(X, idx_under), _safe_indexing(y, idx_under)
def _more_tags(self):
return {"sample_indices": True}
@Substitution(
sampling_strategy=BaseCleaningSampler._sampling_strategy_docstring,
n_jobs=_n_jobs_docstring,
)
class RepeatedEditedNearestNeighbours(BaseCleaningSampler):
"""Undersample based on the repeated edited nearest neighbour method.
This method will repeat several time the ENN algorithm.
Read more in the :ref:`User Guide <edited_nearest_neighbors>`.
Parameters
----------
{sampling_strategy}
n_neighbors : int or object, default=3
If ``int``, size of the neighbourhood to consider to compute the
nearest neighbors. If object, an estimator that inherits from
:class:`~sklearn.neighbors.base.KNeighborsMixin` that will be used to
find the nearest-neighbors.
max_iter : int, default=100
Maximum number of iterations of the edited nearest neighbours
algorithm for a single run.
kind_sel : {{'all', 'mode'}}, default='all'
Strategy to use in order to exclude samples.
- If ``'all'``, all neighbours will have to agree with the samples of
interest to not be excluded.
- If ``'mode'``, the majority vote of the neighbours will be used in
order to exclude a sample.
The strategy `"all"` will be less conservative than `'mode'`. Thus,
more samples will be removed when `kind_sel="all"` generally.
{n_jobs}
Attributes
----------
sample_indices_ : ndarray of shape (n_new_samples,)
Indices of the samples selected.
.. versionadded:: 0.4
n_iter_ : int
Number of iterations run.
.. versionadded:: 0.6
See Also
--------
CondensedNearestNeighbour : Undersample by condensing samples.
EditedNearestNeighbours : Undersample by editing samples.
AllKNN : Undersample using ENN and various number of neighbours.
Notes
-----
The method is based on [1]_. A one-vs.-rest scheme is used when
sampling a class as proposed in [1]_.
Supports multi-class resampling.
References
----------
.. [1] I. Tomek, "An Experiment with the Edited Nearest-Neighbor
Rule," IEEE Transactions on Systems, Man, and Cybernetics, vol. 6(6),
pp. 448-452, June 1976.
Examples
--------
>>> from collections import Counter
>>> from sklearn.datasets import make_classification
>>> from imblearn.under_sampling import \
RepeatedEditedNearestNeighbours # doctest : +NORMALIZE_WHITESPACE
>>> X, y = make_classification(n_classes=2, class_sep=2,
... weights=[0.1, 0.9], n_informative=3, n_redundant=1, flip_y=0,
... n_features=20, n_clusters_per_class=1, n_samples=1000, random_state=10)
>>> print('Original dataset shape %s' % Counter(y))
Original dataset shape Counter({{1: 900, 0: 100}})
>>> renn = RepeatedEditedNearestNeighbours()
>>> X_res, y_res = renn.fit_resample(X, y)
>>> print('Resampled dataset shape %s' % Counter(y_res))
Resampled dataset shape Counter({{1: 887, 0: 100}})
"""
@_deprecate_positional_args
def __init__(
self,
*,
sampling_strategy="auto",
n_neighbors=3,
max_iter=100,
kind_sel="all",
n_jobs=None,
):
super().__init__(sampling_strategy=sampling_strategy)
self.n_neighbors = n_neighbors
self.kind_sel = kind_sel
self.n_jobs = n_jobs
self.max_iter = max_iter
def _validate_estimator(self):
"""Private function to create the NN estimator"""
if self.max_iter < 2:
raise ValueError(
"max_iter must be greater than 1."
" Got {} instead.".format(type(self.max_iter))
)
self.nn_ = check_neighbors_object(
"n_neighbors", self.n_neighbors, additional_neighbor=1
)
self.enn_ = EditedNearestNeighbours(
sampling_strategy=self.sampling_strategy,
n_neighbors=self.nn_,
kind_sel=self.kind_sel,
n_jobs=self.n_jobs,
)
def _fit_resample(self, X, y):
self._validate_estimator()
X_, y_ = X, y
self.sample_indices_ = np.arange(X.shape[0], dtype=int)
target_stats = Counter(y)
class_minority = min(target_stats, key=target_stats.get)
for n_iter in range(self.max_iter):
prev_len = y_.shape[0]
X_enn, y_enn = self.enn_.fit_resample(X_, y_)
# Check the stopping criterion
# 1. If there is no changes for the vector y
# 2. If the number of samples in the other class become inferior to
# the number of samples in the majority class
# 3. If one of the class is disappearing
# Case 1
b_conv = prev_len == y_enn.shape[0]
# Case 2
stats_enn = Counter(y_enn)
count_non_min = np.array(
[
val
for val, key in zip(stats_enn.values(), stats_enn.keys())
if key != class_minority
]
)
b_min_bec_maj = np.any(
count_non_min < target_stats[class_minority]
)
# Case 3
b_remove_maj_class = len(stats_enn) < len(target_stats)
X_, y_, = X_enn, y_enn
self.sample_indices_ = self.sample_indices_[
self.enn_.sample_indices_
]
if b_conv or b_min_bec_maj or b_remove_maj_class:
if b_conv:
X_, y_, = X_enn, y_enn
self.sample_indices_ = self.sample_indices_[
self.enn_.sample_indices_
]
break
self.n_iter_ = n_iter + 1
X_resampled, y_resampled = X_, y_
return X_resampled, y_resampled
def _more_tags(self):
return {"sample_indices": True}
@Substitution(
sampling_strategy=BaseCleaningSampler._sampling_strategy_docstring,
n_jobs=_n_jobs_docstring,
)
class AllKNN(BaseCleaningSampler):
"""Undersample based on the AllKNN method.
This method will apply ENN several time and will vary the number of nearest
neighbours.
Read more in the :ref:`User Guide <edited_nearest_neighbors>`.
Parameters
----------
{sampling_strategy}
n_neighbors : int or estimator object, default=3
If ``int``, size of the neighbourhood to consider to compute the
nearest neighbors. If object, an estimator that inherits from
:class:`~sklearn.neighbors.base.KNeighborsMixin` that will be used to
find the nearest-neighbors. By default, it will be a 3-NN.
kind_sel : {{'all', 'mode'}}, default='all'
Strategy to use in order to exclude samples.
- If ``'all'``, all neighbours will have to agree with the samples of
interest to not be excluded.
- If ``'mode'``, the majority vote of the neighbours will be used in
order to exclude a sample.
The strategy `"all"` will be less conservative than `'mode'`. Thus,
more samples will be removed when `kind_sel="all"` generally.
allow_minority : bool, default=False
If ``True``, it allows the majority classes to become the minority
class without early stopping.
.. versionadded:: 0.3
{n_jobs}
Attributes
----------
sample_indices_ : ndarray of shape (n_new_samples,)
Indices of the samples selected.
.. versionadded:: 0.4
See Also
--------
CondensedNearestNeighbour: Under-sampling by condensing samples.
EditedNearestNeighbours: Under-sampling by editing samples.
RepeatedEditedNearestNeighbours: Under-sampling by repeating ENN.
Notes
-----
The method is based on [1]_.
Supports multi-class resampling. A one-vs.-rest scheme is used when
sampling a class as proposed in [1]_.
References
----------
.. [1] I. Tomek, "An Experiment with the Edited Nearest-Neighbor
Rule," IEEE Transactions on Systems, Man, and Cybernetics, vol. 6(6),
pp. 448-452, June 1976.
Examples
--------
>>> from collections import Counter
>>> from sklearn.datasets import make_classification
>>> from imblearn.under_sampling import \
AllKNN # doctest: +NORMALIZE_WHITESPACE
>>> X, y = make_classification(n_classes=2, class_sep=2,
... weights=[0.1, 0.9], n_informative=3, n_redundant=1, flip_y=0,
... n_features=20, n_clusters_per_class=1, n_samples=1000, random_state=10)
>>> print('Original dataset shape %s' % Counter(y))
Original dataset shape Counter({{1: 900, 0: 100}})
>>> allknn = AllKNN()
>>> X_res, y_res = allknn.fit_resample(X, y)
>>> print('Resampled dataset shape %s' % Counter(y_res))
Resampled dataset shape Counter({{1: 887, 0: 100}})
"""
@_deprecate_positional_args
def __init__(
self,
*,
sampling_strategy="auto",
n_neighbors=3,
kind_sel="all",
allow_minority=False,
n_jobs=None,
):
super().__init__(sampling_strategy=sampling_strategy)
self.n_neighbors = n_neighbors
self.kind_sel = kind_sel
self.allow_minority = allow_minority
self.n_jobs = n_jobs
def _validate_estimator(self):
"""Create objects required by AllKNN"""
if self.kind_sel not in SEL_KIND:
raise NotImplementedError
self.nn_ = check_neighbors_object(
"n_neighbors", self.n_neighbors, additional_neighbor=1
)
self.enn_ = EditedNearestNeighbours(
sampling_strategy=self.sampling_strategy,
n_neighbors=self.nn_,
kind_sel=self.kind_sel,
n_jobs=self.n_jobs,
)
def _fit_resample(self, X, y):
self._validate_estimator()
X_, y_ = X, y
target_stats = Counter(y)
class_minority = min(target_stats, key=target_stats.get)
self.sample_indices_ = np.arange(X.shape[0], dtype=int)
for curr_size_ngh in range(1, self.nn_.n_neighbors):
self.enn_.n_neighbors = curr_size_ngh
X_enn, y_enn = self.enn_.fit_resample(X_, y_)
# Check the stopping criterion
# 1. If the number of samples in the other class become inferior to
# the number of samples in the majority class
# 2. If one of the class is disappearing
# Case 1else:
stats_enn = Counter(y_enn)
count_non_min = np.array(
[
val
for val, key in zip(stats_enn.values(), stats_enn.keys())
if key != class_minority
]
)
b_min_bec_maj = np.any(
count_non_min < target_stats[class_minority]
)
if self.allow_minority:
# overwrite b_min_bec_maj
b_min_bec_maj = False
# Case 2
b_remove_maj_class = len(stats_enn) < len(target_stats)
X_, y_, = X_enn, y_enn
self.sample_indices_ = self.sample_indices_[
self.enn_.sample_indices_
]
if b_min_bec_maj or b_remove_maj_class:
break
X_resampled, y_resampled = X_, y_
return X_resampled, y_resampled
def _more_tags(self):
return {"sample_indices": True}
| 32.621881 | 79 | 0.619911 | 2,102 | 16,996 | 4.777831 | 0.150809 | 0.012446 | 0.009957 | 0.005974 | 0.737628 | 0.717216 | 0.714129 | 0.691825 | 0.685253 | 0.66972 | 0 | 0.016312 | 0.285832 | 16,996 | 520 | 80 | 32.684615 | 0.811089 | 0.487644 | 0 | 0.584158 | 0 | 0 | 0.020632 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.059406 | false | 0 | 0.044554 | 0.014851 | 0.148515 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9c1998dc801be68ae9d2e9206b331ca347c4a37a | 296 | py | Python | tests/util.py | marcodlk/bulkdata | 8bc65cea6045ae61d312ce8b4b0b3f332e84fc02 | [
"MIT"
] | 2 | 2020-06-05T15:09:14.000Z | 2021-04-16T18:16:27.000Z | tests/util.py | marcodlk/bulkdata | 8bc65cea6045ae61d312ce8b4b0b3f332e84fc02 | [
"MIT"
] | 3 | 2020-04-23T04:44:37.000Z | 2020-06-22T22:53:53.000Z | tests/util.py | marcodlk/bulkdata | 8bc65cea6045ae61d312ce8b4b0b3f332e84fc02 | [
"MIT"
] | null | null | null | from bulkdata.field import Field
class MockCard():
def __init__(self, name, fields):
self._name = name
self._fields = [Field(val) for val in fields]
@property
def name(self):
return self._name
@property
def fields(self):
return self._fields | 19.733333 | 53 | 0.621622 | 37 | 296 | 4.756757 | 0.432432 | 0.136364 | 0.159091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.287162 | 296 | 15 | 54 | 19.733333 | 0.834123 | 0 | 0 | 0.181818 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.272727 | false | 0 | 0.090909 | 0.181818 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
9c2178a963db0e08d2c2751e5a03cbd3073139d7 | 406 | py | Python | community/forms.py | darshanime/vesta | 203fd7aaded4d0198a8d219692ec530ab8988522 | [
"MIT"
] | null | null | null | community/forms.py | darshanime/vesta | 203fd7aaded4d0198a8d219692ec530ab8988522 | [
"MIT"
] | null | null | null | community/forms.py | darshanime/vesta | 203fd7aaded4d0198a8d219692ec530ab8988522 | [
"MIT"
] | null | null | null | from django.forms import ModelForm
from .models import Images, Details, Events
class PhotoForm(ModelForm):
class Meta:
model = Images
exclude = ('event',)
class RegisterForm(ModelForm):
class Meta:
model = Details
exclude = ('user',)
class EventsForm(ModelForm):
class Meta:
model = Events
exclude = ('creator','lat', 'lon', 'good', 'pincode') | 23.882353 | 61 | 0.623153 | 42 | 406 | 6.02381 | 0.547619 | 0.166008 | 0.213439 | 0.272727 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.258621 | 406 | 17 | 61 | 23.882353 | 0.840532 | 0 | 0 | 0.214286 | 0 | 0 | 0.081081 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
9c2201aed053ecf646e69189191fe1d9e36f3baa | 970 | py | Python | clean_architecture_example_application/presenter/__init__.py | mallycrip/clean-architecture-python | 3d54d5e092ddbbd45cc14b8c67571e65d2569f70 | [
"MIT"
] | 5 | 2021-02-19T03:14:32.000Z | 2021-12-08T06:09:25.000Z | clean_architecture_example_application/presenter/__init__.py | mallycrip/clean-architecture-python | 3d54d5e092ddbbd45cc14b8c67571e65d2569f70 | [
"MIT"
] | null | null | null | clean_architecture_example_application/presenter/__init__.py | mallycrip/clean-architecture-python | 3d54d5e092ddbbd45cc14b8c67571e65d2569f70 | [
"MIT"
] | null | null | null | from clean_architecture_example_application.presenter.di.controllers import mock_controller_object
class MockApplication:
def __init__(self):
self._address = str()
self._port = int()
def register_controller(self, controller, path: str):
# Register Controller
pass
def run(self, **configs):
self._address = configs["host"]
self._port = configs["port"]
if self._address and self._port:
print(f"*** Mock Application is run in {self._address}:{self._port}")
else:
print("*** Check an address and port")
def create_app():
_app = MockApplication()
register_controllers(_app)
register_extensions(_app)
return _app
def register_controllers(app: MockApplication):
app.register_controller(mock_controller_object, "/")
app.register_controller(mock_controller_object, "/index")
def register_extensions(app: MockApplication):
# TODO
pass
| 24.25 | 98 | 0.675258 | 107 | 970 | 5.813084 | 0.392523 | 0.07074 | 0.096463 | 0.080386 | 0.131833 | 0.131833 | 0 | 0 | 0 | 0 | 0 | 0 | 0.223711 | 970 | 39 | 99 | 24.871795 | 0.826029 | 0.024742 | 0 | 0.083333 | 0 | 0 | 0.109226 | 0.029692 | 0 | 0 | 0 | 0.025641 | 0 | 1 | 0.25 | false | 0.083333 | 0.041667 | 0 | 0.375 | 0.083333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
9c25adaac9c3276a82912bb3b5938c3f0ec17c9d | 2,427 | py | Python | ngrams.py | mattslight/fivec | f4e1bfb7c957f2d90676a5f20110ad0f386c4948 | [
"MIT"
] | null | null | null | ngrams.py | mattslight/fivec | f4e1bfb7c957f2d90676a5f20110ad0f386c4948 | [
"MIT"
] | null | null | null | ngrams.py | mattslight/fivec | f4e1bfb7c957f2d90676a5f20110ad0f386c4948 | [
"MIT"
] | null | null | null | import re
import math
import csv
import pprint
def score4(word):
score = 0.0
with open("./tables/4grams.tsv") as tsv:
pairs = []
for a,b in (r[0:2] for r in csv.reader(tsv, dialect="excel-tab")):
pairs.append([a, b])
for pair in pairs:
if pair[0] in word:
score += (math.log(float(pair[1]))/math.log(float(pairs[0][1])))/2
return score
def score2(word):
score = 0.0
word = word.upper()
with open("./tables/2grams.tsv") as tsv:
for row in csv.reader(tsv, dialect="excel-tab"):
if row[0] in word:
for match_index in (m.start() for m in re.finditer('(?='+row[0]+')', word)):
if match_index == 0:
if row[16] != '0':
score += (math.log(float(row[16]))/math.log(float(row[5])))
else:
score -= 1
elif match_index == 1:
if row[17] != '0':
score += (math.log(float(row[17]))/math.log(float(row[5])))
else:
score -= 1
elif match_index == 2:
if row[18] != '0':
score += (math.log(float(row[18]))/math.log(float(row[5])))
else:
score -= 1
elif match_index == 3:
if row[19] != '0':
score += (math.log(float(row[19]))/math.log(float(row[5])))
else:
score -= 1
return score
def score3(word):
score = 0.0
word = word.upper()
with open("./tables/3grams.tsv") as tsv:
for row in csv.reader(tsv, dialect="excel-tab"):
if row[0] in word:
for match_index in (m.start() for m in re.finditer('(?='+row[0]+')', word)):
if match_index == 0:
if row[12] != '0':
score += (math.log(float(row[12]))/math.log(float(row[4]))) #TODO normalise score
elif match_index == 1:
if row[13] != '0':
score += (math.log(float(row[13]))/math.log(float(row[4]))) #TODO normalise score
elif match_index == 2:
if row[14] != '0':
score += (math.log(float(row[14]))/math.log(float(row[4]))) #TODO normalise score
return score
def score(word):
return score2(word), score3(word), score4(word)
| 35.691176 | 101 | 0.465595 | 322 | 2,427 | 3.481366 | 0.18323 | 0.099911 | 0.171276 | 0.187333 | 0.702944 | 0.702944 | 0.550401 | 0.524532 | 0.471008 | 0.471008 | 0 | 0.051735 | 0.370828 | 2,427 | 67 | 102 | 36.223881 | 0.682384 | 0.024722 | 0 | 0.459016 | 0 | 0 | 0.041878 | 0 | 0 | 0 | 0 | 0.014925 | 0 | 1 | 0.065574 | false | 0 | 0.065574 | 0.016393 | 0.196721 | 0.016393 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9c4a907adebee32a3ba5e63409bc6f012ebac519 | 514 | py | Python | test_sample.py | ashwani1218/OpenMEMEs | 3c4fc8b9c7abac33c4260f096e45db4bc132574b | [
"MIT"
] | 1 | 2019-09-07T11:39:58.000Z | 2019-09-07T11:39:58.000Z | test_sample.py | ashwani1218/OpenMEMEs | 3c4fc8b9c7abac33c4260f096e45db4bc132574b | [
"MIT"
] | 25 | 2019-09-07T12:12:21.000Z | 2019-09-15T19:37:37.000Z | test_sample.py | ashwani1218/OpenMEMEs | 3c4fc8b9c7abac33c4260f096e45db4bc132574b | [
"MIT"
] | 1 | 2019-09-08T05:41:17.000Z | 2019-09-08T05:41:17.000Z | import pytest
class TestClass:
def test_login(self, client):
response = client.get("/")
assert response.status_code == 200
def test_home(self, client):
response = client.get("/home")
assert response.status_code == 200
def test_registration(self, client):
response = client.get("/registration")
assert response.status_code == 200
def test_newpost(self, client):
response = client.get("/newpost")
assert response.status_code == 200
| 27.052632 | 46 | 0.640078 | 59 | 514 | 5.440678 | 0.305085 | 0.087227 | 0.224299 | 0.299065 | 0.738318 | 0.317757 | 0.317757 | 0 | 0 | 0 | 0 | 0.031088 | 0.249027 | 514 | 18 | 47 | 28.555556 | 0.800518 | 0 | 0 | 0.285714 | 0 | 0 | 0.052529 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 1 | 0.285714 | false | 0 | 0.071429 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
9c51b060d295ab3a51ce5d52c42431a4c763eba9 | 644 | py | Python | scrapy/commands/view.py | pyarnold/scrapy | 3e6631bb7a5d54561f72b82774a713f575451cb3 | [
"BSD-3-Clause"
] | 1 | 2020-12-18T01:08:02.000Z | 2020-12-18T01:08:02.000Z | scrapy/commands/view.py | pyarnold/scrapy | 3e6631bb7a5d54561f72b82774a713f575451cb3 | [
"BSD-3-Clause"
] | null | null | null | scrapy/commands/view.py | pyarnold/scrapy | 3e6631bb7a5d54561f72b82774a713f575451cb3 | [
"BSD-3-Clause"
] | null | null | null | from scrapy.command import ScrapyCommand
from scrapy.commands import fetch
from scrapy.utils.response import open_in_browser
class Command(fetch.Command):
def short_desc(self):
return "Open URL in browser, as seen by Scrapy"
def long_desc(self):
return "Fetch a URL using the Scrapy downloader and show its " \
"contents in a browser"
def add_options(self, parser):
ScrapyCommand.add_options(self, parser)
parser.add_option("--spider", dest="spider",
help="use this spider")
def _print_response(self, response, opts):
open_in_browser(response)
| 29.272727 | 72 | 0.67236 | 84 | 644 | 5.02381 | 0.5 | 0.07109 | 0.061611 | 0.094787 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.246894 | 644 | 21 | 73 | 30.666667 | 0.870103 | 0 | 0 | 0 | 0 | 0 | 0.218944 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.2 | 0.133333 | 0.666667 | 0.066667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
9c54a6cceb66a77ede3684376c10fc92c36330e3 | 941 | py | Python | fn_netdevice/fn_netdevice/util/config.py | nickpartner-goahead/resilient-community-apps | 097c0dbefddbd221b31149d82af9809420498134 | [
"MIT"
] | 65 | 2017-12-04T13:58:32.000Z | 2022-03-24T18:33:17.000Z | fn_netdevice/fn_netdevice/util/config.py | nickpartner-goahead/resilient-community-apps | 097c0dbefddbd221b31149d82af9809420498134 | [
"MIT"
] | 48 | 2018-03-02T19:17:14.000Z | 2022-03-09T22:00:38.000Z | fn_netdevice/fn_netdevice/util/config.py | nickpartner-goahead/resilient-community-apps | 097c0dbefddbd221b31149d82af9809420498134 | [
"MIT"
] | 95 | 2018-01-11T16:23:39.000Z | 2022-03-21T11:34:29.000Z | # -*- coding: utf-8 -*-
"""Generate a default configuration-file section for fn_netdevice"""
from __future__ import print_function
def config_section_data():
"""Produce the default configuration section for app.config,
when called by `resilient-circuits config [-c|-u]`
"""
config_data = u"""[fn_netdevice]
# identify the device_name section names below for selftest, separated by commas
selftest=
# specify the directory if using textFSM templates
#template_dir=
# for each network device to communicate with, uniquely define it's section below to match the device_ids field in the function input parameter
#[unique_device_name]
#device_type=<see devices defined here https://github.com/ktbyers/netmiko/blob/master/netmiko/ssh_dispatcher.py>
#ip=
#username=
#password=
#port=22
#secret=<leave commented for default of no secret>
#verbose=False
#use_commit=False
"""
return config_data | 32.448276 | 144 | 0.740701 | 129 | 941 | 5.255814 | 0.713178 | 0.058997 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003827 | 0.166844 | 941 | 29 | 145 | 32.448276 | 0.860969 | 0.206164 | 0 | 0 | 1 | 0.105263 | 0.82066 | 0.030129 | 0 | 0 | 0 | 0 | 0 | 1 | 0.052632 | false | 0.052632 | 0.052632 | 0 | 0.157895 | 0.052632 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
9c615f81d08e578072d26cd75d2a3687c18889eb | 100 | py | Python | credentials_linkedin.py | victoirelinder/pi2_A5 | fd677e77b5b2fe911b4797da54516d4202044b4f | [
"MIT"
] | null | null | null | credentials_linkedin.py | victoirelinder/pi2_A5 | fd677e77b5b2fe911b4797da54516d4202044b4f | [
"MIT"
] | null | null | null | credentials_linkedin.py | victoirelinder/pi2_A5 | fd677e77b5b2fe911b4797da54516d4202044b4f | [
"MIT"
] | null | null | null | # this is done creating a LinkedIn app
USER_ID = 'piprojecta5@gmail.com'
USER_PASSWORD = 'p1carre5'
| 25 | 38 | 0.77 | 15 | 100 | 5 | 0.933333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034884 | 0.14 | 100 | 3 | 39 | 33.333333 | 0.837209 | 0.36 | 0 | 0 | 0 | 0 | 0.467742 | 0.33871 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.5 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
9c717c26e7af18ce95bd207226cf86025d4b9605 | 114 | py | Python | twitterdedupe/__main__.py | jonohill/twitter-dedupe | 1ec61304317721ec989729a03ec4d86325d0e058 | [
"MIT"
] | null | null | null | twitterdedupe/__main__.py | jonohill/twitter-dedupe | 1ec61304317721ec989729a03ec4d86325d0e058 | [
"MIT"
] | null | null | null | twitterdedupe/__main__.py | jonohill/twitter-dedupe | 1ec61304317721ec989729a03ec4d86325d0e058 | [
"MIT"
] | null | null | null | #!/usr/bin/env python
import os
from .daemons import ToggleDaemon
d = ToggleDaemon(os.environ)
d.run_forever()
| 12.666667 | 33 | 0.754386 | 17 | 114 | 5 | 0.764706 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.131579 | 114 | 8 | 34 | 14.25 | 0.858586 | 0.175439 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9c789088f2fcb861a0575fae8424e27e27f4f347 | 8,189 | py | Python | qiskit/providers/ibmq/runtime/__init__.py | jwoehr/qiskit-ibmq-provider | 256a9963c152060523aee51d308c36d64c97517e | [
"Apache-2.0"
] | null | null | null | qiskit/providers/ibmq/runtime/__init__.py | jwoehr/qiskit-ibmq-provider | 256a9963c152060523aee51d308c36d64c97517e | [
"Apache-2.0"
] | null | null | null | qiskit/providers/ibmq/runtime/__init__.py | jwoehr/qiskit-ibmq-provider | 256a9963c152060523aee51d308c36d64c97517e | [
"Apache-2.0"
] | null | null | null | # This code is part of Qiskit.
#
# (C) Copyright IBM 2021.
#
# This code is licensed under the Apache License, Version 2.0. You may
# obtain a copy of this license in the LICENSE.txt file in the root directory
# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
#
# Any modifications or derivative works of this code must retain this
# copyright notice, and modified files need to carry a notice indicating
# that they have been altered from the originals.
"""
==============================================
Runtime (:mod:`qiskit.providers.ibmq.runtime`)
==============================================
.. currentmodule:: qiskit.providers.ibmq.runtime
Modules related to Qiskit Runtime Service.
.. note::
The Qiskit Runtime service is not available to all providers. To check if your provider
has access::
from qiskit import IBMQ
IBMQ.load_account()
provider = IBMQ.get_provider(...)
can_use_runtime = provider.has_service('runtime')
.. note::
Not all backends support Qiskit Runtime. Refer to documentation in
`Qiskit-Partners/qiskit-runtime
<https://github.com/Qiskit-Partners/qiskit-runtime>`_ for more information.
.. caution::
This package is currently provided in beta form and heavy modifications to
both functionality and API are likely to occur. Backward compatibility is not
always guaranteed.
Qiskit Runtime is a new architecture offered by IBM Quantum that
streamlines computations requiring many iterations. These experiments will
execute significantly faster within its improved hybrid quantum/classical process.
The Qiskit Runtime Service allows authorized users to upload their Qiskit quantum programs.
A Qiskit quantum program, also called a runtime program, is a piece of Python
code and its metadata that takes certain inputs, performs
quantum and maybe classical processing, and returns the results. The same or other
authorized users can invoke these quantum programs by simply passing in parameters.
`Qiskit-Partners/qiskit-runtime <https://github.com/Qiskit-Partners/qiskit-runtime>`_
contains detailed tutorials on how to use Qiskit Runtime.
Listing runtime programs
------------------------
To list all available runtime programs::
from qiskit import IBMQ
provider = IBMQ.load_account()
# List all available programs.
provider.runtime.pprint_programs()
# Get a single program.
program = provider.runtime.program('circuit-runner')
# Print program metadata.
print(program)
In the example above, ``provider.runtime`` points to the runtime service class
:class:`IBMRuntimeService`, which is the main entry
point for using this service. The example prints the program metadata of all
available runtime programs and of just the ``circuit-runner`` program. A program
metadata consists of the program's ID, name, description, input parameters,
return values, interim results, and other information that helps you to know
more about the program.
Invoking a runtime program
--------------------------
You can use the :meth:`IBMRuntimeService.run` method to invoke a runtime program.
For example::
from qiskit import IBMQ, QuantumCircuit
from qiskit.providers.ibmq import RunnerResult
provider = IBMQ.load_account()
backend = provider.backend.ibmq_qasm_simulator
# Create a circuit.
qc = QuantumCircuit(2, 2)
qc.h(0)
qc.cx(0, 1)
qc.measure_all()
# Set the "circuit-runner" program parameters
params = provider.runtime.program(program_id="circuit-runner").parameters()
params.circuits = qc
params.measurement_error_mitigation = True
# Configure backend options
options = {'backend_name': backend.name()}
# Execute the circuit using the "circuit-runner" program.
job = provider.runtime.run(program_id="circuit-runner",
options=options,
inputs=params)
# Get runtime job result.
result = job.result(decoder=RunnerResult)
The example above invokes the ``circuit-runner`` program,
which compiles, executes, and optionally applies measurement error mitigation to
the circuit result.
Runtime Jobs
------------
When you use the :meth:`IBMRuntimeService.run` method to invoke a runtime
program, a
:class:`RuntimeJob` instance is returned. This class has all the basic job
methods, such as :meth:`RuntimeJob.status`, :meth:`RuntimeJob.result`, and
:meth:`RuntimeJob.cancel`. Note that it does not have the same methods as regular
circuit jobs, which are instances of :class:`~qiskit.providers.ibmq.job.IBMQJob`.
Interim results
---------------
Some runtime programs provide interim results that inform you about program
progress. You can choose to stream the interim results when you run the
program by passing in the ``callback`` parameter, or at a later time using
the :meth:`RuntimeJob.stream_results` method. For example::
from qiskit import IBMQ, QuantumCircuit
provider = IBMQ.load_account()
backend = provider.backend.ibmq_qasm_simulator
def interim_result_callback(job_id, interim_result):
print(interim_result)
# Stream interim results as soon as the job starts running.
job = provider.runtime.run(program_id="circuit-runner",
options=options,
inputs=program_inputs,
callback=interim_result_callback)
Uploading a program
-------------------
.. note::
Only authorized accounts can upload programs. Having access to the
runtime service doesn't imply access to upload programs.
Each runtime program has both ``data`` and ``metadata``. Program data is
the Python code to be executed. Program metadata provides usage information,
such as program description, its inputs and outputs, and backend requirements.
A detailed program metadata helps the consumers of the program to know what is
needed to run the program.
Each program data needs to have a ``main(backend, user_messenger, **kwargs)``
method, which serves as the entry point to the program. The ``backend`` parameter
is a :class:`ProgramBackend` instance whose :meth:`ProgramBackend.run` method
can be used to submit circuits. The ``user_messenger`` is a :class:`UserMessenger`
instance whose :meth:`UserMessenger.publish` method can be used to publish interim and
final results.
See `qiskit/providers/ibmq/runtime/program/program_template.py` for a program data
template file.
Each program metadata must include at least the program name, description, and
maximum execution time. You can find description of each metadata field in
the :meth:`IBMRuntimeService.upload_program` method. Instead of passing in
the metadata fields individually, you can pass in a JSON file or a dictionary
to :meth:`IBMRuntimeService.upload_program` via the ``metadata`` parameter.
`qiskit/providers/ibmq/runtime/program/program_metadata_sample.json`
is a sample file of program metadata.
You can use the :meth:`IBMRuntimeService.upload_program` to upload a program.
For example::
from qiskit import IBMQ
provider = IBMQ.load_account()
program_id = provider.runtime.upload_program(
data="my_vqe.py",
metadata="my_vqe_metadata.json"
)
In the example above, the file ``my_vqe.py`` contains the program data, and
``my_vqe_metadata.json`` contains the program metadata.
Method :meth:`IBMRuntimeService.delete_program` allows you to delete a
program.
Files related to writing a runtime program are in the
``qiskit/providers/ibmq/runtime/program`` directory.
Classes
==========================
.. autosummary::
:toctree: ../stubs/
IBMRuntimeService
RuntimeJob
RuntimeProgram
UserMessenger
ProgramBackend
ResultDecoder
RuntimeEncoder
RuntimeDecoder
ParameterNamespace
"""
from .ibm_runtime_service import IBMRuntimeService
from .runtime_job import RuntimeJob
from .runtime_program import RuntimeProgram, ParameterNamespace
from .program.user_messenger import UserMessenger
from .program.program_backend import ProgramBackend
from .program.result_decoder import ResultDecoder
from .utils import RuntimeEncoder, RuntimeDecoder
| 35.297414 | 91 | 0.733667 | 1,075 | 8,189 | 5.536744 | 0.293953 | 0.028226 | 0.022345 | 0.021841 | 0.156418 | 0.136761 | 0.116767 | 0.099126 | 0.084677 | 0.084677 | 0 | 0.001922 | 0.174014 | 8,189 | 231 | 92 | 35.450216 | 0.878031 | 0.952986 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
9c7f8a18e43f08afec9565ef82efbaf8e732e5ab | 432 | py | Python | demo/stack/stack.py | istommao/DataStructureAndAlgorithm | 5542b97831b49771191eb126b38ca745be51c084 | [
"MIT"
] | 2 | 2018-01-08T06:08:44.000Z | 2019-05-29T04:22:21.000Z | demo/stack/stack.py | istommao/DataStructureAndAlgorithm | 5542b97831b49771191eb126b38ca745be51c084 | [
"MIT"
] | null | null | null | demo/stack/stack.py | istommao/DataStructureAndAlgorithm | 5542b97831b49771191eb126b38ca745be51c084 | [
"MIT"
] | null | null | null | """stack."""
class Stack(object):
def __init__(self, item):
self._data = [item]
def push(self, item):
self._data.append(item)
def pop(self):
if len(self._data) == 0:
raise ValueError('Your stack is empty')
return self._data.pop()
def getMin(self):
if len(self._data) == 0:
raise ValueError('Your stack is empty')
return min(self._data)
| 18.782609 | 51 | 0.550926 | 55 | 432 | 4.145455 | 0.4 | 0.210526 | 0.105263 | 0.140351 | 0.482456 | 0.482456 | 0.482456 | 0.482456 | 0.482456 | 0.482456 | 0 | 0.006734 | 0.3125 | 432 | 22 | 52 | 19.636364 | 0.760943 | 0.013889 | 0 | 0.307692 | 0 | 0 | 0.090476 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.307692 | false | 0 | 0 | 0 | 0.538462 | 0 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
92c00882051cab5ff1670bb33c616c6e1feac265 | 479 | py | Python | pycoin/services/env.py | GRS-Community/pycoin | 4a9b9722c91e2831519ddf9675fe8c70246432b7 | [
"MIT"
] | 1,210 | 2015-01-02T13:36:28.000Z | 2022-03-30T00:52:22.000Z | pycoin/services/env.py | GRS-Community/pycoin | 4a9b9722c91e2831519ddf9675fe8c70246432b7 | [
"MIT"
] | 280 | 2015-01-05T23:16:47.000Z | 2022-02-22T22:02:17.000Z | pycoin/services/env.py | GRS-Community/pycoin | 4a9b9722c91e2831519ddf9675fe8c70246432b7 | [
"MIT"
] | 459 | 2015-01-10T00:15:57.000Z | 2022-03-16T12:04:40.000Z | import os
def main_cache_dir():
p = os.getenv("PYCOIN_CACHE_DIR")
if p:
p = os.path.expanduser(p)
return p
def tx_read_cache_dirs():
return [p for p in os.getenv("PYCOIN_TX_DB_DIRS", "").split(":") if len(p) > 0]
def tx_writable_cache_dir():
p = main_cache_dir()
if p:
p = os.path.join(main_cache_dir(), "txs")
return p
def config_string_for_netcode_from_env(netcode):
return os.getenv("PYCOIN_%s_PROVIDERS" % netcode, "")
| 19.958333 | 83 | 0.645094 | 78 | 479 | 3.653846 | 0.410256 | 0.140351 | 0.126316 | 0.077193 | 0.126316 | 0.126316 | 0.126316 | 0 | 0 | 0 | 0 | 0.00266 | 0.215031 | 479 | 23 | 84 | 20.826087 | 0.755319 | 0 | 0 | 0.266667 | 0 | 0 | 0.11691 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.266667 | false | 0 | 0.066667 | 0.133333 | 0.6 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
92ed3a61c1e46eb5e5dfcb95c3c79ec4e06b2b24 | 14,396 | py | Python | modules/report_generation/web_report/generate_web_report.py | Ck1998/SecurusAudire | fb6288a72d470f11f6114e0ea6ffb6fa63da80e6 | [
"MIT"
] | null | null | null | modules/report_generation/web_report/generate_web_report.py | Ck1998/SecurusAudire | fb6288a72d470f11f6114e0ea6ffb6fa63da80e6 | [
"MIT"
] | null | null | null | modules/report_generation/web_report/generate_web_report.py | Ck1998/SecurusAudire | fb6288a72d470f11f6114e0ea6ffb6fa63da80e6 | [
"MIT"
] | null | null | null | from modules.report_generation.report_generator_base import ReportGenBase
from os import makedirs
from config import CURR_SYSTEM_PLATFORM
class GenerateWebReport(ReportGenBase):
def __init__(self, full_report: dict, save_folder_location: str, timestamp):
super().__init__()
self.save_folder_location = save_folder_location
self.full_report = full_report
self.audit_result = self.full_report["Audit Results"]
self.timestamp = timestamp
self.file_content = """<!DOCTYPE html><html><head> <meta charset="utf-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <meta http-equiv="X-UA-Compatible" content="IE=edge"> <title>SecureAudire Web Report</title> <style>/* DEMO STYLE*/@import "https://fonts.googleapis.com/css?family=Poppins:300,400,500,600,700";body{font-family: "Poppins", sans-serif; background: #fafafa;}p{font-family: "Poppins", sans-serif; font-size: 1.1em; font-weight: 300; line-height: 1.7em; color: #999;}a,a:hover,a:focus{color: inherit; text-decoration: none; transition: all 0.3s;}.navbar{padding: 15px 10px; background: #fff; border: none; border-radius: 0; margin-bottom: 40px; box-shadow: 1px 1px 3px rgba(0, 0, 0, 0.1);}.navbar-btn{box-shadow: none; outline: none !important; border: none;}.line{width: 100%; height: 1px; border-bottom: 1px dashed #ddd; margin: 40px 0;}/* --------------------------------------------------- SIDEBAR STYLE----------------------------------------------------- */#sidebar{width: 250px; position: fixed; top: 0; left: -250px; height: 100vh; z-index: 999; background: #7386D5; color: #fff; transition: all 0.3s; overflow-y: scroll; box-shadow: 3px 3px 3px rgba(0, 0, 0, 0.2);}#sidebar.active{left: 0;}#dismiss{width: 35px; height: 35px; line-height: 35px; text-align: center; background: #7386D5; position: absolute; top: 10px; right: 10px; cursor: pointer; -webkit-transition: all 0.3s; -o-transition: all 0.3s; transition: all 0.3s;}#dismiss:hover{background: #fff; color: #7386D5;}.overlay{display: none; position: fixed; width: 100vw; height: 100vh; background: rgba(0, 0, 0, 0.7); z-index: 998; opacity: 0; transition: all 0.5s ease-in-out;}.overlay.active{display: block; opacity: 1;}#sidebar .sidebar-header{padding: 20px; background: #6d7fcc;}#sidebar ul.components{padding: 20px 0; border-bottom: 1px solid #47748b;}#sidebar ul p{color: #fff; padding: 10px;}#sidebar ul li a{padding: 10px; font-size: 1.1em; display: block;}#sidebar ul li a:hover{color: #7386D5; background: #fff;}#sidebar ul li.active>a,a[aria-expanded="true"]{color: #fff; background: #6d7fcc;}a[data-toggle="collapse"]{position: relative;}.dropdown-toggle::after{display: block; position: absolute; top: 50%; right: 20px; transform: translateY(-50%);}ul ul a{font-size: 0.9em !important; padding-left: 30px !important; background: #6d7fcc;}ul.CTAs{padding: 20px;}ul.CTAs a{text-align: center; font-size: 0.9em !important; display: block; border-radius: 5px; margin-bottom: 5px;}a.download{background: #fff; color: #7386D5;}a.article,a.article:hover{background: #6d7fcc !important; color: #fff !important;}/* --------------------------------------------------- CONTENT STYLE----------------------------------------------------- */#content{width: 100%; padding: 20px; min-height: 100vh; transition: all 0.3s; position: absolute; top: 0; right: 0;}</style> <link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.1.0/css/bootstrap.min.css" integrity="sha384-9gVQ4dYFwwWSjIDZnLEWnxCjeSWFphJiwGPXr1jddIhOegiu1FwO5qRGvFXOdJZ4" crossorigin="anonymous"><!-- Our Custom CSS <link rel="stylesheet" href="style5.css"> --> <script defer src="https://use.fontawesome.com/releases/v5.0.13/js/solid.js" integrity="sha384-tzzSw1/Vo+0N5UhStP3bvwWPq+uvzCMfrN1fEFe+xBmv1C/AtVX5K0uZtmcHitFZ" crossorigin="anonymous"></script> <script defer src="https://use.fontawesome.com/releases/v5.0.13/js/fontawesome.js" integrity="sha384-6OIrr52G08NpOFSZdxxz1xdNSndlD4vdcf/q2myIUVO0VsqaGHJsB0RaBE01VTOY" crossorigin="anonymous"></script> <script type="text/javascript"></script></head> <body> <div class="wrapper"> <nav id="sidebar" style="min-width: 250px; max-width:250px"> <div id="dismiss"> <i class="fas fa-arrow-left"></i> </div><div class="sidebar-header"> <h4>SecurusAudire</h4> </div><ul class="list-unstyled components">
<li>
<a class="link" href="#" id="Warnings" style="text-decoration:none; color: white;">Warnings</a>
</li>
<li>
<a class="link" href="#" id="Suggestions" style="text-decoration:none; color: white;">Suggestions</a>
</li>
<li>
<a class="link" href="#AuditSubmenu" style="text-decoration:none; color: white;" data-toggle="collapse" aria-expanded="false" class="dropdown-toggle" id="Audit">Audit</a>
<ul class="collapse list-unstyled" id="AuditSubmenu">
"""
def generate_side_nav_bar(self):
"""
Format -
<li>
<a class="link" href="#homeSubmenu" data-toggle="collapse" aria-expanded="false" class="dropdown-toggle" id="home">Home</a>
<ul class="collapse list-unstyled" id="homeSubmenu">
<li>
<a class="link" href="#" id="home_1">Home 1</a>
</li>
<li>
<a class="link" href="#" id="home_2">Home 2</a>
</li>
<li>
<a class="link" href="#" id="home_3">Home 3</a>
</li>
</ul>
</li>
"""
list_item = ""
for module_name in self.audit_result.keys():
sub_list = ""
for test_name in self.audit_result[module_name].keys():
temp_sub_list = f"""
<li>
<a class="link" style="text-decoration:none; color: white;" href="#" id="{module_name.replace(' ', '').replace('.', '').replace('(', '').replace(')', '')}_{test_name.replace(' ', '').replace('.', '').replace('(', '').replace(')', '')}">{test_name}</a>
</li>
"""
sub_list += temp_sub_list
temp_main_list = f"""
<li>
<a class="link" style="text-decoration:none; color: white;" href="#{module_name.replace(' ', '').replace('.', '').replace('(', '').replace(')', '')}SubMenu" data-toggle="collapse" aria-expanded="false" class="dropdown-toggle" id="{module_name.replace(' ', '').replace('.', '').replace('(', '').replace(')', '')}">{module_name}</a>
<ul class="collapse list-unstyled" id="{module_name.replace(' ', '').replace('.', '').replace('(', '').replace(')', '')}SubMenu">{sub_list}
</ul>
</li>
"""
list_item += temp_main_list
list_item += '</ul></li></ul></nav>'
return list_item
def get_html_table_from_dict(self, dict_to_convert: dict):
# converting dictionary to json
json_data = self.util_obj.convert_dict_to_json(dict_convert=dict_to_convert)
# converting json to html table
converted_table = self.util_obj.convert_json_to_html_table(json_data=json_data)
return converted_table
def generate_main_content(self):
complete_table = self.get_html_table_from_dict(self.full_report)
main_table = self.get_html_table_from_dict(self.audit_result)
system_score = self.full_report["System Score"]
total_score_possible = self.full_report["Total Score Possible"]
audit_start_time = self.full_report["Audit Start Time"]
audit_end_time = self.full_report["Audit End Time"]
audit_duration = self.full_report["Audit Duration"]
try:
hardening_index = round((system_score / total_score_possible) * 100, 2)
except ZeroDivisionError:
hardening_index = 0.00
main_content = f""" <div id="content"> <nav class="navbar navbar-expand-lg navbar-light bg-light"> <div class="container-fluid"> <button type="button" id="sidebarCollapse" class="btn btn-info"> <i class="fas fa-align-left"></i> <span></span> </button> <button class="btn btn-dark d-inline-block d-lg-none ml-auto" type="button" data-toggle="collapse" data-target="#navbarSupportedContent" aria-controls="navbarSupportedContent" aria-expanded="false" aria-label="Toggle navigation"> <i class="fas fa-align-justify"></i> </button> <div class="collapse navbar-collapse" id="navbarSupportedContent"> <ul class="nav navbar-nav ml-auto"> <li class="nav-item active"> <h4>Web Report</h4> </li></ul> </div></div></nav><!--Score--><hr /><div id="score" class="row"><div class="col"><h5 style="margin: auto; text-align: center;">Total Score - {system_score}/{total_score_possible}</h5></div><div class="col"><h5 style="margin: auto; text-align: center;"> Hardening Index - {hardening_index}%</h5></div></div><hr /><div id="time" class="row"><div class="col"><h5 style="margin: auto; text-align: center;">Audit Start Time - {audit_start_time}</h5></div><div class="col"><h5 style="margin: auto; text-align: center;"> Audit End Time - {audit_end_time}</h5></div></div><hr /><div id="duration" class="row"><div class="col"><h5 style="margin: auto; text-align: center;"> Audit Duration - {audit_duration}</h5></div><div class="col"><h5 style="margin: auto; text-align: center;"> System Detected - {CURR_SYSTEM_PLATFORM}</h5></div></div>
<hr /><br /><hr /><div id="main_table"><h5>Full Report</h5><hr /><br />{complete_table}</div><div id="main_content"><div id="Audit_content"><a class="go_back" href="#main_table"><h5>Full Report</a> > Audit Results</h5><hr /><br />{main_table}</div>"""
for module_name in self.audit_result.keys():
sub_table = self.get_html_table_from_dict(self.audit_result[module_name])
temp_sub_content = f"""
<div id="{module_name.replace(' ', '').replace('.', '').replace('(', '').replace(')', '')}_content"><a class="go_back" href="#main_table"><h5>Full Report</a> > {module_name}</h5><hr /><br />{sub_table}</div>
"""
main_content += temp_sub_content
for test_name in self.audit_result[module_name].keys():
test_table = self.get_html_table_from_dict(self.audit_result[module_name][test_name])
temp_test_content = f"""
<div id="{module_name.replace(' ', '').replace('.', '').replace('(', '').replace(')', '')}_{test_name.replace(' ', '').replace('.', '').replace('(', '').replace(')', '')}_content"><a class="go_back" href="#main_table"><h5>Full Report</a> > <a class="go_back" href="#{module_name.replace(' ', '').replace('.', '').replace('(', '').replace(')', '')}_content">{module_name}</a> > {test_name}</h5><hr /><br />{test_table}</div>
"""
main_content += temp_test_content
# adding warning table
warnings_table = self.get_html_table_from_dict(self.full_report["Warnings"])
main_content += f'<div id="Warnings_content"><a class="go_back" href="#main_table"><h5>Full Report</a> > Warnings</h5><hr /><br />{warnings_table}</div>'
# adding suggestion table
suggestions_table = self.get_html_table_from_dict(self.full_report["Suggestions"])
main_content += f'<div id="Suggestions_content"><a class="go_back" href="#main_table"><h5>Full Report</a> > Suggestions</h5><hr /><br />{suggestions_table}</div>'
main_content += '</div></div></div>'
return main_content
def parse_result(self):
side_nav_bar = self.generate_side_nav_bar()
self.file_content += side_nav_bar
main_content = self.generate_main_content()
self.file_content += main_content
self.file_content += """<div class="overlay"></div> <script src="https://code.jquery.com/jquery-3.3.1.slim.min.js" integrity="sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo" crossorigin="anonymous"></script> <script src="https://cdnjs.cloudflare.com/ajax/libs/popper.js/1.14.0/umd/popper.min.js" integrity="sha384-cs/chFZiN24E4KMATLdqdvsezGxaGsi4hLGOzlXwp5UZB1LY//20VyM2taTB4QvJ" crossorigin="anonymous"></script> <script src="https://stackpath.bootstrapcdn.com/bootstrap/4.1.0/js/bootstrap.min.js" integrity="sha384-uefMccjFJAIv6A+rW+L4AHf99KvxDjWSu1z9VI8SKNVmz4sk7buKt/6v9KI65qnm" crossorigin="anonymous"></script> <script src="https://cdnjs.cloudflare.com/ajax/libs/malihu-custom-scrollbar-plugin/3.1.5/jquery.mCustomScrollbar.concat.min.js"></script> <script type="text/javascript">$(document).ready(function (){$("#sidebar").mCustomScrollbar({theme: "minimal"}); $('#dismiss, .overlay').on('click', function (){$('#sidebar').removeClass('active'); $('.overlay').removeClass('active');}); $('#sidebarCollapse').on('click', function (){$('#sidebar').addClass('active'); $('.overlay').addClass('active'); $('.collapse.in').toggleClass('in'); $('a[aria-expanded=true]').attr('aria-expanded', 'false');}); $("#main_content div").hide(); $(".link").click(function(){$("#main_content div").hide(); $("#main_table").hide(); $("#"+$(this).attr("id")+"_content").show();});}); $(".go_back").click(function(){$("#main_content div").hide(); $("#main_table").hide(); $($(this).attr("href")).show();});</script> </body></html>"""
self.create_file(self.file_content)
def create_file(self, file_content):
if CURR_SYSTEM_PLATFORM == "windows":
if not self.util_obj.check_file_exsists(self.save_folder_location + r"\SecurusAudire_Reports"):
makedirs(self.save_folder_location + r'\SecurusAudire_Reports')
complete_location = rf"{self.save_folder_location}\SecurusAudire_Reports\web_report-{self.timestamp}.html"
else:
if not self.util_obj.check_file_exsists(self.save_folder_location + r"/SecurusAudire_Reports"):
makedirs(self.save_folder_location + r'/SecurusAudire_Reports')
complete_location = f"{self.save_folder_location}/SecurusAudire_Reports/web_report-{self.timestamp}.html"
with open(complete_location, 'w+') as write_file_obj:
write_file_obj.write(self.file_content)
def generate_report(self):
self.parse_result()
| 94.710526 | 3,844 | 0.638927 | 1,819 | 14,396 | 4.908741 | 0.20011 | 0.042334 | 0.042334 | 0.012095 | 0.396461 | 0.327136 | 0.301602 | 0.266211 | 0.254676 | 0.240788 | 0 | 0.028325 | 0.163726 | 14,396 | 151 | 3,845 | 95.337748 | 0.713348 | 0.048138 | 0 | 0.183673 | 1 | 0.142857 | 0.738278 | 0.286641 | 0 | 0 | 0 | 0 | 0 | 1 | 0.071429 | false | 0 | 0.040816 | 0 | 0.153061 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
92f7e08e7926979b6bcac5bd72886e32592184dd | 171 | py | Python | src/ml/classification/canny.py | Saptarshi-SBU/Component-Counting | 659da5f1af79c23e00a10f4f2c879f342357a5ed | [
"MIT"
] | null | null | null | src/ml/classification/canny.py | Saptarshi-SBU/Component-Counting | 659da5f1af79c23e00a10f4f2c879f342357a5ed | [
"MIT"
] | null | null | null | src/ml/classification/canny.py | Saptarshi-SBU/Component-Counting | 659da5f1af79c23e00a10f4f2c879f342357a5ed | [
"MIT"
] | null | null | null | import cv2
import numpy as np
from matplotlib import pyplot as plt
img = cv2.imread('download3.png',0)
edges = cv2.Canny(img,100,200)
cv2.imwrite('download4.png', edges)
| 21.375 | 36 | 0.754386 | 29 | 171 | 4.448276 | 0.689655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.086667 | 0.122807 | 171 | 7 | 37 | 24.428571 | 0.773333 | 0 | 0 | 0 | 0 | 0 | 0.152047 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
92fd8a38db6bebd7b0b211783bff4ff67d91b95c | 504 | py | Python | discovery-infra/test_infra/helper_classes/config/base_infra_env_config.py | rollandf/assisted-test-infra | f2d3411ceb0838f3045e4ad88f2686bed516cf8f | [
"Apache-2.0"
] | null | null | null | discovery-infra/test_infra/helper_classes/config/base_infra_env_config.py | rollandf/assisted-test-infra | f2d3411ceb0838f3045e4ad88f2686bed516cf8f | [
"Apache-2.0"
] | 164 | 2020-11-02T07:02:58.000Z | 2022-03-28T16:03:34.000Z | discovery-infra/test_infra/helper_classes/config/base_infra_env_config.py | rollandf/assisted-test-infra | f2d3411ceb0838f3045e4ad88f2686bed516cf8f | [
"Apache-2.0"
] | null | null | null | from abc import ABC
from typing import Dict, List
from dataclasses import dataclass
from .base_entity_config import BaseEntityConfig
@dataclass
class BaseInfraEnvConfig(BaseEntityConfig, ABC):
"""
Define all configurations variables that are needed for Cluster during it's execution
All arguments must have default to None with type hint
"""
infra_env_id: str = None
cluster_id: str = None
static_network_config: List[dict] = None
ignition_config_override: str = None
| 26.526316 | 89 | 0.759921 | 67 | 504 | 5.58209 | 0.656716 | 0.05615 | 0.048128 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.19246 | 504 | 18 | 90 | 28 | 0.918919 | 0.277778 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.9 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
92ff9464733b9056b623f9cc471987f28c669248 | 299 | py | Python | utest/test.py | Zhehua-Hu/Enchain | 94bb21f8ff627fab7d28ca15b575ba01710fb579 | [
"Apache-2.0"
] | 12 | 2017-02-20T05:54:12.000Z | 2020-02-13T18:26:29.000Z | utest/test.py | Zhehua-Hu/-nchain | 94bb21f8ff627fab7d28ca15b575ba01710fb579 | [
"Apache-2.0"
] | null | null | null | utest/test.py | Zhehua-Hu/-nchain | 94bb21f8ff627fab7d28ca15b575ba01710fb579 | [
"Apache-2.0"
] | 3 | 2017-02-23T06:35:13.000Z | 2020-06-18T07:06:17.000Z | from unittest import TestCase
import sys
sys.path.append("../")
from Enchain import run_main
class TestMainWindow(TestCase):
app = None
mwin = None
def setUp(self):
self.app, self.mwin = run_main()
def tearDown(self):
self.mwin.close()
self.app.quit()
def test_noop(self):
pass
| 13 | 34 | 0.698997 | 44 | 299 | 4.681818 | 0.545455 | 0.067961 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.177258 | 299 | 22 | 35 | 13.590909 | 0.837398 | 0 | 0 | 0 | 0 | 0 | 0.010067 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.214286 | false | 0.071429 | 0.214286 | 0 | 0.642857 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 3 |
13025cf4d13b9d8186a9edf1cfb391ae0c85ee5e | 9,992 | py | Python | falcon/util/structures.py | RioAtHome/falcon | edd9352e630dbbb6272370281fc5fa6d792df057 | [
"Apache-2.0"
] | null | null | null | falcon/util/structures.py | RioAtHome/falcon | edd9352e630dbbb6272370281fc5fa6d792df057 | [
"Apache-2.0"
] | null | null | null | falcon/util/structures.py | RioAtHome/falcon | edd9352e630dbbb6272370281fc5fa6d792df057 | [
"Apache-2.0"
] | null | null | null | # Copied from the Requests library by Kenneth Reitz et al.
#
# Copyright 2013 Kenneth Reitz
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Data structures.
This module provides additional data structures not found in the
standard library. These classes are hoisted into the `falcon` module
for convenience::
import falcon
things = falcon.CaseInsensitiveDict()
"""
from collections.abc import Mapping
from collections.abc import MutableMapping
# TODO(kgriffs): If we ever diverge from what is upstream in Requests,
# then we will need write tests and remove the "no cover" pragma.
class CaseInsensitiveDict(MutableMapping): # pragma: no cover
"""
A case-insensitive ``dict``-like object.
Implements all methods and operations of
``collections.abc.MutableMapping`` as well as dict's `copy`. Also
provides `lower_items`.
All keys are expected to be strings. The structure remembers the
case of the last key to be set, and ``iter(instance)``,
``keys()``, and ``items()``
will contain case-sensitive keys. However, querying and contains
testing is case insensitive:
cid = CaseInsensitiveDict()
cid['Accept'] = 'application/json'
cid['aCCEPT'] == 'application/json' # True
list(cid) == ['Accept'] # True
For example, ``headers['content-encoding']`` will return the
value of a ``'Content-Encoding'`` response header, regardless
of how the header name was originally stored.
If the constructor, ``.update``, or equality comparison
operations are given keys that have equal ``.lower()``s, the
behavior is undefined.
"""
def __init__(self, data=None, **kwargs):
self._store = dict()
if data is None:
data = {}
self.update(data, **kwargs)
def __setitem__(self, key, value):
# Use the lowercased key for lookups, but store the actual
# key alongside the value.
self._store[key.lower()] = (key, value)
def __getitem__(self, key):
return self._store[key.lower()][1]
def __delitem__(self, key):
del self._store[key.lower()]
def __iter__(self):
return (casedkey for casedkey, mappedvalue in self._store.values())
def __len__(self):
return len(self._store)
def lower_items(self):
"""Like iteritems(), but with all lowercase keys."""
return ((lowerkey, keyval[1]) for (lowerkey, keyval) in self._store.items())
def __eq__(self, other):
if isinstance(other, Mapping):
other = CaseInsensitiveDict(other)
else:
return NotImplemented
# Compare insensitively
return dict(self.lower_items()) == dict(other.lower_items())
# Copy is required
def copy(self):
return CaseInsensitiveDict(self._store.values())
def __repr__(self):
return '%s(%r)' % (self.__class__.__name__, dict(self.items()))
# NOTE(vytas): Although Context is effectively implementing the MutableMapping
# interface, we choose not to subclass MutableMapping to stress the fact that
# Context is, by design, a bare class, and the mapping interface may be
# removed in a future Falcon release.
class Context:
"""
Convenience class to hold contextual information in its attributes.
This class is used as the default :class:`~.Request` and :class:`~Response`
context type (see
:attr:`Request.context_type <falcon.Request.context_type>` and
:attr:`Response.context_type <falcon.Response.context_type>`,
respectively).
In Falcon versions prior to 2.0, the default context type was ``dict``. To
ease the migration to attribute-based context object approach, this class
also implements the mapping interface; that is, object attributes are
linked to dictionary items, and vice versa. For instance:
>>> context = falcon.Context()
>>> context.cache_strategy = 'lru'
>>> context.get('cache_strategy')
'lru'
>>> 'cache_strategy' in context
True
"""
def __contains__(self, key):
return self.__dict__.__contains__(key)
def __getitem__(self, key):
# PERF(vytas): On CPython, using this mapping interface (instead of a
# standard dict) to get, set and delete items incurs overhead
# approximately comparable to that of two function calls
# (per get/set/delete operation, that is).
return self.__dict__.__getitem__(key)
def __setitem__(self, key, value):
return self.__dict__.__setitem__(key, value)
def __delitem__(self, key):
self.__dict__.__delitem__(key)
def __iter__(self):
return self.__dict__.__iter__()
def __len__(self):
return self.__dict__.__len__()
def __eq__(self, other):
if isinstance(other, type(self)):
return self.__dict__.__eq__(other.__dict__)
return self.__dict__.__eq__(other)
def __ne__(self, other):
if isinstance(other, type(self)):
return self.__dict__.__ne__(other.__dict__)
return self.__dict__.__ne__(other)
def __hash__(self):
return hash(self.__dict__)
def __repr__(self):
return '{}({})'.format(type(self).__name__, self.__dict__.__repr__())
def __str__(self):
return '{}({})'.format(type(self).__name__, self.__dict__.__str__())
def clear(self):
return self.__dict__.clear()
def copy(self):
ctx = type(self)()
ctx.update(self.__dict__)
return ctx
def get(self, key, default=None):
return self.__dict__.get(key, default)
def items(self):
return self.__dict__.items()
def keys(self):
return self.__dict__.keys()
def pop(self, key, default=None):
return self.__dict__.pop(key, default)
def popitem(self):
return self.__dict__.popitem()
def setdefault(self, key, default_value=None):
return self.__dict__.setdefault(key, default_value)
def update(self, items):
self.__dict__.update(items)
def values(self):
return self.__dict__.values()
class ETag(str):
"""Convenience class to represent a parsed HTTP entity-tag.
This class is simply a subclass of ``str`` with a few helper methods and
an extra attribute to indicate whether the entity-tag is weak or strong. The
value of the string is equivalent to what RFC 7232 calls an "opaque-tag",
i.e. an entity-tag sans quotes and the weakness indicator.
Note:
Given that a weak entity-tag comparison can be performed by
using the ``==`` operator (per the example below), only a
:meth:`~.strong_compare` method is provided.
Here is an example ``on_get()`` method that demonstrates how to use instances
of this class::
def on_get(self, req, resp):
content_etag = self._get_content_etag()
for etag in (req.if_none_match or []):
if etag == '*' or etag == content_etag:
resp.status = falcon.HTTP_304
return
# -- snip --
resp.etag = content_etag
resp.status = falcon.HTTP_200
(See also: RFC 7232)
Attributes:
is_weak (bool): ``True`` if the entity-tag is weak, otherwise ``False``.
"""
is_weak = False
def strong_compare(self, other):
"""Perform a strong entity-tag comparison.
Two entity-tags are equivalent if both are not weak and their
opaque-tags match character-by-character.
(See also: RFC 7232, Section 2.3.2)
Arguments:
other (ETag): The other :class:`~.ETag` to which you are comparing
this one.
Returns:
bool: ``True`` if the two entity-tags match, otherwise ``False``.
"""
return self == other and not (self.is_weak or other.is_weak)
def dumps(self):
"""Serialize the ETag to a string suitable for use in a precondition header.
(See also: RFC 7232, Section 2.3)
Returns:
str: An opaque quoted string, possibly prefixed by a weakness
indicator ``W/``.
"""
if self.is_weak:
# PERF(kgriffs): Simple concatenation like this is slightly faster
# than %s string formatting.
return 'W/"' + self + '"'
return '"' + self + '"'
@classmethod
def loads(cls, etag_str):
"""Deserialize a single entity-tag string from a precondition header.
Note:
This method is meant to be used only for parsing a single
entity-tag. It can not be used to parse a comma-separated list of
values.
(See also: RFC 7232, Section 2.3)
Arguments:
etag_str (str): An ASCII string representing a single entity-tag,
as defined by RFC 7232.
Returns:
ETag: An instance of `~.ETag` representing the parsed entity-tag.
"""
value = etag_str
is_weak = False
if value.startswith(('W/', 'w/')):
is_weak = True
value = value[2:]
# NOTE(kgriffs): We allow for an unquoted entity-tag just in case,
# although it has been non-standard to do so since at least 1999
# with the advent of RFC 2616.
if value[:1] == value[-1:] == '"':
value = value[1:-1]
t = cls(value)
t.is_weak = is_weak
return t
| 31.225 | 84 | 0.635709 | 1,284 | 9,992 | 4.72352 | 0.285826 | 0.030338 | 0.039242 | 0.026711 | 0.090849 | 0.067106 | 0.067106 | 0.0277 | 0.015829 | 0.015829 | 0 | 0.008446 | 0.265312 | 9,992 | 319 | 85 | 31.322884 | 0.817736 | 0.563751 | 0 | 0.20202 | 0 | 0 | 0.00757 | 0 | 0 | 0 | 0 | 0.003135 | 0 | 1 | 0.343434 | false | 0 | 0.020202 | 0.212121 | 0.737374 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
130ed5411ac52c0981154d28a5a27f4d4f72ebd8 | 206 | py | Python | src/checkBeforeRun/databaseConnectionCheck.py | zw-g/Funny-Nation | bcb72e802e0ff46b4a409c5d51fc8b10e0987463 | [
"MIT"
] | 126 | 2022-01-15T02:29:07.000Z | 2022-03-30T09:57:40.000Z | src/checkBeforeRun/databaseConnectionCheck.py | zw-g/Funny-Nation | bcb72e802e0ff46b4a409c5d51fc8b10e0987463 | [
"MIT"
] | 18 | 2022-01-11T22:24:35.000Z | 2022-03-16T00:13:01.000Z | src/checkBeforeRun/databaseConnectionCheck.py | zw-g/Funny-Nation | bcb72e802e0ff46b4a409c5d51fc8b10e0987463 | [
"MIT"
] | 25 | 2022-01-22T15:06:27.000Z | 2022-03-01T04:34:19.000Z | from src.model.makeDatabaseConnection import makeDatabaseConnection
def databaseConnectionCheck():
db = makeDatabaseConnection()
if db is None:
return False
db.close()
return True
| 20.6 | 67 | 0.728155 | 20 | 206 | 7.5 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.213592 | 206 | 9 | 68 | 22.888889 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0 | 0.571429 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
13147d16a2352cbb525eaad952419672d8b5b45a | 632 | py | Python | RL/src/common/train_utils.py | Seonwhee-Finance/Dynamic_Programming | 3982d42138554f1d53e2d74eb9a93fa3abe2a74a | [
"MIT"
] | null | null | null | RL/src/common/train_utils.py | Seonwhee-Finance/Dynamic_Programming | 3982d42138554f1d53e2d74eb9a93fa3abe2a74a | [
"MIT"
] | null | null | null | RL/src/common/train_utils.py | Seonwhee-Finance/Dynamic_Programming | 3982d42138554f1d53e2d74eb9a93fa3abe2a74a | [
"MIT"
] | null | null | null | import numpy as np
import torch
def to_tensor(np_array: np.array, size=None) -> torch.tensor:
torch_tensor = torch.from_numpy(np_array).float()
if size is not None:
torch_tensor = torch_tensor.view(size)
return torch_tensor
def to_numpy(torch_tensor: torch.tensor) -> np.array:
return torch_tensor.cpu().detach().numpy()
class EMAMeter:
def __init__(self,
alpha: float = 0.5):
self.s = None
self.alpha = alpha
def update(self, y):
if self.s is None:
self.s = y
else:
self.s = self.alpha * y + (1 - self.alpha) * self.s
| 22.571429 | 63 | 0.601266 | 92 | 632 | 3.967391 | 0.347826 | 0.241096 | 0.175342 | 0.180822 | 0.142466 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006652 | 0.286392 | 632 | 27 | 64 | 23.407407 | 0.802661 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.105263 | 0.052632 | 0.473684 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
131850666e0df0de466e9e0772a0d27c78f140e3 | 14 | py | Python | iudx/auth/__init__.py | dilipan97/iudx-python-sdk | 03aa1d717af6e600726e17a8065b2b6532830371 | [
"MIT"
] | 6 | 2021-06-21T04:45:42.000Z | 2022-02-21T11:00:37.000Z | iudx/auth/__init__.py | dilipan97/iudx-python-sdk | 03aa1d717af6e600726e17a8065b2b6532830371 | [
"MIT"
] | 6 | 2021-03-03T09:25:00.000Z | 2022-01-27T09:50:27.000Z | iudx/auth/__init__.py | dilipan97/iudx-python-sdk | 03aa1d717af6e600726e17a8065b2b6532830371 | [
"MIT"
] | 8 | 2021-03-03T09:11:28.000Z | 2022-02-02T07:24:13.000Z | name = "auth"
| 7 | 13 | 0.571429 | 2 | 14 | 4 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 14 | 1 | 14 | 14 | 0.727273 | 0 | 0 | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
13232bbe10b2474dd2b8bcb465edde6602d3f0e0 | 554 | py | Python | Kelas Rabu/Nilai Hash/1310651030 Dian Agustin/dian.py | umjembersoft/TI20151-Keamanan-Komputer | 6e8a4e75fafa59149f5e96b71eb83de935642c38 | [
"MIT"
] | null | null | null | Kelas Rabu/Nilai Hash/1310651030 Dian Agustin/dian.py | umjembersoft/TI20151-Keamanan-Komputer | 6e8a4e75fafa59149f5e96b71eb83de935642c38 | [
"MIT"
] | null | null | null | Kelas Rabu/Nilai Hash/1310651030 Dian Agustin/dian.py | umjembersoft/TI20151-Keamanan-Komputer | 6e8a4e75fafa59149f5e96b71eb83de935642c38 | [
"MIT"
] | null | null | null | __author__ = 'triawan'
import hashlib
print
print "- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -"
print("Program Sederhana Untuk Melakukan Generate Terhadap Nilai Hash dari SHA 256")
print "- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -"
print
NamaString = "Dian A"
hash = hashlib.sha512()
hash.update(NamaString)
hexSHA256 = hash.hexdigest()
print("Nilai hash SHA 256 dari String " +NamaString+ " adalah : " + hexSHA256.upper())
print
print("Penghitungan Nilai Hash Selesai")
| 19.103448 | 86 | 0.527076 | 49 | 554 | 5.877551 | 0.571429 | 0.138889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.036408 | 0.256318 | 554 | 28 | 87 | 19.785714 | 0.662621 | 0 | 0 | 0.357143 | 1 | 0.142857 | 0.571949 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.071429 | null | null | 0.571429 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
132cfbfc2add28afde9f06e2b781136861d352b5 | 78 | py | Python | venv/lib/python3.6/genericpath.py | tchengatcincoai/cryptocoin-prices-compare | f295fecc7213a877bf717af0eb98414e9137b554 | [
"MIT"
] | 72 | 2018-07-02T07:47:15.000Z | 2022-03-29T10:02:14.000Z | venv/lib/python3.6/genericpath.py | zubeir-Abubakar/overflow | 86fc20d860bde8b872e4b16a7b0f857b574528c8 | [
"MIT"
] | 10 | 2021-05-06T21:56:20.000Z | 2022-03-02T02:49:07.000Z | venv/lib/python3.6/genericpath.py | zubeir-Abubakar/overflow | 86fc20d860bde8b872e4b16a7b0f857b574528c8 | [
"MIT"
] | 29 | 2018-09-17T06:10:32.000Z | 2022-03-19T13:15:30.000Z | /Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/genericpath.py | 78 | 78 | 0.846154 | 12 | 78 | 5.5 | 0.916667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051282 | 0 | 78 | 1 | 78 | 78 | 0.794872 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1345c876bec926f1da41203fd2e66b3b3d626cfc | 597 | py | Python | app/parser/types.py | VladisP/StreamQL | 83a2ec70aa481f5bb1ace337284ad2e304b0de2d | [
"MIT"
] | null | null | null | app/parser/types.py | VladisP/StreamQL | 83a2ec70aa481f5bb1ace337284ad2e304b0de2d | [
"MIT"
] | null | null | null | app/parser/types.py | VladisP/StreamQL | 83a2ec70aa481f5bb1ace337284ad2e304b0de2d | [
"MIT"
] | null | null | null | from dataclasses import dataclass
from typing import Any, List, Union
from app.lexer import token
@dataclass(frozen=True)
class AstAtom:
domain: str
value: str
def __str__(self) -> str:
return f"{self.domain} : {self.value}"
def token_to_atom(tok: token.Token) -> AstAtom:
return AstAtom(tok.domain, tok.value)
AST = List[Union[AstAtom, List[Any]]]
AstNode = Union[AstAtom, AST]
class ParseError(Exception):
def __init__(self, message: str):
super().__init__()
self.message = message
def __str__(self) -> str:
return self.message
| 19.258065 | 47 | 0.670017 | 78 | 597 | 4.897436 | 0.410256 | 0.086387 | 0.052356 | 0.068063 | 0.099476 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.214405 | 597 | 30 | 48 | 19.9 | 0.814499 | 0 | 0 | 0.105263 | 0 | 0 | 0.046901 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.210526 | false | 0 | 0.157895 | 0.157895 | 0.736842 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
1349dd6554643ed7448e09bbae26628b7cf1c556 | 2,281 | py | Python | armrest/utils/misc.py | xBrite/armrest | 73942555f3bfeebe8f5b0c701bb814917e7e6ef7 | [
"Apache-2.0"
] | 3 | 2018-08-17T22:27:43.000Z | 2019-10-29T17:26:15.000Z | armrest/utils/misc.py | xBrite/armrest | 73942555f3bfeebe8f5b0c701bb814917e7e6ef7 | [
"Apache-2.0"
] | null | null | null | armrest/utils/misc.py | xBrite/armrest | 73942555f3bfeebe8f5b0c701bb814917e7e6ef7 | [
"Apache-2.0"
] | 1 | 2018-05-15T17:16:56.000Z | 2018-05-15T17:16:56.000Z | # -*- coding: utf-8 -*-
import datetime
import json
def hexdump(src, length=8):
"""Generate ASCII hexdump of bytes or unicode data."""
result = []
digits = 4 if isinstance(src, unicode) else 2
for i in xrange(0, len(src), length):
s = src[i:i+length]
hexa = b' '.join(["%0*X" % (digits, ord(x)) for x in s])
text = b''.join([x if 0x20 <= ord(x) < 0x7F else b'.' for x in s])
result.append( b"%04X %-*s %s" % (i, length*(digits + 1), hexa, text) )
return b'\n'.join(result)
def seconds_to_minutes_rounded_up(seconds):
return (seconds + 59) // 60
def datetime_date(dt=None):
"""Extract YMD from a datetime.datetime. Defaults to "today"."""
# Note: For mocking reasons, this function must not be in ./date.py
dt = dt or datetime.datetime.today()
return dt.replace(hour=0, minute=0, second=0, microsecond=0)
truthy = frozenset(('t', 'true', 'y', 'yes', 'on', '1'))
def asbool(s):
""" Return the boolean value ``True`` if the case-lowered value of string
input ``s`` is any of ``t``, ``true``, ``y``, ``on``, or ``1``;
otherwise return the boolean value ``False``. """
if s is None:
return False
if isinstance(s, bool):
return s
s = str(s).strip()
return s.lower() in truthy
def as_lower(s):
""" Return s.lower(), even if s is None """
return str(s).lower() if s else ''
def pretty_print_json(obj, indent=1, sort_keys=True):
return json.dumps(obj, indent=indent, sort_keys=sort_keys)
def pretty_number(n):
"""Format number with comma as thousands separator"""
return "{:,}".format(n)
def sorted_list_of_dicts_by_common_key(lst, key):
"""Return the given list of dicts sorted by the given key they all have in common."""
return sorted(lst, cmp=lambda x, y: cmp(x[key], y[key]))
def sorted_by_value(d):
"""Return a list of (key, value) tuples from the given dict, sorted by value."""
return [(k, d[k]) for k in sorted(d, key=d.get, reverse=True)]
def chunks(lst, *lens):
"""Split the supplied list into sub-lists of the given lengths, and return the list of sub-lists."""
retval = []
start = 0
for n in lens:
retval.append(lst[start:start + n])
start += n
return retval
| 29.24359 | 104 | 0.613766 | 364 | 2,281 | 3.791209 | 0.39011 | 0.026087 | 0.008696 | 0.010145 | 0.021739 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014764 | 0.22797 | 2,281 | 77 | 105 | 29.623377 | 0.768881 | 0.312144 | 0 | 0 | 1 | 0 | 0.026385 | 0 | 0 | 0 | 0.005277 | 0 | 0 | 1 | 0.243902 | false | 0 | 0.04878 | 0.04878 | 0.585366 | 0.02439 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
1369af84fc50d84adc0143533cb65c6769b01442 | 4,789 | py | Python | assaytools2/analyzer.py | choderalab/assaytools2 | 32c02aeddf0afef69c91e16135a159e4735ad532 | [
"MIT"
] | 2 | 2019-02-08T19:42:07.000Z | 2019-03-25T11:10:17.000Z | assaytools2/analyzer.py | choderalab/assaytools2 | 32c02aeddf0afef69c91e16135a159e4735ad532 | [
"MIT"
] | null | null | null | assaytools2/analyzer.py | choderalab/assaytools2 | 32c02aeddf0afef69c91e16135a159e4735ad532 | [
"MIT"
] | 2 | 2021-04-01T00:35:36.000Z | 2021-04-13T03:52:32.000Z | import numpy as np
import matplotlib
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
import sklearn
def plot_rv(rv, n_sample = 10000, style = 'pca'):
"""plt some distribution.
Parameters
----------
rv :
n_sample :
(Default value = 10000)
style :
(Default value = 'pca')
Returns
-------
"""
samples = rv.sample(n_sample).numpy()
print(rv.name)
if 'LogNormal' in str(rv.name):
samples = np.log(samples)
print(samples)
if samples.ndim == 1 or samples.shape[1] == 1:
samples = samples.flatten()
fig = plt.figure(figsize=[10, 10])
ax = fig.add_subplot(111)
hist, _ = np.histogram(samples, bins=50)
x_axis = np.array(range(50))
ax.plot(x_axis, np.true_divide(hist, hist.sum()))
return ax, None
elif style == 'sep':
assert len(samples.shape) == 2
dims = samples.shape[1]
figs = []
for dim in range(dims):
fig = plt.figure()
figs.append(fig)
hist, _ = np.histogram(samples[:, dim], bins=50)
x_axis = np.array(range(50))
ax = fig.add_subplot(111)
ax.plot(x_axis, np.true_divide(hist, hist.sum()))
return figs, None
elif style == 'pca':
from matplotlib import cm
fig = plt.figure(figsize=[10, 10])
ax = fig.add_subplot(111, projection='3d')
ax._axis3don = False
samples = rv.sample(n_sample).numpy()
pca = sklearn.decomposition.PCA(2)
samples_transformed = pca.fit_transform(samples)
hist, x_edges, y_edges = np.histogram2d(samples_transformed[:, 0], samples_transformed[:, 1], bins=50)
hist = np.true_divide(hist, hist.sum())
x_pos, y_pos = np.meshgrid(x_edges[:-1], y_edges[:-1])
ax.plot_wireframe(x_pos, y_pos, hist, cmap=cm.coolwarm)
return ax, pca
def plot_est(points, rv, rv_ax, style = 'pca', pca=None):
"""plot the estimation points
Parameters
----------
points :
rv :
rv_ax :
style :
(Default value = 'pca')
pca :
(Default value = None)
Returns
-------
"""
zs = np.array([])
for idx in range(points.shape[0]):
if 'LogNormal' in rv.name:
zs = np.append(zs, np.exp(np.sum(rv.log_prob(tf.constant(np.exp(points[idx]), dtype=tf.float32)))))
else:
zs = np.append(zs, np.exp(np.sum(rv.log_prob(tf.constant(points[idx], dtype=tf.float32)))))
if points.ndim == 1 or samples.shape[1] == 1:
xs = points.flatten()
rv_ax.plot(xs, zs, 'x')
return rv_ax
if style == 'sep':
raise NotImplementedError
if style == 'pca':
size = points.shape[1]
points = pca.transform(points)
xs = points[:, 0]
ys = points[:, 1]
rv_ax.plot(xs, ys, zs, 'r-')
return rv_ax
def plot_all(points, rv, n_sample = 10000):
"""plot the distribution and the sample points
Parameters
----------
points :
rv :
n_sample :
(Default value = 10000)
Returns
-------
"""
samples = rv.sample(n_sample).numpy()
if 'LogNormal' in str(rv.name):
samples = np.log(samples)
zs = np.array([])
for idx in range(points.shape[0]):
if 'LogNormal' in rv.name:
zs = np.append(zs, np.exp(np.sum(rv.log_prob(tf.constant(np.exp(points[idx]), dtype=tf.float32)))))
else:
zs = np.append(zs, np.exp(np.sum(rv.log_prob(tf.constant(points[idx], dtype=tf.float32)))))
if samples.ndim == 1 or samples.shape[1] == 1:
samples = samples.flatten()
fig = plt.figure(figsize=[10, 10])
ax = fig.add_subplot(111)
hist, _ = np.histogram(samples, bins=50)
x_axis = np.array(range(50))
ax.plot(x_axis, np.true_divide(hist, hist.sum()))
xs = points.flatten()
ax.plot(xs, zs, 'x')
else:
from matplotlib import cm
fig = plt.figure(figsize=[10, 10])
ax = fig.add_subplot(111, projection='3d')
ax._axis3don = False
samples = rv.sample(n_sample).numpy()
pca = sklearn.decomposition.PCA(2)
samples_transformed = pca.fit_transform(samples)
hist, x_edges, y_edges = np.histogram2d(samples_transformed[:, 0], samples_transformed[:, 1], bins=50)
hist = np.true_divide(hist, hist.sum())
x_pos, y_pos = np.meshgrid(x_edges[:-1], y_edges[:-1])
ax.plot_wireframe(x_pos, y_pos, hist, cmap=cm.coolwarm)
size = points.shape[1]
points = pca.transform(points)
xs = points[:, 0]
ys = points[:, 1]
# ax.plot(xs, ys, zs, 'r-')
| 28.505952 | 111 | 0.558154 | 641 | 4,789 | 4.062403 | 0.171607 | 0.015361 | 0.016129 | 0.028802 | 0.749616 | 0.721966 | 0.692012 | 0.657834 | 0.657834 | 0.647465 | 0 | 0.033019 | 0.29171 | 4,789 | 167 | 112 | 28.676647 | 0.73467 | 0.104197 | 0 | 0.707071 | 0 | 0 | 0.015049 | 0 | 0 | 0 | 0 | 0 | 0.010101 | 1 | 0.030303 | false | 0 | 0.070707 | 0 | 0.151515 | 0.020202 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1378195015b5d82f44145f96a67b57d6e272c93d | 1,583 | py | Python | migrations/versions/2019_09_16_933060cd8eeb_remove_server_defaults.py | tobikrs/ultimate-poll-bot | eaa190ba1fec852c1a7d12c8a4633245f00c435f | [
"MIT"
] | 1 | 2020-03-22T05:49:44.000Z | 2020-03-22T05:49:44.000Z | migrations/versions/2019_09_16_933060cd8eeb_remove_server_defaults.py | RuslanBitcash/ultimate-poll-bot | 33bc71b56f79453359043bd0e778cd153d3a83a3 | [
"MIT"
] | null | null | null | migrations/versions/2019_09_16_933060cd8eeb_remove_server_defaults.py | RuslanBitcash/ultimate-poll-bot | 33bc71b56f79453359043bd0e778cd153d3a83a3 | [
"MIT"
] | 1 | 2021-01-29T17:10:11.000Z | 2021-01-29T17:10:11.000Z | """Remove server defaults
Revision ID: 933060cd8eeb
Revises: 4b06fe3d82ab
Create Date: 2019-09-16 22:22:57.944053
"""
from alembic import op
import sqlalchemy as sa
# revision identifiers, used by Alembic.
revision = '933060cd8eeb'
down_revision = '4b06fe3d82ab'
branch_labels = None
depends_on = None
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('poll', 'compact_doodle_buttons',
existing_type=sa.BOOLEAN(),
server_default=None,
existing_nullable=False)
op.alter_column('user', 'european_date_format',
existing_type=sa.BOOLEAN(),
server_default=None,
existing_nullable=False)
op.alter_column('user', 'started',
existing_type=sa.BOOLEAN(),
server_default=None,
existing_nullable=False)
# ### end Alembic commands ###
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.alter_column('user', 'started',
existing_type=sa.BOOLEAN(),
server_default=sa.text('false'),
existing_nullable=False)
op.alter_column('user', 'european_date_format',
existing_type=sa.BOOLEAN(),
server_default=sa.text('false'),
existing_nullable=False)
op.alter_column('poll', 'compact_doodle_buttons',
existing_type=sa.BOOLEAN(),
server_default=sa.text('true'),
existing_nullable=False)
# ### end Alembic commands ###
| 31.039216 | 65 | 0.617814 | 169 | 1,583 | 5.579882 | 0.349112 | 0.044539 | 0.082715 | 0.133616 | 0.723224 | 0.723224 | 0.662778 | 0.662778 | 0.656416 | 0.656416 | 0 | 0.039724 | 0.268478 | 1,583 | 50 | 66 | 31.66 | 0.774611 | 0.19204 | 0 | 0.71875 | 0 | 0 | 0.128824 | 0.035427 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.0625 | 0 | 0.125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
138cb00961b7566af37a8b19d9c6d4709bb01062 | 13,170 | py | Python | kinow_client/models/create_product_request.py | kinow-io/kaemo-python-sdk | 610fce09e3a9e631babf09195b0492959d9e4d56 | [
"Apache-2.0"
] | 1 | 2017-05-03T12:48:22.000Z | 2017-05-03T12:48:22.000Z | kinow_client/models/create_product_request.py | kinow-io/kaemo-python-sdk | 610fce09e3a9e631babf09195b0492959d9e4d56 | [
"Apache-2.0"
] | null | null | null | kinow_client/models/create_product_request.py | kinow-io/kaemo-python-sdk | 610fce09e3a9e631babf09195b0492959d9e4d56 | [
"Apache-2.0"
] | null | null | null | # coding: utf-8
"""
Server API
Reference for Server API (REST/Json)
OpenAPI spec version: 2.0.9
Generated by: https://github.com/swagger-api/swagger-codegen.git
"""
from pprint import pformat
from six import iteritems
import re
class CreateProductRequest(object):
"""
NOTE: This class is auto generated by the swagger code generator program.
Do not edit the class manually.
"""
def __init__(self, name=None, description_short=None, description=None, meta_title=None, meta_description=None, meta_keywords=None, link_rewrite=None, active=None, reference=None, date_from=None, date_to=None, availability_before=None, availability_after=None, id_category_default=None, tags=None):
"""
CreateProductRequest - a model defined in Swagger
:param dict swaggerTypes: The key is attribute name
and the value is attribute type.
:param dict attributeMap: The key is attribute name
and the value is json key in definition.
"""
self.swagger_types = {
'name': 'list[I18nFieldInput]',
'description_short': 'list[I18nFieldInput]',
'description': 'list[I18nFieldInput]',
'meta_title': 'list[I18nFieldInput]',
'meta_description': 'list[I18nFieldInput]',
'meta_keywords': 'list[I18nFieldInput]',
'link_rewrite': 'list[I18nFieldInput]',
'active': 'bool',
'reference': 'str',
'date_from': 'str',
'date_to': 'str',
'availability_before': 'int',
'availability_after': 'int',
'id_category_default': 'int',
'tags': 'list[I18nField]'
}
self.attribute_map = {
'name': 'name',
'description_short': 'description_short',
'description': 'description',
'meta_title': 'meta_title',
'meta_description': 'meta_description',
'meta_keywords': 'meta_keywords',
'link_rewrite': 'link_rewrite',
'active': 'active',
'reference': 'reference',
'date_from': 'date_from',
'date_to': 'date_to',
'availability_before': 'availability_before',
'availability_after': 'availability_after',
'id_category_default': 'id_category_default',
'tags': 'tags'
}
self._name = name
self._description_short = description_short
self._description = description
self._meta_title = meta_title
self._meta_description = meta_description
self._meta_keywords = meta_keywords
self._link_rewrite = link_rewrite
self._active = active
self._reference = reference
self._date_from = date_from
self._date_to = date_to
self._availability_before = availability_before
self._availability_after = availability_after
self._id_category_default = id_category_default
self._tags = tags
@property
def name(self):
"""
Gets the name of this CreateProductRequest.
:return: The name of this CreateProductRequest.
:rtype: list[I18nFieldInput]
"""
return self._name
@name.setter
def name(self, name):
"""
Sets the name of this CreateProductRequest.
:param name: The name of this CreateProductRequest.
:type: list[I18nFieldInput]
"""
if name is None:
raise ValueError("Invalid value for `name`, must not be `None`")
self._name = name
@property
def description_short(self):
"""
Gets the description_short of this CreateProductRequest.
:return: The description_short of this CreateProductRequest.
:rtype: list[I18nFieldInput]
"""
return self._description_short
@description_short.setter
def description_short(self, description_short):
"""
Sets the description_short of this CreateProductRequest.
:param description_short: The description_short of this CreateProductRequest.
:type: list[I18nFieldInput]
"""
self._description_short = description_short
@property
def description(self):
"""
Gets the description of this CreateProductRequest.
:return: The description of this CreateProductRequest.
:rtype: list[I18nFieldInput]
"""
return self._description
@description.setter
def description(self, description):
"""
Sets the description of this CreateProductRequest.
:param description: The description of this CreateProductRequest.
:type: list[I18nFieldInput]
"""
self._description = description
@property
def meta_title(self):
"""
Gets the meta_title of this CreateProductRequest.
:return: The meta_title of this CreateProductRequest.
:rtype: list[I18nFieldInput]
"""
return self._meta_title
@meta_title.setter
def meta_title(self, meta_title):
"""
Sets the meta_title of this CreateProductRequest.
:param meta_title: The meta_title of this CreateProductRequest.
:type: list[I18nFieldInput]
"""
self._meta_title = meta_title
@property
def meta_description(self):
"""
Gets the meta_description of this CreateProductRequest.
:return: The meta_description of this CreateProductRequest.
:rtype: list[I18nFieldInput]
"""
return self._meta_description
@meta_description.setter
def meta_description(self, meta_description):
"""
Sets the meta_description of this CreateProductRequest.
:param meta_description: The meta_description of this CreateProductRequest.
:type: list[I18nFieldInput]
"""
self._meta_description = meta_description
@property
def meta_keywords(self):
"""
Gets the meta_keywords of this CreateProductRequest.
:return: The meta_keywords of this CreateProductRequest.
:rtype: list[I18nFieldInput]
"""
return self._meta_keywords
@meta_keywords.setter
def meta_keywords(self, meta_keywords):
"""
Sets the meta_keywords of this CreateProductRequest.
:param meta_keywords: The meta_keywords of this CreateProductRequest.
:type: list[I18nFieldInput]
"""
self._meta_keywords = meta_keywords
@property
def link_rewrite(self):
"""
Gets the link_rewrite of this CreateProductRequest.
:return: The link_rewrite of this CreateProductRequest.
:rtype: list[I18nFieldInput]
"""
return self._link_rewrite
@link_rewrite.setter
def link_rewrite(self, link_rewrite):
"""
Sets the link_rewrite of this CreateProductRequest.
:param link_rewrite: The link_rewrite of this CreateProductRequest.
:type: list[I18nFieldInput]
"""
if link_rewrite is None:
raise ValueError("Invalid value for `link_rewrite`, must not be `None`")
self._link_rewrite = link_rewrite
@property
def active(self):
"""
Gets the active of this CreateProductRequest.
:return: The active of this CreateProductRequest.
:rtype: bool
"""
return self._active
@active.setter
def active(self, active):
"""
Sets the active of this CreateProductRequest.
:param active: The active of this CreateProductRequest.
:type: bool
"""
self._active = active
@property
def reference(self):
"""
Gets the reference of this CreateProductRequest.
:return: The reference of this CreateProductRequest.
:rtype: str
"""
return self._reference
@reference.setter
def reference(self, reference):
"""
Sets the reference of this CreateProductRequest.
:param reference: The reference of this CreateProductRequest.
:type: str
"""
self._reference = reference
@property
def date_from(self):
"""
Gets the date_from of this CreateProductRequest.
:return: The date_from of this CreateProductRequest.
:rtype: str
"""
return self._date_from
@date_from.setter
def date_from(self, date_from):
"""
Sets the date_from of this CreateProductRequest.
:param date_from: The date_from of this CreateProductRequest.
:type: str
"""
self._date_from = date_from
@property
def date_to(self):
"""
Gets the date_to of this CreateProductRequest.
:return: The date_to of this CreateProductRequest.
:rtype: str
"""
return self._date_to
@date_to.setter
def date_to(self, date_to):
"""
Sets the date_to of this CreateProductRequest.
:param date_to: The date_to of this CreateProductRequest.
:type: str
"""
self._date_to = date_to
@property
def availability_before(self):
"""
Gets the availability_before of this CreateProductRequest.
Value can be 0, 1 or 2
:return: The availability_before of this CreateProductRequest.
:rtype: int
"""
return self._availability_before
@availability_before.setter
def availability_before(self, availability_before):
"""
Sets the availability_before of this CreateProductRequest.
Value can be 0, 1 or 2
:param availability_before: The availability_before of this CreateProductRequest.
:type: int
"""
self._availability_before = availability_before
@property
def availability_after(self):
"""
Gets the availability_after of this CreateProductRequest.
Value can be 0, 1 or 2
:return: The availability_after of this CreateProductRequest.
:rtype: int
"""
return self._availability_after
@availability_after.setter
def availability_after(self, availability_after):
"""
Sets the availability_after of this CreateProductRequest.
Value can be 0, 1 or 2
:param availability_after: The availability_after of this CreateProductRequest.
:type: int
"""
self._availability_after = availability_after
@property
def id_category_default(self):
"""
Gets the id_category_default of this CreateProductRequest.
:return: The id_category_default of this CreateProductRequest.
:rtype: int
"""
return self._id_category_default
@id_category_default.setter
def id_category_default(self, id_category_default):
"""
Sets the id_category_default of this CreateProductRequest.
:param id_category_default: The id_category_default of this CreateProductRequest.
:type: int
"""
if id_category_default is None:
raise ValueError("Invalid value for `id_category_default`, must not be `None`")
self._id_category_default = id_category_default
@property
def tags(self):
"""
Gets the tags of this CreateProductRequest.
:return: The tags of this CreateProductRequest.
:rtype: list[I18nField]
"""
return self._tags
@tags.setter
def tags(self, tags):
"""
Sets the tags of this CreateProductRequest.
:param tags: The tags of this CreateProductRequest.
:type: list[I18nField]
"""
self._tags = tags
def to_dict(self):
"""
Returns the model properties as a dict
"""
result = {}
for attr, _ in iteritems(self.swagger_types):
value = getattr(self, attr)
if isinstance(value, list):
result[attr] = list(map(
lambda x: x.to_dict() if hasattr(x, "to_dict") else x,
value
))
elif hasattr(value, "to_dict"):
result[attr] = value.to_dict()
elif isinstance(value, dict):
result[attr] = dict(map(
lambda item: (item[0], item[1].to_dict())
if hasattr(item[1], "to_dict") else item,
value.items()
))
else:
result[attr] = value
return result
def to_str(self):
"""
Returns the string representation of the model
"""
return pformat(self.to_dict())
def __repr__(self):
"""
For `print` and `pprint`
"""
return self.to_str()
def __eq__(self, other):
"""
Returns true if both objects are equal
"""
return self.__dict__ == other.__dict__
def __ne__(self, other):
"""
Returns true if both objects are not equal
"""
return not self == other
| 28.69281 | 302 | 0.609188 | 1,356 | 13,170 | 5.705015 | 0.096608 | 0.046536 | 0.201655 | 0.060109 | 0.577301 | 0.405636 | 0.284255 | 0.188082 | 0.095786 | 0.041624 | 0 | 0.007358 | 0.30858 | 13,170 | 458 | 303 | 28.755459 | 0.842192 | 0.368945 | 0 | 0.264045 | 1 | 0 | 0.126959 | 0.003192 | 0 | 0 | 0 | 0 | 0 | 1 | 0.202247 | false | 0 | 0.016854 | 0 | 0.337079 | 0.005618 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
1392a058dfd2e1a43a6f6773b1dae4df4e5fd25c | 678 | py | Python | mopidy_oe1/backend.py | movermeyer/mopidy-oe1 | 9d0b31df13cfedc9af10cefe575d82d5b4407d0f | [
"Apache-2.0"
] | 9 | 2015-03-08T15:53:33.000Z | 2020-04-01T10:11:22.000Z | mopidy_oe1/backend.py | movermeyer/mopidy-oe1 | 9d0b31df13cfedc9af10cefe575d82d5b4407d0f | [
"Apache-2.0"
] | 4 | 2015-04-05T23:45:14.000Z | 2019-12-08T16:29:35.000Z | mopidy_oe1/backend.py | movermeyer/mopidy-oe1 | 9d0b31df13cfedc9af10cefe575d82d5b4407d0f | [
"Apache-2.0"
] | 5 | 2015-04-05T23:30:53.000Z | 2020-03-28T19:13:16.000Z | from __future__ import unicode_literals
import logging
from mopidy import backend
import pykka
from mopidy_oe1.library import OE1LibraryProvider
from mopidy_oe1.playback import OE1PlaybackProvider
logger = logging.getLogger(__name__)
class OE1Backend(pykka.ThreadingActor, backend.Backend):
def __init__(self, config, audio):
super(OE1Backend, self).__init__()
self.config = config
self.library = OE1LibraryProvider(backend=self)
self.playback = OE1PlaybackProvider(audio=audio, backend=self)
self.uri_schemes = ['oe1']
def on_start(self):
logger.info('Starting OE1Backend')
def on_stop(self):
pass
| 22.6 | 70 | 0.727139 | 76 | 678 | 6.197368 | 0.447368 | 0.063694 | 0.055202 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018282 | 0.193215 | 678 | 29 | 71 | 23.37931 | 0.842779 | 0 | 0 | 0 | 0 | 0 | 0.032448 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0.055556 | 0.333333 | 0 | 0.555556 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | 3 |
1396ddca2a35c1d445048a714bba27e7511237ef | 4,114 | py | Python | nameko/testing/rabbit.py | vlcinsky/nameko | 88d7e5211de4fcc1c34cd7f84d7c77f0619c5f5d | [
"Apache-2.0"
] | 3,425 | 2016-11-10T17:12:42.000Z | 2022-03-31T19:07:49.000Z | nameko/testing/rabbit.py | vlcinsky/nameko | 88d7e5211de4fcc1c34cd7f84d7c77f0619c5f5d | [
"Apache-2.0"
] | 371 | 2020-03-04T21:51:56.000Z | 2022-03-31T20:59:11.000Z | nameko/testing/rabbit.py | vlcinsky/nameko | 88d7e5211de4fcc1c34cd7f84d7c77f0619c5f5d | [
"Apache-2.0"
] | 420 | 2016-11-17T05:46:42.000Z | 2022-03-23T12:36:06.000Z | import json
import six
from requests import ConnectionError, HTTPError, Session
from requests.auth import HTTPBasicAuth
from requests.utils import get_auth_from_url, urldefragauth
from six.moves.urllib.parse import quote # pylint: disable=E0401
__all__ = ['Client', 'HTTPError']
def _quote(value):
return quote(value, '')
class Client(object):
"""Pyrabbit replacement using requests instead of httplib2 """
def __init__(self, uri):
# move basic auth creds into headers to avoid
# https://github.com/requests/requests/issues/4275
username, password = get_auth_from_url(uri)
uri = urldefragauth(uri)
self._base_url = '{}/api'.format(uri)
self._session = Session()
self._session.auth = HTTPBasicAuth(username, password)
self._session.headers['content-type'] = 'application/json'
self._verify_api_connection()
def _build_url(self, args):
args = map(_quote, args)
return '{}/{}'.format(
self._base_url,
'/'.join(args),
)
def _request(self, method, *args, **kwargs):
url = self._build_url(args)
json_data = kwargs.pop('json', None)
if json_data is not None:
kwargs['data'] = json.dumps(json_data)
try:
result = self._session.request(method, url, **kwargs)
except ConnectionError as exc:
six.raise_from(Exception(
'Connection error for the RabbitMQ management HTTP'
' API at {}, is it enabled?'.format(url)
), exc)
result.raise_for_status()
if result.content:
return result.json()
def _get(self, *args, **kwargs):
return self._request('GET', *args, **kwargs)
def _put(self, *args, **kwargs):
return self._request('PUT', *args, **kwargs)
def _delete(self, *args, **kwargs):
return self._request('DELETE', *args, **kwargs)
def _post(self, *args, **kwargs):
return self._request('POST', *args, **kwargs)
def _verify_api_connection(self):
self._get('overview')
def get_connections(self):
return self._get('connections')
def delete_connection(self, name):
return self._delete('connections', name)
def get_exchanges(self, vhost):
return self._get('exchanges', vhost)
def get_all_vhosts(self):
return self._get('vhosts')
def create_vhost(self, vhost):
return self._put('vhosts', vhost)
def delete_vhost(self, vhost):
return self._delete('vhosts', vhost)
def set_vhost_permissions(self, vhost, username, configure, read, write):
permissions = {
'configure': configure,
'read': read,
'write': write,
}
return self._put(
'permissions', vhost, username,
json=permissions)
def get_queue(self, vhost, name):
return self._get('queues', vhost, name)
def create_queue(self, vhost, name, **properties):
return self._put('queues', vhost, name, json=properties)
def get_queues(self, vhost):
return self._get('queues', vhost)
def get_queue_bindings(self, vhost, name):
return self._get('queues', vhost, name, 'bindings')
def create_queue_binding(self, vhost, exchange, queue, routing_key):
body = {
'routing_key': routing_key,
}
return self._post(
'bindings', vhost, 'e', exchange, 'q', queue, json=body
)
def publish(self, vhost, name, routing_key, payload, properties=None):
body = {
'routing_key': routing_key,
'payload': payload,
'properties': properties or {},
'payload_encoding': 'string',
}
return self._post('exchanges', vhost, name, 'publish', json=body)
def get_messages(self, vhost, name, count=1, requeue=False):
body = {
'count': count,
'encoding': 'auto',
'requeue': requeue,
}
return self._post('queues', vhost, name, 'get', json=body)
| 30.029197 | 77 | 0.600146 | 463 | 4,114 | 5.142549 | 0.274298 | 0.075598 | 0.032759 | 0.033599 | 0.149937 | 0.086518 | 0.034439 | 0.034439 | 0.034439 | 0 | 0 | 0.003349 | 0.274186 | 4,114 | 136 | 78 | 30.25 | 0.794039 | 0.041808 | 0 | 0.05 | 0 | 0 | 0.099644 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.23 | false | 0.02 | 0.06 | 0.15 | 0.51 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
1399c38729ca78ab7130d1f7eac4a60b339c59c0 | 109 | py | Python | execicios/ex010/mat06.py | Israel97f/Exercicios-de-Python | 5d3054187977deeb3fadbd7bb1cdee035c609a61 | [
"MIT"
] | null | null | null | execicios/ex010/mat06.py | Israel97f/Exercicios-de-Python | 5d3054187977deeb3fadbd7bb1cdee035c609a61 | [
"MIT"
] | null | null | null | execicios/ex010/mat06.py | Israel97f/Exercicios-de-Python | 5d3054187977deeb3fadbd7bb1cdee035c609a61 | [
"MIT"
] | null | null | null | n = float(input('Quanto dinheiro você ten na carteira '))
print('voce pode compra $ {:.2f}'.format(n/5.29))
| 27.25 | 57 | 0.669725 | 18 | 109 | 4.055556 | 0.944444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.042553 | 0.137615 | 109 | 3 | 58 | 36.333333 | 0.734043 | 0 | 0 | 0 | 0 | 0 | 0.568807 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.5 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 3 |
139ab786d1d833a711cdd4232567fb413c7a4abd | 3,775 | py | Python | tests/components/plum_lightpad/test_init.py | erogleva/core | 994ae09f69afe772150a698953c0d7386a745de2 | [
"Apache-2.0"
] | 6 | 2016-11-25T06:36:27.000Z | 2021-11-16T11:20:23.000Z | tests/components/plum_lightpad/test_init.py | erogleva/core | 994ae09f69afe772150a698953c0d7386a745de2 | [
"Apache-2.0"
] | 56 | 2020-08-03T07:30:54.000Z | 2022-03-31T06:02:04.000Z | tests/components/plum_lightpad/test_init.py | erogleva/core | 994ae09f69afe772150a698953c0d7386a745de2 | [
"Apache-2.0"
] | 2 | 2020-12-25T16:31:22.000Z | 2020-12-30T20:53:56.000Z | """Tests for the Plum Lightpad config flow."""
from aiohttp import ContentTypeError
from requests.exceptions import HTTPError
from homeassistant.components.plum_lightpad.const import DOMAIN
from homeassistant.core import HomeAssistant
from homeassistant.setup import async_setup_component
from tests.async_mock import Mock, patch
from tests.common import MockConfigEntry
async def test_async_setup_no_domain_config(hass: HomeAssistant):
"""Test setup without configuration is noop."""
result = await async_setup_component(hass, DOMAIN, {})
assert result is True
assert DOMAIN not in hass.data
async def test_async_setup_imports_from_config(hass: HomeAssistant):
"""Test that specifying config will setup an entry."""
with patch(
"homeassistant.components.plum_lightpad.utils.Plum.loadCloudData"
) as mock_loadCloudData, patch(
"homeassistant.components.plum_lightpad.async_setup_entry",
return_value=True,
) as mock_async_setup_entry:
result = await async_setup_component(
hass,
DOMAIN,
{
DOMAIN: {
"username": "test-plum-username",
"password": "test-plum-password",
}
},
)
await hass.async_block_till_done()
assert result is True
assert len(mock_loadCloudData.mock_calls) == 1
assert len(mock_async_setup_entry.mock_calls) == 1
async def test_async_setup_entry_sets_up_light(hass: HomeAssistant):
"""Test that configuring entry sets up light domain."""
config_entry = MockConfigEntry(
domain=DOMAIN,
data={"username": "test-plum-username", "password": "test-plum-password"},
)
config_entry.add_to_hass(hass)
with patch(
"homeassistant.components.plum_lightpad.utils.Plum.loadCloudData"
) as mock_loadCloudData, patch(
"homeassistant.components.plum_lightpad.light.async_setup_entry"
) as mock_light_async_setup_entry:
result = await hass.config_entries.async_setup(config_entry.entry_id)
assert result is True
await hass.async_block_till_done()
assert len(mock_loadCloudData.mock_calls) == 1
assert len(mock_light_async_setup_entry.mock_calls) == 1
async def test_async_setup_entry_handles_auth_error(hass: HomeAssistant):
"""Test that configuring entry handles Plum Cloud authentication error."""
config_entry = MockConfigEntry(
domain=DOMAIN,
data={"username": "test-plum-username", "password": "test-plum-password"},
)
config_entry.add_to_hass(hass)
with patch(
"homeassistant.components.plum_lightpad.utils.Plum.loadCloudData",
side_effect=ContentTypeError(Mock(), None),
), patch(
"homeassistant.components.plum_lightpad.light.async_setup_entry"
) as mock_light_async_setup_entry:
result = await hass.config_entries.async_setup(config_entry.entry_id)
assert result is False
assert len(mock_light_async_setup_entry.mock_calls) == 0
async def test_async_setup_entry_handles_http_error(hass: HomeAssistant):
"""Test that configuring entry handles HTTP error."""
config_entry = MockConfigEntry(
domain=DOMAIN,
data={"username": "test-plum-username", "password": "test-plum-password"},
)
config_entry.add_to_hass(hass)
with patch(
"homeassistant.components.plum_lightpad.utils.Plum.loadCloudData",
side_effect=HTTPError,
), patch(
"homeassistant.components.plum_lightpad.light.async_setup_entry"
) as mock_light_async_setup_entry:
result = await hass.config_entries.async_setup(config_entry.entry_id)
assert result is False
assert len(mock_light_async_setup_entry.mock_calls) == 0
| 35.613208 | 82 | 0.714437 | 457 | 3,775 | 5.628009 | 0.175055 | 0.089425 | 0.087481 | 0.122473 | 0.760498 | 0.723561 | 0.707621 | 0.637247 | 0.577372 | 0.575428 | 0 | 0.001982 | 0.198146 | 3,775 | 105 | 83 | 35.952381 | 0.847704 | 0.010596 | 0 | 0.564103 | 0 | 0 | 0.203714 | 0.143355 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0 | false | 0.051282 | 0.102564 | 0 | 0.102564 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
13c2f97eb86e888ca5440a5e2c40cf7e7cfdad2e | 1,000 | py | Python | src/ggrc/models/hooks/__init__.py | mrR2D2/ggrc-core | f4f92628de4490512fcc9511be28e6cf1b875e14 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | src/ggrc/models/hooks/__init__.py | mrR2D2/ggrc-core | f4f92628de4490512fcc9511be28e6cf1b875e14 | [
"ECL-2.0",
"Apache-2.0"
] | null | null | null | src/ggrc/models/hooks/__init__.py | mrR2D2/ggrc-core | f4f92628de4490512fcc9511be28e6cf1b875e14 | [
"ECL-2.0",
"Apache-2.0"
] | 1 | 2020-02-13T12:32:45.000Z | 2020-02-13T12:32:45.000Z | # Copyright (C) 2019 Google Inc.
# Licensed under http://www.apache.org/licenses/LICENSE-2.0 <see LICENSE file>
"""Import GGRC model hooks."""
from ggrc.models.hooks import common
from ggrc.models.hooks import assessment
from ggrc.models.hooks import audit
from ggrc.models.hooks import comment
from ggrc.models.hooks import custom_attribute_definition
from ggrc.models.hooks import issue
from ggrc.models.hooks import issue_tracker
from ggrc.models.hooks import relationship
from ggrc.models.hooks import acl
from ggrc.models.hooks import access_control_role
from ggrc.models.hooks import with_action
ALL_HOOKS = [
access_control_role,
assessment,
audit,
comment,
issue,
relationship,
with_action,
custom_attribute_definition,
acl,
common,
# Keep IssueTracker at the end of list to make sure that all other hooks
# are already executed and all data is final.
issue_tracker,
]
def init_hooks():
for hook in ALL_HOOKS:
hook.init_hook()
| 25 | 78 | 0.759 | 146 | 1,000 | 5.089041 | 0.438356 | 0.118439 | 0.207268 | 0.281292 | 0.38358 | 0.080754 | 0 | 0 | 0 | 0 | 0 | 0.007246 | 0.172 | 1,000 | 39 | 79 | 25.641026 | 0.890097 | 0.248 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.407407 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
13c4761fc2bcac0d7fd3b75f841c8d3f23ad4cb0 | 3,479 | py | Python | test/jit/test_hash.py | Stonepia/pytorch | 82006ba46074226a071c25dd2e03dc4828941544 | [
"Intel"
] | 1 | 2021-06-17T13:02:45.000Z | 2021-06-17T13:02:45.000Z | test/jit/test_hash.py | Stonepia/pytorch | 82006ba46074226a071c25dd2e03dc4828941544 | [
"Intel"
] | null | null | null | test/jit/test_hash.py | Stonepia/pytorch | 82006ba46074226a071c25dd2e03dc4828941544 | [
"Intel"
] | null | null | null | import os
import sys
import torch
from typing import Tuple, List
# Make the helper files in test/ importable
pytorch_test_dir = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))
sys.path.append(pytorch_test_dir)
from torch.testing._internal.jit_utils import JitTestCase
if __name__ == "__main__":
raise RuntimeError("This test file is not meant to be run directly, use:\n\n"
"\tpython test/test_jit.py TESTNAME\n\n"
"instead.")
class TestHash(JitTestCase):
def test_hash_tuple(self):
def fn(t1: Tuple[int, int], t2: Tuple[int, int]) -> bool:
return hash(t1) == hash(t2)
self.checkScript(fn, ((1, 2), (1, 2)))
self.checkScript(fn, ((1, 2), (3, 4)))
self.checkScript(fn, ((1, 2), (2, 1)))
def test_hash_tuple_nested_unhashable_type(self):
# Tuples may contain unhashable types like `list`, check that we error
# properly in that case.
@torch.jit.script
def fn_unhashable(t1: Tuple[int, List[int]]):
return hash(t1)
with self.assertRaisesRegex(RuntimeError, "unhashable"):
fn_unhashable((1, [1]))
def test_hash_tensor(self):
"""Tensors should hash by identity"""
def fn(t1, t2):
return hash(t1) == hash(t2)
tensor1 = torch.tensor(1)
tensor1_clone = torch.tensor(1)
tensor2 = torch.tensor(2)
self.checkScript(fn, (tensor1, tensor1))
self.checkScript(fn, (tensor1, tensor1_clone))
self.checkScript(fn, (tensor1, tensor2))
def test_hash_none(self):
def fn():
n1 = None
n2 = None
return hash(n1) == hash(n2)
self.checkScript(fn, ())
def test_hash_bool(self):
def fn(b1: bool, b2: bool):
return hash(b1) == hash(b2)
self.checkScript(fn, (True, False))
self.checkScript(fn, (True, True))
self.checkScript(fn, (False, True))
self.checkScript(fn, (False, False))
def test_hash_float(self):
def fn(f1: float, f2: float):
return hash(f1) == hash(f2)
self.checkScript(fn, (1.2345, 1.2345))
self.checkScript(fn, (1.2345, 6.789))
self.checkScript(fn, (1.2345, float("inf")))
self.checkScript(fn, (float("inf"), float("inf")))
self.checkScript(fn, (1.2345, float('nan')))
self.checkScript(fn, (float("nan"), float("nan")))
self.checkScript(fn, (float("nan"), float("inf")))
def test_hash_int(self):
def fn(i1: int, i2: int):
return hash(i1) == hash(i2)
self.checkScript(fn, (123, 456))
self.checkScript(fn, (123, 123))
self.checkScript(fn, (123, -123))
self.checkScript(fn, (-123, -123))
self.checkScript(fn, (123, 0))
def test_hash_string(self):
def fn(s1: str, s2: str):
return hash(s1) == hash(s2)
self.checkScript(fn, ("foo", "foo"))
self.checkScript(fn, ("foo", "bar"))
self.checkScript(fn, ("foo", ""))
def test_hash_device(self):
def fn(d1: torch.device, d2: torch.device):
return hash(d1) == hash(d2)
gpu0 = torch.device('cuda:0')
gpu1 = torch.device('cuda:1')
cpu = torch.device('cpu')
self.checkScript(fn, (gpu0, gpu0))
self.checkScript(fn, (gpu0, gpu1))
self.checkScript(fn, (gpu0, cpu))
self.checkScript(fn, (cpu, cpu))
| 32.212963 | 81 | 0.576028 | 454 | 3,479 | 4.317181 | 0.264317 | 0.229592 | 0.260204 | 0.064286 | 0.265306 | 0.106633 | 0.081633 | 0.081633 | 0.045408 | 0.045408 | 0 | 0.049704 | 0.271342 | 3,479 | 107 | 82 | 32.514019 | 0.723471 | 0.047715 | 0 | 0.025 | 0 | 0 | 0.052648 | 0 | 0 | 0 | 0 | 0 | 0.0125 | 1 | 0.225 | false | 0 | 0.0625 | 0.1 | 0.4125 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
13c8f926b1b7d2d2d35425f659d65359e2cd0872 | 184 | py | Python | KGEkeras/__init__.py | Erik-BM/KGE-Keras | e15b5435e603dff3cd241b703edad0763c14087e | [
"MIT"
] | 2 | 2020-03-14T14:55:13.000Z | 2020-06-26T10:53:24.000Z | KGEkeras/__init__.py | Erik-BM/KGE-Keras | e15b5435e603dff3cd241b703edad0763c14087e | [
"MIT"
] | null | null | null | KGEkeras/__init__.py | Erik-BM/KGE-Keras | e15b5435e603dff3cd241b703edad0763c14087e | [
"MIT"
] | 1 | 2020-06-26T11:04:41.000Z | 2020-06-26T11:04:41.000Z | # -*- coding: utf-8 -*-
__version__ = '0.1.0'
__doc__ = """
Knowledge graph embedding models implemented as keras.Model sub-classes.
"""
from .models import *
from .utils import *
| 23 | 76 | 0.673913 | 24 | 184 | 4.833333 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026316 | 0.173913 | 184 | 7 | 77 | 26.285714 | 0.736842 | 0.11413 | 0 | 0 | 0 | 0 | 0.515528 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
13ca97e6a4b284228fa22002df606846af46ee0d | 1,849 | py | Python | acos_client/v21/nat.py | alonbg/acos-client | 027370e47862ea64d289a6f55378fe45fb28140d | [
"Apache-2.0"
] | null | null | null | acos_client/v21/nat.py | alonbg/acos-client | 027370e47862ea64d289a6f55378fe45fb28140d | [
"Apache-2.0"
] | null | null | null | acos_client/v21/nat.py | alonbg/acos-client | 027370e47862ea64d289a6f55378fe45fb28140d | [
"Apache-2.0"
] | null | null | null | # Licensed under the Apache License, Version 2.0 (the "License"); you may
# not use this file except in compliance with the License. You may obtain
# a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations
# under the License.
import base
class Nat(base.BaseV21):
@property
def pool(self):
return self.Pool(self.client)
class Pool(base.BaseV21):
def _set(self, action, name, start_ip, end_ip, mask, **kwargs):
params = {
'name': name,
'start_ip_addr': start_ip,
'end_ip_addr': end_ip,
'netmask': mask,
}
return self._post(action, params, **kwargs)
def all(self):
return self._get('nat.pool.getAll')
def create(self, name, start_ip, end_ip, mask, **kwargs):
return self._set('nat.pool.create', name, start_ip, end_ip, mask,
**kwargs)
def update(self, name, start_ip, end_ip, mask, **kwargs):
return self._set('nat.pool.create', name, start_ip, end_ip, mask,
**kwargs)
def delete(self, name, **kwargs):
return self._post('nat.pool.delete', {"name": name}, **kwargs)
def stats(self, name, **kwargs):
return self._post('nat.pool.fetchStatistics', {"name": name},
**kwargs)
def all_stats(self, **kwargs):
return self._get('nat.pool.fetchALLStatistics', **kwargs)
| 36.254902 | 78 | 0.585722 | 232 | 1,849 | 4.560345 | 0.392241 | 0.075614 | 0.062382 | 0.068053 | 0.289225 | 0.251418 | 0.251418 | 0.226843 | 0.160681 | 0.160681 | 0 | 0.006197 | 0.301785 | 1,849 | 50 | 79 | 36.98 | 0.813323 | 0.295295 | 0 | 0.172414 | 0 | 0 | 0.119565 | 0.039596 | 0 | 0 | 0 | 0 | 0 | 1 | 0.275862 | false | 0 | 0.034483 | 0.241379 | 0.655172 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 3 |
13cd08512e19891898090a708a1bc908c3f50d8e | 579 | py | Python | models/__init__.py | yshanyes/Pytorch-ECG-Classifier-Cinc2020-Official | 292fcf0758d526dad204cbec5935c864b7ee9444 | [
"BSD-2-Clause"
] | 7 | 2020-09-24T03:08:56.000Z | 2022-01-12T12:51:19.000Z | models/__init__.py | yshanyes/Pytorch-ECG-Classifier-Cinc2020-Official | 292fcf0758d526dad204cbec5935c864b7ee9444 | [
"BSD-2-Clause"
] | null | null | null | models/__init__.py | yshanyes/Pytorch-ECG-Classifier-Cinc2020-Official | 292fcf0758d526dad204cbec5935c864b7ee9444 | [
"BSD-2-Clause"
] | 4 | 2021-01-16T10:45:13.000Z | 2021-06-22T07:03:53.000Z | # -*- coding: utf-8 -*-
'''
@time: 2019/10/1 10:20
@ author: ys
'''
from .mixnet_sm import mixnet_sm_pretrain, mixnet_sm, mixnet_sm_predict
from .mixnet_mm import mixnet_mm_pretrain, mixnet_mm, mixnet_mm_predict
from .resnet import resnet34
from .senet import seresnet34, seresnet50,seresnet101,seresnext26_32x4d,seresnext50_32x4d
from .resnest import resnest50,resnest101
from .iresnest import iresnest50_predict,iresnest101,iresnest50_pretrain
from .semknet import semkresnet34,semkresnet18
from .multi_scale_resnet import MSResNet
# from .utils import SelectAdaptivePool1d
| 32.166667 | 89 | 0.823834 | 77 | 579 | 5.961039 | 0.519481 | 0.069717 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.092131 | 0.100173 | 579 | 17 | 90 | 34.058824 | 0.788868 | 0.170984 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
13e5e30cc5ff4bd24ed78e638a64bba66bcdaea3 | 4,246 | py | Python | ThunkLibs/Generators/libXrender.py | phire/FEX | a721257cdd787bd641875ca8e138809aaad17e0c | [
"MIT"
] | null | null | null | ThunkLibs/Generators/libXrender.py | phire/FEX | a721257cdd787bd641875ca8e138809aaad17e0c | [
"MIT"
] | null | null | null | ThunkLibs/Generators/libXrender.py | phire/FEX | a721257cdd787bd641875ca8e138809aaad17e0c | [
"MIT"
] | null | null | null | #!/usr/bin/python3
from ThunkHelpers import *
lib("libXrender")
fn("Cursor XRenderCreateAnimCursor(Display*, int, XAnimCursor*)")
fn("Cursor XRenderCreateCursor(Display*, Picture, unsigned int, unsigned int)")
fn("GlyphSet XRenderCreateGlyphSet(Display*, const XRenderPictFormat*)")
fn("GlyphSet XRenderReferenceGlyphSet(Display*, GlyphSet)")
fn("int XRenderParseColor(Display*, char*, XRenderColor*)")
fn("int XRenderQueryExtension(Display*, int*, int*)")
fn("int XRenderQueryFormats(Display*)")
fn("int XRenderQuerySubpixelOrder(Display*, int)")
fn("int XRenderQueryVersion(Display*, int*, int*)")
fn("int XRenderSetSubpixelOrder(Display*, int, int)")
fn("Picture XRenderCreateConicalGradient(Display*, const XConicalGradient*, const XFixed*, const XRenderColor*, int)")
fn("Picture XRenderCreateLinearGradient(Display*, const XLinearGradient*, const XFixed*, const XRenderColor*, int)")
fn("Picture XRenderCreatePicture(Display*, Drawable, const XRenderPictFormat*, long unsigned int, const XRenderPictureAttributes*)")
fn("Picture XRenderCreateRadialGradient(Display*, const XRadialGradient*, const XFixed*, const XRenderColor*, int)")
fn("Picture XRenderCreateSolidFill(Display*, const XRenderColor*)")
fn("void XRenderAddGlyphs(Display*, GlyphSet, const Glyph*, const XGlyphInfo*, int, const char*, int)")
fn("void XRenderAddTraps(Display*, Picture, int, int, const XTrap*, int)")
fn("void XRenderChangePicture(Display*, Picture, long unsigned int, const XRenderPictureAttributes*)")
fn("void XRenderComposite(Display*, int, Picture, Picture, Picture, int, int, int, int, int, int, unsigned int, unsigned int)")
fn("void XRenderCompositeDoublePoly(Display*, int, Picture, Picture, const XRenderPictFormat*, int, int, int, int, const XPointDouble*, int, int)")
fn("void XRenderCompositeString16(Display*, int, Picture, Picture, const XRenderPictFormat*, GlyphSet, int, int, int, int, const short unsigned int*, int)")
fn("void XRenderCompositeString32(Display*, int, Picture, Picture, const XRenderPictFormat*, GlyphSet, int, int, int, int, const unsigned int*, int)")
fn("void XRenderCompositeString8(Display*, int, Picture, Picture, const XRenderPictFormat*, GlyphSet, int, int, int, int, const char*, int)")
fn("void XRenderCompositeText16(Display*, int, Picture, Picture, const XRenderPictFormat*, int, int, int, int, const XGlyphElt16*, int)")
fn("void XRenderCompositeText32(Display*, int, Picture, Picture, const XRenderPictFormat*, int, int, int, int, const XGlyphElt32*, int)")
fn("void XRenderCompositeText8(Display*, int, Picture, Picture, const XRenderPictFormat*, int, int, int, int, const XGlyphElt8*, int)")
fn("void XRenderCompositeTrapezoids(Display*, int, Picture, Picture, const XRenderPictFormat*, int, int, const XTrapezoid*, int)")
fn("void XRenderCompositeTriangles(Display*, int, Picture, Picture, const XRenderPictFormat*, int, int, const XTriangle*, int)")
fn("void XRenderCompositeTriFan(Display*, int, Picture, Picture, const XRenderPictFormat*, int, int, const XPointFixed*, int)")
fn("void XRenderCompositeTriStrip(Display*, int, Picture, Picture, const XRenderPictFormat*, int, int, const XPointFixed*, int)")
fn("void XRenderFillRectangle(Display*, int, Picture, const XRenderColor*, int, int, unsigned int, unsigned int)")
fn("void XRenderFillRectangles(Display*, int, Picture, const XRenderColor*, const XRectangle*, int)")
fn("void XRenderFreeGlyphs(Display*, GlyphSet, const Glyph*, int)")
fn("void XRenderFreeGlyphSet(Display*, GlyphSet)")
fn("void XRenderFreePicture(Display*, Picture)")
fn("void XRenderSetPictureClipRectangles(Display*, Picture, int, int, const XRectangle*, int)")
fn("void XRenderSetPictureClipRegion(Display*, Picture, Region)")
fn("void XRenderSetPictureFilter(Display*, Picture, const char*, XFixed*, int)")
fn("void XRenderSetPictureTransform(Display*, Picture, XTransform*)")
fn("XFilters* XRenderQueryFilters(Display*, Drawable)")
fn("XIndexValue* XRenderQueryPictIndexValues(Display*, const XRenderPictFormat*, int*)")
fn("XRenderPictFormat* XRenderFindFormat(Display*, long unsigned int, const XRenderPictFormat*, int)")
fn("XRenderPictFormat* XRenderFindStandardFormat(Display*, int)")
fn("XRenderPictFormat* XRenderFindVisualFormat(Display*, const Visual*)")
Generate() | 81.653846 | 156 | 0.775789 | 452 | 4,246 | 7.287611 | 0.199115 | 0.071038 | 0.051913 | 0.087432 | 0.427444 | 0.317851 | 0.279599 | 0.243169 | 0.222526 | 0.187917 | 0 | 0.004123 | 0.085963 | 4,246 | 52 | 157 | 81.653846 | 0.844628 | 0.004004 | 0 | 0 | 0 | 0.276596 | 0.91511 | 0.346654 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.021277 | 0 | 0.021277 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
13e6ef2c417eed171fa2e823fe3cbedf3426171d | 100 | py | Python | project_admin/apps.py | madprime/oh-ancestrydna-source | adb46038afd436b0d7f8b60be43a933916f6b22e | [
"MIT"
] | 7 | 2018-02-03T02:00:33.000Z | 2019-12-16T10:42:07.000Z | project_admin/apps.py | madprime/oh-ancestrydna-source | adb46038afd436b0d7f8b60be43a933916f6b22e | [
"MIT"
] | 64 | 2018-02-14T13:11:35.000Z | 2022-03-17T10:12:48.000Z | project_admin/apps.py | madprime/oh-ancestrydna-source | adb46038afd436b0d7f8b60be43a933916f6b22e | [
"MIT"
] | 34 | 2018-02-13T20:00:04.000Z | 2018-03-29T11:10:02.000Z | from django.apps import AppConfig
class ProjectAdminConfig(AppConfig):
name = 'project_admin'
| 16.666667 | 36 | 0.78 | 11 | 100 | 7 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.15 | 100 | 5 | 37 | 20 | 0.905882 | 0 | 0 | 0 | 0 | 0 | 0.13 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
13f17984a131118b479da3b6558a4509ec34032b | 2,709 | py | Python | emission/core/wrapper/client.py | trevor-wu/e-mission-server | 2e31986bd7c0faab7110b7eb69541b0b9eac62df | [
"BSD-3-Clause"
] | 1 | 2018-01-07T19:17:16.000Z | 2018-01-07T19:17:16.000Z | emission/core/wrapper/client.py | trevor-wu/e-mission-server | 2e31986bd7c0faab7110b7eb69541b0b9eac62df | [
"BSD-3-Clause"
] | null | null | null | emission/core/wrapper/client.py | trevor-wu/e-mission-server | 2e31986bd7c0faab7110b7eb69541b0b9eac62df | [
"BSD-3-Clause"
] | 1 | 2021-11-15T10:22:41.000Z | 2021-11-15T10:22:41.000Z | from __future__ import unicode_literals
from __future__ import print_function
from __future__ import division
from __future__ import absolute_import
#
# In the current iteration, there is a client object that can be loaded from
# the filesystem into the database and its settings loaded from the database.
# There are no special settings (e.g. active/inactive).
#
# I have no idea how this will be used, but it is nice^H^H^H^H, unit tested code,
# so let us keep it around a bit longer
#
# Ah but this assumes that the settings file is in `emission/clients/` and we
# just deleted that entire directory. Changing this to conf for now...
from future import standard_library
standard_library.install_aliases()
from builtins import str
from builtins import *
from builtins import object
import json
import logging
import dateutil.parser
from datetime import datetime
# Our imports
from emission.core.get_database import get_profile_db, get_client_db
class Client(object):
def __init__(self, clientName):
# TODO: write background process to ensure that there is only one client with each name
# Maybe clean up unused clients?
self.clientName = clientName
self.settings_filename = "conf/clients/%s.settings.json" % self.clientName
self.__reload()
# Smart settings call, which returns the override settings if the client is
# active, and
def getSettings(self):
logging.debug("For client %s, returning settings %s" % (self.clientName, self.clientJSON['client_settings']))
return self.clientJSON['client_settings']
def __reload(self):
self.clientJSON = None
if self.clientName is not None:
self.clientJSON = get_client_db().find_one({'name': self.clientName})
# Figure out if the JSON object here should always be passed in
# Having it be passed in is a lot more flexible
# Let's compromise for now by passing it in and seeing how much of a hassle it is
# That will also ensure that the update_client script is not a complete NOP
def __update(self, newEntry):
get_client_db().update({'name': self.clientName}, newEntry, upsert = True)
self.__reload()
def update(self, createKey = True):
import uuid
newEntry = json.load(open(self.settings_filename))
if createKey:
newEntry['key'] = str(uuid.uuid4())
# logging.info("Updating with new entry %s" % newEntry)
self.__update(newEntry)
return newEntry['key']
def getClientKey(self):
if self.clientJSON is None:
return None
logging.debug("About to return %s from JSON %s" % (self.clientJSON['key'], self.clientJSON))
return self.clientJSON['key']
def clientSpecificSetters(self, uuid, sectionId, predictedModeMap):
return None
| 37.109589 | 115 | 0.740126 | 394 | 2,709 | 4.959391 | 0.416244 | 0.057318 | 0.040942 | 0.028659 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.00045 | 0.18014 | 2,709 | 72 | 116 | 37.625 | 0.879334 | 0.368771 | 0 | 0.093023 | 0 | 0 | 0.086442 | 0.01717 | 0 | 0 | 0 | 0.013889 | 0 | 1 | 0.162791 | false | 0 | 0.325581 | 0.023256 | 0.627907 | 0.023256 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 3 |
b938c751006cdaa9a3488e3d9f2970d3cad75366 | 446 | py | Python | lhotse/dataset/__init__.py | freewym/lhotse | 66e9bbaf25b75011388ab00189baa162c3c1d435 | [
"Apache-2.0"
] | null | null | null | lhotse/dataset/__init__.py | freewym/lhotse | 66e9bbaf25b75011388ab00189baa162c3c1d435 | [
"Apache-2.0"
] | null | null | null | lhotse/dataset/__init__.py | freewym/lhotse | 66e9bbaf25b75011388ab00189baa162c3c1d435 | [
"Apache-2.0"
] | null | null | null | from .diarization import DiarizationDataset
from .source_separation import (
SourceSeparationDataset,
PreMixedSourceSeparationDataset,
DynamicallyMixedSourceSeparationDataset
)
from .speech_recognition import SpeechRecognitionDataset
from .speech_synthesis import SpeechSynthesisDataset
from .unsupervised import (
UnsupervisedDataset,
UnsupervisedWaveformDataset,
DynamicUnsupervisedDataset
)
from .vad import VadDataset
| 29.733333 | 56 | 0.843049 | 31 | 446 | 12.032258 | 0.645161 | 0.053619 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125561 | 446 | 14 | 57 | 31.857143 | 0.95641 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.428571 | 0 | 0.428571 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 3 |
b9562d9cc2c5a50ea05d9a0e0405dd356a8d6320 | 319 | py | Python | bitmovin/services/encodings/webm_drm_service.py | migutw42/bitmovin-python | 2d82b143bc405ee24398821280621dc64b5af7e9 | [
"Unlicense"
] | 44 | 2016-12-12T17:37:23.000Z | 2021-03-03T09:48:48.000Z | bitmovin/services/encodings/webm_drm_service.py | migutw42/bitmovin-python | 2d82b143bc405ee24398821280621dc64b5af7e9 | [
"Unlicense"
] | 38 | 2017-01-09T14:45:45.000Z | 2022-02-27T18:04:33.000Z | bitmovin/services/encodings/webm_drm_service.py | migutw42/bitmovin-python | 2d82b143bc405ee24398821280621dc64b5af7e9 | [
"Unlicense"
] | 27 | 2017-02-02T22:49:31.000Z | 2019-11-21T07:04:57.000Z | from .drm_service import DRMService
from .cenc_drm_service import CENCDRM
class WebMDRMService(DRMService):
MUXING_TYPE_URL = 'webm'
def __init__(self, http_client):
super().__init__(http_client=http_client)
self.CENC = CENCDRM(http_client=http_client, muxing_type_url=self.MUXING_TYPE_URL)
| 26.583333 | 90 | 0.76489 | 43 | 319 | 5.162791 | 0.44186 | 0.225225 | 0.175676 | 0.18018 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.153605 | 319 | 11 | 91 | 29 | 0.822222 | 0 | 0 | 0 | 0 | 0 | 0.012539 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.714286 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
b973977bc932d3a00b25074c9c6fc90ccd86ee52 | 193 | py | Python | src/pattern-searching/boj_2775.py | joeyworld/algo | 03e733f8f0dafe8b5cfe85eb9f7d72f370a67c61 | [
"MIT"
] | 1 | 2019-02-11T09:18:14.000Z | 2019-02-11T09:18:14.000Z | src/pattern-searching/boj_2775.py | gyukebox/algo | 03e733f8f0dafe8b5cfe85eb9f7d72f370a67c61 | [
"MIT"
] | null | null | null | src/pattern-searching/boj_2775.py | gyukebox/algo | 03e733f8f0dafe8b5cfe85eb9f7d72f370a67c61 | [
"MIT"
] | null | null | null | def solve(floor, room):
pass
if __name__ == '__main__':
num_tc = int(input())
for _ in range(num_tc):
k = int(input())
n = int(input())
print(solve(k, n))
| 17.545455 | 27 | 0.518135 | 27 | 193 | 3.296296 | 0.666667 | 0.269663 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.316062 | 193 | 10 | 28 | 19.3 | 0.674242 | 0 | 0 | 0 | 0 | 0 | 0.041451 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0.125 | 0 | 0 | 0.125 | 0.125 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 3 |
b9a3ff697396841c55505470726f1419eccfa0d8 | 161 | py | Python | is_palindrome.py | YessOn/ProjectEuler-Toolkits | cc196c1c6faca32f7613ae40ddb3c7f7b86e071e | [
"MIT"
] | null | null | null | is_palindrome.py | YessOn/ProjectEuler-Toolkits | cc196c1c6faca32f7613ae40ddb3c7f7b86e071e | [
"MIT"
] | null | null | null | is_palindrome.py | YessOn/ProjectEuler-Toolkits | cc196c1c6faca32f7613ae40ddb3c7f7b86e071e | [
"MIT"
] | null | null | null | def is_palindrome(n):
if str(n) == str(n)[::-1]:
return True
return False
# Examples:
is_palindrome(101) # True
is_palindrome(147) # False
| 16.1 | 30 | 0.608696 | 23 | 161 | 4.130435 | 0.565217 | 0.378947 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.057851 | 0.248447 | 161 | 9 | 31 | 17.888889 | 0.727273 | 0.124224 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b9b17a163a4842f240419d73afcc80272d7f7296 | 592 | py | Python | cohesity_management_sdk/models/object_class_enum.py | chandrashekar-cohesity/management-sdk-python | 9e6ec99e8a288005804b808c4e9b19fd204e3a8b | [
"Apache-2.0"
] | 1 | 2019-11-07T23:19:32.000Z | 2019-11-07T23:19:32.000Z | cohesity_management_sdk/models/object_class_enum.py | chandrashekar-cohesity/management-sdk-python | 9e6ec99e8a288005804b808c4e9b19fd204e3a8b | [
"Apache-2.0"
] | null | null | null | cohesity_management_sdk/models/object_class_enum.py | chandrashekar-cohesity/management-sdk-python | 9e6ec99e8a288005804b808c4e9b19fd204e3a8b | [
"Apache-2.0"
] | null | null | null | # -*- coding: utf-8 -*-
# Copyright 2019 Cohesity Inc.
class ObjectClassEnum(object):
"""Implementation of the 'ObjectClass' enum.
Specifies the object class of the principal (either 'kGroup' or 'kUser').
'kUser' specifies a user object class.
'kGroup' specifies a group object class.
'kComputer' specifies a computer object class.
Attributes:
KUSER: TODO: type description here.
KGROUP: TODO: type description here.
KCOMPUTER: TODO: type description here.
"""
KUSER = 'kUser'
KGROUP = 'kGroup'
KCOMPUTER = 'kComputer'
| 22.769231 | 77 | 0.657095 | 66 | 592 | 5.893939 | 0.469697 | 0.113111 | 0.14653 | 0.177378 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.011186 | 0.244932 | 592 | 25 | 78 | 23.68 | 0.85906 | 0.731419 | 0 | 0 | 0 | 0 | 0.176991 | 0 | 0 | 0 | 0 | 0.12 | 0 | 1 | 0 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
b9bee50086fa06af2fef2ce84ec75a8953baeeaf | 1,690 | py | Python | ngcasa/flagging/shadow.py | wxiongccnu1990/cngi_prototype | 7a7230485acc9f8f2be534a832522339153d521e | [
"Apache-2.0"
] | null | null | null | ngcasa/flagging/shadow.py | wxiongccnu1990/cngi_prototype | 7a7230485acc9f8f2be534a832522339153d521e | [
"Apache-2.0"
] | null | null | null | ngcasa/flagging/shadow.py | wxiongccnu1990/cngi_prototype | 7a7230485acc9f8f2be534a832522339153d521e | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 AUI, Inc. Washington DC, USA
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
def shadow(vis_dataset, shadow_parms, storage_parms):
"""
.. todo::
This function is not yet implemented
Flag all baselines for antennas that are shadowed beyond the specified tolerance.
All antennas in the zarr-file metadata (and their corresponding diameters)
will be considered for shadow-flag calculations.
For a given timestep, an antenna is flagged if any of its baselines
(projected onto the uv-plane) is shorter than radius_1 + radius_2 - tolerance.
The value of 'w' is used to determine which antenna is behind the other.
The phase-reference center is used for antenna-pointing direction.
Antennas that are not part of the observation, but still physically
present and shadowing other antennas that are being used, must be added
to the meta-data list in the zarr prior to calling this method.
Inputs :
(1) shadowlimit or tolerance (in m)
(2) array name for output flags. Default = FLAG
Returns
-------
vis_dataset : xarray.core.dataset.Dataset
"""
| 39.302326 | 85 | 0.711834 | 246 | 1,690 | 4.865854 | 0.601626 | 0.050125 | 0.037594 | 0.026734 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009224 | 0.230178 | 1,690 | 42 | 86 | 40.238095 | 0.910838 | 0.885207 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.02381 | 0 | 1 | 1 | false | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b9c7e63896a19f8f68bf0b6cf666cb87713acaae | 1,183 | py | Python | lib/python2.7/site-packages/networkx/algorithms/isomorphism/tests/test_isomorphism.py | nishaero/wifi-userseg-ryu | 1132f2c813b79eff755bdd1a9e73e7ad3980af7c | [
"Apache-2.0"
] | 184 | 2017-12-20T21:50:06.000Z | 2022-03-19T13:24:58.000Z | lib/python2.7/site-packages/networkx/algorithms/isomorphism/tests/test_isomorphism.py | nishaero/wifi-userseg-ryu | 1132f2c813b79eff755bdd1a9e73e7ad3980af7c | [
"Apache-2.0"
] | 71 | 2015-01-05T16:50:55.000Z | 2020-09-30T19:17:47.000Z | lib/python2.7/site-packages/networkx/algorithms/isomorphism/tests/test_isomorphism.py | nishaero/wifi-userseg-ryu | 1132f2c813b79eff755bdd1a9e73e7ad3980af7c | [
"Apache-2.0"
] | 136 | 2018-01-09T22:52:06.000Z | 2022-02-24T13:26:18.000Z | #!/usr/bin/env python
from nose.tools import *
import networkx as nx
from networkx.algorithms import isomorphism as iso
class TestIsomorph:
def setUp(self):
self.G1=nx.Graph()
self.G2=nx.Graph()
self.G3=nx.Graph()
self.G4=nx.Graph()
self.G1.add_edges_from([ [1,2],[1,3],[1,5],[2,3] ])
self.G2.add_edges_from([ [10,20],[20,30],[10,30],[10,50] ])
self.G3.add_edges_from([ [1,2],[1,3],[1,5],[2,5] ])
self.G4.add_edges_from([ [1,2],[1,3],[1,5],[2,4] ])
def test_could_be_isomorphic(self):
assert_true(iso.could_be_isomorphic(self.G1,self.G2))
assert_true(iso.could_be_isomorphic(self.G1,self.G3))
assert_false(iso.could_be_isomorphic(self.G1,self.G4))
assert_true(iso.could_be_isomorphic(self.G3,self.G2))
def test_fast_could_be_isomorphic(self):
assert_true(iso.fast_could_be_isomorphic(self.G3,self.G2))
def test_faster_could_be_isomorphic(self):
assert_true(iso.faster_could_be_isomorphic(self.G3,self.G2))
def test_is_isomorphic(self):
assert_true(iso.is_isomorphic(self.G1,self.G2))
assert_false(iso.is_isomorphic(self.G1,self.G4))
| 35.848485 | 68 | 0.666948 | 197 | 1,183 | 3.77665 | 0.228426 | 0.225806 | 0.205645 | 0.254032 | 0.651882 | 0.587366 | 0.50672 | 0.329301 | 0.329301 | 0.076613 | 0 | 0.06524 | 0.170752 | 1,183 | 32 | 69 | 36.96875 | 0.69317 | 0.016906 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.32 | 1 | 0.2 | false | 0 | 0.12 | 0 | 0.36 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 3 |
b9e5f2a40785450b2c248ab3b74c508e43aeb946 | 413 | py | Python | 0x0B-python-input_output/9-student.py | coding-max/holbertonschool-higher_level_programming | 392fed1ae686642b6cca6bb6752050882bbf79fc | [
"MIT"
] | 1 | 2021-04-26T03:45:12.000Z | 2021-04-26T03:45:12.000Z | 0x0B-python-input_output/9-student.py | coding-max/holbertonschool-higher_level_programming | 392fed1ae686642b6cca6bb6752050882bbf79fc | [
"MIT"
] | null | null | null | 0x0B-python-input_output/9-student.py | coding-max/holbertonschool-higher_level_programming | 392fed1ae686642b6cca6bb6752050882bbf79fc | [
"MIT"
] | 1 | 2022-02-02T02:44:35.000Z | 2022-02-02T02:44:35.000Z | #!/usr/bin/python3
"""Student to JSON"""
class Student:
"""representation of a student"""
def __init__(self, first_name, last_name, age):
"""instantiation of the student"""
self.first_name = first_name
self.last_name = last_name
self.age = age
def to_json(self):
"""retrieves a dictionary representation of a Student instance"""
return self.__dict__
| 25.8125 | 73 | 0.639225 | 52 | 413 | 4.788462 | 0.461538 | 0.108434 | 0.136546 | 0.192771 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003226 | 0.249395 | 413 | 15 | 74 | 27.533333 | 0.8 | 0.363196 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0 | 0 | 0 | 0.571429 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
b9f4c1e80f5b309153b9a36f08e85d719f4cc498 | 129 | py | Python | apps/dashboard/appconfig.py | Kpaubert/onlineweb4 | 9ac79f163bc3a816db57ffa8477ea88770d97807 | [
"MIT"
] | 32 | 2017-02-22T13:38:38.000Z | 2022-03-31T23:29:54.000Z | apps/dashboard/appconfig.py | Kpaubert/onlineweb4 | 9ac79f163bc3a816db57ffa8477ea88770d97807 | [
"MIT"
] | 694 | 2017-02-15T23:09:52.000Z | 2022-03-31T23:16:07.000Z | apps/dashboard/appconfig.py | Kpaubert/onlineweb4 | 9ac79f163bc3a816db57ffa8477ea88770d97807 | [
"MIT"
] | 35 | 2017-09-02T21:13:09.000Z | 2022-02-21T11:30:30.000Z | from django.apps import AppConfig
class DashboardConfig(AppConfig):
name = "apps.dashboard"
verbose_name = "dashboard"
| 18.428571 | 33 | 0.744186 | 14 | 129 | 6.785714 | 0.714286 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.170543 | 129 | 6 | 34 | 21.5 | 0.88785 | 0 | 0 | 0 | 0 | 0 | 0.178295 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.25 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 3 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.