hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
7ee0ad4229cbb8785cc665b7f1d02577367e5533 | 6,358 | py | Python | venv/lib/python3.6/site-packages/xero_python/accounting/models/branding_theme.py | 6enno/FarmXero | 881b1e6648e927631b276e66a4c5287e4de2cbc1 | [
"MIT"
] | null | null | null | venv/lib/python3.6/site-packages/xero_python/accounting/models/branding_theme.py | 6enno/FarmXero | 881b1e6648e927631b276e66a4c5287e4de2cbc1 | [
"MIT"
] | null | null | null | venv/lib/python3.6/site-packages/xero_python/accounting/models/branding_theme.py | 6enno/FarmXero | 881b1e6648e927631b276e66a4c5287e4de2cbc1 | [
"MIT"
] | null | null | null | # coding: utf-8
"""
Accounting API
No description provided (generated by Openapi Generator https://github.com/openapitools/openapi-generator) # noqa: E501
Contact: api@xero.com
Generated by: https://openapi-generator.tech
"""
import re # noqa: F401
from xero_python.models import BaseModel
class BrandingTheme(BaseModel):
"""NOTE: This class is auto generated by OpenAPI Generator.
Ref: https://openapi-generator.tech
Do not edit the class manually.
"""
"""
Attributes:
openapi_types (dict): The key is attribute name
and the value is attribute type.
attribute_map (dict): The key is attribute name
and the value is json key in definition.
"""
openapi_types = {
"branding_theme_id": "str",
"name": "str",
"logo_url": "str",
"type": "str",
"sort_order": "int",
"created_date_utc": "datetime[ms-format]",
}
attribute_map = {
"branding_theme_id": "BrandingThemeID",
"name": "Name",
"logo_url": "LogoUrl",
"type": "Type",
"sort_order": "SortOrder",
"created_date_utc": "CreatedDateUTC",
}
def __init__(
self,
branding_theme_id=None,
name=None,
logo_url=None,
type=None,
sort_order=None,
created_date_utc=None,
): # noqa: E501
"""BrandingTheme - a model defined in OpenAPI""" # noqa: E501
self._branding_theme_id = None
self._name = None
self._logo_url = None
self._type = None
self._sort_order = None
self._created_date_utc = None
self.discriminator = None
if branding_theme_id is not None:
self.branding_theme_id = branding_theme_id
if name is not None:
self.name = name
if logo_url is not None:
self.logo_url = logo_url
if type is not None:
self.type = type
if sort_order is not None:
self.sort_order = sort_order
if created_date_utc is not None:
self.created_date_utc = created_date_utc
@property
def branding_theme_id(self):
"""Gets the branding_theme_id of this BrandingTheme. # noqa: E501
Xero identifier # noqa: E501
:return: The branding_theme_id of this BrandingTheme. # noqa: E501
:rtype: str
"""
return self._branding_theme_id
@branding_theme_id.setter
def branding_theme_id(self, branding_theme_id):
"""Sets the branding_theme_id of this BrandingTheme.
Xero identifier # noqa: E501
:param branding_theme_id: The branding_theme_id of this BrandingTheme. # noqa: E501
:type: str
"""
self._branding_theme_id = branding_theme_id
@property
def name(self):
"""Gets the name of this BrandingTheme. # noqa: E501
Name of branding theme # noqa: E501
:return: The name of this BrandingTheme. # noqa: E501
:rtype: str
"""
return self._name
@name.setter
def name(self, name):
"""Sets the name of this BrandingTheme.
Name of branding theme # noqa: E501
:param name: The name of this BrandingTheme. # noqa: E501
:type: str
"""
self._name = name
@property
def logo_url(self):
"""Gets the logo_url of this BrandingTheme. # noqa: E501
The location of the image file used as the logo on this branding theme # noqa: E501
:return: The logo_url of this BrandingTheme. # noqa: E501
:rtype: str
"""
return self._logo_url
@logo_url.setter
def logo_url(self, logo_url):
"""Sets the logo_url of this BrandingTheme.
The location of the image file used as the logo on this branding theme # noqa: E501
:param logo_url: The logo_url of this BrandingTheme. # noqa: E501
:type: str
"""
self._logo_url = logo_url
@property
def type(self):
"""Gets the type of this BrandingTheme. # noqa: E501
Always INVOICE # noqa: E501
:return: The type of this BrandingTheme. # noqa: E501
:rtype: str
"""
return self._type
@type.setter
def type(self, type):
"""Sets the type of this BrandingTheme.
Always INVOICE # noqa: E501
:param type: The type of this BrandingTheme. # noqa: E501
:type: str
"""
allowed_values = ["INVOICE", "None"] # noqa: E501
if type:
if type not in allowed_values:
raise ValueError(
"Invalid value for `type` ({0}), must be one of {1}".format( # noqa: E501
type, allowed_values
)
)
self._type = type
@property
def sort_order(self):
"""Gets the sort_order of this BrandingTheme. # noqa: E501
Integer – ranked order of branding theme. The default branding theme has a value of 0 # noqa: E501
:return: The sort_order of this BrandingTheme. # noqa: E501
:rtype: int
"""
return self._sort_order
@sort_order.setter
def sort_order(self, sort_order):
"""Sets the sort_order of this BrandingTheme.
Integer – ranked order of branding theme. The default branding theme has a value of 0 # noqa: E501
:param sort_order: The sort_order of this BrandingTheme. # noqa: E501
:type: int
"""
self._sort_order = sort_order
@property
def created_date_utc(self):
"""Gets the created_date_utc of this BrandingTheme. # noqa: E501
UTC timestamp of creation date of branding theme # noqa: E501
:return: The created_date_utc of this BrandingTheme. # noqa: E501
:rtype: datetime
"""
return self._created_date_utc
@created_date_utc.setter
def created_date_utc(self, created_date_utc):
"""Sets the created_date_utc of this BrandingTheme.
UTC timestamp of creation date of branding theme # noqa: E501
:param created_date_utc: The created_date_utc of this BrandingTheme. # noqa: E501
:type: datetime
"""
self._created_date_utc = created_date_utc
| 27.885965 | 124 | 0.596886 | 786 | 6,358 | 4.647583 | 0.147583 | 0.076649 | 0.124829 | 0.113332 | 0.552696 | 0.463181 | 0.428963 | 0.311799 | 0.24035 | 0.125376 | 0 | 0.026176 | 0.321013 | 6,358 | 227 | 125 | 28.008811 | 0.819551 | 0.423089 | 0 | 0.065934 | 1 | 0 | 0.091883 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.021978 | 0 | 0.263736 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7ee1d07b4bc413cb7389c911457d3de03b101227 | 2,460 | py | Python | check_mana.py | CheapskateProjects/MtgManaRecognition | 9119a843f5c235ca09c695a46611bb46fea37573 | [
"MIT"
] | 7 | 2020-01-24T13:15:51.000Z | 2021-11-18T00:59:14.000Z | check_mana.py | CheapskateProjects/MtgManaRecognition | 9119a843f5c235ca09c695a46611bb46fea37573 | [
"MIT"
] | null | null | null | check_mana.py | CheapskateProjects/MtgManaRecognition | 9119a843f5c235ca09c695a46611bb46fea37573 | [
"MIT"
] | 3 | 2017-12-11T08:42:20.000Z | 2021-05-23T22:16:37.000Z | """
This code will read file given as parameter and list what mana symbols it contains.
created Apr 2017
by CheapskateProjects
---------------------------
The MIT License (MIT)
Copyright (c) 2017 CheapskateProjects
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
"""
import cv2
import numpy as np
from os import listdir
import sys
if ( len(sys.argv) <= 1 ):
print "Usage: <file to check>"
sys.exit()
# Config
matchingThreshold = 0.7
# Read templates at the begining. Only once
green = cv2.imread('mana_icons/green_mana.jpg',0)
blue = cv2.imread('mana_icons/blue_mana.jpg',0)
red = cv2.imread('mana_icons/red_mana.jpg',0)
white = cv2.imread('mana_icons/white_mana.jpg',0)
black = cv2.imread('mana_icons/black_mana.jpg',0)
# All the mana logos are about the same size
w, h = green.shape[::-1]
def colorcheck( color_template, draw_color, img_gray ):
results = cv2.matchTemplate(img_gray,color_template,cv2.TM_CCOEFF_NORMED)
locations = np.where( results >= matchingThreshold)
if len(zip(*locations[::-1])) > 0:
return "Yes"
else:
return "No"
filename=sys.argv[1]
img_to_check = cv2.imread(filename)
img_gray = cv2.cvtColor(img_to_check, cv2.COLOR_BGR2GRAY)
print "Green: " + colorcheck(green, (0,255,0), img_gray)
print "Red: " + colorcheck(red, (0,0,255), img_gray)
print "Black: " + colorcheck(black, (0,0,0), img_gray)
print "Blue: " + colorcheck(blue, (255,0,0), img_gray)
print "White: " + colorcheck(white, (255,255,255), img_gray)
| 46.415094 | 460 | 0.742683 | 385 | 2,460 | 4.672727 | 0.449351 | 0.048916 | 0.036131 | 0.050028 | 0.015564 | 0 | 0 | 0 | 0 | 0 | 0 | 0.028338 | 0.153659 | 2,460 | 52 | 461 | 47.307692 | 0.835735 | 0.036992 | 0 | 0 | 0 | 0 | 0.161752 | 0.109026 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.137931 | null | null | 0.206897 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7eea074d109ec1681ca547e782a6c5293f0db45e | 43,464 | py | Python | backend/tests/baserow/contrib/database/field/test_formula_field_type.py | ashishdhngr/baserow | b098678d2165eb7c42930ee24dc6753a3cb520c3 | [
"MIT"
] | null | null | null | backend/tests/baserow/contrib/database/field/test_formula_field_type.py | ashishdhngr/baserow | b098678d2165eb7c42930ee24dc6753a3cb520c3 | [
"MIT"
] | null | null | null | backend/tests/baserow/contrib/database/field/test_formula_field_type.py | ashishdhngr/baserow | b098678d2165eb7c42930ee24dc6753a3cb520c3 | [
"MIT"
] | null | null | null | import inspect
import pytest
from django.db.models import TextField
from django.urls import reverse
from rest_framework.status import HTTP_200_OK, HTTP_204_NO_CONTENT
from baserow.contrib.database.table.cache import (
generated_models_cache,
)
from baserow.contrib.database.fields.dependencies.handler import FieldDependencyHandler
from baserow.contrib.database.fields.dependencies.update_collector import (
CachingFieldUpdateCollector,
)
from baserow.contrib.database.fields.field_cache import FieldCache
from baserow.contrib.database.fields.field_types import FormulaFieldType
from baserow.contrib.database.fields.fields import BaserowExpressionField
from baserow.contrib.database.fields.handler import FieldHandler
from baserow.contrib.database.fields.models import FormulaField, LookupField
from baserow.contrib.database.fields.registries import field_type_registry
from baserow.contrib.database.formula import (
BaserowFormulaInvalidType,
FormulaHandler,
BaserowFormulaTextType,
BaserowFormulaNumberType,
)
from baserow.contrib.database.formula.ast.tree import BaserowFunctionDefinition
from baserow.contrib.database.formula.registries import formula_function_registry
from baserow.contrib.database.rows.handler import RowHandler
from baserow.contrib.database.views.exceptions import (
ViewFilterTypeNotAllowedForField,
ViewSortFieldNotSupported,
)
from baserow.contrib.database.views.handler import ViewHandler
from baserow.contrib.database.views.models import SORT_ORDER_ASC, SORT_ORDER_DESC
from baserow.contrib.database.views.registries import view_filter_type_registry
@pytest.mark.django_db
def test_creating_a_model_with_formula_field_immediately_populates_it(data_fixture):
table = data_fixture.create_database_table()
formula_field = data_fixture.create_formula_field(
table=table, formula="'test'", formula_type="text"
)
formula_field_name = f"field_{formula_field.id}"
model = table.get_model()
row = model.objects.create()
assert getattr(row, formula_field_name) == "test"
@pytest.mark.django_db
def test_adding_a_formula_field_to_an_existing_table_populates_it_for_all_rows(
data_fixture,
):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
before_model = table.get_model()
existing_row = before_model.objects.create()
formula_field = FieldHandler().create_field(
user, table, "formula", name="formula", formula="'test'"
)
formula_field_name = f"field_{formula_field.id}"
model = table.get_model()
row = model.objects.create()
assert getattr(row, formula_field_name) == "test"
assert getattr(model.objects.get(id=existing_row.id), formula_field_name) == "test"
@pytest.mark.django_db
def test_cant_change_the_value_of_a_formula_field_directly(data_fixture):
table = data_fixture.create_database_table()
data_fixture.create_formula_field(
name="formula", table=table, formula="'test'", formula_type="text"
)
data_fixture.create_text_field(name="text", table=table)
model = table.get_model(attribute_names=True)
row = model.objects.create(formula="not test")
assert row.formula == "test"
row.text = "update other field"
row.save()
row.formula = "not test"
row.save()
row.refresh_from_db()
assert row.formula == "test"
@pytest.mark.django_db
def test_get_set_export_serialized_value_formula_field(data_fixture):
table = data_fixture.create_database_table()
formula_field = data_fixture.create_formula_field(
table=table, formula="'test'", formula_type="text"
)
formula_field_name = f"field_{formula_field.id}"
formula_field_type = field_type_registry.get_by_model(formula_field)
model = table.get_model()
row_1 = model.objects.create()
row_2 = model.objects.create()
old_row_1_value = getattr(row_1, formula_field_name)
old_row_2_value = getattr(row_2, formula_field_name)
assert old_row_1_value == "test"
assert old_row_2_value == "test"
formula_field_type.set_import_serialized_value(
row_1,
formula_field_name,
formula_field_type.get_export_serialized_value(
row_1, formula_field_name, {}, None, None
),
{},
None,
None,
)
formula_field_type.set_import_serialized_value(
row_2,
formula_field_name,
formula_field_type.get_export_serialized_value(
row_2, formula_field_name, {}, None, None
),
{},
None,
None,
)
row_1.save()
row_2.save()
row_1.refresh_from_db()
row_2.refresh_from_db()
assert old_row_1_value == getattr(row_1, formula_field_name)
assert old_row_2_value == getattr(row_2, formula_field_name)
@pytest.mark.django_db
def test_changing_type_of_other_field_still_results_in_working_filter(data_fixture):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
grid_view = data_fixture.create_grid_view(user, table=table)
first_formula_field = data_fixture.create_formula_field(
table=table, formula="'test'", formula_type="text", name="source"
)
formula_field_referencing_first_field = data_fixture.create_formula_field(
table=table, formula="field('source')", formula_type="text"
)
data_fixture.create_view_filter(
user=user,
view=grid_view,
field=formula_field_referencing_first_field,
type="equal",
value="t",
)
# Change the first formula field to be a boolean field, meaning that the view
# filter on the referencing formula field is now and invalid and should be deleted
FieldHandler().update_field(user, first_formula_field, formula="1")
queryset = ViewHandler().get_queryset(grid_view)
assert not queryset.exists()
assert queryset.count() == 0
@pytest.mark.django_db
def test_can_use_complex_date_filters_on_formula_field(data_fixture):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
grid_view = data_fixture.create_grid_view(user, table=table)
data_fixture.create_date_field(user=user, table=table, name="date_field")
formula_field = data_fixture.create_formula_field(
table=table, formula="field('date_field')", formula_type="date", name="formula"
)
data_fixture.create_view_filter(
user=user,
view=grid_view,
field=formula_field,
type="date_equals_today",
value="Europe/London",
)
queryset = ViewHandler().get_queryset(grid_view)
assert not queryset.exists()
assert queryset.count() == 0
@pytest.mark.django_db
def test_can_use_complex_contains_filters_on_formula_field(data_fixture):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
grid_view = data_fixture.create_grid_view(user, table=table)
data_fixture.create_date_field(
user=user, table=table, name="date_field", date_format="US"
)
formula_field = data_fixture.create_formula_field(
table=table,
formula="field('date_field')",
formula_type="date",
name="formula",
date_format="US",
date_time_format="24",
)
data_fixture.create_view_filter(
user=user,
view=grid_view,
field=formula_field,
type="contains",
value="23",
)
queryset = ViewHandler().get_queryset(grid_view)
assert not queryset.exists()
assert queryset.count() == 0
@pytest.mark.django_db
def test_can_change_formula_type_breaking_other_fields(data_fixture):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
first_formula_field = handler.create_field(
user=user, table=table, name="1", type_name="formula", formula="1+1"
)
second_formula_field = handler.create_field(
user=user, table=table, type_name="formula", name="2", formula="field('1')+1"
)
assert list(
second_formula_field.field_dependencies.values_list("id", flat=True)
) == [first_formula_field.id]
assert list(first_formula_field.dependant_fields.values_list("id", flat=True)) == [
second_formula_field.id
]
assert (
second_formula_field.dependencies.first().dependency.specific
== first_formula_field
)
handler.update_field(
user=user, field=first_formula_field, new_type_name="formula", formula="'a'"
)
second_formula_field.refresh_from_db()
assert second_formula_field.formula_type == BaserowFormulaInvalidType.type
assert "argument number 2" in second_formula_field.error
@pytest.mark.django_db
def test_can_still_insert_rows_with_an_invalid_but_previously_date_formula_field(
data_fixture,
):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
date_field = handler.create_field(
user=user, table=table, name="1", type_name="date"
)
formula_field = handler.create_field(
user=user, table=table, type_name="formula", name="2", formula="field('1')"
)
handler.update_field(user=user, field=date_field, new_type_name="single_select")
row = RowHandler().create_row(user=user, table=table)
assert getattr(row, f"field_{formula_field.id}") is None
@pytest.mark.django_db
def test_formula_with_row_id_is_populated_after_creating_row(
data_fixture,
):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
formula_field = handler.create_field(
user=user, table=table, type_name="formula", name="2", formula="row_id()"
)
row = RowHandler().create_row(user=user, table=table)
assert getattr(row, f"field_{formula_field.id}") == row.id
@pytest.mark.django_db
def test_can_rename_field_preserving_whitespace(
data_fixture,
):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
test_field = handler.create_field(
user=user, table=table, type_name="text", name="a"
)
formula_field = handler.create_field(
user=user, table=table, type_name="formula", name="2", formula=" field('a') \n"
)
assert formula_field.formula == f" field('a') \n"
handler.update_field(user=user, field=test_field, name="b")
formula_field.refresh_from_db()
assert formula_field.formula == f" field('b') \n"
@pytest.mark.django_db
def test_recalculate_formulas_according_to_version(
data_fixture,
):
formula_with_default_internal_field = data_fixture.create_formula_field(
formula="1",
internal_formula="",
requires_refresh_after_insert=False,
name="a",
version=1,
recalculate=False,
create_field=False,
)
formula_that_needs_refresh = data_fixture.create_formula_field(
formula="row_id()",
internal_formula="",
formula_type="number",
requires_refresh_after_insert=False,
name="b",
version=1,
recalculate=False,
create_field=False,
)
broken_reference_formula = data_fixture.create_formula_field(
formula="field('unknown')",
internal_formula="",
requires_refresh_after_insert=False,
name="c",
version=1,
recalculate=False,
create_field=False,
)
dependant_formula = data_fixture.create_formula_field(
table=formula_that_needs_refresh.table,
formula="field('b')",
internal_formula="",
requires_refresh_after_insert=False,
name="d",
version=1,
recalculate=False,
create_field=False,
)
formula_already_at_correct_version = data_fixture.create_formula_field(
formula="'a'",
internal_formula="",
requires_refresh_after_insert=False,
name="e",
version=FormulaHandler.BASEROW_FORMULA_VERSION,
recalculate=False,
create_field=False,
)
upto_date_formula_depending_on_old_version = data_fixture.create_formula_field(
table=dependant_formula.table,
formula=f"field('{dependant_formula.name}')",
internal_formula="",
requires_refresh_after_insert=False,
name="f",
version=FormulaHandler.BASEROW_FORMULA_VERSION,
recalculate=False,
create_field=False,
)
assert (
formula_already_at_correct_version.version
== FormulaHandler.BASEROW_FORMULA_VERSION
)
assert dependant_formula.version == 1
field_cache = FieldCache()
for formula_field in FormulaField.objects.all():
FieldDependencyHandler().rebuild_dependencies(formula_field, field_cache)
FormulaHandler().recalculate_formulas_according_to_version()
formula_with_default_internal_field.refresh_from_db()
assert formula_with_default_internal_field.internal_formula == "error_to_nan(1)"
assert not formula_with_default_internal_field.requires_refresh_after_insert
formula_that_needs_refresh.refresh_from_db()
assert formula_that_needs_refresh.internal_formula == "error_to_nan(row_id())"
assert formula_that_needs_refresh.requires_refresh_after_insert
broken_reference_formula.refresh_from_db()
assert broken_reference_formula.internal_formula == "field('unknown')"
assert broken_reference_formula.formula_type == "invalid"
assert not broken_reference_formula.requires_refresh_after_insert
dependant_formula.refresh_from_db()
assert dependant_formula.internal_formula == "error_to_nan(row_id())"
assert dependant_formula.requires_refresh_after_insert
# The update is not done for this formula and hence the values are left alone
formula_already_at_correct_version.refresh_from_db()
assert formula_already_at_correct_version.internal_formula == ""
assert not formula_already_at_correct_version.requires_refresh_after_insert
upto_date_formula_depending_on_old_version.refresh_from_db()
assert (
upto_date_formula_depending_on_old_version.field_dependencies.get().specific
== dependant_formula
)
assert (
upto_date_formula_depending_on_old_version.internal_formula
== "error_to_nan(row_id())"
)
assert upto_date_formula_depending_on_old_version.requires_refresh_after_insert
@pytest.mark.django_db
def test_can_update_lookup_field_value(
data_fixture, api_client, django_assert_num_queries
):
user, token = data_fixture.create_user_and_token()
table = data_fixture.create_database_table(user=user)
table2 = data_fixture.create_database_table(user=user, database=table.database)
table_primary_field = data_fixture.create_text_field(
name="p", table=table, primary=True
)
data_fixture.create_text_field(name="primaryfield", table=table2, primary=True)
looked_up_field = data_fixture.create_date_field(
name="lookupfield",
table=table2,
date_include_time=False,
date_format="US",
)
linkrowfield = FieldHandler().create_field(
user,
table,
"link_row",
name="linkrowfield",
link_row_table=table2,
)
table2_model = table2.get_model(attribute_names=True)
a = table2_model.objects.create(lookupfield=f"2021-02-01", primaryfield="primary a")
b = table2_model.objects.create(lookupfield=f"2022-02-03", primaryfield="primary b")
table_model = table.get_model(attribute_names=True)
table_row = table_model.objects.create()
table_row.linkrowfield.add(a.id)
table_row.linkrowfield.add(b.id)
table_row.save()
formulafield = FieldHandler().create_field(
user,
table,
"formula",
name="formulafield",
formula=f"IF(datetime_format(lookup('{linkrowfield.name}',"
f"'{looked_up_field.name}'), "
f"'YYYY')='2021', 'yes', 'no')",
)
response = api_client.get(
reverse("api:database:rows:list", kwargs={"table_id": table.id}),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.json() == {
"count": 1,
"next": None,
"previous": None,
"results": [
{
f"field_{table_primary_field.id}": None,
f"field_{linkrowfield.id}": [
{"id": a.id, "value": "primary a"},
{"id": b.id, "value": "primary b"},
],
f"field_{formulafield.id}": [
{"value": "yes", "id": a.id},
{"value": "no", "id": b.id},
],
"id": table_row.id,
"order": "1.00000000000000000000",
}
],
}
response = api_client.patch(
reverse(
"api:database:rows:item",
kwargs={"table_id": table2.id, "row_id": a.id},
),
{f"field_{looked_up_field.id}": "2000-02-01"},
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.status_code == HTTP_200_OK
response = api_client.get(
reverse("api:database:rows:list", kwargs={"table_id": table.id}),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.json() == {
"count": 1,
"next": None,
"previous": None,
"results": [
{
f"field_{table_primary_field.id}": None,
f"field_{linkrowfield.id}": [
{"id": a.id, "value": "primary a"},
{"id": b.id, "value": "primary b"},
],
f"field_{formulafield.id}": [
{"value": "no", "id": a.id},
{"value": "no", "id": b.id},
],
"id": table_row.id,
"order": "1.00000000000000000000",
}
],
}
@pytest.mark.django_db
def test_nested_lookup_with_formula(
data_fixture, api_client, django_assert_num_queries
):
user, token = data_fixture.create_user_and_token()
table = data_fixture.create_database_table(user=user)
table2 = data_fixture.create_database_table(user=user, database=table.database)
table3 = data_fixture.create_database_table(user=user, database=table.database)
table_primary_field = data_fixture.create_text_field(
name="p", table=table, primary=True
)
data_fixture.create_text_field(name="p", table=table3, primary=True)
data_fixture.create_text_field(name="p", table=table2, primary=True)
data_fixture.create_text_field(name="lookupfield", table=table2)
linkrowfield = FieldHandler().create_field(
user,
table,
"link_row",
name="table_linkrowfield",
link_row_table=table2,
)
linkrowfield2 = FieldHandler().create_field(
user,
table2,
"link_row",
name="table2_linkrowfield",
link_row_table=table3,
)
table3_model = table3.get_model(attribute_names=True)
table3_a = table3_model.objects.create(p="table3 a")
table3_model.objects.create(p="table3 b")
table3_c = table3_model.objects.create(p="table3 c")
table3_d = table3_model.objects.create(p="table3 d")
table2_model = table2.get_model(attribute_names=True)
table2_1 = table2_model.objects.create(lookupfield=f"lookup 1", p=f"primary 1")
table2_1.table2linkrowfield.add(table3_a.id)
table2_1.save()
table2_2 = table2_model.objects.create(lookupfield=f"lookup 2", p=f"primary 2")
table2_3 = table2_model.objects.create(lookupfield=f"lookup 3", p=f"primary 3")
table2_3.table2linkrowfield.add(table3_c.id)
table2_3.table2linkrowfield.add(table3_d.id)
table2_3.save()
table_model = table.get_model(attribute_names=True)
table1_x = table_model.objects.create(p="table1 x")
table1_x.tablelinkrowfield.add(table2_1.id)
table1_x.tablelinkrowfield.add(table2_2.id)
table1_x.save()
table1_y = table_model.objects.create(p="table1 y")
table1_y.tablelinkrowfield.add(table2_3.id)
table1_y.save()
# with django_assert_num_queries(1):
lookup_field = FieldHandler().create_field(
user,
table,
type_name="formula",
name="formula",
formula=f"lookup('{linkrowfield.name}','{linkrowfield2.name}')",
)
response = api_client.get(
reverse("api:database:rows:list", kwargs={"table_id": table.id}),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.json() == {
"count": 2,
"next": None,
"previous": None,
"results": [
{
f"field_{table_primary_field.id}": table1_x.p,
f"field_{linkrowfield.id}": [
{"id": table2_1.id, "value": table2_1.p},
{"id": table2_2.id, "value": table2_2.p},
],
f"field_{lookup_field.id}": [
{
"value": table3_a.p,
"ids": {
f"database_table_{table2.id}": table2_1.id,
f"database_table_{table3.id}": table3_a.id,
},
},
],
"id": table1_x.id,
"order": "1.00000000000000000000",
},
{
f"field_{table_primary_field.id}": table1_y.p,
f"field_{linkrowfield.id}": [{"id": table2_3.id, "value": table2_3.p}],
f"field_{lookup_field.id}": [
{
"value": table3_c.p,
"ids": {
f"database_table_{table2.id}": table2_3.id,
f"database_table_{table3.id}": table3_c.id,
},
},
{
"value": table3_d.p,
"ids": {
f"database_table_{table2.id}": table2_3.id,
f"database_table_{table3.id}": table3_d.id,
},
},
],
"id": table1_y.id,
"order": "1.00000000000000000000",
},
],
}
@pytest.mark.django_db
def test_can_delete_lookup_field_value(
data_fixture, api_client, django_assert_num_queries
):
user, token = data_fixture.create_user_and_token()
table = data_fixture.create_database_table(user=user)
table2 = data_fixture.create_database_table(user=user, database=table.database)
table_primary_field = data_fixture.create_text_field(
name="p", table=table, primary=True
)
data_fixture.create_text_field(name="primaryfield", table=table2, primary=True)
looked_up_field = data_fixture.create_date_field(
name="lookupfield",
table=table2,
date_include_time=False,
date_format="US",
)
linkrowfield = FieldHandler().create_field(
user,
table,
"link_row",
name="linkrowfield",
link_row_table=table2,
)
table2_model = table2.get_model(attribute_names=True)
a = table2_model.objects.create(lookupfield=f"2021-02-01", primaryfield="primary a")
b = table2_model.objects.create(lookupfield=f"2022-02-03", primaryfield="primary b")
table_model = table.get_model(attribute_names=True)
table_row = table_model.objects.create(p="table row 1")
table_row.linkrowfield.add(a.id)
table_row.linkrowfield.add(b.id)
table_row.save()
formulafield = FieldHandler().create_field(
user,
table,
"formula",
name="formulafield",
formula=f"IF(datetime_format(lookup('{linkrowfield.name}',"
f"'{looked_up_field.name}'), "
f"'YYYY')='2021', 'yes', 'no')",
)
response = api_client.get(
reverse("api:database:rows:list", kwargs={"table_id": table.id}),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.json() == {
"count": 1,
"next": None,
"previous": None,
"results": [
{
f"field_{table_primary_field.id}": "table row 1",
f"field_{linkrowfield.id}": [
{"id": a.id, "value": "primary a"},
{"id": b.id, "value": "primary b"},
],
f"field_{formulafield.id}": [
{"value": "yes", "id": a.id},
{"value": "no", "id": b.id},
],
"id": table_row.id,
"order": "1.00000000000000000000",
}
],
}
response = api_client.delete(
reverse(
"api:database:rows:item",
kwargs={"table_id": table2.id, "row_id": a.id},
),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.status_code == HTTP_204_NO_CONTENT
response = api_client.get(
reverse("api:database:rows:list", kwargs={"table_id": table.id}),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.json() == {
"count": 1,
"next": None,
"previous": None,
"results": [
{
f"field_{table_primary_field.id}": "table row 1",
f"field_{linkrowfield.id}": [
{"id": b.id, "value": "primary b"},
],
f"field_{formulafield.id}": [
{"value": "no", "id": b.id},
],
"id": table_row.id,
"order": "1.00000000000000000000",
}
],
}
@pytest.mark.django_db
def test_can_delete_double_link_lookup_field_value(
data_fixture, api_client, django_assert_num_queries
):
user, token = data_fixture.create_user_and_token()
table = data_fixture.create_database_table(user=user)
table2 = data_fixture.create_database_table(user=user, database=table.database)
table3 = data_fixture.create_database_table(user=user, database=table.database)
table_primary_field = data_fixture.create_text_field(
name="p", table=table, primary=True
)
data_fixture.create_text_field(name="primaryfield", table=table2, primary=True)
data_fixture.create_text_field(name="primaryfield", table=table3, primary=True)
table2_linkrowfield = FieldHandler().create_field(
user,
table2,
"link_row",
name="linkrowfield",
link_row_table=table3,
)
table3_model = table3.get_model(attribute_names=True)
table3_1 = table3_model.objects.create(primaryfield="table 3 row 1")
table3_2 = table3_model.objects.create(primaryfield="table 3 row 2")
linkrowfield = FieldHandler().create_field(
user,
table,
"link_row",
name="linkrowfield",
link_row_table=table2,
)
table2_model = table2.get_model(attribute_names=True)
table2_a = table2_model.objects.create(primaryfield="primary a")
table2_a.linkrowfield.add(table3_1.id)
table2_a.save()
table2_b = table2_model.objects.create(primaryfield="primary b")
table2_b.linkrowfield.add(table3_2.id)
table2_b.save()
table_model = table.get_model(attribute_names=True)
table_row = table_model.objects.create(p="table row 1")
table_row.linkrowfield.add(table2_a.id)
table_row.linkrowfield.add(table2_b.id)
table_row.save()
formulafield = FieldHandler().create_field(
user,
table,
"formula",
name="formulafield",
formula=f"lookup('{linkrowfield.name}','{table2_linkrowfield.name}')",
)
response = api_client.get(
reverse("api:database:rows:list", kwargs={"table_id": table.id}),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.json() == {
"count": 1,
"next": None,
"previous": None,
"results": [
{
f"field_{table_primary_field.id}": "table row 1",
f"field_{linkrowfield.id}": [
{"id": table2_a.id, "value": "primary a"},
{"id": table2_b.id, "value": "primary b"},
],
f"field_{formulafield.id}": [
{
"value": table3_1.primaryfield,
"ids": {
f"database_table_{table2.id}": table2_a.id,
f"database_table_{table3.id}": table3_1.id,
},
},
{
"value": table3_2.primaryfield,
"ids": {
f"database_table_{table2.id}": table2_b.id,
f"database_table_{table3.id}": table3_2.id,
},
},
],
"id": table_row.id,
"order": "1.00000000000000000000",
}
],
}
response = api_client.delete(
reverse(
"api:database:rows:item",
kwargs={"table_id": table2.id, "row_id": table2_a.id},
),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.status_code == HTTP_204_NO_CONTENT
response = api_client.get(
reverse("api:database:rows:list", kwargs={"table_id": table.id}),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.json() == {
"count": 1,
"next": None,
"previous": None,
"results": [
{
f"field_{table_primary_field.id}": "table row 1",
f"field_{linkrowfield.id}": [
{"id": table2_b.id, "value": "primary b"},
],
f"field_{formulafield.id}": [
{
"value": table3_2.primaryfield,
"ids": {
f"database_table_{table2.id}": table2_b.id,
f"database_table_{table3.id}": table3_2.id,
},
},
],
"id": table_row.id,
"order": "1.00000000000000000000",
}
],
}
response = api_client.delete(
reverse(
"api:database:rows:item",
kwargs={"table_id": table3.id, "row_id": table3_2.id},
),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.status_code == HTTP_204_NO_CONTENT
response = api_client.get(
reverse("api:database:rows:list", kwargs={"table_id": table.id}),
format="json",
HTTP_AUTHORIZATION=f"JWT {token}",
)
assert response.json() == {
"count": 1,
"next": None,
"previous": None,
"results": [
{
f"field_{table_primary_field.id}": "table row 1",
f"field_{linkrowfield.id}": [
{"id": table2_b.id, "value": "primary b"},
],
f"field_{formulafield.id}": [],
"id": table_row.id,
"order": "1.00000000000000000000",
}
],
}
@pytest.mark.django_db
def test_all_functions_are_registered():
def get_all_subclasses(cls):
all_subclasses = []
for subclass in cls.__subclasses__():
if not inspect.isabstract(subclass):
all_subclasses.append(subclass)
all_subclasses.extend(get_all_subclasses(subclass))
return all_subclasses
funcs = formula_function_registry.get_all()
names = [f.type for f in funcs]
assert len(names) == len(get_all_subclasses(BaserowFunctionDefinition))
# print(json.dumps(names, indent=4))
@pytest.mark.django_db
def test_row_dependency_update_functions_do_no_row_updates_for_same_table(
data_fixture, django_assert_num_queries
):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
handler.create_field(user=user, table=table, type_name="text", name="a")
formula_field = handler.create_field(
user=user,
table=table,
type_name="formula",
name="formula",
formula="field('a')",
)
table_model = table.get_model()
row = table_model.objects.create()
formula_field_type = FormulaFieldType()
update_collector = CachingFieldUpdateCollector(table, existing_model=table_model)
formula_field_type.row_of_dependency_updated(
formula_field, row, update_collector, None
)
formula_field_type.row_of_dependency_updated(
formula_field, row, update_collector, []
)
formula_field_type.row_of_dependency_created(
formula_field, row, update_collector, None
)
formula_field_type.row_of_dependency_created(
formula_field, row, update_collector, []
)
formula_field_type.row_of_dependency_deleted(
formula_field, row, update_collector, None
)
formula_field_type.row_of_dependency_deleted(
formula_field, row, update_collector, []
)
with django_assert_num_queries(0):
update_collector.apply_updates_and_get_updated_fields()
@pytest.mark.django_db
def test_recalculated_internal_type_with_incorrect_syntax_formula_sets_to_invalid(
data_fixture,
):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
handler.create_field(user=user, table=table, type_name="text", name="a")
formula_field = handler.create_field(
user=user,
table=table,
type_name="formula",
name="formula",
formula="field('a')",
)
formula_field.formula = "invalid"
formula_field.save()
assert formula_field.formula_type == BaserowFormulaInvalidType.type
assert "Invalid syntax" in formula_field.error
@pytest.mark.django_db
def test_accessing_cached_internal_formula_second_time_does_no_queries(
data_fixture, django_assert_num_queries
):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
a_field = handler.create_field(user=user, table=table, type_name="text", name="a")
formula_field = handler.create_field(
user=user,
table=table,
type_name="formula",
name="formula",
formula="field('a')",
)
with django_assert_num_queries(0):
assert str(formula_field.cached_untyped_expression) == formula_field.formula
assert (
str(formula_field.cached_typed_internal_expression)
== f"error_to_null(field('{a_field.db_column}'))"
)
assert formula_field.cached_formula_type.type == BaserowFormulaTextType.type
@pytest.mark.django_db
def test_saving_after_properties_have_been_cached_does_recaclulation(data_fixture):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
a_field = handler.create_field(user=user, table=table, type_name="text", name="a")
formula_field = handler.create_field(
user=user,
table=table,
type_name="formula",
name="formula",
formula="field('a')",
)
assert str(formula_field.cached_untyped_expression) == formula_field.formula
assert (
str(formula_field.cached_typed_internal_expression)
== f"error_to_null(field('{a_field.db_column}'))"
)
assert formula_field.cached_formula_type.type == BaserowFormulaTextType.type
formula_field.formula = "1"
formula_field.save()
assert str(formula_field.cached_untyped_expression) == "1"
assert str(formula_field.cached_typed_internal_expression) == f"error_to_nan(1)"
assert formula_field.cached_formula_type.type == BaserowFormulaNumberType.type
@pytest.mark.django_db
def test_renaming_dependency_maintains_dependency_link(data_fixture):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
a_field = handler.create_field(user=user, table=table, type_name="text", name="a")
formula_field = handler.create_field(
user=user,
table=table,
type_name="formula",
name="formula",
formula="field('a')",
)
starting_dep = formula_field.dependencies.get()
assert formula_field.field_dependencies.get().id == a_field.id
assert starting_dep.broken_reference_field_name is None
assert starting_dep.dependency_id == a_field.id
handler.update_field(user, a_field, name="other")
formula_field.refresh_from_db()
assert formula_field.dependencies.get().id == starting_dep.id
assert formula_field.field_dependencies.get().id == a_field.id
assert formula_field.formula == "field('other')"
@pytest.mark.django_db
def test_can_insert_and_update_rows_with_formula_referencing_single_select(
data_fixture,
):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
option_field = data_fixture.create_single_select_field(
table=table, name="option_field", order=1
)
option_a = data_fixture.create_select_option(
field=option_field, value="A", color="blue"
)
option_b = data_fixture.create_select_option(
field=option_field, value="B", color="red"
)
formula_field = handler.create_field(
user=user,
table=table,
type_name="formula",
name="2",
formula="field('option_field')",
)
row = RowHandler().create_row(
user=user, table=table, values={f"field_{option_field.id}": option_a.id}
)
row.refresh_from_db()
result = getattr(row, f"field_{formula_field.id}")
assert result == {
"id": option_a.id,
"color": option_a.color,
"value": option_a.value,
}
row = RowHandler().update_row(
user=user,
table=table,
row_id=row.id,
values={f"field_{option_field.id}": option_b.id},
)
row.refresh_from_db()
result = getattr(row, f"field_{formula_field.id}")
assert result == {
"id": option_b.id,
"color": option_b.color,
"value": option_b.value,
}
row = RowHandler().create_row(user=user, table=table, values={})
row.refresh_from_db()
result = getattr(row, f"field_{formula_field.id}")
assert result is None
@pytest.mark.django_db
def test_cannot_create_view_filter_or_sort_on_invalid_field(data_fixture):
user = data_fixture.create_user()
table, other_table, link = data_fixture.create_two_linked_tables(user=user)
grid_view = data_fixture.create_grid_view(user, table=table)
first_formula_field = FieldHandler().create_field(
user, table, "formula", formula="1", name="source"
)
broken_formula_field = FieldHandler().create_field(
user, table, "formula", formula="field('source')", name="a"
)
FieldHandler().delete_field(user, first_formula_field)
option_field = data_fixture.create_single_select_field(
table=table, name="option_field", order=1
)
data_fixture.create_select_option(field=option_field, value="A", color="blue")
data_fixture.create_select_option(field=option_field, value="B", color="red")
single_select_formula_field = FieldHandler().create_field(
user=user,
table=table,
type_name="formula",
name="2",
formula="field('option_field')",
)
lookup_field = FieldHandler().create_field(
user=user,
table=table,
type_name="lookup",
name="lookup",
through_field_name=link.name,
target_field_name="primary",
)
broken_formula_field = FormulaField.objects.get(id=broken_formula_field.id)
single_select_formula_field = FormulaField.objects.get(
id=single_select_formula_field.id
)
lookup_field = LookupField.objects.get(id=lookup_field.id)
assert broken_formula_field.formula_type == "invalid"
assert single_select_formula_field.formula_type == "single_select"
assert lookup_field.formula_type == "array"
fields_which_cant_yet_be_sorted_or_filtered = [
broken_formula_field,
single_select_formula_field,
lookup_field,
]
for field in fields_which_cant_yet_be_sorted_or_filtered:
for view_filter_type in view_filter_type_registry.get_all():
with pytest.raises(ViewFilterTypeNotAllowedForField):
ViewHandler().create_filter(
user,
grid_view,
field,
view_filter_type.type,
"",
)
for field in fields_which_cant_yet_be_sorted_or_filtered:
with pytest.raises(ViewSortFieldNotSupported):
ViewHandler().create_sort(user, grid_view, field, SORT_ORDER_ASC)
with pytest.raises(ViewSortFieldNotSupported):
ViewHandler().create_sort(user, grid_view, field, SORT_ORDER_DESC)
@pytest.mark.django_db
def test_can_cache_and_uncache_formula_model_field(
data_fixture,
):
user = data_fixture.create_user()
table = data_fixture.create_database_table(user=user)
handler = FieldHandler()
formula_field = handler.create_field(
user=user,
table=table,
type_name="formula",
name="2",
formula="'a'",
)
formula_field_type = field_type_registry.get_by_model(formula_field)
formula_model_field = formula_field_type.get_model_field(formula_field)
generated_models_cache.set("test_formula_key", formula_model_field)
uncached = generated_models_cache.get("test_formula_key")
assert uncached == formula_model_field
assert isinstance(uncached, BaserowExpressionField)
assert uncached.__class__ == TextField
assert str(uncached.expression) == str(formula_model_field.expression)
@pytest.mark.django_db
def test_inserting_a_row_with_lookup_field_immediately_populates_it_with_empty_list(
data_fixture,
):
user = data_fixture.create_user()
table_a, table_b, link_field = data_fixture.create_two_linked_tables(user=user)
target_field = data_fixture.create_text_field(name="target", table=table_b)
table_a_model = table_a.get_model(attribute_names=True)
table_b_model = table_b.get_model(attribute_names=True)
row_1 = table_b_model.objects.create(primary="1", target="target 1")
row_2 = table_b_model.objects.create(primary="2", target="target 2")
row_a = table_a_model.objects.create(primary="a")
row_a.link.add(row_1.id)
row_a.link.add(row_2.id)
row_a.save()
lookup = FieldHandler().create_field(
user,
table_a,
"lookup",
name="lookup",
through_field_name="link",
target_field_name="target",
)
model_with_lookup = table_a.get_model()
inserted_row = model_with_lookup.objects.create()
default_empty_value_for_lookup = getattr(inserted_row, f"field_{lookup.id}")
assert default_empty_value_for_lookup is not None
assert default_empty_value_for_lookup == "[]"
| 35.108239 | 88 | 0.649802 | 5,167 | 43,464 | 5.130443 | 0.063286 | 0.071523 | 0.060281 | 0.026406 | 0.770116 | 0.706024 | 0.647214 | 0.604248 | 0.548719 | 0.53016 | 0 | 0.015967 | 0.237737 | 43,464 | 1,237 | 89 | 35.136621 | 0.78416 | 0.006948 | 0 | 0.558453 | 0 | 0 | 0.110158 | 0.05114 | 0 | 0 | 0 | 0 | 0.080935 | 1 | 0.024281 | false | 0 | 0.021583 | 0 | 0.046763 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7eedd09fb3c8d92730036810d13bd71098a0604a | 1,791 | py | Python | asyncpg_opentracing/tracing.py | condorcet/asyncpg_opentracing | 7e8342c2ab9d360507695f802b9a74803f76675e | [
"MIT"
] | 3 | 2021-02-07T02:55:46.000Z | 2021-11-25T21:32:19.000Z | asyncpg_opentracing/tracing.py | condorcet/asyncpg_opentracing | 7e8342c2ab9d360507695f802b9a74803f76675e | [
"MIT"
] | null | null | null | asyncpg_opentracing/tracing.py | condorcet/asyncpg_opentracing | 7e8342c2ab9d360507695f802b9a74803f76675e | [
"MIT"
] | null | null | null | from functools import wraps
from opentracing import global_tracer, tags, logs
from contextlib import contextmanager
def operation_name(query: str):
# TODO: some statement should contain two words. For example CREATE TABLE.
query = query.strip().split(' ')[0].strip(';').upper()
return 'asyncpg ' + query
@contextmanager
def con_context(handler, query, query_args):
_tags = {
tags.DATABASE_TYPE: 'SQL',
tags.DATABASE_STATEMENT: query,
tags.DATABASE_USER: handler._params.user,
tags.DATABASE_INSTANCE: handler._params.database,
'db.params': query_args,
tags.SPAN_KIND: tags.SPAN_KIND_RPC_CLIENT,
}
with global_tracer().start_active_span(
operation_name=operation_name(query),
tags=_tags
) as scope:
try:
yield
except Exception as e:
scope.span.log_kv({
logs.EVENT: 'error',
logs.ERROR_KIND: type(e).__name__,
logs.ERROR_OBJECT: e,
logs.MESSAGE: str(e)
})
raise
def wrap(coro):
@wraps(coro)
async def wrapped(self, query, *args, **kwargs):
with con_context(self, query, args):
return await coro(self, query, *args, **kwargs)
return wrapped
def wrap_executemany(coro):
@wraps(coro)
async def wrapped(self, query, args, *_args, **kwargs):
with con_context(self, query, args):
return await coro(self, query, args, *_args, **kwargs)
return wrapped
def tracing_connection(cls):
cls.fetch = wrap(cls.fetch)
cls.fetchval = wrap(cls.fetchval)
cls.fetchrow = wrap(cls.fetchrow)
cls.execute = wrap(cls.execute)
cls.executemany = wrap_executemany(cls.executemany)
return cls
| 27.553846 | 78 | 0.627024 | 214 | 1,791 | 5.079439 | 0.373832 | 0.066237 | 0.071757 | 0.033119 | 0.235511 | 0.191352 | 0.191352 | 0.191352 | 0.191352 | 0.119595 | 0 | 0.00076 | 0.265773 | 1,791 | 64 | 79 | 27.984375 | 0.825856 | 0.040201 | 0 | 0.122449 | 0 | 0 | 0.015725 | 0 | 0 | 0 | 0 | 0.015625 | 0 | 1 | 0.102041 | false | 0 | 0.061224 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7eef35921fa0ede03616e146e1295177bb83c0f6 | 28,152 | py | Python | acro/train_uncond_dcgan.py | udibr/dcgan_code | b80e8b97193ef57ea86ecb3be684b452655fe2ac | [
"MIT"
] | 9 | 2015-12-18T09:55:35.000Z | 2018-12-02T07:04:07.000Z | acro/train_uncond_dcgan.py | udibr/dcgan_code | b80e8b97193ef57ea86ecb3be684b452655fe2ac | [
"MIT"
] | null | null | null | acro/train_uncond_dcgan.py | udibr/dcgan_code | b80e8b97193ef57ea86ecb3be684b452655fe2ac | [
"MIT"
] | 4 | 2016-01-18T08:16:38.000Z | 2019-02-12T02:29:47.000Z | """
uncond_dcgan1 made with 64x64 images from https://s3.amazonaws.com/udipublic/acro.images.tgz for train.tar.gz
"""
import argparse
parser = argparse.ArgumentParser(description='train uncoditional dcgan')
parser.add_argument('--desc',
default='uncond_dcgan',
help='name to uniquely describe this run')
parser.add_argument('--path',
default='data/jpg.hdf5',
help='where to read fuel hdf5 data file with training')
parser.add_argument('--val', type=float,
default=0.,
help="what part of the training data to use for validation")
parser.add_argument('--model',
help='start from a pre-existing model.'
' The suffixes _gen_params.jl'
' and _discrim_params.jl'
' are added to the path you supply')
parser.add_argument('--batch', type=int,
default=128,
help='batch size')
parser.add_argument('-k', type=int,
default=0,
help='# of discrim updates for each gen update.'
' 0 - alternate > 0 more d, < 0 more g')
parser.add_argument('--maxk', type=int,
default=1,
help='max value for k')
parser.add_argument('--mink', type=int,
default=-1,
help='min value for k')
parser.add_argument('--l2d', type=float,
default=1.e-5,
help="discriminator l2")
parser.add_argument('--l2decay', type=float,
default=0.,
help="reduce l2d by 1-l2decay")
parser.add_argument('--l2step', type=float,
default=0.,
help="increase(decrease) discriminator's l2"
" when generator cost is above 1.3(below 0.9)")
parser.add_argument('--dropout', type=float,
default=0.,
help="discriminator dropout")
parser.add_argument('--lr', type=float,
default=0.0002,
help="initial learning rate for adam")
parser.add_argument('--lrstep', type=float,
default=1.,
help="increa/decrease g/d learning rate")
parser.add_argument('--dbn', action='store_false',
help='dont perfrom batch normalization on discriminator')
parser.add_argument('--db1', action='store_true',
help='add bias to first layer of discriminator')
parser.add_argument('--ngf', type=int,
default=128,
help='# of gen filters')
parser.add_argument('--ndf', type=int,
default=128,
help='# of discriminator filters')
parser.add_argument('--updates', type=int,
default=100,
help='compute score every n_updates')
parser.add_argument('-z', type=int,
default=100,
help='number of hidden variables')
parser.add_argument('--znorm', action='store_true',
help='normalize z values to unit sphere')
parser.add_argument('--generate', action='store_true',
help='generate sample png and gif')
parser.add_argument('--ngif', type=int, default=1,
help='# of png images to generate. If 1 then no gif')
parser.add_argument('--nvis2', type=int,
default=14,
help='number of rows/cols of sub-images to generate')
parser.add_argument('--generate_d', type=float, default=0.,
help="minimal discrimation score when generating samples")
parser.add_argument('--generate_c', type=float, default=0.,
help="minimal classification score when generating samples")
parser.add_argument('--generate_v', type=float,
help='generate sample along a random direction with this step size')
parser.add_argument('--classify', action='store_true',
help='classify target')
parser.add_argument('--onlyclassify', action='store_true',
help='just do classify target')
parser.add_argument('--seed', type=int,
default=123,
help='seed all random generators')
parser.add_argument('--filter_label', type=int,
help='take only training data with this label (does not work with classify')
parser.add_argument('--nepochs', type=int,
default=25,
help='total number of epochs')
parser.add_argument('--niter', type=int,
default=25,
help='# of iter at starting learning rate')
parser.add_argument('--start', type=int,
default=0,
help='If not 0 then start from this epoch after loading the last model')
args = parser.parse_args()
if args.onlyclassify:
args.classify = True
if args.classify:
assert args.filter_label is None, "you can't classify and limit your data to one lable"
if args.model is None and args.start > 0:
args.model = 'models/%s/%d'%(args.desc, args.start)
import random
random.seed(args.seed)
import numpy as np
np.random.seed(args.seed)
import sys
sys.path.append('..')
import os
import json
from time import sleep
from time import time
from tqdm import tqdm, trange
from matplotlib import pyplot as plt
from sklearn.externals import joblib
import theano
import theano.tensor as T
from theano.sandbox.cuda.dnn import dnn_conv
from lib import activations
from lib import updates
from lib import inits
from lib.vis import color_grid_vis
from lib.rng import py_rng, np_rng
from lib.ops import batchnorm, conv_cond_concat, deconv, dropout, l2normalize
from lib.metrics import nnc_score, nnd_score
from lib.theano_utils import floatX, sharedX
from lib.data_utils import OneHot, shuffle, iter_data, center_crop, patch
from load import streams
def transform(X):
# X = [center_crop(x, npx) for x in X] # only works for (H,W,3)
assert X[0].shape == (npx,npx,3) or X[0].shape == (3,npx,npx)
if X[0].shape == (npx,npx,3):
X = X.transpose(0, 3, 1, 2)
return floatX(X/127.5 - 1.)
def inverse_transform(X):
X = (X.reshape(-1, nc, npx, npx).transpose(0, 2, 3, 1)+1.)/2.
return X
k = 0 # # of discrim updates for each gen update. 0 - alternate > 0 more d, < 0 more g
l2 = 1e-5 # l2 weight decay
l2d = args.l2d # discriminator l2
l2step = args.l2step # increase(decrease) discriminator l2 when generator cost is above 1.3(below 0.9)
margin = 0.3 # Dont optimize discriminator(generator) when classification error below margin(above 1-margin)
nvis2 = args.nvis2
nvis = nvis2*nvis2 # # of samples to visualize during training
b1 = 0.5 # momentum term of adam
nc = 3 # # of channels in image
nbatch = args.batch # # of examples in batch
npx = 64 # # of pixels width/height of images
nz = args.z # # of dim for Z
ngf = args.ngf # # of gen filters in first conv layer
ndf = args.ndf # # of discrim filters in first conv layer
nx = npx*npx*nc # # of dimensions in X
niter = args.niter # # of iter at starting learning rate
niter_decay = args.nepochs - niter # # of iter to linearly decay learning rate to zero
lr = args.lr # initial learning rate for adam
ntrain = None # # of examples to train on. None take all
ngif = args.ngif # # of images in a gif
desc = args.desc
model_dir = 'models/%s'%desc
samples_dir = 'samples/%s'%desc
if not os.path.exists('logs/'):
os.makedirs('logs/')
if not os.path.exists(model_dir):
os.makedirs(model_dir)
if not os.path.exists(samples_dir):
os.makedirs(samples_dir)
###########################################
# data
if not args.generate:
tr_data, tr_stream, val_stream, ntrain_s, nval_s = streams(ntrain=ntrain,
batch_size=args.batch,
path=args.path,
val = args.val,
filter_label=args.filter_label)
if ntrain is None:
ntrain = tr_data.num_examples
print '# examples', tr_data.num_examples
print '# training examples', ntrain_s
print '# validation examples', nval_s
tr_handle = tr_data.open()
vaX,labels = tr_data.get_data(tr_handle, slice(0, 10000))
vaX = transform(vaX)
means = labels.mean(axis=0)
print('labels ',labels.shape,means,means[0]/means[1])
vaY,labels = tr_data.get_data(tr_handle, slice(10000, min(ntrain, 20000)))
vaY = transform(vaY)
va_nnd_1k = nnd_score(vaY.reshape((len(vaY),-1)), vaX.reshape((len(vaX),-1)), metric='euclidean')
print 'va_nnd_1k = %.2f'%(va_nnd_1k)
means = labels.mean(axis=0)
print('labels ',labels.shape,means,means[0]/means[1])
#####################################
# shared variables
gifn = inits.Normal(scale=0.02)
difn = inits.Normal(scale=0.02)
gain_ifn = inits.Normal(loc=1., scale=0.02)
bias_ifn = inits.Constant(c=0.)
gw = gifn((nz, ngf*8*4*4), 'gw')
gg = gain_ifn((ngf*8*4*4), 'gg')
gb = bias_ifn((ngf*8*4*4), 'gb')
gw2 = gifn((ngf*8, ngf*4, 5, 5), 'gw2')
gg2 = gain_ifn((ngf*4), 'gg2')
gb2 = bias_ifn((ngf*4), 'gb2')
gw3 = gifn((ngf*4, ngf*2, 5, 5), 'gw3')
gg3 = gain_ifn((ngf*2), 'gg3')
gb3 = bias_ifn((ngf*2), 'gb3')
gw4 = gifn((ngf*2, ngf, 5, 5), 'gw4')
gg4 = gain_ifn((ngf), 'gg4')
gb4 = bias_ifn((ngf), 'gb4')
gwx = gifn((ngf, nc, 5, 5), 'gwx')
dw = difn((ndf, nc, 5, 5), 'dw')
db = bias_ifn((ndf), 'db')
dw2 = difn((ndf*2, ndf, 5, 5), 'dw2')
dg2 = gain_ifn((ndf*2), 'dg2')
db2 = bias_ifn((ndf*2), 'db2')
dw3 = difn((ndf*4, ndf*2, 5, 5), 'dw3')
dg3 = gain_ifn((ndf*4), 'dg3')
db3 = bias_ifn((ndf*4), 'db3')
dw4 = difn((ndf*8, ndf*4, 5, 5), 'dw4')
dg4 = gain_ifn((ndf*8), 'dg4')
db4 = bias_ifn((ndf*8), 'db4')
dwy = difn((ndf*8*4*4, 1), 'dwy')
dwy1 = difn((ndf*8*4*4, 1), 'dwy')
# models
relu = activations.Rectify()
sigmoid = activations.Sigmoid()
lrelu = activations.LeakyRectify()
tanh = activations.Tanh()
bce = T.nnet.binary_crossentropy
# generator model
gen_params = [gw, gg, gb, gw2, gg2, gb2, gw3, gg3, gb3, gw4, gg4, gb4, gwx]
def gen(Z, w, g, b, w2, g2, b2, w3, g3, b3, w4, g4, b4, wx):
h = relu(batchnorm(T.dot(Z, w), g=g, b=b))
h = h.reshape((h.shape[0], ngf*8, 4, 4))
h2 = relu(batchnorm(deconv(h, w2, subsample=(2, 2), border_mode=(2, 2)), g=g2, b=b2))
h3 = relu(batchnorm(deconv(h2, w3, subsample=(2, 2), border_mode=(2, 2)), g=g3, b=b3))
h4 = relu(batchnorm(deconv(h3, w4, subsample=(2, 2), border_mode=(2, 2)), g=g4, b=b4))
x = tanh(deconv(h4, wx, subsample=(2, 2), border_mode=(2, 2)))
return x
# discriminator model
"""
#old model
if args.dbn:
if args.db1:
print "Bias on layer 1 + batch normalization"
discrim_params = [dw, db, dw2, dg2, db2, dw3, dg3, db3, dw4, dg4, db4, dwy, dwy1]
def discrim(X, w, b, w2, g2, b2, w3, g3, b3, w4, g4, b4, wy, wy1):
h = lrelu(dnn_conv(X, w, subsample=(2, 2), border_mode=(2, 2))+b.dimshuffle('x', 0, 'x', 'x'))
h = dropout(h, args.dropout)
h2 = lrelu(batchnorm(dnn_conv(h, w2, subsample=(2, 2), border_mode=(2, 2)), g=g2, b=b2))
h2 = dropout(h2, args.dropout)
h3 = lrelu(batchnorm(dnn_conv(h2, w3, subsample=(2, 2), border_mode=(2, 2)), g=g3, b=b3))
h3 = dropout(h3, args.dropout)
h4 = lrelu(batchnorm(dnn_conv(h3, w4, subsample=(2, 2), border_mode=(2, 2)), g=g4, b=b4))
h4 = dropout(h4, args.dropout)
h4 = T.flatten(h4, 2)
y = sigmoid(T.dot(h4, wy))
y1 = sigmoid(T.dot(h4, wy1))
return y, y1
else:
print "Batch normalization"
discrim_params = [dw, dw2, dg2, db2, dw3, dg3, db3, dw4, dg4, db4, dwy, dwy1]
def discrim(X, w, w2, g2, b2, w3, g3, b3, w4, g4, b4, wy, wy1):
h = lrelu(dnn_conv(X, w, subsample=(2, 2), border_mode=(2, 2)))
h = dropout(h, args.dropout)
h2 = lrelu(batchnorm(dnn_conv(h, w2, subsample=(2, 2), border_mode=(2, 2)), g=g2, b=b2))
h2 = dropout(h2, args.dropout)
h3 = lrelu(batchnorm(dnn_conv(h2, w3, subsample=(2, 2), border_mode=(2, 2)), g=g3, b=b3))
h3 = dropout(h3, args.dropout)
h4 = lrelu(batchnorm(dnn_conv(h3, w4, subsample=(2, 2), border_mode=(2, 2)), g=g4, b=b4))
h4 = dropout(h4, args.dropout)
h4 = T.flatten(h4, 2)
y = sigmoid(T.dot(h4, wy))
y1 = sigmoid(T.dot(h4, wy1))
return y, y1
else:
if args.db1:
print "Bias on layer 1"
discrim_params = [dw, db, dw2, db2, dw3, db3, dw4, db4, dwy, dwy1]
def discrim(X, w, b, w2, b2, w3, b3, w4, b4, wy, wy1):
h = lrelu(dnn_conv(X, w, subsample=(2, 2), border_mode=(2, 2))+b.dimshuffle('x', 0, 'x', 'x'))
h = dropout(h, args.dropout)
h2 = lrelu(dnn_conv(h, w2, subsample=(2, 2), border_mode=(2, 2))+b2.dimshuffle('x', 0, 'x', 'x'))
h2 = dropout(h2, args.dropout)
h3 = lrelu(dnn_conv(h2, w3, subsample=(2, 2), border_mode=(2, 2))+b3.dimshuffle('x', 0, 'x', 'x'))
h3 = dropout(h3, args.dropout)
h4 = lrelu(dnn_conv(h3, w4, subsample=(2, 2), border_mode=(2, 2))+b4.dimshuffle('x', 0, 'x', 'x'))
h4 = dropout(h4, args.dropout)
h4 = T.flatten(h4, 2)
y = sigmoid(T.dot(h4, wy))
y1 = sigmoid(T.dot(h4, wy1))
return y, y1
else:
discrim_params = [dw, dw2, db2, dw3, db3, dw4, db4, dwy, dwy1]
def discrim(X, w, w2, b2, w3, b3, w4, b4, wy, wy1):
h = lrelu(dnn_conv(X, w, subsample=(2, 2), border_mode=(2, 2)))
h = dropout(h, args.dropout)
h2 = lrelu(dnn_conv(h, w2, subsample=(2, 2), border_mode=(2, 2))+b2.dimshuffle('x', 0, 'x', 'x'))
h2 = dropout(h2, args.dropout)
h3 = lrelu(dnn_conv(h2, w3, subsample=(2, 2), border_mode=(2, 2))+b3.dimshuffle('x', 0, 'x', 'x'))
h3 = dropout(h3, args.dropout)
h4 = lrelu(dnn_conv(h3, w4, subsample=(2, 2), border_mode=(2, 2))+b4.dimshuffle('x', 0, 'x', 'x'))
h4 = dropout(h4, args.dropout)
h4 = T.flatten(h4, 2)
y = sigmoid(T.dot(h4, wy))
y1 = sigmoid(T.dot(h4, wy1))
return y, y1
"""
#new model
discrim_params = [dw, db, dw2, dg2, db2, dw3, dg3, db3, dw4, dg4, db4, dwy, dwy1]
def discrim(X, w, b, w2, g2, b2, w3, g3, b3, w4, g4, b4, wy, wy1):
h0 = dnn_conv(X, w, subsample=(2, 2), border_mode=(2, 2))
if args.db1:
h0 += b.dimshuffle('x', 0, 'x', 'x')
h1 = lrelu(h0)
h1 = dropout(h1, args.dropout)
h1 = dnn_conv(h1, w2, subsample=(2, 2), border_mode=(2, 2))
if args.dbn:
h1 = batchnorm(h1, g=g2, b=b2)
else:
h1 += b2.dimshuffle('x', 0, 'x', 'x')
h2 = lrelu(h1)
h2 = dropout(h2, args.dropout)
h2 = dnn_conv(h2, w3, subsample=(2, 2), border_mode=(2, 2))
if args.dbn:
h2 = batchnorm(h2, g=g3, b=b3)
else:
h2 += b3.dimshuffle('x', 0, 'x', 'x')
h3 = lrelu(h2)
h3 = dropout(h3, args.dropout)
h3 = dnn_conv(h3, w4, subsample=(2, 2), border_mode=(2, 2))
if args.dbn:
h3 = batchnorm(h3, g=g4, b=b4)
else:
h3 += b4.dimshuffle('x', 0, 'x', 'x')
h4 = lrelu(h3)
h4 = dropout(h4, args.dropout)
h4 = T.flatten(h4, 2)
y = sigmoid(T.dot(h4, wy))
y1 = sigmoid(T.dot(h4, wy1))
return y, y1
X = T.tensor4()
Z = T.matrix()
Y = T.matrix()
MASK = T.matrix()
gX = gen(Z, *gen_params)
p_gen, p_gen_classify = discrim(gX, *discrim_params)
p_real, p_classify = discrim(X, *discrim_params)
if args.model is not None:
print 'loading',args.model
from itertools import izip
gen_params_values = joblib.load(args.model + '_gen_params.jl')
for p, v in izip(gen_params, gen_params_values):
p.set_value(v)
discrim_params_values = joblib.load(args.model + '_discrim_params.jl')
if len(discrim_params) == len(discrim_params_values):
load_params = discrim_params
else: # support old save format
print 'loading old format',len(discrim_params),len(discrim_params_values)
if args.dbn and args.db1:
raise Exception('impossible')
load_params = [dw, db, dw2, dg2, db2, dw3, dg3, db3, dw4, dg4, db4, dwy, dwy1]
elif args.dbn:
load_params = [dw, dw2, dg2, db2, dw3, dg3, db3, dw4, dg4, db4, dwy, dwy1]
elif args.db1:
load_params = [dw, db, dw2, db2, dw3, db3, dw4, db4, dwy, dwy1]
else:
load_params = [dw, dw2, db2, dw3, db3, dw4, db4, dwy, dwy1]
assert len(discrim_params_values) == len(load_params), "# params in model does not match"
for p, v in izip(load_params, discrim_params_values):
p.set_value(v)
###############################
# generate
_gen = theano.function([Z], gX)
from sklearn.preprocessing import normalize
def gen_z(n):
if args.znorm:
return floatX(normalize(np_rng.uniform(-1., 1., size=(n, nz))))
else:
return floatX(np_rng.uniform(-1., 1., size=(n, nz)))
if args.generate:
_genscore = theano.function([Z], [gX, p_gen, p_gen_classify])
t = iter(trange(nvis))
pgs = []
pcs = []
zmbs = []
samples = []
while len(zmbs) < nvis:
zmb = gen_z(args.batch)
xmb, pg, pc = _genscore(zmb)
pgs.append(pg)
pcs.append(pc)
for i in range(args.batch):
if pg[i] >= args.generate_d and pc[i] >= args.generate_c:
zmbs.append(zmb[i])
samples.append(xmb[i])
t.next()
if len(zmbs) >= nvis:
break
pgs = np.concatenate(pgs)
pcs = np.concatenate(pcs)
print 'generate_d',pgs.mean(),pgs.std(),'generate_c',pcs.mean(),pcs.std()
samples = np.asarray(samples)
color_grid_vis(inverse_transform(samples), (nvis2, nvis2),
'%s/Z_%03d.png'%(samples_dir,0))
if args.generate_v is None:
sample_zmb0 = np.array(zmbs)
sample_zmb1 = np.roll(sample_zmb0, 1, axis=0)
for i in tqdm(range(1,ngif)):
z = abs(1.-2.*i/(ngif-1.)) # from 1 to 0 and back to almost 1
sample_zmb = z * sample_zmb0 + (1-z) * sample_zmb1
samples = np.asarray(_gen(sample_zmb))
color_grid_vis(inverse_transform(samples), (nvis2, nvis2),
'%s/Z_%03d.png'%(samples_dir,i))
else:
sample_zmb = np.array(zmbs)
v = gen_z(nvis)
for i in tqdm(range(1,ngif)):
sample_zmb += args.generate_v * v
samples = np.asarray(_gen(sample_zmb))
color_grid_vis(inverse_transform(samples), (nvis2, nvis2),
'%s/Z_%03d.png'%(samples_dir,i))
if ngif > 1:
os.system("convert -delay 15 -loop 0 {0}/Z_*.png {0}/Z.gif".format(samples_dir))
exit(0)
def gen_samples(n, nbatch=128):
samples = []
n_gen = 0
for i in range(n/nbatch):
zmb = gen_z(nbatch)
xmb = _gen(zmb)
samples.append(xmb)
n_gen += len(xmb)
n_left = n-n_gen
if n_left:
zmb = gen_z(n_left)
xmb = _gen(zmb)
samples.append(xmb)
return np.concatenate(samples, axis=0)
####################
d_cost_real = bce(p_real, T.ones(p_real.shape)).mean()
d_classify = (bce(p_classify, Y) * MASK).sum() / MASK.sum()
d_classify_error = (T.neq(p_classify > 0.5, Y) * MASK).sum() / MASK.sum()
d_error_real = 1.-T.mean(p_real)
d_cost_gen = bce(p_gen, T.zeros(p_gen.shape)).mean()
d_error_gen = T.mean(p_gen)
g_cost_d = bce(p_gen, T.ones(p_gen.shape)).mean()
d_cost = d_cost_real + d_cost_gen
if args.onlyclassify:
d_cost = d_classify
elif args.classify:
d_cost += d_classify
g_cost = g_cost_d
cost_target = [g_cost, d_cost, g_cost_d, d_cost_real, d_cost_gen, d_error_real, d_error_gen, d_classify, d_classify_error]
lrg = sharedX(lr)
lrd = sharedX(lr)
l2t = sharedX(l2d)
d_updater = updates.Adam(lr=lrd, b1=b1, regularizer=updates.Regularizer(l2=l2t))
g_updater = updates.Adam(lr=lrg, b1=b1, regularizer=updates.Regularizer(l2=l2))
"""
#old model
if args.onlyclassify:
d_updates = d_updater(discrim_params[:-2]+discrim_params[-1:], d_cost)
elif args.classify:
d_updates = d_updater(discrim_params, d_cost)
else:
d_updates = d_updater(discrim_params[:-1], d_cost)
"""
#new model
d_updates = d_updater(discrim_params, d_cost)
g_updates = g_updater(gen_params, g_cost)
updates = d_updates + g_updates
_train_g = theano.function([X, Z, Y, MASK], cost_target, updates=g_updates)
_train_d = theano.function([X, Z, Y, MASK], cost_target, updates=d_updates)
if args.onlyclassify:
_train_classify = theano.function([X, Y, MASK], [d_classify, d_classify_error], updates=d_updates)
if args.classify:
_classify_d = theano.function([X, Y, MASK], [d_classify, d_classify_error])
log_fields = [
'n_epochs',
'n_updates',
'n_examples',
'n_seconds',
'1k_va_nnd',
# '10k_va_nnd',
# '100k_va_nnd',
'g_cost',
'd_cost',
'error_r',
'error_g',
'd_cost_real',
'd_cost_gen',
'd_classify',
'd_classify_error',
'lrg','lrd',
'l2d',
]
n_updates = 0
n_epochs = 0
n_examples = 0
do_initial_valid = True
log_lines = []
if args.start > 0:
f_log = open('logs/%s.ndjson'%desc, 'rb')
for l in f_log:
j = json.loads(l.strip())
if 'valid_classify' in j:
do_initial_valid = False
continue
if j['n_epochs'] > args.start:
break
do_initial_valid = True
n_epochs = j['n_epochs']
n_updates = j['n_updates']
n_examples = j['n_examples']
lrg.set_value(floatX(j['lrg']))
lrd.set_value(floatX(j['lrd']))
l2t.set_value(floatX(j['l2d']))
log_lines.append(l)
f_log.close()
f_log = open('logs/%s.ndjson'%desc, 'wb')
for l in log_lines:
f_log.write(l)
vis_idxs = py_rng.sample(np.arange(len(vaX)), nvis)
vaX_vis = inverse_transform(vaX[vis_idxs])
color_grid_vis(vaX_vis, (args.nvis2, args.nvis2), 'samples/%s_etl_test.png'%desc)
sample_zmb = gen_z(nvis)
vaX = vaX.reshape(len(vaX), -1)
print desc.upper()
t = time()
costs = []
label_sums = np.zeros(2)
def validate():
if args.classify and args.val > 0.:
sleep(5.)
valid_label_sums = np.zeros(2)
val_costs = []
for imb,labels in tqdm(val_stream.get_epoch_iterator(), total=nval_s/nbatch):
valid_label_sums += labels.sum(axis=0)
y = labels[:,0].reshape((-1,1))
mask = labels[:,1].reshape((-1,1))
imb = transform(imb)
cost = _classify_d(imb, y, mask)
val_costs.append(cost)
print 'valid label sums',valid_label_sums,valid_label_sums[0]/(valid_label_sums[1]+1e-8)
val_cost = np.array(val_costs).mean(axis=0)
d_cost_class = float(val_cost[0])
d_error_class = float(val_cost[1])
print("val_d_classify=%f val_d_classify_error=%f"%(d_cost_class, d_error_class))
log = [d_cost_class, d_error_class]
f_log.write(json.dumps(dict(zip(['valid_classify', 'valid_classify_error'], log)))+'\n')
f_log.flush()
sleep(5.)
if do_initial_valid:
validate()
for epoch in range(args.start,args.nepochs):
for imb,labels in tqdm(tr_stream.get_epoch_iterator(), total=ntrain_s/nbatch):
label_sums += labels.sum(axis=0)
y = labels[:,0].reshape((-1,1))
mask = labels[:,1].reshape((-1,1))
imb = transform(imb)
if args.onlyclassify:
cost = _train_classify(imb, y, mask)
cost = [0]*(len(cost_target)-len(cost)) + cost
else:
zmb = gen_z(len(imb))
if k >= 0:
if n_updates % (k+2) == 0:
cost = _train_g(imb, zmb, y, mask)
else:
cost = _train_d(imb, zmb, y, mask)
else:
if n_updates % (-k+2) == 0:
cost = _train_d(imb, zmb, y, mask)
else:
cost = _train_g(imb, zmb, y, mask)
n_updates += 1
n_examples += len(imb)
costs.append(cost)
if n_updates % args.updates == 0:
cost = np.array(costs).mean(axis=0)
# [g_cost, d_cost, g_cost_d, d_cost_real, d_cost_gen, d_error_real, d_error_gen,d_classify, d_classify_error]
print 'label sums',label_sums,label_sums[0]/(label_sums[1]+1e-8)
label_sums = np.zeros(2)
costs = []
g_cost = float(cost[0])
d_cost = float(cost[1])
d_cost_real = float(cost[3])
d_cost_gen = float(cost[4])
d_error_r = float(cost[5])
d_error_g = float(cost[6])
d_cost_class = float(cost[7])
d_error_class = float(cost[8])
gX = gen_samples(10000)
gX = gX.reshape(len(gX), -1)
va_nnd_1k = nnd_score(gX[:1000], vaX, metric='euclidean')
# va_nnd_10k = nnd_score(gX[:10000], vaX, metric='euclidean')
# va_nnd_100k = nnd_score(gX[:100000], vaX, metric='euclidean')
log = [n_epochs, n_updates, n_examples, time()-t,
va_nnd_1k, g_cost, d_cost,
d_error_r, d_error_g,d_cost_real,d_cost_gen,
d_cost_class, d_error_class,
float(lrg.get_value()),float(lrd.get_value()),float(l2t.get_value())
]
print '%d %d %.2f'%(epoch, n_updates, va_nnd_1k)
print 'gc=%.4f dc=%.4f dcr=%.4f dcg=%.4f er=%.4f eg=%.4f cls=%.4f err=%.4f'%(
g_cost, d_cost, d_cost_real, d_cost_gen,
d_error_r,d_error_g, d_cost_class, d_error_class)
f_log.write(json.dumps(dict(zip(log_fields, log)))+'\n')
f_log.flush()
# if g_cost > d_cost + .3:
# k -= 1
# elif g_cost < d_cost - .3:
# k += 1
# k = max(-3, min(3,k))
# k poistive is do more d, k negative is do more g
if d_error_r < margin or d_error_g < margin: # d is too good
k += args.k
lrg.set_value(floatX(lrg.get_value()*args.lrstep))
lrd.set_value(floatX(lrd.get_value()/args.lrstep))
elif d_error_r > 1.-margin or d_error_g > 1.-margin: # d is too bad
k -= args.k
lrg.set_value(floatX(lrg.get_value()/args.lrstep))
lrd.set_value(floatX(lrd.get_value()*args.lrstep))
elif k > 0: # unwind d
k -= 1
# lrd.set_value(floatX(lrd.get_value()/args.lrstep))
elif k < 0: # unwind g
k += 1
# lrg.set_value(floatX(lrg.get_value()/args.lrstep))
k = max(args.mink,min(args.maxk,k))
# http://torch.ch/blog/2015/11/13/gan.html#balancing-the-gan-game
if g_cost > 1.3: # g is bad -> increase regularization on d
l2t.set_value(floatX(l2t.get_value() + l2step))
elif g_cost < 0.9: # g is good -> decrease regularization on d
l2t.set_value(floatX(l2t.get_value() - l2step))
else:
l2t.set_value(floatX(l2t.get_value() * (1.-args.l2decay)))
if l2t.get_value() < 0:
l2t.set_value(floatX(0.))
print k, l2t.get_value()
validate()
samples = np.asarray(_gen(sample_zmb))
color_grid_vis(inverse_transform(samples), (args.nvis2, args.nvis2), 'samples/%s/%d.png'%(desc, n_epochs))
n_epochs += 1
if n_epochs > niter:
lrg.set_value(floatX(lrg.get_value() - lr/niter_decay))
lrd.set_value(floatX(lrd.get_value() - lr/niter_decay))
if n_epochs <= 5 or n_epochs % 5 == 0:
joblib.dump([p.get_value() for p in gen_params], 'models/%s/%d_gen_params.jl'%(desc, n_epochs))
joblib.dump([p.get_value() for p in discrim_params], 'models/%s/%d_discrim_params.jl'%(desc, n_epochs))
| 39.318436 | 122 | 0.577472 | 4,207 | 28,152 | 3.712146 | 0.126931 | 0.006147 | 0.037011 | 0.026125 | 0.446821 | 0.361017 | 0.312864 | 0.284562 | 0.25056 | 0.239739 | 0 | 0.043728 | 0.270531 | 28,152 | 715 | 123 | 39.373427 | 0.716741 | 0.057474 | 0 | 0.145985 | 0 | 0.001825 | 0.119365 | 0.004575 | 0 | 0 | 0 | 0 | 0.005474 | 0 | null | null | 0 | 0.047445 | null | null | 0.029197 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7ef212a3bbd72af3407c75992543ad244f5853aa | 686 | py | Python | tests/test_base_testclass.py | FrNecas/requre | 110ad5c42b6bbb087a28bcaf7d7b7834825ec65a | [
"MIT"
] | 4 | 2019-09-11T10:39:19.000Z | 2020-01-26T14:46:04.000Z | tests/test_base_testclass.py | FrNecas/requre | 110ad5c42b6bbb087a28bcaf7d7b7834825ec65a | [
"MIT"
] | 134 | 2020-08-04T06:56:25.000Z | 2022-03-28T19:59:10.000Z | tests/test_base_testclass.py | FrNecas/requre | 110ad5c42b6bbb087a28bcaf7d7b7834825ec65a | [
"MIT"
] | 8 | 2019-09-11T09:52:01.000Z | 2020-05-15T07:49:20.000Z | # Copyright Contributors to the Packit project.
# SPDX-License-Identifier: MIT
import os
import shutil
from requre import RequreTestCase
from requre.utils import get_datafile_filename
class CheckBaseTestClass(RequreTestCase):
def tearDown(self):
super().tearDown()
data_file_path = get_datafile_filename(self)
self.assertTrue(os.path.exists(data_file_path))
# use just class and test name instead of full ID
self.assertIn(".".join(self.id().split(".")[-2:]), data_file_path.name)
self.assertIn("test_data", str(data_file_path))
shutil.rmtree(os.path.dirname(get_datafile_filename(self)))
def test(self):
pass
| 29.826087 | 79 | 0.708455 | 90 | 686 | 5.233333 | 0.533333 | 0.067941 | 0.101911 | 0.097665 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001789 | 0.185131 | 686 | 22 | 80 | 31.181818 | 0.840787 | 0.177843 | 0 | 0 | 0 | 0 | 0.019643 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 1 | 0.142857 | false | 0.071429 | 0.285714 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
7ef4e804662096ec1a9ee780599e15a0cae458b8 | 3,106 | py | Python | src/view/services_read_page.py | nbilbo/services_manager | 74e0471a1101305303a96d39963cc98fc0645a64 | [
"MIT"
] | null | null | null | src/view/services_read_page.py | nbilbo/services_manager | 74e0471a1101305303a96d39963cc98fc0645a64 | [
"MIT"
] | null | null | null | src/view/services_read_page.py | nbilbo/services_manager | 74e0471a1101305303a96d39963cc98fc0645a64 | [
"MIT"
] | null | null | null | """Frame to show all service\'s register\'s.
"""
import tkinter.ttk
from src.view import constants
from src.view.services_page import ServicesPage
class ServicesReadPage(ServicesPage):
def __init__(self, parent, controller, *args, **kwargs):
super(ServicesReadPage, self).__init__(parent, *args, **kwargs)
self.handler = Handler(self, controller)
self.create_treeview()
self.create_crud_buttons()
self.create_binds()
self.set_title("Services")
def create_treeview(self):
"""Create treeview to show data.
"""
self.treeview = tkinter.ttk.Treeview(self)
self.treeview.pack(side="top", fill="both", expand=True, padx=constants.PADX, pady=constants.PADY)
def create_crud_buttons(self):
"""Create crud buttons.
"""
container = tkinter.ttk.Frame(self)
container.pack(side="top", fill="both")
self.add_button = tkinter.ttk.Button(
container,
text="Add")
self.update_button = tkinter.ttk.Button(
container,
text="update")
self.delete_button = tkinter.ttk.Button(
container,
text="delete")
for button in (
self.add_button,
self.update_button,
self.delete_button):
button.pack(
side="left",
fill="both",
expand=True,
padx=constants.PADX,
pady=constants.PADY)
def create_binds(self):
"""Connect events and handler.
"""
self.back_button["command"] = self.handler.inicialize_home_page
self.add_button["command"] = self.handler.inicialize_services_add_page
self.delete_button["command"] = self.handler.inicialize_services_delete_page
self.update_button["command"] = self.handler.inicialize_services_update_page
def get_add_button(self):
"""
return
tkinter.ttk.Button
"""
return self.add_button
def get_update_button(self):
"""
return
tkinter.ttk.Button
"""
return self.update_button
def get_delete_button(self):
"""
return
tkinter.ttk.Button
"""
return self.delete_button
def get_treeview(self):
"""
return
tkinter.ttk.Treeview
"""
return self.treeview
class Handler(object):
def __init__(self, widget, controller):
super(Handler).__init__()
self.widget = widget
self.controller = controller
def inicialize_home_page(self):
self.controller.inicialize_home_page()
def inicialize_services_add_page(self):
self.controller.inicialize_services_add_page()
def inicialize_services_delete_page(self):
self.controller.inicialize_services_delete_page()
def inicialize_services_update_page(self):
self.controller.inicialize_services_update_page()
| 28.236364 | 106 | 0.59369 | 321 | 3,106 | 5.498442 | 0.202492 | 0.056657 | 0.054391 | 0.054391 | 0.460623 | 0.339377 | 0.14051 | 0.14051 | 0.069122 | 0.069122 | 0 | 0 | 0.306826 | 3,106 | 109 | 107 | 28.495413 | 0.819786 | 0.085963 | 0 | 0.048387 | 0 | 0 | 0.027127 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.209677 | false | 0 | 0.048387 | 0 | 0.354839 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7efb31a8e8af90737b8da1f5791e1e718e4838b5 | 5,251 | py | Python | anarky/interface.py | MulberryBeacon/anarky | 54684e4422d36c6ea3c0bb3fab5af56002864690 | [
"MIT"
] | 1 | 2015-05-12T13:05:04.000Z | 2015-05-12T13:05:04.000Z | anarky/interface.py | MulberryBeacon/anarky | 54684e4422d36c6ea3c0bb3fab5af56002864690 | [
"MIT"
] | null | null | null | anarky/interface.py | MulberryBeacon/anarky | 54684e4422d36c6ea3c0bb3fab5af56002864690 | [
"MIT"
] | null | null | null | # -*- coding: utf8 -*-
"""
Common user interface operations.
Author: Eduardo Ferreira
License: MIT (see LICENSE for details)
"""
# Module import
# --------------------------------------------------------------------------------------------------
from os import walk
from os.path import isdir, isfile, join
import argparse
import logging
import sys
from .__version__ import __version__
# Constants
# --------------------------------------------------------------------------------------------------
ERROR = "{} '{}' is not available (doesn't exist or no privileges to access it)!"
ERROR_INVALID = "{} '{}' is invalid!"
ERROR_INVALID_LIST = 'The list of input files is invalid!'
ERROR_EMPTY_LIST = 'The list of input files is empty!'
# Logger
# --------------------------------------------------------------------------------------------------
logging.basicConfig(level=logging.INFO)
_logger = logging.getLogger(__name__)
def keyboard_interrupt():
_logger.warn('\nThe program execution was interrupted!\n')
# Methods :: Command line options and instructions
# --------------------------------------------------------------------------------------------------
def parse_options(program, description, decode=False):
"""
Parses and retrieves the values for the full set of command line arguments.
:param program: The name of the program
:param description: The description of the program
:param decode: Flag the indicates if it's an encoding or decoding operation
:return: The list of command line arguments
"""
# Defines the parent parser
parser = argparse.ArgumentParser(prog=program, description=description)
parser.add_argument('-v', '--version', action='version', version='%(prog)s ' + __version__)
group = parser.add_argument_group('options')
group.add_argument('-f', '--files', nargs='+', metavar='FILES', dest='input_files',
help='input files', required=True)
# TODO: the destination probably shouldn't be a required parameter. And the name could be
# changed to "output"...
group.add_argument('-o', '--output', metavar='OUTPUT', dest='output_dir', help='output directory')
return parser.parse_args()
def get_options(program, description, decode=False):
"""
Parses, retrieves and validates the values for the full set of command line arguments.
:param program: The name of the program
:param description: The description of the program
:param decode: Flag the indicates if it's an encoding or decoding operation
:return: The fully parsed and validated list of command line arguments
"""
args = parse_options(program, description, decode)
# Checks the input files
files = get_input_files(args.input_files)
if len(files) == 0:
_logger.error(ERROR_EMPTY_LIST)
sys.exit(1)
# TODO: this bit needs to be completely reviewed!
# Checks the output directory, cover and tag parameters
"""
if not (directory_exists(args.output_dir) and not (
not decode and args.cover is not None and not file_exists(args.cover))):
sys.exit(1)
"""
if not directory_exists(args.output_dir):
_logger.error(ERROR.format('Directory', args.output_dir))
sys.exit(1)
#return files, args.output_dir, args.cover, args.tags, args.playlist
return files, args.output_dir
# Methods :: File system library
# --------------------------------------------------------------------------------------------------
def file_exists(filename):
"""
Checks if a file is a valid file system entry.
:param filename: The name of a file
:return: True if the given file name matches an actual file; False otherwise
"""
try:
if not isfile(filename):
_logger.error(ERROR.format('File', filename))
return False
except TypeError:
_logger.error(ERROR_INVALID.format('File', filename))
return False
return True
def directory_exists(directory):
"""
Checks if a directory is a valid file system entry.
:param directory: The name of a directory
:return: True if the given directory name matches an actual directory; False otherwise
"""
try:
if not isdir(directory):
_logger.error(ERROR.format('Directory', directory))
return False
except TypeError:
_logger.error(ERROR_INVALID.format('Directory', directory))
return False
return True
def get_input_files(entries):
"""
Checks and stores the input files provided in the command line interface.
:param entries: The set of input entries (can be either files or directories)
:return: A complete list of the input files
"""
result = []
try:
for entry in entries:
if isfile(entry):
result.append(entry)
elif isdir(entry):
for root, directories, files in walk(entry):
for filename in files:
file_path = join(root, filename)
result.append(file_path)
else:
_logger.error(ERROR.format('File system entry', entry))
except TypeError:
_logger.error(ERROR_INVALID_LIST)
return result | 34.546053 | 102 | 0.607503 | 615 | 5,251 | 5.079675 | 0.276423 | 0.03201 | 0.040973 | 0.028169 | 0.387964 | 0.24904 | 0.209987 | 0.15493 | 0.15493 | 0.119718 | 0 | 0.001212 | 0.214626 | 5,251 | 152 | 103 | 34.546053 | 0.756305 | 0.41992 | 0 | 0.212121 | 0 | 0 | 0.134241 | 0 | 0 | 0 | 0 | 0.013158 | 0 | 1 | 0.090909 | false | 0 | 0.090909 | 0 | 0.318182 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d046d7646f874b0a9f0b4e92ebdd10a5f1eb202 | 268 | py | Python | hospitals/urls.py | gilga98/ahalya | 1c50ae3ffaf48db5b1970567028117991451d62b | [
"MIT"
] | 4 | 2020-07-18T18:09:32.000Z | 2021-05-01T02:12:40.000Z | hospitals/urls.py | gilga98/ahalya | 1c50ae3ffaf48db5b1970567028117991451d62b | [
"MIT"
] | 5 | 2021-03-30T13:56:57.000Z | 2021-09-22T19:27:22.000Z | hospitals/urls.py | gilga98/ahalya | 1c50ae3ffaf48db5b1970567028117991451d62b | [
"MIT"
] | 1 | 2020-11-15T05:08:21.000Z | 2020-11-15T05:08:21.000Z | from django.urls import path
from . import views
app_name = "hospitals"
urlpatterns = [
path("hospitalList", views.HospitalDetailedList.as_view(), name="hospital_list"),
path("hospitalDetail", views.HospitalDetailedSingle.as_view(), name="hospital_read"),
] | 26.8 | 89 | 0.75 | 30 | 268 | 6.533333 | 0.633333 | 0.061224 | 0.102041 | 0.183673 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.115672 | 268 | 10 | 90 | 26.8 | 0.827004 | 0 | 0 | 0 | 0 | 0 | 0.226766 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d06085df3fe41964096c1793c9b3fba869163aa | 273 | py | Python | setup.py | klaasb/python-dmenuwrap | ffc592da00d1c53200e4b391a81c423afc06c592 | [
"BSD-2-Clause"
] | 1 | 2019-06-28T17:47:43.000Z | 2019-06-28T17:47:43.000Z | setup.py | klaasb/python-dmenuwrap | ffc592da00d1c53200e4b391a81c423afc06c592 | [
"BSD-2-Clause"
] | null | null | null | setup.py | klaasb/python-dmenuwrap | ffc592da00d1c53200e4b391a81c423afc06c592 | [
"BSD-2-Clause"
] | null | null | null | from distutils.core import setup
setup(
name='python-dmenuwrap',
author='Klaas Boesche',
author_email='klaas-dev@boesche.me',
url='https://github.com/KaGeBe/python-dmenuwrap',
version='0.1.0',
license='BSD 2-clause',
py_modules=['dmenuwrap']
)
| 22.75 | 53 | 0.67033 | 36 | 273 | 5.027778 | 0.777778 | 0.165746 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.017467 | 0.161172 | 273 | 11 | 54 | 24.818182 | 0.772926 | 0 | 0 | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.1 | 0 | 0.1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d088e8ad48a298f700c49b458a6cd8398f17041 | 19,511 | py | Python | FortressOfSolitude/_FortressOfSolitude/NeutrinoKey/models.py | BDD16/FortressOfSolitude | 51070d3ffa78262d823ae8ccce4f8ae3c7ed83ac | [
"MIT"
] | null | null | null | FortressOfSolitude/_FortressOfSolitude/NeutrinoKey/models.py | BDD16/FortressOfSolitude | 51070d3ffa78262d823ae8ccce4f8ae3c7ed83ac | [
"MIT"
] | 6 | 2021-07-26T14:07:30.000Z | 2022-01-09T01:06:40.000Z | FortressOfSolitude/_FortressOfSolitude/NeutrinoKey/models.py | BDD16/FortressOfSolitude | 51070d3ffa78262d823ae8ccce4f8ae3c7ed83ac | [
"MIT"
] | null | null | null | """
DBA 1337_TECH, AUSTIN TEXAS © MAY 2020
Proof of Concept code, No liabilities or warranties expressed or implied.
"""
# -*- coding: utf-8 -*-
from __future__ import unicode_literals
from django.db import models
from datetime import datetime
from .cryptoutils import CryptoTools
from base64 import b64encode, b64decode
from django.contrib.auth import get_user_model
from random import random
# Create your models here.
# Constants
LENGTH_OF_KEK = 32 # 256 bits or 32 bytes
LENGTH_OF_DEK = 32 # 256 bits or 32 bytes
LENGTH_OF_SALT = 32 # 256 bits or 32 bytes
'''
KeyMold is a models.Manager clas extension that includes creating a Kek and retrieving a kek
no inputs
'''
class KeyMold(models.Manager):
def _create_kek(request, **kwargs):
pwd = request.user.password
# print("deriving kek")
self.kek = DeriveKek_default(pwd)
return self.kek
def get_queryset(self):
qs = models.QuerySet(self.model)
if self._db is not None:
qs = qs.using('default')
return qs
'''
TelescopeCoord is a models.Manager that allows to find the neutron star that will be used for the keyMold to make a Key Encryption Key [kek].
no inputs
'''
class TelescopeCoord(models.Manager):
def get_queryset(self):
qs = models.QuerySet(self.model)
if self._db is not None:
qs = qs.using('default')
return qs
'''
QuasiPlasma is a models.Manager that allows for deriving Data Encryption Keys [DEKs] and retrieving deks from the neutron stars plasma.
no inputs
'''
class QuasiPlasma(models.Manager):
def _create_dek(request, **kwargs):
pwd = request.user.password
self.dek = DeriveDek_default(pwd)
return self.dek
def get_queryset(self):
qs = models.QuerySet(self.model)
if self._db is not None:
qs = qs.using('default')
return qs
'''
KEK is the Key encryption Key [KEK] model.Model class extension that has the ability to derive a new KEK as well as wrap the KEK.
no inputs
'''
class KEK(models.Model):
# Never should the key be passed as clear text always use the wrap or unwrap functions
crypto = CryptoTools()
kek = None
wrappedKek = None
result_wrapped_nonce = models.CharField(max_length=128, default=b64encode(int(55).to_bytes(4, 'big')))
result_wrapped_kek = models.CharField(max_length=128, default=None)
objects = TelescopeCoord()
class Meta:
verbose_name = 'KEK'
def unwrap_key(self, password):
if isinstance(password, str) and self.kek == None and self.wrappedKek == None:
self.crypto.nonce = b64decode(self.result_wrapped_nonce)
self.kek = self.crypto.AesDecryptEAX(b64decode(self.result_wrapped_kek),
self.crypto.Sha256(password.encode()))
if isinstance(password, bytes) and self.kek == None and self.wrappedKek == None:
if isinstance(self.result_wrapped_nonce, str):
result_wrapped_nonce = (self.result_wrapped_nonce.encode()).replace(b"b'", b'')
result_wrapped_nonce = result_wrapped_nonce[:-1]
result_wrapped_nonce = result_wrapped_nonce + b'=' * (len(self.result_wrapped_nonce) % 4)
self.crypto.nonce = b64decode(result_wrapped_nonce)
else:
self.crypto.nonce = b64decode(self.result_wrapped_nonce)
# print("wrappedKek: " + self.result_wrapped_kek)
if isinstance(self.result_wrapped_kek, str):
result_wrapped_kek = (self.result_wrapped_kek.encode()).replace(b"b'", b'')
result_wrapped_kek = result_wrapped_kek[:-1]
result_wrapped_kek = result_wrapped_kek + b'=' * (len(result_wrapped_kek) % 4)
elif isinstance(self.result_wrapped_kek, bytes):
result_wrapped_kek = self.result_wrapped_kek
self.kek = self.crypto.AesDecryptEAX(b64decode(result_wrapped_kek), CryptoTools().Sha256(password))
else:
try:
self.crypto.nonce = b64decode(self.result_wrapped_nonce)
if not isinstance(password, bytes):
password = password.encode()
self.kek = self.crypto.AesDecryptEAX(b64decode(self.result_wrapped_kek), self.crypto.Sha256(password))
self.wrappedKek = None
except:
print('someone has attempted to spoof the KEK (key encryption key)')
return self.kek
def wrap_key(self, password):
if isinstance(password, str) and self.kek == None:
self.kek = self.crypto.AesEncryptEAX(data, self.crypto.Sha256(password.encode()))
self.wrappedKek = self.kek
self.kek = None
elif isinstance(password, bytes) and self.kek == None:
self.kek = self.crypto.AesEncryptEAX(data, self.crypto.Sha256(password))
self.wrappedKek = b64encode(self.kek)
self.kek = None
elif self.kek != None:
try:
# print("ATTEMPTING WRAPPING KEK")
self.crypto.nonce = b64decode(self.result_wrapped_nonce)
# print("set nonce")
if isinstance(password, bytes):
self.wrappedKek = b64encode(self.crypto.AesEncryptEAX(self.kek, self.crypto.Sha256(password)))
else:
self.wrappedKek = b64encode(
self.crypto.AesEncryptEAX(self.kek, self.crypto.Sha256(password.encode())))
self.kek = None
except OSError as ERROR:
print(ERROR)
print('Wrapping KEK (key encryption key) was unsuccessful')
return self.wrappedKek
'''
using the model of KEK unwrap and wrap the kek then unwrap the dek then pass the dek to a more useable object
perhaps this will also fetch the dek that is associated with that data model, so needs to be a manytomany relation.
DEK is a models.Model or Data Encryption Key class that allows to store, derive, and wrap Data Encryption Keys from a KEK and Salt
'''
class DEK(models.Model):
crypto = CryptoTools()
dek = None
wrappedDek = None
SALT = None
result_wrapped_nonce = models.CharField(max_length=128, default=b64encode(int(55).to_bytes(4, 'big')))
result_wrappedDek = models.CharField(max_length=128)
result_SALT = models.CharField(max_length=45)
kek_to_retrieve = models.ManyToManyField(KEK)
objects = KeyMold()
class Meta:
verbose_name = 'DEK'
def wrap_key(self, kek, password):
if isinstance(kek, KEK) and isinstance(password, str):
kek.unwrap_key(password)
self.crypto.nonce = b64decode(kek.result_wrapped_nonce)
# print(self.result_wrappedDek)
self.dek = self.crypto.AesEncryptEAX(b64decode(self.result_wrappedDek), kek.kek)
kek.wrap_key(password)
return self.dek
elif isinstance(kek, KEK) and isinstance(password, bytes):
kek.unwrap_key(password)
self.crypto.nonce = b64decode(kek.result_wrapped_nonce)
self.dek = self.crypto.AesEncryptEAX(self.result_wrappedDek, kek.kek)
kek.wrap_key(password)
return self.dek
else:
try:
kek.unwrap_key(password)
self.crypto.nonce = b64decode(kek.result_wrapped_nonce)
self.dek = self.crypto.AesEncryptEAX(self.result_wrappedDek, self.crypto.Sha256(kek.kek))
kek.wrap_key(password)
return self.dek
except:
print('someone has attempted to spoof the DEK (data encryption key)')
def unwrap_key(self, kek, password):
if isinstance(kek, KEK) and isinstance(password, str):
master = kek.unwrap_key(password.encode())
self.crypto.nonce = b64decode(self.result_wrapped_nonce)
self.dek = self.crypto.AesDecryptEAX(b64decode(self.result_wrappedDek), self.crypto.Sha256(master))
kek.wrap_key(password)
return self.dek
elif isinstance(kek, KEK) and isinstance(password, bytes):
kek.unwrap_key(password)
if isinstance(self.result_wrapped_nonce, str):
print("NONCEDEK_STR:" + self.result_wrapped_nonce)
result_wrapped_nonce = (self.result_wrapped_nonce.encode()).replace(b"b'", b'')
result_wrapped_nonce = result_wrapped_nonce[:-1]
result_wrapped_nonce = result_wrapped_nonce + b'=' * (len(result_wrapped_nonce) % 4)
self.crypto.nonce = b64decode(result_wrapped_nonce)
print(b'NONCEDEK>' + result_wrapped_nonce)
elif isinstance(self.result_wrapped_nonce, bytes):
print("YOLO")
self.crypto.nonce = b64decode(self.result_wrapped_nonce)
if (not isinstance(self.result_wrappedDek, bytes)):
print("did we make it here" + str(self.result_wrappedDek))
result_wrappedDek = (self.result_wrappedDek.encode()).replace(b"b'", b'')
result_wrappedDek = result_wrappedDek[:-1]
print("did we make it here" + str(result_wrappedDek))
wrapper = result_wrappedDek + b'=' * (len(result_wrappedDek) % 4)
print("wrapper" + str(wrapper))
else:
print(self.result_wrappedDek)
result_wrappedDek = self.result_wrappedDek.replace(b"b'", b'')
result_wrappedDek = result_wrappedDek
wrapper = result_wrappedDek + b'=' * (len(result_wrappedDek) % 4)
cryptoObj = CryptoTools()
print(wrapper)
self.dek = self.crypto.AesDecryptEAX(b64decode(wrapper), cryptoObj.Sha256(kek.kek))
kek.wrap_key(password)
return self.dek
else:
try:
if not isinstance(password, bytes):
password = password.encode()
else:
password = password
kek.unwrap_key(password)
self.crypto.nonce = b64decode(self.result_wrapped_nonce)
# print("about to decrypt dek")
self.dek = self.crypto.AesDecryptEAX(b64decode(self.result_wrappedDek), self.crypto.Sha256(kek.kek))
kek.wrap_key(password)
return self.dek
except:
print('someone has attempted to spoof the KEK2 (key encryption key)')
'''
function to DeriveKek_default from an arbitrary password
'''
def DeriveKek_default(password):
crypto = CryptoTools()
if len(crypto.Sha256(password.encode())) != LENGTH_OF_KEK:
print('ERROR> NOT ENOUGH BYTES IN PASSWORD FOR DEK, NEED 32')
if isinstance(password, str):
somekek = crypto.Sha256(bytes(password.encode()))
somekek = crypto.AesEncryptEAX(password.encode(), somekek)
k = KEK(result_wrapped_kek=b64encode(somekek))
k.save()
return k
elif isinstance(password, bytes):
somekek = crypto.Sha256(bytes(password.encode()))
somekek = crypto.AesEncryptEAX(password.encode(), somekek)
k = KEK(result_wrapped_kek=b64encode(somekek), result_wrapped_nonce=crypto.nonce)
k.save()
return k
else:
print("ERROR>UNABLE TO GENERATE WRAPPED KEK, USE A CORRECT KEY FORMAT FOR WRAPPING")
'''
NeutronCore is a models.Model type class that allow for KEKs to be generated through a kek generator, time_generated, and of course the kek object
this is the model for when you need access to multiple KEKS for a single user
USE CASE: is old data relies on older KEKs but that older KEK is still active
but the user happened to change their password which would entail creating a new password and from that time the DEK chain would change to the newly
created KEK wrapped using the newly changed password.
'''
class NeutronCore(models.Model):
kek = models.ForeignKey(
get_user_model(), related_name='KEK',
on_delete=models.CASCADE,
default=1)
kekgenerator = models.ManyToManyField(KEK, related_name='KEK')
time_generated = models.DateTimeField('date star collapsed', auto_now_add=True)
objects = KeyMold()
class Meta:
verbose_name = 'neutron core'
ordering = ['-time_generated']
get_latest_by = 'time_generated'
def DeriveKek(self, password):
crypto = CryptoTools()
if len(crypto.Sha256(password.encode())) != LENGTH_OF_KEK:
print('ERROR> NOT ENOUGH BYTES IN PASSWORD FOR DEK, NEED 32')
if isinstance(password, str):
somekek = crypto.Sha256(bytes(password.encode()))
somekek = crypto.AesEncryptEAX(password.encode(), somekek)
k = KEK(result_wrapped_kek=b64encode(somekek), result_wrapped_nonce=b64encode(crypto.nonce))
k.save()
return k
elif isinstance(password, bytes):
somekek = crypto.Sha256(bytes(password.encode()))
somekek = crypto.AesEncryptEAX(password.encode(), somekek)
k = KEK(result_wrapped_kek=b64encode(somekek), result_wrapped_nonce=b64encode(crypto.nonce))
k.save()
return k
else:
print("ERROR>UNABLE TO GENERATE WRAPPED KEK, USE A CORRECT KEY FORMAT FOR WRAPPING")
def DeriveDek_default(password):
crypto = CryptoTools()
kekForDek = NeutronCore(get_user_model()).DeriveKek(password)
if isinstance(kekForDek, KEK):
if password != None and isinstance(password, str):
# Generate DEK based off this formula sha256(256 bit SALT + KEK)
self.SALT = crypto.RandomNumber(32)
crypto.nonce = b64decode(kekForDek.result_wrapped_nonce)
DerivedDek = crypto.Sha256(bytes(kekForDek.result_SALT) + crypto.AesDecryptEAX(
bytes(b64decode(str(kekForDek.result_wrapped_kek).encode())),
crypto.Sha256(bytes(password.encode()))))
dekgenerator = DerivedDek
dek = DerivedDek
dek = DEK.wrap_key(dek, password)
newDek = DEK(result_wrappedDek=b64encode(dek), result_SALT=kekForDek.result_SALT,
kek_to_retrieve=kekForDek, result_wrapped_nonce=b64encode(crypto.nonce))
newDek.save()
return newDek
'''
NeutronMatterCollector is for generating a Data Encryption Key [DEK]
no inputs
'''
class NeutronMatterCollector(models.Model):
dekgenerator = models.ManyToManyField(DEK,
related_name='kek_for_dek_generator') # length of 32 bytes (256bits) in base64 is 44, but will need to include an = ending and null so extending to 45.
try:
# print(get_user_model().user)
kekForDek = models.ForeignKey(
KEK, related_name='KEK_obj',
on_delete=models.CASCADE, default=1)
dek = models.ForeignKey(
DEK, related_name='DEK_obj',
on_delete=models.CASCADE,
default=1)
except:
try:
print("unable to locate KEK for username creating new one, this could be due to a new user")
kekForDek = models.ForeignKey(KEK, related_name='KEK_obj',
on_delete=models.CASCADE, default=1)
dek = models.ForeignKey(DEK, related_name='DEK_obj', on_delete=models.CASCADE, default=1)
print("successfully made a KEK and DEK")
except:
print("unable to create KEK")
print(get_user_model().natural_key(get_user_model()))
time_generated = models.DateTimeField('date integrated', auto_now_add=datetime.now().strftime("%Y-%m-%d %H:%M:%S"))
objects = QuasiPlasma()
class Meta:
verbose_name = 'neutron matter collector'
ordering = ['-time_generated']
get_latest_by = 'time_generated'
def DeriveDek(self, password):
crypto = CryptoTools()
if isinstance(NeutronMatterCollector.kekForDek, KEK):
if password != None and isinstance(password, str):
# Generate DEK based off this formula sha256(256 bit SALT + KEK)
self.SALT = crypto.RandomNumber(32)
crypto.nonce = b64decode(NeutronMatterCollector.kekForDek.result_wrapped_nonce)
DerivedDek = crypto.Sha256(bytes(self.SALT) + crypto.AesDecryptEAX(
bytes(b64decode(str(self.kekForDek.result_wrapped_kek).encode())),
crypto.Sha256(bytes(password.encode()))))
self.dekgenerator = DerivedDek
dek = DerivedDek
dek = DEK.wrap_key(dek, password)
newDek = DEK(result_wrappedDek=b64encode(dek), result_SALT=b64encode(self.SALT),
kek_to_retrieve=self.dekgenerator)
newDek.save()
return newDek
else:
self.kekForDek = NeutronCore(get_user_model()).DeriveKek(password)
if isinstance(self.kekForDek, KEK):
if password != None and isinstance(password, str):
# Generate DEK based off this formula sha256(256 bit SALT + KEK)
self.SALT = crypto.RandomNumber(32)
crypto.nonce = b64decode(self.kekForDek.result_wrapped_nonce)
# print(self.kekForDek.result_wrapped_nonce)
# print(self.kekForDek.result_wrapped_kek)
# print(password)
DerivedDek = crypto.Sha256(
bytes(self.SALT) + crypto.AesDecryptEAX(b64decode(self.kekForDek.result_wrapped_kek),
crypto.Sha256(bytes(password.encode()))))
# self.dekgenerator.id.set(self.request.user)
dek = DerivedDek
# newkey = DEK()
# newkey.dek = dek
# dek = DEK.wrap_key(newkey, kek=self.kekForDek, password=password.encode())
dek = crypto.AesEncryptEAX(dek, crypto.Sha256(
crypto.AesDecryptEAX(b64decode(self.kekForDek.result_wrapped_kek),
crypto.Sha256(bytes(password.encode())))))
newDek = DEK(result_wrappedDek=b64encode(dek), result_SALT=b64encode(self.SALT),
result_wrapped_nonce=b64encode(crypto.nonce), id=self.id)
# newDek.kek_to_retrieve.set(self.dekgenerator)
# self.time_generated = models.DateTimeField('date integrated', auto_now_add=datetime.now().strftime("%Y-%m-%d %H:%M:%S"))
self.save()
newDek.save()
self.dekgenerator.add(newDek)
self.save()
return newDek
class KryptonianSpeak:
def db_for_read(self, model, **hints):
return 'default'
def db_for_write(self, model, **hints):
return 'default'
def allow_relation(self, obj1, obj2, **hints):
return True
'''
db_list = ('default', 'superHeros', 'icePick', 'neutronStarMatter', 'neutronStarMold')
if obj1._state.db in db_list and obj2._state.db in db_list:
return True
return None
'''
def allow_migrate(self, db, app_label, model_name=None, **hints):
return True
| 39.576065 | 194 | 0.621906 | 2,258 | 19,511 | 5.240921 | 0.134632 | 0.073601 | 0.063884 | 0.026027 | 0.651344 | 0.625148 | 0.564729 | 0.540477 | 0.489522 | 0.450397 | 0 | 0.019832 | 0.284147 | 19,511 | 492 | 195 | 39.656504 | 0.827379 | 0.063964 | 0 | 0.533123 | 0 | 0 | 0.057895 | 0.00128 | 0 | 0 | 0 | 0 | 0 | 1 | 0.053628 | false | 0.22082 | 0.022082 | 0.009464 | 0.258675 | 0.066246 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
7d124f218a5ee0a4b8f0187b77fddb8e78f5822d | 866 | py | Python | ledis/datastructures.py | gianghta/Ledis | a6b31617621746344408ee411cf510ef3cfb2e7b | [
"MIT"
] | null | null | null | ledis/datastructures.py | gianghta/Ledis | a6b31617621746344408ee411cf510ef3cfb2e7b | [
"MIT"
] | null | null | null | ledis/datastructures.py | gianghta/Ledis | a6b31617621746344408ee411cf510ef3cfb2e7b | [
"MIT"
] | null | null | null | from enum import unique, Enum
from typing import Union
@unique
class DataType(Enum):
STR = "str"
SET = "set"
class BaseDataStructure:
__slots__ = {"data", "type", "expire_at"}
def __init__(self, data: Union[str, set]):
self.data = data
# This will raise an error if type is not supported
self.type = DataType(type(data).__name__)
# UTC expire timestamp, in seconds
self.expire_at = None
def __eq__(self, other):
if not isinstance(other, self.__class__):
# don't attempt to compare against unrelated types
return NotImplemented
return (
self.data == other.data
and self.expire_at == other.expire_at
and self.type == other.type
)
class String(BaseDataStructure):
pass
class Set(BaseDataStructure):
pass
| 21.121951 | 62 | 0.614319 | 104 | 866 | 4.884615 | 0.480769 | 0.062992 | 0.047244 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.297921 | 866 | 40 | 63 | 21.65 | 0.835526 | 0.15127 | 0 | 0.083333 | 0 | 0 | 0.031464 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.083333 | false | 0.083333 | 0.083333 | 0 | 0.541667 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
7d175bf034fa65f7ae66fd6150ae4013621d9935 | 16,355 | py | Python | tests/normalizer/test_number_normalizer.py | tkscode/pyNormalizeNumExp | ac7df9b49153d9b792f5c8087b17c0d8c4a615b2 | [
"BSD-3-Clause"
] | 2 | 2021-11-09T06:18:21.000Z | 2021-12-04T10:58:26.000Z | tests/normalizer/test_number_normalizer.py | tkscode/pyNormalizeNumExp | ac7df9b49153d9b792f5c8087b17c0d8c4a615b2 | [
"BSD-3-Clause"
] | null | null | null | tests/normalizer/test_number_normalizer.py | tkscode/pyNormalizeNumExp | ac7df9b49153d9b792f5c8087b17c0d8c4a615b2 | [
"BSD-3-Clause"
] | 1 | 2021-11-09T03:33:33.000Z | 2021-11-09T03:33:33.000Z | import pytest
from pynormalizenumexp.expression.base import NNumber, NotationType
from pynormalizenumexp.utility.dict_loader import DictLoader
from pynormalizenumexp.normalizer.number_normalizer import NumberNormalizer
@pytest.fixture(scope="class")
def number_normalizer():
return NumberNormalizer(DictLoader("ja"))
class TestNumberNormalizer:
def test_process_標準(self, number_normalizer: NumberNormalizer):
res = number_normalizer.process("その3,244人が3,456,789円で百二十三万四千五百六十七円")
expect = [NNumber("3,244", 2, 7), NNumber("3,456,789", 9, 18), NNumber("百二十三万四千五百六十七", 20, 32)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 3244
expect[0].notation_type = [NotationType.HANKAKU]
expect[1].value_lower_bound = expect[1].value_upper_bound = 3456789
expect[1].notation_type = [NotationType.ZENKAKU]
expect[2].value_lower_bound = expect[2].value_upper_bound = 1234567
expect[2].notation_type = [NotationType.KANSUJI_KURAI_SEN, NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN,
NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_MAN, NotationType.KANSUJI_09,
NotationType.KANSUJI_KURAI_SEN, NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN,
NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN, NotationType.KANSUJI_09]
assert res == expect
def test_process_小数点あり(self, number_normalizer: NumberNormalizer):
res = number_normalizer.process("その3,244.15人が3,456,789.456円")
expect = [NNumber("3,244.15", 2, 10), NNumber("3,456,789.456", 12, 25)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 3244.15
expect[0].notation_type = [NotationType.HANKAKU]
expect[1].value_lower_bound = expect[1].value_upper_bound = 3456789.456
expect[1].notation_type = [NotationType.ZENKAKU]
assert res == expect
res = number_normalizer.process("131.1ポイントというスコアを叩き出した")
expect = [NNumber("131.1", 0, 5)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 131.1
expect[0].notation_type = [NotationType.HANKAKU, NotationType.HANKAKU, NotationType.HANKAKU]
assert res == expect
res = number_normalizer.process("9.3万円も損した")
expect = [NNumber("9.3万", 0, 4)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 93000
expect[0].notation_type = [NotationType.HANKAKU]
assert res == expect
def test_process_プラスあり(self, number_normalizer: NumberNormalizer):
res = number_normalizer.process("その+3,244人が+3,456,789円でプラス百二十三万四千五百六十七円")
expect = [NNumber("+3,244", 2, 8), NNumber("+3,456,789", 10, 20), NNumber("プラス百二十三万四千五百六十七", 22, 37)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 3244
expect[0].notation_type = [NotationType.HANKAKU]
expect[1].value_lower_bound = expect[1].value_upper_bound = 3456789
expect[1].notation_type = [NotationType.ZENKAKU]
expect[2].value_lower_bound = expect[2].value_upper_bound = 1234567
expect[2].notation_type = [NotationType.KANSUJI_KURAI_SEN, NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN,
NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_MAN, NotationType.KANSUJI_09,
NotationType.KANSUJI_KURAI_SEN, NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN,
NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN, NotationType.KANSUJI_09]
assert res == expect
def test_process_マイナスあり(self, number_normalizer: NumberNormalizer):
res = number_normalizer.process("その-3,244人がー3,456,789円でマイナス百二十三万四千五百六十七円")
expect = [NNumber("-3,244", 2, 8), NNumber("ー3,456,789", 10, 20), NNumber("マイナス百二十三万四千五百六十七", 22, 38)]
expect[0].value_lower_bound = expect[0].value_upper_bound = -3244
expect[0].notation_type = [NotationType.HANKAKU]
expect[1].value_lower_bound = expect[1].value_upper_bound = -3456789
expect[1].notation_type = [NotationType.ZENKAKU]
expect[2].value_lower_bound = expect[2].value_upper_bound = -1234567
expect[2].notation_type = [NotationType.KANSUJI_KURAI_SEN, NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN,
NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_MAN, NotationType.KANSUJI_09,
NotationType.KANSUJI_KURAI_SEN, NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN,
NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN, NotationType.KANSUJI_09]
assert res == expect
def test_process_範囲あり(self, number_normalizer: NumberNormalizer):
res = number_normalizer.process("その10~20人が、100〜200円で")
expect = [NNumber("10~20", 2, 7), NNumber("100〜200", 10, 17)]
expect[0].value_lower_bound = 10
expect[0].value_upper_bound = 20
expect[0].notation_type = [NotationType.HANKAKU, NotationType.HANKAKU]
expect[1].value_lower_bound = 100
expect[1].value_upper_bound = 200
expect[1].notation_type = [NotationType.ZENKAKU, NotationType.ZENKAKU, NotationType.ZENKAKU]
assert res == expect
res = number_normalizer.process("1,2個")
expect = [NNumber("1,2", 0, 3)]
expect[0].value_lower_bound = 1
expect[0].value_upper_bound = 2
expect[0].notation_type = [NotationType.HANKAKU]
assert res == expect
def test_process_数値なし(self, number_normalizer: NumberNormalizer):
res = number_normalizer.process("あいうえお")
assert res == []
def test_process_invalid_notation(self, number_normalizer: NumberNormalizer):
res = number_normalizer.process("1千1千1千")
expect = [NNumber("1千1", 0, 3), NNumber("千1", 3, 5), NNumber("千", 5, 6)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 1001
expect[0].notation_type = [NotationType.HANKAKU, NotationType.KANSUJI_KURAI_SEN, NotationType.HANKAKU]
expect[1].value_lower_bound = expect[1].value_upper_bound = 1001
expect[1].notation_type = [NotationType.KANSUJI_KURAI_SEN, NotationType.HANKAKU]
expect[2].value_lower_bound = expect[2].value_upper_bound = 1000
expect[2].notation_type = [NotationType.KANSUJI_KURAI_SEN]
assert res == expect
res = number_normalizer.process("200720人がきた")
expect = [NNumber("2007", 0, 4), NNumber("20", 4, 6)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 2007
expect[0].notation_type = [NotationType.ZENKAKU, NotationType.ZENKAKU, NotationType.ZENKAKU, NotationType.ZENKAKU]
expect[1].value_lower_bound = expect[1].value_upper_bound = 20
expect[1].notation_type = [NotationType.HANKAKU, NotationType.HANKAKU]
assert res == expect
res = number_normalizer.process("2007二十人がきた")
expect = [NNumber("2007", 0, 4), NNumber("二十", 4, 6)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 2007
expect[0].notation_type = [NotationType.ZENKAKU, NotationType.ZENKAKU, NotationType.ZENKAKU, NotationType.ZENKAKU]
expect[1].value_lower_bound = expect[1].value_upper_bound = 20
expect[1].notation_type = [NotationType.KANSUJI_09, NotationType.KANSUJI_KURAI_SEN]
assert res == expect
def test_process_real(self, number_normalizer: NumberNormalizer):
res = number_normalizer.process("京・京")
assert res == []
res = number_normalizer.process("七〇〇万")
expect = [NNumber("七〇〇万", 0, 4)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 7000000
expect[0].notation_type = [NotationType.KANSUJI_09, NotationType.KANSUJI_09, NotationType.KANSUJI_09,
NotationType.KANSUJI_KURAI_MAN]
assert res == expect
res = number_normalizer.process("7000千人")
expect = [NNumber("7000千", 0, 5)]
expect[0].value_lower_bound = expect[0].value_upper_bound = 7000000
expect[0].notation_type = [NotationType.HANKAKU, NotationType.HANKAKU, NotationType.HANKAKU,
NotationType.HANKAKU, NotationType.KANSUJI_KURAI_SEN]
assert res == expect
res = number_normalizer.process("京京億億万万京億万")
assert res == []
res = number_normalizer.process("そうだ、京都いこう")
assert res == []
def test_suffix_is_arabic(self, number_normalizer: NumberNormalizer):
res = number_normalizer.suffix_is_arabic("10")
assert res == True
res = number_normalizer.suffix_is_arabic("10")
assert res == True
res = number_normalizer.suffix_is_arabic("10あ")
assert res == False
res = number_normalizer.suffix_is_arabic("")
assert res == False
def test_prefix_3digits_is_arabic(self, number_normalizer: NumberNormalizer):
res = number_normalizer.prefix_3digits_is_arabic("1000")
assert res == True
res = number_normalizer.prefix_3digits_is_arabic("1000")
assert res == True
res = number_normalizer.prefix_3digits_is_arabic("100")
assert res == True
res = number_normalizer.prefix_3digits_is_arabic("10")
assert res == False
res = number_normalizer.prefix_3digits_is_arabic("あ1000")
assert res == False
def test_is_valid_comma_notation(self, number_normalizer: NumberNormalizer):
res = number_normalizer.is_valid_comma_notation("3", "000")
assert res == True
res = number_normalizer.is_valid_comma_notation("3", "000円")
assert res == True
res = number_normalizer.is_valid_comma_notation("3あ", "000")
assert res == False
res = number_normalizer.is_valid_comma_notation("3", "00")
assert res == False
res = number_normalizer.is_valid_comma_notation("29", "30")
assert res == False
def test_join_numbers_by_comma(self, number_normalizer: NumberNormalizer):
numbers = [NNumber("3", 5, 6), NNumber("000", 7, 10)]
res = number_normalizer.join_numbers_by_comma("この商品は3,000円だ", numbers)
assert res == [NNumber("3,000", 5, 10)]
numbers = [NNumber("29", 6, 8), NNumber("30", 9, 11)]
res = number_normalizer.join_numbers_by_comma("当たり番号は29,30だ", numbers)
assert res == numbers
def test_convert_number(self, number_normalizer: NumberNormalizer):
numbers = [
NNumber("1,234"), NNumber("1,234,567"), NNumber("一二三四五六七"), NNumber("123万4567"),
NNumber("百二十三万四千五百六十七"), NNumber("百2十3万4千5百6十7")
]
res = number_normalizer.convert_number(numbers)
expect = [
NNumber("1,234"), NNumber("1,234,567"), NNumber("一二三四五六七"), NNumber("123万4567"),
NNumber("百二十三万四千五百六十七"), NNumber("百2十3万4千5百6十7")
]
expect[0].value_lower_bound = expect[0].value_upper_bound = 1234
expect[1].value_lower_bound = expect[1].value_upper_bound = 1234567
expect[2].value_lower_bound = expect[2].value_upper_bound = 1234567
expect[3].value_lower_bound = expect[3].value_upper_bound = 1234567
expect[4].value_lower_bound = expect[4].value_upper_bound = 1234567
expect[5].value_lower_bound = expect[5].value_upper_bound = 1234567
assert res == expect
def test_fix_prefix_su(self, number_normalizer: NumberNormalizer):
number = NNumber("十万", 0, 2)
res = number_normalizer.fix_prefix_su("十万円", number)
assert res == number
number = NNumber("十万", 3, 5)
res = number_normalizer.fix_prefix_su("これは十万円の価値がある", number)
assert res == number
number = NNumber("十万", 4, 6)
number.value_lower_bound = number.value_upper_bound = 100000
res = number_normalizer.fix_prefix_su("これは数十万円の価値がある", number)
expect = NNumber("数十万", 3, 6)
expect.value_lower_bound = 100000
expect.value_upper_bound = 900000
assert res == expect
def test_fix_intermediate_su(self, number_normalizer: NumberNormalizer):
cur_number = NNumber("十万", 0, 2)
next_number = NNumber("二十万", 2, 5)
res = number_normalizer.fix_intermediate_su("十万二十万", cur_number, next_number)
assert res == cur_number
cur_number = NNumber("十万", 0, 2)
next_number = NNumber("二十万", 3, 6)
res = number_normalizer.fix_intermediate_su("十万と二十万", cur_number, next_number)
assert res == cur_number
cur_number = NNumber("十", 3, 4)
cur_number.value_lower_bound = cur_number.value_upper_bound = 10
next_number = NNumber("万", 5, 6)
next_number.value_lower_bound = next_number.value_upper_bound = 10000
res = number_normalizer.fix_intermediate_su("これは十数万円の価値がある", cur_number, next_number)
expect = NNumber("十数万", 3, 6)
expect.value_lower_bound = 110000
expect.value_upper_bound = 190000
assert res == expect
def test_fix_suffix_su(self, number_normalizer: NumberNormalizer):
number = NNumber("十", 3, 4)
res = number_normalizer.fix_suffix_su("これは十円の価値がある", number)
assert res == number
number = NNumber("十", 3, 4)
number.value_lower_bound = number.value_upper_bound = 10
res = number_normalizer.fix_suffix_su("これは十数円の価値がある", number)
expect = NNumber("十数", 3, 5)
expect.value_lower_bound = 11
expect.value_upper_bound = 19
assert res == expect
def test_fix_numbers_by_su(self, number_normalizer: NumberNormalizer):
numbers = [
NNumber("十", 3, 4), NNumber("万", 8, 9), NNumber("十", 12, 13), NNumber("百", 17, 18), NNumber("十", 19, 20),
NNumber("一万", 23, 25), NNumber("千", 26, 27), NNumber("十", 30, 31), NNumber("万", 32, 33)
]
numbers[0].value_lower_bound = numbers[0].value_upper_bound = 10
numbers[1].value_lower_bound = numbers[1].value_upper_bound = 10000
numbers[2].value_lower_bound = numbers[2].value_upper_bound = 10
numbers[3].value_lower_bound = numbers[3].value_upper_bound = 100
numbers[4].value_lower_bound = numbers[4].value_upper_bound = 10
numbers[5].value_lower_bound = numbers[5].value_upper_bound = 10000
numbers[6].value_lower_bound = numbers[6].value_upper_bound = 1000
numbers[7].value_lower_bound = numbers[7].value_upper_bound = 10
numbers[8].value_lower_bound = numbers[8].value_upper_bound = 10000
res = number_normalizer.fix_numbers_by_su("その数十人が、数万人で、十数人で、百数十人で、一万数千人で、十数万人で、", numbers)
expect = [
NNumber("数十", 2, 4), NNumber("数万", 7, 9), NNumber("十数", 12, 14),
NNumber("百数十", 17, 20), NNumber("一万数千", 23, 27), NNumber("十数万", 30, 33)
]
expect[0].value_lower_bound = 10
expect[0].value_upper_bound = 90
expect[1].value_lower_bound = 10000
expect[1].value_upper_bound = 90000
expect[2].value_lower_bound = 11
expect[2].value_upper_bound = 19
expect[3].value_lower_bound = 110
expect[3].value_upper_bound = 190
expect[4].value_lower_bound = 11000
expect[4].value_upper_bound = 19000
expect[5].value_lower_bound = 110000
expect[5].value_upper_bound = 190000
assert res == expect
def test_is_only_kansuji_kurai_man(self, number_normalizer: NumberNormalizer):
res = number_normalizer.is_only_kansuji_kurai_man("十二")
assert res == False
res = number_normalizer.is_only_kansuji_kurai_man("億")
assert res == True
def test_remove_only_kansuji_kurai_man(self, number_normalizer: NumberNormalizer):
numbers = [NNumber("十二万"), NNumber("億"), NNumber("三万")]
res = number_normalizer.remove_only_kansuji_kurai_man(numbers)
expect = [NNumber("十二万"), NNumber("三万")]
assert res == expect
def test_remove_unnecessary_data(self, number_normalizer: NumberNormalizer):
numbers = [NNumber("十二万"), NNumber("2億", 0, 2), NNumber("2億", 0, 2), NNumber("三万", 3, 5)]
res = number_normalizer.remove_unnecessary_data(numbers)
expect = [NNumber("2億", 0, 2), NNumber("三万", 3, 5)]
assert res == expect
| 50.478395 | 123 | 0.667869 | 2,007 | 16,355 | 5.178376 | 0.106627 | 0.106225 | 0.076494 | 0.056577 | 0.786587 | 0.700953 | 0.626479 | 0.567016 | 0.468488 | 0.427692 | 0 | 0.07068 | 0.217976 | 16,355 | 323 | 124 | 50.634675 | 0.74136 | 0 | 0 | 0.352941 | 0 | 0 | 0.048915 | 0.011801 | 0 | 0 | 0 | 0 | 0.172794 | 1 | 0.077206 | false | 0 | 0.014706 | 0.003676 | 0.099265 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d176782197a481d98435dbbce03b227e4fc2703 | 253 | py | Python | kivy/tests/pyinstaller/simple_widget/project/widget.py | Galland/kivy | 95a6bf279883d706f645e4629c16d5ee1038f0ec | [
"MIT"
] | 13,889 | 2015-01-01T06:43:41.000Z | 2022-03-31T17:37:56.000Z | kivy/tests/pyinstaller/simple_widget/project/widget.py | Galland/kivy | 95a6bf279883d706f645e4629c16d5ee1038f0ec | [
"MIT"
] | 4,570 | 2015-01-01T17:58:52.000Z | 2022-03-31T18:42:16.000Z | kivy/tests/pyinstaller/simple_widget/project/widget.py | Galland/kivy | 95a6bf279883d706f645e4629c16d5ee1038f0ec | [
"MIT"
] | 3,786 | 2015-01-01T09:20:45.000Z | 2022-03-30T21:15:05.000Z | from kivy.uix.widget import Widget
class MyWidget(Widget):
def __init__(self, **kwargs):
super(MyWidget, self).__init__(**kwargs)
def callback(*l):
self.x = self.y
self.fbind('y', callback)
callback()
| 19.461538 | 48 | 0.58498 | 30 | 253 | 4.666667 | 0.566667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.280632 | 253 | 12 | 49 | 21.083333 | 0.769231 | 0 | 0 | 0 | 0 | 0 | 0.003953 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.125 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d196d02b6dfdae637cba35d8a14ed891350c55a | 1,046 | py | Python | microcosm_flask/session.py | KensoDev/microcosm-flask | 3618333f4a0f45e673a33986877157208c9eac5f | [
"Apache-2.0"
] | 11 | 2017-01-30T21:53:20.000Z | 2020-05-29T22:39:19.000Z | microcosm_flask/session.py | KensoDev/microcosm-flask | 3618333f4a0f45e673a33986877157208c9eac5f | [
"Apache-2.0"
] | 139 | 2016-03-09T19:09:59.000Z | 2021-09-03T17:14:00.000Z | microcosm_flask/session.py | KensoDev/microcosm-flask | 3618333f4a0f45e673a33986877157208c9eac5f | [
"Apache-2.0"
] | 10 | 2016-12-19T22:39:42.000Z | 2021-03-09T19:23:15.000Z | """
Support a user-defined per-request session.
"""
from flask import g
def register_session_factory(graph, key, session_factory):
"""
Register a session creation function so that a new session (of user-defined type)
will be saved to `flask.g` on every request (and closed on teardown).
In other words: this os a mechanism to register a SQLAlchemy session instance
or similar without coupling the web and database tiers directly.
The session function should have the signature:
def session_factory(graph):
return Session()
If the session instance is closeable, it will be closed on teardown.
"""
@graph.flask.before_request
def begin_session():
setattr(g, key, session_factory(graph))
@graph.flask.teardown_request
def end_session(*args, **kwargs):
# NB: session will be none if there's an error raised in `before_request`
session = getattr(g, key, None)
if session is not None and hasattr(session, "close"):
session.close()
| 30.764706 | 85 | 0.688337 | 146 | 1,046 | 4.863014 | 0.506849 | 0.078873 | 0.080282 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.235182 | 1,046 | 33 | 86 | 31.69697 | 0.8875 | 0.563098 | 0 | 0 | 0 | 0 | 0.012376 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.3 | false | 0 | 0.1 | 0 | 0.4 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d198a067cf29bfd3860f24dbef0396a06853828 | 5,188 | py | Python | pkg_ros_iot_bridge/scripts/temp_for_salim/get_sheet.py | 1arshan/Eyantra_Virgi-bot | 30ebe99fec6a0d4767fe94468b21bc00091bc527 | [
"MIT"
] | 1 | 2021-09-09T04:41:28.000Z | 2021-09-09T04:41:28.000Z | pkg_ros_iot_bridge/scripts/temp_for_salim/get_sheet.py | 1arshan/Eyantra_Virgi-bot | 30ebe99fec6a0d4767fe94468b21bc00091bc527 | [
"MIT"
] | null | null | null | pkg_ros_iot_bridge/scripts/temp_for_salim/get_sheet.py | 1arshan/Eyantra_Virgi-bot | 30ebe99fec6a0d4767fe94468b21bc00091bc527 | [
"MIT"
] | null | null | null | #! /usr/bin/env python2.7
import requests
import json
import heapq as hq #heap
def check_order(order_id,order_info):
for i in order_info:
if i[1] == order_id:
return True
return False
def check_if_dispatched(order_id):
# URL = "https://spreadsheets.google.com/feeds/list/1rianYVvWCIJeoa17Jlrg7GZTUwuI_SG3KaKaaHtgGvY/4/public/full?alt=json" ##eyrc.vb.1637@gmail.com
URL = "https://spreadsheets.google.com/feeds/list/1QTyFVQA0YheuERNtD7Vq1ASVJl6tQ4rPGh65vFpExhg/4/public/full?alt=json" ##vb1637eyrc@gmail.com
#URL = "https://spreadsheets.google.com/feeds/list/1Twkrdg5QvlTRH15SLgWfh8tom5Pxjp-6QphH_s3vPIk/4/public/full?alt=json" ##1637vbeyrc@gmail.com
response = requests.get(URL) #order
data =response.content
res = json.loads(data)
if u'entry' in res["feed"]:
res2 = res["feed"][u'entry']
else:
return False
for x in res2:
content =x[u'content']
content =content[u'$t']
Dict = dict((a.strip(), b.strip())
for a, b in (element.split(': ')
for element in content.split(', ')))
if order_id == Dict[u'orderid'].encode('utf-8'):
return True
return False
def get_data_from_sheet(max_order_id,order_info):
# URL = "https://spreadsheets.google.com/feeds/list/1rianYVvWCIJeoa17Jlrg7GZTUwuI_SG3KaKaaHtgGvY/3/public/full?alt=json" ##eyrc.vb.1637@gmail.com
URL = "https://spreadsheets.google.com/feeds/list/1QTyFVQA0YheuERNtD7Vq1ASVJl6tQ4rPGh65vFpExhg/3/public/full?alt=json" ##vb1637eyrc@gmail.com
#URL = "https://spreadsheets.google.com/feeds/list/1Twkrdg5QvlTRH15SLgWfh8tom5Pxjp-6QphH_s3vPIk/3/public/full?alt=json" ##1637vbeyrc@gmail.com
response = requests.get(URL) #order
data =response.content
res = json.loads(data)
if u'entry' in res["feed"]:
#print("entry present")
res2 = res["feed"][u'entry']
else:
order_to_be_procced=()
#print("no data present")
return order_to_be_procced,max_order_id,order_info
res2 = res["feed"][u'entry']
#order_info=[]
hq.heapify(order_info)
#max_order_id =0
for x in res2:
content =x[u'content']
content =content[u'$t']
Dict = dict((a.strip(), b.strip())
for a, b in (element.split(': ')
for element in content.split(', ')))
if Dict[u'item']=="Medicines" or Dict[u'item']=="Medicine":
Dict[u'priority'] =0 #0
color ="red"
elif Dict[u'item']=="Food":
Dict[u'priority']=1 #1
color ="yellow"
else:
Dict[u'priority'] =2 #2
color ="green"
# if max_order_id < int(Dict[u'orderid']):
order_id_encoded = Dict[u'orderid'].encode('utf-8')
if not check_order(Dict[u'orderid'],order_info) and not check_if_dispatched(order_id_encoded):
max_order_id=int(Dict[u'orderid'])
tup=(Dict[u'priority'],Dict[u'orderid'],Dict[u'item'],Dict[u'city'])
hq.heappush(order_info,tup) #always have highest priority upward
#print(order_info)
if len(order_info)>0:
order_to_be_procced =hq.heappop(order_info) #order with highest priority
else:
order_to_be_procced=()
print("order_to_be_procced",order_to_be_procced)
print("order_info: ", order_info)
return order_to_be_procced,max_order_id,order_info
"""
order_info=[]
hq.heapify(order_info)
max_order_id =0
#order_to_be_procced,max_order_id,order_info =get_data_from_sheet(0,order_info)
#print(order_to_be_procced, max_order_id)
for i in range(8):
order_to_be_procced,max_order_id,order_info =get_data_from_sheet(max_order_id,order_info)
print(order_to_be_procced, max_order_id)
"""
def get_data_from_inventory_sheet():
# URL = "https://spreadsheets.google.com/feeds/list/1rianYVvWCIJeoa17Jlrg7GZTUwuI_SG3KaKaaHtgGvY/2/public/full?alt=json" ##eyrc.vb.1637@gmail.com
URL = "https://spreadsheets.google.com/feeds/list/1QTyFVQA0YheuERNtD7Vq1ASVJl6tQ4rPGh65vFpExhg/2/public/full?alt=json" ##vb1637eyrc@gmail.com
#URL = "https://spreadsheets.google.com/feeds/list/1Twkrdg5QvlTRH15SLgWfh8tom5Pxjp-6QphH_s3vPIk/2/public/full?alt=json" ##1637vbeyrc@gmail.com
response = requests.get(URL) #inventory
data =response.content
res = json.loads(data)
if u'entry' in res["feed"]:
res2 = res["feed"][u'entry']
else:
match_box_color_with_index ={}
return match_box_color_with_index
res2 = res["feed"][u'entry']
match_box_color_with_index ={}
for x in res2:
content =x[u'content']
content =content[u'$t']
Dict = dict((a.strip(), b.strip())
for a, b in (element.split(': ')
for element in content.split(', ')))
box_index =Dict[u'sku']
box_index=box_index[1:3]
match_box_color_with_index.update({box_index.encode("utf-8"):Dict[u'item'].encode("utf-8")}) # dic which will match storage number with box item
#print(match_box_color_with_index)
return match_box_color_with_index
check_if_dispatched('2002') | 39.907692 | 152 | 0.655551 | 715 | 5,188 | 4.560839 | 0.172028 | 0.055198 | 0.036799 | 0.053971 | 0.774302 | 0.709598 | 0.666053 | 0.650721 | 0.650721 | 0.562404 | 0 | 0.034088 | 0.208365 | 5,188 | 130 | 153 | 39.907692 | 0.759922 | 0.241133 | 0 | 0.602273 | 0 | 0.034091 | 0.176056 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.034091 | 0 | 0.181818 | 0.022727 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d1e3411660fc6ff987dff3de950e6a48810d1d8 | 7,125 | py | Python | tests/tests_bibliotools/test_parse_and_group.py | wonjoonSeol/ScienceScape | 8d8a3cb76193b6f85b7a2a6c7219e249237d64c8 | [
"BSD-3-Clause"
] | 5 | 2018-02-14T21:11:06.000Z | 2020-02-23T14:53:11.000Z | tests/tests_bibliotools/test_parse_and_group.py | wonjoonSeol/ScienceScape | 8d8a3cb76193b6f85b7a2a6c7219e249237d64c8 | [
"BSD-3-Clause"
] | 106 | 2018-02-09T00:31:05.000Z | 2018-03-29T07:28:34.000Z | tests/tests_bibliotools/test_parse_and_group.py | wonjoonSeol/ScienceScape | 8d8a3cb76193b6f85b7a2a6c7219e249237d64c8 | [
"BSD-3-Clause"
] | 6 | 2018-02-23T17:48:03.000Z | 2020-05-14T13:39:36.000Z | from django.test import TestCase
import sys
import os
lib_path = os.path.abspath(os.path.join(__file__, '..', '..', '..', 'bibliotools3', 'scripts'))
sys.path.append(lib_path)
from parse_and_group import is_year_within_span
from parse_and_group import create_span_files
from parse_and_group import separate_years
from parse_and_group import get_span_parameters
class TestParseGroup(TestCase):
"""
This test tests that the method is_year_within_span works correctly
for years in the span.
"""
def test_year_within_span_true(self):
allTrue = True
for year in range(1990, 2010):
if not is_year_within_span(1990, 2010, year):
allTrue = False
self.assertEqual(True, allTrue)
"""
This test tests that the method is_year_within_span works correctly
for years NOT in the span.
"""
def test_year_within_span_false(self):
allFalse = True
for year in range(1900, 1989):
if is_year_within_span(1990, 2010, year):
allFalse = False
self.assertEqual(True, allFalse)
"""
This test tests that upon calling separate_years, the lines are correctly separated
amongst the span files.
"""
def test_years_correctly_separated(self):
# Set up test folders/files (will be removed at the end of test)
wos_headers = "PT AU BA BE GP AF BF CA TI SO SE BS LA DT CT CY CL SP HO DE ID AB C1 RP EM RI OI FU FX CR NR TC Z9 U1 U2 PU PI PA SN EI BN J9 JI PD PY VL IS PN SU SI MA BP EP AR DI D2 EA EY PG WC SC GA UT PM OA HC HP DA"
dir = os.path.dirname(os.path.dirname(__file__))
os.makedirs(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears"))
os.makedirs(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/firstSpan"))
os.makedirs(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/secondSpan"))
first_span_txt = open(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/firstSpan/firstSpan.txt"), "w")
second_span_txt = open(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/secondSpan/secondSpan.txt"), "w")
first_span_txt.write(wos_headers + "\n")
second_span_txt.write(wos_headers + "\n")
# This is a dummy line for testing
line = """J Piersanti, S; Orlandi, A Piersanti, Stefano; Orlandi, Antonio Genetic Algorithm Optimization for the Total Radiated Power of a Meandered Line by Using an Artificial Neural Network IEEE TRANSACTIONS ON ELECTROMAGNETIC COMPATIBILITY English Article Artificial neural network (ANN); electromagnetic (EM) radiation; genetic algorithms (GAs); machine learning; meandered line; nature-inspired algorithms; signal integrity; total radiated power (TRP) One of the state-of-the-art optimization strategies is the introduction of an artificial neural network in place of a more time-consuming numerical tool to compute the cost function. This work describes the development of a genetic algorithm optimization strategy for a meandered microstrip line by using an artificial neural network whose training set has been designed by a uniform sampling of the global design space. The results in terms of the total radiated electromagnetic power are discussed and compared with those obtained by the initial and not optimized configuration. [Piersanti, Stefano; Orlandi, Antonio] Univ Aquila, Dept Ind & Informat Engn & Econ, UAq EMC Lab, I-67100 Laquila, Italy Orlandi, A (reprint author), Univ Aquila, Dept Ind & Informat Engn & Econ, UAq EMC Lab, I-67100 Laquila, Italy. stefano.piersanti@graduate.univaq.it; anto-nio.orlandi@univaq.it Computer Simulation Technology, 2017, CST STUD SUIT 2017; Cuthbert T. R., 1987, OPTIMIZATION USING P; Duffy AP, 2006, IEEE T ELECTROMAGN C, V48, P449, DOI 10.1109/TEMC.2006.879358; HAGAN MT, 1994, IEEE T NEURAL NETWOR, V5, P989, DOI 10.1109/72.329697; Hagan M. T., 1995, NEURAL NETWORK DESIG; Hall S. H., 2009, ADV SIGNAL INTEGRITY; Haupt R.L., 2004, PRACTICAL GENETIC AL; [Anonymous], 2008, P1597 IEEE; Orlandi A., 2017, ELECTROMAGNETIC BAND; Orlandi A, 2006, IEEE T ELECTROMAGN C, V48, P460, DOI 10.1109/TEMC.2006.879360; Qi Q, 2016, EL PACKAG TECH CONF, P85, DOI 10.1109/EPTC.2016.7861448; Tron S., 2013, MEANDERED TRANSMISSI; Uka S., 1990, IEEE T NEURAL NETWOR, V2, P675 13 0 0 0 0 IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC PISCATAWAY 445 HOES LANE, PISCATAWAY, NJ 08855-4141 USA 0018-9375 1558-187X IEEE T ELECTROMAGN C IEEE Trans. Electromagn. Compat. AUG 2018 60 4 1014 1017 10.1109/TEMC.2017.2764623 4 Engineering, Electrical & Electronic; Telecommunications Engineering; Telecommunications FT4JY WOS:000423122600025 2018-02-07"""
# Mocking some time spans
spans = {
"firstSpan":{
"years":[1900,1999],
},
"secondSpan":{
"years":[2000, 2018],
}
}
# Mocking a folder structure with dummy input/output files/folders
years_spans = dict((s, data["years"]) for s, data in spans.items())
files = {
"firstSpan": first_span_txt,
"secondSpan": second_span_txt,
}
# Call to the method we want to test
separate_years(line, years_spans, files, 44)
first_span_txt.close()
second_span_txt.close()
first_span_read = open(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/firstSpan/firstSpan.txt"), "r")
second_span_read = open(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/secondSpan/secondSpan.txt"), "r")
first_span_read.readline()
second_span_read.readline()
# Check that the years have been correctly separated
result = False
if len(first_span_read.readlines()) == 0 and len(second_span_read.readlines()) == 1:
result = True
# Tear down
first_span_read.close()
second_span_read.close()
os.remove(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/firstSpan/firstSpan.txt"))
os.remove(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/secondSpan/secondSpan.txt"))
os.rmdir(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/firstSpan"))
os.rmdir(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears/secondSpan"))
os.rmdir(os.path.join(dir, "tests_bibliotools/testFiles/foldersForSeparateYears"))
self.assertEqual(True, result)
"""
This test tests that upon calling get_span_parameters,
correct uncorrupted parameters are returned (critical step).
"""
def test_get_span_parameters(self):
mocked_spans = {
"first_span":{
"years":[1789,2010]
},
"second_span":{
"years":[2011,2018]
},
}
result = str(get_span_parameters(mocked_spans.items(), "years"))
self.assertEqual(result, """{'first_span': [1789, 2010], 'second_span': [2011, 2018]}""")
| 60.897436 | 2,417 | 0.703439 | 981 | 7,125 | 4.993884 | 0.399592 | 0.019596 | 0.026536 | 0.031843 | 0.359869 | 0.326801 | 0.296183 | 0.270055 | 0.257808 | 0.253725 | 0 | 0.058927 | 0.204491 | 7,125 | 116 | 2,418 | 61.422414 | 0.805399 | 0.052211 | 0 | 0 | 0 | 0.026667 | 0.564006 | 0.152417 | 0 | 0 | 0 | 0 | 0.053333 | 1 | 0.053333 | false | 0 | 0.093333 | 0 | 0.16 | 0.013333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d1f3ddbc8caa64dc170bb034f2e11f6a498e3f3 | 968 | py | Python | src/hw_conversion/HWPreprocessor.py | jmbarrios/hw-conversion | 8addd24e726e7284ade3195df14f96ea51c332b7 | [
"MIT"
] | null | null | null | src/hw_conversion/HWPreprocessor.py | jmbarrios/hw-conversion | 8addd24e726e7284ade3195df14f96ea51c332b7 | [
"MIT"
] | null | null | null | src/hw_conversion/HWPreprocessor.py | jmbarrios/hw-conversion | 8addd24e726e7284ade3195df14f96ea51c332b7 | [
"MIT"
] | null | null | null | '''
Module containing a preprocessor that keeps cells if they match given
expression.
'''
# Author: Juan M. Barrios <j.m.barrios@gmail.com>
import re
from typing import Pattern
from traitlets import Unicode
from nbconvert.preprocessors import Preprocessor
class HomeworkPreproccessor(Preprocessor):
'''Keeps cells form a notebook that match a regular expression'''
pattern = Unicode().tag(config=True)
def check_conditions(self, cell):
'''Checks that a cell matches the pattern.
Returns: Boolean.
True means cell should be kept.
'''
regexp_compiled = re.compile(self.pattern)
return regexp_compiled.match(cell.source)
def preprocess(self, nb, resources):
'''Preprocessing to apply to each notebook.'''
if not self.pattern:
return nb, resources
nb.cells = [cell for cell in nb.cells if self.check_conditions(cell)]
return nb, resources | 27.657143 | 77 | 0.67562 | 120 | 968 | 5.416667 | 0.541667 | 0.050769 | 0.052308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.241736 | 968 | 35 | 78 | 27.657143 | 0.885559 | 0.332645 | 0 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.285714 | 0 | 0.785714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
7d27c111549ca054eb1d4350e5f213c7b661a06c | 500 | py | Python | drinks/migrations/0002_drink_ingredients.py | jmhubbard/cocktail_api | 47c2cca699f02dc14af04b989beeee9855a797f0 | [
"Unlicense"
] | 1 | 2020-11-25T04:57:34.000Z | 2020-11-25T04:57:34.000Z | drinks/migrations/0002_drink_ingredients.py | jmhubbard/cocktail_api | 47c2cca699f02dc14af04b989beeee9855a797f0 | [
"Unlicense"
] | null | null | null | drinks/migrations/0002_drink_ingredients.py | jmhubbard/cocktail_api | 47c2cca699f02dc14af04b989beeee9855a797f0 | [
"Unlicense"
] | null | null | null | # Generated by Django 3.1.2 on 2020-10-29 04:46
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('recipes', '0001_initial'),
('ingredients', '0001_initial'),
('drinks', '0001_initial'),
]
operations = [
migrations.AddField(
model_name='drink',
name='ingredients',
field=models.ManyToManyField(through='recipes.Recipe', to='ingredients.Ingredient'),
),
]
| 23.809524 | 96 | 0.6 | 49 | 500 | 6.040816 | 0.734694 | 0.111486 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.07337 | 0.264 | 500 | 20 | 97 | 25 | 0.730978 | 0.09 | 0 | 0 | 1 | 0 | 0.247241 | 0.048565 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d2a2fea07d41d19ee631745dc1ae58b9dcafc22 | 7,363 | py | Python | src/ResourceManager.py | NEKERAFA/Soul-Tower | d37c0bf6bcbf253ec5b2c41f802adeeca31fb384 | [
"MIT"
] | null | null | null | src/ResourceManager.py | NEKERAFA/Soul-Tower | d37c0bf6bcbf253ec5b2c41f802adeeca31fb384 | [
"MIT"
] | null | null | null | src/ResourceManager.py | NEKERAFA/Soul-Tower | d37c0bf6bcbf253ec5b2c41f802adeeca31fb384 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
import pygame, sys, os, json
from pygame.locals import *
IMAGE_PATH = os.path.join('assets', 'images')
SPRITE_SHEET_PATH = os.path.join('assets', 'sprites')
STAGE_CONF_PATH = os.path.join('assets', 'stages')
ROOM_CONF_PATH = os.path.join('assets', 'rooms')
DIALOGUE_CONF_PATH = os.path.join('assets', 'dialogues')
FONT_PATH = os.path.join('assets', 'fonts')
SOUND_PATH = os.path.join('assets', 'sounds')
MUSIC_PATH = os.path.join(SOUND_PATH,'music')
EFFECT_PATH = os.path.join(SOUND_PATH,'effects')
# -------------------------------------------------
# Clase ResourceManager
# En este caso se implementa como una clase vacía, solo con métodos de clase
class ResourceManager(object):
resources = {}
@classmethod
def load_music(cls, name):
fullname = os.path.join(MUSIC_PATH, name)
pygame.mixer.music.load(fullname)
@classmethod
def load_effect_sound(cls, name):
if name in cls.resources:
return cls.resources[name]
else:
fullname = os.path.join(EFFECT_PATH, name)
try:
sound_effect = pygame.mixer.Sound(fullname)
#sound_effect.set_volume(0.7);
#print(fullname)
#print(sound_effect.get_volume())
except pygame.error, message:
print 'Cannot load sound effect file:', fullname
raise SystemExit, message
#Se almacena
cls.resources[name] = sound_effect
return sound_effect
@classmethod
def load_image(cls, name, colorkey=None):
fullname = os.path.join(IMAGE_PATH, name)
# Si el name de archivo está entre los resources ya cargados
if fullname in cls.resources:
# Se devuelve ese recurso
return cls.resources[fullname]
# Si no ha sido cargado anteriormente
else:
# Se carga la imagen indicando la carpeta en la que está
try:
image = pygame.image.load(fullname)
except pygame.error, message:
print 'Cannot load image:', fullname
raise SystemExit, message
# Obtenemos el colorkey
if colorkey is not None:
if colorkey is -1:
colorkey = image.get_at((0,0))
image.set_colorkey(colorkey, RLEACCEL)
# Convertimos el canal alpha
image = image.convert_alpha()
# Se almacena
cls.resources[fullname] = image
# Se devuelve
return image
@classmethod
def free_image(cls, name):
fullname = os.path.join(IMAGE_PATH, name)
if fullname in cls.resources:
del cls.resources[fullname]
@classmethod
def load_sprite_conf(cls, name):
fullname = os.path.join(SPRITE_SHEET_PATH, name)
# Si el name de archivo está entre los resources ya cargados
if fullname in cls.resources:
# Se devuelve ese recurso
return cls.resources[fullname]
# Si no ha sido cargado anteriormente
else:
# Se carga el recurso indicando el name de su carpeta
try:
pfile = open(fullname, 'r')
except IOError as e:
print 'Cannot load sprite sheet:', fullname
raise SystemExit, e.strerror
# Se carga y parsea el json
data = json.load(pfile)
pfile.close()
# Se almacena
cls.resources[fullname] = data
# Se devuelve
return data
@classmethod
def free_sprite_conf(cls, name):
fullname = os.path.join(SPRITE_SHEET_PATH, name)
if fullname in cls.resources:
del cls.resources[fullname]
@classmethod
def load_room(cls, name):
fullname = os.path.join(ROOM_CONF_PATH, name)
# Si el name de archivo está entre los resources ya cargados
if fullname in cls.resources:
# Se devuelve ese recurso
return cls.resources[fullname]
# Si no ha sido cargado anteriormente
else:
# Se carga el recurso indicando el name de su carpeta
try:
pfile = open(fullname, 'r')
except IOError as e:
print 'Cannot load room:', fullname
raise SystemExit, e.strerror
data = json.load(pfile)
pfile.close()
# Se almacena
cls.resources[fullname] = data
# Se devuelve
return data
@classmethod
def free_room(cls, name):
fullname = os.path.join(ROOM_CONF_PATH, name)
if fullname in cls.resources:
del cls.resources[fullname]
@classmethod
def load_stage(cls, name):
fullname = os.path.join(STAGE_CONF_PATH, name)
# Si el name de archivo está entre los resources ya cargados
if fullname in cls.resources:
# Se devuelve ese recurso
return cls.resources[fullname]
# Si no ha sido cargado anteriormente
else:
# Se carga el recurso indicando el name de su carpeta
try:
pfile = open(fullname, 'r')
except IOError as e:
print 'Cannot load stage:', fullname
raise SystemExit, e.strerror
data = json.load(pfile)
pfile.close()
# Se almacena
cls.resources[fullname] = data
# Se devuelve
return data
@classmethod
def fre_stage(cls, name):
fullname = os.path.join(STAGE_CONF_PATH, name)
if fullname in cls.resources:
del cls.resources[fullname]
@classmethod
def load_dialogue(cls, name):
fullname = os.path.join(DIALOGUE_CONF_PATH, name)
# Si el name de archivo está entre los resources ya cargados
if fullname in cls.resources:
# Se devuelve ese recurso
return cls.resources[fullname]
# Si no ha sido cargado anteriormente
else:
try:
pfile = open(fullname, 'r')
except IOError as e:
print 'Cannot load dialogue:', fullname
raise SystemExit, e.strerror
data = json.load(pfile)
pfile.close()
# Se almacena
cls.resources[fullname] = data
# Se devuelve
return data
@classmethod
def free_dialogue(cls, name):
fullname = os.path.join(DIALOGUE_CONF_PATH, name)
if fullname in cls.resources:
del cls.resources[fullname]
@classmethod
def load_font(cls, name, size):
fullname = os.path.join(FONT_PATH, name)
if (fullname, size) in cls.resources:
return cls.resources[(fullname, size)]
else:
try:
font = pygame.font.Font(fullname, size)
except pygame.error, message:
print 'Cannot load font:', fullname
raise SystemExit, message
cls.resources[(fullname, size)] = font
return font
@classmethod
def free_font(cls, name, size):
fullname = os.path.join(FONT_PATH, name)
if (fullname, size) in cls.resources:
del cls.resources[(fullname, size)]
| 33.621005 | 76 | 0.573408 | 849 | 7,363 | 4.897527 | 0.147232 | 0.095238 | 0.055315 | 0.060606 | 0.721982 | 0.687831 | 0.639731 | 0.594998 | 0.594998 | 0.594998 | 0 | 0.001237 | 0.341029 | 7,363 | 218 | 77 | 33.775229 | 0.85573 | 0.170583 | 0 | 0.635762 | 0 | 0 | 0.040884 | 0 | 0 | 0 | 0 | 0.004587 | 0 | 0 | null | null | 0 | 0.013245 | null | null | 0.046358 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d2c1800b1cf775906aaeca97219fcb2b7436072 | 1,307 | py | Python | valuation/migrations/0001_initial.py | jiun0507/minestock | b333298575cae1c426cc4450e85e9e576458b74a | [
"Unlicense"
] | null | null | null | valuation/migrations/0001_initial.py | jiun0507/minestock | b333298575cae1c426cc4450e85e9e576458b74a | [
"Unlicense"
] | null | null | null | valuation/migrations/0001_initial.py | jiun0507/minestock | b333298575cae1c426cc4450e85e9e576458b74a | [
"Unlicense"
] | 1 | 2021-10-15T20:10:39.000Z | 2021-10-15T20:10:39.000Z | # Generated by Django 3.2 on 2021-04-28 12:31
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
initial = True
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
]
operations = [
migrations.CreateModel(
name='ValuationCategory',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('name', models.CharField(max_length=200)),
],
),
migrations.CreateModel(
name='Valuation',
fields=[
('id', models.BigAutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('ticker', models.CharField(max_length=10, null=True)),
('review', models.TextField()),
('method', models.CharField(blank=True, choices=[('dcf', 'DCF'), ('reproduction_cst', 'Reproduction_cost'), ('other', 'Other')], max_length=20)),
('value', models.FloatField()),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
]
| 36.305556 | 161 | 0.599847 | 132 | 1,307 | 5.810606 | 0.515152 | 0.031291 | 0.036506 | 0.057366 | 0.21382 | 0.21382 | 0.21382 | 0.21382 | 0.21382 | 0.21382 | 0 | 0.021717 | 0.260138 | 1,307 | 35 | 162 | 37.342857 | 0.771458 | 0.0329 | 0 | 0.357143 | 1 | 0 | 0.090333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.107143 | 0 | 0.25 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d2eedcb594966e266531a38d18f3efe92684a79 | 1,615 | py | Python | linear regression/simple_linear_regression.py | liangjisheng/Machine-Learning | 55b6781d621e2de09c6e750ecc993178fb247c7b | [
"MIT"
] | null | null | null | linear regression/simple_linear_regression.py | liangjisheng/Machine-Learning | 55b6781d621e2de09c6e750ecc993178fb247c7b | [
"MIT"
] | null | null | null | linear regression/simple_linear_regression.py | liangjisheng/Machine-Learning | 55b6781d621e2de09c6e750ecc993178fb247c7b | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
"""
@project = 0602-1
@file = simple_linear_regression
@author = Liangjisheng
@create_time = 2018/6/2 0002 下午 17:16
"""
import matplotlib.pyplot as plt
import numpy as np
from sklearn import datasets, linear_model
# 加载用于回归模型的数据集
# 这个数据集中一共有442个样本,特征向量维度为10
# 特征向量每个变量为实数,变化范围(-.2 ,.2)
# 目标输出为实数,变化范围 (25 ,346)
diabetes = datasets.load_diabetes()
# 查看数据集的基本信息
print(type(diabetes))
print(diabetes.data.shape)
print(diabetes.data.dtype)
print(diabetes.target.shape)
print(diabetes.target.dtype)
# 为了便于画图显示
# 仅仅使用一维数据作为训练用的X
# 这里使用np.newaxis的目的是让行向量变成列向量
# 这样diabetes_X每一项都代表一个样本
diabetes_X = diabetes.data[:, np.newaxis, 2]
# 此时diabetes_X的shape是(442L, 1L)
# 如果上面一行代码是:diabetes_X = diabetes.data[:, 2]
# 则diabetes_X的shape是(442L,),是一个行向量
print(diabetes_X.shape)
print(type(diabetes_X))
# 人工将输入数据划分为训练集和测试集
# 前400个样本作为训练用,后20个样本作为测试用
diabetes_X_train = diabetes_X[:-20]
diabetes_X_test = diabetes_X[-20:]
diabetes_y_train = diabetes.target[:-20]
diabetes_y_test = diabetes.target[-20:]
# 初始化一个线性回归模型
regr = linear_model.LinearRegression()
# 基于训练数据,对线性回归模型进行训练
regr.fit(diabetes_X_train, diabetes_y_train)
# 模型的参数
print('模型参数:', regr.coef_)
print('模型截距:', regr.intercept_)
# 模型在测试集上的均方差(mean square error)
print('测试集上的均方差: %.2f'
% np.mean((regr.predict(diabetes_X_test) - diabetes_y_test) ** 2))
# 模型在测试集上的得分,得分结果在0到1之间,数值越大,说明模型越好
print('模型得分: %.2f' % regr.score(diabetes_X_test, diabetes_y_test))
# 绘制模型在测试集上的效果
plt.scatter(diabetes_X_test, diabetes_y_test, color='black')
plt.plot(diabetes_X_test, regr.predict(diabetes_X_test), color='blue', linewidth=3)
plt.grid()
plt.show()
| 24.469697 | 83 | 0.763467 | 219 | 1,615 | 5.424658 | 0.479452 | 0.106061 | 0.065657 | 0.070707 | 0.095118 | 0.065657 | 0 | 0 | 0 | 0 | 0 | 0.041181 | 0.097833 | 1,615 | 65 | 84 | 24.846154 | 0.774194 | 0.35418 | 0 | 0 | 0 | 0 | 0.042365 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.111111 | 0 | 0.111111 | 0.407407 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
7d3ea37ac9acf74a01f56012eb7dac104176c0aa | 1,392 | py | Python | mititools/mititools/serializers/frictionless.py | jimmymathews/MITI | 0745b051a02fd1055ff80af560683fdbb18d5651 | [
"MIT"
] | null | null | null | mititools/mititools/serializers/frictionless.py | jimmymathews/MITI | 0745b051a02fd1055ff80af560683fdbb18d5651 | [
"MIT"
] | null | null | null | mititools/mititools/serializers/frictionless.py | jimmymathews/MITI | 0745b051a02fd1055ff80af560683fdbb18d5651 | [
"MIT"
] | null | null | null | import os
from os import mkdir
from os.path import join
from os.path import exists
import json
import importlib.resources
import jinja2
from jinja2 import Environment
from jinja2 import BaseLoader
with importlib.resources.path('mititools', 'fd_schema.json.jinja') as file:
jinja_environment = Environment(loader=BaseLoader)
fd_schema_file_contents = open(file, 'rt').read()
from ..default_values import fd_package_path
from ..name_manipulation import create_table_filename
from ..name_manipulation import create_auxiliary_table_filename
def write_frictionless(top_variables, data_tables):
json_str = render_json_data_package(top_variables)
json_object = json.loads(json_str)
payload = json.dumps(json_object, indent=2)
json_filename = 'datapackage.json'
if not exists(fd_package_path):
mkdir(fd_package_path)
with open(join(fd_package_path, json_filename), 'wt') as f:
f.write(payload)
for tablename, df in data_tables.items():
if list(df.columns) != ['value']:
filename = create_table_filename(tablename)
else:
filename = create_auxiliary_table_filename(tablename)
df.to_csv(join(fd_package_path, filename), sep='\t', index=False)
def render_json_data_package(variables):
template = jinja_environment.from_string(fd_schema_file_contents)
return template.render(**variables)
| 32.372093 | 75 | 0.75431 | 188 | 1,392 | 5.31383 | 0.382979 | 0.045045 | 0.065065 | 0.032032 | 0.064064 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003422 | 0.160201 | 1,392 | 42 | 76 | 33.142857 | 0.851155 | 0 | 0 | 0 | 0 | 0 | 0.04023 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0 | 0.393939 | 0 | 0.484848 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
7d400f7a4ed47d3dc5c52007ae1f8fcbedc5ec4c | 2,216 | py | Python | mvj/urls.py | tuomas777/mvj | e9a12e42c399b9fb77fd8fad85fc8f0f6d4ce405 | [
"MIT"
] | null | null | null | mvj/urls.py | tuomas777/mvj | e9a12e42c399b9fb77fd8fad85fc8f0f6d4ce405 | [
"MIT"
] | null | null | null | mvj/urls.py | tuomas777/mvj | e9a12e42c399b9fb77fd8fad85fc8f0f6d4ce405 | [
"MIT"
] | null | null | null | import rest_framework.urls
from django.conf import settings
from django.contrib import admin
from django.urls import include, path, re_path
from rest_framework import routers
from rest_framework_swagger.views import get_swagger_view
from leasing.views import ktj_proxy
from leasing.viewsets.basis_of_rent import BasisOfRentViewSet
from leasing.viewsets.comment import CommentTopicViewSet, CommentViewSet
from leasing.viewsets.contact import ContactViewSet
from leasing.viewsets.decision import DecisionViewSet
from leasing.viewsets.lease import (
DistrictViewSet, FinancingViewSet, HitasViewSet, IntendedUseViewSet, LeaseTypeViewSet, LeaseViewSet,
ManagementViewSet, MunicipalityViewSet, NoticePeriodViewSet, RegulationViewSet, StatisticalUseViewSet,
SupportiveHousingViewSet)
from users.viewsets import UserViewSet
router = routers.DefaultRouter()
router.register(r'basis_of_rent', BasisOfRentViewSet)
router.register(r'comment', CommentViewSet)
router.register(r'comment_topic', CommentTopicViewSet)
router.register(r'contact', ContactViewSet)
router.register(r'decision', DecisionViewSet)
router.register(r'district', DistrictViewSet)
router.register(r'financing', FinancingViewSet)
router.register(r'hitas', HitasViewSet)
router.register(r'intended_use', IntendedUseViewSet)
router.register(r'lease', LeaseViewSet)
router.register(r'lease_type', LeaseTypeViewSet)
router.register(r'management', ManagementViewSet)
router.register(r'municipality', MunicipalityViewSet)
router.register(r'notice_period', NoticePeriodViewSet)
router.register(r'regulation', RegulationViewSet)
router.register(r'statistical_use', StatisticalUseViewSet)
router.register(r'supportive_housing', SupportiveHousingViewSet)
router.register(r'user', UserViewSet)
urlpatterns = [
path('v1/', include(router.urls)),
re_path(r'(?P<base_type>ktjki[ir])/tuloste/(?P<print_type>[\w/]+)/pdf', ktj_proxy),
path('admin/', admin.site.urls),
path('auth/', include(rest_framework.urls)),
path('docs/', get_swagger_view(title='MVJ API')),
]
if settings.DEBUG and 'debug_toolbar' in settings.INSTALLED_APPS:
import debug_toolbar
urlpatterns = [path('__debug__/', include(debug_toolbar.urls)), ] + urlpatterns
| 43.45098 | 106 | 0.813628 | 254 | 2,216 | 6.968504 | 0.358268 | 0.142373 | 0.152542 | 0.024859 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.000491 | 0.080776 | 2,216 | 50 | 107 | 44.32 | 0.868434 | 0 | 0 | 0 | 0 | 0 | 0.129513 | 0.026625 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.311111 | 0 | 0.311111 | 0.022222 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
7d44b7e8f5a379cc7b50059795fc7b51e4005b04 | 361 | py | Python | hy-data-analysis-with-python-spring-2020/part03-e07_meeting_planes/src/meeting_planes.py | Melimet/DAP2020 | 0854fe4ce8ace6abf6dc0bbcf71984595ff6d42a | [
"MIT"
] | null | null | null | hy-data-analysis-with-python-spring-2020/part03-e07_meeting_planes/src/meeting_planes.py | Melimet/DAP2020 | 0854fe4ce8ace6abf6dc0bbcf71984595ff6d42a | [
"MIT"
] | null | null | null | hy-data-analysis-with-python-spring-2020/part03-e07_meeting_planes/src/meeting_planes.py | Melimet/DAP2020 | 0854fe4ce8ace6abf6dc0bbcf71984595ff6d42a | [
"MIT"
] | null | null | null | #!/usr/bin/python3
import numpy as np
def meeting_planes(a1, b1, c1, a2, b2, c2, a3, b3, c3):
return []
def main():
a1=1
b1=4
c1=5
a2=3
b2=2
c2=1
a3=2
b3=4
c3=1
x, y, z = meeting_planes(a1, b1, c1, a2, b2, c2, a3, b3, c3)
print(f"Planes meet at x={x}, y={y} and z={z}")
if __name__ == "__main__":
main()
| 15.041667 | 64 | 0.518006 | 70 | 361 | 2.528571 | 0.514286 | 0.146893 | 0.169492 | 0.19209 | 0.350282 | 0.350282 | 0.350282 | 0.350282 | 0.350282 | 0.350282 | 0 | 0.14741 | 0.304709 | 361 | 23 | 65 | 15.695652 | 0.557769 | 0.047091 | 0 | 0 | 0 | 0 | 0.131195 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.117647 | false | 0 | 0.058824 | 0.058824 | 0.235294 | 0.058824 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d473618101c1bf818cfd31f50f7230e32057c47 | 566 | py | Python | setup.py | iatlab/datas-utils | b8eef303de5a5d5a57182c0627b721dde0b6b300 | [
"MIT"
] | null | null | null | setup.py | iatlab/datas-utils | b8eef303de5a5d5a57182c0627b721dde0b6b300 | [
"MIT"
] | null | null | null | setup.py | iatlab/datas-utils | b8eef303de5a5d5a57182c0627b721dde0b6b300 | [
"MIT"
] | null | null | null | # -*- coding:utf-8 -*-
from setuptools import setup
setup(
name = "datas_utils",
packages = ["datas_utils",
"datas_utils.env",
"datas_utils.log",
"datas_utils.aws",
],
version = "0.0.1",
description = "Tools for Datas Project",
author = "Makoto P. Kato",
author_email = "mpkato@acm.org",
license = "MIT License",
url = "https://github.com/iatlab/datas_utils",
install_requires = ['boto3>=1.9.3', 'mysql-connector-python>=8.0.12'],
tests_require=['nose'],
)
| 28.3 | 74 | 0.55477 | 66 | 566 | 4.621212 | 0.727273 | 0.196721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.029412 | 0.279152 | 566 | 19 | 75 | 29.789474 | 0.718137 | 0.035336 | 0 | 0 | 0 | 0 | 0.398897 | 0.055147 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.058824 | 0 | 0.058824 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
7d507f34d285e67fb744c8b50084ce59c5e7e8eb | 2,065 | py | Python | script/sklearn_like_toolkit/warpper/wrapperGridSearchCV.py | demetoir/MLtools | 8c42fcd4cc71728333d9c116ade639fe57d50d37 | [
"MIT"
] | null | null | null | script/sklearn_like_toolkit/warpper/wrapperGridSearchCV.py | demetoir/MLtools | 8c42fcd4cc71728333d9c116ade639fe57d50d37 | [
"MIT"
] | null | null | null | script/sklearn_like_toolkit/warpper/wrapperGridSearchCV.py | demetoir/MLtools | 8c42fcd4cc71728333d9c116ade639fe57d50d37 | [
"MIT"
] | null | null | null | from sklearn import model_selection
from sklearn.externals.joblib import Parallel
from tqdm import tqdm
from script.sklearn_like_toolkit.warpper.base.MixIn import ClfWrapperMixIn, MetaBaseWrapperClfWithABC
import multiprocessing
CPU_COUNT = multiprocessing.cpu_count()
# TODO using packtools.grid_search GridSearchCVProgressBar make warning ...
# but copied code just work fine, wtf??
# from pactools.grid_search import GridSearchCVProgressBar as _GridSearchCVProgressBar
class GridSearchCVProgressBar(model_selection.GridSearchCV):
"""Monkey patch Parallel to have a progress bar during grid search"""
def _get_param_iterator(self):
"""Return ParameterGrid instance for the given param_grid"""
iterator = super(GridSearchCVProgressBar, self)._get_param_iterator()
iterator = list(iterator)
n_candidates = len(iterator)
cv = model_selection._split.check_cv(self.cv, None)
n_splits = getattr(cv, 'n_splits', 3)
max_value = n_candidates * n_splits
class ParallelProgressBar(Parallel):
def __call__(self, iterable):
bar = tqdm(max_value=max_value, title='GridSearchCV')
bar.iterable = iterable
# iterable = bar(iterable)
return super(ParallelProgressBar, self).__call__(iterable)
# Monkey patch
model_selection._search.Parallel = ParallelProgressBar
return iterator
class wrapperGridSearchCV(GridSearchCVProgressBar, ClfWrapperMixIn, metaclass=MetaBaseWrapperClfWithABC):
def __init__(self, estimator, param_grid, scoring=None, fit_params=None, n_jobs=CPU_COUNT, iid=True, refit=True,
cv=None, verbose=0, pre_dispatch='2*n_jobs', error_score='raise', return_train_score="warn"):
GridSearchCVProgressBar.__init__(
self, estimator, param_grid, scoring, fit_params, n_jobs, iid, refit, cv, verbose, pre_dispatch,
error_score, return_train_score)
ClfWrapperMixIn.__init__(self)
| 42.142857 | 117 | 0.708475 | 223 | 2,065 | 6.269058 | 0.434978 | 0.040057 | 0.032904 | 0.031474 | 0.04721 | 0.04721 | 0 | 0 | 0 | 0 | 0 | 0.001852 | 0.215496 | 2,065 | 48 | 118 | 43.020833 | 0.861111 | 0.171429 | 0 | 0 | 0 | 0 | 0.022465 | 0 | 0 | 0 | 0 | 0.020833 | 0 | 1 | 0.107143 | false | 0 | 0.178571 | 0 | 0.464286 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ada4fe020a9277ea274b6aa23f3bc9a49595bbe7 | 430 | py | Python | angr/procedures/libc/tolower.py | mariusmue/angr | f8304c4b1f0097a721a6692b02a45cabaae137c5 | [
"BSD-2-Clause"
] | 2 | 2018-12-03T23:14:56.000Z | 2018-12-03T23:15:57.000Z | angr/procedures/libc/tolower.py | mariusmue/angr | f8304c4b1f0097a721a6692b02a45cabaae137c5 | [
"BSD-2-Clause"
] | null | null | null | angr/procedures/libc/tolower.py | mariusmue/angr | f8304c4b1f0097a721a6692b02a45cabaae137c5 | [
"BSD-2-Clause"
] | 1 | 2019-08-07T01:42:01.000Z | 2019-08-07T01:42:01.000Z | import angr
from angr.sim_type import SimTypeInt
import logging
l = logging.getLogger("angr.procedures.libc.tolower")
class tolower(angr.SimProcedure):
def run(self, c):
self.argument_types = {0: SimTypeInt(self.state.arch, True)}
self.return_type = SimTypeInt(self.state.arch, True)
return self.state.solver.If(
self.state.solver.And(c >= 65, c <= 90), # A - Z
c + 32, c)
| 26.875 | 68 | 0.644186 | 60 | 430 | 4.566667 | 0.55 | 0.131387 | 0.138686 | 0.167883 | 0.19708 | 0 | 0 | 0 | 0 | 0 | 0 | 0.021084 | 0.227907 | 430 | 15 | 69 | 28.666667 | 0.804217 | 0.011628 | 0 | 0 | 0 | 0 | 0.066194 | 0.066194 | 0 | 0 | 0 | 0 | 0 | 1 | 0.090909 | false | 0 | 0.272727 | 0 | 0.545455 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
adb71119cdfc222b0935d1f0fd6885370fac21e0 | 922 | py | Python | movie/migrations/0003_auto_20200718_0759.py | edith007/The-Movie-Database | fef4aba56be66b93de5665da374ec8aab05c40f9 | [
"CC0-1.0"
] | 2 | 2020-07-12T20:15:53.000Z | 2020-07-19T12:07:48.000Z | movie/migrations/0003_auto_20200718_0759.py | edith007/The-Movie-DataBase | fef4aba56be66b93de5665da374ec8aab05c40f9 | [
"CC0-1.0"
] | 1 | 2020-07-12T07:50:55.000Z | 2020-07-12T07:50:55.000Z | movie/migrations/0003_auto_20200718_0759.py | edith007/The-Movie-DataBase | fef4aba56be66b93de5665da374ec8aab05c40f9 | [
"CC0-1.0"
] | null | null | null | # Generated by Django 2.2.12 on 2020-07-18 07:59
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('movie', '0002_auto_20200717_1039'),
]
operations = [
migrations.RemoveField(
model_name='show',
name='plot',
),
migrations.AddField(
model_name='show',
name='genres',
field=models.TextField(default=''),
),
migrations.AddField(
model_name='show',
name='image',
field=models.TextField(default=''),
),
migrations.AddField(
model_name='show',
name='name',
field=models.TextField(default=''),
),
migrations.AddField(
model_name='userrating',
name='position',
field=models.IntegerField(default=0),
),
]
| 24.263158 | 49 | 0.520607 | 81 | 922 | 5.82716 | 0.481481 | 0.095339 | 0.110169 | 0.144068 | 0.451271 | 0.451271 | 0.377119 | 0.377119 | 0.262712 | 0.262712 | 0 | 0.055369 | 0.353579 | 922 | 37 | 50 | 24.918919 | 0.736577 | 0.049892 | 0 | 0.516129 | 1 | 0 | 0.092677 | 0.026316 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.032258 | 0 | 0.129032 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
adb85db75ca4685e65491305d77f76efef993ae2 | 220 | py | Python | figures/styles.py | Jakob-Unfried/msc-legacy | 2c41f3f714936c25dd534bd66da802c26176fcfa | [
"MIT"
] | 1 | 2021-03-22T14:16:43.000Z | 2021-03-22T14:16:43.000Z | figures/styles.py | Jakob-Unfried/msc-legacy | 2c41f3f714936c25dd534bd66da802c26176fcfa | [
"MIT"
] | null | null | null | figures/styles.py | Jakob-Unfried/msc-legacy | 2c41f3f714936c25dd534bd66da802c26176fcfa | [
"MIT"
] | null | null | null | colors_per_chi = {2: 'green', 3: 'orange', 4: 'purple', 5: 'pink', 6: 'red'}
style_per_chi = {2: '-', 3: '-.', 4: 'dotted'}
markers_per_reason = {'converged': 'o', 'progress': 'x', 'ressources': 'v'}
linewidth = 5.31596
| 44 | 76 | 0.581818 | 32 | 220 | 3.8125 | 0.78125 | 0.098361 | 0.114754 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 0.140909 | 220 | 4 | 77 | 55 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0.286364 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
adb871b1b9356162327f5a254f4edf184c40e44c | 2,222 | py | Python | index.py | tapanbk/compass-distance-and-bearing | 2554a39c79570cb675b8d02d1fe8de7a86a7d0f4 | [
"Apache-2.0"
] | null | null | null | index.py | tapanbk/compass-distance-and-bearing | 2554a39c79570cb675b8d02d1fe8de7a86a7d0f4 | [
"Apache-2.0"
] | null | null | null | index.py | tapanbk/compass-distance-and-bearing | 2554a39c79570cb675b8d02d1fe8de7a86a7d0f4 | [
"Apache-2.0"
] | null | null | null | def calculate_compass_distance(origin, destination):
import math
origin_latitude, origin_longitude = origin
destination_latitude, destination_longitude = destination
# 3959 = > Miles and 6371 = > KM
# unit in meters
radius = 6371*1000
dlat = math.radians(destination_latitude-origin_latitude)
dlon = math.radians(destination_longitude-origin_longitude)
a = math.sin(dlat/2) * math.sin(dlat/2) + math.cos(math.radians(origin_latitude)) \
* math.cos(math.radians(destination_latitude)) * math.sin(dlon/2) * math.sin(dlon/2)
c = 2 * math.atan2(math.sqrt(a), math.sqrt(1-a))
return radius * c
def calculate_initial_compass_bearing(origin, destination):
import math
if (type(origin) != tuple) or (type(destination) != tuple):
raise TypeError("Only tuples are supported as arguments")
origin_latitude, origin_longitude = origin
destination_latitude, destination_longitude = destination
origin_latitude = math.radians(origin_latitude)
destination_latitude = math.radians(destination_latitude)
diff_long = math.radians(destination_longitude - origin_longitude)
x = math.sin(diff_long) * math.cos(destination_latitude)
y = math.cos(origin_latitude) * math.sin(destination_latitude) - (
math.sin(origin_latitude) * math.cos(destination_latitude) * math.cos(diff_long))
initial_bearing = math.atan2(x, y)
initial_bearing = math.degrees(initial_bearing)
compass_bearing = (initial_bearing + 360) % 360
return compass_bearing
if __name__ == '__main__':
# Checked on
# http: // instantglobe.com / CRANES / GeoCoordTool.html
pointa = (27.672944, 85.313551)
# pointb = (27.674198, 85.313379) # 353.074198377
# pointb = (27.674312, 85.313701) # 5.54634422036
# pointb = (27.673761, 85.314173) # 33.989047
# pointb = (27.673723, 85.314581) # 49.5024397615
pointb = (27.672792, 85.315011) # distance: 144.764626263 bearing:96.7043766096
# pointb = (27.671747, 85.313615) # distance: 133.249459006 bearing: 177.288978602
distance = calculate_compass_distance(pointa, pointb)
bearing = calculate_initial_compass_bearing(pointa, pointb)
print(distance, bearing)
| 47.276596 | 93 | 0.713771 | 274 | 2,222 | 5.59854 | 0.354015 | 0.111473 | 0.071708 | 0.05867 | 0.189048 | 0.170795 | 0.110821 | 0.110821 | 0.110821 | 0.110821 | 0 | 0.12766 | 0.175068 | 2,222 | 46 | 94 | 48.304348 | 0.70922 | 0.191719 | 0 | 0.181818 | 0 | 0 | 0.025843 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.060606 | false | 0.181818 | 0.060606 | 0 | 0.181818 | 0.030303 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
adbb784f60c50615a64e5766faabdc65dde7543d | 446 | py | Python | setup.py | CubexX/shortest-python | 0b778ad88cc329e00ecd94236178f13735451ded | [
"MIT"
] | null | null | null | setup.py | CubexX/shortest-python | 0b778ad88cc329e00ecd94236178f13735451ded | [
"MIT"
] | null | null | null | setup.py | CubexX/shortest-python | 0b778ad88cc329e00ecd94236178f13735451ded | [
"MIT"
] | null | null | null | from distutils.core import setup
setup(
name='shortest-python',
packages=['shortest'],
version='0.1',
description='Python library for shorte.st url shortener',
long_description="More on github: https://github.com/CubexX/shortest-python",
author='CubexX',
author_email='root@cubexx.xyz',
url='https://github.com/CubexX/shortest-python',
keywords=['shortest', 'shorte.st', 'links'],
license='MIT License'
)
| 29.733333 | 81 | 0.683857 | 55 | 446 | 5.509091 | 0.618182 | 0.138614 | 0.092409 | 0.132013 | 0.224422 | 0.224422 | 0 | 0 | 0 | 0 | 0 | 0.005291 | 0.152466 | 446 | 14 | 82 | 31.857143 | 0.796296 | 0 | 0 | 0 | 0 | 0 | 0.493274 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.076923 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
adbd54f48752ea1ae6b17c7029c2a22c69b5f6e2 | 3,381 | py | Python | 04_random_forest_exp.py | markysamson/CDSWcreditcardfraud | 31dc6d417baee4ef5736bce74059815b4f51542a | [
"Apache-2.0"
] | null | null | null | 04_random_forest_exp.py | markysamson/CDSWcreditcardfraud | 31dc6d417baee4ef5736bce74059815b4f51542a | [
"Apache-2.0"
] | null | null | null | 04_random_forest_exp.py | markysamson/CDSWcreditcardfraud | 31dc6d417baee4ef5736bce74059815b4f51542a | [
"Apache-2.0"
] | null | null | null | # # Building and Evaluating Random Forest Model
# ## Setup
# Import useful packages, modules, classes, and functions:
from __future__ import print_function
from pyspark.sql import SparkSession
from pyspark.sql.functions import col
#import numpy as np
#import pandas as pd
import matplotlib.pyplot as plt
#import seaborn as sns
import cdsw
# Create a SparkSession:
spark = SparkSession.builder.master("local").appName("creditcard_exp").getOrCreate()
param_numTrees=int(sys.argv[1])
# param_numTrees=10
# ## Preprocess the modeling data
# Read the explored data from HDFS:
df = spark.read.parquet("creditcardfraud/exploredata/")
# Now we manually select our features and label:
# Features selected
feature_selected = ["V1","V2","V3","V4","V9","V10","V11","V12","V14","V16","V17","V18","V19"]
df_selected = df.select("Time","V1","V2","V3","V4","V9","V10","V11","V12","V14","V16","V17","V18","V19","Class")
# The machine learning algorithms in Spark MLlib expect the features to be collected into
# a single column. So we use
# [VectorAssembler](http://spark.apache.org/docs/latest/api/python/pyspark.ml.html#pyspark.ml.feature.VectorAssembler)
# to assemble our feature vector:
from pyspark.ml.feature import VectorAssembler
assembler = VectorAssembler(inputCols=feature_selected, outputCol="Features")
df_assembled = assembler.transform(df_selected)
# **Note:** `features` is stored in sparse format.
# ## Create train and test datasets for machine learning (classification).
# Fit our model on the train DataFrame and evaluate our model on the test DataFrame:
# We want both train and test dataset to have equal proportion of normal and fraud transactions.
df_norm = df_assembled.filter(df_assembled.Class == 0)
df_fraud = df_assembled.filter(df_assembled.Class == 1)
(norm_train, norm_test) = df_norm.randomSplit([0.7, 0.3], 12345)
(fraud_train, fraud_test) = df_fraud.randomSplit([0.7, 0.3], 12345)
df_train = norm_train.union(fraud_train).orderBy("Time")
df_test = norm_test.union(fraud_test).orderBy("Time")
# ## Specify Random Forest model
from pyspark.ml.classification import RandomForestClassifier
rf = RandomForestClassifier(featuresCol="Features", labelCol="Class", numTrees=param_numTrees)
# ## Fit the Random Forest model
# Use the `fit` method to fit the linear regression model on the train DataFrame:
%time rf_model = rf.fit(df_train)
# ## Evaluate model performance on the test dataset.
# Use the `evaluate` method of the
# [BinaryClassificationEvaluator](http://spark.apache.org/docs/latest/api/python/pyspark.ml.html#pyspark.ml.evaluation.BinaryClassificationEvaluator)
# class
# Generate predictions on the test DataFrame:
test_with_prediction = rf_model.transform(df_test)
# **Note:** The resulting DataFrame includes three types of predictions. The
# `rawPrediction` is a vector of log-odds, `prediction` is a vector or
# probabilities `prediction` is the predicted class based on the probability
# vector.
# Create an instance of `BinaryClassificationEvaluator` class:
from pyspark.ml.evaluation import BinaryClassificationEvaluator
evaluator = BinaryClassificationEvaluator(rawPredictionCol="rawPrediction", labelCol="Class",
metricName="areaUnderROC")
auroc=evaluator.evaluate(test_with_prediction)
auroc
cdsw.track_metric("auroc", auroc)
# ## Cleanup
# Stop the SparkSession:
# spark.stop() | 37.988764 | 149 | 0.762496 | 457 | 3,381 | 5.551422 | 0.407002 | 0.024832 | 0.020102 | 0.006307 | 0.134017 | 0.115097 | 0.073315 | 0.073315 | 0.073315 | 0.073315 | 0 | 0.021922 | 0.123041 | 3,381 | 89 | 150 | 37.988764 | 0.833727 | 0.487134 | 0 | 0 | 0 | 0 | 0.111905 | 0.016667 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.275862 | null | null | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
adc85f255e45341493cae7d856bac908be193a3e | 6,347 | py | Python | common/utils/utils.py | hvsuchitra/tv_tracker | 5415d177fe9a4e16ec39d9812e9502840bba5b12 | [
"MIT"
] | null | null | null | common/utils/utils.py | hvsuchitra/tv_tracker | 5415d177fe9a4e16ec39d9812e9502840bba5b12 | [
"MIT"
] | null | null | null | common/utils/utils.py | hvsuchitra/tv_tracker | 5415d177fe9a4e16ec39d9812e9502840bba5b12 | [
"MIT"
] | null | null | null | import smtplib
def get_binary(src_file):
with open(src_file, 'rb') as f:
return f.read()
def send_mail(to, username, password, message_type='account_creation'):
server = 'smtp.mail.me.com'
port = 587
email = 'mailid'
_password = 'password'
if message_type == 'account_creation':
message = f'''Subject: Welcome to TV Tracker
From: TV Tracker Dev<{email}>
To: {to}
Thank You for registering. Your username is {username} and password is {password}.
Have a nice day 0:)'''
elif message_type == 'password_change':
message = f'''Subject: TV Track Password Change
From: TV Tracker Dev<{email}>
To: {to}
The password to your TV Tracker account {username} was changed to {password}.
If you have not made this change, reply to this email to deactivate your account.
Have a nice day 0:)'''
elif message_type == 'reset_password':
message = f'''Subject: TV Track Password Change
From: TV Tracker Dev<{email}>
To: {to}
The password to your TV Tracker account {username} was reset to {password}.
Use this password the next time to login.
Have a nice day 0:)'''
with smtplib.SMTP(server, port) as server:
server.starttls()
server.login(email, _password)
server.sendmail(from_addr=email, to_addrs=to, msg=message)
from pathlib import Path
def get_path(path, to_str=True):
app_root = Path('../common').resolve()
return f'{app_root / path}' if to_str else app_root / path
from PyQt5 import QtCore
from PyQt5.QtGui import QImage, QPainter, QBrush, QColor
def make_trans(image, opaque_factor):
temp = QImage(image.size(), QImage.Format_ARGB32)
temp.fill(QtCore.Qt.transparent)
painter = QPainter(temp)
painter.setOpacity(opaque_factor)
painter.drawImage(QtCore.QRect(0, 0, image.width(), image.height()), image)
return temp
def make_dark(image, dark_factor):
painter = QPainter(image)
brush = QBrush(QColor(0, 0, 0, dark_factor))
painter.setBrush(brush)
painter.drawRect(0, 0, image.width(), image.height())
return image
from random import choice
from json import load
def random_thought():
with open(get_path('resources/misc/quotes.json')) as file_obj:
random_quote = choice(load(file_obj))
return random_quote['text'], random_quote['author']
from string import ascii_lowercase, ascii_uppercase, digits, punctuation
from secrets import choice as secret_choice
from random import shuffle, randint
def generate_password():
characters = [ascii_lowercase, ascii_uppercase, digits, punctuation]
shuffle(characters)
random_password = [*map(secret_choice, characters)]
random_password.extend(secret_choice(secret_choice(characters)) for _ in range(randint(4, 12)))
shuffle(random_password)
return ''.join(random_password)
from PyQt5.QtCore import pyqtSignal, Qt, QThread
from PyQt5.QtWidgets import QLabel
from PyQt5.QtGui import QPixmap
class ClickableLabel(QLabel):
clicked = pyqtSignal(str)
def __init__(self, name=None, src=None):
super().__init__()
self.setObjectName(name)
if name is not None and src is not None:
if name != 'profile':
self.setPixmap(QPixmap.fromImage(QImage(get_path(src))).scaled(100, 100, Qt.KeepAspectRatio))
else:
self.setPixmap(circle_crop(src).scaled(100, 100, Qt.KeepAspectRatio))
def mousePressEvent(self, event):
self.clicked.emit(self.objectName())
class SendMailThread(QThread):
signal = pyqtSignal('PyQt_PyObject')
def __init__(self, to, username, password):
super().__init__()
self.to = to
self.username = username
self.password = password
def run(self):
send_mail(self.to, self.username, self.password, self.message_type)
from PyQt5.QtCore import Qt, QRect
from PyQt5.QtGui import QBrush, QImage, QPainter, QPixmap, QWindow
from PyQt5.QtWidgets import QLabel, QVBoxLayout, QWidget
def circle_crop(image):
size = 100
image = QImage.fromData(image)
image.convertToFormat(QImage.Format_ARGB32)
imgsize = min(image.width(), image.height())
rect = QRect((image.width() - imgsize) / 2, (image.height() - imgsize) / 2, imgsize, imgsize)
image = image.copy(rect)
out_img = QImage(image.size(), QImage.Format_ARGB32)
out_img.fill(Qt.transparent)
brush = QBrush(image)
painter = QPainter(out_img)
painter.setBrush(brush)
painter.setPen(Qt.NoPen)
painter.setRenderHint(QPainter.Antialiasing, True)
painter.drawEllipse(0, 0, imgsize, imgsize)
painter.end()
pr = QWindow().devicePixelRatio()
pm = QPixmap.fromImage(out_img)
# pm.setDevicePixelRatio(pr)
# size*=pr
# pm=pm.scaled(size,size,Qt.KeepAspectRatio,Qt.SmoothTransformation)
return pm
from PyQt5.QtCore import QTimeLine
from PyQt5.QtWidgets import QCalendarWidget, QGridLayout, QStackedWidget, QTextEdit
class FaderWidget(QWidget):
def __init__(self, old_widget, new_widget):
QWidget.__init__(self, new_widget)
self.pixmap_opacity = 1.0
self.old_pixmap = QPixmap(new_widget.size())
old_widget.render(self.old_pixmap)
self.timeline = QTimeLine()
self.timeline.valueChanged.connect(self.animate)
self.timeline.finished.connect(self.close)
self.timeline.setDuration(333)
self.timeline.start()
self.resize(new_widget.size())
self.show()
def paintEvent(self, event):
painter = QPainter(self)
painter.setOpacity(self.pixmap_opacity)
painter.drawPixmap(0, 0, self.old_pixmap)
def animate(self, value):
self.pixmap_opacity = 1.0 - value
self.update()
class StackedWidget(QStackedWidget):
clicked = pyqtSignal(str)
def __init__(self, name):
super().__init__()
self.setEnabled(True)
self.setObjectName(name)
def setCurrentIndex(self, index):
if self.currentIndex() != index:
self.fader_widget = FaderWidget(self.currentWidget(), self.widget(index))
super().setCurrentIndex(index)
def enterEvent(self, event):
self.setCurrentIndex(1)
def leaveEvent(self, event):
self.setCurrentIndex(0)
def mousePressEvent(self, QMouseEvent):
self.clicked.emit(self.objectName())
| 27.595652 | 109 | 0.688357 | 804 | 6,347 | 5.300995 | 0.288557 | 0.021117 | 0.010324 | 0.011262 | 0.186767 | 0.147114 | 0.084702 | 0.062412 | 0.049273 | 0.049273 | 0 | 0.012234 | 0.201513 | 6,347 | 229 | 110 | 27.716157 | 0.828729 | 0.016071 | 0 | 0.144737 | 0 | 0 | 0.130909 | 0.004166 | 0 | 0 | 0 | 0 | 0 | 1 | 0.131579 | false | 0.125 | 0.111842 | 0 | 0.335526 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
adcb4497ce1ae012191c1c8f35996119cb7174cf | 1,059 | py | Python | python/python_challenge/25/25.py | yunyu2019/blog | e4dce66504ad9b9c16d8e40ef6dff92e17ad0af0 | [
"Apache-2.0"
] | null | null | null | python/python_challenge/25/25.py | yunyu2019/blog | e4dce66504ad9b9c16d8e40ef6dff92e17ad0af0 | [
"Apache-2.0"
] | null | null | null | python/python_challenge/25/25.py | yunyu2019/blog | e4dce66504ad9b9c16d8e40ef6dff92e17ad0af0 | [
"Apache-2.0"
] | null | null | null | #!/usr/bin/python
# -*- coding: utf-8 -*-
# @Date : 2016-05-17 16:36:18
# @Author : Yunyu2019 (yunyu2010@yeah.net)
# @Link : http://www.pythonchallenge.com/pc/hex/lake.html
import os
import wave
import time
import Image
import requests
def download(urls):
filename=os.path.basename(urls)
try:
req=requests.get(urls,auth=('butter','fly'))
fp=open(filename,'wb')
fp.write(req.content)
fp.close()
print 'download:%s' % filename
except:
print 'fail download:%s' % filename
def getdatas(files):
fp=open(files,'rb')
data=fp.read()[44:]
fp.close()
img=Image.new('RGB',(60,60))
img.fromstring(data)
return img
"""
for i in range(1,26):
urls='http://www.pythonchallenge.com/pc/hex/lake%s.wav' % i
download(urls)
time.sleep(1)
"""
img=Image.new('RGB',(300,300))
for i in range(25):
y,x=divmod(i,5)
files='lake{0}.wav'.format(i+1)
pices=getdatas(files)
img.paste(pices,(x*60,y*60))
img.save('lake.jpg') | 24.068182 | 64 | 0.586402 | 156 | 1,059 | 3.980769 | 0.557692 | 0.022544 | 0.070853 | 0.080515 | 0.109501 | 0.109501 | 0.109501 | 0 | 0 | 0 | 0 | 0.058968 | 0.23135 | 1,059 | 44 | 65 | 24.068182 | 0.703931 | 0.160529 | 0 | 0.068966 | 0 | 0 | 0.091292 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.172414 | null | null | 0.068966 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
add2a5d0a4f10c67d63cf8daaaf3c6b9766d80f2 | 2,743 | py | Python | caloric_balance/test_main_getUserString.py | ankitsumitg/python-projects | 34a3df6fcd8544bf83aa9f3d47ec160e3838b1d1 | [
"MIT"
] | 1 | 2021-03-22T20:45:06.000Z | 2021-03-22T20:45:06.000Z | caloric_balance/test_main_getUserString.py | ankitsumitg/python-projects | 34a3df6fcd8544bf83aa9f3d47ec160e3838b1d1 | [
"MIT"
] | null | null | null | caloric_balance/test_main_getUserString.py | ankitsumitg/python-projects | 34a3df6fcd8544bf83aa9f3d47ec160e3838b1d1 | [
"MIT"
] | null | null | null | """
Do Not Edit this file. You may and are encouraged to look at it for reference.
"""
import sys
if sys.version_info.major != 3:
print('You must use Python 3.x version to run this unit test')
sys.exit(1)
import unittest
import main
class TestGetUserString(unittest.TestCase):
def input_replacement(self, prompt):
self.assertFalse(self.too_many_inputs)
self.input_given_prompt = prompt
r = self.input_response_list[self.input_response_index]
self.input_response_index += 1
if self.input_response_index >= len(self.input_response_list):
self.input_response_index = 0
self.too_many_inputs = True
return r
def print_replacement(self, *args, **kargs):
return
def setUp(self):
self.too_many_inputs = False
self.input_given_prompt = None
self.input_response_index = 0
self.input_response_list = [""]
main.input = self.input_replacement
main.print = self.print_replacement
return
def test001_getUserStringExists(self):
self.assertTrue('getUserString' in dir(main),
'Function "getUserString" is not defined, check your spelling')
return
def test002_getUserStringSendsCorrectPrompt(self):
from main import getUserString
expected_prompt = "HELLO"
expected_response = "WORLD"
self.input_response_list = [expected_response]
actual_response = getUserString(expected_prompt)
self.assertEqual(self.input_given_prompt, expected_prompt)
return
def test003_getUserStringGetsInput(self):
from main import getUserString
expected_prompt = "HELLO"
expected_response = "WORLD"
self.input_response_list = [expected_response]
actual_response = getUserString(expected_prompt)
self.assertEqual(actual_response, expected_response)
return
def test004_getUserStringStripsWhitespace(self):
from main import getUserString
expected_prompt = "HELLO"
expected_response = "WORLD"
self.input_response_list = [" \t\n" + expected_response + " \t\n"]
actual_response = getUserString(expected_prompt)
self.assertEqual(actual_response, expected_response)
return
def test005_getUserStringIgnoresBlankLines(self):
from main import getUserString
expected_prompt = "HELLO"
expected_response = "WORLD"
self.input_response_list = ["", "\n", " \t\n" + expected_response + " \t\n"]
actual_response = getUserString(expected_prompt)
self.assertEqual(actual_response, expected_response)
return
if __name__ == '__main__':
unittest.main()
| 33.048193 | 87 | 0.677725 | 303 | 2,743 | 5.867987 | 0.29703 | 0.08099 | 0.114736 | 0.082677 | 0.503937 | 0.503937 | 0.485939 | 0.485939 | 0.43757 | 0.43757 | 0 | 0.010125 | 0.243894 | 2,743 | 82 | 88 | 33.45122 | 0.847155 | 0.028436 | 0 | 0.46875 | 1 | 0 | 0.073767 | 0 | 0 | 0 | 0 | 0 | 0.09375 | 1 | 0.125 | false | 0 | 0.109375 | 0.015625 | 0.375 | 0.046875 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
add36b7db37e26004fe848c8316ce3ebda2a90b6 | 660 | py | Python | linked_list_reversal.py | Nikhilxavier/Linked-List | b934985d937edd2cd4a683d751a930aacd7f96bf | [
"BSD-3-Clause"
] | null | null | null | linked_list_reversal.py | Nikhilxavier/Linked-List | b934985d937edd2cd4a683d751a930aacd7f96bf | [
"BSD-3-Clause"
] | null | null | null | linked_list_reversal.py | Nikhilxavier/Linked-List | b934985d937edd2cd4a683d751a930aacd7f96bf | [
"BSD-3-Clause"
] | null | null | null | """
Implementation of Linked List reversal.
"""
# Author: Nikhil Xavier <nikhilxavier@yahoo.com>
# License: BSD 3 clause
class Node:
"""Node class for Singly Linked List."""
def __init__(self, value):
self.value = value
self.next_node = None
def reverse_linked_list(head):
"""Reverse linked list.
Returns reversed linked list head.
"""
current_node = head
previous_node = None
next_node = None
while current_node:
next_node = current_node.next_node
current_node.next_node = previous_node
previous_node = current_node
current_node = next_node
return previous_node
| 20 | 48 | 0.668182 | 82 | 660 | 5.109756 | 0.402439 | 0.114558 | 0.143198 | 0.181384 | 0.190931 | 0.136038 | 0.136038 | 0.136038 | 0 | 0 | 0 | 0.002028 | 0.25303 | 660 | 32 | 49 | 20.625 | 0.84787 | 0.304545 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.285714 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
add8774dbb279519f1395cc74a9dc7957550b2a2 | 7,520 | py | Python | edg_core/test_simple_const_prop.py | tengisd/PolymorphicBlocks | 240a11f813762c4eb5a97c9d9766a0af19cd8f3a | [
"BSD-3-Clause"
] | null | null | null | edg_core/test_simple_const_prop.py | tengisd/PolymorphicBlocks | 240a11f813762c4eb5a97c9d9766a0af19cd8f3a | [
"BSD-3-Clause"
] | null | null | null | edg_core/test_simple_const_prop.py | tengisd/PolymorphicBlocks | 240a11f813762c4eb5a97c9d9766a0af19cd8f3a | [
"BSD-3-Clause"
] | null | null | null | import unittest
from . import *
from edg_core.ScalaCompilerInterface import ScalaCompiler
class TestConstPropInternal(Block):
def __init__(self) -> None:
super().__init__()
self.float_param = self.Parameter(FloatExpr())
self.range_param = self.Parameter(RangeExpr())
class TestParameterConstProp(Block):
def __init__(self) -> None:
super().__init__()
self.float_const = self.Parameter(FloatExpr())
self.float_param = self.Parameter(FloatExpr())
self.range_const = self.Parameter(RangeExpr())
self.range_param = self.Parameter(RangeExpr())
def contents(self):
self.assign(self.float_const, 2.0)
self.assign(self.float_param, self.float_const)
self.assign(self.range_const, Range(1.0, 42.0))
self.assign(self.range_param, self.range_const)
self.block = self.Block(TestConstPropInternal())
self.assign(self.block.float_param, self.float_param)
self.assign(self.block.range_param, self.range_param)
class ConstPropTestCase(unittest.TestCase):
def setUp(self) -> None:
self.compiled = ScalaCompiler.compile(TestParameterConstProp)
def test_float_prop(self) -> None:
self.assertEqual(self.compiled.get_value(['float_const']), 2.0)
self.assertEqual(self.compiled.get_value(['block', 'float_param']), 2.0)
def test_range_prop(self) -> None:
self.assertEqual(self.compiled.get_value(['range_const']), Range(1.0, 42.0))
self.assertEqual(self.compiled.get_value(['block', 'range_param']), Range(1.0, 42.0))
class TestPortConstPropLink(Link):
def __init__(self) -> None:
super().__init__()
self.a = self.Port(TestPortConstPropPort())
self.b = self.Port(TestPortConstPropPort())
self.assign(self.b.float_param, self.a.float_param) # first connected is source
class TestPortConstPropPort(Port[TestPortConstPropLink]):
def __init__(self) -> None:
super().__init__()
self.link_type = TestPortConstPropLink
self.float_param = self.Parameter(FloatExpr())
class TestPortConstPropInnerBlock(Block):
def __init__(self) -> None:
super().__init__()
self.port = self.Port(TestPortConstPropPort(), optional=True)
class TestPortConstPropOuterBlock(Block):
def __init__(self) -> None:
super().__init__()
self.inner = self.Block(TestPortConstPropInnerBlock())
self.port = self.Port(TestPortConstPropPort())
self.connect(self.inner.port, self.port)
class TestPortConstPropTopBlock(Block):
def __init__(self) -> None:
super().__init__()
self.block1 = self.Block(TestPortConstPropInnerBlock())
self.block2 = self.Block(TestPortConstPropOuterBlock())
self.link = self.connect(self.block1.port, self.block2.port)
self.assign(self.block1.port.float_param, 3.5)
class ConstPropPortTestCase(unittest.TestCase):
def setUp(self) -> None:
self.compiled = ScalaCompiler.compile(TestPortConstPropTopBlock)
def test_port_param_prop(self) -> None:
self.assertEqual(self.compiled.get_value(['block1', 'port', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['link', 'a', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['link', 'b', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['block2', 'port', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['block2', 'inner', 'port', 'float_param']), 3.5)
def test_connected_link(self) -> None:
self.assertEqual(self.compiled.get_value(['block1', 'port', edgir.IS_CONNECTED]), True)
self.assertEqual(self.compiled.get_value(['block2', 'port', edgir.IS_CONNECTED]), True)
self.assertEqual(self.compiled.get_value(['block2', 'inner', 'port', edgir.IS_CONNECTED]), True)
class TestDisconnectedTopBlock(Block):
def __init__(self) -> None:
super().__init__()
self.block1 = self.Block(TestPortConstPropInnerBlock())
self.assign(self.block1.port.float_param, 3.5)
class DisconnectedPortTestCase(unittest.TestCase):
def setUp(self) -> None:
self.compiled = ScalaCompiler.compile(TestDisconnectedTopBlock)
def test_disconnected_link(self) -> None:
self.assertEqual(self.compiled.get_value(['block1', 'port', edgir.IS_CONNECTED]), False)
class TestPortConstPropBundleLink(Link):
def __init__(self) -> None:
super().__init__()
self.a = self.Port(TestPortConstPropBundle())
self.b = self.Port(TestPortConstPropBundle())
self.elt1_link = self.connect(self.a.elt1, self.b.elt1)
self.elt2_link = self.connect(self.a.elt2, self.b.elt2)
class TestPortConstPropBundle(Bundle[TestPortConstPropBundleLink]):
def __init__(self) -> None:
super().__init__()
self.link_type = TestPortConstPropBundleLink
self.elt1 = self.Port(TestPortConstPropPort())
self.elt2 = self.Port(TestPortConstPropPort())
class TestPortConstPropBundleInnerBlock(Block):
def __init__(self) -> None:
super().__init__()
self.port = self.Port(TestPortConstPropBundle())
class TestPortConstPropBundleTopBlock(Block):
def __init__(self) -> None:
super().__init__()
def contents(self) -> None:
self.block1 = self.Block(TestPortConstPropBundleInnerBlock())
self.block2 = self.Block(TestPortConstPropBundleInnerBlock())
self.link = self.connect(self.block1.port, self.block2.port)
self.assign(self.block1.port.elt1.float_param, 3.5)
self.assign(self.block1.port.elt2.float_param, 6.0)
class ConstPropBundleTestCase(unittest.TestCase):
def setUp(self) -> None:
self.compiled = ScalaCompiler.compile(TestPortConstPropBundleTopBlock)
def test_port_param_prop(self) -> None:
self.assertEqual(self.compiled.get_value(['block1', 'port', 'elt1', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['block1', 'port', 'elt2', 'float_param']), 6.0)
self.assertEqual(self.compiled.get_value(['link', 'a', 'elt1', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['link', 'a', 'elt2', 'float_param']), 6.0)
self.assertEqual(self.compiled.get_value(['link', 'elt1_link', 'a', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['link', 'elt2_link', 'a', 'float_param']), 6.0)
self.assertEqual(self.compiled.get_value(['link', 'elt1_link', 'b', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['link', 'elt2_link', 'b', 'float_param']), 6.0)
self.assertEqual(self.compiled.get_value(['link', 'b', 'elt1', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['link', 'b', 'elt2', 'float_param']), 6.0)
self.assertEqual(self.compiled.get_value(['block2', 'port', 'elt1', 'float_param']), 3.5)
self.assertEqual(self.compiled.get_value(['block2', 'port', 'elt2', 'float_param']), 6.0)
def test_connected_link(self) -> None:
self.assertEqual(self.compiled.get_value(['block1', 'port', edgir.IS_CONNECTED]), True)
self.assertEqual(self.compiled.get_value(['block2', 'port', edgir.IS_CONNECTED]), True)
# Note: inner ports IS_CONNECTED is not defined
self.assertEqual(self.compiled.get_value(['link', 'a', edgir.IS_CONNECTED]), True)
self.assertEqual(self.compiled.get_value(['link', 'b', edgir.IS_CONNECTED]), True)
self.assertEqual(self.compiled.get_value(['link', 'elt1_link', 'a', edgir.IS_CONNECTED]), True)
self.assertEqual(self.compiled.get_value(['link', 'elt1_link', 'b', edgir.IS_CONNECTED]), True)
self.assertEqual(self.compiled.get_value(['link', 'elt2_link', 'a', edgir.IS_CONNECTED]), True)
self.assertEqual(self.compiled.get_value(['link', 'elt2_link', 'b', edgir.IS_CONNECTED]), True)
| 38.367347 | 100 | 0.71742 | 937 | 7,520 | 5.528282 | 0.085379 | 0.085714 | 0.121042 | 0.172008 | 0.671236 | 0.637066 | 0.611004 | 0.600386 | 0.540154 | 0.489961 | 0 | 0.017411 | 0.121676 | 7,520 | 195 | 101 | 38.564103 | 0.766843 | 0.009441 | 0 | 0.350746 | 0 | 0 | 0.07588 | 0 | 0 | 0 | 0 | 0 | 0.246269 | 1 | 0.186567 | false | 0 | 0.022388 | 0 | 0.328358 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
addded900080d658ae63eb6fa0ec1ba46ebc0d17 | 492 | py | Python | setup.py | my-old-projects/syspy | f870adfa6d2839fa8c8c8d3a6c3bdcfb0c863c1d | [
"MIT"
] | null | null | null | setup.py | my-old-projects/syspy | f870adfa6d2839fa8c8c8d3a6c3bdcfb0c863c1d | [
"MIT"
] | null | null | null | setup.py | my-old-projects/syspy | f870adfa6d2839fa8c8c8d3a6c3bdcfb0c863c1d | [
"MIT"
] | 1 | 2020-11-21T10:09:30.000Z | 2020-11-21T10:09:30.000Z | from distutils.core import setup
setup(
name = 'syspy',
version = '0.2',
url = 'https://github.com/aligoren/syspy',
download_url = 'https://github.com/aligoren/syspy/archive/master.zip',
author = 'Ali GOREN <goren.ali@yandex.com>',
author_email = 'goren.ali@yandex.com',
license = 'Apache v2.0 License',
packages = ['syspy'],
description = 'Windows System Informations',
keywords = ['sys', 'util', 'system', 'info', 'information', 'windows', 'os'],
)
| 32.8 | 81 | 0.636179 | 59 | 492 | 5.271186 | 0.644068 | 0.051447 | 0.090032 | 0.109325 | 0.192926 | 0.192926 | 0 | 0 | 0 | 0 | 0 | 0.009926 | 0.180894 | 492 | 14 | 82 | 35.142857 | 0.761787 | 0 | 0 | 0 | 0 | 0 | 0.473577 | 0.044715 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.076923 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
adde9423551c7dda2a8e059cbe593f3c2faacded | 1,638 | py | Python | DjangoECom/products/migrations/0003_auto_20210109_2256.py | MostafaSamyFayez/E-Commerce-Sys | 95ed3cb65b238866e336d43422dfb1737bfd7993 | [
"Unlicense"
] | 2 | 2021-04-01T00:23:44.000Z | 2021-04-01T00:23:48.000Z | DjangoECom/products/migrations/0003_auto_20210109_2256.py | MostafaSamyFayez/E-Commerce-Sys | 95ed3cb65b238866e336d43422dfb1737bfd7993 | [
"Unlicense"
] | null | null | null | DjangoECom/products/migrations/0003_auto_20210109_2256.py | MostafaSamyFayez/E-Commerce-Sys | 95ed3cb65b238866e336d43422dfb1737bfd7993 | [
"Unlicense"
] | 1 | 2021-01-23T13:06:25.000Z | 2021-01-23T13:06:25.000Z | # Generated by Django 3.1.4 on 2021-01-09 20:56
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('products', '0002_auto_20210102_1247'),
]
operations = [
migrations.RemoveField(
model_name='product',
name='review',
),
migrations.AddField(
model_name='product',
name='total_review',
field=models.FloatField(default=0),
),
migrations.CreateModel(
name='Review',
fields=[
('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')),
('comment', models.TextField(blank=True, max_length=250)),
('status', models.CharField(choices=[('True', 'True'), ('False', 'False'), ('New', 'New')], default='New', max_length=20)),
('subject', models.CharField(blank=True, max_length=50)),
('ip', models.CharField(blank=True, max_length=20)),
('rate', models.IntegerField(default=1)),
('create_at', models.DateTimeField(auto_now_add=True)),
('update_at', models.DateTimeField(auto_now=True)),
('product', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='products.product')),
('user', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)),
],
),
]
| 39.95122 | 139 | 0.59707 | 172 | 1,638 | 5.540698 | 0.465116 | 0.033578 | 0.044071 | 0.069255 | 0.243442 | 0.18468 | 0.115425 | 0.115425 | 0.115425 | 0.115425 | 0 | 0.034539 | 0.257631 | 1,638 | 40 | 140 | 40.95 | 0.749178 | 0.027473 | 0 | 0.147059 | 1 | 0 | 0.10748 | 0.014456 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.088235 | 0 | 0.176471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ade1b9af12ca82c1c26ca0644c6d5b378d6445ab | 5,496 | py | Python | build.py | niklas2902/py4godot | bf50624d1fc94b55faf82a3a4d322e33fbba60ce | [
"MIT"
] | 2 | 2021-12-10T21:17:57.000Z | 2021-12-17T18:54:49.000Z | build.py | niklas2902/py4godot | bf50624d1fc94b55faf82a3a4d322e33fbba60ce | [
"MIT"
] | 9 | 2021-12-21T18:35:28.000Z | 2022-03-27T20:03:50.000Z | build.py | niklas2902/py4godot | bf50624d1fc94b55faf82a3a4d322e33fbba60ce | [
"MIT"
] | 1 | 2022-03-07T08:06:57.000Z | 2022-03-07T08:06:57.000Z | import argparse
import os
import subprocess
import time
from Cython.Build import cythonize
import generate_bindings
from meson_scripts import copy_tools, download_python, generate_init_files, \
locations, platform_check, generate_godot, \
download_godot
generate_bindings.build()
def cythonize_files():
module = cythonize('py4godot/core/*/*.pyx', language_level=3)
module += cythonize("py4godot/classes/*.pyx", language_level=3)
module += cythonize("py4godot/utils/*.pyx", language_level=3)
module += cythonize("py4godot/pluginscript_api/*.pyx", language_level=3)
module += cythonize("py4godot/pluginscript_api/*/*.pyx", language_level=3)
module += cythonize("py4godot/pluginscript_api/*/*/*.pyx", language_level=3)
module += cythonize("py4godot/pluginscript_api/*/*/*/*.pyx", language_level=3)
module += cythonize("py4godot/gdnative_api/*.pyx", language_level=3)
module += cythonize("py4godot/enums/*.pyx", language_level=3)
module += cythonize("py4godot/events/*.pyx", language_level=3)
def compile_python_ver_file(platform):
"""compile python file, to find the matching python version"""
python_dir = locations.get_python_dir(platform)
godot_dir = locations.get_godot_dir(platform)
with open("platforms/binary_dirs/python_ver_temp.cross", "r") as python_temp:
file_string = python_temp.read()
# Replacing things like in a template
file_string = file_string.replace("{python_ver}", python_dir)
file_string = file_string.replace("{godot}", godot_dir)
with open("platforms/binary_dirs/python_ver_compile.cross", "w") as python_compile:
python_compile.write(file_string)
def get_compiler():
compiler_res = subprocess.run("vcvarsall", shell=True, stdout=subprocess.DEVNULL,
stderr=subprocess.STDOUT)
if compiler_res.returncode == 0:
return "msvc"
compiler_res = subprocess.run("gcc --version", shell=True, stdout=subprocess.DEVNULL, stderr=subprocess.STDOUT)
if compiler_res.returncode == 0:
return "gcc"
raise Exception("No compiler found")
current_platform = platform_check.get_platform()
command_separator = "&"
if "linux" in current_platform:
command_separator = ";"
my_parser = argparse.ArgumentParser(fromfile_prefix_chars='@')
my_parser.add_argument('--compiler',
help='specify the compiler, you want to use to compile')
my_parser.add_argument('--target_platform',
help='specify the platform, you want to go build for')
my_parser.add_argument("-run_tests", help="should tests be run", default="False")
my_parser.add_argument("-download_godot", help="should tests be run", default="False")
# Execute parse_args()
args = my_parser.parse_args()
# Determining if tests should be run
should_run_tests = args.run_tests.lower() == "true"
# Determining if godot binary should be downloaded
should_download_godot = args.download_godot.lower() == "true"
build_dir = f"build_meson/{args.target_platform}"
start = time.time()
if args.compiler is None:
print("Checking for compilers")
args.compiler = get_compiler()
print(f"Got compiler:{args.compiler}")
cythonize_files()
# loading the needed python files for the target platform
download_python.download_file(args.target_platform, allow_copy=True)
# downlaod needed python files for the current platform
download_python.download_file(current_platform, allow_copy=False)
compile_python_ver_file(current_platform)
# initializing for msvc if wanted as compiler (todo:should be improved sometime)
msvc_init = f"vcvarsall.bat {'x86_amd64'} {command_separator} cl {command_separator} " if "msvc" in args.compiler else ""
res = subprocess.Popen(msvc_init +
f"meson {build_dir} --cross-file platforms/{args.target_platform}.cross "
f"--cross-file platforms/compilers/{args.compiler}_compiler.native "
f"--cross-file platforms/binary_dirs/python_ver_compile.cross "
f"--buildtype=release {'--wipe' if os.path.isdir(build_dir) else ''}"
f"{command_separator} ninja -C build_meson/{args.target_platform}",
shell=True)
res.wait()
copy_tools.run(args.target_platform)
generate_init_files.create_init_file(args.target_platform)
copy_tools.copy_main(args.target_platform)
generate_godot.generate_lib(args.target_platform)
generate_godot.generate_gdignore()
print("=================================Build finished==================================")
print("Build took:", time.time() - start, "seconds")
if should_download_godot:
print("=================================Start download==================================")
download_godot.run(current_platform)
print("=================================Fnish download==================================")
# running tests
if should_run_tests:
print("=================================Start tests==================================")
start = time.time()
copy_tools.copy_tests(args.target_platform)
res = subprocess.Popen(
f"ninja -C build_meson/{args.target_platform} test", shell=True)
res.wait()
streamdata = res.communicate()[0]
rc = res.returncode
print("=================================Build finished==================================")
print("Running tests took:", time.time() - start, "seconds")
if rc != 0:
raise Exception("Tests failed")
| 41.014925 | 121 | 0.670852 | 660 | 5,496 | 5.365152 | 0.248485 | 0.047444 | 0.064953 | 0.048009 | 0.331262 | 0.275911 | 0.239198 | 0.133013 | 0.12087 | 0.12087 | 0 | 0.006046 | 0.157387 | 5,496 | 133 | 122 | 41.323308 | 0.758583 | 0.072962 | 0 | 0.083333 | 1 | 0 | 0.319166 | 0.196379 | 0 | 0 | 0 | 0.007519 | 0 | 1 | 0.03125 | false | 0 | 0.072917 | 0 | 0.125 | 0.09375 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
ade5f5e2cfd5b36a0ae20ab962012600912b9fde | 1,337 | py | Python | backoffice/web/companies/serializers.py | uktrade/trade-access-program | 8fb565e96de7d7bb0bde31255aef0f291063e93c | [
"MIT"
] | 1 | 2021-03-04T15:24:12.000Z | 2021-03-04T15:24:12.000Z | backoffice/web/companies/serializers.py | uktrade/trade-access-program | 8fb565e96de7d7bb0bde31255aef0f291063e93c | [
"MIT"
] | 7 | 2020-08-24T13:27:02.000Z | 2021-06-09T18:42:31.000Z | backoffice/web/companies/serializers.py | uktrade/trade-access-program | 8fb565e96de7d7bb0bde31255aef0f291063e93c | [
"MIT"
] | 1 | 2021-05-20T07:40:00.000Z | 2021-05-20T07:40:00.000Z | from rest_framework import serializers
from web.companies.models import Company, DnbGetCompanyResponse
class DnbGetCompanyResponseSerializer(serializers.ModelSerializer):
class Meta:
model = DnbGetCompanyResponse
fields = ['id', 'company', 'dnb_data', 'registration_number', 'company_address']
class CompanyReadSerializer(serializers.ModelSerializer):
dnb_get_company_responses = DnbGetCompanyResponseSerializer(many=True)
class Meta:
model = Company
fields = '__all__'
class CompanyWriteSerializer(serializers.ModelSerializer):
class Meta:
model = Company
fields = '__all__'
class SearchCompaniesSerializer(serializers.Serializer):
search_term = serializers.CharField(min_length=2, max_length=60, required=False)
primary_name = serializers.CharField(min_length=2, max_length=60, required=False)
registration_numbers = serializers.ListField(
child=serializers.CharField(min_length=1, max_length=60), min_length=1, required=False
)
duns_number = serializers.CharField(required=False)
def validate(self, attrs):
attrs = super().validate(attrs)
if not any(field in attrs for field in self.fields):
raise serializers.ValidationError(f"One of: {', '.join(self.fields)} is required.")
return attrs
| 32.609756 | 95 | 0.732236 | 140 | 1,337 | 6.807143 | 0.478571 | 0.083945 | 0.044071 | 0.091291 | 0.256034 | 0.186779 | 0.186779 | 0.113326 | 0.113326 | 0.113326 | 0 | 0.009099 | 0.17801 | 1,337 | 40 | 96 | 33.425 | 0.858053 | 0 | 0 | 0.259259 | 0 | 0 | 0.082274 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.037037 | false | 0 | 0.074074 | 0 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
adf16cd52d01a8a986221348b55fd5f12bd7c761 | 826 | py | Python | test_day01.py | clfs/aoc2019 | 940fdbdd7bbb69c4a3a6c947c37bf7b60a201e88 | [
"MIT"
] | null | null | null | test_day01.py | clfs/aoc2019 | 940fdbdd7bbb69c4a3a6c947c37bf7b60a201e88 | [
"MIT"
] | null | null | null | test_day01.py | clfs/aoc2019 | 940fdbdd7bbb69c4a3a6c947c37bf7b60a201e88 | [
"MIT"
] | null | null | null | def fuel_required(weight: int) -> int:
return weight // 3 - 2
def fuel_required_accurate(weight: int) -> int:
fuel = 0
while weight > 0:
weight = max(0, weight // 3 - 2)
fuel += weight
return fuel
def test_fuel_required() -> None:
cases = [(12, 2), (14, 2), (1969, 654), (100756, 33583)]
for x, y in cases:
assert fuel_required(x) == y
def test_fuel_required_accurate() -> None:
cases = [(14, 2), (1969, 966), (100756, 50346)]
for x, y in cases:
assert fuel_required_accurate(x) == y
def test_solutions() -> None:
with open("input/01.txt") as f:
modules = [int(line) for line in f]
part_1 = sum(map(fuel_required, modules))
part_2 = sum(map(fuel_required_accurate, modules))
assert part_1 == 3375962
assert part_2 == 5061072
| 25.030303 | 60 | 0.605327 | 121 | 826 | 3.975207 | 0.363636 | 0.199584 | 0.16632 | 0.079002 | 0.12474 | 0.12474 | 0.12474 | 0.12474 | 0 | 0 | 0 | 0.117264 | 0.256659 | 826 | 32 | 61 | 25.8125 | 0.666124 | 0 | 0 | 0.086957 | 0 | 0 | 0.014528 | 0 | 0 | 0 | 0 | 0 | 0.173913 | 1 | 0.217391 | false | 0 | 0 | 0.043478 | 0.304348 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
adf4bfc1de4b0c89ad4020a842811facae16d88a | 499 | py | Python | Easy/After 157/175.Modified Kaprekar Numbers.py | sherryx080/CPTango | c7491156202fa7517c96b96dab27c867b949bb63 | [
"MIT"
] | null | null | null | Easy/After 157/175.Modified Kaprekar Numbers.py | sherryx080/CPTango | c7491156202fa7517c96b96dab27c867b949bb63 | [
"MIT"
] | null | null | null | Easy/After 157/175.Modified Kaprekar Numbers.py | sherryx080/CPTango | c7491156202fa7517c96b96dab27c867b949bb63 | [
"MIT"
] | null | null | null | import sys
p = int(sys.stdin.readline())
q = int(sys.stdin.readline())
result = []
for i in range(p,q+1):
square = i * i
l_num = 0
temp = list(str(square))
#print(temp[:len(temp)//2])
#print(temp[len(temp)//2:])
if square > 10:
l_num = int(''.join(temp[:len(temp)//2]))
r_num = int(''.join(temp[len(temp)//2:]))
if l_num+r_num == i:
result.append(i)
if len(result)==0:
print("INVALID RANGE")
else:
for i in result:
print(i,end=" ") | 20.791667 | 49 | 0.54509 | 83 | 499 | 3.216867 | 0.373494 | 0.104869 | 0.164794 | 0.179775 | 0.307116 | 0.164794 | 0.164794 | 0 | 0 | 0 | 0 | 0.02381 | 0.242485 | 499 | 24 | 50 | 20.791667 | 0.68254 | 0.104208 | 0 | 0 | 0 | 0 | 0.03139 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0.111111 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
adf797c04c5912626a52449a6704952383c10673 | 451 | py | Python | login/migrations/0003_user_token.py | yuxiaoYX/xiaoshuo | 5652703521aa99774e8e0667c5e6b9f24a6d90ac | [
"MIT"
] | null | null | null | login/migrations/0003_user_token.py | yuxiaoYX/xiaoshuo | 5652703521aa99774e8e0667c5e6b9f24a6d90ac | [
"MIT"
] | null | null | null | login/migrations/0003_user_token.py | yuxiaoYX/xiaoshuo | 5652703521aa99774e8e0667c5e6b9f24a6d90ac | [
"MIT"
] | null | null | null | # Generated by Django 2.2.1 on 2019-07-28 08:12
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('login', '0002_auto_20190720_1846'),
]
operations = [
migrations.AddField(
model_name='user',
name='token',
field=models.CharField(default=1, max_length=100, verbose_name='token验证'),
preserve_default=False,
),
]
| 22.55 | 86 | 0.605322 | 50 | 451 | 5.32 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107692 | 0.279379 | 451 | 19 | 87 | 23.736842 | 0.710769 | 0.099778 | 0 | 0 | 1 | 0 | 0.108911 | 0.056931 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.076923 | 0 | 0.307692 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
adfa006844d8a11fe82113a8c42d8b2a11a4d2bc | 10,777 | py | Python | usersystem/views.py | sergioruizdavila/asanni-backend | b2da3f3a97dbd1ef46d65f13ee9b2098124d4fc4 | [
"MIT"
] | 8 | 2018-05-24T04:46:58.000Z | 2021-06-11T04:41:49.000Z | usersystem/views.py | sergioruizdavila/asanni-backend | b2da3f3a97dbd1ef46d65f13ee9b2098124d4fc4 | [
"MIT"
] | null | null | null | usersystem/views.py | sergioruizdavila/asanni-backend | b2da3f3a97dbd1ef46d65f13ee9b2098124d4fc4 | [
"MIT"
] | 4 | 2020-01-24T13:35:42.000Z | 2021-06-15T07:38:06.000Z | from allauth.account.utils import setup_user_email, send_email_confirmation
from rest_framework.response import Response
from usersystem.serializers import UserSerializer, UserRegisterSerializer
from rest_framework.views import APIView
from rest_framework.status import HTTP_200_OK, HTTP_400_BAD_REQUEST, HTTP_201_CREATED, HTTP_404_NOT_FOUND
from rest_framework.permissions import AllowAny
from django.contrib.auth.models import User
from usersystem.settings import PASSWORD_MAX_LENGTH, PASSWORD_MIN_LENGTH, LOCAL_OAUTH2_KEY
import requests as makerequest
from usersystem.secrets import SOCIAL_AUTH_GOOGLE_OAUTH2_KEY, SOCIAL_AUTH_GOOGLE_OAUTH2_SECRET
from social.apps.django_app.default.models import UserSocialAuth
# Create your views here.
class AccountView(APIView):
"""
An API endpoint for managing the current user.
GET returns basic information about the current user.
POST expects at least one of 'email', 'first_name' or 'last_name' fields.
DELETE deletes the current user.
"""
def get(self, request):
serializer = UserSerializer(request.user, context={'request': request})
return Response(serializer.data)
def post(self, request):
if not request.data:
return Response(status=HTTP_400_BAD_REQUEST)
serializer = UserSerializer(data=request.data, partial=True)
# Return a 400 response if the data was invalid.
serializer.is_valid(raise_exception=True)
request.user.email = serializer.validated_data.get(
'email', request.user.email)
request.user.first_name = serializer.validated_data.get(
'first_name', request.user.first_name)
request.user.last_name = serializer.validated_data.get(
'last_name', request.user.last_name)
request.user.save()
return Response(status=HTTP_200_OK)
def delete(self, request):
# If this is a Google social account, revoke its Google tokens
socAuth = next(
iter(UserSocialAuth.get_social_auth_for_user(request.user)), None)
if socAuth and socAuth.provider == 'google-oauth2':
refresh_token = socAuth.extra_data.get(
'refresh_token', socAuth.extra_data['access_token'])
makerequest.post(
'https://accounts.google.com/o/oauth2/revoke?token=' + refresh_token)
request.user.delete()
return Response(status=HTTP_200_OK)
class AccountUsernameView(APIView):
"""
A simple API endpoint for getting an username with a given email.
POST must contain 'email' field. Server returns 400 if email is already used or 200 otherwise.
"""
permission_classes = (AllowAny,)
def post(self, request):
email = request.data.get('email', None)
if email is None:
return Response({"message": "'email' field is missing"}, status=HTTP_400_BAD_REQUEST)
try:
data = User.objects.get(email=email)
except User.DoesNotExist:
data = None
if data:
return Response({"userExist": True, "username": data.username}, status=HTTP_200_OK)
return Response({"userExist": False}, status=HTTP_200_OK)
class AccountPasswordView(APIView):
"""
An API endpoint for password management (for the current user)
GET returns 200 if user has a password or 404 otherwise
POST must contain 'newPassword' field ( and 'oldPassword' if user already has a password )
"""
def post(self, request):
newpass = request.data.get('newPassword', None)
if newpass is None:
return Response({"message": "Missing 'newPassword' field"}, status=HTTP_400_BAD_REQUEST)
if len(newpass) < PASSWORD_MIN_LENGTH or len(newpass) > PASSWORD_MAX_LENGTH:
return Response({"message": "New password doesn't match length requirements"}, status=HTTP_400_BAD_REQUEST)
if request.user.has_usable_password():
oldpass = request.data.get('oldPassword', None)
if oldpass is None:
return Response({"message": "Missing 'oldPassword' field"}, status=HTTP_400_BAD_REQUEST)
if not request.user.check_password(oldpass):
return Response({"message": "'oldPassword' is invalid"}, status=HTTP_400_BAD_REQUEST)
if oldpass == newpass:
return Response({"message": "oldPassword and newPassword are identical"}, status=HTTP_400_BAD_REQUEST)
request.user.set_password(newpass)
request.user.save()
return Response(status=HTTP_200_OK)
def get(self, request):
if request.user.has_usable_password():
return Response(status=HTTP_200_OK)
return Response(status=HTTP_404_NOT_FOUND)
class AccountSocialView(APIView):
"""
A simple API endpoint for checking if user has connected social account
GET returns 200 and the name of the social auth provider if user has connected social account or 404 otherwise.
"""
def get(self, request):
socAuth = next(
iter(UserSocialAuth.get_social_auth_for_user(request.user)), None)
if not socAuth:
return Response(status=HTTP_404_NOT_FOUND)
else:
return Response({"social_provider": socAuth.provider}, status=HTTP_200_OK)
class RegisterView(APIView):
"""
An API endpoint for user registration.
POST must contain 'username', 'email', 'first_name', 'last_name' and 'password' fields.
"""
permission_classes = (AllowAny,)
def post(self, request):
serializer = UserRegisterSerializer(
data=request.data, context={'request': request})
# Return a 400 response if the data was invalid.
serializer.is_valid(raise_exception=True)
validated_data = serializer.validated_data
user = User.objects.create(
username=validated_data['username'],
email=validated_data['email'],
first_name=validated_data['first_name'],
last_name=validated_data['last_name']
)
user.set_password(validated_data['password'])
user.save()
setup_user_email(request, user, [])
# send_email_confirmation(request, user, signup=True)
return Response(status=HTTP_201_CREATED)
class RegisterCheckEmailView(APIView):
"""
A simple API endpoint for checking if an user with a given email exists.
POST must contain 'email' field. Server returns 400 if email is already used or 200 otherwise.
"""
permission_classes = (AllowAny,)
def post(self, request):
email = request.data.get('email', None)
if email is None:
return Response({"message": "'email' field is missing"}, status=HTTP_400_BAD_REQUEST)
if User.objects.filter(email=email):
return Response({"emailExist": True}, status=HTTP_400_BAD_REQUEST)
return Response(status=HTTP_200_OK)
class RegisterCheckUsernameView(APIView):
"""
An API endpoint for checking if an username is taken.
POST must contain 'username' field. Server returns 400 if username is already used or 200 is it is available
"""
permission_classes = (AllowAny,)
def post(self, request):
username = request.data.get('username', None)
if username is None:
return Response({"message": "'username' field is missing"}, status=HTTP_400_BAD_REQUEST)
if User.objects.filter(username=username):
return Response(status=HTTP_400_BAD_REQUEST)
return Response(status=HTTP_200_OK)
class GoogleAuthCodeView(APIView):
"""
An API endpoint which expects a google auth code, which is then used for social login.
POST must contain a 'code' field with the authorization code. This code is
exchanged for google's access and refresh tokens, which are stored on server.
Afterwards local access and refresh tokens are generated and returned, which are
then used to communicate with our API.
Go to https://developers.google.com/identity/sign-in/web/server-side-flow
for more information on the google server-side auth flow implemented here.
"""
permission_classes = (AllowAny,)
def post(self, request):
code = request.data.get('code', None)
if not code:
return Response({"message": "Authorization code missing"}, status=HTTP_400_BAD_REQUEST)
# Exchange auth code for tokens
googleurl = 'https://accounts.google.com/o/oauth2/token'
exchangeCodeRequest = makerequest.post(
googleurl,
data={
'code': code,
'redirect_uri': 'postmessage',
'client_id': SOCIAL_AUTH_GOOGLE_OAUTH2_KEY,
'client_secret': SOCIAL_AUTH_GOOGLE_OAUTH2_SECRET,
'grant_type': 'authorization_code'
})
# We can now exchange the external token for a token linked to *OUR*
# OAuth2 provider
exchangeExternalTokenUrl = 'http://' + \
request.META['HTTP_HOST'] + '/social-auth/convert-token'
externalToken = exchangeCodeRequest.json().get('access_token', None)
if externalToken is None:
return Response({"message": "Server could not retrieve external tokens"}, status=HTTP_400_BAD_REQUEST)
exchangeExternalTokenRequest = makerequest.post(exchangeExternalTokenUrl, data={
'grant_type': 'convert_token',
'client_id': LOCAL_OAUTH2_KEY,
'backend': 'google-oauth2',
'token': externalToken}
)
# Get user and add exchangeCodeRequest's (Google's) refresh token to UserSocialAuth extra_data
# This is a bit hacky, @TODO use python-social-auth's pipeline
# mechanism instead
if exchangeExternalTokenRequest.status_code is not makerequest.codes.ok:
# If the social account's email is already used in another account,
# throw an error
return Response({"message": "User with that email already exists!"}, status=HTTP_400_BAD_REQUEST)
getUserUrl = 'http://' + request.META['HTTP_HOST'] + '/account/'
getUserRequest = makerequest.get(getUserUrl, data={}, headers={
'Authorization': 'Bearer ' + exchangeExternalTokenRequest.json()['access_token']})
refreshToken = exchangeCodeRequest.json().get('refresh_token', None)
if refreshToken is not None:
user = User.objects.all().filter(
username=getUserRequest.json()['username'])[0]
userSocial = user.social_auth.get(provider='google-oauth2')
userSocial.extra_data['refresh_token'] = refreshToken
userSocial.save()
return Response(exchangeExternalTokenRequest.json())
| 39.47619 | 119 | 0.675049 | 1,284 | 10,777 | 5.516355 | 0.195483 | 0.055344 | 0.021177 | 0.036002 | 0.354087 | 0.26373 | 0.211351 | 0.152337 | 0.141607 | 0.141607 | 0 | 0.016842 | 0.234202 | 10,777 | 272 | 120 | 39.621324 | 0.841391 | 0.216943 | 0 | 0.262821 | 0 | 0 | 0.12315 | 0.003155 | 0 | 0 | 0 | 0.003676 | 0 | 1 | 0.070513 | false | 0.115385 | 0.070513 | 0 | 0.403846 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
adfb427ed9c8aea967913aed0f52cd6bbf4bc8fe | 1,661 | py | Python | tests/libtests/geocoords/data/ConvertDataApp.py | jedbrown/spatialdata | f18d34d92253986e8018f393201bf901e9667c2a | [
"MIT"
] | null | null | null | tests/libtests/geocoords/data/ConvertDataApp.py | jedbrown/spatialdata | f18d34d92253986e8018f393201bf901e9667c2a | [
"MIT"
] | null | null | null | tests/libtests/geocoords/data/ConvertDataApp.py | jedbrown/spatialdata | f18d34d92253986e8018f393201bf901e9667c2a | [
"MIT"
] | null | null | null | #!/usr/bin/env python
#
# ======================================================================
#
# Brad T. Aagaard, U.S. Geological Survey
#
# This code was developed as part of the Computational Infrastructure
# for Geodynamics (http://geodynamics.org).
#
# Copyright (c) 2010-2017 University of California, Davis
#
# See COPYING for license information.
#
# ======================================================================
#
## @file geocoords/tests/libtests/data/ConvertDataApp.py
## @brief Python application to generate data for coordinate conversion tests.
from pyre.applications.Script import Script
# ConvertDataApp class
class ConvertDataApp(Script):
"""Python application to generate data for coordinate conversion tests."""
def main(self, *args, **kwds):
"""Run application."""
data = self.inventory.data
data.calculate()
data.dump(self.inventory.dumper)
return
def __init__(self):
"""Constructor."""
Script.__init__(self, 'convertdataapp')
return
class Inventory(Script.Inventory):
## @class Inventory
## Python object for managing ConvertDataApp facilities and properties.
##
## \b Properties
## @li None
##
## \b Facilities
## @li \b data Data generator for coordinate transformation test
## @li \b dumper Dump data to file
import pyre.inventory
from spatialdata.utils.CppData import CppData
from ConvertData import ConvertData
data = pyre.inventory.facility('data', factory=ConvertData)
dumper = pyre.inventory.facility('dumper', factory=CppData)
# main
if __name__ == '__main__':
app = ConvertDataApp()
app.run()
# End of file
| 27.229508 | 78 | 0.64419 | 178 | 1,661 | 5.921348 | 0.5 | 0.037002 | 0.036053 | 0.051233 | 0.111954 | 0.111954 | 0.111954 | 0.111954 | 0.111954 | 0 | 0 | 0.00581 | 0.170981 | 1,661 | 60 | 79 | 27.683333 | 0.759622 | 0.53522 | 0 | 0.105263 | 0 | 0 | 0.044693 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.105263 | false | 0 | 0.210526 | 0 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
adfb5713c6c3ab922ee55b1c8b7f49f69297f607 | 381 | py | Python | index.py | FunctionX/validator_queries | 842d0a75ee07f48b972d1bb18d292cadc730fa8b | [
"MIT"
] | null | null | null | index.py | FunctionX/validator_queries | 842d0a75ee07f48b972d1bb18d292cadc730fa8b | [
"MIT"
] | null | null | null | index.py | FunctionX/validator_queries | 842d0a75ee07f48b972d1bb18d292cadc730fa8b | [
"MIT"
] | null | null | null | import subprocess
import json
import csv
from csv import DictWriter
import datetime
import pandas as pd
import Cmd
import Data
from Report import Report
import File
def main():
Data.val_earnings_w_sum_columns()
dataframe=Data.get_val_token_info()
dataframe.to_csv(File._generate_file_name("fxcored_status"), index=False)
if __name__ == '__main__':
main()
| 14.111111 | 77 | 0.76378 | 55 | 381 | 4.927273 | 0.6 | 0.088561 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.165354 | 381 | 26 | 78 | 14.653846 | 0.852201 | 0 | 0 | 0 | 0 | 0 | 0.057743 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.625 | 0 | 0.6875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
adfd589bdaa5f2bed3080ab8be5e570ad08ea48a | 1,457 | py | Python | sistemas_lineares.py | lucaspompeun/metodos-matematicos-aplicados-nas-engenharias-via-sistemas-computacionais | 008d397f76a935af1aba530cc0134b9dd326d3ac | [
"MIT"
] | 16 | 2019-09-27T03:08:44.000Z | 2020-10-16T18:43:45.000Z | primeira-edicao/sistemas_lineares.py | gm2sc-ifpa/metodos-matematicos-aplicados-nas-engenharias-via-sistemas-computacionais-master | f435c366e08dc14b0557f2172ad3b841ddb7ef2e | [
"MIT"
] | null | null | null | primeira-edicao/sistemas_lineares.py | gm2sc-ifpa/metodos-matematicos-aplicados-nas-engenharias-via-sistemas-computacionais-master | f435c366e08dc14b0557f2172ad3b841ddb7ef2e | [
"MIT"
] | 5 | 2019-09-13T20:00:38.000Z | 2020-09-19T03:04:00.000Z | # -*- coding: utf-8 -*-
"""
Created on Wed Mar 27 18:19:25 2019
INSTITUTO FEDERAL DE EDUCAÇÃO, CIÊNCIA E TECNOLOGIA DO PÁRA - IFPA ANANINDEUA
@author:
Prof. Dr. Denis C. L. Costa
Discentes:
Heictor Alves de Oliveira Costa
Lucas Pompeu Neves
Grupo de Pesquisa:
Gradiente de Modelagem Matemática e
Simulação Computacional - GM²SC
Assunto:
Resolução de Sistemas Lineares
Nome do sript: sistemas_lineares
Disponível em:
https://github.com/GM2SC/DEVELOPMENT-OF-MATHEMATICAL-METHODS-IN-
COMPUTATIONAL-ENVIRONMENT/blob/master/SINEPEM_2019/sistemas_lineares.py
"""
# Biblioteca: numpy
import numpy as np
print('')
print('=======================================')
# Resolução de Sistemas Lineares
print('Resolução de Sistemas Lineares')
print('')
# Declarando a Matriz dos Coeficientes: A
A = np.array([[1,1,1], [1,-1,-1], [2,-1,1]])
print('Matriz dos Coeficientes:' )
print('A =',"\n", A,"\n")
# Declarando a Matriz dos Termos Independentes: B
B = np.array([[6], [-4], [1]])
print('Matriz dos Termos Independentes:' )
print('B =',"\n", B,"\n")
# Matriz Solução: X = inv(A)*B
X = np.linalg.solve(A, B)
print('Matriz solução:')
print('X =')
print(X)
print('')
print('=======================================')
print(' ---> Fim do Programa sistemas_lineares <---') | 26.981481 | 78 | 0.572409 | 175 | 1,457 | 4.742857 | 0.537143 | 0.115663 | 0.014458 | 0.09759 | 0.090361 | 0.083133 | 0 | 0 | 0 | 0 | 0 | 0.028105 | 0.242965 | 1,457 | 54 | 79 | 26.981481 | 0.724388 | 0.589568 | 0 | 0.277778 | 0 | 0 | 0.451128 | 0.146617 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.055556 | 0 | 0.055556 | 0.777778 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
adfdf98fe1afda40d1e086b86ceaf5056842822c | 1,956 | py | Python | observations/r/unemp_dur.py | hajime9652/observations | 2c8b1ac31025938cb17762e540f2f592e302d5de | [
"Apache-2.0"
] | 199 | 2017-07-24T01:34:27.000Z | 2022-01-29T00:50:55.000Z | observations/r/unemp_dur.py | hajime9652/observations | 2c8b1ac31025938cb17762e540f2f592e302d5de | [
"Apache-2.0"
] | 46 | 2017-09-05T19:27:20.000Z | 2019-01-07T09:47:26.000Z | observations/r/unemp_dur.py | hajime9652/observations | 2c8b1ac31025938cb17762e540f2f592e302d5de | [
"Apache-2.0"
] | 45 | 2017-07-26T00:10:44.000Z | 2022-03-16T20:44:59.000Z | # -*- coding: utf-8 -*-
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import csv
import numpy as np
import os
import sys
from observations.util import maybe_download_and_extract
def unemp_dur(path):
"""Unemployment Duration
Journal of Business Economics and Statistics web site :
http://amstat.tandfonline.com/loi/ubes20
*number of observations* : 3343
A time serie containing :
spell
length of spell in number of two-week intervals
censor1
= 1 if re-employed at full-time job
censor2
= 1 if re-employed at part-time job
censor3
1 if re-employed but left job: pt-ft status unknown
censor4
1 if still jobless
age
age
ui
= 1 if filed UI claim
reprate
eligible replacement rate
disrate
eligible disregard rate
logwage
log weekly earnings in lost job (1985\\$)
tenure
years tenure in lost job
McCall, B.P. (1996) “Unemployment Insurance Rules, Joblessness, and
Part-time Work”, *Econometrica*, **64**, 647–682.
Args:
path: str.
Path to directory which either stores file or otherwise file will
be downloaded and extracted there.
Filename is `unemp_dur.csv`.
Returns:
Tuple of np.ndarray `x_train` with 3343 rows and 11 columns and
dictionary `metadata` of column headers (feature names).
"""
import pandas as pd
path = os.path.expanduser(path)
filename = 'unemp_dur.csv'
if not os.path.exists(os.path.join(path, filename)):
url = 'http://dustintran.com/data/r/Ecdat/UnempDur.csv'
maybe_download_and_extract(path, url,
save_file_name='unemp_dur.csv',
resume=False)
data = pd.read_csv(os.path.join(path, filename), index_col=0,
parse_dates=True)
x_train = data.values
metadata = {'columns': data.columns}
return x_train, metadata
| 22.744186 | 71 | 0.674847 | 273 | 1,956 | 4.721612 | 0.589744 | 0.011637 | 0.037238 | 0.030256 | 0.057409 | 0 | 0 | 0 | 0 | 0 | 0 | 0.026549 | 0.248978 | 1,956 | 85 | 72 | 23.011765 | 0.850238 | 0.554192 | 0 | 0 | 0 | 0 | 0.101266 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.045455 | false | 0 | 0.409091 | 0 | 0.5 | 0.045455 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
bc0762cc61dc9010fbb5ce8a4cca396976aadaf0 | 800 | py | Python | tests/strangenames.py | DasSkelett/AVC-VersionFileValidator | f31bab0cf5e273cbb675ffaf921741c32b3a2e15 | [
"MIT"
] | 2 | 2019-12-18T16:34:06.000Z | 2020-03-13T03:31:26.000Z | tests/strangenames.py | DasSkelett/AVC-VersionFileValidator | f31bab0cf5e273cbb675ffaf921741c32b3a2e15 | [
"MIT"
] | 4 | 2019-12-22T18:40:31.000Z | 2020-05-07T00:52:48.000Z | tests/strangenames.py | DasSkelett/AVC-VersionFileValidator | f31bab0cf5e273cbb675ffaf921741c32b3a2e15 | [
"MIT"
] | null | null | null | import os
from pathlib import Path
from unittest import TestCase
import validator.validator as validator
from .test_utils import schema, build_map
class TestStrangeNames(TestCase):
old_cwd = os.getcwd()
@classmethod
def setUpClass(cls):
os.chdir('./tests/workspaces/strange-names')
@classmethod
def tearDownClass(cls):
os.chdir(cls.old_cwd)
def test_findsAll(self):
(status, successful, failed, ignored) = validator.validate_cwd('', schema, build_map)
self.assertEqual(status, 1)
self.assertSetEqual(successful, {Path('CAPS.VERSION')})
self.assertSetEqual(failed, {Path('camelCaseVersionMissing.Version')})
# Make sure 'not-detected.version.json' has not been detected.
self.assertSetEqual(ignored, set())
| 29.62963 | 93 | 0.70125 | 92 | 800 | 6.021739 | 0.543478 | 0.097473 | 0.050542 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001543 | 0.19 | 800 | 26 | 94 | 30.769231 | 0.853395 | 0.075 | 0 | 0.105263 | 0 | 0 | 0.101626 | 0.085366 | 0 | 0 | 0 | 0 | 0.210526 | 1 | 0.157895 | false | 0 | 0.263158 | 0 | 0.526316 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bc0cf17e742ee5f0e9c7dd32d45b474ef9bbc8ae | 15,924 | py | Python | util/approximate/embedding_interpolator/old/v1_numpy.py | tchlux/util | eff37464c7e913377398025adf76b057f9630b35 | [
"MIT"
] | 4 | 2021-04-22T20:19:40.000Z | 2022-01-30T18:57:23.000Z | util/approximate/embedding_interpolator/old/v1_numpy.py | tchlux/util | eff37464c7e913377398025adf76b057f9630b35 | [
"MIT"
] | 1 | 2022-01-24T14:10:27.000Z | 2022-01-30T16:42:53.000Z | util/approximate/embedding_interpolator/old/v1_numpy.py | tchlux/util | eff37464c7e913377398025adf76b057f9630b35 | [
"MIT"
] | 2 | 2019-05-19T07:44:28.000Z | 2021-04-22T20:20:40.000Z | from numpy import zeros, ones, dot, sum, abs, max, argmax, clip, \
random, prod, asarray, set_printoptions, unravel_index
# Generate a random uniform number (array) in range [0,1].
def zero(*shape): return zeros(shape)
def randnorm(*shape): return random.normal(size=shape)
def randuni(*shape): return random.random(size=shape)
def randint(*shape, min=-3, max=9):
data = asarray(random.randint(min+1,max+1,size=shape), dtype=float)
data[data <= 0] -= 1
return data
# Build a model given four integers:
# di - dimension of input
# ds - dimension for each internal state
# ns - number of internal states
# do - dimension of output
def build_model(di, ds, ns, do):
# Use random normal vectors.
input_params = randnorm(di+1, ds)
internal_params = randnorm(ds+1, ds, ns-1)
output_params = randnorm(ds+1, do)
# Normalize the length of all random normal vectors for input.
input_params[:,:] /= ((input_params[:,:]**2).sum(axis=0))**(1/2)
internal_params[:,:,:] /= ((internal_params[:,:,:]**2).sum(axis=0))**(1/2)
# Set the bias values.
input_params[-1,:] = 1
internal_params[-1,:,:] = 0
# Set the scratch space for storing internal values to zero.
internal_values = zero(ds, ns)
return input_params, internal_params, output_params, internal_values
# Get the shape of a model (when provided the arrays).
def get_shape(*model):
di, ds = model[0].shape
di -= 1
ns = model[1].shape[-1] + 1
do = model[2].shape[-1]
return di, ds, ns, do
# Function for pushing values forward through a dense MLP.
def forward(inputs, input_params, internal_params, output_params, internal_values, display=False):
di, ds, ns, do = get_shape(input_params, internal_params, output_params)
# Compute the input layer.
internal_values[:,0] = clip(dot(inputs, input_params[:di,:]) +
input_params[di,:], 0.0, float('inf'))
if display:
print("^"*70)
print("input: ",inputs)
print()
for n in range(ds):
print(f"0.{n} ", input_params[:di,n], '+', input_params[di,n], '=', internal_values[n,0])
print(" 0 out ", internal_values[:,0])
# Compute the next set of internal values with a rectified activation.
for i in range(ns-1):
internal_values[:,i+1] = internal_params[ds,:,i] + \
dot(internal_values[:,i],
internal_params[:ds,:,i])
if display:
print()
for n in range(ds):
print(f"{i+1}.{n} ", internal_params[:ds,n,i], '+', internal_params[ds:ds+1,n,i], '=', internal_values[n,i+1])
internal_values[:,i+1] = clip(internal_values[:,i+1], 0.0, float('inf'))
if display: print(f" {i+1} out ", internal_values[:,i+1])
# compute the output.
output = dot(internal_values[:,ns-1], output_params[:ds]) + output_params[ds]
if display:
print()
for n in range(do):
print(f"{ns}.{n} ", output_params[:ds,n],'+', output_params[ds,n], '=', output[n])
print(f" {ns} out ", output[:])
print()
print("output:", output)
print("_"*70)
return output
# Compute the gradient with respect to all parameters using finite differences.
def gradient(grad, inputs, *model, display=False):
# Get the model shape.
di, ds, ns, do = get_shape(*model)
# Initialize storage for the gradients.
input_grad = zeros(model[0].shape)
internal_grad = zeros(model[1].shape)
output_grad = ones(model[2].shape)
# Retrieve the model parameters.
internal_params = model[1]
output_params = model[2]
# Retreive the internal values of the model (after executing forwards).
internal_values = model[-1]
# Compute the gradient of the last parameters.
nonzero = internal_values[:,-1].nonzero()
output_grad[ds,:] = grad[:]
for i in range(do):
output_grad[:ds,i] = internal_values[:,-1] * grad[i]
internal_values[nonzero,-1] = dot(output_params[:ds,:][nonzero], grad)
if display:
print("^"*70)
print("Output grad:")
print("",output_grad.T)
print("",nonzero, internal_values[:,-1])
# Compute the gradient of all internal parameters.
for i in range(ns-2,-1,-1):
# Compute the gradient for all weights.
# set the bias gradient.
internal_grad[ds,:,i] = internal_values[:,i+1]
# set the gradient for each column of connections
# (to a single output in next layer).
nonzero = internal_values[:,i].nonzero()
for j in range(ds):
if (internal_values[j,i+1] == 0): continue
internal_grad[:ds,j,i][nonzero] = internal_values[nonzero,i] * internal_values[j,i+1]
if display:
print(f"layer {i} -> {i+1}, output node {j}")
print(" ",internal_grad[:,j,i])
# Compute the next preceding layer of internal values.
internal_values[nonzero,i] = dot(internal_params[:ds,:,i][nonzero], internal_values[:,i+1])
if display:
print("Grads for next layer:")
print("",nonzero, internal_values[:,i])
# Compute the gradient for the input parameters.
input_grad[di,:] = internal_values[:,0]
for i in range(ds):
input_grad[:di,i] = inputs[:] * internal_values[i,0]
if display:
print("Input grad:")
print(input_grad.T)
print("_"*70)
# Return the gradients.
return input_grad, internal_grad, output_grad
# Compute the gradient with respect to all parameters using finite differences.
def finite_difference(inputs, *model, diff=0.0001, display=False):
# Shift matrices (used for computing finite differences).
input_shift = zeros(model[0].shape)
internal_shift = zeros(model[1].shape)
output_shift = zeros(model[2].shape)
# Function for producting the shifted model.
shifted_model = lambda: (model[0]+input_shift, model[1]+internal_shift, model[2]+output_shift, model[3])
# Gradient matrices.
input_grad = zeros(model[0].shape)
internal_grad = zeros(model[1].shape)
output_grad = zeros(model[2].shape)
# Total number of outputs.
output_shape = forward(inputs, *model).shape
num_outputs = prod(output_shape)
# Compute the expected set of nonzero internal activations.
forward(inputs, *model)
expected_nonzero = tuple(model[-1].nonzero()[0])
# Function for testing the effect that a shift
def measure_layer(layer, grad, shift, name):
for j in range(layer.size):
curr_idx = unravel_index(j, layer.shape)
shift[curr_idx] = diff/2
out_high = forward(inputs, *shifted_model())[out_index]
nonzero_high = tuple(model[3].nonzero()[0])
shift[curr_idx] = -diff/2
out_low = forward(inputs, *shifted_model())[out_index]
nonzero_low = tuple(model[3].nonzero()[0])
shift[curr_idx] = 0
# If a zero became nonzero (or vice versa), then the
# finite different approximation is unstable.
if ((len(nonzero_high) <= len(expected_nonzero)) and
(len(nonzero_low) <= len(expected_nonzero))):
# Compute the gradient
grad[curr_idx] += sum(out_high - out_low) / diff
if display:
print(f"{name:14s}{str(curr_idx):10s} {grad[curr_idx]: .3f}")
print(f" {float(out_high)}")
print(f" {float(out_low)}")
print(f" {float(diff)}")
# Display information.
if display:
print("^"*70)
print("shifted_model: ",[v.shape for v in shifted_model()])
print("output shape, size: ", output_shape, num_outputs)
# Cycle over each output.
for i in range(num_outputs):
out_index = unravel_index(i, output_shape)
if display: print("out_index: ",out_index)
# Cycle over all model parameters, testing effect on output.
# input layer
measure_layer(model[0], input_grad, input_shift, "input idx:")
# internal layers
measure_layer(model[1], internal_grad, internal_shift, "internal idx:")
# output layer
measure_layer(model[2], output_grad, output_shift, "output idx:")
if display: print("_"*70)
# Done computing finite difference gradient!
return input_grad, internal_grad, output_grad
def test():
print("Testing..")
di_vals = (1,2,3)
ds_vals = (1,2,3)
ns_vals = (1,2,3)
do_vals = (1,2,3)
seeds = list(range(5))
# Cycle all combination of tests.
from itertools import product
for (di, ds, ns, do, seed) in product(di_vals, ds_vals, ns_vals, do_vals, seeds):
# --------------------------------------------------------------------
# di - dimension of input
# ds - dimension for each internal state
# ns - number of internal states
# do - dimension of output
# --------------------------------------------------------------------
random.seed(seed)
# Create the model.
#
model = build_model(di, ds, ns, do)
# Call the "forward" function.
# inputs = randuni(di)
inputs = randuni(di)
# inputs = randint(di)
# Run the model forward to compute internal values.
output = forward(inputs, *model, display=False)
# Compute the gradients with a finite difference.
approx_model_grad = finite_difference(inputs, *model, display=False)
# Run the model again (fresh) to get the "internal values".
output = forward(inputs, *model, display=False)
# Print the model gradients that were directly computed.
model_grad = gradient(ones(do), inputs, *model)
# Check the correctness of the gradient function.
for i,(app, true) in enumerate(zip(approx_model_grad, model_grad)):
diff = (abs(app - true) / (abs(true) + 1)).T
# Skip "internal params" if that is empty.
if (len(diff) == 0): continue
# Check for the difference.
if (max(diff) > .01):
set_printoptions(precision=3, sign=" ")
print()
print("ERROR ON TEST")
print(" seed =",seed)
print()
print("di, ds, ns, do: ",di, ds, ns, do)
print("input_params: ",model[0].shape)
print("internal_params: ",model[1].shape)
print("output_params: ",model[2].shape)
print("internal_values: ",model[3].shape)
print()
# forward(inputs, *model, display=True)
finite_difference(inputs, *model, display=True)
print()
print("model[0]:")
print(model[0].T)
print()
print("model[1]:")
print(model[1].T)
print()
print("model[2]:")
print(model[2].T)
print()
print("internals:")
print(model[-1].T)
print()
print()
print("approx_model_grad[0]:")
print(approx_model_grad[0].T)
print()
print("approx_model_grad[1]:")
print(approx_model_grad[1].T)
print()
print("approx_model_grad[2]:")
print(approx_model_grad[2].T)
print()
print()
print("model_grad[0]:")
print(model_grad[0].T)
print()
print("model_grad[1]:")
print(model_grad[1].T)
print()
print("model_grad[2]:")
print(model_grad[2].T)
print()
print()
print("Phase",i,"(0 = input, 1 = internal, 2 = output)")
print("",max(diff))
print("",unravel_index(argmax(diff), diff.shape))
print()
print("Finite differene gradient:")
print(app.T)
print()
print("Directly computed gradient:")
print(true.T)
print()
print("Difference")
print(diff)
print()
print("ERROR ON TEST")
exit()
print(" all passed!")
if __name__ == "__main__":
test()
class NN:
def __init__(self, di, do, ds=16, ns=4):
self.di = di
self.ds = ds
self.ns = ns
self.do = do
self.model = list(build_model(di, ds, ns, do))
def fit(self, x, y, steps=1000, step_factor=0.01, display=False,
show=False, **kwargs):
# Make sure that the given data is the right shape.
assert (self.di == x.shape[-1])
assert (self.do == y.shape[-1])
if (show and (self.do == 1) and (self.di == 1)):
show_interval = max([1, steps // 100])
from util.plot import Plot
p = Plot()
p.add("Data", *(x.T), y.flatten(), group='d', frame=-1)
p.add_func("Model", self, [x.min(), x.max()], group='m', frame=-1)
loss_values = []
# For the number of training steps..
for s in range(steps):
if (not s%10): print(s, end="\r")
if (show): loss_values.append( ((y - self(x))**2).sum()**(1/2) )
grads = [zeros(l.shape) for l in self.model]
# Average gradient from all data points.
for i, (d_in, d_out) in enumerate(zip(x,y)):
m_out = forward(d_in, *self.model, display=False)
loss_grad = m_out - d_out
grad_step = gradient(loss_grad, d_in, *self.model, display=False)
# Dynamically update the average (of the gradients).
for j in range(len(grad_step)):
grads[j] += (grad_step[j] - grads[j]) / (i+1)
if display:
yhat = self(x).reshape(y.shape)
loss = ((y - yhat)**2).sum(axis=-1).mean()
# Take a step in the gradient direction.
for j in range(len(grads)):
self.model[j] -= grads[j] * step_factor
# Display progress.
if display:
print()
print("Step:", s)
print("loss:", loss)
print("model:")
for l in self.model[:-1]:
print("",l.T)
print("grads: ")
for l in grads[:-1]:
print("",-l.T)
print()
# Update the model plot, if appropriate.
if (show and (s%show_interval == 0)):
p.add("Data", *(x.T), y.flatten(), group='d', frame=s)
p.add_func("Model", self, [x.min(), x.max()], group='m', frame=s)
# Add the last frame, if it wasn't already added.
if (show):
print(" showing plot..")
# Show the plot of the model.
p.show(show=False)
p = Plot("","Step","Loss value")
p.add("Loss", list(range(len(loss_values))), loss_values,
mode="markers+lines", color=1)
p.show(append=True, show_legend=False)
# Return predictions for new data.
def predict(self, x):
if (len(x.shape) == 2):
outputs = []
for x_in in x:
outputs.append( forward(x_in, *self.model)[0] )
return asarray(outputs)
else: return forward(x, *self.model)
# Wrapper for the "__call__".
def __call__(self, *args):
return self.predict(*args)
| 41.46875 | 129 | 0.543582 | 1,997 | 15,924 | 4.216825 | 0.140711 | 0.058188 | 0.021613 | 0.00855 | 0.266002 | 0.190714 | 0.143926 | 0.105688 | 0.068638 | 0.068638 | 0 | 0.017682 | 0.314557 | 15,924 | 383 | 130 | 41.577024 | 0.753825 | 0.191723 | 0 | 0.193662 | 1 | 0 | 0.068866 | 0.007191 | 0 | 0 | 0 | 0 | 0.007042 | 1 | 0.052817 | false | 0.003521 | 0.010563 | 0.014085 | 0.09507 | 0.373239 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc101f9477ced02fc73e4aa485f705e9bede840d | 1,429 | py | Python | sample-apps/data-loader/app.py | jkylling/fdb-kubernetes-operator | b6c2d18d7841e789bd33bc1e05ed449e7468fabd | [
"Apache-2.0"
] | null | null | null | sample-apps/data-loader/app.py | jkylling/fdb-kubernetes-operator | b6c2d18d7841e789bd33bc1e05ed449e7468fabd | [
"Apache-2.0"
] | null | null | null | sample-apps/data-loader/app.py | jkylling/fdb-kubernetes-operator | b6c2d18d7841e789bd33bc1e05ed449e7468fabd | [
"Apache-2.0"
] | null | null | null | #! /usr/bin/python
'''
This file provides a sample app for loading data into FDB.
To use it to load data into one of the sample clusters in this repo,
you can build the image by running `docker build -t fdb-data-loader sample-apps/data-loader`,
and then run the data loader by running `kubectl apply -f sample-apps/data-loader/job.yaml`
'''
import argparse
import random
import uuid
import fdb
fdb.api_version(600)
@fdb.transactional
def write_batch(tr, batch_size, value_size):
prefix = uuid.uuid4()
for index in range(1, batch_size+1):
key = fdb.tuple.pack((prefix, index))
value = []
for _ in range(0, value_size):
value.append(random.randint(0, 255))
tr[key] = bytes(value)
pass
def load_data(keys, batch_size, value_size):
batch_count = int(keys / batch_size)
db = fdb.open()
for batch in range(1, batch_count+1):
print('Writing batch %d' % batch)
write_batch(db, batch_size, value_size)
pass
if __name__ == '__main__':
parser = argparse.ArgumentParser(description="Load random data into FDB")
parser.add_argument('--keys', type=int, help='Number of keys to generate', default=100000)
parser.add_argument('--batch-size', type=int, help='Number of keys to write in each transaction', default=10)
parser.add_argument('--value-size', type=int, help='Number of bytes to include in each value', default=1000)
args = parser.parse_args()
load_data(args.keys, args.batch_size, args.value_size)
| 30.404255 | 110 | 0.73478 | 234 | 1,429 | 4.354701 | 0.423077 | 0.061825 | 0.041217 | 0.052993 | 0.075564 | 0.075564 | 0.049068 | 0 | 0 | 0 | 0 | 0.020425 | 0.143457 | 1,429 | 46 | 111 | 31.065217 | 0.812092 | 0.23233 | 0 | 0.068966 | 0 | 0 | 0.172635 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.068966 | false | 0.068966 | 0.137931 | 0 | 0.206897 | 0.034483 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
bc10b3a754b1e07f5391ee346cb0afda2918b4d2 | 1,580 | py | Python | data_files/PROGRAMS/MEDIUM/0018_4Sum.py | sudhirrd007/LeetCode-scraper | de87ff17fff2c73e67392321df1107cce7cbf883 | [
"MIT"
] | null | null | null | data_files/PROGRAMS/MEDIUM/0018_4Sum.py | sudhirrd007/LeetCode-scraper | de87ff17fff2c73e67392321df1107cce7cbf883 | [
"MIT"
] | null | null | null | data_files/PROGRAMS/MEDIUM/0018_4Sum.py | sudhirrd007/LeetCode-scraper | de87ff17fff2c73e67392321df1107cce7cbf883 | [
"MIT"
] | null | null | null | # ID : 18
# Title : 4Sum
# Difficulty : MEDIUM
# Acceptance_rate : 35.2%
# Runtime : 72 ms
# Memory : 12.7 MB
# Tags : Array , Hash Table , Two Pointers
# Language : python3
# Problem_link : https://leetcode.com/problems/4sum
# Premium : 0
# Notes : -
###
def fourSum(self, nums: List[int], target: int) -> List[List[int]]:
nums.sort()
L = len(nums)
ans = []
if(L > 2):
last = nums[-1]
else:
return []
for i in range(L-3):
if(i>0 and nums[i]==nums[i-1] or nums[i] + 3*last < target):
continue
if(4*nums[i] > target):
break
for j in range(i+1, L-2):
if(j>i+1 and nums[j] == nums[j-1] or nums[i]+nums[j]+2*last < target):
continue
if(nums[i]+3*nums[j] > target):
break
temp = nums[i] + nums[j]
start,end = j+1, L-1
while(start < end):
t = temp + nums[start] + nums[end]
if(t < target):
start += 1
elif(t > target):
end -= 1
else:
if([nums[i], nums[j], nums[end], t] not in ans):
ans.append([nums[i], nums[j], nums[start], nums[end]])
while(start < end and nums[start]==nums[start+1]):
start += 1
start += 1
end -= 1
return ans
| 31.6 | 86 | 0.401899 | 192 | 1,580 | 3.296875 | 0.359375 | 0.07109 | 0.07109 | 0.063191 | 0.044234 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04108 | 0.460759 | 1,580 | 49 | 87 | 32.244898 | 0.701878 | 0.144937 | 0 | 0.323529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc16f12ef99fb925404a57659bcb97a3596fb611 | 886 | py | Python | django/store/views.py | brickfaced/django-ecommerce | e06c69f4cc1f57ac2acffa0baf1c48fa99fede13 | [
"MIT"
] | 24 | 2021-03-10T17:04:46.000Z | 2022-02-28T20:09:52.000Z | nextdrf/django/store/views.py | RJ-0605/YT_NextJS_DRF_Ecommerce_2021_Part1 | 4416c2168143aa71ba863d61efb71c5712dbd215 | [
"MIT"
] | 1 | 2021-04-07T19:57:10.000Z | 2021-04-07T20:37:03.000Z | nextdrf/django/store/views.py | RJ-0605/YT_NextJS_DRF_Ecommerce_2021_Part1 | 4416c2168143aa71ba863d61efb71c5712dbd215 | [
"MIT"
] | 18 | 2021-03-21T09:03:40.000Z | 2022-01-31T10:08:24.000Z | from django.shortcuts import render
from rest_framework import generics
from . import models
from .models import Category, Product
from .serializers import CategorySerializer, ProductSerializer
class ProductListView(generics.ListAPIView):
queryset = Product.objects.all()
serializer_class = ProductSerializer
class Product(generics.RetrieveAPIView):
lookup_field = "slug"
queryset = Product.objects.all()
serializer_class = ProductSerializer
class CategoryItemView(generics.ListAPIView):
serializer_class = ProductSerializer
def get_queryset(self):
return models.Product.objects.filter(
category__in=Category.objects.get(slug=self.kwargs["slug"]).get_descendants(include_self=True)
)
class CategoryListView(generics.ListAPIView):
queryset = Category.objects.filter(level=1)
serializer_class = CategorySerializer
| 27.6875 | 106 | 0.772009 | 91 | 886 | 7.395604 | 0.428571 | 0.089153 | 0.142645 | 0.074294 | 0.18425 | 0.18425 | 0.18425 | 0.18425 | 0 | 0 | 0 | 0.00133 | 0.151242 | 886 | 31 | 107 | 28.580645 | 0.893617 | 0 | 0 | 0.238095 | 0 | 0 | 0.009029 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.047619 | false | 0 | 0.238095 | 0.047619 | 0.904762 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bc1a89242c0c01e6a959ddfab279551c0ad333eb | 831 | py | Python | 6. Algorithms - Graph Traversal/3 - GraphTraversal-DFS.py | PacktPublishing/Data-Structures-and-Algorithms-The-Complete-Masterclass | a4f34ecaa23682df62763e3b6b54709383f45c0b | [
"MIT"
] | 25 | 2021-01-20T19:09:05.000Z | 2022-02-02T01:29:46.000Z | 6. Algorithms - Graph Traversal/3 - GraphTraversal-DFS.py | EmmaMuhleman1/Data-Structures-and-Algorithms-The-Complete-Masterclass | a4f34ecaa23682df62763e3b6b54709383f45c0b | [
"MIT"
] | null | null | null | 6. Algorithms - Graph Traversal/3 - GraphTraversal-DFS.py | EmmaMuhleman1/Data-Structures-and-Algorithms-The-Complete-Masterclass | a4f34ecaa23682df62763e3b6b54709383f45c0b | [
"MIT"
] | 23 | 2021-01-20T19:09:12.000Z | 2022-03-23T02:50:05.000Z | class Node():
def __init__(self, value):
self.value = value
self.adjacentlist = []
self.visited = False
class Graph():
def DFS(self, node, traversal):
node.visited = True
traversal.append(node.value)
for element in node.adjacentlist:
if element.visited is False:
self.DFS(element, traversal)
return traversal
node1 = Node("A")
node2 = Node("B")
node3 = Node("C")
node4 = Node("D")
node5 = Node("E")
node6 = Node("F")
node7 = Node("G")
node8 = Node("H")
node1.adjacentlist.append(node2)
node1.adjacentlist.append(node3)
node1.adjacentlist.append(node4)
node2.adjacentlist.append(node5)
node2.adjacentlist.append(node6)
node4.adjacentlist.append(node7)
node6.adjacentlist.append(node8)
graph = Graph()
print(graph.DFS(node1, [])) | 21.868421 | 44 | 0.649819 | 102 | 831 | 5.254902 | 0.362745 | 0.235075 | 0.128731 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.035061 | 0.21059 | 831 | 38 | 45 | 21.868421 | 0.782012 | 0 | 0 | 0 | 0 | 0 | 0.009615 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.066667 | false | 0 | 0 | 0 | 0.166667 | 0.033333 | 0 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc1e78e2d936299d31ed164ae95b1d514a32eb51 | 221 | py | Python | homeassistant/components/ridwell/const.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 30,023 | 2016-04-13T10:17:53.000Z | 2020-03-02T12:56:31.000Z | homeassistant/components/ridwell/const.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 31,101 | 2020-03-02T13:00:16.000Z | 2022-03-31T23:57:36.000Z | homeassistant/components/ridwell/const.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 11,956 | 2016-04-13T18:42:31.000Z | 2020-03-02T09:32:12.000Z | """Constants for the Ridwell integration."""
import logging
DOMAIN = "ridwell"
LOGGER = logging.getLogger(__package__)
DATA_ACCOUNT = "account"
DATA_COORDINATOR = "coordinator"
SENSOR_TYPE_NEXT_PICKUP = "next_pickup"
| 18.416667 | 44 | 0.778281 | 25 | 221 | 6.48 | 0.72 | 0.123457 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117647 | 221 | 11 | 45 | 20.090909 | 0.830769 | 0.171946 | 0 | 0 | 0 | 0 | 0.20339 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc1fee8e7b2dd05b18c6fd6cd75205806000b3bf | 4,716 | py | Python | neuron_simulator_service/SAC_network/stimulus.py | jpm343/RetinaX | 63f84209b4f8bdcdc88f35c54a03e7b7c56f4ab3 | [
"MIT"
] | 2 | 2020-12-06T23:04:39.000Z | 2020-12-29T18:28:20.000Z | neuron_simulator_service/SAC_network/stimulus.py | jpm343/RetinaX | 63f84209b4f8bdcdc88f35c54a03e7b7c56f4ab3 | [
"MIT"
] | null | null | null | neuron_simulator_service/SAC_network/stimulus.py | jpm343/RetinaX | 63f84209b4f8bdcdc88f35c54a03e7b7c56f4ab3 | [
"MIT"
] | 1 | 2020-10-23T19:40:59.000Z | 2020-10-23T19:40:59.000Z | from __future__ import division
import numpy as np
# Set bar stimuli speed
def update_bar_speed(BPsyn, delay, width, speed, d_init,
synapse_type="alphaCSyn", angle=0):
print "Updating bar speed to: %f mm/s" % speed
angrad = angle * np.pi / 180.0
angcos = np.cos(angrad)
angsin = np.sin(angrad)
for BPi in BPsyn:
# Note that initial bar position, d_init, should be as far as to
# ensure all BP are activated
# (xprime, yprime): rotated and translated axes centered on the bar
# xprime: axis of bar, yprime: normal to bar
# xprime = BPi[1] * angsin - BPi[2] * angcos
yprime = BPi[1] * angcos + BPi[2] * angsin + d_init
synapse_onset = delay + yprime / speed
if ((speed > 0 and yprime < (0 - width)) or
(speed < 0 and yprime > 0)):
# Bar won't pass over BP location (yprime)
deactivate_BP_synapse(BPi, synapse_type, synapse_onset)
continue
duration = None
if synapse_type == "BPexc":
duration = abs(width / speed)
activate_BP_synapse(BPi, synapse_type, synapse_onset, duration)
return BPsyn
def set_stimulus(BPsyn, stimulus_type, delay, synapse_type, **kwargs):
if stimulus_type == "bar":
width = kwargs['bar_width']
speed = kwargs['bar_speed']
x_init = kwargs['bar_x_init']
if 'bar_angle' in kwargs:
update_bar_speed(BPsyn, delay, width, speed, x_init, synapse_type,
kwargs['bar_angle'])
else:
update_bar_speed(BPsyn, delay, width, speed, x_init, synapse_type)
elif stimulus_type == "annulus":
center = kwargs['center']
ri = kwargs['inner_diam']
ro = kwargs['outer_diam']
dur = kwargs['duration']
print "Setting up annulus stimulus with delay: %1.1f (ms)" % delay
for BPi in BPsyn:
if in_annulus((BPi[1], BPi[2]), center, ri, ro):
activate_BP_synapse(BPi, synapse_type, delay, dur)
elif stimulus_type == "grating":
width = kwargs['bar_width']
speed = kwargs['bar_speed']
x_init = kwargs['bar_x_init']
x_freq = kwargs['spatial_freq']
N_bars = kwargs['N_bars']
dur = width / speed
period = x_freq / speed
for BPi in BPsyn:
if BPi[1] < (x_init - width): # Grating wont pass over BP location
continue
synapse_onset = delay + (BPi[1] - x_init) / speed
activate_BP_synapse(BPi, synapse_type, synapse_onset, dur, period,
N_bars)
elif stimulus_type == "bar_with_circular_mask":
width = kwargs['bar_width']
speed = kwargs['bar_speed']
x_init = kwargs['bar_x_init']
mask_center = kwargs['mask_center']
mask_diam = kwargs['mask_diam']
dur = width / speed
for BPi in BPsyn:
if BPi[1] < (x_init - width) or not in_annulus((BPi[1], BPi[2]),
mask_center, 0, mask_diam / 2):
continue
synapse_onset = delay + (BPi[1] - x_init) / speed
activate_BP_synapse(BPi, synapse_type, synapse_onset, dur)
return BPsyn
def in_annulus(point, center, inner_diam, outer_diam):
dist = np.linalg.norm(np.array(point) - np.array(center))
return dist >= inner_diam and dist <= outer_diam
def activate_BP_synapse(BPsynapse, synapse_type, synapse_onset, dur=None,
period=0, n_events=1):
if synapse_type in ("alphaCSyn", "expCSyn"):
BPsynapse[0].onset = synapse_onset
BPsynapse[0].dur = BPsynapse[0].default_dur
if dur is not None:
BPsynapse[0].dur = dur
elif synapse_type in ("Exp2Syn", "BPexc"):
BPsynapse[-2].number = n_events
BPsynapse[-2].interval = period
BPsynapse[-2].start = synapse_onset
BPsynapse[-2].noise = 0 # Deafult should be 0 anyway
if synapse_type == "BPexc":
BPsynapse[0].dur = dur
def deactivate_BP_synapse(BPsynapse, synapse_type, synapse_onset):
if synapse_type in ("alphaCSyn", "expCSyn"):
BPsynapse[0].onset = synapse_onset
BPsynapse[0].dur = 0
elif synapse_type in ("Exp2Syn", "BPexc"):
BPsynapse[-2].number = 0
BPsynapse[-2].interval = 0
BPsynapse[-2].start = synapse_onset
BPsynapse[-2].noise = 0 # Deafult should be 0 anyway
if synapse_type == "BPexc":
BPsynapse[0].dur = 0
def insert_voltage_clamp(nrnobj, nrnsec, xsec, voltage_amp, dur):
vclamp = nrnobj.SEClamp(xsec, sec=nrnsec)
vclamp.amp1 = voltage_amp
vclamp.dur1 = dur
return vclamp
| 38.975207 | 79 | 0.599237 | 614 | 4,716 | 4.410423 | 0.224756 | 0.073117 | 0.039882 | 0.05096 | 0.460487 | 0.442024 | 0.418021 | 0.36226 | 0.36226 | 0.3113 | 0 | 0.016517 | 0.293893 | 4,716 | 120 | 80 | 39.3 | 0.796697 | 0.083545 | 0 | 0.39 | 0 | 0 | 0.086523 | 0.005103 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.02 | null | null | 0.02 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc24a0ca761d939b80b88836c40772f1d9ffb66b | 3,055 | py | Python | panel/layout/spacer.py | sthagen/holoviz-panel | 9abae5ac78e55857ed209de06feae3439f2f533b | [
"BSD-3-Clause"
] | 601 | 2018-08-25T20:01:22.000Z | 2019-11-19T19:37:08.000Z | panel/layout/spacer.py | sthagen/holoviz-panel | 9abae5ac78e55857ed209de06feae3439f2f533b | [
"BSD-3-Clause"
] | 626 | 2018-08-27T16:30:33.000Z | 2019-11-20T17:02:00.000Z | panel/layout/spacer.py | sthagen/holoviz-panel | 9abae5ac78e55857ed209de06feae3439f2f533b | [
"BSD-3-Clause"
] | 73 | 2018-09-28T07:46:05.000Z | 2019-11-18T22:45:36.000Z | """
Spacer components to add horizontal or vertical space to a layout.
"""
import param
from bokeh.models import Div as BkDiv, Spacer as BkSpacer
from ..reactive import Reactive
class Spacer(Reactive):
"""
The `Spacer` layout is a very versatile component which makes it easy to
put fixed or responsive spacing between objects.
Like all other components spacers support both absolute and responsive
sizing modes.
Reference: https://panel.holoviz.org/user_guide/Customization.html#spacers
:Example:
>>> pn.Row(
... 1, pn.Spacer(width=200),
... 2, pn.Spacer(width=100),
... 3
... )
"""
_bokeh_model = BkSpacer
def _get_model(self, doc, root=None, parent=None, comm=None):
properties = self._process_param_change(self._init_params())
model = self._bokeh_model(**properties)
if root is None:
root = model
self._models[root.ref['id']] = (model, parent)
return model
class VSpacer(Spacer):
"""
The `VSpacer` layout provides responsive vertical spacing.
Using this component we can space objects equidistantly in a layout and
allow the empty space to shrink when the browser is resized.
Reference: https://panel.holoviz.org/user_guide/Customization.html#spacers
:Example:
>>> pn.Column(
... pn.layout.VSpacer(), 'Item 1',
... pn.layout.VSpacer(), 'Item 2',
... pn.layout.VSpacer()
... )
"""
sizing_mode = param.Parameter(default='stretch_height', readonly=True)
class HSpacer(Spacer):
"""
The `HSpacer` layout provides responsive vertical spacing.
Using this component we can space objects equidistantly in a layout and
allow the empty space to shrink when the browser is resized.
Reference: https://panel.holoviz.org/user_guide/Customization.html#spacers
:Example:
>>> pn.Row(
... pn.layout.HSpacer(), 'Item 1',
... pn.layout.HSpacer(), 'Item 2',
... pn.layout.HSpacer()
... )
"""
sizing_mode = param.Parameter(default='stretch_width', readonly=True)
class Divider(Reactive):
"""
A `Divider` draws a horizontal rule (a `<hr>` tag in HTML) to separate
multiple components in a layout. It automatically spans the full width of
the container.
Reference: https://panel.holoviz.org/reference/layouts/Divider.html
:Example:
>>> pn.Column(
... '# Lorem Ipsum',
... pn.layout.Divider(),
... 'A very long text... '
>>> )
"""
width_policy = param.ObjectSelector(default="fit", readonly=True)
_bokeh_model = BkDiv
def _get_model(self, doc, root=None, parent=None, comm=None):
properties = self._process_param_change(self._init_params())
properties['style'] = {'width': '100%', 'height': '100%'}
model = self._bokeh_model(text='<hr style="margin: 0px">', **properties)
if root is None:
root = model
self._models[root.ref['id']] = (model, parent)
return model
| 27.035398 | 80 | 0.636661 | 376 | 3,055 | 5.087766 | 0.343085 | 0.029273 | 0.039728 | 0.054365 | 0.493466 | 0.478306 | 0.438578 | 0.438578 | 0.438578 | 0.438578 | 0 | 0.008573 | 0.236334 | 3,055 | 112 | 81 | 27.276786 | 0.811402 | 0.527987 | 0 | 0.444444 | 0 | 0 | 0.066884 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.074074 | false | 0 | 0.111111 | 0 | 0.592593 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
bc30514100a65cd97885d46696267a630becc87d | 266 | py | Python | Algorithms/Maximum_Number_of_Coins_You_Can_Get/main.py | ugurcan-sonmez-95/LeetCode | e463a424c2d781f67be31ae42bc3c5c896db6017 | [
"MIT"
] | 1 | 2020-10-01T09:12:09.000Z | 2020-10-01T09:12:09.000Z | Algorithms/Maximum_Number_of_Coins_You_Can_Get/main.py | ugurcan-sonmez-95/LeetCode | e463a424c2d781f67be31ae42bc3c5c896db6017 | [
"MIT"
] | null | null | null | Algorithms/Maximum_Number_of_Coins_You_Can_Get/main.py | ugurcan-sonmez-95/LeetCode | e463a424c2d781f67be31ae42bc3c5c896db6017 | [
"MIT"
] | null | null | null | ### Maximum Number of Coins You Can Get - Solution
class Solution:
def maxCoins(self, piles: List[int]) -> int:
piles.sort()
max_coin, n = 0, len(piles)
for i in range(n//3, n, 2):
max_coin += piles[i]
return max_coin | 29.555556 | 50 | 0.56391 | 40 | 266 | 3.675 | 0.7 | 0.142857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016393 | 0.31203 | 266 | 9 | 51 | 29.555556 | 0.786885 | 0.172932 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0 | 0 | 0.428571 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc3513cb203b5eb746d7ca1948d927c26402a7f1 | 390 | py | Python | palsbet/migrations/0003_auto_20180323_0018.py | denis254/palsbetc | d70d0fadaa661ff36c046a4f0a87a88d890c0dc4 | [
"BSD-3-Clause"
] | null | null | null | palsbet/migrations/0003_auto_20180323_0018.py | denis254/palsbetc | d70d0fadaa661ff36c046a4f0a87a88d890c0dc4 | [
"BSD-3-Clause"
] | 11 | 2020-03-24T16:11:23.000Z | 2021-12-13T19:47:29.000Z | palsbet/migrations/0003_auto_20180323_0018.py | denis254/overtimebet | 063af2fc263580d96e396e953ef8658a75ac38a5 | [
"BSD-3-Clause"
] | null | null | null | # Generated by Django 2.0.2 on 2018-03-22 21:18
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('palsbet', '0002_viptipsgames'),
]
operations = [
migrations.AlterField(
model_name='viptipsgames',
name='cathegory',
field=models.CharField(max_length=100),
),
]
| 20.526316 | 51 | 0.602564 | 40 | 390 | 5.8 | 0.8 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078853 | 0.284615 | 390 | 18 | 52 | 21.666667 | 0.752688 | 0.115385 | 0 | 0 | 1 | 0 | 0.131195 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
bc3d021b03a2c4dec39faa861c772e80b36950d7 | 1,311 | py | Python | test.py | nerdingitout/STT-- | 4544f89abd6c2a8c11c1cf3e309ceb6f5d9b6b15 | [
"Apache-2.0"
] | null | null | null | test.py | nerdingitout/STT-- | 4544f89abd6c2a8c11c1cf3e309ceb6f5d9b6b15 | [
"Apache-2.0"
] | null | null | null | test.py | nerdingitout/STT-- | 4544f89abd6c2a8c11c1cf3e309ceb6f5d9b6b15 | [
"Apache-2.0"
] | null | null | null | import pandas as pd
import json
import csv
# importing the module
import json
# Opening JSON file
with open('response.json') as json_file:
data = json.load(json_file)
# for reading nested data [0] represents
# the index value of the list
print(data['results'][0]['alternatives']['transcript'])
# for printing the key-value pair of
# nested dictionary for looop can be used
print("\nPrinting nested dicitonary as a key-value pair\n")
for i in data['people1']:
print("Name:", i['name'])
print("Website:", i['website'])
print("From:", i['from'])
print()
#def json_csv(filename):
# with open(filename) as data_file: #opening json file
# data = json.load(data_file) #loading json data
# normalized_df = pd.json_normalize(data)
# print(normalized_df['results'][0])
# normalized_df.to_csv('my_csv_file.csv',index=False)
# return pd.DataFrame(data['results'])
json_csv('response.json') #calling the json_csv function, paramter is the source json file
#file = open('response.json')
#obj = json.load(file)
#for element in obj['results']:
# for alternative in element['alternatives']:
# for stamp in alternative['timestamps']:
# name, value1, value2 = stamp
# print(stamp)
| 29.133333 | 91 | 0.650648 | 179 | 1,311 | 4.687151 | 0.391061 | 0.047676 | 0.035757 | 0.038141 | 0.047676 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005871 | 0.220442 | 1,311 | 44 | 92 | 29.795455 | 0.815068 | 0.601068 | 0 | 0.142857 | 0 | 0 | 0.290581 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.285714 | 0 | 0.285714 | 0.428571 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
70b0ec3b0f20b624360319ad2024474c4a16cdca | 928 | py | Python | Learning/python_data_analysis11.py | VictoriaGuXY/MCO-Menu-Checker-Online | 706e2e1bf7395cc344f382ea2ac53d964d459f86 | [
"MIT"
] | null | null | null | Learning/python_data_analysis11.py | VictoriaGuXY/MCO-Menu-Checker-Online | 706e2e1bf7395cc344f382ea2ac53d964d459f86 | [
"MIT"
] | null | null | null | Learning/python_data_analysis11.py | VictoriaGuXY/MCO-Menu-Checker-Online | 706e2e1bf7395cc344f382ea2ac53d964d459f86 | [
"MIT"
] | null | null | null | import pandas as pd
from scipy.stats import ttest_rel
"""
output
"""
# Note: some output is shortened to save spaces.
# This file discusses statistical analysis (Part II).
# ------------------------------------------------------------------------------
# Data stored in form of xlsx with contents:
"""
group data
0 1 34
1 1 37
2 1 28
3 1 36
4 1 30
5 2 43
6 2 45
7 2 47
8 2 49
9 2 39
"""
# Assume these data are paired sample.
# ------------------------------------------------------------------------------
IS_t_test = pd.read_excel('E:\\IS_t_test.xlsx')
Group1 = IS_t_test[IS_t_test['group']==1]['data']
Group2 = IS_t_test[IS_t_test['group']==2]['data']
print (ttest_rel(Group1,Group2))
"""
(-5.6873679190073361, 0.00471961872448184)
"""
# The first element from output is the value of t
# The second element from output is p-value
| 21.090909 | 80 | 0.516164 | 130 | 928 | 3.569231 | 0.569231 | 0.038793 | 0.090517 | 0.038793 | 0.081897 | 0.081897 | 0.081897 | 0 | 0 | 0 | 0 | 0.112971 | 0.227371 | 928 | 43 | 81 | 21.581395 | 0.53417 | 0.459052 | 0 | 0 | 0 | 0 | 0.140078 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.333333 | 0 | 0.333333 | 0.166667 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
70bcf6e650bbb3a26c3ae8d0c7700d683160246d | 3,881 | py | Python | fabrun/views.py | agepoly/azimut-gestion | 76da15a55086fdfacf605c139a1a64884ab0de40 | [
"MIT"
] | null | null | null | fabrun/views.py | agepoly/azimut-gestion | 76da15a55086fdfacf605c139a1a64884ab0de40 | [
"MIT"
] | null | null | null | fabrun/views.py | agepoly/azimut-gestion | 76da15a55086fdfacf605c139a1a64884ab0de40 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from django.shortcuts import get_object_or_404, render_to_response, redirect
from django.template import RequestContext
from django.core.context_processors import csrf
from django.views.decorators.csrf import csrf_exempt
from django.http import Http404, HttpResponse, HttpResponseForbidden, HttpResponseNotFound
from django.utils.encoding import smart_str
from django.conf import settings
from django.contrib.admin.views.decorators import staff_member_required
from django.contrib.auth.decorators import login_required
from django.http import HttpResponseRedirect
from django.db import connections
from django.core.paginator import InvalidPage, EmptyPage, Paginator
from django.core.cache import cache
from django.core.urlresolvers import reverse
from django.contrib import messages
import subprocess
from django.utils import timezone
from servers.models import Server
from fabrun.models import Task
from fabrun.tasks import run_task
import datetime
KEYWORDS = ('[$AG:NeedGestion]', '[$AG:NeedKM]', '[$AG:NeedUser]', '[$AG:NeedKomUser]', '[$AG:NeedSudo]', '[$AG:NeedMysqlPassword]', '[$AG:NeedSrvIp]')
@login_required
@staff_member_required
def home(request):
"""Show the page to execute scripts"""
if request.method == 'POST':
task = request.POST.get('script')
if task:
for spk in request.POST.getlist('server'):
server = get_object_or_404(Server, pk=spk)
t = Task(creation_date=timezone.now(), server=server, command=task)
t.save()
run_task.delay(t.pk)
messages.success(request, "Created task for server " + str(server))
liste = []
out, __ = subprocess.Popen(['fab', '--shortlist'], stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=settings.FABRIC_FOLDER).communicate()
# for command in out.split('\n'):
# if command:
# out2, __ = subprocess.Popen(['fab', '-d', command], stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=settings.FABRIC_FOLDER).communicate()
# description = out2.split('\n')[2]
# for keyword in KEYWORDS:
# description = description.replace(keyword, '')
# description = description.strip()
# liste.append((command, description))
liste = out.split('\n')
servers = Server.objects.exclude(ssh_connection_string_from_gestion=None).order_by('name').all()
tasks = Task.objects.order_by('-creation_date').all()
return render_to_response('fabrun/home.html', {'liste': liste, 'tasks': tasks, 'servers': servers}, context_instance=RequestContext(request))
@login_required
@staff_member_required
def show_run(request, pk):
"""Show output for a run"""
task = get_object_or_404(Task, pk=pk)
return render_to_response('fabrun/show_run.html', {'task': task}, context_instance=RequestContext(request))
@login_required
@staff_member_required
def clean_up(request):
Task.objects.filter(creation_date__lt=timezone.now() - datetime.timedelta(days=1)).delete()
messages.success(request, "Old fabric runs have been deleted")
return HttpResponseRedirect(reverse('fabrun.views.home'))
@login_required
@staff_member_required
def get_description(request):
command = request.GET.get('task')
out, __ = subprocess.Popen(['fab', '--shortlist'], stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=settings.FABRIC_FOLDER).communicate()
if command in out.split('\n'):
out2, __ = subprocess.Popen(['fab', '-d', command], stdout=subprocess.PIPE, stderr=subprocess.PIPE, cwd=settings.FABRIC_FOLDER).communicate()
description = out2.split('\n')[2]
for keyword in KEYWORDS:
description = description.replace(keyword, '')
description = description.strip()
return HttpResponse(description)
raise Http404
| 31.552846 | 155 | 0.707034 | 467 | 3,881 | 5.738758 | 0.319058 | 0.059701 | 0.035448 | 0.035821 | 0.347015 | 0.312687 | 0.286567 | 0.286567 | 0.286567 | 0.286567 | 0 | 0.007138 | 0.169802 | 3,881 | 122 | 156 | 31.811475 | 0.824643 | 0.127802 | 0 | 0.15625 | 0 | 0 | 0.095068 | 0.006833 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0.015625 | 0.328125 | 0 | 0.453125 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
70c846852061594e0e9914e39b2035cd28c6dd88 | 2,167 | py | Python | jetson/ballrunner.py | Reslix/Lohbot | 4920c0d0fcda64ec7b6438dd848c789e4fd15cc4 | [
"MIT"
] | 1 | 2018-02-21T03:49:54.000Z | 2018-02-21T03:49:54.000Z | jetson/ballrunner.py | Reslix/Lohbot | 4920c0d0fcda64ec7b6438dd848c789e4fd15cc4 | [
"MIT"
] | null | null | null | jetson/ballrunner.py | Reslix/Lohbot | 4920c0d0fcda64ec7b6438dd848c789e4fd15cc4 | [
"MIT"
] | null | null | null | from newcamera import TrackingCameraRunner
from serial_io import SerialIO
from show import imshow
import cv2
import math
print("Initializing serial connection with Arduino")
ard = SerialIO()
ard.start()
print("Initializing camera")
c = TrackingCameraRunner(0)
print("Tracking Ball...")
tcenterx = 640
tradius = 40
speed = 40
im = None
last = (0, cv2.getTickCount())
try:
while True:
c.step_frame()
center, radius = c.track_tennis_ball()
#im = imshow(c.frame, im=im)
if center:
# This should al lbe in cm...
distance = 2131/(radius ** 1.02)
horizontal = 3.5/radius * (tcenterx - center[0])
oh = horizontal/distance
if oh < -1:
oh = -1
elif oh > 1:
oh = 1
angle = math.asin(oh)
print('distance: {}\nhorizontal: {}\nangle: {}'.format(distance, horizontal, angle))
#TODO make sure angle is in degrees
'''
if angle < -.1:
differential = 89.1 * angle + 43.7
elif angle > .1:
differential = 126 * angle - 28.9
else:
differential = 0
differential *= 1
'''
differential = angle * 130
deriv = (angle - last[0])/((cv2.getTickCount() - last[1])/cv2.getTickFrequency())
print('deriv: {}'.format(deriv))
last = (angle, cv2.getTickCount())
differential = differential + 0.01 * deriv
translate = (distance - 30)/6
left = + differential
right = - differential
if angle > 0:
left = max(-speed, min(speed, left))
right = max(-speed, min(speed, right))
else:
left = max(-speed, min(speed, left))*3
right = max(-speed, min(speed, right))*3
left += translate
right += translate
print(left,right)
ard.direct(int(right), int(left))
else:
ard.stop()
print("stop")
except KeyboardInterrupt:
ard.stop()
c.close()
pass
| 30.957143 | 97 | 0.513152 | 229 | 2,167 | 4.838428 | 0.406114 | 0.01083 | 0.039711 | 0.057762 | 0.090253 | 0.090253 | 0 | 0 | 0 | 0 | 0 | 0.04246 | 0.369635 | 2,167 | 69 | 98 | 31.405797 | 0.768668 | 0.041071 | 0 | 0.072727 | 0 | 0 | 0.071547 | 0 | 0 | 0 | 0 | 0.014493 | 0 | 1 | 0 | false | 0.018182 | 0.090909 | 0 | 0.090909 | 0.127273 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
70c959da6388e937e02687ad9571d92c479badd1 | 8,247 | py | Python | simple_rl/tasks/taxi/TaxiOOMDPClass.py | KorlaMarch/simple_rl | 30086b5cf4fd3e9dee76ddfb5ae4f565593ce191 | [
"Apache-2.0"
] | 10 | 2021-11-22T12:29:30.000Z | 2022-03-28T10:23:16.000Z | simple_rl/tasks/taxi/TaxiOOMDPClass.py | samlobel/simple_rl_mbrl | ed868916d06dbf68f4af23bea83b0e852e88df6e | [
"Apache-2.0"
] | null | null | null | simple_rl/tasks/taxi/TaxiOOMDPClass.py | samlobel/simple_rl_mbrl | ed868916d06dbf68f4af23bea83b0e852e88df6e | [
"Apache-2.0"
] | 2 | 2022-03-19T07:42:56.000Z | 2022-03-28T10:36:33.000Z | '''
TaxiMDPClass.py: Contains the TaxiMDP class.
From:
Dietterich, Thomas G. "Hierarchical reinforcement learning with the
MAXQ value function decomposition." J. Artif. Intell. Res.(JAIR) 13
(2000): 227-303.
Author: David Abel (cs.brown.edu/~dabel/)
'''
# Python imports.
from __future__ import print_function
import random
import copy
# Other imports.
from simple_rl.mdp.oomdp.OOMDPClass import OOMDP
from simple_rl.mdp.oomdp.OOMDPObjectClass import OOMDPObject
from simple_rl.tasks.taxi.TaxiStateClass import TaxiState
from simple_rl.tasks.taxi import taxi_helpers
class TaxiOOMDP(OOMDP):
''' Class for a Taxi OO-MDP '''
# Static constants.
ACTIONS = ["up", "down", "left", "right", "pickup", "dropoff"]
ATTRIBUTES = ["x", "y", "has_passenger", "in_taxi", "dest_x", "dest_y"]
CLASSES = ["agent", "wall", "passenger"]
def __init__(self, width, height, agent, walls, passengers, slip_prob=0, gamma=0.99):
self.height = height
self.width = width
agent_obj = OOMDPObject(attributes=agent, name="agent")
wall_objs = self._make_oomdp_objs_from_list_of_dict(walls, "wall")
pass_objs = self._make_oomdp_objs_from_list_of_dict(passengers, "passenger")
init_state = self._create_state(agent_obj, wall_objs, pass_objs)
OOMDP.__init__(self, TaxiOOMDP.ACTIONS, self._taxi_transition_func, self._taxi_reward_func, init_state=init_state, gamma=gamma)
self.slip_prob = slip_prob
def _create_state(self, agent_oo_obj, walls, passengers):
'''
Args:
agent_oo_obj (OOMDPObjects)
walls (list of OOMDPObject)
passengers (list of OOMDPObject)
Returns:
(OOMDP State)
TODO: Make this more egneral and put it in OOMDPClass.
'''
objects = {c : [] for c in TaxiOOMDP.CLASSES}
objects["agent"].append(agent_oo_obj)
# Make walls.
for w in walls:
objects["wall"].append(w)
# Make passengers.
for p in passengers:
objects["passenger"].append(p)
return TaxiState(objects)
def _taxi_reward_func(self, state, action):
'''
Args:
state (OOMDP State)
action (str)
Returns
(float)
'''
_error_check(state, action)
# Stacked if statements for efficiency.
if action == "dropoff":
# If agent is dropping off.
agent = state.get_first_obj_of_class("agent")
# Check to see if all passengers at destination.
if agent.get_attribute("has_passenger"):
for p in state.get_objects_of_class("passenger"):
if p.get_attribute("x") != p.get_attribute("dest_x") or p.get_attribute("y") != p.get_attribute("dest_y"):
return 0 - self.step_cost
return 1 - self.step_cost
return 0 - self.step_cost
def _taxi_transition_func(self, state, action):
'''
Args:
state (State)
action (str)
Returns
(State)
'''
_error_check(state, action)
if self.slip_prob > random.random():
# Flip dir.
if action == "up":
action = "down"
elif action == "down":
action = "up"
elif action == "left":
action = "right"
elif action == "right":
action = "left"
if action == "up" and state.get_agent_y() < self.height:
next_state = self.move_agent(state, self.slip_prob, dy=1)
elif action == "down" and state.get_agent_y() > 1:
next_state = self.move_agent(state, self.slip_prob, dy=-1)
elif action == "right" and state.get_agent_x() < self.width:
next_state = self.move_agent(state, self.slip_prob, dx=1)
elif action == "left" and state.get_agent_x() > 1:
next_state = self.move_agent(state, self.slip_prob, dx=-1)
elif action == "dropoff":
next_state = self.agent_dropoff(state)
elif action == "pickup":
next_state = self.agent_pickup(state)
else:
next_state = state
# Make terminal.
if taxi_helpers.is_taxi_terminal_state(next_state):
next_state.set_terminal(True)
# All OOMDP states must be updated.
next_state.update()
return next_state
def __str__(self):
return "taxi_h-" + str(self.height) + "_w-" + str(self.width)
def visualize_agent(self, agent):
from ...utils.mdp_visualizer import visualize_agent
from taxi_visualizer import _draw_state
visualize_agent(self, agent, _draw_state)
_ = input("Press anything to quit ")
sys.exit(1)
def visualize_interaction(self):
from simple_rl.utils.mdp_visualizer import visualize_interaction
from taxi_visualizer import _draw_state
visualize_interaction(self, _draw_state)
raw_input("Press anything to quit ")
sys.exit(1)
# ----------------------------
# -- Action Implementations --
# ----------------------------
def move_agent(self, state, slip_prob=0, dx=0, dy=0):
'''
Args:
state (TaxiState)
dx (int) [optional]
dy (int) [optional]
Returns:
(TaxiState)
'''
if taxi_helpers._is_wall_in_the_way(state, dx=dx, dy=dy):
# There's a wall in the way.
return state
next_state = copy.deepcopy(state)
# Move Agent.
agent_att = next_state.get_first_obj_of_class("agent").get_attributes()
agent_att["x"] += dx
agent_att["y"] += dy
# Move passenger.
taxi_helpers._move_pass_in_taxi(next_state, dx=dx, dy=dy)
return next_state
def agent_pickup(self, state):
'''
Args:
state (TaxiState)
'''
next_state = copy.deepcopy(state)
agent = next_state.get_first_obj_of_class("agent")
# update = False
if agent.get_attribute("has_passenger") == 0:
# If the agent does not have a passenger.
for i, passenger in enumerate(next_state.get_objects_of_class("passenger")):
if agent.get_attribute("x") == passenger.get_attribute("x") and agent.get_attribute("y") == passenger.get_attribute("y"):
# Pick up passenger at agent location.
agent.set_attribute("has_passenger", 1)
passenger.set_attribute("in_taxi", 1)
return next_state
def agent_dropoff(self, state):
'''
Args:
state (TaxiState)
Returns:
(TaxiState)
'''
next_state = copy.deepcopy(state)
# Get Agent, Walls, Passengers.
agent = next_state.get_first_obj_of_class("agent")
# agent = OOMDPObject(attributes=agent_att, name="agent")
passengers = next_state.get_objects_of_class("passenger")
if agent.get_attribute("has_passenger") == 1:
# Update if the agent has a passenger.
for i, passenger in enumerate(passengers):
if passenger.get_attribute("in_taxi") == 1:
# Drop off the passenger.
passengers[i].set_attribute("in_taxi", 0)
agent.set_attribute("has_passenger", 0)
return next_state
def _error_check(state, action):
'''
Args:
state (State)
action (str)
Summary:
Checks to make sure the received state and action are of the right type.
'''
if action not in TaxiOOMDP.ACTIONS:
raise ValueError("Error: the action provided (" + str(action) + ") was invalid.")
if not isinstance(state, TaxiState):
raise ValueError("Error: the given state (" + str(state) + ") was not of the correct class.")
def main():
agent = {"x":1, "y":1, "has_passenger":0}
passengers = [{"x":8, "y":4, "dest_x":2, "dest_y":2, "in_taxi":0}]
taxi_world = TaxiOOMDP(10, 10, agent=agent, walls=[], passengers=passengers)
if __name__ == "__main__":
main()
| 31.477099 | 137 | 0.587486 | 993 | 8,247 | 4.640483 | 0.209466 | 0.044922 | 0.015625 | 0.013889 | 0.319878 | 0.221137 | 0.182075 | 0.121094 | 0.10026 | 0.06901 | 0 | 0.008456 | 0.29732 | 8,247 | 261 | 138 | 31.597701 | 0.786713 | 0.187583 | 0 | 0.141667 | 0 | 0 | 0.084165 | 0 | 0 | 0 | 0 | 0.003831 | 0 | 1 | 0.1 | false | 0.208333 | 0.091667 | 0.008333 | 0.308333 | 0.008333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
70c96909cdb1ba2109525248284a595c630feada | 2,176 | py | Python | tests/bugs/core_2923_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2022-02-05T11:37:13.000Z | 2022-02-05T11:37:13.000Z | tests/bugs/core_2923_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2021-09-03T11:47:00.000Z | 2021-09-03T12:42:10.000Z | tests/bugs/core_2923_test.py | FirebirdSQL/firebird-qa | 96af2def7f905a06f178e2a80a2c8be4a4b44782 | [
"MIT"
] | 1 | 2021-06-30T14:14:16.000Z | 2021-06-30T14:14:16.000Z | #coding:utf-8
#
# id: bugs.core_2923
# title: Problem with dependencies between a procedure and a view using that procedure
# decription:
# tracker_id: CORE-2923
# min_versions: ['2.5.0']
# versions: 3.0
# qmid: None
import pytest
from firebird.qa import db_factory, isql_act, Action
# version: 3.0
# resources: None
substitutions_1 = []
init_script_1 = """"""
db_1 = db_factory(sql_dialect=3, init=init_script_1)
test_script_1 = """
set term ^;
create procedure sp_test returns (i smallint) as
begin
i = 32767;
suspend;
end
^
create view v0 as
select i
from sp_test
^
alter procedure sp_test returns (i int) as
begin
i = 32768;
suspend;
end
^
set term ;^
commit;
---
create table t1 (n1 smallint);
insert into t1(n1) values(32767);
commit;
create view v1 as
select *
from t1;
alter table t1 alter n1 type integer;
commit;
insert into t1(n1) values(32768);
commit;
---
create table t2 (n2 smallint);
insert into t2(n2) values(32767);
commit;
create domain d2 integer;
create view v2 as
select * from t2;
alter table t2 alter n2 type d2;
insert into t2(n2) values(32768);
commit;
---
set list on;
select '0' as test_no, v.* from v0 v
union all
select '1', v.* from v1 v
union all
select '2', v.* from v2 v
;
"""
act_1 = isql_act('db_1', test_script_1, substitutions=substitutions_1)
expected_stdout_1 = """
TEST_NO 0
I 32768
TEST_NO 1
I 32767
TEST_NO 1
I 32768
TEST_NO 2
I 32767
TEST_NO 2
I 32768
"""
@pytest.mark.version('>=3.0')
def test_1(act_1: Action):
act_1.expected_stdout = expected_stdout_1
act_1.execute()
assert act_1.clean_stdout == act_1.clean_expected_stdout
| 19.428571 | 93 | 0.529412 | 280 | 2,176 | 3.957143 | 0.328571 | 0.032491 | 0.016245 | 0.021661 | 0.113718 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093985 | 0.388787 | 2,176 | 111 | 94 | 19.603604 | 0.739098 | 0.120404 | 0 | 0.383562 | 0 | 0 | 0.747634 | 0 | 0 | 0 | 0 | 0 | 0.013699 | 1 | 0.013699 | false | 0 | 0.027397 | 0 | 0.041096 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
70c98f390066819db18b2eabe5323591400b47e2 | 5,582 | py | Python | day_09.py | bob-white/advent_2018 | 4139359b322eb42ac6cfef9e574e43bc91caa62d | [
"MIT"
] | 1 | 2018-12-02T05:41:56.000Z | 2018-12-02T05:41:56.000Z | day_09.py | bob-white/advent_2018 | 4139359b322eb42ac6cfef9e574e43bc91caa62d | [
"MIT"
] | null | null | null | day_09.py | bob-white/advent_2018 | 4139359b322eb42ac6cfef9e574e43bc91caa62d | [
"MIT"
] | null | null | null | """
--- Day 9: Marble Mania ---
You talk to the Elves while you wait for your navigation system to initialize. To pass the time, they introduce you to their favorite marble game.
The Elves play this game by taking turns arranging the marbles in a circle according to very particular rules. The marbles are numbered starting with 0 and increasing by 1 until every marble has a number.
First, the marble numbered 0 is placed in the circle. At this point, while it contains only a single marble, it is still a circle: the marble is both clockwise from itself and counter-clockwise from itself. This marble is designated the current marble.
Then, each Elf takes a turn placing the lowest-numbered remaining marble into the circle between the marbles that are 1 and 2 marbles clockwise of the current marble. (When the circle is large enough, this means that there is one marble between the marble that was just placed and the current marble.) The marble that was just placed then becomes the current marble.
However, if the marble that is about to be placed has a number which is a multiple of 23, something entirely different happens. First, the current player keeps the marble they would have placed, adding it to their score. In addition, the marble 7 marbles counter-clockwise from the current marble is removed from the circle and also added to the current player's score. The marble located immediately clockwise of the marble that was removed becomes the new current marble.
For example, suppose there are 9 players. After the marble with value 0 is placed in the middle, each player (shown in square brackets) takes a turn. The result of each of those turns would produce circles of marbles like this, where clockwise is to the right and the resulting current marble is in parentheses:
[-] (0)
[1] 0 (1)
[2] 0 (2) 1
[3] 0 2 1 (3)
[4] 0 (4) 2 1 3
[5] 0 4 2 (5) 1 3
[6] 0 4 2 5 1 (6) 3
[7] 0 4 2 5 1 6 3 (7)
[8] 0 (8) 4 2 5 1 6 3 7
[9] 0 8 4 (9) 2 5 1 6 3 7
[1] 0 8 4 9 2(10) 5 1 6 3 7
[2] 0 8 4 9 2 10 5(11) 1 6 3 7
[3] 0 8 4 9 2 10 5 11 1(12) 6 3 7
[4] 0 8 4 9 2 10 5 11 1 12 6(13) 3 7
[5] 0 8 4 9 2 10 5 11 1 12 6 13 3(14) 7
[6] 0 8 4 9 2 10 5 11 1 12 6 13 3 14 7(15)
[7] 0(16) 8 4 9 2 10 5 11 1 12 6 13 3 14 7 15
[8] 0 16 8(17) 4 9 2 10 5 11 1 12 6 13 3 14 7 15
[9] 0 16 8 17 4(18) 9 2 10 5 11 1 12 6 13 3 14 7 15
[1] 0 16 8 17 4 18 9(19) 2 10 5 11 1 12 6 13 3 14 7 15
[2] 0 16 8 17 4 18 9 19 2(20)10 5 11 1 12 6 13 3 14 7 15
[3] 0 16 8 17 4 18 9 19 2 20 10(21) 5 11 1 12 6 13 3 14 7 15
[4] 0 16 8 17 4 18 9 19 2 20 10 21 5(22)11 1 12 6 13 3 14 7 15
[5] 0 16 8 17 4 18(19) 2 20 10 21 5 22 11 1 12 6 13 3 14 7 15
[6] 0 16 8 17 4 18 19 2(24)20 10 21 5 22 11 1 12 6 13 3 14 7 15
[7] 0 16 8 17 4 18 19 2 24 20(25)10 21 5 22 11 1 12 6 13 3 14 7 15
The goal is to be the player with the highest score after the last marble is used up. Assuming the example above ends after the marble numbered 25, the winning score is 23+9=32 (because player 5 kept marble 23 and removed marble 9, while no other player got any points in this very short example game).
Here are a few more examples:
10 players; last marble is worth 1618 points: high score is 8317
13 players; last marble is worth 7999 points: high score is 146373
17 players; last marble is worth 1104 points: high score is 2764
21 players; last marble is worth 6111 points: high score is 54718
30 players; last marble is worth 5807 points: high score is 37305
What is the winning Elf's score?
Your puzzle answer was 398242.
--- Part Two ---
Amused by the speed of your answer, the Elves are curious:
What would the new winning Elf's score be if the number of the last marble were 100 times larger?
Your puzzle answer was 3273842452.
Both parts of this puzzle are complete! They provide two gold stars: **
"""
import re
from itertools import cycle
from typing import List, Tuple, Dict
from collections import deque, defaultdict
# Part 1 and 2 have been placed in the input file, so running once will work.
with open('day_09.input', 'r') as f:
data = f.read()
# data = """9 players; last marble is worth 25 points: high score is 32
# 10 players; last marble is worth 1618 points: high score is 8317
# 13 players; last marble is worth 7999 points: high score is 146373
# 17 players; last marble is worth 1104 points: high score is 2764
# 21 players; last marble is worth 6111 points: high score is 54718
# 30 players; last marble is worth 5807 points: high score is 37305"""
games: List[Tuple[int, ...]] = [tuple(map(int, re.findall(r'\d+', line))) for line in data.split('\n')]
for game in games:
# Turns out rotating a deque is way faster than inserting nodes into a list.
circle: deque = deque()
# high_score gets dropped when we use our actual data.
players, marbles, *high_score = game
scores: Dict[int, int] = defaultdict(int)
for marble, player in zip(range(marbles + 1), cycle(range(players))):
if marble and not marble % 23:
# Rotate the circle back by 7, and remove that marble.
circle.rotate(-7)
scores[player] += marble + circle.pop()
else:
# Rotate the circle by 2, and then place a new marble
circle.rotate(2)
circle.append(marble)
if high_score:
print(max(scores.values()) == high_score[0])
else:
print(max(scores.values()))
| 52.168224 | 473 | 0.67431 | 1,111 | 5,582 | 3.383438 | 0.240324 | 0.034052 | 0.018622 | 0.022346 | 0.286512 | 0.267624 | 0.252195 | 0.24581 | 0.239159 | 0.233041 | 0 | 0.162695 | 0.263346 | 5,582 | 106 | 474 | 52.660377 | 0.751459 | 0.843246 | 0 | 0.090909 | 0 | 0 | 0.021028 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.181818 | 0 | 0.181818 | 0.090909 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
70cd06f1a4e2360da81c2eda5df37352f1b549eb | 762 | py | Python | qctrl_api/control_api/models.py | bibek-Neupane/back-end-challenge | d5a7b33adaa59e5ad566ac7435132c990f80a740 | [
"Apache-2.0"
] | null | null | null | qctrl_api/control_api/models.py | bibek-Neupane/back-end-challenge | d5a7b33adaa59e5ad566ac7435132c990f80a740 | [
"Apache-2.0"
] | null | null | null | qctrl_api/control_api/models.py | bibek-Neupane/back-end-challenge | d5a7b33adaa59e5ad566ac7435132c990f80a740 | [
"Apache-2.0"
] | null | null | null | from django.db import models
from django.core.validators import MinValueValidator, MaxValueValidator
class Control(models.Model):
objects=models.Manager()
TYPE_CHOICES=(
('Primitive','Primitive'),
('Corpse','CORPSE'),
('Gaussian','Gaussian'),
('CinBB','CinBB'),
)
#pk i.e id --> Refered to as pk while we use it as a lookup variable
name = models.CharField(max_length=200)
type = models.CharField(max_length=200, choices=TYPE_CHOICES, default='Primitive')
maximum_rabi_rate = models.FloatField(validators = [MinValueValidator(0), MaxValueValidator(100)])
polar_angle = models.FloatField(validators = [MinValueValidator(0), MaxValueValidator(1)])
def __str__(self):
return self.name
| 34.636364 | 102 | 0.692913 | 87 | 762 | 5.942529 | 0.609195 | 0.038685 | 0.069633 | 0.092843 | 0.340426 | 0.235977 | 0 | 0 | 0 | 0 | 0 | 0.019231 | 0.181102 | 762 | 21 | 103 | 36.285714 | 0.809295 | 0.087927 | 0 | 0 | 0 | 0 | 0.093795 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.125 | 0.0625 | 0.6875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
70d1f8b647267479a9bcde7a989424cc5e099952 | 3,651 | py | Python | engine/engine.py | farouqzaib/Personify | ae7c837b5539ccc1f0d4a8f327783814228fe4d7 | [
"MIT"
] | 1 | 2018-05-09T21:56:08.000Z | 2018-05-09T21:56:08.000Z | engine/engine.py | farouqzaib/Personify | ae7c837b5539ccc1f0d4a8f327783814228fe4d7 | [
"MIT"
] | null | null | null | engine/engine.py | farouqzaib/Personify | ae7c837b5539ccc1f0d4a8f327783814228fe4d7 | [
"MIT"
] | 1 | 2018-05-09T21:56:46.000Z | 2018-05-09T21:56:46.000Z | from algorithms.factorization_machine import FactorizationMachine
from algorithms.latent_dirichlet_allocation import LatentDirichletAllocation
from app.config import db
from app.config.config import engine
import datetime
import numpy as np
import pandas as pd
import pickle
class Engine:
def __init__(self):
self.users_to_int = {}
self.items_to_int = {}
self.users_to_int_counter = 0
self.items_to_int_counter = 0
def load_data(self):
'''
Loads the data from the data source
'''
today = datetime.datetime.today()
query = {
"range": {
"created_at": {
"gte": (today - datetime.timedelta(days=engine["training_data_age"])).strftime("%Y-%m-%d"),
"lte": today.strftime('%Y-%m-%d')
}
}
}
#TODO investigate performance of scroll instead of fetching large records in one go
res = db.es.search(index="events", doc_type="event", body={"query": query}, size=100)
events = []
data = res["hits"]["hits"]
# #check the dtype of the essential features
# first_record = record[0]["_source"]
# if type(first_record["user"]) == "string":
# users_to_int = self.transform_feature_to_int()
# if type(first_record["item"]) == "string":
# items_to_int = self.transform_feature_to_int()
for record in data:
event = record["_source"]
events.append((self.transform_feature_to_int(event["user"]), self.transform_feature_to_int(event["item"], "item"), event["rating"])
+ self.apply_feature_engineering(event["created_at"], "date"))
print np.array(events).shape
return np.array(events)
def transform_feature_to_int(self, feature, type='user'):
'''
Assigns a unique integer to an element in a feature
'''
if type == 'user':
if feature not in self.users_to_int:
self.users_to_int[feature] = self.users_to_int_counter
self.users_to_int_counter += 1
return self.users_to_int_counter
else:
if feature not in self.items_to_int:
self.items_to_int[feature] = self.items_to_int_counter
self.items_to_int_counter += 1
return self.items_to_int_counter
def apply_feature_engineering(self, feature, type="date"):
if type == "date":
transformed_feature = pd.to_datetime(feature)
return (transformed_feature.year, transformed_feature.month, transformed_feature.day)
def train(self):
self.fm = FactorizationMachine()
events = self.load_data()
features = events[:, 0:events.shape[1] - 1] #means to select every data in every row for the first and second to last column
target = events[:, -1] #selects every data in every row for the last column
self.fm.fit(features, target)
self.save_model()
def get_recommendations(self):
self.fm.predict(features)
def save_recommendations(self):
pass
def save_model(self):
'''
Saves the trained model to disk
'''
today = datetime.datetime.today()
file_name = 'models/model-{0}.fm'.format(today)
f = open('engine/{0}'.format(file_name), 'w')
pickle.dump(self.fm, f)
res = db.es.index(index="events", doc_type='models', id='', body={'model': file_name, 'created_at': today})
def load_model(self):
pass | 36.148515 | 143 | 0.599836 | 448 | 3,651 | 4.680804 | 0.314732 | 0.050072 | 0.03815 | 0.046733 | 0.221268 | 0.136862 | 0.052456 | 0 | 0 | 0 | 0 | 0.005399 | 0.289784 | 3,651 | 101 | 144 | 36.148515 | 0.803317 | 0.130923 | 0 | 0.059701 | 0 | 0 | 0.064124 | 0 | 0 | 0 | 0 | 0.009901 | 0 | 0 | null | null | 0.029851 | 0.119403 | null | null | 0.014925 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
70d940d807d775cf236b4b2decb10d2c5a7f64f8 | 15,391 | py | Python | Snake.py | RodneyTheProgrammer/snkgame | 4fdeb1a1c2c871f307ce0984949693506292ec1c | [
"BSD-2-Clause"
] | null | null | null | Snake.py | RodneyTheProgrammer/snkgame | 4fdeb1a1c2c871f307ce0984949693506292ec1c | [
"BSD-2-Clause"
] | null | null | null | Snake.py | RodneyTheProgrammer/snkgame | 4fdeb1a1c2c871f307ce0984949693506292ec1c | [
"BSD-2-Clause"
] | null | null | null | #!/usr/bin/env python
import curses
import time
import random
from operator import getitem,attrgetter
stdscr = curses.initscr()
L,W= stdscr.getmaxyx()
curses.start_color()
curses.noecho()
curses.cbreak()
curses.curs_set(0)
stdscr.keypad(1)
L,W = stdscr.getmaxyx()
normal,infiltrate,get2goal,runaway=0,1,2,3
def start_page(window):
maxy,maxx = window.getmaxyx()
s = 'Snake'
paktc = 'Press any key to continue'
window.addstr(maxy/2,maxx/2-len(s)/2,s,curses.A_BOLD)
window.addstr(maxy/2+1,maxx/2-len(paktc)/2,paktc)
window.refresh()
window.getch()
window.erase()
def MID(L,D): #move in direction
R =L.copy()
if D == 360:
return R
if 0<D<180:
R.y = R.y-1
elif 180 < D < 360 :
R.y = R.y+1
if 90 < D < 270:
R.x = R.x-1
elif 270 < D or D< 90:
R.x = R.x+1
return R
#point-related stuff
class point(object):
def __init__(self,y,x):
self.y =y
self.x =x
def __cmp__(self,other):
if self.x == other.x and self.y == other.y :
return 0
else:
return -1
def __str__(self):
return str((self.y,self.x))
def __repr__(self):
return str(self)
def copy(self):
return point(self.y,self.x)
def pointgen(L,W):
x = point(0,0)
x.y=random.randint(0+1,L-2)
x.x=random.randint(0+1,W-2)
return x
class dummy(object):
def __init__(self,char,location=point(1,1)):
self.location=location
self.char=char
def draw(self,window):
window.addstr(self.location.y,self.location.x,self.char)
#living animals
class man(object):
def __init__(self):
self.score = 0
self.location=point(1,1)
def move(self,D):
X = MID(self.location,D)
if X.y != 0 and X.x != 0:
if X.y != L-1 and X.x != W-1:
self.location = X
def live(self):
pass
def chance(self):
return chance((500-self.score)/5)
def draw(self,window):
window.addstr(self.location.y,self.location.x,'@')
class rat(object):
def __init__(self,coin,dude):
self.master = dude
self.coin = coin
self.location = pointgen(L,W)
self.attached=False
self.previous_escape=45
self.phase=normal
def move(self,D):
X = MID(self.location,D)
if X.y != 0 and X.x != 0:
if X.y !=L-1 and X.x != W-1:
self.location = X
return True
return False
def draw(self,window):
window.addstr(self.location.y,self.location.x,'r',curses.A_REVERSE)
def live(self,snakes):
rat = self.location
snk = nearest_snake(snakes,rat).pieces[-1]
threats = danger_directions(snakes,rat)
sdistance = distance(rat,snk)
isPhase = self.isPhase
self.isAttached()
escape = False
if isPhase(infiltrate):
if not self.goal in threats:
self.phase = normal
else:
escape = True
d = closest(self.goal)[0]
if isPhase(runaway):
if sdistance > 8:
self.phase = normal
if sdistance <= 8:
if isPhase(runaway):
escape = True
if self.previous_escape not in threats:
d = self.previous_escape
else:
d = opposite(direction(rat,snk))
elif sdistance <= 4:
escape = True
if isPhase(infiltrate):
d = closest(self.previous_escape)[0]
else:
if self.previous_escape not in threats:
d = self.previous_escape
d = opposite(direction(rat,snk))
self.phase = runaway
if isPhase(normal):
self.phase = get2goal
if self.phase == get2goal and self.goal() == direction(rat,snk):
self.phase = infiltrate
escape = True
d = closest(self.goal())[0]
if self.phase == get2goal:
d = self.goal()
if self.goal() == 360:
self.attached = True
if not self.move(d) :
s = closest(d)
for n in s:
d = n
if self.move(d):
break
if self.attached:
self.coin.location = MID(self.location,d)
if escape :
self.previous_escape = d
def isAttached(self):
if distance(self.location,self.coin.location) > 1:
self.attached = False
def isPhase(self,phase):
if self.phase == phase:
return True
else:
return False
def goal(self):
if self.attached:
return direction(self.location,self.master.location)
else:
return direction(self.location,self.coin.location)
def chance(self):
return chance((500-self.master.score)/5)
class snake(object):
def __init__(self):
self.lenght=random.randint(3,7)
self.pieces=[point(1,1)]
def move(self,D):
X=MID(self.pieces[-1],D)
if X.y != 0 and X.x != 0:
if X.y!= L-1 and X.x !=W-1:
self.pieces.append(X)
if len(self.pieces) > self.lenght:
del(self.pieces[0])
return True
def draw(self,window):
for p in self.pieces:
window.addstr(p.y,p.x,'s')
p = self.pieces[-1]
window.addstr(p.y,p.x,'S')
def live(self,dude,pact,rats):
m = dude.location
snk = self.pieces[-1]
l = pact[0]
ldng = l.pieces[-1]
h = distance(ldng,m)
k = distance(snk,ldng)
if self !=l and h < 5 and k < 14 :
d = opposite(direction(snk,ldng))
elif dude.chance():
d= random.randint(0,8)*45
else:
d= direction(snk,m)
if not self.move(d) :
for n in closest(d):
if self.move(n):
break
if m in self.pieces:
snakeeaten(m,stdscr)
if type(dude)==rat:
del(rats[rats.index(dude)])
return False
else:
return True
self.lenght= self.lenght+1
def danger_directions(snakes,target):
ret=[]
for s in snakes:
ret.append(direction(target,s.pieces[-1]))
return ret
def nearest_snake(_snakes,target):
snakes = _snakes[:]
a = map(getitem,map(attrgetter('pieces'),snakes),[-1]*len(snakes))
b = map(distance,a,[target]*len(snakes))
nearest = b.index(sorted(b)[0])
return snakes[nearest]
def direction(point,target):
if point.x > target.x:
if point.y > target.y:
return 135
if point.y == target.y:
return 180
if point.y < target.y:
return 225
if point.x == target.x:
if point.y > target.y:
return 90
if point.y == target.y:
return 360
if point.y < target.y:
return 270
if point.x < target.x:
if point.y > target.y:
return 45
if point.y == target.y:
return 0
if point.y < target.y:
return 315
def angle_delta(angle1,angle2):
return abs(angle1-angle2)
def opposite(drctn):
if drctn == 360:
return 360
else :
return closest(drctn)[-1]
def distance(point,target):
return int(((point.x-target.x)**2+(point.y-target.y)**2)**0.5)
def closest(drctn):
if drctn == 360:
return [360]
l = []
append = l.append
a,b = drctn,drctn
x = 1
while a != b or x:
if x:
x=not x
a = a + 45
b = b - 45
if b < 0:
b = 315
if a == 360:
a= 0
append(a)
append(b)
return l
def drw_ln(r,D,L,win,m='#'):
h=L
c=0
while c!=r:
try:
win.addstr(h.y,h.x,m)
except:
pass
h = MID(h,D)
c=c+1
def drw_sqr(r,location,window):
_1=point(location.y-r/2,location.x-r/2)
drw_ln(r,0,_1,window)
drw_ln(r,270,_1,window)
_2=point(location.y+r/2,location.x+r/2)
drw_ln(r,180,_2,window)
drw_ln(r,90,_2,window)
def gatelight(location,window):
c=0
while c != 6:
time.sleep(0.05)
drw_sqr(c,location,window)
window.refresh()
c=c+1
def snakefoo(location,lenght,window):
window.addstr(location.y-1,location.x-1,'0-0')
window.addstr(location.y,location.x-1,'\\_/')
window.refresh()
time.sleep(0.5)
window.addstr(location.y-1,location.x-1,'>-<')
window.addstr(location.y,location.x-1, '\\_/')
window.refresh()
time.sleep(0.2)
c = 1
while c <= lenght:
window.addstr(location.y+c,location.x,'^')
if c>=2:
window.addstr(location.y+c-1,location.x,'|')
window.refresh()
c = c+1
time.sleep(0.2)
while 1 <= c:
window.addstr(location.y+c,location.x,' ')
if c == 2:
window.addstr(location.y+c-1,location.x,'^')
window.refresh()
c = c-1
time.sleep(0.2)
window.addstr(location.y-1,location.x-1,'0-0')
window.addstr(location.y,location.x-1, '\\_/')
window.refresh()
time.sleep(2.5)
def sparkle(location,window):
window.addstr(location.y,location.x,'*')
window.refresh()
time.sleep(0.2)
window.addstr(location.y-1,location.x-1,'\\|/')
window.addstr(location.y,location.x-1,'- -')
window.addstr(location.y+1,location.x-1,'/|\\')
window.refresh()
def snakeeaten(location,window):
window.addstr(location.y-1,location.x-1,'/-\\')
window.addstr(location.y,location.x-1, '.@.')
try:
window.addstr(location.y+1,location.x-1,'\\-/')
except:
pass
window.refresh()
c=0
while c != 10:
time.sleep(0.1)
if c % 2 == 1:
window.addstr(location.y,location.x,'@')
window.refresh()
else:
window.addstr(location.y,location.x,' ')
window.refresh()
c = c+1
window.addstr(location.y-1,location.x-1, '0_0')
window.addstr(location.y,location.x-1, '\\_/')
window.addstr(location.y+1,location.x-1,' ')
window.refresh()
time.sleep(0.5)
window.addstr(location.y-1,location.x-1,'-')
window.refresh()
time.sleep(0.5)
window.addstr(location.y-1,location.x-1,'0')
window.refresh()
time.sleep(0.5)
def snakestare(snakes,window):
locations=[]
for s in snakes:
locations.append(s.pieces[-1])
for l in locations:
window.addstr(l.y-1,l.x-1,'0_0!!')
window.addstr(l.y,l.x-1,'\\_/')
window.refresh()
time.sleep(1)
c=0
while c != 22:
for l in locations:
if c%2:
window.addstr(l.y-1,l.x-1,'0_0 ')
else:
window.addstr(l.y-1,l.x-1,'-_-')
window.refresh()
time.sleep(0.1)
c=c+1
time.sleep(1.5)
def snakehappy(snakes,target,window):
ls=[]
for s in snakes:
ls.append(s.pieces[-1])
for l in ls:
window.addstr(l.y-1,l.x-1,'0-0')
window.addstr(l.y,l.x-1,'\\_/')
window.refresh()
time.sleep(0.2)
for l in ls:
window.addstr(l.y-1,l.x-1,'^-^')
window.refresh()
time.sleep(1)
def chance(percent):
if percent < 4:
percent = 4
x = random.randint(1,100)
if x <= percent:
return True
else:
return False
def border(window):
maxy,maxx=L,W
maxx=maxx-1
maxy=maxy-1
_0 = point(0,0)
drw_ln(maxy,270,_0,window,'|')
drw_ln(maxx,0,_0,window,'-')
_M= point(maxy,maxx)
drw_ln(maxy,90,_M,window,'|')
drw_ln(maxx,180,_M,window,'-')
window.addstr(0,0,'+')
window.addstr(0,maxx,'+')
window.addstr(maxy,0,'+')
window.addstr(maxy-1,maxx,'+')
def windowsave(window):
while curses.is_term_resized(L,W):
maxx,maxy=stdscr.getmaxyx()
if maxx < W or maxy < L:
stdscr.addstr(0,0,'Please resize the window',curses.A_BOLD)
stdscr.refresh()
time.sleep(0.5)
def TheGame(window):
tb = dummy('',point(0,0))
dude = man()
dude.location = pointgen(L,W)
snakes=[snake(),snake(),snake()]
dude.location=pointgen(L,W)
rats = []
reached300=False
Q=map(ord,('Q','q'))
R=map(ord,('R','r'))
for s in snakes:
s.pieces[0]=pointgen(L,W)
while s.pieces[0] == dude.location:
s.pieces[0]=pointgen(L,W)
coin = dummy('$',pointgen(L,W))
gate = dummy('#',pointgen(L,W))
while gate.location == coin.location:
gate.location = pointgen(L,W)
input=''
while 1:
if dude.score >= 300:
reached300 = True
window.erase()
there_are_rats= len(rats)>0
try:
border(window)
for s in snakes:
s.draw(window)
if there_are_rats:
for r in rats:
r.draw(window)
dude.draw(window)
gate.draw(window)
coin.draw(window)
tb.draw(window)
window.refresh()
except:
windowsave(window)
if len(rats) >= 1:
rats[-1].live(snakes)
for r in rats[:-1]:
r.move(random.randint(0,8)*45)
input = window.getch()
if input == curses.KEY_UP:
dude.move(90)
if input == curses.KEY_DOWN:
dude.move(270)
if input == curses.KEY_LEFT:
dude.move(180)
if input == curses.KEY_RIGHT:
dude.move(0)
if input in Q:
tb.char='chicken heart!'
tb.draw(window)
window.refresh()
snakestare(snakes,window)
return (0,dude.score)
if input in R:
if reached300 and dude.score >= 200:
dude.score=dude.score-200
tb.char='real food!'
tb.draw(window)
window.refresh()
rats.append(rat(coin,dude))
sparkle(rats[-1].location,window)
window.addstr(rats[-1].location.y,rats[-1].location.x,'r')
snakehappy(snakes,rats[-1].location,window)
for s in snakes:
if len(rats) > 0 :
h=rats[-1]
else:
h=dude
if s.live(h,snakes,rats) == True:
return (-1,dude.score)
if dude.location == coin.location:
dude.score =dude.score+int(3000*(1.0/(W+L)))
tb.char=str(dude.score)+'$'
coin.location=pointgen(L,W)
if dude.location == gate.location:
gatelight(gate.location,window)
snakestare(snakes,window)
return (1,dude.score)
break
start_page(stdscr)
x=TheGame(stdscr)
stdscr.erase()
curses.nocbreak(),curses.echo(),curses.endwin(),curses.endwin()
if x[0]==-1:
print "You\'ve been eaten with %s grams of gold you found"%x[1]
if x[0]==0:
print "You left the game embarrassingly while having only %s grams of gold"%x[1]
if x[0] == 1:
print "You escaped with %s grams of gold"%x[1]
| 27.681655 | 84 | 0.523553 | 2,139 | 15,391 | 3.7223 | 0.106124 | 0.064808 | 0.060286 | 0.063301 | 0.383321 | 0.310852 | 0.261869 | 0.244913 | 0.216654 | 0.214896 | 0 | 0.038495 | 0.333312 | 15,391 | 555 | 85 | 27.731532 | 0.737452 | 0.004548 | 0 | 0.336614 | 0 | 0 | 0.022459 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.005906 | 0.007874 | null | null | 0.005906 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
70d993efbc1f1e5a1f92a001568acf76b80283cd | 266 | py | Python | Jawaban/4.py | Rakhid16/pibiti-himatifa-2020 | 734ae59d0ffb3cf41d14fe4fbcaba7b317f5c870 | [
"Apache-2.0"
] | null | null | null | Jawaban/4.py | Rakhid16/pibiti-himatifa-2020 | 734ae59d0ffb3cf41d14fe4fbcaba7b317f5c870 | [
"Apache-2.0"
] | null | null | null | Jawaban/4.py | Rakhid16/pibiti-himatifa-2020 | 734ae59d0ffb3cf41d14fe4fbcaba7b317f5c870 | [
"Apache-2.0"
] | null | null | null | kamus = {"elephant" : "gajah", "zebra" : "zebra", "dog" : "anjing", "camel" : "unta"}
kata = input("Masukan kata berbahasa inggris : ")
if kata in kamus:
print("Terjemahan dari " + kata + " adalah " + kamus[kata])
else:
print("Kata tersebt belum ada di kamus") | 33.25 | 85 | 0.631579 | 34 | 266 | 4.941176 | 0.705882 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184211 | 266 | 8 | 86 | 33.25 | 0.774194 | 0 | 0 | 0 | 0 | 0 | 0.483146 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
70dec0077e310c765f9ecb7dbeebc0c662a8b197 | 9,621 | py | Python | gui.py | TheAlienBee/Search_and_Apply | a62dcad45df09a38ab41c2f450a3202f3def6b3b | [
"MIT"
] | null | null | null | gui.py | TheAlienBee/Search_and_Apply | a62dcad45df09a38ab41c2f450a3202f3def6b3b | [
"MIT"
] | null | null | null | gui.py | TheAlienBee/Search_and_Apply | a62dcad45df09a38ab41c2f450a3202f3def6b3b | [
"MIT"
] | null | null | null | import tkinter as tk #tkinter is our gui lib.
import webbrowser #webbrowser allows us to open user's default web browser. good for clicking on links.
import jsonlines
import io
import genCoverLetter as gcl
from Search_and_Apply.Search_and_Apply.spiders.IndeedSpider import searchFor
from Search_and_Apply.Search_and_Apply.IndeedExpressApply import ExpressApply
if __name__ == '__main__':#not sure what this does. might delete later.
root = tk.Tk()#initialization of root.
#what follows are some global variables.
keywords = []
links = ["link1","link2","link3","link4","link5"]
name = "empty"
email = "empty"
position = "A fine position"
company = "A fine company"
phone = "777"
def BuildMenu(): #buildmenu is called each time the main window is changed. just builds the menu.
mb = tk.Menubutton(root,text="Menu")
mb.grid(row=0,column=0)
mb.menu = tk.Menu (mb, tearoff = 0)
mb["menu"] = mb.menu
mb.menu.add_command(label="Search",command=Keywords)
mb.menu.add_command(label="Resume",command=Resume)
mb.menu.add_command(label="Cover Letter",command=CoverLetter)
mb.menu.add_command(label="Profile",command=Profile)
mb.menu.add_command(label="Job Listings",command=Listings)
mb.menu.add_command(label="Additional Information",command=AdditionalInfo)
mb.menu.add_command(label="Quit",command=Quit)
def Keywords(): #keywords page.
for widget in root.winfo_children(): #eliminates all widgets. clears the window.
widget.destroy()
with open("URLs_from_IndeedSpider.json", mode ='r') as reader:
file_data = reader.read().strip('\n []{}')
if hyperlinks == [] and file_data != '':
display_links()
BuildMenu()
tk.Message(root,text="Search",width=3000).grid(row=1,column=1) #message.
tk.Label(root,text="Search Keywords").grid(row=2,column=1) #label
key_ent = tk.Entry(root) #text entry
key_ent.grid(row=2,column=2)
tk.Button(root,text="Search",command=lambda:search(key_ent.get())).grid(row=2,column=3) #search button
def search(new_keywords): #run one search per keyword. needs work. shouldn't take long.
global keywords #imports global keywords list
keywords = new_keywords #saves input from text entry field
searchFor([keywords])
print("searching... %s" % keywords)
display_links()
def clettergen():
global position
global company
global name
global email
global phone
to_csv()
gcl.write_cover_letter()
def apply(L): #given name and email and a list of relevant links, call applyTo. needs some work. shouldn't take long.
global applyBot
#try:
applyBot.applyTo(L)
except AttributeError as e:
print(e)
print(L)
popup = tk.Toplevel()
tk.Message(popup,text="Error: Please create a profile,",width=3000).grid(row=0,column=0)
tk.Button(popup,text="OK",command=popup.destroy).grid(row=1,column=1)
def open_link(event): #simply opens the links.
webbrowser.open_new_tab(event)
def Resume(): #resume page. needs some work. shouldn't take long.
for widget in root.winfo_children(): #eliminates all widgets. clears the window.
widget.destroy()
BuildMenu()
tk.Message(root,text="Resume",width=3000).grid(row=1,column=1)
def CoverLetter(): #coverletter page. needs some work. shouldn't take long.
for widget in root.winfo_children(): #eliminates all widgets. clears the window.
widget.destroy()
BuildMenu()
tk.Message(root,text="Cover Letter",width=3000).grid(row=1,column=1)
tk.Label(root,text="Name").grid(row=3,column=1)
name_ent = tk.Entry(root)
name_ent.grid(row=3,column=2)
tk.Button(root,text="Save Name",command=lambda: save_name(name_ent.get())).grid(row=3,column=3)
tk.Label(root,text="Email").grid(row=4,column=1)
email_ent = tk.Entry(root)
email_ent.grid(row=4,column=2)
tk.Button(root,text="Save Email",command=lambda: save_email(email_ent.get())).grid(row=4,column=3)
tk.Label(root,text="Position").grid(row=5,column=1)
position_ent = tk.Entry(root)
position_ent.grid(row=5,column=2)
tk.Button(root,text="Save Position",command=lambda: save_position(position_ent.get())).grid(row=5,column=3)
tk.Label(root,text="Company").grid(row=6,column=1)
company_ent= tk.Entry(root)
company_ent.grid(row=6,column=2)
tk.Button(root,text="Save Company",command=lambda: save_company(company_ent.get())).grid(row=6,column=3)
tk.Label(root,text="Phone").grid(row=7,column=1)
phone_ent= tk.Entry(root)
phone_ent.grid(row=7,column=2)
tk.Button(root,text="Phone",command=lambda: save_phone(phone_ent.get())).grid(row=7,column=3)
tk.Button(root,text="Generate Cover Letter",command=clettergen).grid(row=9,column=2)
#takes keywords, generates pdf.
def save_phone(new_phone):
global phone
phone = new_phone
popup = tk.Toplevel()
tk.Message(popup,text="Profile updated with phone number,",width=3000).grid(row=0,column=0)
tk.Button(popup,text="OK",command=popup.destroy).grid(row=1,column=1)
def save_position(new_position):
global position
position = new_position
popup = tk.Toplevel()
tk.Message(popup,text="Profile updated with job position,",width=3000).grid(row=0,column=0)
tk.Button(popup,text="OK",command=popup.destroy).grid(row=1,column=1)
def save_company(new_company):
global company
company = new_company
popup = tk.Toplevel()
tk.Message(popup,text="Profile updated with company name,",width=3000).grid(row=0,column=0)
tk.Button(popup,text="OK",command=popup.destroy).grid(row=1,column=1)
def to_csv():
#to comma-separated string
step1 = [position+","+company+","+name+","+email+","+phone]
step2 = ",".join(step1)
s=io.StringIO(step2)
with open('info.csv','w') as f:
for line in s:
f.write(line)
def Profile(): #profile page. essentially finished.
for widget in root.winfo_children():
widget.destroy()
BuildMenu()
tk.Message(root,text="Profile",width=3000).grid(row=1,column=1)
tk.Label(root,text="Name").grid(row=2,column=1)
name_ent = tk.Entry(root)
name_ent.grid(row=2,column=2)
tk.Button(root,text="Save Name",command=lambda: save_name(name_ent.get())).grid(row=2,column=3)
tk.Label(root,text="Email").grid(row=3,column=1)
email_ent = tk.Entry(root)
email_ent.grid(row=3,column=2)
tk.Button(root,text="Save Email",command=lambda: save_email(email_ent.get())).grid(row=3,column=3)
#we could add things here. profile1, profile2, etc. would take some time.
def save_name(new_name): #simply saves name from text field to global var.
global name
name = new_name
applyBot.addName(name)
popup = tk.Toplevel()
tk.Message(popup,text="Profile updated with your name,",width=3000).grid(row=0,column=0)
tk.Button(popup,text="OK",command=popup.destroy).grid(row=1,column=1)
def save_email(new_email): #simply saves email from text field to global var.
global email
email = new_email
applyBot.addEmail(email)
popup = tk.Toplevel()
tk.Message(popup,text="Profile updated with your email,",width=3000).grid(row=0,column=0)
tk.Button(popup,text="OK",command=popup.destroy).grid(row=1,column=1)
def Listings(): #listings page. functional.
for widget in root.winfo_children():
widget.destroy()
BuildMenu()
tk.Message(root,text="Job Listings",width=3000).grid(row=1,column=1)
display_links()
def display_links(): #fun function. turns global links list into hyperlinks in the window.
with jsonlines.open("URLs_from_IndeedSpider.json", mode ='r') as reader:
distros_dict = reader.iter(type = dict)
i=0
hyperlinks=[]
for link in distros_dict:
hyperlinks.append(tk.Label(root,text=link['Title'],fg="blue",cursor="hand2"))
hyperlinks[i].grid(row=i+10,column=1)
hyperlinks[i].bind("<Button-1>", lambda e: open_link(link['Link'])) #button-1 means left-click.
tk.Button(root,text="Apply",command=lambda : apply(link['Link'])).grid(row=i+10,column=2)
i+=1
def AdditionalInfo(): #additional info page.
for widget in root.winfo_children():
widget.destroy()
BuildMenu()
tk.Message(root,text="Additional Information",width=3000).grid(row=1,column=1)
#not sure what goes here, if we're doing this, etc.
def Quit(): #simply quits.
root.destroy()
exit()
hyperlinks = []
print("loading...")
applyBot = ExpressApply()
BuildMenu()
root.geometry("640x480")
tk.Message(root,text="welcome to Search and Apply.",width=3000).grid(row=1,column=1)
root.mainloop() #starts the engine.
| 41.291845 | 122 | 0.624571 | 1,311 | 9,621 | 4.509535 | 0.189931 | 0.05565 | 0.028586 | 0.035183 | 0.475981 | 0.433356 | 0.412382 | 0.344892 | 0.344892 | 0.31952 | 0 | 0.023852 | 0.241763 | 9,621 | 232 | 123 | 41.469828 | 0.786566 | 0.135537 | 0 | 0.276243 | 0 | 0 | 0.092807 | 0.006709 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.038674 | null | null | 0.022099 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
70e3213acde89557a8953f051282e36ffe6c2163 | 5,868 | py | Python | vp_suite/models/precipitation_nowcasting/ef_conv_lstm.py | AIS-Bonn/vp-suite | 479bd48185b9a93a6bc6bff2dfe226e9c65800d8 | [
"MIT"
] | 3 | 2022-03-05T15:27:25.000Z | 2022-03-25T18:47:35.000Z | vp_suite/models/precipitation_nowcasting/ef_conv_lstm.py | Flunzmas/vp-suite | 391570121b5bd9e3fd23aca9a0945a63c4173a24 | [
"MIT"
] | 16 | 2022-01-06T08:38:26.000Z | 2022-02-23T19:19:28.000Z | vp_suite/models/precipitation_nowcasting/ef_conv_lstm.py | AIS-Bonn/vp-suite | 479bd48185b9a93a6bc6bff2dfe226e9c65800d8 | [
"MIT"
] | 3 | 2022-02-07T22:34:45.000Z | 2022-02-22T11:14:37.000Z | from collections import OrderedDict
from vp_suite.model_blocks import ConvLSTM
from vp_suite.models.precipitation_nowcasting.ef_blocks import Encoder_Forecaster
class EF_ConvLSTM(Encoder_Forecaster):
r"""
This is a reimplementation of the Encoder-Forecaster model based on ConvLSTMs, as introduced in
"Convolutional LSTM Network: A Machine Learning Approach for Precipitation Nowcasting" by Shi et al.
(https://arxiv.org/abs/1506.04214). This implementation is based on the PyTorch implementation on
https://github.com/Hzzone/Precipitation-Nowcasting which implements the encoder-forecaster structure from
"Deep Learning for Precipitation Nowcasting: A Benchmark and A New Model" by Shi et al.
(https://arxiv.org/abs/1706.03458).
The Encoder-Forecaster Network stacks multiple convolutional/up-/downsampling and recurrent layers
that operate on different spatial scales.
Note:
The default hyperparameter configuration is intended for input frames of size (64, 64).
For considerably larger or smaller image sizes, you might want to adjust the architecture.
"""
# model-specific constants
NAME = "EF-ConvLSTM (Shi et al.)"
PAPER_REFERENCE = "https://arxiv.org/abs/1506.04214"
CODE_REFERENCE = "https://github.com/Hzzone/Precipitation-Nowcasting"
MATCHES_REFERENCE = "Yes"
# model hyperparameters (c=channels, h=height, w=width, k=kernel_size, s=stride, p=padding)
num_layers = 3 #: Number of recurrent cell layers
enc_c = [16, 64, 64, 96, 96, 96] #: Channels for conv and rnn; Length should be 2*num_layers
dec_c = [96, 96, 96, 96, 64, 16] #: Channels for conv and rnn; Length should be 2*num_layers
# convs
enc_conv_names = ["conv1_leaky_1", "conv2_leaky_1", "conv3_leaky_1"] #: Encoder conv block layer names (for internal initialization)
enc_conv_k = [3, 3, 3] #: Encoder conv block kernel sizes per layer
enc_conv_s = [1, 2, 2] #: Encoder conv block strides per layer
enc_conv_p = [1, 1, 1] #: Encoder conv block paddings per layer
dec_conv_names = ["deconv1_leaky_1", "deconv2_leaky_1", "deconv3_leaky_1"] #: Decoder conv block layer names (for internal initialization)
dec_conv_k = [4, 4, 3] #: Decoder conv block kernel sizes per layer
dec_conv_s = [2, 2, 1] #: Decoder conv block strides per layer
dec_conv_p = [1, 1, 1] #: Decoder conv block paddings per layer
# rnns
enc_rnn_k = [3, 3, 3] #: Encoder recurrent block kernel sizes per layer
enc_rnn_s = [1, 1, 1] #: Encoder recurrent block strides per layer
enc_rnn_p = [1, 1, 1] #: Encoder recurrent block paddings per layer
dec_rnn_k = [3, 3, 3] #: Decoder recurrent block kernel sizes per layer
dec_rnn_s = [1, 1, 1] #: Decoder recurrent block strides per layer
dec_rnn_p = [1, 1, 1] #: Decoder recurrent block paddings per layer
# final convs
final_conv_1_name = "identity" #: Final conv block 1 name
final_conv_1_c = 16 #: Final conv block 1 out channels
final_conv_1_k = 3 #: Final conv block 1 kernel size
final_conv_1_s = 1 #: Final conv block 1 stride
final_conv_1_p = 1 #: Final conv block 1 padding
final_conv_2_name = "conv3_3" #: Final conv block 2 name
final_conv_2_k = 1 #: Final conv block 2 kernel size
final_conv_2_s = 1 #: Final conv block 2 stride
final_conv_2_p = 0 #: Final conv block 2 padding
def __init__(self, device, **model_kwargs):
super(EF_ConvLSTM, self).__init__(device, **model_kwargs)
def _build_encoder_decoder(self):
# build enc layers and encoder
layer_in_c = self.img_c
enc_convs, enc_rnns = [], []
for n in range(self.num_layers):
layer_mid_c = self.enc_c[2 * n]
layer_out_c = self.enc_c[2 * n + 1]
enc_convs.append(OrderedDict(
{self.enc_conv_names[n]: [layer_in_c, layer_mid_c, self.enc_conv_k[n],
self.enc_conv_s[n], self.enc_conv_p[n]]}
))
enc_rnns.append(ConvLSTM(device=self.device, in_channels=layer_mid_c, enc_channels=layer_out_c,
state_h=self.enc_rnn_state_h[n], state_w=self.enc_rnn_state_w[n],
kernel_size=self.enc_rnn_k[n], stride=self.enc_rnn_s[n],
padding=self.enc_rnn_p[n]))
layer_in_c = layer_out_c
# build dec layers and decoder, including final convs
dec_convs, dec_rnns = [], []
for n in range(self.num_layers):
layer_mid_c = self.dec_c[2 * n]
layer_out_c = self.dec_c[2 * n + 1]
dec_rnns.append(ConvLSTM(device=self.device, in_channels=layer_in_c, enc_channels=layer_mid_c,
state_h=self.dec_rnn_state_h[n], state_w=self.dec_rnn_state_w[n],
kernel_size=self.dec_rnn_k[n], stride=self.dec_rnn_s[n],
padding=self.dec_rnn_p[n]))
dec_conv_dict = {
self.dec_conv_names[n]: [layer_mid_c, layer_out_c, self.dec_conv_k[n],
self.dec_conv_s[n], self.dec_conv_p[n]]
}
if n == self.num_layers - 1:
dec_conv_dict[self.final_conv_1_name] = [layer_out_c, self.final_conv_1_c, self.final_conv_1_k,
self.final_conv_1_s, self.final_conv_1_p]
dec_conv_dict[self.final_conv_2_name] = [self.final_conv_1_c, self.img_c, self.final_conv_2_k,
self.final_conv_2_s, self.final_conv_2_p]
dec_convs.append(OrderedDict(dec_conv_dict))
layer_in_c = layer_out_c
return enc_convs, enc_rnns, dec_convs, dec_rnns
| 53.834862 | 143 | 0.644853 | 880 | 5,868 | 4.022727 | 0.190909 | 0.071186 | 0.031073 | 0.023729 | 0.410452 | 0.287006 | 0.149153 | 0.090395 | 0.076271 | 0.048588 | 0 | 0.035116 | 0.267212 | 5,868 | 108 | 144 | 54.333333 | 0.78814 | 0.361282 | 0 | 0.055556 | 0 | 0 | 0.056831 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.027778 | false | 0 | 0.041667 | 0 | 0.513889 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
70e52df4dc9953cd6ed3848558c135eb3f637b16 | 1,984 | py | Python | nbpkg/pkginspect/nbpkgdescr.py | kiaderouiche/nbpkgquery | 17c996398fad922276c2dc250392959e7bfc31f0 | [
"MIT"
] | 1 | 2017-05-27T13:30:41.000Z | 2017-05-27T13:30:41.000Z | nbpkg/pkginspect/nbpkgdescr.py | kiaderouiche/nbpkgquery | 17c996398fad922276c2dc250392959e7bfc31f0 | [
"MIT"
] | null | null | null | nbpkg/pkginspect/nbpkgdescr.py | kiaderouiche/nbpkgquery | 17c996398fad922276c2dc250392959e7bfc31f0 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
'''
nbpkg defspec
'''
NBPKG_MAGIC_NUMBER = b'\x1f\x8b'
NBPKG_HEADER_MAGIC_NUMBER = '\037\213'
NBPKGINFO_MIN_NUMBER = 1000
NBPKGINFO_MAX_NUMBER = 1146
# data types definition
NBPKG_DATA_TYPE_NULL = 0
NBPKG_DATA_TYPE_CHAR = 1
NBPKG_DATA_TYPE_INT8 = 2
NBPKG_DATA_TYPE_INT16 = 3
NBPKG_DATA_TYPE_INT32 = 4
NBPKG_DATA_TYPE_INT64 = 5
NBPKG_DATA_TYPE_STRING = 6
NBPKG_DATA_TYPE_BIN = 7
NBPKG_DATA_TYPE_STRING_ARRAY = 8
NBPKG_DATA_TYPE_I18NSTRING_TYPE = 9
NBPKG_DATA_TYPES = (NBPKG_DATA_TYPE_NULL,
NBPKG_DATA_TYPE_CHAR,
NBPKG_DATA_TYPE_INT8,
NBPKG_DATA_TYPE_INT16,
NBPKG_DATA_TYPE_INT32,
NBPKG_DATA_TYPE_INT64,
NBPKG_DATA_TYPE_STRING,
NBPKG_DATA_TYPE_BIN,
NBPKG_DATA_TYPE_STRING_ARRAY,)
NBPKGINFO_DISTNAME = 1000
NBPKGINFO_PKGNAME = 1000
NBPKGINFO_CATEGORY = 1000
NBPKGINFO_MAINTAINER = 1000
NBPKGINFO_HOMEPAGE = 1020
NBPKGINFO_COMMENT = 1000
NBPKGINFO_LICENSE = 1000
NBPKGINFO_VERSION = 1001
NBPKGINFO_RELEASE = 1002
NBPKGINFO_DESCRIPTION = 1005
NBPKGINFO_LONG_DESCRIPTION = 1005
NBPKGINFO_OS_VERSION = 1000
NBPKGINFO_COPYRIGHT = 1014
NBPKGINFO_SIZE_PKG = 1000
NBPKGINFO_MACHINE_ARCH = 1022
NBPKGINFOS = (
NBPKGINFO_DISTNAME,
NBPKGINFO_PKGNAME,
NBPKGINFO_CATEGORY,
NBPKGINFO_MAINTAINER,
NBPKGINFO_HOMEPAGE,
NBPKGINFO_COMMENT,
NBPKGINFO_LICENSE,
NBPKGINFO_VERSION,
NBPKGINFO_RELEASE,
NBPKGINFO_LONG_DESCRIPTION,
NBPKGINFO_OS_VERSION,
NBPKGINFO_SIZE_PKG,
NBPKGINFO_MACHINE_ARCH,
)
NBPKG_HEADER_BASIC_FILES = dict()
NBPKG_HEADER_BASIC_FILES = {
'NBPKG_BUILD_INFO':'+BUILD_INFO',
'NBPKG_BUILD_VERSION':'+BUILD_VERSION',
'NBPKG_COMMENT':'+COMMENT',
'NBPKG_CONTENTS':'+CONTENTS',
'NBPKG_DESC':'+DESC',
'NBPKG_SIZE_ALL':'+SIZE_ALL',
'NBPKG_SIZE_PKG':'+SIZE_PKG',
}
| 25.435897 | 48 | 0.703629 | 238 | 1,984 | 5.327731 | 0.306723 | 0.141956 | 0.194795 | 0.059937 | 0.037855 | 0 | 0 | 0 | 0 | 0 | 0 | 0.067232 | 0.227823 | 1,984 | 77 | 49 | 25.766234 | 0.760444 | 0.029234 | 0 | 0 | 0 | 0 | 0.094418 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
70e6d36077cbded389cedfadb9bbd0925029daf4 | 7,286 | py | Python | tensorflow/python/keras/distribute/mnist_multi_worker.py | 6paklata/tensorflow | d6464431256192d2cf1c9b20271792309af7a354 | [
"Apache-2.0"
] | 2 | 2019-05-08T10:02:57.000Z | 2019-05-08T10:02:59.000Z | tensorflow/python/keras/distribute/mnist_multi_worker.py | gurkangokdemir/tensorflow | 2e6446f8a9be31c59080b643d04f9b463c4201cf | [
"Apache-2.0"
] | null | null | null | tensorflow/python/keras/distribute/mnist_multi_worker.py | gurkangokdemir/tensorflow | 2e6446f8a9be31c59080b643d04f9b463c4201cf | [
"Apache-2.0"
] | 2 | 2020-03-25T12:52:20.000Z | 2020-08-11T09:31:43.000Z | # Copyright 2019 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""An example training a Keras Model using MirroredStrategy and native APIs."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from absl import flags
# pylint: disable=g-direct-tensorflow-import
from tensorflow.python import keras
from tensorflow.python.data.ops import dataset_ops
from tensorflow.python.distribute import collective_all_reduce_strategy as collective_strategy
from tensorflow.python.distribute import multi_worker_util
from tensorflow.python.distribute.cluster_resolver import TFConfigClusterResolver
from tensorflow.python.framework import dtypes
from tensorflow.python.framework import ops
from tensorflow.python.keras import backend
from tensorflow.python.keras import utils
from tensorflow.python.keras.datasets import mnist
from tensorflow.python.keras.optimizer_v2 import rmsprop
from tensorflow.python.ops import math_ops
from tensorflow.python.platform import app
from tensorflow.python.platform import tf_logging as logging
NUM_CLASSES = 10
flags.DEFINE_boolean(name='enable_eager', default=False, help='Enable eager?')
flags.DEFINE_enum('distribution_strategy', None, ['multi_worker_mirrored'],
'The Distribution Strategy to use.')
flags.DEFINE_string('model_dir', None, 'Directory for TensorBoard/Checkpoint.')
# TODO(rchao): Use multi_worker_util.maybe_shard_dataset() once that is provided
# there.
def maybe_shard_dataset(dataset):
"""Shard the dataset if running in multi-node environment."""
cluster_resolver = TFConfigClusterResolver()
cluster_spec = cluster_resolver.cluster_spec().as_dict()
if cluster_spec:
dataset = dataset.shard(
multi_worker_util.worker_count(cluster_spec,
cluster_resolver.task_type),
multi_worker_util.id_in_cluster(
cluster_spec, cluster_resolver.task_type, cluster_resolver.task_id))
return dataset
def get_data_shape():
# input image dimensions
img_rows, img_cols = 28, 28
if backend.image_data_format() == 'channels_first':
return 1, img_rows, img_cols
else:
return img_rows, img_cols, 1
def get_input_datasets(use_bfloat16=False):
"""Downloads the MNIST dataset and creates train and eval dataset objects.
Args:
use_bfloat16: Boolean to determine if input should be cast to bfloat16
Returns:
Train dataset and eval dataset. The dataset doesn't include batch dim.
"""
cast_dtype = dtypes.bfloat16 if use_bfloat16 else dtypes.float32
# the data, split between train and test sets
(x_train, y_train), (x_test, y_test) = mnist.load_data()
train_data_shape = (x_train.shape[0],) + get_data_shape()
test_data_shape = (x_test.shape[0],) + get_data_shape()
if backend.image_data_format() == 'channels_first':
x_train = x_train.reshape(train_data_shape)
x_test = x_test.reshape(test_data_shape)
else:
x_train = x_train.reshape(train_data_shape)
x_test = x_test.reshape(test_data_shape)
x_train = x_train.astype('float32')
x_test = x_test.astype('float32')
x_train /= 255
x_test /= 255
# convert class vectors to binary class matrices
y_train = utils.to_categorical(y_train, NUM_CLASSES)
y_test = utils.to_categorical(y_test, NUM_CLASSES)
# train dataset
train_ds = dataset_ops.Dataset.from_tensor_slices((x_train, y_train))
# TODO(rchao): Remove maybe_shard_dataset() once auto-sharding is done.
train_ds = maybe_shard_dataset(train_ds)
train_ds = train_ds.repeat()
train_ds = train_ds.map(lambda x, y: (math_ops.cast(x, cast_dtype), y))
train_ds = train_ds.batch(64, drop_remainder=True)
# eval dataset
eval_ds = dataset_ops.Dataset.from_tensor_slices((x_test, y_test))
# TODO(rchao): Remove maybe_shard_dataset() once auto-sharding is done.
eval_ds = maybe_shard_dataset(eval_ds)
eval_ds = eval_ds.repeat()
eval_ds = eval_ds.map(lambda x, y: (math_ops.cast(x, cast_dtype), y))
eval_ds = eval_ds.batch(64, drop_remainder=True)
return train_ds, eval_ds
def get_model(index=0):
"""Builds a Sequential CNN model to recognize MNIST digits.
Args:
index: The worker index. Defaults to 0.
Returns:
a CNN Keras model used for MNIST
"""
# Define a CNN model to recognize MNIST digits.
model = keras.models.Sequential()
model.add(
keras.layers.Conv2D(
32,
kernel_size=(3, 3),
activation='relu',
input_shape=get_data_shape()))
model.add(keras.layers.Conv2D(64, (3, 3), activation='relu'))
model.add(keras.layers.MaxPooling2D(pool_size=(2, 2)))
model.add(keras.layers.Dropout(0.25, name='dropout_worker%s_first' % index))
model.add(keras.layers.Flatten())
model.add(keras.layers.Dense(128, activation='relu'))
model.add(keras.layers.Dropout(0.5, name='dropout_worker%s_second' % index))
model.add(keras.layers.Dense(NUM_CLASSES, activation='softmax'))
return model
def main(_):
if flags.FLAGS.enable_eager:
ops.enable_eager_execution()
logging.info('Eager execution enabled for MNIST Multi-Worker.')
else:
logging.info('Eager execution not enabled for MNIST Multi-Worker.')
# Build the train and eval datasets from the MNIST data.
train_ds, eval_ds = get_input_datasets()
if flags.FLAGS.distribution_strategy == 'multi_worker_mirrored':
# MultiWorkerMirroredStrategy for multi-worker distributed MNIST training.
strategy = collective_strategy.CollectiveAllReduceStrategy()
else:
raise ValueError('Only `multi_worker_mirrored` is supported strategy '
'in Keras MNIST example at this time. Strategy passed '
'in is %s' % flags.FLAGS.distribution_strategy)
# Create and compile the model under Distribution strategy scope.
# `fit`, `evaluate` and `predict` will be distributed based on the strategy
# model was compiled with.
with strategy.scope():
model = get_model()
optimizer = rmsprop.RMSProp(learning_rate=0.001)
model.compile(
loss=keras.losses.categorical_crossentropy,
optimizer=optimizer,
metrics=['accuracy'])
# Train the model with the train dataset.
tensorboard_callback = keras.callbacks.TensorBoard(
log_dir=flags.FLAGS.model_dir)
model.fit(
x=train_ds,
epochs=20,
steps_per_epoch=468,
callbacks=[tensorboard_callback])
# Evaluate the model with the eval dataset.
score = model.evaluate(eval_ds, steps=10, verbose=0)
logging.info('Test loss:{}'.format(score[0]))
logging.info('Test accuracy:{}'.format(score[1]))
if __name__ == '__main__':
logging.set_verbosity(logging.INFO)
app.run()
| 36.79798 | 94 | 0.734971 | 1,020 | 7,286 | 5.038235 | 0.286275 | 0.03814 | 0.054485 | 0.029578 | 0.237011 | 0.142246 | 0.087566 | 0.073166 | 0.059155 | 0.059155 | 0 | 0.013416 | 0.161131 | 7,286 | 197 | 95 | 36.984772 | 0.827389 | 0.275323 | 0 | 0.086207 | 0 | 0 | 0.101268 | 0.029593 | 0 | 0 | 0 | 0.010152 | 0 | 1 | 0.043103 | false | 0.008621 | 0.155172 | 0 | 0.241379 | 0.008621 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
70ef08ce99b2902cbf44d027970fd4e38a0d4f18 | 712 | py | Python | src/tests/test_loading.py | danmysak/ipa-parser | bb4f5fc1a8f95ef87793d2ffd79430a9a0ffbeaf | [
"MIT"
] | null | null | null | src/tests/test_loading.py | danmysak/ipa-parser | bb4f5fc1a8f95ef87793d2ffd79430a9a0ffbeaf | [
"MIT"
] | null | null | null | src/tests/test_loading.py | danmysak/ipa-parser | bb4f5fc1a8f95ef87793d2ffd79430a9a0ffbeaf | [
"MIT"
] | null | null | null | from timeit import timeit
from unittest import TestCase
from ..ipaparser import IPA, load
__all__ = [
'TestLoading',
]
FACTOR = 10.0
def is_much_larger(a: float, b: float) -> bool:
return a > b * FACTOR
def are_roughly_equal(a: float, b: float) -> bool:
return not is_much_larger(a, b) and not is_much_larger(b, a)
class TestLoading(TestCase):
def test_loading_time(self) -> None:
loading_time = timeit(load, number=1)
first_parse = timeit(lambda: IPA('/abc/'), number=1)
second_parse = timeit(lambda: IPA('/def/'), number=1)
self.assertTrue(is_much_larger(loading_time, first_parse))
self.assertTrue(are_roughly_equal(first_parse, second_parse))
| 25.428571 | 69 | 0.688202 | 103 | 712 | 4.514563 | 0.398058 | 0.051613 | 0.103226 | 0.055914 | 0.094624 | 0.094624 | 0 | 0 | 0 | 0 | 0 | 0.010453 | 0.19382 | 712 | 27 | 70 | 26.37037 | 0.799652 | 0 | 0 | 0 | 0 | 0 | 0.029494 | 0 | 0 | 0 | 0 | 0 | 0.111111 | 1 | 0.166667 | false | 0 | 0.166667 | 0.111111 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
70efcdc28513f6b7a89674d1c7aad0fab1e49771 | 511 | py | Python | Newbies/namedtuple.py | Fernal73/LearnPython3 | 5288017c0dbf95633b84f1e6324f00dec6982d36 | [
"MIT"
] | 1 | 2021-12-17T11:03:13.000Z | 2021-12-17T11:03:13.000Z | Newbies/namedtuple.py | Fernal73/LearnPython3 | 5288017c0dbf95633b84f1e6324f00dec6982d36 | [
"MIT"
] | 1 | 2020-02-05T00:14:43.000Z | 2020-02-06T09:22:49.000Z | Newbies/namedtuple.py | Fernal73/LearnPython3 | 5288017c0dbf95633b84f1e6324f00dec6982d36 | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
"""Named tuple example."""
from collections import namedtuple
Car = namedtuple('Car', 'color mileage')
# Our new "Car" class works as expected:
MY_CAR = Car('red', 3812.4)
print(MY_CAR.color)
print(MY_CAR.mileage)
# We get a nice string repr for free:
print(MY_CAR)
try:
MY_CAR.color = 'blue'
except AttributeError as inst:
print(type(inst)) # the exception instance
print(inst.args) # arguments stored in .args
print(inst)
finally:
print("Into finally")
| 22.217391 | 53 | 0.690802 | 76 | 511 | 4.578947 | 0.631579 | 0.071839 | 0.086207 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014388 | 0.183953 | 511 | 22 | 54 | 23.227273 | 0.820144 | 0.324853 | 0 | 0 | 0 | 0 | 0.104478 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.071429 | 0 | 0.071429 | 0.5 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
70f12857a34e97687f3b8515a3ecdf1b8631fbee | 779 | py | Python | zeus/brewery/models.py | sdivakarrajesh/Zeus | 7a6ddd3d0375f3a2f131f6fa46539faafbd73766 | [
"MIT"
] | null | null | null | zeus/brewery/models.py | sdivakarrajesh/Zeus | 7a6ddd3d0375f3a2f131f6fa46539faafbd73766 | [
"MIT"
] | 5 | 2021-03-19T01:10:37.000Z | 2021-09-22T18:47:10.000Z | zeus/brewery/models.py | sdivakarrajesh/Zeus | 7a6ddd3d0375f3a2f131f6fa46539faafbd73766 | [
"MIT"
] | null | null | null | from django.db import models
# Create your models here.
class DrinkType(models.Model):
created = models.DateTimeField(auto_now_add=True, blank=True, null=True)
updated = models.DateTimeField(auto_now=True, blank=True, null=True)
title = models.CharField(max_length=100, null=True, blank=True)
def __str__(self):
return self.title or ''
class Drink(models.Model):
created = models.DateTimeField(auto_now_add=True, blank=True, null=True)
updated = models.DateTimeField(auto_now=True, blank=True, null=True)
name = models.CharField(max_length=300, null=True, blank=True)
image = models.URLField(blank=True, null=True)
drink_type = models.ManyToManyField(DrinkType, blank=True)
def __str__(self):
return self.name or '' | 32.458333 | 76 | 0.722721 | 107 | 779 | 5.102804 | 0.345794 | 0.131868 | 0.142857 | 0.155678 | 0.553114 | 0.553114 | 0.553114 | 0.446886 | 0.446886 | 0.446886 | 0 | 0.009188 | 0.161746 | 779 | 24 | 77 | 32.458333 | 0.826953 | 0.030809 | 0 | 0.4 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.133333 | false | 0 | 0.066667 | 0.133333 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
70f1cf6cbe1d0d3ea9ded5e933650823891cd848 | 1,015 | py | Python | main.py | kylecorry31/lifx_effects | ea97e3274233fb844c416d4d5a95d06a8a1c7cde | [
"MIT"
] | null | null | null | main.py | kylecorry31/lifx_effects | ea97e3274233fb844c416d4d5a95d06a8a1c7cde | [
"MIT"
] | 1 | 2021-11-13T20:35:21.000Z | 2021-11-13T20:35:21.000Z | main.py | kylecorry31/lifx_effects | ea97e3274233fb844c416d4d5a95d06a8a1c7cde | [
"MIT"
] | null | null | null | from effects.keyboard_effect import KeyboardEffect
from utils.lights import get_lights
from effects.candle_effect import CandleEffect
from effects.phasma_hunt_effect import PhasmaHuntEffect
from effects.audio_spectrum_effect import AudioSpectrumEffect
from effects.audio_amplitude_effect import AudioAmplitudeEffect
from effects.midi_effect import MidiEffect
import time
lights = get_lights(3)
try:
# PhasmaHuntEffect(250).run(lights)
# CandleEffect(250, 45).run(lights)
# KeyboardEffect(['a', 'd'], 200).run(lights)
# MidiEffect('music/2.mid', [0, 0], 2).run(lights)
# MidiEffect('music/6.mid', [4, 9], 1, True).run(lights)
# AudioSpectrumEffect("music/1.wav", 1, bins=[2, 2], num_bins=1024).run(lights)
# AudioSpectrumEffect("music/2.wav", 1, bins=[14, 15, 16], num_bins=1024).run(lights)
AudioAmplitudeEffect("microphone", 3).run(lights)
finally:
time.sleep(0.2)
for light in lights:
light.on(255)
time.sleep(1)
for light in lights:
light.on(255) | 37.592593 | 89 | 0.727094 | 140 | 1,015 | 5.178571 | 0.385714 | 0.09931 | 0.044138 | 0.066207 | 0.126897 | 0.071724 | 0.071724 | 0 | 0 | 0 | 0 | 0.057604 | 0.144828 | 1,015 | 27 | 90 | 37.592593 | 0.77765 | 0.371429 | 0 | 0.222222 | 0 | 0 | 0.015848 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.444444 | 0 | 0.444444 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
cb0378636f9363b9345438112d17636d553ee87d | 1,975 | py | Python | src/ClusterManager/cluster_manager.py | nautilusshell/QianJiangYuan | 262a88ca559f62d6e8633b1596481515d32f7907 | [
"MIT"
] | null | null | null | src/ClusterManager/cluster_manager.py | nautilusshell/QianJiangYuan | 262a88ca559f62d6e8633b1596481515d32f7907 | [
"MIT"
] | null | null | null | src/ClusterManager/cluster_manager.py | nautilusshell/QianJiangYuan | 262a88ca559f62d6e8633b1596481515d32f7907 | [
"MIT"
] | 1 | 2019-12-27T07:57:48.000Z | 2019-12-27T07:57:48.000Z | #!/usr/bin/python
# -*- coding: UTF-8 -*-
import json
import os
import time
import argparse
import uuid
import subprocess
import sys
import datetime
import yaml
from jinja2 import Environment, FileSystemLoader, Template
import base64
import re
import thread
import threading
import random
import textwrap
import logging
import logging.config
import job_manager
import user_manager
import node_manager
import joblog_manager
import command_manager
from multiprocessing import Process, Manager
def create_log( logdir = '/var/log/dlworkspace' ):
if not os.path.exists( logdir ):
os.system("mkdir -p " + logdir )
with open('logging.yaml') as f:
logging_config = yaml.load(f)
f.close()
logging_config["handlers"]["file"]["filename"] = logdir+"/clustermanager.log"
logging.config.dictConfig(logging_config)
def Run():
create_log()
logging.info( "Starting job manager... " )
proc_job = Process(target=job_manager.Run)
proc_job.start()
logging.info( "Starting user manager... " )
proc_user = Process(target=user_manager.Run)
proc_user.start()
logging.info( "Starting node manager... " )
proc_node = Process(target=node_manager.Run)
proc_node.start()
logging.info( "Starting joblogging manager... " )
proc_joblog = Process(target=joblog_manager.Run)
proc_joblog.start()
logging.info( "Starting command manager... " )
proc_command = Process(target=command_manager.Run)
proc_command.start()
proc_job.join()
proc_user.join()
proc_node.join()
proc_joblog.join()
proc_command.join()
pass
if __name__ == '__main__':
#parser = argparse.ArgumentParser( prog='cluster_manager.py',
# formatter_class=argparse.RawDescriptionHelpFormatter,
# description=textwrap.dedent('''\
# ''') )
#parser.add_argument("help",
# help = "Show the usage of this program" )
#args = parser.parse_args()
Run() | 21.236559 | 85 | 0.691646 | 239 | 1,975 | 5.539749 | 0.389121 | 0.049094 | 0.071752 | 0.072508 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002516 | 0.194937 | 1,975 | 93 | 86 | 21.236559 | 0.830189 | 0.150886 | 0 | 0 | 0 | 0 | 0.132415 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.035088 | false | 0.017544 | 0.421053 | 0 | 0.45614 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
cb0468927dd370e4c6e76788301b7d52a85a268f | 611 | py | Python | netmiko/ssh_exception.py | dalekirkman1/netmiko | a82ae784f4f1ef75e864ac972c088b8c963f77ff | [
"MIT"
] | 1 | 2020-12-11T00:48:09.000Z | 2020-12-11T00:48:09.000Z | netmiko/ssh_exception.py | rockenwind/netmiko | 24291029d0cdd5af660475ac1093a2dcd1c08af2 | [
"MIT"
] | null | null | null | netmiko/ssh_exception.py | rockenwind/netmiko | 24291029d0cdd5af660475ac1093a2dcd1c08af2 | [
"MIT"
] | null | null | null | from paramiko.ssh_exception import SSHException
from paramiko.ssh_exception import AuthenticationException
class NetmikoTimeoutException(SSHException):
"""SSH session timed trying to connect to the device."""
pass
class NetmikoAuthenticationException(AuthenticationException):
"""SSH authentication exception based on Paramiko AuthenticationException."""
pass
class ConfigInvalidException(Exception):
"""Exception raised for invalid configuration error."""
pass
NetMikoTimeoutException = NetmikoTimeoutException
NetMikoAuthenticationException = NetmikoAuthenticationException
| 24.44 | 81 | 0.813421 | 50 | 611 | 9.9 | 0.54 | 0.048485 | 0.060606 | 0.09697 | 0.121212 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130933 | 611 | 24 | 82 | 25.458333 | 0.932203 | 0.281506 | 0 | 0.3 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.3 | 0.2 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
cb0b42243a5f1619da2be629a939fedeb68fb530 | 3,510 | py | Python | tools/Vitis-AI-Quantizer/vai_q_tensorflow2.x/setup.py | hito0512/Vitis-AI | 996459fb96cb077ed2f7e789d515893b1cccbc95 | [
"Apache-2.0"
] | 848 | 2019-12-03T00:16:17.000Z | 2022-03-31T22:53:17.000Z | tools/Vitis-AI-Quantizer/vai_q_tensorflow2.x/setup.py | wangyifan778/Vitis-AI | f61061eef7550d98bf02a171604c9a9f283a7c47 | [
"Apache-2.0"
] | 656 | 2019-12-03T00:48:46.000Z | 2022-03-31T18:41:54.000Z | tools/Vitis-AI-Quantizer/vai_q_tensorflow2.x/setup.py | wangyifan778/Vitis-AI | f61061eef7550d98bf02a171604c9a9f283a7c47 | [
"Apache-2.0"
] | 506 | 2019-12-03T00:46:26.000Z | 2022-03-30T10:34:56.000Z | # Copyright 2019 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Install tensorflow_model_optimization."""
import datetime
import os
import sys
from setuptools import find_packages
from setuptools import setup
from setuptools.command.install import install as InstallCommandBase
from setuptools.dist import Distribution
# To enable importing version.py directly, we add its path to sys.path.
version_path = os.path.join(
os.path.dirname(__file__), 'tensorflow_model_optimization', 'python/core')
sys.path.append(version_path)
from version import __version__ # pylint: disable=g-import-not-at-top
# TODO(alanchiao): add explicit Tensorflow requirement once Tensorflow
# moves from a tf and tf-gpu packaging approach (where a user installs
# one of the two) to one where a user installs the tf package and then
# also installs the gpu package if they need gpu support. The latter allows
# us (and our dependents) to maintain a single package instead of two.
REQUIRED_PACKAGES = [
'numpy~=1.14',
'six~=1.10',
'enum34~=1.1;python_version<"3.4"',
'dm-tree~=0.1.1',
]
if '--release' in sys.argv:
release = True
sys.argv.remove('--release')
else:
# Build a nightly package by default.
release = False
if release:
project_name = 'vai-q-tensorflow2'
else:
# Nightly releases use date-based versioning of the form
# '0.0.1.dev20180305'
project_name = 'vai-q-tensorflow2-nightly'
datestring = datetime.datetime.now().strftime('%Y%m%d')
__version__ += datestring
class BinaryDistribution(Distribution):
"""This class is needed in order to create OS specific wheels."""
def has_ext_modules(self):
return False
setup(
name=project_name,
version=__version__,
description='Xilinx Vitis AI Quantizer for Tensorflow 2.x. '
'This is customized based on tensorflow-model-optimization('
'https://github.com/tensorflow/model-optimization)'
'A suite of tools that users, both novice and advanced'
' can use to optimize machine learning models for deployment'
' and execution.',
author='Xiao Sheng',
author_email='kylexiao@xilinx.com',
license='Apache 2.0',
packages=find_packages(),
install_requires=REQUIRED_PACKAGES,
# Add in any packaged data.
include_package_data=True,
package_data={'': ['*.so', '*.json']},
exclude_package_data={'': ['BUILD', '*.h', '*.cc']},
zip_safe=False,
distclass=BinaryDistribution,
cmdclass={
'pip_pkg': InstallCommandBase,
},
classifiers=[
'Intended Audience :: Developers',
'Intended Audience :: Education',
'Intended Audience :: Science/Research',
'License :: OSI Approved :: Apache Software License',
'Topic :: Scientific/Engineering',
'Topic :: Scientific/Engineering :: Artificial Intelligence',
],
keywords='tensorflow model optimization machine learning',
)
| 35.454545 | 80 | 0.703989 | 456 | 3,510 | 5.328947 | 0.517544 | 0.024691 | 0.055556 | 0.013169 | 0.021399 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01343 | 0.17265 | 3,510 | 98 | 81 | 35.816327 | 0.823347 | 0.38547 | 0 | 0.031746 | 0 | 0 | 0.378417 | 0.075401 | 0 | 0 | 0 | 0.010204 | 0 | 1 | 0.015873 | false | 0 | 0.126984 | 0.015873 | 0.174603 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb1e226e79e09f241ae2edae33072edf44edd04c | 2,301 | py | Python | bdd/contact_stepts.py | SvetlanaPopova/python_1 | 5acc26e3d3746d7fcf48603d9ca9064e39c248ca | [
"Apache-2.0"
] | null | null | null | bdd/contact_stepts.py | SvetlanaPopova/python_1 | 5acc26e3d3746d7fcf48603d9ca9064e39c248ca | [
"Apache-2.0"
] | null | null | null | bdd/contact_stepts.py | SvetlanaPopova/python_1 | 5acc26e3d3746d7fcf48603d9ca9064e39c248ca | [
"Apache-2.0"
] | null | null | null | __author__ = 'User'
from pytest_bdd import given, when, then
from model.contact import Contact
import random
@given('a contact list')
def contact_list(db):
return db.get_contact_list()
@given('a contact with <firstname>, <lastname>, <address> and <mobilephone>')
def new_contact(firstname, lastname, address, mobilephone):
return Contact(firstname=firstname, lastname=lastname, address=address, mobilephone=mobilephone)
@when('I add the contact to the list')
def add_new_contact(app, new_contact):
app.contact.add_new(new_contact)
@then('the new contact list is equal to the old contact list with the added contact')
def verify_contact_added(db, contact_list, new_contact, app, check_ui):
app.contact.check_add_new_success(db, new_contact, contact_list, check_ui)
@given('a non-empty contact list')
def non_empty_contact_list(app, db):
if len(db.get_contact_list()) < 0:
app.group.create(Contact(firstname='some firstname'))
return db.get_contact_list()
@given('a random contact from the list')
def random_contact(non_empty_contact_list):
return random.choice(non_empty_contact_list)
@when('I delete the contact from the list')
def delete_contact(app, random_contact):
app.contact.delete_contact_by_id(random_contact.id)
@then('the new contact list is equal to the old contact list without the contact')
def verify_contact_deleted(db, non_empty_contact_list, random_contact, app, check_ui):
app.contact.check_delete_success(db, random_contact, non_empty_contact_list, check_ui)
@when('I modify the contact from the list')
def modify_contact(app, new_contact, random_contact):
new_contact.id = random_contact.id
app.contact.modify_contact_by_id(new_contact)
@then('the new contact list is equal to the old contact list with the modified contact')
def verify_contact_deleted(db, non_empty_contact_list, new_contact, random_contact ,app, check_ui):
non_empty_contact_list.remove(random_contact)
random_contact.firstname = new_contact.firstname
random_contact.lastname = new_contact.lastname
random_contact.address = new_contact.address
random_contact.mobilephone =new_contact.mobilephone
non_empty_contact_list.append(random_contact)
app.contact.check_modify_contact_success(db, non_empty_contact_list, check_ui)
| 41.836364 | 100 | 0.788787 | 349 | 2,301 | 4.916905 | 0.160458 | 0.147436 | 0.087413 | 0.110723 | 0.346154 | 0.315851 | 0.231352 | 0.161422 | 0.161422 | 0.161422 | 0 | 0.000495 | 0.121686 | 2,301 | 54 | 101 | 42.611111 | 0.84859 | 0 | 0 | 0.046512 | 0 | 0 | 0.207736 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.232558 | false | 0 | 0.069767 | 0.069767 | 0.395349 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb214db88b78d4f30ed19cb317af575bf87c3f7d | 686 | py | Python | src/django_perf_rec/settings.py | adamchainz/django-perf-rec | f543053d9de5bc7f52f5761fc914d342c78e37a1 | [
"MIT"
] | 147 | 2018-08-21T14:18:27.000Z | 2022-03-31T23:16:58.000Z | src/django_perf_rec/settings.py | adamchainz/django-perf-rec | f543053d9de5bc7f52f5761fc914d342c78e37a1 | [
"MIT"
] | 48 | 2018-07-15T11:07:08.000Z | 2022-03-26T16:00:22.000Z | src/django_perf_rec/settings.py | adamchainz/django-perf-rec | f543053d9de5bc7f52f5761fc914d342c78e37a1 | [
"MIT"
] | 11 | 2018-07-13T10:09:44.000Z | 2021-02-13T18:15:12.000Z | import sys
from typing import Any
from django.conf import settings
if sys.version_info >= (3, 8):
from typing import Literal
ModeType = Literal["once", "none", "all"]
else:
ModeType = str
class Settings:
defaults = {"HIDE_COLUMNS": True, "MODE": "once"}
def get_setting(self, key: str) -> Any:
try:
return settings.PERF_REC[key]
except (AttributeError, KeyError):
return self.defaults.get(key, None)
@property
def HIDE_COLUMNS(self) -> bool:
return self.get_setting("HIDE_COLUMNS")
@property
def MODE(self) -> ModeType:
return self.get_setting("MODE")
perf_rec_settings = Settings()
| 20.176471 | 53 | 0.635569 | 85 | 686 | 5.011765 | 0.470588 | 0.077465 | 0.075117 | 0.093897 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.003884 | 0.249271 | 686 | 33 | 54 | 20.787879 | 0.823301 | 0 | 0 | 0.090909 | 0 | 0 | 0.068513 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.136364 | false | 0 | 0.181818 | 0.090909 | 0.590909 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
cb283b303e7b53a68315f74dab085b92d21f9186 | 1,031 | py | Python | truncande/cli.py | Ricyteach/truncande | b7c529cb141b6c534833b4f9536ea6910412d637 | [
"MIT"
] | null | null | null | truncande/cli.py | Ricyteach/truncande | b7c529cb141b6c534833b4f9536ea6910412d637 | [
"MIT"
] | null | null | null | truncande/cli.py | Ricyteach/truncande | b7c529cb141b6c534833b4f9536ea6910412d637 | [
"MIT"
] | null | null | null | import pathlib
import click
from . import candeout
@click.group()
@click.argument("ifile", type=click.Path(exists=True, dir_okay=False), required=True)
@click.argument(
"ofile",
type=click.Path(exists=False, dir_okay=False, writable=True),
required=False,
)
@click.pass_context
def main(ctx, ifile, ofile=""):
ifile = pathlib.Path(ifile)
ofile = (
pathlib.Path(ofile) if ofile else ifile.parent / (ifile.stem + " truncated.txt")
)
ctx.ensure_object(dict)
ctx.obj["ifile"] = ifile
ctx.obj["ofile"] = ofile
ctx.obj["candeout"] = candeout.CandeOut(ifile.read_text().split("\n"))
@main.command()
@click.argument("steps", nargs=-1, type=int)
@click.pass_context
def steps(ctx, steps=(-1,)):
cout: candeout.CandeOut = ctx.obj["candeout"]
ctx.obj["out"] = candeout.remove_steps(cout, steps)
@main.resultcallback()
@click.pass_context
def write_file(ctx, *args, **kwargs):
ofile: pathlib.Path = ctx.obj["ofile"]
ofile.write_text("\n".join(ctx.obj["candeout"].lines))
| 25.775 | 88 | 0.676043 | 141 | 1,031 | 4.87234 | 0.368794 | 0.061135 | 0.069869 | 0.082969 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.002273 | 0.14646 | 1,031 | 39 | 89 | 26.435897 | 0.778409 | 0 | 0 | 0.096774 | 0 | 0 | 0.072745 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.096774 | false | 0.096774 | 0.096774 | 0 | 0.193548 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
cb2a679f6bf1c9fc3be6603b4c524bc925b318db | 756 | py | Python | conference/decorators.py | ethancarlsson/epcon | 10ae259ad75271651506d44cc5e71cf089349ea3 | [
"BSD-2-Clause"
] | 40 | 2015-03-03T22:14:58.000Z | 2022-02-15T22:27:48.000Z | conference/decorators.py | ethancarlsson/epcon | 10ae259ad75271651506d44cc5e71cf089349ea3 | [
"BSD-2-Clause"
] | 699 | 2015-01-21T10:13:29.000Z | 2022-02-08T09:26:36.000Z | conference/decorators.py | ethancarlsson/epcon | 10ae259ad75271651506d44cc5e71cf089349ea3 | [
"BSD-2-Clause"
] | 96 | 2015-01-22T11:03:13.000Z | 2022-01-31T05:35:34.000Z | import functools
from django.contrib import messages
from django.urls import reverse
from django.shortcuts import redirect
def full_profile_required(func):
@functools.wraps(func)
def wrapper(request, *args, **kwargs):
if (request.user
and request.user.id # FIXME test mocks mess with the above object so we have to check the id
and (not request.user.attendeeprofile or not request.user.attendeeprofile.gender)):
messages.warning(
request,
"Please update your profile to continue using the EuroPython website."
)
return redirect(reverse('user_panel:profile_settings'))
return func(request, *args, **kwargs)
return wrapper
| 31.5 | 109 | 0.661376 | 90 | 756 | 5.511111 | 0.577778 | 0.08871 | 0.068548 | 0.116935 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.268519 | 756 | 23 | 110 | 32.869565 | 0.896926 | 0.092593 | 0 | 0 | 0 | 0 | 0.138889 | 0.039474 | 0 | 0 | 0 | 0.043478 | 0 | 1 | 0.117647 | false | 0 | 0.235294 | 0 | 0.529412 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb397006da0e1fab13bd9fe5696085b9a41dee97 | 1,365 | py | Python | converter.py | Supercip971/convertisseur-python | 5e8e6c150fcc3c79e4902971ffd6227b32253450 | [
"MIT"
] | null | null | null | converter.py | Supercip971/convertisseur-python | 5e8e6c150fcc3c79e4902971ffd6227b32253450 | [
"MIT"
] | null | null | null | converter.py | Supercip971/convertisseur-python | 5e8e6c150fcc3c79e4902971ffd6227b32253450 | [
"MIT"
] | null | null | null | str_xdigits = [
"0",
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9",
"a",
"b",
"c",
"d",
"e",
"f",
]
def convert_digit(value: int, base: int) -> str:
return str_xdigits[value % base]
def convert_to_val(value: int, base: int) -> str:
if value == None:
return "Error"
current = int(value)
result = ""
while current != 0:
result = result + convert_digit(current, base)
current = current // base
if len(result) == 0:
return "0"
return result[::-1] # reverse string
def val_to_hex(value: int) -> str:
return "0x" + convert_to_val(value, 16)
def val_to_bin(value: int) -> str:
return "0b" + convert_to_val(value, 2)
def val_to_dec(value: int) -> str:
return convert_to_val(value, 10)
def val_from_str(value: str, base: int) -> int:
value = value.lower()
result = 0
for c in value:
if c not in str_xdigits or int(str_xdigits.index(c)) >= base:
return None
result = result * base + str_xdigits.index(c)
return result
def val_from_hex(value: str) -> int:
return val_from_str(value.removeprefix("0x"), 16)
def val_from_bin(value: str) -> int:
return val_from_str(value.removeprefix("0b"), 2)
def val_from_dec(value: str) -> int:
return val_from_str(value, 10)
| 17.727273 | 69 | 0.570696 | 198 | 1,365 | 3.757576 | 0.257576 | 0.056452 | 0.064516 | 0.091398 | 0.209677 | 0.16129 | 0.16129 | 0.16129 | 0.11828 | 0 | 0 | 0.029532 | 0.280586 | 1,365 | 76 | 70 | 17.960526 | 0.728106 | 0.010256 | 0 | 0 | 0 | 0 | 0.022239 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.176471 | false | 0 | 0 | 0.137255 | 0.411765 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 |
cb3c3f1580ef22e37e0c06b778c777c3a92768d2 | 13,914 | py | Python | backend/test/test_api.py | solevis/pixyship2 | 15a592a1961e4286a344a0d1664cfb491439bf09 | [
"MIT"
] | 8 | 2021-04-04T17:10:35.000Z | 2021-12-04T06:56:56.000Z | backend/test/test_api.py | solevis/pixyship2 | 15a592a1961e4286a344a0d1664cfb491439bf09 | [
"MIT"
] | 56 | 2021-02-18T14:50:32.000Z | 2022-03-28T14:03:30.000Z | backend/test/test_api.py | solevis/pixyship2 | 15a592a1961e4286a344a0d1664cfb491439bf09 | [
"MIT"
] | 2 | 2021-09-28T00:57:00.000Z | 2022-01-21T07:47:15.000Z | from pixelstarshipsapi import PixelStarshipsApi
from run import push_context
def test_login():
pixel_starships_api = PixelStarshipsApi()
device_key, device_checksum = pixel_starships_api.generate_device()
token = pixel_starships_api.get_device_token(device_key, device_checksum)
assert isinstance(token, str)
assert len(token) == 36
def test_settings():
pixel_starships_api = PixelStarshipsApi()
settings = pixel_starships_api.get_api_settings()
assert 'ProductionServer' in settings
assert 'MaintenanceMessage' in settings
def test_inspect_ship():
# avoid Flask RuntimeError: No application found
push_context()
pixel_starships_api = PixelStarshipsApi()
user_id = 6635604 # Solevis
inspect_ship = pixel_starships_api.inspect_ship(user_id)
# Player
user = inspect_ship['User']
assert 'Id' in user
assert 'Name' in user
assert 'IconSpriteId' in user
assert 'AllianceName' in user
assert 'AllianceSpriteId' in user
assert 'Trophy' in user
assert 'LastAlertDate' in user
# Ship
ship = inspect_ship['Ship']
assert 'ShipDesignId' in ship
assert 'ImmunityDate' in ship
assert 'ShipStatus' in ship
assert 'OriginalRaceId' in ship
# Room
room = inspect_ship['Ship']['Rooms'][0]
assert 'RoomId' in room
assert 'Row' in room
assert 'Column' in room
assert 'ConstructionStartDate' in room
def test_dailies():
pixel_starships_api = PixelStarshipsApi()
dailies = pixel_starships_api.get_dailies()
assert len(dailies) > 0
# Shop
assert 'LimitedCatalogCurrencyAmount' in dailies
assert 'LimitedCatalogType' in dailies
assert 'LimitedCatalogArgument' in dailies
assert 'LimitedCatalogCurrencyType' in dailies
assert 'LimitedCatalogQuantity' in dailies
assert 'LimitedCatalogMaxTotal' in dailies
assert 'LimitedCatalogExpiryDate' in dailies
# Blue cargo
assert 'CommonCrewId' in dailies
assert 'HeroCrewId' in dailies
# Green cargo
assert 'CargoItems' in dailies
assert 'CargoPrices' in dailies
# Daily reward
assert 'DailyRewardArgument' in dailies
assert 'DailyRewardType' in dailies
assert 'DailyItemRewards' in dailies
# Sale
assert 'SaleType' in dailies
assert 'SaleArgument' in dailies
assert 'SaleItemMask' in dailies
# News messages
assert 'News' in dailies
assert 'NewsUpdateDate' in dailies
assert 'TournamentNews' in dailies
assert 'NewsSpriteId' in dailies
def test_sprites():
pixel_starships_api = PixelStarshipsApi()
sprites = pixel_starships_api.get_sprites()
assert len(sprites) > 0
sprite = sprites[0]
assert 'SpriteId' in sprite
assert 'ImageFileId' in sprite
assert 'X' in sprite
assert 'Y' in sprite
assert 'Width' in sprite
assert 'Height' in sprite
assert 'SpriteKey' in sprite
def test_ships():
pixel_starships_api = PixelStarshipsApi()
ships = pixel_starships_api.get_ships()
assert len(ships) > 0
ship = ships[0]
assert 'ShipDesignName' in ship
assert 'ShipDescription' in ship
assert 'ShipLevel' in ship
assert 'Hp' in ship
assert 'RepairTime' in ship
assert 'InteriorSpriteId' in ship
assert 'ExteriorSpriteId' in ship
assert 'LogoSpriteId' in ship
assert 'MiniShipSpriteId' in ship
assert 'RoomFrameSpriteId' in ship
assert 'DoorFrameLeftSpriteId' in ship
assert 'DoorFrameRightSpriteId' in ship
assert 'Rows' in ship
assert 'Columns' in ship
assert 'RaceId' in ship
assert 'Mask' in ship
assert 'MineralCost' in ship
assert 'StarbuxCost' in ship
assert 'MineralCapacity' in ship
assert 'GasCapacity' in ship
assert 'EquipmentCapacity' in ship
assert 'ShipType' in ship
def test_researches():
pixel_starships_api = PixelStarshipsApi()
researches = pixel_starships_api.get_researches()
assert len(researches) > 0
research = researches[0]
assert 'ResearchName' in research
assert 'ResearchDescription' in research
assert 'GasCost' in research
assert 'StarbuxCost' in research
assert 'RequiredLabLevel' in research
assert 'ResearchTime' in research
assert 'LogoSpriteId' in research
assert 'ImageSpriteId' in research
assert 'RequiredResearchDesignId' in research
assert 'ResearchDesignType' in research
def test_rooms():
pixel_starships_api = PixelStarshipsApi()
rooms = pixel_starships_api.get_rooms()
assert len(rooms) > 0
room = rooms[0]
assert 'RoomName' in room
assert 'RoomShortName' in room
assert 'RoomType' in room
assert 'Level' in room
assert 'Capacity' in room
assert 'Rows' in room
assert 'Columns' in room
assert 'ImageSpriteId' in room
assert 'ConstructionSpriteId' in room
assert 'MaxSystemPower' in room
assert 'MaxPowerGenerated' in room
assert 'MinShipLevel' in room
assert 'UpgradeFromRoomDesignId' in room
assert 'DefaultDefenceBonus' in room
assert 'ReloadTime' in room
assert 'RefillUnitCost' in room
assert 'RoomType' in room
assert 'PriceString' in room
assert 'PriceString' in room
assert 'ConstructionTime' in room
assert 'RoomDescription' in room
assert 'ManufactureType' in room
room_with_missile_design = None
for room in rooms:
if room['MissileDesign']:
room_with_missile_design = room
break
assert room_with_missile_design
assert 'SystemDamage' in room_with_missile_design['MissileDesign']
assert 'HullDamage' in room_with_missile_design['MissileDesign']
assert 'CharacterDamage' in room_with_missile_design['MissileDesign']
room_with_purchase = None
for room in rooms:
if room['AvailabilityMask']:
room_with_purchase = room
break
assert room_with_purchase
assert 'AvailabilityMask' in room_with_purchase
def test_characters():
pixel_starships_api = PixelStarshipsApi()
characters = pixel_starships_api.get_characters()
assert len(characters) > 0
character = characters[0]
assert 'CharacterDesignName' in character
assert 'ProfileSpriteId' in character
assert 'Rarity' in character
assert 'Hp' in character
assert 'FinalHp' in character
assert 'Pilot' in character
assert 'FinalPilot' in character
assert 'Attack' in character
assert 'FinalAttack' in character
assert 'Repair' in character
assert 'FinalRepair' in character
assert 'Weapon' in character
assert 'FinalWeapon' in character
assert 'Engine' in character
assert 'FinalEngine' in character
assert 'Research' in character
assert 'FinalResearch' in character
assert 'Science' in character
assert 'FinalScience' in character
assert 'SpecialAbilityArgument' in character
assert 'SpecialAbilityFinalArgument' in character
assert 'SpecialAbilityType' in character
assert 'FireResistance' in character
assert 'WalkingSpeed' in character
assert 'RunSpeed' in character
assert 'TrainingCapacity' in character
assert 'ProgressionType' in character
assert 'CollectionDesignId' in character
assert 'EquipmentMask' in character
parts = character['CharacterParts']
assert 'StandardSpriteId' in parts['Head']
assert 'StandardSpriteId' in parts['Body']
assert 'StandardSpriteId' in parts['Leg']
def test_collections():
pixel_starships_api = PixelStarshipsApi()
collections = pixel_starships_api.get_collections()
assert len(collections) > 0
collection = collections[0]
assert 'CollectionName' in collection
assert 'MinCombo' in collection
assert 'MaxCombo' in collection
assert 'BaseEnhancementValue' in collection
assert 'SpriteId' in collection
assert 'StepEnhancementValue' in collection
assert 'IconSpriteId' in collection
def test_items():
pixel_starships_api = PixelStarshipsApi()
items = pixel_starships_api.get_items()
assert len(items) > 0
item = items[0]
assert 'ItemDesignName' in item
assert 'ItemDesignDescription' in item
assert 'ImageSpriteId' in item
assert 'ItemSubType' in item
assert 'EnhancementType' in item
assert 'Ingredients' in item
assert 'Content' in item
assert 'MarketPrice' in item
assert 'FairPrice' in item
assert 'ItemDesignId' in item
assert 'ItemType' in item
assert 'Rarity' in item
assert 'EnhancementValue' in item
def test_alliances():
pixel_starships_api = PixelStarshipsApi()
alliances = pixel_starships_api.get_alliances(42)
assert len(alliances) == 42
alliance = alliances[0]
assert 'AllianceId' in alliance
assert 'AllianceName' in alliance
def test_sales():
pixel_starships_api = PixelStarshipsApi()
sales = pixel_starships_api.get_sales(131, 0, 1) # Scratchy
assert len(sales) == 1
sale = sales[0]
assert 'SaleId' in sale
assert 'StatusDate' in sale
assert 'Quantity' in sale
assert 'CurrencyType' in sale
assert 'CurrencyValue' in sale
assert 'BuyerShipId' in sale
assert 'BuyerShipName' in sale
assert 'BuyerShipName' in sale
assert 'SellerShipId' in sale
assert 'SellerShipName' in sale
assert 'ItemId' in sale
def test_users():
# avoid Flask RuntimeError: No application found
push_context()
pixel_starships_api = PixelStarshipsApi()
users = pixel_starships_api.get_users() # top 10
assert len(users) == 100
user = users[0]
assert 'Id' in user
assert 'Name' in user
assert 'Trophy' in user
assert 'AllianceId' in user
assert 'LastLoginDate' in user
assert 'AllianceName' in user
assert 'AllianceSpriteId' in user
def test_alliance_users():
# avoid Flask RuntimeError: No application found
push_context()
pixel_starships_api = PixelStarshipsApi()
alliance_id = 9343 # Trek Federation
users = pixel_starships_api.get_alliance_users(alliance_id)
assert len(users) > 0
user = users[0]
assert 'Id' in user
assert 'Name' in user
assert 'Trophy' in user
assert 'AllianceId' in user
assert 'LastLoginDate' in user
assert 'AllianceName' in user
assert 'AllianceSpriteId' in user
def test_prestiges_character_to():
pixel_starships_api = PixelStarshipsApi()
character_id = 196 # PinkZilla
prestiges = pixel_starships_api.get_prestiges_character_to(character_id)
assert len(prestiges) > 0
prestige = prestiges[0]
assert 'CharacterDesignId1' in prestige
assert 'CharacterDesignId2' in prestige
def test_prestiges_character_from():
pixel_starships_api = PixelStarshipsApi()
character_id = 338 # Zongzi-Man
prestiges = pixel_starships_api.get_prestiges_character_from(character_id)
assert len(prestiges) > 0
prestige = prestiges[0]
assert 'CharacterDesignId1' in prestige
assert 'CharacterDesignId2' in prestige
def test_rooms_purchase():
pixel_starships_api = PixelStarshipsApi()
rooms_purchase = pixel_starships_api.get_rooms_purchase()
assert len(rooms_purchase) > 0
room_purchase = rooms_purchase[0]
assert 'RoomDesignId' in room_purchase
assert 'AvailabilityMask' in room_purchase
def test_search_users():
# avoid Flask RuntimeError: No application found
push_context()
pixel_starships_api = PixelStarshipsApi()
user_name_to_search = 'Solevis'
users = pixel_starships_api.search_users(user_name_to_search, True)
assert len(users) == 1
user = users[0]
assert 'Name' in user
assert user['Name'] == user_name_to_search
assert 'PVPAttackWins' in user
assert 'PVPAttackLosses' in user
assert 'PVPAttackDraws' in user
assert 'PVPDefenceDraws' in user
assert 'PVPDefenceWins' in user
assert 'PVPDefenceLosses' in user
assert 'HighestTrophy' in user
assert 'CrewDonated' in user
assert 'CrewReceived' in user
assert 'AllianceJoinDate' in user
assert 'CreationDate' in user
def test_trainings():
# avoid Flask RuntimeError: No application found
push_context()
pixel_starships_api = PixelStarshipsApi()
trainings = pixel_starships_api.get_trainings()
assert len(trainings) > 0
training = trainings[0]
assert 'TrainingDesignId' in training
assert 'TrainingSpriteId' in training
assert 'HpChance' in training
assert 'AttackChance' in training
assert 'PilotChance' in training
assert 'RepairChance' in training
assert 'WeaponChance' in training
assert 'ScienceChance' in training
assert 'EngineChance' in training
assert 'StaminaChance' in training
assert 'AbilityChance' in training
assert 'XpChance' in training
assert 'Fatigue' in training
assert 'MinimumGuarantee' in training
def test_achievements():
# avoid Flask RuntimeError: No application found
push_context()
pixel_starships_api = PixelStarshipsApi()
achievements = pixel_starships_api.get_achievements()
assert len(achievements) > 0
achievement = achievements[0]
assert 'AchievementDesignId' in achievement
assert 'AchievementTitle' in achievement
assert 'AchievementDescription' in achievement
assert 'SpriteId' in achievement
assert 'RewardString' in achievement
assert 'ParentAchievementDesignId' in achievement
def test_situations():
# avoid Flask RuntimeError: No application found
push_context()
pixel_starships_api = PixelStarshipsApi()
situations = pixel_starships_api.get_situations()
assert len(situations) > 0
situation = situations[0]
assert 'SituationDesignId' in situation
assert 'SituationName' in situation
assert 'SituationDescription' in situation
assert 'FromDate' in situation
assert 'EndDate' in situation
assert 'IconSpriteId' in situation
| 28.280488 | 78 | 0.720425 | 1,569 | 13,914 | 6.251115 | 0.187381 | 0.064233 | 0.077998 | 0.076264 | 0.221554 | 0.18699 | 0.174144 | 0.131117 | 0.131117 | 0.127447 | 0 | 0.006667 | 0.213023 | 13,914 | 491 | 79 | 28.338086 | 0.889041 | 0.033491 | 0 | 0.192837 | 0 | 0 | 0.216313 | 0.02771 | 0 | 0 | 0 | 0 | 0.688705 | 1 | 0.060606 | false | 0 | 0.00551 | 0 | 0.066116 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb3d6a46bfe63becf40c12dca772072e068d92b3 | 4,524 | py | Python | yamlfred/alfred_object.py | uchida/yamlfred | bbd3ff1c875aef095556aaa6b5838e1f2a3ec01e | [
"CC0-1.0"
] | null | null | null | yamlfred/alfred_object.py | uchida/yamlfred | bbd3ff1c875aef095556aaa6b5838e1f2a3ec01e | [
"CC0-1.0"
] | null | null | null | yamlfred/alfred_object.py | uchida/yamlfred | bbd3ff1c875aef095556aaa6b5838e1f2a3ec01e | [
"CC0-1.0"
] | null | null | null | # -*- coding: utf-8 -*-
from __future__ import unicode_literals, print_function
import os.path
import uuid
from yamlfred.utils import remove_default, merge_dicts
from yamlfred.utils import Include
defaults = {
'alfred.workflow.output.notification': {
'config': {'removeextension': False, 'output': 0, 'lastpathcomponent': False, 'onlyshowifquerypopulated': False, 'sticky': False},
'version': 0,
},
'alfred.workflow.trigger.hotkey': {
'config': {'leftcursor': False, 'argument': 0, 'relatedAppsMode': 0, 'action': 0, 'hotkey': 0, 'hotstring': '', 'hotmod': 0, 'modsmode': 0},
'version': 1, },
'alfred.workflow.action.openfile': {
'config': {},
'version': 1,
},
'alfred.workflow.input.keyword': {
'config': {'argumenttype': 0, 'withspace': True},
'version': 0,
},
'alfred.workflow.trigger.external': {
'config': {},
'version': 0,
},
'alfred.workflow.output.largetype': {
'version': 0,
},
'alfred.workflow.action.revealfile': {
'version': 0,
},
'alfred.workflow.input.filefilter': {
'config': {'scopes': [], 'includesystem': False, 'withspace': True, 'anchorfields': True, 'daterange': 0, 'types': []},
'version': 0,
},
'alfred.workflow.input.scriptfilter': {
'config': {'withspace': True, 'escaping': 102, 'script': '', 'argumenttype': 0, 'type': 0,
'queuedelaycustom': 3, 'queuedelayimmediatelyinitially': True, 'queuedelaymode': 0, 'queuemode': 1},
'version': 0,
},
'alfred.workflow.action.browseinalfred': {
'config': {},
'version': 0,
},
'alfred.workflow.trigger.action': {
'config': {'filetypes': [], 'acceptsmulti': False},
'version': 0,
},
'alfred.workflow.output.clipboard': {
'config': {'clipboardtext': '', 'autopaste': False},
'version': 0, },
'alfred.workflow.output.script': {
'config': {'escaping': 102, 'type': 0, 'script': '', 'concurrently': False},
'version': 0, },
'alfred.workflow.action.launchfiles': {
'config': {'paths': [], 'toggle': False},
'version': 0,
},
'alfred.workflow.trigger.contact': {
'config': {},
'version': 0,
},
'alfred.workflow.action.systemwebsearch': {
'config': {},
'version': 0,
},
'alfred.workflow.trigger.fallback': {
'config': {},
'version': 0,
},
'alfred.workflow.action.openurl': {
'config': {'utf8': True, 'plusspaces': False},
'version': 0,
},
'alfred.workflow.action.systemcommand': {
'config': {'command': 0, 'confirm': False},
'version': 1,
},
'alfred.workflow.action.itunescommand': {
'config': {'command': 0},
'version': 0,
},
'alfred.workflow.action.script': {
'config': {'escaping': 102, 'type': 0, 'script': '', 'concurrently': False},
'version': 0,
},
'alfred.workflow.action.applescript': {
'config': {'cachescript': False, 'applescript': ''},
'version': 0,
},
'alfred.workflow.action.terminalcommand': {
'config': {'escaping': 0},
'version': 0,
},
'alfred.workflow.trigger.remote': {
'config': {'argumenttype': 0, 'workflowonly': False},
'version': 0,
},
}
class AlfredObject(object):
def __init__(self, dic):
self.type = dic['type']
default = defaults[self.type] if self.type in defaults else {}
self.prop = merge_dicts(default, dic)
if 'uid' not in self.prop:
self.prop['uid'] = uuid.uuid4()
self.script_type = None
if self.type == 'alfred.workflow.action.applescript':
self.script_type = 'applescript'
elif self.type in ['alfred.workflow.input.scriptfilter',
'alfred.workflow.output.script',
'alfred.workflow.action.script']:
self.script_type = 'script'
return
def dump(self, script_dir='.'):
default = defaults[self.type] if self.type in defaults else {}
prop = remove_default(self.prop, default)
if self.script_type:
path = os.path.join(script_dir, self.prop['uid'])
with open(path, 'w') as f:
script = self.prop['config'].get(self.script_type)
f.write(script)
prop['config'][self.script_type] = Include(path)
return prop
| 34.015038 | 148 | 0.552387 | 427 | 4,524 | 5.800937 | 0.28103 | 0.158256 | 0.11304 | 0.177634 | 0.352442 | 0.224869 | 0.101736 | 0.101736 | 0.101736 | 0.101736 | 0 | 0.016867 | 0.266136 | 4,524 | 132 | 149 | 34.272727 | 0.729217 | 0.004642 | 0 | 0.266129 | 0 | 0 | 0.394579 | 0.214175 | 0 | 0 | 0 | 0 | 0 | 1 | 0.016129 | false | 0 | 0.040323 | 0 | 0.080645 | 0.008065 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb4f9d88257810b11fcfe2f3d7d48b189b5318eb | 328 | py | Python | bflib/sizes.py | ChrisLR/BasicDungeonRL | b293d40bd9a0d3b7aec41b5e1d58441165997ff1 | [
"MIT"
] | 3 | 2017-10-28T11:28:38.000Z | 2018-09-12T09:47:00.000Z | bflib/sizes.py | ChrisLR/BasicDungeonRL | b293d40bd9a0d3b7aec41b5e1d58441165997ff1 | [
"MIT"
] | null | null | null | bflib/sizes.py | ChrisLR/BasicDungeonRL | b293d40bd9a0d3b7aec41b5e1d58441165997ff1 | [
"MIT"
] | null | null | null | from enum import Enum
class Size(Enum):
VerySmall = "Very Small"
Small = "Small"
Medium = "Medium"
Large = "Large"
Huge = "Huge"
feet_map = {
Size.VerySmall: 1,
Size.Small: 3,
Size.Medium: 5,
Size.Large: 10,
Size.Huge: 20,
}
def size_in_feet(size):
return feet_map.get(size, 0)
| 14.26087 | 32 | 0.591463 | 46 | 328 | 4.130435 | 0.5 | 0.105263 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.033755 | 0.277439 | 328 | 22 | 33 | 14.909091 | 0.767932 | 0 | 0 | 0 | 0 | 0 | 0.091463 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.0625 | false | 0 | 0.0625 | 0.0625 | 0.5625 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 |
cb56bcf8fe68538776f3fb4e542fc3f89b7ad574 | 414 | py | Python | SAplatform/SAcore/migrations/0009_auto_20190529_2231.py | ThankPan/SA_Backend | 9aecd5950a9170fff264278eded9ab37ee35187b | [
"MIT"
] | 2 | 2019-05-22T09:16:51.000Z | 2019-05-29T10:26:12.000Z | SAplatform/SAcore/migrations/0009_auto_20190529_2231.py | ThankPan/SA_Backend | 9aecd5950a9170fff264278eded9ab37ee35187b | [
"MIT"
] | null | null | null | SAplatform/SAcore/migrations/0009_auto_20190529_2231.py | ThankPan/SA_Backend | 9aecd5950a9170fff264278eded9ab37ee35187b | [
"MIT"
] | null | null | null | # Generated by Django 2.0.6 on 2019-05-29 14:31
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('SAcore', '0008_user_avator'),
]
operations = [
migrations.AlterField(
model_name='user',
name='avator',
field=models.CharField(default='author_avator/default.jpg', max_length=255),
),
]
| 21.789474 | 88 | 0.608696 | 46 | 414 | 5.369565 | 0.782609 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072848 | 0.270531 | 414 | 18 | 89 | 23 | 0.745033 | 0.108696 | 0 | 0 | 1 | 0 | 0.155313 | 0.06812 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.083333 | 0 | 0.333333 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb5bf325d16c4ad64611039b02ce9e3296d30e86 | 279 | py | Python | niaaml_gui/progress_bar.py | zStupan/NiaAML-GUI | fff3dd3e5ae1ec26df77d952970d90e1b26a0cf3 | [
"MIT"
] | 2 | 2020-12-02T19:27:03.000Z | 2021-12-23T11:56:02.000Z | niaaml_gui/progress_bar.py | zStupan/NiaAML-GUI | fff3dd3e5ae1ec26df77d952970d90e1b26a0cf3 | [
"MIT"
] | 8 | 2020-12-08T21:50:48.000Z | 2021-11-23T09:47:43.000Z | niaaml_gui/progress_bar.py | zStupan/NiaAML-GUI | fff3dd3e5ae1ec26df77d952970d90e1b26a0cf3 | [
"MIT"
] | 3 | 2021-04-21T09:42:57.000Z | 2021-11-22T15:47:08.000Z | from PyQt5 import QtCore
from PyQt5.QtWidgets import QProgressBar
class ProgressBar(QProgressBar):
def __init__(self):
super(ProgressBar, self).__init__()
self.setTextVisible(True)
self.setMaximum(100)
self.setAlignment(QtCore.Qt.AlignCenter) | 31 | 48 | 0.724014 | 30 | 279 | 6.466667 | 0.633333 | 0.092784 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.022124 | 0.189964 | 279 | 9 | 48 | 31 | 0.836283 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.125 | false | 0 | 0.25 | 0 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb6614d079e45fa5d3bb1ee408477e4669627542 | 53,973 | py | Python | bindings/java/gen_jni.py | protocols-comnet/openwebrtc | 49e507623133f8faf10c7c99cc60f2c41c9847d2 | [
"BSD-2-Clause"
] | 1 | 2015-01-26T18:53:22.000Z | 2015-01-26T18:53:22.000Z | bindings/java/gen_jni.py | protocols-comnet/openwebrtc | 49e507623133f8faf10c7c99cc60f2c41c9847d2 | [
"BSD-2-Clause"
] | null | null | null | bindings/java/gen_jni.py | protocols-comnet/openwebrtc | 49e507623133f8faf10c7c99cc60f2c41c9847d2 | [
"BSD-2-Clause"
] | 2 | 2020-09-01T07:00:35.000Z | 2020-09-27T01:13:23.000Z | #!/usr/bin/env python -B
# Copyright (c) 2014, Ericsson AB. All rights reserved.
#
# Redistribution and use in source and binary forms, with or without modification,
# are permitted provided that the following conditions are met:
#
# 1. Redistributions of source code must retain the above copyright notice, this
# list of conditions and the following disclaimer.
#
# 2. Redistributions in binary form must reproduce the above copyright notice, this
# list of conditions and the following disclaimer in the documentation and/or other
# materials provided with the distribution.
#
# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
# ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
# WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
# IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
# INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
# NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
# PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
# WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
# ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY
# OF SUCH DAMAGE.
import xml.etree.ElementTree as ET
from gir_parser import parse_gir_file
import re
import os
import sys
import errno
import copy
import argparse
###### ####### ## ## ###### ########
## ## ## ## ### ## ## ## ##
## ## ## #### ## ## ##
## ## ## ## ## ## ###### ##
## ## ## ## #### ## ##
## ## ## ## ## ### ## ## ##
###### ####### ## ## ###### ##
parser = argparse.ArgumentParser()
parser.add_argument('--gir', dest = 'gir', metavar = 'FILE', help = '.gir file')
parser.add_argument('--c-out', dest = 'c_dir', metavar = 'DIR', help = '.c output directory')
parser.add_argument('--j-out', dest = 'j_dir', metavar = 'DIR', help = '.java base output directory')
args = parser.parse_args()
if args.gir:
print 'reading from gir file "{}"'.format(args.gir)
else:
print 'missing gir input file (--gir)'
if args.c_dir:
print 'generating C source to "{}"'.format(args.c_dir)
else:
print 'missing C output directory (--c-out)'
if args.j_dir:
print 'generating Java source to "{}"'.format(args.j_dir)
else:
print 'missing Java output directory (--j-out)'
if not all(args.__dict__.values()):
print "all arguments must be set"
sys.exit(-1)
OUT_FILE = "owr_jni.c"
PACKAGE_ROOT = 'com.ericsson.research'
C_HEAD = """ \
/*
* Copyright (c) 2014, Ericsson AB. All rights reserved.
*
* Redistribution and use in source and binary forms, with or without modification,
* are permitted provided that the following conditions are met:
*
* 1. Redistributions of source code must retain the above copyright notice, this
* list of conditions and the following disclaimer.
*
* 2. Redistributions in binary form must reproduce the above copyright notice, this
* list of conditions and the following disclaimer in the documentation and/or other
* materials provided with the distribution.
*
* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
* ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
* WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
* IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT,
* INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
* NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
* PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY,
* WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
* ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY
* OF SUCH DAMAGE.
*/
#include <android/log.h>
#include <android/native_window_jni.h>
#include <jni.h>
#include <owr.h>
#include <owr_audio_payload.h>
#include <owr_audio_renderer.h>
#include <owr_candidate.h>
#include <owr_image_renderer.h>
#include <owr_image_server.h>
#include <owr_local.h>
#include <owr_local_media_source.h>
#include <owr_media_renderer.h>
#include <owr_media_session.h>
#include <owr_media_source.h>
#include <owr_payload.h>
#include <owr_remote_media_source.h>
#include <owr_session.h>
#include <owr_transport_agent.h>
#include <owr_types.h>
#include <owr_video_payload.h>
#include <owr_video_renderer.h>
#include <owr_window_registry.h>
#define android_assert(st) if (!(st)) { __android_log_write(ANDROID_LOG_ERROR, "OpenWebRTC", "Assertion failed at "G_STRINGIFY(__LINE__));}
#undef g_assert
#define g_assert android_assert
#define log_verbose(st, ...) __android_log_print(ANDROID_LOG_VERBOSE, "OpenWebRTC", "["G_STRINGIFY(__LINE__)"]: "st, ##__VA_ARGS__);
#define log_debug(st, ...) __android_log_print(ANDROID_LOG_DEBUG, "OpenWebRTC", "["G_STRINGIFY(__LINE__)"]: "st, ##__VA_ARGS__);
#define log_info(st, ...) __android_log_print(ANDROID_LOG_INFO, "OpenWebRTC", "["G_STRINGIFY(__LINE__)"]: "st, ##__VA_ARGS__);
#define log_warning(st, ...) __android_log_print(ANDROID_LOG_WARN, "OpenWebRTC", "["G_STRINGIFY(__LINE__)"]: "st, ##__VA_ARGS__);
#define log_error(st, ...) __android_log_print(ANDROID_LOG_ERROR, "OpenWebRTC", "["G_STRINGIFY(__LINE__)"]: "st, ##__VA_ARGS__);
static GHashTable* class_cache_table;
"""
TYPE_TABLE_GIR_TO_JAVA = {
'none': 'void',
'utf8': 'java.lang.String',
'gchar': 'char',
'guchar': 'char',
'gint': 'int',
'guint': 'int',
'gint64': 'long',
'guint64': 'long',
'gboolean': 'boolean',
'gdouble': 'double',
'guintptr': 'long',
'gpointer': 'long',
'GLib.List': 'java.util.List<>',
'GLib.HashTable': 'java.util.Map<>'
}
TYPE_TABLE_GIR_TO_JNI = {
'none': 'void',
'utf8': 'jstring',
'gchar': 'jbyte',
'guchar': 'jbyte',
'gint': 'jint',
'guint': 'jint',
'gint64': 'jlong',
'guint64': 'jlong',
'gboolean': 'jboolean',
'gfloat': 'jfloat',
'gdouble': 'jdouble',
'guintptr': 'jlong',
'gpointer': 'jlong',
'GLib.List': 'jobject',
'GLib.HashTable': 'jobject'
}
JAVA_TYPE_SIGNATURES = {
'void': 'V',
'java.lang.String': 'Ljava/lang/String;',
'java.lang.Object': 'Ljava/lang/Object;',
'boolean': 'Z',
'byte': 'B',
'char': 'C',
'short': 'S',
'int': 'I',
'long': 'J',
'float': 'F',
'double': 'D',
'java.util.List<>': 'Ljava/util/List;',
'java.util.Map<>': 'Ljava/util/Map;',
'java.util.ArrayList<>': 'Ljava/util/ArrayList;',
'java.util.HashMap<>': 'Ljava/util/HashMap;'
}
quot = '"{}"'.format
# boolean jboolean unsigned 8 bits
# byte jbyte signed 8 bits
# char jchar unsigned 16 bits
# short jshort signed 16 bits
# int jint signed 32 bits
# long jlong signed 64 bits
# float jfloat 32 bits
# double jdouble 64 bits
# void void N/A
## ## ######## #### ######## ######## ########
## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ## ##
## ## ## ######## ## ## ###### ########
## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ## ##
### ### ## ## #### ## ######## ## ##
class Writer:
indentation_size = 4
def __init__(self, path, filename):
self.indentation = 0
self.indented = True
try:
os.makedirs(path)
except OSError as e:
if e.errno == errno.EEXIST and os.path.isdir(path):
pass
else:
raise
self.file = open(path + os.sep + filename, 'w')
def out(self, st):
self.file.write(st)
def outln(self, st):
self.out(st + '\n')
def indent(self):
if not self.indented:
self.out(' ' * self.indentation_size * self.indentation)
self.indented = True
def push(self):
if not self.indented:
self.indent()
else:
self.out(' ')
self.out('{')
self.line()
self.indentation += 1
def pop(self, newlines = 1):
self.indentation -= 1
if self.indentation < 0:
self.file.close()
else:
self.indent()
self.out('}')
self.out('\n' * newlines)
if (newlines > 0):
self.indented = False
def line(self, st = None, push = False):
if st is not None:
self.indent()
self.out(st)
if push:
self.push()
else:
self.out('\n')
self.indented = False
def state(self, st, push = False):
self.indent()
self.out(st)
if push:
self.push()
else:
self.out(';')
self.line()
def assignment(self):
self.out(' = ')
def par(self, st):
self.out('({})'.format(st))
def semi(self):
self.out(';')
self.line()
def ret(self, st = None):
self.indent()
if st is not None:
self.out('return ' + st)
self.semi()
self.pop()
else:
self.out('return ')
def lval(self, name):
self.previous_lval = name
self.indent()
self.out(name)
self.assignment()
def rval(self, st):
self.out(st)
self.semi()
def cast(self, name):
self.indent()
self.par(name)
self.out(' ')
def comment(self, comment):
self.outln('/* {} */'.format(comment))
def call(self, name, *arguments):
self.indent()
self.out(name)
self.par(', '.join(arguments))
self.semi()
def declare(self, typename, name):
self.indent()
self.out(typename)
self.out(' ')
self.out(name)
self.semi()
def case(self, st):
self.indentation -= 1
self.line('case {}:'.format(st))
self.indentation += 1
## ## ## ######## #### ######## ######## ########
## ## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ## ## ##
## ## ## ## ######## ## ## ###### ########
## ## ## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ## ## ## ##
###### ### ### ## ## #### ## ######## ## ##
class JavaWriter(Writer):
def __init__(self, fqcn):
split = fqcn.split('.')
self.class_name = split[-1]
self.packages = split[0:-1]
self.package_name = '.'.join(split[0:-1])
path = os.sep.join([args.j_dir] + split[0:-1])
Writer.__init__(self, path, self.class_name + '.java')
self.state('package ' + self.package_name)
self.line()
def class_declaration(self,
typename = 'class',
extends = None,
interfaces = [],
name = None,
visibility = 'public',
static = False
):
self.class_type = typename
self.indent()
self.out(visibility)
if static:
self.out(' static')
self.out(' ' + typename + ' ' + (name or self.class_name))
if extends is not None:
self.out(' extends ' + extends)
if interfaces != []:
self.out(' implements ' + ', '.join(interfaces))
self.push()
def constructor(self,
parameters = [],
visibility = 'public'
):
self.indent()
if visibility is not None:
self.out(visibility + ' ')
self.out(self.class_name)
self.parameters(parameters)
self.push()
def method(self,
obj = None,
name = None,
types = None,
parameters = None,
visibility = 'public',
native = True,
abstract = False,
static = False
):
name = name or obj and obj['camel_name']
types = types or obj and obj['types'] or dict(java = 'void')
parameters = parameters or obj and obj.get('parameters') or []
self.indent()
if visibility is not None:
self.out(visibility + ' ')
if static:
self.out('static ')
if native:
self.out('native ')
if abstract:
self.out('abstract ')
self.out(self.typename(types) + ' ')
self.out(name)
self.parameters(parameters)
if self.class_type == 'interface':
self.out(';')
self.line()
return
if not native and not abstract:
self.push()
else:
self.out(';')
self.line()
def typename(self, types):
if types['java'][-2:] == '<>':
return '%s<%s>' % (types['java'][:-2], ', '.join(map(self.typename, types['inner'])))
return types['inner'][0]['java'] + '[]'
else:
return types['java']
def parameter(self, parameter):
return self.typename(parameter['types']) + ' ' + parameter['camel_name']
def filter_user_data(self, parameters):
return [p for p in parameters if p.get('c_name') != 'user_data']
def argument(self, arg):
return arg['camel_name']
def parameters(self, parameters = []):
self.out('(' + ', '.join(map(self.parameter, self.filter_user_data(parameters))) + ')')
def arguments(self, arguments = []):
self.out('(' + ', '.join(map(self.argument, self.filter_user_data(arguments))) + ')')
###### ## ## ######## #### ######## ######## ########
## ## ## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ## ## ##
## ## ## ## ######## ## ## ###### ########
## ## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ## ## ## ##
###### ### ### ## ## #### ## ######## ## ##
class CWriter(Writer):
indentation_size = 4
def __init__(self, namespace, root):
Writer.__init__(self, args.c_dir, OUT_FILE)
self.class_name = ''
self.package = "{}.{}".format(root, namespace['symbol_prefix'])
self.underscore_package = self.package.replace('.', '_')
self.slash_package = self.package.replace('.', '/')
self.enums = namespace['enums']
self.callbacks = namespace['callbacks']
self.symbol_prefix = namespace['symbol_prefix']
self.identifier_prefix = namespace['identifier_prefix']
self.out(C_HEAD)
def setClassname(self, class_name):
self.class_name = class_name
self.line()
self.outln('/**{}**/'.format(len(class_name) * '*'))
self.outln('/* {} */'.format(class_name))
self.outln('/**{}**/'.format(len(class_name) * '*'))
self.line()
def method(self, method, static = False):
self.jni_function(
return_value = method['types']['jni'],
name = method['camel_name'],
parameters = method['parameters'],
static = static
)
def jni_function(self,
return_value = 'void',
name = None,
parameters = [],
static = False
):
self.out('JNIEXPORT ')
self.out(self.str_jni_type(return_value))
self.set_return_type(self.str_jni_type(return_value))
self.out(' Java_' + self.underscore_package + '_')
self.outln(self.class_name + '_' + name)
self.indentation += 1
self.indent()
str_parameters = ['JNIEnv* env', 'jclass jclazz' if static else 'jobject jself']
str_parameters += [p['types']['jni'] + ' ' + CWriter.str_jni_name(p)
for p in parameters if p.get('c_name') != 'user_data']
self.par(', '.join(str_parameters))
self.indentation -= 1
self.line()
self.push()
def get_self(self):
self.lval('self')
self.cast(self.str_self_type())
self.call('jobject_to_GObject', 'env', 'jself')
def g_object_unref(self, name):
self.line('if (%s)' % name, push = True)
self.state('g_object_unref({})'.format(name))
self.pop()
def env(self, name, *args):
self.indent()
self.out('(*env)->{}(env, {})'.format(name, ', '.join(args)))
self.semi()
def make_global_ref(self):
self.lval(self.previous_lval)
self.env('NewGlobalRef', self.previous_lval)
self.g_assert()
def g_assert(self):
self.call('android_assert', self.previous_lval)
def set_return_type(self, return_type):
self.return_type = return_type;
def check_exception(self):
if self.return_type and self.return_type != 'void':
self.line('if ((*env)->ExceptionCheck(env)) return (%s) 0;' % self.return_type)
else:
self.line('if ((*env)->ExceptionCheck(env)) return;')
## ## ## ####
## ### ## ##
## #### ## ##
## ## ## ## ##
## ## ## #### ##
## ## ## ### ##
###### ## ## ####
def declare_self(self):
self.declare(self.str_self_type(), 'self')
def jni_declare(self, param):
typename = self.str_jni_type(param)
name = self.str_jni_name(param)
self.declare(typename, name)
if param['types']['java'] == 'java.util.List<>':
self.declare(self.str_jni_type(param['types']['inner'][0]), '%s_item' % name)
def jni_to_c(self, obj):
jni_name = self.str_jni_name(obj)
camel_name = obj['camel_name']
c_type = obj['types']['c']
jni_type = obj['types']['jni']
java_type = obj['types']['java']
if obj.get('c_name') == 'user_data':
return
if java_type in self.callbacks:
self.lval(camel_name)
self.rval('callback_{}'.format(java_type))
self.lval('userData')
self.call('user_data_create', jni_name)
self.g_assert()
return
if java_type == 'java.util.List<>':
self.line('NOT IMPLEMENTED: {} to {}'.format(java_type, c_type))
elif java_type == 'java.util.Map<>':
self.line('NOT IMPLEMENTED: {} to {}'.format(java_type, c_type))
elif java_type == 'java.lang.String':
self.lval('%s_jstring' % camel_name)
self.cast(c_type)
self.call('(*env)->GetStringUTFChars', 'env', jni_name, 'NULL')
self.check_exception()
self.lval(camel_name)
self.call('g_strdup', '{}_jstring'.format(camel_name))
self.g_assert()
elif java_type == 'android.view.Surface':
self.lval(camel_name)
self.cast(c_type)
self.call('ANativeWindow_fromSurface', 'env', jni_name)
self.g_assert()
elif java_type in self.enums:
self.lval(camel_name)
self.cast(c_type)
self.call('{}_to_c_enum'.format(c_type), 'env', jni_name)
elif jni_type == 'jobject':
self.lval(camel_name)
self.cast(c_type)
self.call('jobject_to_GObject', 'env', jni_name)
self.g_assert()
else:
self.lval(camel_name)
self.cast(c_type)
self.rval(jni_name)
def cleanup_jni(self, obj):
jni_name = self.str_jni_name(obj)
c_name = obj['camel_name']
c_type = obj['types']['c']
jni_type = obj['types']['jni']
java_type = obj['types']['java']
if jni_type == 'jstring':
self.call('(*env)->ReleaseStringUTFChars', 'env', jni_name, '%s_jstring' % c_name)
self.check_exception();
def jni_return_declare(self, obj):
self.declare(self.str_c_type(obj), 'result')
self.declare(self.str_jni_type(obj), 'jResult')
def return_jni_result(self, obj):
ret = copy.copy(obj)
ret['camel_name'] = 'result'
ret['title_name'] = 'Result'
self.c_to_jni(ret)
ret['camel_name'] = 'result'
self.cleanup_c(ret, skip_transfer = 'none')
self.line()
self.ret('jResult')
######
## ##
##
##
##
## ##
######
def c_declare(self, param):
jni_type = self.str_jni_type(param)
if jni_type == 'jstring':
self.declare(self.str_c_type(param), '%s_jstring' % param['camel_name'])
self.declare(self.str_c_type(param), param['camel_name'])
def c_to_jni(self, obj):
jni_name = self.str_jni_name(obj)
c_name = obj['camel_name']
c_type = obj['types']['c']
jni_type = obj['types']['jni']
java_type = obj['types']['java']
if java_type == 'java.util.List<>' :
obj_copy = copy.deepcopy(obj)
obj_copy['types'] = obj_copy['types']['inner'][0]
obj_copy['camel_name'] += '->data'
obj_copy['title_name'] += '_item'
self.lval(jni_name)
self.call('create_jList', 'env')
self.check_exception()
self.g_assert()
self.line('for (; {0} != NULL; {0} = {0}->next)'.format(c_name), push = True)
self.c_to_jni(obj_copy)
self.call('jList_add_item', 'env', jni_name, jni_name + '_item')
self.cleanup_c(obj_copy)
self.check_exception()
self.pop()
elif java_type == 'java.util.Map<>':
pass
elif java_type == 'java.lang.String':
self.lval(jni_name)
self.cast(c_type)
self.call('(*env)->NewStringUTF', 'env', c_name)
self.check_exception()
elif java_type in self.enums:
self.lval(jni_name)
self.call('{}_to_java_enum'.format(c_type), 'env', c_name)
self.g_assert()
elif jni_type == 'jobject':
self.lval(jni_name)
self.call('GObject_to_jobject'.format(java_type), 'env', c_name)
self.g_assert()
else:
self.lval(jni_name)
self.cast(jni_type)
self.rval(c_name)
def cleanup_c(self, obj, skip_transfer = 'full'):
jni_name = self.str_jni_name(obj)
c_name = obj['camel_name']
c_type = obj['types']['c']
jni_type = obj['types']['jni']
java_type = obj['types']['java']
if obj['transfer'] == skip_transfer:
return
if java_type == 'java.lang.String':
if c_type in ['gchar*', 'const gchar*', 'char*', 'const char*']:
self.call('g_free', '(void*) {}'.format(c_name))
else:
self.line('NOT IMPLEMENTED: cleanup {}'.format(c_type))
elif jni_type == 'jobject':
if java_type not in self.enums:
if c_type == 'GList*':
self.call('g_list_free_full', c_name, 'g_object_unref')
else:
self.g_object_unref(c_name)
def c_return_declare(self, obj):
self.declare(self.str_c_type(obj), 'ret')
self.declare(self.str_c_type(obj), 'result')
self.declare(self.str_jni_type(obj), 'jResult')
def return_c_result(self, obj):
ret = copy.copy(obj)
ret['camel_name'] = 'result'
ret['title_name'] = 'Result'
self.jni_to_c(ret)
self.lval('ret')
if ret['types']['jni'] == 'jstring' and ret['transfer'] == 'none':
if ret['types']['c'] in ['gchar*', 'const gchar*', 'char*', 'const char*']:
self.call('g_strdup', 'result')
else:
self.rval('result')
else:
self.rval('result')
self.cleanup_jni(ret)
self.line()
self.ret('ret')
###### ######## ########
## ## ## ## ##
## ## ## ##
###### ## ########
## ## ## ##
## ## ## ## ##
###### ## ## ##
def str_self_type(self):
return self.str_ptr(self.identifier_prefix + self.class_name)
@staticmethod
def str_c_type(obj):
if type(obj) == str:
return obj
elif obj.get('types') is not None:
return obj['types']['c']
elif obj.get('c') is not None:
return obj['c']
@staticmethod
def str_jni_type(obj):
if type(obj) == str:
return obj
elif obj.get('types') is not None:
return obj['types']['jni']
elif obj.get('jni') is not None:
return obj['jni']
def str_java_type(self, obj):
name = None
if type(obj) == str:
name = obj
else:
name = (obj.get('types') or obj)['java']
if obj.get('c_name') == 'user_data':
return ''
signature = JAVA_TYPE_SIGNATURES.get(name)
if signature is None:
return 'L{}/{};'.format(self.slash_package, name)
if signature == '[]':
return '[' + self.str_java_type(obj['types']['inner'][0])
return signature
def str_gobject_type(self, obj):
if obj['types']['c'] is not None:
return obj['types']['c']
java_type = obj['types']['java']
if java_type not in self.enums:
java_type += '*'
return '{}{}'.format(self.identifier_prefix, java_type)
def str_class_signature(self, obj, inner = None):
name = obj if type(obj) == str else obj.get('name')
inner = inner and (inner if type(inner) == str else inner.get('title_name'))
if inner:
return '{}/{}${}'.format(self.slash_package, name, inner)
else:
return '{}/{}'.format(self.slash_package, name)
def str_method_signature(self, obj):
args = ''.join(map(self.str_java_type, obj['parameters']))
sig = '({}){}'.format(args or '', self.str_java_type(obj))
return sig
@staticmethod
def str_jni_call_name(obj):
jni_type = CWriter.str_jni_type(obj)
if jni_type in ['jarray']:
jni_type = 'jobject'
if jni_type == 'void':
jni_type = '_void'
if jni_type == 'jstring':
jni_type = 'jobject'
return jni_type[1:].title()
@staticmethod
def str_jni_name(obj):
return obj.get('jni_name') or 'j{}'.format(obj['title_name'])
@staticmethod
def str_c_name(obj):
return obj['camel_name']
@staticmethod
def str_ptr(st):
return '{}*'.format(st)
def str_linebreak(self):
return '\n' + ' ' * self.indentation_size * (self.indentation + 1)
###### ###### ######## ########
## ## ## ## ## ## ##
## ## ## ## ##
## ###### ## ########
## ## ## ## ##
## ## ## ## ## ## ##
###### ###### ## ## ##
JNI_VERSION = 'JNI_VERSION_1_6'
CACHE_CLASS = 'class_{}'.format
CACHE_FIELD = 'field_{}_{}'.format
CACHE_STATIC_FIELD = 'field_static_{}_{}'.format
CACHE_METHOD = 'method_{}_{}'.format
CACHE_STATIC_METHOD = 'method_static_{}_{}'.format
TEMPL_TO_JAVA_ENUM = '{c_name}_to_java_enum'.format
TEMPL_TO_C_ENUM = '{c_name}_to_c_enum'.format
STATIC_HELPER_METHODS = """\
static JNIEnv* get_jni_env()
{{
JNIEnv* env = NULL;
int ret;
ret = (*jvm)->GetEnv(jvm, (void**)&env, {jni_version});
if (ret == JNI_EDETACHED) {{
if ((*jvm)->AttachCurrentThread(jvm, (JNIEnv**) &env, NULL) != 0) {{
log_error("JNI: failed to attach thread");
}} else {{
log_info("JNI: successfully attached to thread");
}}
}} else if (ret == JNI_EVERSION) {{
log_error("JNI: version not supported");
}}
g_assert(env);
return env;
}}
""".format(
jni_version = JNI_VERSION
) + """\
typedef struct {
jobject self;
} UserData;
static UserData* user_data_create(jobject jself)
{
JNIEnv* env;
UserData* data;
env = get_jni_env();
data = g_slice_new0(UserData);
g_assert(data);
data->self = (*env)->NewGlobalRef(env, jself);
if ((*env)->ExceptionCheck(env)) return NULL;
log_info("created global ref: %p", data->self);
return data;
}
static void user_data_destroy(gpointer data_pointer)
{
JNIEnv* env;
UserData* data;
env = get_jni_env();
data = (UserData*) data_pointer;
log_info("finalizing global ref: %p", data->self);
g_assert(data);
(*env)->DeleteGlobalRef(env, data->self);
if ((*env)->ExceptionCheck(env)) return;
g_slice_free(UserData, data);
}
static void user_data_closure_notify(gpointer data_pointer, GClosure* ignored)
{
(void) ignored;
user_data_destroy(data_pointer);
}
static jobject GObject_to_jobject(JNIEnv* env, gpointer self_pointer)
{
GObject* self = G_OBJECT(self_pointer);
UserData* data;
if (!self) {
log_debug("got jobject[null] from GObject data[NULL]");
return NULL;
}
g_object_ref(self);
data = (UserData*) g_object_get_data(self, "java_instance");
if (data) {
log_debug("got jobject[%p] from GObject data[%p]", data->self, self);
return data->self;
} else {
jobject jself;
jobject native_pointer;
jclass* class_pointer;
jclass clazz;
jfieldID fieldId;
jmethodID methodId;
gchar* classname;
classname = g_strdup_printf("com/ericsson/research/owr/%s", &(G_OBJECT_TYPE_NAME(self)[3]));
log_verbose("searching for class: %s\\n", classname);
class_pointer = (jclass*) g_hash_table_lookup(class_cache_table, classname);
g_assert(class_pointer);
clazz = *class_pointer;
g_assert(clazz);
g_free(classname);
native_pointer = (*env)->NewObject(env, class_NativePointer, method_NativePointer__constructor, (jlong) self);
if ((*env)->ExceptionCheck(env)) return NULL;
fieldId = (*env)->GetFieldID(env, clazz, "nativeInstance", "J");
if ((*env)->ExceptionCheck(env)) return NULL;
methodId = (*env)->GetMethodID(env, clazz, "<init>", "(Lcom/ericsson/research/owr/NativePointer;)V");
if ((*env)->ExceptionCheck(env)) return NULL;
jself = (*env)->NewObject(env, clazz, methodId, native_pointer);
if ((*env)->ExceptionCheck(env)) return NULL;
data = user_data_create(jself);
g_assert(data);
g_object_set_data(self, "java_instance", data);
log_debug("got jobject[%p] from GObject[%p]", jself, self);
return jself;
}
}
static gpointer jobject_to_GObject(JNIEnv* env, jobject jself)
{
jclass clazz;
jfieldID fieldId;
gpointer self;
if (!jself) {
log_debug("got GObject[NULL] from jobject[null]");
return NULL;
}
clazz = (*env)->GetObjectClass(env, jself);
if ((*env)->ExceptionCheck(env)) return NULL;
fieldId = (*env)->GetFieldID(env, clazz, "nativeInstance", "J");
if ((*env)->ExceptionCheck(env)) return NULL;
self = (gpointer) (*env)->GetLongField(env, jself, fieldId);
if ((*env)->ExceptionCheck(env)) return NULL;
g_object_ref(self);
log_debug("got GObject[%p] from jobject[%p]", self, jself);
return self;
}
static jobject create_jList(JNIEnv* env)
{
jobject list;
list = (*env)->NewObject(env, class_ArrayList, method_ArrayList__constructor);
if ((*env)->ExceptionCheck(env)) return NULL;
return list;
}
static void jList_add_item(JNIEnv* env, jobject list, jobject item)
{
(*env)->CallBooleanMethod(env, list, method_ArrayList_add, item);
if ((*env)->ExceptionCheck(env)) return;
}
/*
static jobject create_jMap(JNIEnv* env)
{
jobject map;
map = (*env)->NewObject(env, class_HashMap, method_HashMap__constructor);
if ((*env)->ExceptionCheck(env)) return NULL;
return map;
}
static void jMap_add_item(JNIEnv* env, jobject map, jobject key, jobject value)
{
(*env)->CallObjectMethod(env, map, method_HashMap_put, key, value);
if ((*env)->ExceptionCheck(env)) return;
}
*/
"""
def cify_namespace(namespace):
w = CWriter(namespace, PACKAGE_ROOT)
enums = namespace['enums']
enum_names = set()
for enum in enums.values():
enum_names.add(enum['name'])
enums = [enums[name] for name in enum_names]
# cache declarations
w.state('static JavaVM* jvm')
# cache helpers
jni_onload_writers = []
current_class = dict()
def cache_class(name, *sigargs):
w.line()
w.declare('static jclass', CACHE_CLASS(name))
current_class['name'] = name
def jni_onload_writer():
w.line()
current_class['name'] = name
if len(sigargs) == 1 and '/' in sigargs[0]:
signature = quot(sigargs[0])
else:
signature = quot(w.str_class_signature(*sigargs))
classname = CACHE_CLASS(name)
w.lval(classname)
w.env('FindClass', signature)
w.check_exception()
w.make_global_ref()
w.call('g_hash_table_insert', 'class_cache_table', signature, '&' + w.previous_lval)
jni_onload_writers.append(jni_onload_writer)
def cache_method(method):
name = method['camel_name']
if name == '<init>':
name = '_constructor'
w.declare('static jmethodID', CACHE_METHOD(current_class['name'], name))
def jni_onload_writer():
w.lval(CACHE_METHOD(current_class['name'], name))
w.env('GetMethodID', CACHE_CLASS(current_class['name']), quot(method['camel_name']),
quot(w.str_method_signature(method)))
w.check_exception()
jni_onload_writers.append(jni_onload_writer)
def cache_field(java_name, signature, static = False):
name_gen = (CACHE_STATIC_FIELD if static else CACHE_FIELD)
w.declare('static jfieldID', name_gen(current_class['name'], java_name))
def jni_onload_writer():
w.lval(name_gen(current_class['name'], java_name))
w.env('GetStaticFieldID' if static else 'GetFieldID',
CACHE_CLASS(current_class['name']), quot(java_name), quot(signature))
w.check_exception()
jni_onload_writers.append(jni_onload_writer)
# declare classes that should be cached
cache_class('NativePointer', 'NativePointer')
cache_field('pointer', 'J')
cache_method(dict(
camel_name='<init>',
parameters = ['long'],
types = dict(java='void')
))
cache_class('ArrayList', 'java/util/ArrayList')
cache_method(dict(
camel_name = '<init>',
parameters = [],
types = dict(java = 'void')
))
cache_method(dict(
camel_name = 'add',
parameters = [dict(java = 'java.lang.Object')],
types = dict(java = 'boolean')
))
cache_class('HashMap', 'java/util/HashMap')
cache_method(dict(
camel_name = '<init>',
parameters = [],
types = dict(java = 'void')
))
cache_method(dict(
camel_name = 'put',
parameters = [
dict(java = 'java.lang.Object'),
dict(java = 'java.lang.Object')
],
types = dict(java = 'java.lang.Object')
))
for callback in namespace['callbacks'].values():
cache_class(callback['name'], callback)
cache_method(callback)
for clazz in namespace['classes']:
cache_class(clazz['name'], clazz)
cache_field('nativeInstance', 'J')
for signal in clazz['signals']:
classname = '%s_%s' % (clazz['name'], signal['title_name'])
cache_class(classname, clazz, signal['title_name'] + 'Listener')
cache_method(signal)
for enum in (e for e in enums if not e['bitfield']):
cache_class(enum['name'], enum['name'])
cache_field('value', 'I')
for member in enum['members']:
cache_field(member['name'], w.str_java_type(enum['name']), static = True)
w.out(STATIC_HELPER_METHODS)
w.set_return_type('jint')
w.line('jint JNI_OnLoad(JavaVM* vm, void* reserved)')
w.push()
w.declare('JNIEnv*', 'env')
w.line()
w.lval('class_cache_table')
w.call('g_hash_table_new', 'g_str_hash', 'g_str_equal')
w.line()
w.lval('jvm')
w.rval('vm')
w.lval('env')
w.call('get_jni_env')
[writer() for writer in jni_onload_writers]
w.line()
w.ret(JNI_VERSION)
w.line()
###### ## ####### ######## ### ##
## ## ## ## ## ## ## ## ## ##
## ## ## ## ## ## ## ## ##
## #### ## ## ## ######## ## ## ##
## ## ## ## ## ## ## ######### ##
## ## ## ## ## ## ## ## ## ##
###### ######## ####### ######## ## ## ########
# w.line(STATIC_C_IMPLEMENTATIONS(**w.__dict__))
# enum accessors
for enum in (e for e in enums if not e['bitfield']):
w.setClassname(enum['name'])
name = TEMPL_TO_JAVA_ENUM(**enum)
w.set_return_type('jobject')
w.line('static jobject %s(JNIEnv* env, %s value)' % (name, enum['c_name']))
w.push()
w.line('jfieldID fieldId;')
w.line()
w.line('switch (value)', push = True)
for member in enum['members']:
w.case(format(member['c_name']))
w.lval('fieldId')
w.rval(CACHE_STATIC_FIELD(enum['name'], member['name']))
w.state('break')
w.pop()
w.line()
w.g_assert()
w.ret()
w.env('GetStaticObjectField', CACHE_CLASS(enum['name']), 'fieldId')
w.check_exception()
w.pop()
w.line()
name = TEMPL_TO_C_ENUM(**enum)
w.set_return_type('guint')
w.line('static guint %s(JNIEnv* env, jobject jenum)' % name)
w.push()
w.declare('guint', 'value')
w.lval('value')
w.cast('guint')
w.env('GetIntField', 'jenum', 'field_%s_value' % enum['name'])
w.check_exception()
w.ret('value')
w.line()
for callback in namespace['callbacks'].values():
void = callback['types']['c'] == 'void'
# callback handler
w.set_return_type('void')
w.line('static {return_type} {name}({parameters})'.format(
return_type = callback['types']['c'],
name = 'callback_{}'.format(callback['name']),
parameters = ', '.join(
w.str_gobject_type(p) + ' ' + p['camel_name'] for p in callback['parameters']
)
))
w.push()
parameters = [p for p in callback['parameters'] if p['c_name'] != 'user_data']
# declare
w.declare('JNIEnv*', 'env')
w.declare('UserData*', 'data')
map(w.jni_declare, parameters)
if not void:
w.c_return_declare(callback)
w.line()
# convert c arguments to jni
w.lval('env')
w.call('get_jni_env')
w.lval('data')
w.cast('UserData*')
w.rval('userData')
for parameter in parameters:
w.c_to_jni(parameter)
# cleanup c arguments
map(w.cleanup_c, parameters)
w.line()
# call listener with jni arguments
if not void:
w.lval('jResult')
call_name = 'Call{}Method'.format(w.str_jni_call_name(callback))
w.env(call_name, 'data->self', CACHE_METHOD(callback['name'], callback['camel_name']), *map(w.str_jni_name, parameters))
w.check_exception()
w.call('user_data_destroy', 'data')
# convert result if there is one
if not void:
w.line()
w.return_c_result(callback)
else:
w.pop()
w.line()
w.setClassname(namespace['name'])
for function in namespace['functions']:
# break
void = function['types']['c'] == 'void'
w.method(function, static = True)
# declarations
map(w.c_declare, function['parameters'])
if not void:
w.jni_return_declare(function)
if not void or function['parameters']:
w.line()
# jni to c
map(w.jni_to_c, function['parameters'])
if function['parameters']:
w.line()
# call
if not void:
w.lval('result')
w.call(function['c_name'], *map(w.str_c_name, function['parameters']))
# c to jni
if not void:
w.line()
w.return_jni_result(function)
else:
w.pop()
w.line()
###### ## ### ###### ######
## ## ## ## ## ## ## ## ##
## ## ## ## ## ##
## ## ## ## ###### ######
## ## ######### ## ##
## ## ## ## ## ## ## ## ##
###### ######## ## ## ###### ######
# classes
for clazz in namespace['classes']:
# break
w.setClassname(clazz['name'])
# constructor
if clazz['constructor']:
w.comment('constructor')
constructor = clazz['constructor']
w.jni_function(
name = constructor['camel_name'],
parameters = constructor['parameters']
)
w.declare(constructor['types']['c'], 'self')
map(w.c_declare, constructor['parameters'])
w.declare('UserData*', 'data')
w.line()
if constructor['parameters']:
map(w.jni_to_c, constructor['parameters'])
w.line()
w.lval('self')
w.call(constructor['c_name'], *map(w.str_c_name, constructor['parameters']))
w.line()
w.lval('data')
w.call('user_data_create', 'jself')
w.g_assert()
w.call('g_object_set_data', 'G_OBJECT(self)', quot('java_instance'), 'data')
w.line()
w.env('SetLongField', 'data->self', CACHE_FIELD(clazz['name'], 'nativeInstance'), '(jlong) self')
w.check_exception()
map(w.cleanup_jni, constructor['parameters'])
w.pop()
w.line()
# methods
if clazz['methods']:
w.comment('methods')
for method in clazz['methods']:
void = method['types']['c'] == 'void'
w.method(method)
w.declare_self()
map(w.c_declare, method['parameters'])
if not void:
w.jni_return_declare(method)
w.line()
w.get_self()
if method.get('parameters'):
map(w.jni_to_c, method['parameters'])
w.line()
map(w.cleanup_jni, method['parameters'])
w.line()
if not void:
w.lval('result')
w.call(method['c_name'], *map(w.str_c_name, [clazz] + method['parameters']))
if method.get('parameters'):
w.line()
map(w.cleanup_c, method['parameters'])
w.g_object_unref('self')
if not void:
w.line()
w.return_jni_result(method)
else:
w.pop()
w.line()
for function in clazz['functions']:
void = function['types']['c'] == 'void'
w.method(function)
map(w.c_declare, function['parameters'])
if not void:
w.jni_return_declare(function)
if not void or function['parameters']:
w.line()
if function.get('parameters'):
map(w.jni_to_c, function['parameters'])
w.line()
map(w.cleanup_jni, function['parameters'])
w.line()
if not void:
w.lval('result')
w.call(function['c_name'], *map(w.str_c_name, function['parameters']))
if function.get('parameters'):
w.line()
map(w.cleanup_c, function['parameters'])
if not void:
w.line()
w.return_jni_result(function)
else:
w.pop()
w.line()
#properties
if clazz['properties']:
w.comment('properties')
for prop in clazz['properties']:
if prop['writable']:
w.jni_function(name = 'set' + prop['title_name'], parameters = [prop])
w.declare_self()
w.c_declare(prop)
w.line()
w.get_self()
w.jni_to_c(prop)
w.call('g_object_set', 'self', '"{c_name}"'.format(**prop), prop['camel_name'], 'NULL')
w.cleanup_jni(prop)
w.g_object_unref('self')
w.pop()
w.line()
if prop['readable']:
w.jni_function(return_value = prop, name = 'get' + prop['title_name'])
w.declare_self()
w.jni_declare(prop)
w.c_declare(prop)
w.line()
w.get_self()
w.call('g_object_get', 'self', '"{c_name}"'.format(**prop), '&{camel_name}'.format(**prop), 'NULL')
w.c_to_jni(prop)
w.line()
w.cleanup_c(prop)
w.g_object_unref('self')
w.ret(w.str_jni_name(prop))
w.line()
for signal in clazz['signals']:
void = signal['types']['c'] == 'void'
handler_name = 'signal_{}_{}'.format(clazz['name'], signal['camel_name'])
listener_param = dict(
title_name = 'Listener',
types = dict(jni = 'jobject')
)
w.set_return_type(signal['types']['c'])
# signal handler
w.line('static {return_type} {name}({parameters})'.format(
return_type = signal['types']['c'],
name = handler_name,
parameters = ''.join([
'{}* {},'.format(clazz['c_name'], 'self'),
w.str_linebreak() if signal['parameters'] else ' ',
', '.join(['{} {}'.format(w.str_gobject_type(p), p['camel_name'])
for p in signal['parameters']] + ['UserData* data']),
])
))
w.push()
# declare
w.declare('JNIEnv*', 'env')
for parameter in signal['parameters']:
w.jni_declare(parameter)
if not void:
w.c_return_declare(signal)
w.line()
w.cast('void')
w.rval('self')
w.line()
# convert c arguments to jni
w.lval('env')
w.call('get_jni_env')
for parameter in signal['parameters']:
w.c_to_jni(parameter)
# cleanup c arguments
map(w.cleanup_c, signal['parameters'])
w.line()
# call listener with jni arguments
if not void:
w.lval('jResult')
call_name = 'Call{}Method'.format(w.str_jni_call_name(signal))
w.env(call_name, 'data->self', CACHE_METHOD('{[name]}_{[title_name]}'.format(clazz, signal), signal['camel_name']), *map(w.str_jni_name, signal['parameters']))
w.check_exception()
# convert result if there is one
if not void:
w.line()
w.return_c_result(signal)
else:
w.pop()
w.line()
# jni implementation
w.jni_function(name = 'add' + signal['title_name'] + 'Listener', parameters = [listener_param])
w.declare_self()
w.declare('gulong', 'handler_id')
w.declare('UserData*', 'data')
w.line()
w.get_self()
w.line()
w.lval('data')
w.call('user_data_create', w.str_jni_name(listener_param))
w.lval('handler_id')
w.call('g_signal_connect_data', 'G_OBJECT(self)', quot(signal['c_name']),
'G_CALLBACK({})'.format(handler_name), 'data', 'user_data_closure_notify', '0')
w.cast('void')
w.rval('handler_id')
w.pop()
w.line()
w.pop()
## ###### ######## ########
## ## ## ## ## ##
## ## ## ## ##
## ###### ## ########
## ## ## ## ## ##
## ## ## ## ## ## ##
###### ###### ## ## ##
STATIC_JAVA_IMPLEMENTATIONS = {
'NativePointer': """\
public class NativePointer {
final long pointer;
private NativePointer(long pointer) {
this.pointer = pointer;
}
}
"""
}
def javify_callback(callback, package):
w = JavaWriter(package + '.' + callback['title_name'])
w.class_declaration(typename = 'interface')
w.method(callback, native = False)
w.pop()
def javify_class(clazz, package):
w = JavaWriter(package + '.' + clazz['name'])
w.class_declaration(extends = clazz['parent'])
# debug tag
w.line('public static final String TAG = "' + clazz['name'] + '";')
# constructor
constructor = clazz.get('constructor')
if constructor:
w.line()
w.constructor(constructor['parameters'])
w.call('nativeConstructor', *map(w.argument, constructor['parameters']))
w.pop()
w.line()
w.method(constructor, visibility = 'private')
w.line()
w.constructor([dict(
types = dict(java = 'NativePointer'),
camel_name = 'nativePointer'
)], visibility = None)
if clazz.get('parent'):
w.call('super', 'nativePointer')
else:
w.lval('nativeInstance')
w.rval('nativePointer.pointer')
w.pop()
if not constructor or constructor.get('parameters'):
w.line()
w.constructor(visibility = None)
w.pop()
if not clazz.get('parent'):
w.line()
w.state('long nativeInstance')
# methods
for method in clazz['methods']:
w.line()
w.method(method)
# functions
for function in clazz['functions']:
w.line()
w.method(function, static = True)
# properties
for prop in clazz['properties']:
w.line()
w.method(prop, name = 'get' + prop['title_name'])
if prop['writable']:
w.method(name = 'set' + prop['title_name'], parameters = [prop])
# signals
for signal in clazz['signals']:
interface = signal['title_name'] + 'Listener'
w.line()
w.method(name = 'add' + interface, parameters = [dict(
camel_name = 'listener',
types = dict(java = interface)
)])
w.line()
w.class_declaration(static = True, typename = 'interface', name = interface)
# w.outI('public static interface ' + interface)
w.method(signal, native = False)
w.pop()
# end class
w.pop()
def javify_bitfield(enum, package):
w = JavaWriter(package + '.' + enum['name'])
w.class_declaration(typename = 'class')
# members
for member in enum['members']:
w.state('public static final int {name} = {value}'.format(**member))
w.pop()
def javify_enum(enum, package):
w = JavaWriter(package + '.' + enum['name'])
w.class_declaration(typename = 'enum')
# members
w.indent()
for member in enum['members'][0:-1]:
w.line('{name}({value}),'.format(**member))
w.state('{name}({value})'.format(**enum['members'][-1]))
w.line()
# value
w.state('public final int value')
w.line()
w.constructor(visibility = 'private', parameters = [dict(
types = dict(java ='int'),
camel_name = 'value',
c_name = '_value',
)])
w.state('this.value = value')
w.pop()
w.pop()
def javify_functions(namespace, functions, package):
w = JavaWriter(package + '.' + namespace['name'])
w.class_declaration()
# TODO: proper solution
w.line('static')
w.push()
w.call('System.loadLibrary', quot("openwebrtc_jni"))
w.pop()
for function in functions:
w.method(function, static = True)
w.line()
w.pop()
def javify_namespace(namespace):
package = PACKAGE_ROOT + '.' + namespace['symbol_prefix']
for callback in namespace['callbacks'].values():
javify_callback(callback, package)
for clazz in namespace['classes']:
javify_class(clazz, package)
for name, enum in namespace['enums'].items():
if enum['bitfield']:
javify_bitfield(enum, package)
else:
javify_enum(enum, package)
javify_functions(namespace, namespace['functions'], package)
for name, source in STATIC_JAVA_IMPLEMENTATIONS.items():
w = JavaWriter(package + '.' + name)
w.out(source)
###### #######
## ## ## ##
## ## ##
## #### ## ##
## ## ## ##
## ## ## ##
###### #######
namespaces = parse_gir_file(args.gir)
namespace = namespaces[0]
javify_namespace(namespace)
cify_namespace(namespace)
| 30.649063 | 171 | 0.528857 | 6,043 | 53,973 | 4.547245 | 0.086381 | 0.013101 | 0.007861 | 0.012009 | 0.419739 | 0.345573 | 0.291168 | 0.249718 | 0.21904 | 0.200226 | 0 | 0.002165 | 0.306857 | 53,973 | 1,760 | 172 | 30.666477 | 0.732351 | 0.07048 | 0 | 0.396403 | 0 | 0.006255 | 0.288627 | 0.049125 | 0 | 0 | 0 | 0.000568 | 0.016419 | 0 | null | null | 0.001564 | 0.006255 | null | null | 0.010164 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb66f50e920c621e6d847f079089141e524ab650 | 880 | py | Python | test/functional_requirements/fault_tolerance/REBUILD_TRIGGERED_RIGHT_AFTER_ARRAY_MOUNTED.py | so931/poseidonos | 2aa82f26bfbd0d0aee21cd0574779a655634f08c | [
"BSD-3-Clause"
] | 38 | 2021-04-06T03:20:55.000Z | 2022-03-02T09:33:28.000Z | test/functional_requirements/fault_tolerance/REBUILD_TRIGGERED_RIGHT_AFTER_ARRAY_MOUNTED.py | so931/poseidonos | 2aa82f26bfbd0d0aee21cd0574779a655634f08c | [
"BSD-3-Clause"
] | 19 | 2021-04-08T02:27:44.000Z | 2022-03-23T00:59:04.000Z | test/functional_requirements/fault_tolerance/REBUILD_TRIGGERED_RIGHT_AFTER_ARRAY_MOUNTED.py | so931/poseidonos | 2aa82f26bfbd0d0aee21cd0574779a655634f08c | [
"BSD-3-Clause"
] | 28 | 2021-04-08T04:39:18.000Z | 2022-03-24T05:56:00.000Z | #!/usr/bin/env python3
import subprocess
import os
import sys
sys.path.append("../")
sys.path.append("../../system/lib/")
sys.path.append("../array/")
import json_parser
import pos
import pos_util
import cli
import api
import json
import time
import CREATE_ARRAY_BASIC
ARRAYNAME = CREATE_ARRAY_BASIC.ARRAYNAME
def execute():
CREATE_ARRAY_BASIC.execute()
api.detach_ssd(CREATE_ARRAY_BASIC.ANY_DATA)
time.sleep(5)
out = cli.mount_array(CREATE_ARRAY_BASIC.ARRAYNAME)
timeout = 80000 #80s
if api.wait_situation(ARRAYNAME, "REBUILDING", timeout) == True:
return "pass"
return "fail"
if __name__ == "__main__":
if len(sys.argv) >= 2:
pos.set_addr(sys.argv[1])
api.clear_result(__file__)
result = execute()
ret = api.set_result_manually(cli.array_info(ARRAYNAME), result, __file__)
pos.flush_and_kill_pos()
exit(ret) | 24.444444 | 78 | 0.7125 | 125 | 880 | 4.696 | 0.496 | 0.093697 | 0.136286 | 0.127768 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.014905 | 0.161364 | 880 | 36 | 79 | 24.444444 | 0.780488 | 0.027273 | 0 | 0 | 0 | 0 | 0.064327 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.03125 | false | 0.03125 | 0.34375 | 0 | 0.4375 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 |
cb6bfa1a351710813d824ef3686ff4624c0467f4 | 70,114 | py | Python | SublimeText3_3176/Data/Packages/SublimeCodeIntel-master/libs/codeintel2/pythoncile1.py | xiexie1993/Tool_Sublime_Text3_for_Windows | 51b11ac2d7df36242d68b3b5f85af5f2a8c550e2 | [
"RSA-MD"
] | 1 | 2018-06-23T08:07:39.000Z | 2018-06-23T08:07:39.000Z | SublimeText3_3176/Data/Packages/SublimeCodeIntel-master/libs/codeintel2/pythoncile1.py | xiexie1993/Tool_Sublime_Text3_for_Windows | 51b11ac2d7df36242d68b3b5f85af5f2a8c550e2 | [
"RSA-MD"
] | null | null | null | SublimeText3_3176/Data/Packages/SublimeCodeIntel-master/libs/codeintel2/pythoncile1.py | xiexie1993/Tool_Sublime_Text3_for_Windows | 51b11ac2d7df36242d68b3b5f85af5f2a8c550e2 | [
"RSA-MD"
] | null | null | null | #!/usr/bin/env python
# Copyright (c) 2004-2006 ActiveState Software Inc.
#
# Contributors:
# Trent Mick (TrentM@ActiveState.com)
"""
pythoncile - a Code Intelligence Language Engine for the Python language
Module Usage:
from pythoncile import scan
mtime = os.stat("foo.py")[stat.ST_MTIME]
content = open("foo.py", "r").read()
scan(content, "foo.py", mtime=mtime)
Command-line Usage:
pythoncile.py [<options>...] [<Python files>...]
Options:
-h, --help dump this help and exit
-V, --version dump this script's version and exit
-v, --verbose verbose output, use twice for more verbose output
-f, --filename <path> specify the filename of the file content
passed in on stdin, this is used for the "path"
attribute of the emitted <file> tag.
--md5=<string> md5 hash for the input
--mtime=<secs> modification time for output info, in #secs since
1/1/70.
-L, --language <name>
the language of the file being scanned
-c, --clock print timing info for scans (CIX is not printed)
One or more Python files can be specified as arguments or content can be
passed in on stdin. A directory can also be specified, in which case
all .py files in that directory are scanned.
This is a Language Engine for the Code Intelligence (codeintel) system.
Code Intelligence XML format. See:
http://specs.activestate.com/Komodo_3.0/func/code_intelligence.html
The command-line interface will return non-zero iff the scan failed.
"""
# Dev Notes:
# <none>
#
# TODO:
# - type inferencing: asserts
# - type inferencing: return statements
# - type inferencing: calls to isinstance
# - special handling for None may be required
# - Comments and doc strings. What format?
# - JavaDoc - type hard to parse and not reliable
# (http://java.sun.com/j2se/javadoc/writingdoccomments/).
# - PHPDoc? Possibly, but not that rigorous.
# - Grouch (http://www.mems-exchange.org/software/grouch/) -- dunno yet.
# - Don't like requirement for "Instance attributes:" landmark in doc
# strings.
# - This can't be a full solution because the requirement to repeat
# the argument name doesn't "fit" with having a near-by comment when
# variable is declared.
# - Two space indent is quite rigid
# - Only allowing attribute description on the next line is limiting.
# - Seems focussed just on class attributes rather than function
# arguments.
# - Perhaps what PerlCOM POD markup uses?
# - Home grown? My own style? Dunno
# - make type inferencing optional (because it will probably take a long
# time to generate), this is tricky though b/c should the CodeIntel system
# re-scan a file after "I want type inferencing now" is turned on? Hmmm.
# - [lower priority] handle staticmethod(methname) and
# classmethod(methname). This means having to delay emitting XML until
# end of class scope and adding .visitCallFunc().
# - [lower priority] look for associated comments for variable
# declarations (as per VS.NET's spec, c.f. "Supplying Code Comments" in
# the VS.NET user docs)
import os
import sys
import getopt
from hashlib import md5
import re
import logging
import pprint
import glob
import time
import stat
import types
from cStringIO import StringIO
from functools import partial
# this particular ET is different from xml.etree and is expected
# to be returned from scan_et() by the clients of this module
import ciElementTree as et
import compiler
from compiler import ast
from compiler.visitor import dumpNode, ExampleASTVisitor
import parser
from codeintel2.common import CILEError
from codeintel2 import util
from codeintel2 import tdparser
#---- exceptions
class PythonCILEError(CILEError):
pass
#---- global data
_version_ = (0, 3, 0)
log = logging.getLogger("pythoncile")
# log.setLevel(logging.DEBUG)
util.makePerformantLogger(log)
_gClockIt = 0 # if true then we are gathering timing data
_gClock = None # if gathering timing data this is set to time retrieval fn
_gStartTime = None # start time of current file being scanned
#---- internal routines and classes
def _isclass(namespace):
return (len(namespace["types"]) == 1
and "class" in namespace["types"])
def _isfunction(namespace):
return (len(namespace["types"]) == 1
and "function" in namespace["types"])
def getAttrStr(attrs):
"""Construct an XML-safe attribute string from the given attributes
"attrs" is a dictionary of attributes
The returned attribute string includes a leading space, if necessary,
so it is safe to use the string right after a tag name. Any Unicode
attributes will be encoded into UTF8 encoding as part of this process.
"""
from xml.sax.saxutils import quoteattr
s = ''
for attr, value in attrs.items():
if not isinstance(value, basestring):
value = str(value)
elif isinstance(value, unicode):
value = value.encode("utf-8")
s += ' %s=%s' % (attr, quoteattr(value))
return s
# match 0x00-0x1f except TAB(0x09), LF(0x0A), and CR(0x0D)
_encre = re.compile('([\x00-\x08\x0b\x0c\x0e-\x1f])')
# XXX: this is not used anywhere, is it needed at all?
if sys.version_info >= (2, 3):
charrefreplace = 'xmlcharrefreplace'
else:
# Python 2.2 doesn't have 'xmlcharrefreplace'. Fallback to a
# literal '?' -- this is better than failing outright.
charrefreplace = 'replace'
def xmlencode(s):
"""Encode the given string for inclusion in a UTF-8 XML document.
Note: s must *not* be Unicode, it must be encoded before being passed in.
Specifically, illegal or unpresentable characters are encoded as
XML character entities.
"""
# As defined in the XML spec some of the character from 0x00 to 0x19
# are not allowed in well-formed XML. We replace those with entity
# references here.
# http://www.w3.org/TR/2000/REC-xml-20001006#charsets
#
# Dev Notes:
# - It would be nice if Python has a codec for this. Perhaps we
# should write one.
# - Eric, at one point, had this change to '_xmlencode' for rubycile:
# p4 diff2 -du \
# //depot/main/Apps/Komodo-devel/src/codeintel/ruby/rubycile.py#7 \
# //depot/main/Apps/Komodo-devel/src/codeintel/ruby/rubycile.py#8
# but:
# My guess is that there was a bug here, and explicitly
# utf-8-encoding non-ascii characters fixed it. This was a year
# ago, and I don't recall what I mean by "avoid shuffling the data
# around", but it must be related to something I observed without
# that code.
# replace with XML decimal char entity, e.g. ''
return _encre.sub(lambda m: '&#%d;' % ord(m.group(1)), s)
def cdataescape(s):
"""Return the string escaped for inclusion in an XML CDATA section.
Note: Any Unicode will be encoded to UTF8 encoding as part of this process.
A CDATA section is terminated with ']]>', therefore this token in the
content must be escaped. To my knowledge the XML spec does not define
how to do that. My chosen escape is (courteousy of EricP) is to split
that token into multiple CDATA sections, so that, for example:
blah...]]>...blah
becomes:
blah...]]]]><![CDATA[>...blah
and the resulting content should be copacetic:
<b><![CDATA[blah...]]]]><![CDATA[>...blah]]></b>
"""
if isinstance(s, unicode):
s = s.encode("utf-8")
parts = s.split("]]>")
return "]]]]><![CDATA[>".join(parts)
def _unistr(x):
if isinstance(x, unicode):
return x
elif isinstance(x, str):
return x.decode('utf8')
else:
return unicode(x)
def _et_attrs(attrs):
return dict((_unistr(k), xmlencode(_unistr(v))) for k, v in attrs.items()
if v is not None)
def _et_data(data):
return xmlencode(_unistr(data))
def _node_attrs(node, **kw):
return dict(name=node["name"],
line=node.get("line"),
doc=node.get("doc"),
attributes=node.get("attributes") or None,
**kw)
def _node_citdl(node):
max_type = None
max_score = -1
#'guesses' is a types dict: {<type guess>: <score>, ...}
guesses = node.get("types", {})
for type, score in guesses.items():
if ' ' in type:
# XXX Drop the <start-scope> part of CITDL for now.
type = type.split(None, 1)[0]
# Don't emit None types, it does not help us. Fix for bug:
# http://bugs.activestate.com/show_bug.cgi?id=71989
if type != "None":
if score > max_score:
max_type = type
max_score = score
return max_type
class AST2CIXVisitor:
"""Generate Code Intelligence XML (CIX) from walking a Python AST tree.
This just generates the CIX content _inside_ of the <file/> tag. The
prefix and suffix have to be added separately.
Note: All node text elements are encoded in UTF-8 format by the Python AST
tree processing, no matter what encoding is used for the file's
original content. The generated CIX XML will also be UTF-8 encoded.
"""
DEBUG = 0
def __init__(self, moduleName=None, content=None, lang="Python"):
self.lang = lang
if self.DEBUG is None:
self.DEBUG = log.isEnabledFor(logging.DEBUG)
self.moduleName = moduleName
if content:
self.lines = content.splitlines(0)
else:
self.lines = None
# Symbol Tables (dicts) are built up for each scope. The namespace
# stack to the global-level is maintain in self.nsstack.
self.st = { # the main module symbol table
# <scope name>: <namespace dict>
}
self.nsstack = []
self.cix = et.TreeBuilder()
def emit_start(self, s, attrs={}):
self.cix.start(s, _et_attrs(attrs))
def emit_data(self, data):
self.cix.data(_et_data(data))
def emit_end(self, s):
self.cix.end(s)
def emit_tag(self, s, attrs={}, data=None):
self.emit_start(s, _et_attrs(attrs))
if data is not None:
self.emit_data(data)
self.emit_end(s)
def cix_module(self, node):
"""Emit CIX for the given module namespace."""
# log.debug("cix_module(%s, level=%r)", '.'.join(node["nspath"]),
# level)
assert len(node["types"]) == 1 and "module" in node["types"]
attrs = _node_attrs(node, lang=self.lang, ilk="blob")
module = self.emit_start('scope', attrs)
for import_ in node.get("imports", []):
self.cix_import(import_)
self.cix_symbols(node["symbols"])
self.emit_end('scope')
def cix_import(self, node):
# log.debug("cix_import(%s, level=%r)", node["module"], level)
attrs = node
self.emit_tag('import', attrs)
def cix_symbols(self, node, parentIsClass=0):
# Sort variables by line order. This provide the most naturally
# readable comparison of document with its associate CIX content.
vars = sorted(node.values(), key=lambda v: v.get("line"))
for var in vars:
self.cix_symbol(var, parentIsClass)
def cix_symbol(self, node, parentIsClass=0):
if _isclass(node):
self.cix_class(node)
elif _isfunction(node):
self.cix_function(node)
else:
self.cix_variable(node, parentIsClass)
def cix_variable(self, node, parentIsClass=0):
# log.debug("cix_variable(%s, level=%r, parentIsClass=%r)",
# '.'.join(node["nspath"]), level, parentIsClass)
attrs = _node_attrs(node, citdl=_node_citdl(node))
if parentIsClass and "is-class-var" not in node:
# Special CodeIntel <variable> attribute to distinguish from the
# usual class variables.
if attrs["attributes"]:
attrs["attributes"] += " __instancevar__"
else:
attrs["attributes"] = "__instancevar__"
self.emit_tag('variable', attrs)
def cix_class(self, node):
# log.debug("cix_class(%s, level=%r)", '.'.join(node["nspath"]), level)
if node["classrefs"]:
citdls = (t for t in (_node_citdl(n) for n in node["classrefs"])
if t is not None)
classrefs = " ".join(citdls)
else:
classrefs = None
attrs = _node_attrs(node,
lineend=node.get("lineend"),
signature=node.get("signature"),
ilk="class",
classrefs=classrefs)
self.emit_start('scope', attrs)
for import_ in node.get("imports", []):
self.cix_import(import_)
self.cix_symbols(node["symbols"], parentIsClass=1)
self.emit_end('scope')
def cix_argument(self, node):
# log.debug("cix_argument(%s, level=%r)", '.'.join(node["nspath"]),
# level)
attrs = _node_attrs(node, citdl=_node_citdl(node), ilk="argument")
self.emit_tag('variable', attrs)
def cix_function(self, node):
# log.debug("cix_function(%s, level=%r)", '.'.join(node["nspath"]), level)
# Determine the best return type.
best_citdl = None
max_count = 0
for citdl, count in node["returns"].items():
if count > max_count:
best_citdl = citdl
attrs = _node_attrs(node,
lineend=node.get("lineend"),
returns=best_citdl,
signature=node.get("signature"),
ilk="function")
self.emit_start("scope", attrs)
for import_ in node.get("imports", []):
self.cix_import(import_)
argNames = []
for arg in node["arguments"]:
argNames.append(arg["name"])
self.cix_argument(arg)
symbols = {} # don't re-emit the function arguments
for symbolName, symbol in node["symbols"].items():
if symbolName not in argNames:
symbols[symbolName] = symbol
self.cix_symbols(symbols)
# XXX <returns/> if one is defined
self.emit_end('scope')
def getCIX(self, path):
"""Return CIX content for parsed data."""
log.debug("getCIX")
moduleNS = self.st[()]
self.emit_start('file', dict(lang=self.lang, path=path))
self.cix_module(moduleNS)
self.emit_end('file')
file = self.cix.close()
return file
def visitModule(self, node):
log.info("visitModule")
nspath = ()
namespace = {"name": self.moduleName,
"nspath": nspath,
"types": {"module": 1},
"symbols": {}}
if node.doc:
summarylines = util.parseDocSummary(node.doc.splitlines(0))
namespace["doc"] = "\n".join(summarylines)
if node.lineno:
namespace["line"] = node.lineno
self.st[nspath] = namespace
self.nsstack.append(namespace)
self.visit(node.node)
self.nsstack.pop()
def visitReturn(self, node):
log.info("visitReturn: %r", node.value)
citdl_types = self._guessTypes(node.value)
for citdl in citdl_types:
if citdl:
citdl = citdl.split(None, 1)[0]
if citdl and citdl not in ("None", "NoneType"):
if citdl in ("False", "True"):
citdl = "bool"
func_node = self.nsstack[-1]
t = func_node["returns"]
t[citdl] = t.get(citdl, 0) + 1
def visitClass(self, node):
log.info("visitClass:%d: %r", node.lineno,
self.lines and self.lines[node.lineno-1])
locals = self.nsstack[-1]
name = node.name
nspath = locals["nspath"] + (name,)
namespace = {
"nspath": nspath,
"name": name,
"types": {"class": 1},
# XXX Example of a base class that might surprise: the
# __metaclass__ class in
# c:\python22\lib\site-packages\ctypes\com\automation.py
# Should this be self._getCITDLExprRepr()???
"classrefs": [],
"symbols": {},
}
namespace["declaration"] = namespace
if node.lineno:
namespace["line"] = node.lineno
lastNode = node
while lastNode.getChildNodes():
lastNode = lastNode.getChildNodes()[-1]
if lastNode.lineno:
namespace["lineend"] = lastNode.lineno
attributes = []
if name.startswith("__") and name.endswith("__"):
pass
elif name.startswith("__"):
attributes.append("private")
elif name.startswith("_"):
attributes.append("protected")
namespace["attributes"] = ' '.join(attributes)
if node.bases:
for baseNode in node.bases:
baseName = self._getExprRepr(baseNode)
classref = {"name": baseName, "types": {}}
for t in self._guessTypes(baseNode):
if t not in classref["types"]:
classref["types"][t] = 0
classref["types"][t] += 1
namespace["classrefs"].append(classref)
if node.doc:
siglines, desclines = util.parsePyFuncDoc(node.doc)
if siglines:
namespace["signature"] = "\n".join(siglines)
if desclines:
namespace["doc"] = "\n".join(desclines)
self.st[nspath] = locals["symbols"][name] = namespace
self.nsstack.append(namespace)
self.visit(node.code)
self.nsstack.pop()
def visitFunction(self, node):
log.info("visitFunction:%d: %r", node.lineno,
self.lines and self.lines[node.lineno-1])
parent = self.nsstack[-1]
parentIsClass = _isclass(parent)
namespace = {
"types": {"function": 1},
"returns": {},
"arguments": [],
"symbols": {},
}
namespace["declaration"] = namespace
if node.lineno:
namespace["line"] = node.lineno
lastNode = node
while lastNode.getChildNodes():
lastNode = lastNode.getChildNodes()[-1]
if lastNode.lineno:
namespace["lineend"] = lastNode.lineno
name = node.name
# Determine attributes
attributes = []
if name.startswith("__") and name.endswith("__"):
pass
elif name.startswith("__"):
attributes.append("private")
elif name.startswith("_"):
attributes.append("protected")
if name == "__init__" and parentIsClass:
attributes.append("__ctor__")
# process decorators
prop_var = None
if node.decorators:
for deco in node.decorators.nodes:
deco_name = getattr(deco, 'name', None)
prop_mode = None
if deco_name == 'staticmethod':
attributes.append("__staticmethod__")
continue
if deco_name == 'classmethod':
attributes.append("__classmethod__")
continue
if deco_name == 'property':
prop_mode = 'getter'
elif hasattr(deco, 'attrname') and deco.attrname in ('getter',
'setter',
'deleter'):
prop_mode = deco.attrname
if prop_mode:
if prop_mode == 'getter':
# it's a getter, create a pseudo-var
prop_var = parent["symbols"].get(name, None)
if prop_var is None:
prop_var = dict(name=name,
nspath=parent["nspath"] + (name,),
doc=None,
types={},
symbols={})
var_attrs = ['property']
if name.startswith("__") and name.endswith("__"):
pass
elif name.startswith("__"):
var_attrs.append("private")
elif name.startswith("_"):
var_attrs.append("protected")
prop_var["attributes"] = ' '.join(var_attrs)
prop_var["declaration"] = prop_var
parent["symbols"][name] = prop_var
if not "is-class-var" in prop_var:
prop_var["is-class-var"] = 1
# hide the function
attributes += ['__hidden__']
name += " (property %s)" % prop_mode
# only one property decorator makes sense
break
namespace["attributes"] = ' '.join(attributes)
if parentIsClass and name == "__init__":
fallbackSig = parent["name"]
else:
fallbackSig = name
namespace["name"] = name
nspath = parent["nspath"] + (name,)
namespace["nspath"] = nspath
# Handle arguments. The format of the relevant Function attributes
# makes this a little bit of pain.
defaultArgsBaseIndex = len(node.argnames) - len(node.defaults)
if node.kwargs:
defaultArgsBaseIndex -= 1
if node.varargs:
defaultArgsBaseIndex -= 1
varargsIndex = len(node.argnames)-2
else:
varargsIndex = None
kwargsIndex = len(node.argnames)-1
elif node.varargs:
defaultArgsBaseIndex -= 1
varargsIndex = len(node.argnames)-1
kwargsIndex = None
else:
varargsIndex = kwargsIndex = None
sigArgs = []
for i in range(len(node.argnames)):
argOrArgTuple = node.argnames[i]
if isinstance(argOrArgTuple, tuple):
# If it is a tuple arg with a default assignment, then we
# drop that info (except for the sig): too hard and too rare
# to bother with.
sigArg = str(argOrArgTuple)
if i >= defaultArgsBaseIndex:
defaultNode = node.defaults[i-defaultArgsBaseIndex]
try:
default = self._getExprRepr(defaultNode)
except PythonCILEError, ex:
raise PythonCILEError("unexpected default argument node "
"type for Function '%s': %s"
% (node.name, ex))
sigArg += "="+default
sigArgs.append(sigArg)
arguments = []
for argName in argOrArgTuple:
argument = {"name": argName,
"nspath": nspath+(argName,),
"doc": None,
"types": {},
"line": node.lineno,
"symbols": {}}
arguments.append(argument)
else:
argName = argOrArgTuple
argument = {"name": argName,
"nspath": nspath+(argName,),
"doc": None,
"types": {},
"line": node.lineno,
"symbols": {}}
if i == kwargsIndex:
argument["attributes"] = "kwargs"
sigArgs.append("**"+argName)
elif i == varargsIndex:
argument["attributes"] = "varargs"
sigArgs.append("*"+argName)
elif i >= defaultArgsBaseIndex:
defaultNode = node.defaults[i-defaultArgsBaseIndex]
try:
argument["default"] = self._getExprRepr(defaultNode)
except PythonCILEError, ex:
raise PythonCILEError("unexpected default argument node "
"type for Function '%s': %s"
% (node.name, ex))
sigArgs.append(argName+'='+argument["default"])
for t in self._guessTypes(defaultNode):
log.info("guessed type: %s ::= %s", argName, t)
if t not in argument["types"]:
argument["types"][t] = 0
argument["types"][t] += 1
else:
sigArgs.append(argName)
if i == 0 and parentIsClass:
# If this is a class method, then the first arg is the class
# instance.
className = self.nsstack[-1]["nspath"][-1]
argument["types"][className] = 1
argument["declaration"] = self.nsstack[-1]
arguments = [argument]
for argument in arguments:
if "declaration" not in argument:
argument[
"declaration"] = argument # namespace dict of the declaration
namespace["arguments"].append(argument)
namespace["symbols"][argument["name"]] = argument
# Drop first "self" argument from class method signatures.
# - This is a little bit of a compromise as the "self" argument
# should *sometimes* be included in a method's call signature.
if _isclass(parent) and sigArgs and "__staticmethod__" not in attributes:
# Delete the first "self" argument.
del sigArgs[0]
fallbackSig += "(%s)" % (", ".join(sigArgs))
if "__staticmethod__" in attributes:
fallbackSig += " - staticmethod"
elif "__classmethod__" in attributes:
fallbackSig += " - classmethod"
if node.doc:
siglines, desclines = util.parsePyFuncDoc(node.doc, [fallbackSig])
namespace["signature"] = "\n".join(siglines)
if desclines:
namespace["doc"] = "\n".join(desclines)
else:
namespace["signature"] = fallbackSig
self.st[nspath] = parent["symbols"][name] = namespace
self.nsstack.append(namespace)
self.visit(node.code)
self.nsstack.pop()
if prop_var:
# this is a property getter function,
# copy its return types to the corresponding property variable...
var_types = prop_var["types"]
for t in namespace["returns"]:
if t not in var_types:
var_types[t] = 0
else:
var_types[t] += 1
# ... as well as its line number
if "line" in namespace:
prop_var["line"] = namespace["line"]
def visitImport(self, node):
log.info("visitImport:%d: %r", node.lineno,
self.lines and self.lines[node.lineno-1])
imports = self.nsstack[-1].setdefault("imports", [])
for module, alias in node.names:
import_ = {"module": module}
if node.lineno:
import_["line"] = node.lineno
if alias:
import_["alias"] = alias
imports.append(import_)
def visitFrom(self, node):
log.info("visitFrom:%d: %r", node.lineno,
self.lines and self.lines[node.lineno-1])
imports = self.nsstack[-1].setdefault("imports", [])
module = node.modname
if node.level > 0:
module = ("." * node.level) + module
for symbol, alias in node.names:
import_ = {"module": module, "symbol": symbol}
if node.lineno:
import_["line"] = node.lineno
if alias:
import_["alias"] = alias
imports.append(import_)
# XXX
# def visitReturn(self, node):
# # set __rettypes__ on Functions
# pass
# def visitGlobal(self, node):
# # note for future visitAssign to control namespace
# pass
# def visitYield(self, node):
# # modify the Function into a generator??? what are the implications?
# pass
# def visitAssert(self, node):
# # support the assert hints that Wing does
# pass
def _assignVariable(self, varName, namespace, rhsNode, line,
isClassVar=0):
"""Handle a simple variable name assignment.
"varName" is the variable name being assign to.
"namespace" is the namespace dict to which to assign the variable.
"rhsNode" is the ast.Node of the right-hand side of the
assignment.
"line" is the line number on which the variable is being assigned.
"isClassVar" (optional) is a boolean indicating if this var is
a class variable, as opposed to an instance variable
"""
log.debug("_assignVariable(varName=%r, namespace %s, rhsNode=%r, "
"line, isClassVar=%r)", varName,
'.'.join(namespace["nspath"]), rhsNode, isClassVar)
variable = namespace["symbols"].get(varName, None)
new_var = False
if variable is None:
new_var = True
variable = {"name": varName,
"nspath": namespace["nspath"]+(varName,),
# Could try to parse documentation from a near-by
# string.
"doc": None,
# 'types' is a dict mapping a type name to the number
# of times this was guessed as the variable type.
"types": {},
"symbols": {}}
# Determine attributes
attributes = []
if varName.startswith("__") and varName.endswith("__"):
pass
elif varName.startswith("__"):
attributes.append("private")
elif varName.startswith("_"):
attributes.append("protected")
variable["attributes"] = ' '.join(attributes)
variable["declaration"] = variable
if line:
variable["line"] = line
namespace["symbols"][varName] = variable
if isClassVar and not "is-class-var" in variable:
variable["is-class-var"] = 1
# line number of first class-level assignment wins
if line:
variable["line"] = line
if (not new_var and
_isfunction(variable) and
isinstance(rhsNode, ast.CallFunc) and
rhsNode.args and
isinstance(rhsNode.args[0], ast.Name) and
variable["name"] == rhsNode.args[0].name
):
# a speial case for 2.4-styled decorators
return
varTypes = variable["types"]
for t in self._guessTypes(rhsNode, namespace):
log.info("guessed type: %s ::= %s", varName, t)
if t not in varTypes:
varTypes[t] = 0
varTypes[t] += 1
def _visitSimpleAssign(self, lhsNode, rhsNode, line):
"""Handle a simple assignment: assignment to a symbol name or to
an attribute of a symbol name. If the given left-hand side (lhsNode)
is not an node type that can be handled, it is dropped.
"""
log.debug("_visitSimpleAssign(lhsNode=%r, rhsNode=%r)", lhsNode,
rhsNode)
if isinstance(lhsNode, ast.AssName):
# E.g.: foo = ...
# Assign this to the local namespace, unless there was a
# 'global' statement. (XXX Not handling 'global' yet.)
ns = self.nsstack[-1]
self._assignVariable(lhsNode.name, ns, rhsNode, line,
isClassVar=_isclass(ns))
elif isinstance(lhsNode, ast.AssAttr):
# E.g.: foo.bar = ...
# If we can resolve "foo", then we update that namespace.
variable, citdl = self._resolveObjectRef(lhsNode.expr)
if variable:
self._assignVariable(lhsNode.attrname,
variable["declaration"], rhsNode, line)
else:
log.debug("could not handle simple assign (module '%s'): "
"lhsNode=%r, rhsNode=%r", self.moduleName, lhsNode,
rhsNode)
def visitAssign(self, node):
log.info("visitAssign:%d: %r", node.lineno,
self.lines and self.lines[node.lineno-1])
lhsNode = node.nodes[0]
rhsNode = node.expr
if isinstance(lhsNode, (ast.AssName, ast.AssAttr)):
# E.g.:
# foo = ... (AssName)
# foo.bar = ... (AssAttr)
self._visitSimpleAssign(lhsNode, rhsNode, node.lineno)
elif isinstance(lhsNode, (ast.AssTuple, ast.AssList)):
# E.g.:
# foo, bar = ...
# [foo, bar] = ...
# If the RHS is a sequence with the same number of elements,
# then we update each assigned-to variable. Otherwise, bail.
if isinstance(rhsNode, (ast.Tuple, ast.List)):
if len(lhsNode.nodes) == len(rhsNode.nodes):
for i in range(len(lhsNode.nodes)):
self._visitSimpleAssign(lhsNode.nodes[i],
rhsNode.nodes[i],
node.lineno)
elif isinstance(rhsNode, ast.Dict):
if len(lhsNode.nodes) == len(rhsNode.items):
for i in range(len(lhsNode.nodes)):
self._visitSimpleAssign(lhsNode.nodes[i],
rhsNode.items[i][0],
node.lineno)
elif isinstance(rhsNode, ast.CallFunc):
for i in range(len(lhsNode.nodes)):
self._visitSimpleAssign(lhsNode.nodes[i],
None, # we don't have a good type.
node.lineno)
else:
log.info(
"visitAssign:: skipping unknown rhsNode type: %r - %r",
type(rhsNode), rhsNode)
elif isinstance(lhsNode, ast.Slice):
# E.g.: bar[1:2] = "foo"
# We don't bother with these: too hard.
log.info("visitAssign:: skipping slice - too hard")
pass
elif isinstance(lhsNode, ast.Subscript):
# E.g.: bar[1] = "foo"
# We don't bother with these: too hard.
log.info("visitAssign:: skipping subscript - too hard")
pass
else:
raise PythonCILEError("unexpected type of LHS of assignment: %r"
% lhsNode)
def _handleUnknownAssignment(self, assignNode, lineno):
if isinstance(assignNode, ast.AssName):
self._visitSimpleAssign(assignNode, None, lineno)
elif isinstance(assignNode, ast.AssTuple):
for anode in assignNode.nodes:
self._visitSimpleAssign(anode, None, lineno)
def visitFor(self, node):
log.info("visitFor:%d: %r", node.lineno,
self.lines and self.lines[node.lineno-1])
# E.g.:
# for foo in ...
# None: don't bother trying to resolve the type of the RHS
self._handleUnknownAssignment(node.assign, node.lineno)
self.visit(node.body)
def visitWith(self, node):
log.info("visitWith:%d: %r", node.lineno,
self.lines and self.lines[node.lineno-1])
self._handleUnknownAssignment(node.vars, node.lineno)
self.visit(node.body)
def visitTryExcept(self, node):
log.info("visitTryExcept:%d: %r", node.lineno,
self.lines and self.lines[node.lineno-1])
self.visit(node.body)
for handler in node.handlers:
try:
if handler[1]:
try:
lineno = handler[1].lineno
except AttributeError:
lineno = node.lineno
self._handleUnknownAssignment(handler[1], lineno)
if handler[2]:
self.visit(handler[2])
except IndexError:
pass
if node.else_:
self.visit(node.else_)
def _resolveObjectRef(self, expr):
"""Try to resolve the given expression to a variable namespace.
"expr" is some kind of ast.Node instance.
Returns the following 2-tuple for the object:
(<variable dict>, <CITDL string>)
where,
<variable dict> is the defining dict for the variable, e.g.
{'name': 'classvar', 'types': {'int': 1}}.
This is None if the variable could not be resolved.
<CITDL string> is a string of CITDL code (see the spec) describing
how to resolve the variable later. This is None if the
variable could be resolved or if the expression is not
expressible in CITDL (CITDL does not attempt to be a panacea).
"""
log.debug("_resolveObjectRef(expr=%r)", expr)
if isinstance(expr, ast.Name):
name = expr.name
nspath = self.nsstack[-1]["nspath"]
for i in range(len(nspath), -1, -1):
ns = self.st[nspath[:i]]
if name in ns["symbols"]:
return (ns["symbols"][name], None)
else:
log.debug(
"_resolveObjectRef: %r not in namespace %r", name,
'.'.join(ns["nspath"]))
elif isinstance(expr, ast.Getattr):
obj, citdl = self._resolveObjectRef(expr.expr)
decl = obj and obj["declaration"] or None # want the declaration
if (decl # and "symbols" in decl #XXX this "and"-part necessary?
and expr.attrname in decl["symbols"]):
return (decl["symbols"][expr.attrname], None)
elif isinstance(expr.expr, ast.Const):
# Special case: specifically refer to type object for
# attribute access on constants, e.g.:
# ' '.join
citdl = "__builtins__.%s.%s"\
% ((type(expr.expr.value).__name__), expr.attrname)
return (None, citdl)
# XXX Could optimize here for common built-in attributes. E.g.,
# we *know* that str.join() returns a string.
elif isinstance(expr, ast.Const):
# Special case: specifically refer to type object for constants.
return (None, "__builtins__.%s" % type(expr.value).__name__)
elif isinstance(expr, ast.CallFunc):
# XXX Would need flow analysis to have an object dict for whatever
# a __call__ would return.
pass
# Fallback: return CITDL code for delayed resolution.
log.debug("_resolveObjectRef: could not resolve %r", expr)
scope = '.'.join(self.nsstack[-1]["nspath"])
exprrepr = self._getCITDLExprRepr(expr)
if exprrepr:
if scope:
citdl = "%s %s" % (exprrepr, scope)
else:
citdl = exprrepr
else:
citdl = None
return (None, citdl)
def _guessTypes(self, expr, curr_ns=None):
log.debug("_guessTypes(expr=%r)", expr)
ts = []
if isinstance(expr, ast.Const):
ts = [type(expr.value).__name__]
elif isinstance(expr, ast.Tuple):
ts = [tuple.__name__]
elif isinstance(expr, (ast.List, ast.ListComp)):
ts = [list.__name__]
elif hasattr(ast, 'Set') and isinstance(expr, ast.Set):
ts = [set.__name__]
elif isinstance(expr, ast.Dict):
ts = [dict.__name__]
elif isinstance(expr, (ast.Add, ast.Sub, ast.Mul, ast.Div, ast.Mod,
ast.Power)):
order = ["int", "bool", "long", "float", "complex", "string",
"unicode"]
possibles = self._guessTypes(
expr.left)+self._guessTypes(expr.right)
ts = []
highest = -1
for possible in possibles:
if possible not in order:
ts.append(possible)
else:
highest = max(highest, order.index(possible))
if not ts and highest > -1:
ts = [order[highest]]
elif isinstance(expr, (ast.FloorDiv, ast.Bitand, ast.Bitor,
ast.Bitxor, ast.RightShift, ast.LeftShift)):
ts = [int.__name__]
elif isinstance(expr, (ast.Or, ast.And)):
ts = []
for node in expr.nodes:
for t in self._guessTypes(node):
if t not in ts:
ts.append(t)
elif isinstance(expr, (ast.Compare, ast.Not)):
ts = [type(1 == 2).__name__]
elif isinstance(expr, (ast.UnaryAdd, ast.UnarySub, ast.Invert,
ast.Not)):
ts = self._guessTypes(expr.expr)
elif isinstance(expr, ast.Slice):
ts = [list.__name__]
elif isinstance(expr, ast.Backquote):
ts = [str.__name__]
elif isinstance(expr, (ast.Name, ast.Getattr)):
variable, citdl = self._resolveObjectRef(expr)
if variable:
if _isclass(variable) or _isfunction(variable):
ts = ['.'.join(variable["nspath"])]
else:
ts = variable["types"].keys()
elif citdl:
ts = [citdl]
elif isinstance(expr, ast.CallFunc):
variable, citdl = self._resolveObjectRef(expr.node)
if variable:
# XXX When/if we support <returns/> and if we have that
# info for this 'variable' we can return an actual
# value here.
# Optmizing Shortcut: If the variable is a class then just
# call its type that class definition, i.e. 'mymodule.MyClass'
# instead of 'type(call(mymodule.MyClass))'.
# Remove the common leading namespace elements.
scope_parts = list(variable["nspath"])
if curr_ns is not None:
for part in curr_ns["nspath"]:
if scope_parts and part == scope_parts[0]:
scope_parts.pop(0)
else:
break
scope = '.'.join(scope_parts)
if _isclass(variable):
ts = [scope]
else:
ts = [scope+"()"]
elif citdl:
# For code like this:
# for line in lines:
# line = line.rstrip()
# this results in a type guess of "line.rstrip <funcname>".
# That sucks. Really it should at least be line.rstrip() so
# that runtime CITDL evaluation can try to determine that
# rstrip() is a _function_ call rather than _class creation_,
# which is the current resuilt. (c.f. bug 33493)
# XXX We *could* attempt to guess based on where we know
# "line" to be a module import: the only way that
# 'rstrip' could be a class rather than a function.
# TW: I think it should always use "()" no matter if it's
# a class or a function. The codeintel handler can work
# out which one it is. This gives us the ability to then
# distinguish between class methods and instance methods,
# as class methods look like:
# MyClass.staticmethod()
# and instance methods like:
# MyClass().instancemethod()
# Updated to use "()".
# Ensure we only add the "()" to the type part, not to the
# scope (if it exists) part, which is separated by a space. Bug:
# http://bugs.activestate.com/show_bug.cgi?id=71987
# citdl in this case looks like "string.split myfunction"
ts = citdl.split(None, 1)
ts[0] += "()"
ts = [" ".join(ts)]
elif isinstance(expr, (ast.Subscript, ast.Lambda)):
pass
else:
log.info("don't know how to guess types from this expr: %r" % expr)
return ts
def _getExprRepr(self, node):
"""Return a string representation for this Python expression.
Raises PythonCILEError if can't do it.
"""
s = None
if isinstance(node, ast.Name):
s = node.name
elif isinstance(node, ast.Const):
s = repr(node.value)
elif isinstance(node, ast.Getattr):
s = '.'.join([self._getExprRepr(node.expr), node.attrname])
elif isinstance(node, ast.List):
items = [self._getExprRepr(c) for c in node.getChildren()]
s = "[%s]" % ", ".join(items)
elif isinstance(node, ast.Tuple):
items = [self._getExprRepr(c) for c in node.getChildren()]
s = "(%s)" % ", ".join(items)
elif hasattr(ast, 'Set') and isinstance(node, ast.Set):
items = [self._getExprRepr(c) for c in node.getChildren()]
s = "{%s}" % ", ".join(items)
elif isinstance(node, ast.Dict):
items = ["%s: %s" % (self._getExprRepr(k), self._getExprRepr(v))
for (k, v) in node.items]
s = "{%s}" % ", ".join(items)
elif isinstance(node, ast.CallFunc):
s = self._getExprRepr(node.node)
s += "("
allargs = []
for arg in node.args:
allargs.append(self._getExprRepr(arg))
if node.star_args:
for arg in node.star_args:
allargs.append("*" + self._getExprRepr(arg))
if node.dstar_args:
for arg in node.dstar_args:
allargs.append("**" + self._getExprRepr(arg))
s += ",".join(allargs)
s += ")"
elif isinstance(node, ast.Subscript):
s = "[%s]" % self._getExprRepr(node.expr)
elif isinstance(node, ast.Backquote):
s = "`%s`" % self._getExprRepr(node.expr)
elif isinstance(node, ast.Slice):
dumpNode(node)
s = self._getExprRepr(node.expr)
s += "["
if node.lower:
s += self._getExprRepr(node.lower)
s += ":"
if node.upper:
s += self._getExprRepr(node.upper)
s += "]"
elif isinstance(node, ast.UnarySub):
s = "-" + self._getExprRepr(node.expr)
elif isinstance(node, ast.UnaryAdd):
s = "+" + self._getExprRepr(node.expr)
elif isinstance(node, ast.Add):
s = self._getExprRepr(
node.left) + "+" + self._getExprRepr(node.right)
elif isinstance(node, ast.Sub):
s = self._getExprRepr(
node.left) + "-" + self._getExprRepr(node.right)
elif isinstance(node, ast.Mul):
s = self._getExprRepr(
node.left) + "*" + self._getExprRepr(node.right)
elif isinstance(node, ast.Div):
s = self._getExprRepr(
node.left) + "/" + self._getExprRepr(node.right)
elif isinstance(node, ast.FloorDiv):
s = self._getExprRepr(
node.left) + "//" + self._getExprRepr(node.right)
elif isinstance(node, ast.Mod):
s = self._getExprRepr(
node.left) + "%" + self._getExprRepr(node.right)
elif isinstance(node, ast.Power):
s = self._getExprRepr(
node.left) + "**" + self._getExprRepr(node.right)
elif isinstance(node, ast.LeftShift):
s = self._getExprRepr(
node.left) + "<<" + self._getExprRepr(node.right)
elif isinstance(node, ast.RightShift):
s = self._getExprRepr(
node.left) + ">>" + self._getExprRepr(node.right)
elif isinstance(node, ast.Keyword):
s = node.name + "=" + self._getExprRepr(node.expr)
elif isinstance(node, ast.Bitor):
creprs = []
for cnode in node.nodes:
if isinstance(cnode, (ast.Const, ast.Name)):
crepr = self._getExprRepr(cnode)
else:
crepr = "(%s)" % self._getExprRepr(cnode)
creprs.append(crepr)
s = "|".join(creprs)
elif isinstance(node, ast.Bitand):
creprs = []
for cnode in node.nodes:
if isinstance(cnode, (ast.Const, ast.Name)):
crepr = self._getExprRepr(cnode)
else:
crepr = "(%s)" % self._getExprRepr(cnode)
creprs.append(crepr)
s = "&".join(creprs)
elif isinstance(node, ast.Bitxor):
creprs = []
for cnode in node.nodes:
if isinstance(cnode, (ast.Const, ast.Name)):
crepr = self._getExprRepr(cnode)
else:
crepr = "(%s)" % self._getExprRepr(cnode)
creprs.append(crepr)
s = "^".join(creprs)
elif isinstance(node, ast.Lambda):
s = "lambda"
defaultArgsBaseIndex = len(node.argnames) - len(node.defaults)
if node.kwargs:
defaultArgsBaseIndex -= 1
if node.varargs:
defaultArgsBaseIndex -= 1
varargsIndex = len(node.argnames)-2
else:
varargsIndex = None
kwargsIndex = len(node.argnames)-1
elif node.varargs:
defaultArgsBaseIndex -= 1
varargsIndex = len(node.argnames)-1
kwargsIndex = None
else:
varargsIndex = kwargsIndex = None
args = []
for i in range(len(node.argnames)):
argOrArgTuple = node.argnames[i]
if isinstance(argOrArgTuple, tuple):
arg = "(%s)" % ','.join(argOrArgTuple)
if i >= defaultArgsBaseIndex:
defaultNode = node.defaults[i-defaultArgsBaseIndex]
try:
arg += "="+self._getExprRepr(defaultNode)
except PythonCILEError:
# XXX Work around some trouble cases.
arg += arg+"=..."
else:
argname = node.argnames[i]
if i == kwargsIndex:
arg = "**"+argname
elif i == varargsIndex:
arg = "*"+argname
elif i >= defaultArgsBaseIndex:
defaultNode = node.defaults[i-defaultArgsBaseIndex]
try:
arg = argname+"="+self._getExprRepr(defaultNode)
except PythonCILEError:
# XXX Work around some trouble cases.
arg = argname+"=..."
else:
arg = argname
args.append(arg)
if args:
s += " " + ",".join(args)
try:
s += ": " + self._getExprRepr(node.code)
except PythonCILEError:
# XXX Work around some trouble cases.
s += ":..."
else:
raise PythonCILEError("don't know how to get string repr "
"of expression: %r" % node)
return s
def _getCITDLExprRepr(self, node, _level=0):
"""Return a string repr for this expression that CITDL processing
can handle.
CITDL is no panacea -- it is meant to provide simple delayed type
determination. As a result, many complicated expressions cannot
be handled. If the expression is not with CITDL's scope, then None
is returned.
"""
s = None
if isinstance(node, ast.Name):
s = node.name
elif isinstance(node, ast.Const):
s = repr(node.value)
elif isinstance(node, ast.Getattr):
exprRepr = self._getCITDLExprRepr(node.expr, _level+1)
if exprRepr is None:
pass
else:
s = '.'.join([exprRepr, node.attrname])
elif isinstance(node, ast.List):
s = "[]"
elif isinstance(node, ast.Tuple):
s = "()"
elif hasattr(ast, 'Set') and isinstance(node, ast.Set):
s = "set()"
elif isinstance(node, ast.Dict):
s = "{}"
elif isinstance(node, ast.CallFunc):
# Only allow CallFunc at the top-level. I.e. this:
# spam.ham.eggs()
# is in scope, but this:
# spam.ham().eggs
# is not.
if _level != 0:
pass
else:
s = self._getCITDLExprRepr(node.node, _level+1)
if s is not None:
s += "()"
return s
def _quietCompilerParse(content):
oldstderr = sys.stderr
sys.stderr = StringIO()
try:
return compiler.parse(content)
finally:
sys.stderr = oldstderr
def _quietCompile(source, filename, kind):
oldstderr = sys.stderr
sys.stderr = StringIO()
try:
return compile(source, filename, kind)
finally:
sys.stderr = oldstderr
def _getAST(content):
"""Return an AST for the given Python content.
If cannot, raise an error describing the problem.
"""
# EOL issues:
# compiler.parse() can't handle '\r\n' EOLs on Mac OS X and can't
# handle '\r' EOLs on any platform. Let's just always normalize.
# Unfortunately this is work only for the exceptional case. The
# problem is most acute on the Mac.
content = '\n'.join(content.splitlines(0))
# Is this faster?
# content = content.replace('\r\n', '\n').replace('\r', '\n')
errlineno = None # line number of a SyntaxError
ast_ = None
try:
ast_ = _quietCompilerParse(content)
except SyntaxError, ex:
errlineno = ex.lineno
log.debug("compiler parse #1: syntax error on line %d", errlineno)
except parser.ParserError, ex:
log.debug("compiler parse #1: parse error")
# Try to get the offending line number.
# compile() only likes LFs for EOLs.
lfContent = content.replace("\r\n", "\n").replace("\r", "\n")
try:
_quietCompile(lfContent, "dummy.py", "exec")
except SyntaxError, ex2:
errlineno = ex2.lineno
except:
pass
if errlineno is None:
raise # Does this re-raise 'ex' (as we want) or 'ex2'?
if errlineno is not None:
# There was a syntax error at this line: try to recover by effectively
# nulling out the offending line.
lines = content.splitlines(1)
offender = lines[errlineno-1]
log.info("syntax error on line %d: %r: trying to recover",
errlineno, offender)
indent = ''
for i in range(0, len(offender)):
if offender[i] in " \t":
indent += offender[i]
else:
break
lines[errlineno-1] = indent+"pass"+"\n"
newContent = ''.join(lines)
errlineno2 = None
try:
ast_ = _quietCompilerParse(newContent)
except SyntaxError, ex:
errlineno2 = ex.lineno
log.debug("compiler parse #2: syntax error on line %d", errlineno)
except parser.ParserError, ex:
log.debug("compiler parse #2: parse error")
# Try to get the offending line number.
# compile() only likes LFs for EOLs.
lfContent = newContent.replace("\r\n", "\n").replace("\r", "\n")
try:
_quietCompile(lfContent, "dummy.py", "exec")
except SyntaxError, ex2:
errlineno2 = ex2.lineno
except:
pass
if errlineno2 is None:
raise
if ast_ is not None:
pass
elif errlineno2 == errlineno:
raise ValueError("cannot recover from syntax error: line %d"
% errlineno)
else:
raise ValueError("cannot recover from multiple syntax errors: "
"line %d and then %d" % (errlineno, errlineno2))
if ast_ is None:
raise ValueError("could not generate AST")
return ast_
_rx_cache = {}
def _rx(pattern, flags=0):
if pattern not in _rx_cache:
_rx_cache[pattern] = re.compile(pattern, flags)
return _rx_cache[pattern]
def _convert3to2(src):
# XXX: this might be much faster to do all this stuff by manipulating
# parse trees produced by tdparser
# except Foo as bar => except (Foo,) bar
src = _rx(r'(\bexcept\s*)(\S.+?)\s+as\s+(\w+)\s*:').sub(
r'\1(\2,), \3:', src)
# 0o123 => 123
src = _rx(r'\b0[oO](\d+)').sub(r'\1', src)
# print(foo) => print_(foo)
src = _rx(r'\bprint\s*\(').sub(r'print_(', src)
# change forms of class Foo(metaclass=Cls3) to class Foo
src = _rx(r'(\bclass\s+\w+\s*)\(\s*\w+\s*=\s*\w+\s*\)\s*:').sub(
r'\1:', src)
# change forms of class Foo(..., arg=Base1, metaclass=Cls3) to class
# Foo(...)
src = _rx(r'(\bclass\s+\w+\s*\(.*?),?\s*\w+\s*=.+?\)\s*:').sub(
r'\1):', src)
# Remove return type annotations like def foo() -> int:
src = _rx(r'(\bdef\s+\w+\s*\(.*?\))\s*->\s*\w+\s*:').sub(r'\1:', src)
# def foo(foo:Bar, baz=lambda x: qoox): => def foo(bar, baz=_lambda(qoox)):
src = _rx(r'(\bdef\s+\w+\s*\()(.+?)(\)\s*:)').sub(_clean_func_args, src)
return src
def _clean_func_args(defn):
argdef = defn.group(2)
parser = tdparser.PyExprParser()
try:
arglist = parser.parse_bare_arglist(argdef)
seen_args = False
seen_kw = False
py2 = []
for arg in arglist:
name, value, type = arg
if name.id == "*":
if not seen_kw:
name.value = "**kwargs"
py2.append(arg)
seen_kw = True
seen_args = True
elif name.value[:2] == "**":
if not seen_kw:
py2.append(arg)
seen_kw = True
seen_args = True
elif name.value[0] == "*":
if not seen_args:
seen_args = True
py2.append(arg)
else:
if seen_args or seen_kw:
break
else:
py2.append(arg)
cleared = tdparser.arg_list_py(py2)
except tdparser.ParseError, ex:
cleared = argdef
log.exception("Couldn't parse (%r)" % argdef)
return defn.group(1) + cleared + defn.group(3)
#---- public module interface
def scan_cix(content, filename, md5sum=None, mtime=None, lang="Python"):
"""Scan the given Python content and return Code Intelligence data
conforming the the Code Intelligence XML format.
"content" is the Python content to scan. This should be an
encoded string: must be a string for `md5` and
`compiler.parse` -- see bug 73461.
"filename" is the source of the Python content (used in the
generated output).
"md5sum" (optional) if the MD5 hexdigest has already been calculated
for the content, it can be passed in here. Otherwise this
is calculated.
"mtime" (optional) is a modified time for the file (in seconds since
the "epoch"). If it is not specified the _current_ time is used.
Note that the default is not to stat() the file and use that
because the given content might not reflect the saved file state.
"lang" (optional) is the language of the given file content.
Typically this is "Python" (i.e. a pure Python file), but it
may also be "DjangoHTML" or similar for Python embedded in
other documents.
XXX Add an optional 'eoltype' so that it need not be
re-calculated if already known.
This can raise one of SyntaxError, PythonCILEError or parser.ParserError
if there was an error processing. Currently this implementation uses the
Python 'compiler' package for processing, therefore the given Python
content must be syntactically correct.
"""
codeintel = scan_et(content, filename, md5sum, mtime, lang)
tree = et.ElementTree(codeintel)
stream = StringIO()
# this is against the W3C spec, but ElementTree wants it lowercase
tree.write(stream, "utf-8")
raw_cix = stream.getvalue()
# XXX: why this 0xA -> 
 conversion is necessary?
# It makes no sense, but some tests break without it
# (like cile/scaninputs/path:cdata_close.py)
cix = raw_cix.replace('\x0a', '
')
return cix
def scan_et(content, filename, md5sum=None, mtime=None, lang="Python"):
"""Scan the given Python content and return Code Intelligence data
conforming the the Code Intelligence XML format.
"content" is the Python content to scan. This should be an
encoded string: must be a string for `md5` and
`compiler.parse` -- see bug 73461.
"filename" is the source of the Python content (used in the
generated output).
"md5sum" (optional) if the MD5 hexdigest has already been calculated
for the content, it can be passed in here. Otherwise this
is calculated.
"mtime" (optional) is a modified time for the file (in seconds since
the "epoch"). If it is not specified the _current_ time is used.
Note that the default is not to stat() the file and use that
because the given content might not reflect the saved file state.
"lang" (optional) is the language of the given file content.
Typically this is "Python" (i.e. a pure Python file), but it
may also be "DjangoHTML" or similar for Python embedded in
other documents.
XXX Add an optional 'eoltype' so that it need not be
re-calculated if already known.
This can raise one of SyntaxError, PythonCILEError or parser.ParserError
if there was an error processing. Currently this implementation uses the
Python 'compiler' package for processing, therefore the given Python
content must be syntactically correct.
"""
log.info("scan '%s'", filename)
if md5sum is None:
md5sum = md5(content).hexdigest()
if mtime is None:
mtime = int(time.time())
# 'compiler' both (1) wants a newline at the end and (2) can fail on
# funky *whitespace* at the end of the file.
content = content.rstrip() + '\n'
if lang == "Python3":
# Make Python3 code as compatible with pythoncile's Python2
# parser as neessary for codeintel purposes.
content = _convert3to2(content)
if isinstance(filename, types.UnicodeType):
filename = filename.encode('utf-8')
# The 'path' attribute must use normalized dir separators.
if sys.platform.startswith("win"):
path = filename.replace('\\', '/')
else:
path = filename
try:
ast_ = _getAST(content)
if _gClockIt:
sys.stdout.write(" (ast:%.3fs)" % (_gClock()-_gStartTime))
except Exception, ex:
file = et.Element('file', _et_attrs(dict(lang=lang,
path=path,
error=str(ex))))
else:
moduleName = os.path.splitext(os.path.basename(filename))[0]
visitor = AST2CIXVisitor(moduleName, content=content, lang=lang)
if log.isEnabledFor(logging.DEBUG):
walker = ExampleASTVisitor()
walker.VERBOSE = 1
else:
walker = None
compiler.walk(ast_, visitor, walker)
if _gClockIt:
sys.stdout.write(" (walk:%.3fs)" % (_gClock()-_gStartTime))
if log.isEnabledFor(logging.INFO):
# Dump a repr of the gathering info for debugging
# - We only have to dump the module namespace because
# everything else should be linked from it.
for nspath, namespace in visitor.st.items():
if len(nspath) == 0: # this is the module namespace
pprint.pprint(namespace)
file = visitor.getCIX(path)
if _gClockIt:
sys.stdout.write(" (getCIX:%.3fs)" % (_gClock()-_gStartTime))
codeintel = et.Element('codeintel', _et_attrs(dict(version="2.0")))
codeintel.append(file)
return codeintel
#---- mainline
def main(argv):
logging.basicConfig()
# Parse options.
try:
opts, args = getopt.getopt(argv[1:], "Vvhf:cL:",
["version", "verbose", "help", "filename=", "md5=", "mtime=",
"clock", "language="])
except getopt.GetoptError, ex:
log.error(str(ex))
log.error("Try `pythoncile --help'.")
return 1
numVerboses = 0
stdinFilename = None
md5sum = None
mtime = None
lang = "Python"
global _gClockIt
for opt, optarg in opts:
if opt in ("-h", "--help"):
sys.stdout.write(__doc__)
return
elif opt in ("-V", "--version"):
ver = '.'.join([str(part) for part in _version_])
print "pythoncile %s" % ver
return
elif opt in ("-v", "--verbose"):
numVerboses += 1
if numVerboses == 1:
log.setLevel(logging.INFO)
else:
log.setLevel(logging.DEBUG)
elif opt in ("-f", "--filename"):
stdinFilename = optarg
elif opt in ("-L", "--language"):
lang = optarg
elif opt in ("--md5",):
md5sum = optarg
elif opt in ("--mtime",):
mtime = optarg
elif opt in ("-c", "--clock"):
_gClockIt = 1
import time
global _gClock
if sys.platform.startswith("win"):
_gClock = time.clock
else:
_gClock = time.time
if len(args) == 0:
contentOnStdin = 1
filenames = [stdinFilename or "<stdin>"]
else:
contentOnStdin = 0
paths = []
for arg in args:
paths += glob.glob(arg)
filenames = []
for path in paths:
if os.path.isfile(path):
filenames.append(path)
elif os.path.isdir(path):
pyfiles = [os.path.join(path, n) for n in os.listdir(path)
if os.path.splitext(n)[1] == ".py"]
pyfiles = [f for f in pyfiles if os.path.isfile(f)]
filenames += pyfiles
try:
for filename in filenames:
if contentOnStdin:
log.debug("reading content from stdin")
content = sys.stdin.read()
log.debug("finished reading content from stdin")
if mtime is None:
mtime = int(time.time())
else:
if mtime is None:
mtime = int(os.stat(filename)[stat.ST_MTIME])
fin = open(filename, 'r')
try:
content = fin.read()
finally:
fin.close()
if _gClockIt:
sys.stdout.write("scanning '%s'..." % filename)
global _gStartTime
_gStartTime = _gClock()
data = scan_cix(content, filename, md5sum=md5sum, mtime=mtime,
lang=lang)
if _gClockIt:
sys.stdout.write(" %.3fs\n" % (_gClock()-_gStartTime))
elif data:
sys.stdout.write(data)
except PythonCILEError, ex:
log.error(str(ex))
if log.isEnabledFor(logging.DEBUG):
print
import traceback
traceback.print_exception(*sys.exc_info())
return 1
except KeyboardInterrupt:
log.debug("user abort")
return 1
if __name__ == "__main__":
sys.exit(main(sys.argv))
| 39.257559 | 86 | 0.53306 | 7,693 | 70,114 | 4.789289 | 0.138698 | 0.021659 | 0.016149 | 0.017669 | 0.336663 | 0.286722 | 0.270546 | 0.254505 | 0.235805 | 0.22636 | 0 | 0.006997 | 0.359985 | 70,114 | 1,785 | 87 | 39.279552 | 0.814057 | 0.149542 | 0 | 0.335833 | 0 | 0.000833 | 0.075612 | 0.006041 | 0.000833 | 0 | 0 | 0.00056 | 0.000833 | 0 | null | null | 0.013333 | 0.036667 | null | null | 0.005 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb6e6dd109fe14cd4863e7a3ef1390182e64eed7 | 3,142 | py | Python | lexical-parse-float/etc/limits.py | sjurajpuchky/rust-lexical | 827dd8722967030bfe75cfbd786f0e160ffb9c0c | [
"Apache-2.0",
"MIT"
] | 249 | 2018-12-15T22:10:42.000Z | 2022-03-28T05:43:13.000Z | lexical-parse-float/etc/limits.py | sjurajpuchky/rust-lexical | 827dd8722967030bfe75cfbd786f0e160ffb9c0c | [
"Apache-2.0",
"MIT"
] | 76 | 2018-11-25T03:41:43.000Z | 2022-03-15T18:57:00.000Z | lexical-parse-float/etc/limits.py | sjurajpuchky/rust-lexical | 827dd8722967030bfe75cfbd786f0e160ffb9c0c | [
"Apache-2.0",
"MIT"
] | 35 | 2018-12-02T19:35:28.000Z | 2022-03-12T23:37:39.000Z | #!/usr/bin/env python3
"""
Generate the numeric limits for a given radix.
This is used for the fast-path algorithms, to calculate the
maximum number of digits or exponent bits that can be exactly
represented as a native value.
"""
import math
def is_pow2(value):
'''Calculate if a value is a power of 2.'''
floor = int(math.log2(value))
return value == 2**floor
def remove_pow2(value):
'''Remove a power of 2 from the value.'''
while math.floor(value / 2) == value / 2:
value //= 2
return value
def feature(radix):
'''Get the feature gate from the value'''
if radix == 10:
return ''
elif is_pow2(radix):
return 'if cfg!(feature = "power-of-two") '
return 'if cfg!(feature = "radix") '
def exponent_limit(radix, mantissa_size, max_exp):
'''
Calculate the exponent limit for a float, for a given
float type, where `radix` is the numerical base
for the float type, and mantissa size is the length
of the mantissa in bits. max_exp is the maximum
binary exponent, where all exponent bits except the lowest
are set (or, `2**(exponent_size - 1) - 1`).
'''
if is_pow2(radix):
# Can always be exactly represented. We can't handle
# denormal floats, however.
scaled = int(max_exp / math.log2(radix))
return (-scaled, scaled)
else:
# Positive and negative should be the same,
# since we need to find the maximum digit
# representable with mantissa digits.
# We first need to remove the highest power-of-
# two from the radix, since these will be represented
# with exponent digits.
base = remove_pow2(radix)
precision = mantissa_size + 1
exp_limit = int(precision / math.log2(base))
return (-exp_limit, exp_limit)
def mantissa_limit(radix, mantissa_size):
'''
Calculate mantissa limit for a float type, given
the radix and the length of the mantissa in bits.
'''
precision = mantissa_size + 1
return int(precision / math.log2(radix))
def all_limits(mantissa_size, exponent_size, type_name):
'''Print limits for all radixes.'''
max_exp = 2**(exponent_size - 1) - 1
print('/// Get the exponent limit as a const fn.')
print('#[inline(always)]')
print(f'pub const fn {type_name}_exponent_limit(radix: u32) -> (i64, i64) {{')
print(' match radix {')
for radix in range(2, 37):
exp_limit = exponent_limit(radix, mantissa_size, max_exp)
print(f' {radix} {feature(radix)}=> {exp_limit},')
print(' _ => (0, 0),')
print(' }')
print('}')
print('')
print('/// Get the mantissa limit as a const fn.')
print('#[inline(always)]')
print(f'pub const fn {type_name}_mantissa_limit(radix: u32) -> i64 {{')
print(' match radix {')
for radix in range(2, 37):
mant_limit = mantissa_limit(radix, mantissa_size)
print(f' {radix} {feature(radix)}=> {mant_limit},')
print(' _ => 0,')
print(' }')
print('}')
print('')
all_limits(23, 8, 'f32')
all_limits(52, 11, 'f64')
| 30.504854 | 82 | 0.619669 | 439 | 3,142 | 4.33713 | 0.275626 | 0.05042 | 0.037815 | 0.046218 | 0.235294 | 0.163866 | 0.163866 | 0.096639 | 0.096639 | 0.096639 | 0 | 0.024003 | 0.257479 | 3,142 | 102 | 83 | 30.803922 | 0.792113 | 0.344048 | 0 | 0.269231 | 1 | 0 | 0.24759 | 0.033486 | 0 | 0 | 0 | 0 | 0 | 1 | 0.115385 | false | 0 | 0.019231 | 0 | 0.288462 | 0.346154 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb7491736bfbd22fd0d03bf388c942817f7860a2 | 299 | py | Python | Python/9. Errors and Exceptions/exercise2.py | mukeshmithrakumar/HackerRankSolutions | cd9e71be5e8703287b9f4efc042df8827175af1b | [
"MIT"
] | 12 | 2019-10-29T09:49:26.000Z | 2022-02-21T09:43:41.000Z | Python/9. Errors and Exceptions/exercise2.py | ozturkosu/HackerRankSolutions | cd9e71be5e8703287b9f4efc042df8827175af1b | [
"MIT"
] | null | null | null | Python/9. Errors and Exceptions/exercise2.py | ozturkosu/HackerRankSolutions | cd9e71be5e8703287b9f4efc042df8827175af1b | [
"MIT"
] | 10 | 2019-12-22T03:18:50.000Z | 2021-09-23T16:55:25.000Z | # Incorrect Regex "https://www.hackerrank.com/challenges/incorrect-regex/problem"
# Enter your code here. Read input from STDIN. Print output to STDOUT
import re
for i in range(int(input())):
try:
re.compile(input())
print("True")
except ValueError:
print("False")
| 24.916667 | 81 | 0.665552 | 40 | 299 | 4.975 | 0.825 | 0.140704 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.210702 | 299 | 11 | 82 | 27.181818 | 0.84322 | 0.491639 | 0 | 0 | 0 | 0 | 0.060403 | 0 | 0 | 0 | 0 | 0.090909 | 0 | 1 | 0 | false | 0 | 0.142857 | 0 | 0.142857 | 0.285714 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb7c33f1129f73ea16522644b0dc0c8fff4d920f | 712 | py | Python | UDPMulticastClient.py | monteno-m/Challenge | facebb31d3e04dcd497577e4e6f38020b846c614 | [
"MIT"
] | null | null | null | UDPMulticastClient.py | monteno-m/Challenge | facebb31d3e04dcd497577e4e6f38020b846c614 | [
"MIT"
] | null | null | null | UDPMulticastClient.py | monteno-m/Challenge | facebb31d3e04dcd497577e4e6f38020b846c614 | [
"MIT"
] | null | null | null | # Basic UDP Multicasting server built in python
import socket
import struct
def UDPReceiveMultiCast(group, port):
# Create UDP Socket
s = socket.socket(socket.AF_INET, socket.SOCK_DGRAM, socket.IPPROTO_UDP)
s.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
s.bind(('', port))
mreq = struct.pack("4sl", socket.inet_aton(group), socket.INADDR_ANY)
s.setsockopt(socket.IPPROTO_IP, socket.IP_ADD_MEMBERSHIP, mreq)
try:
while True:
print s.recv(10240)
except KeyboardInterrupt :
print 'Client Interrupted, Closing Socket'
s.close()
if __name__ == '__main__':
UDPReceiveMultiCast('224.1.1.1', 5007)
| 25.428571 | 76 | 0.657303 | 88 | 712 | 5.113636 | 0.602273 | 0.08 | 0.075556 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.031365 | 0.238764 | 712 | 28 | 77 | 25.428571 | 0.798893 | 0.088483 | 0 | 0 | 0 | 0 | 0.083462 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0.125 | null | null | 0.125 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
cb7f6dcf22a868c02e40a1c04eb933a6c7a3c1bd | 341 | py | Python | celerytest/tasks.py | dpfried/mocs | d594a47c2e3125a30fe14c2bc646d0e0614cb8b2 | [
"MIT"
] | 8 | 2016-05-31T09:54:15.000Z | 2021-08-13T13:25:01.000Z | celerytest/tasks.py | dpfried/mocs | d594a47c2e3125a30fe14c2bc646d0e0614cb8b2 | [
"MIT"
] | null | null | null | celerytest/tasks.py | dpfried/mocs | d594a47c2e3125a30fe14c2bc646d0e0614cb8b2 | [
"MIT"
] | 10 | 2015-01-07T06:45:44.000Z | 2021-06-21T08:41:57.000Z | from celery.task import task
from time import sleep
@task()
def add(x, y):
return x + y;
@task()
def status(delay):
status.update_state(state='PROGRESS', meta={'description': 'starting timer'})
sleep(delay)
status.update_state(state='PROGRESS', meta={'description': 'after first sleep'})
sleep(delay)
return 'done'
| 22.733333 | 84 | 0.677419 | 46 | 341 | 4.978261 | 0.5 | 0.061135 | 0.148472 | 0.19214 | 0.436681 | 0.436681 | 0.436681 | 0.436681 | 0 | 0 | 0 | 0 | 0.170088 | 341 | 14 | 85 | 24.357143 | 0.809187 | 0 | 0 | 0.333333 | 0 | 0 | 0.214076 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.166667 | false | 0 | 0.166667 | 0.083333 | 0.5 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.