hexsha string | size int64 | ext string | lang string | max_stars_repo_path string | max_stars_repo_name string | max_stars_repo_head_hexsha string | max_stars_repo_licenses list | max_stars_count int64 | max_stars_repo_stars_event_min_datetime string | max_stars_repo_stars_event_max_datetime string | max_issues_repo_path string | max_issues_repo_name string | max_issues_repo_head_hexsha string | max_issues_repo_licenses list | max_issues_count int64 | max_issues_repo_issues_event_min_datetime string | max_issues_repo_issues_event_max_datetime string | max_forks_repo_path string | max_forks_repo_name string | max_forks_repo_head_hexsha string | max_forks_repo_licenses list | max_forks_count int64 | max_forks_repo_forks_event_min_datetime string | max_forks_repo_forks_event_max_datetime string | content string | avg_line_length float64 | max_line_length int64 | alphanum_fraction float64 | qsc_code_num_words_quality_signal int64 | qsc_code_num_chars_quality_signal float64 | qsc_code_mean_word_length_quality_signal float64 | qsc_code_frac_words_unique_quality_signal float64 | qsc_code_frac_chars_top_2grams_quality_signal float64 | qsc_code_frac_chars_top_3grams_quality_signal float64 | qsc_code_frac_chars_top_4grams_quality_signal float64 | qsc_code_frac_chars_dupe_5grams_quality_signal float64 | qsc_code_frac_chars_dupe_6grams_quality_signal float64 | qsc_code_frac_chars_dupe_7grams_quality_signal float64 | qsc_code_frac_chars_dupe_8grams_quality_signal float64 | qsc_code_frac_chars_dupe_9grams_quality_signal float64 | qsc_code_frac_chars_dupe_10grams_quality_signal float64 | qsc_code_frac_chars_replacement_symbols_quality_signal float64 | qsc_code_frac_chars_digital_quality_signal float64 | qsc_code_frac_chars_whitespace_quality_signal float64 | qsc_code_size_file_byte_quality_signal float64 | qsc_code_num_lines_quality_signal float64 | qsc_code_num_chars_line_max_quality_signal float64 | qsc_code_num_chars_line_mean_quality_signal float64 | qsc_code_frac_chars_alphabet_quality_signal float64 | qsc_code_frac_chars_comments_quality_signal float64 | qsc_code_cate_xml_start_quality_signal float64 | qsc_code_frac_lines_dupe_lines_quality_signal float64 | qsc_code_cate_autogen_quality_signal float64 | qsc_code_frac_lines_long_string_quality_signal float64 | qsc_code_frac_chars_string_length_quality_signal float64 | qsc_code_frac_chars_long_word_length_quality_signal float64 | qsc_code_frac_lines_string_concat_quality_signal float64 | qsc_code_cate_encoded_data_quality_signal float64 | qsc_code_frac_chars_hex_words_quality_signal float64 | qsc_code_frac_lines_prompt_comments_quality_signal float64 | qsc_code_frac_lines_assert_quality_signal float64 | qsc_codepython_cate_ast_quality_signal float64 | qsc_codepython_frac_lines_func_ratio_quality_signal float64 | qsc_codepython_cate_var_zero_quality_signal bool | qsc_codepython_frac_lines_pass_quality_signal float64 | qsc_codepython_frac_lines_import_quality_signal float64 | qsc_codepython_frac_lines_simplefunc_quality_signal float64 | qsc_codepython_score_lines_no_logic_quality_signal float64 | qsc_codepython_frac_lines_print_quality_signal float64 | qsc_code_num_words int64 | qsc_code_num_chars int64 | qsc_code_mean_word_length int64 | qsc_code_frac_words_unique null | qsc_code_frac_chars_top_2grams int64 | qsc_code_frac_chars_top_3grams int64 | qsc_code_frac_chars_top_4grams int64 | qsc_code_frac_chars_dupe_5grams int64 | qsc_code_frac_chars_dupe_6grams int64 | qsc_code_frac_chars_dupe_7grams int64 | qsc_code_frac_chars_dupe_8grams int64 | qsc_code_frac_chars_dupe_9grams int64 | qsc_code_frac_chars_dupe_10grams int64 | qsc_code_frac_chars_replacement_symbols int64 | qsc_code_frac_chars_digital int64 | qsc_code_frac_chars_whitespace int64 | qsc_code_size_file_byte int64 | qsc_code_num_lines int64 | qsc_code_num_chars_line_max int64 | qsc_code_num_chars_line_mean int64 | qsc_code_frac_chars_alphabet int64 | qsc_code_frac_chars_comments int64 | qsc_code_cate_xml_start int64 | qsc_code_frac_lines_dupe_lines int64 | qsc_code_cate_autogen int64 | qsc_code_frac_lines_long_string int64 | qsc_code_frac_chars_string_length int64 | qsc_code_frac_chars_long_word_length int64 | qsc_code_frac_lines_string_concat null | qsc_code_cate_encoded_data int64 | qsc_code_frac_chars_hex_words int64 | qsc_code_frac_lines_prompt_comments int64 | qsc_code_frac_lines_assert int64 | qsc_codepython_cate_ast int64 | qsc_codepython_frac_lines_func_ratio int64 | qsc_codepython_cate_var_zero int64 | qsc_codepython_frac_lines_pass int64 | qsc_codepython_frac_lines_import int64 | qsc_codepython_frac_lines_simplefunc int64 | qsc_codepython_score_lines_no_logic int64 | qsc_codepython_frac_lines_print int64 | effective string | hits int64 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
e6dd4e443aa75702085230306111987d70f3876d | 3,411 | py | Python | custom_operations.py | chutien/zpp-mem | 470dec89dda475f7272b876f191cef9f8266a6dc | [
"MIT"
] | 1 | 2019-10-22T11:33:23.000Z | 2019-10-22T11:33:23.000Z | custom_operations.py | chutien/zpp-mem | 470dec89dda475f7272b876f191cef9f8266a6dc | [
"MIT"
] | null | null | null | custom_operations.py | chutien/zpp-mem | 470dec89dda475f7272b876f191cef9f8266a6dc | [
"MIT"
] | null | null | null | import tensorflow as tf
import numpy as np
def feedback_alignment_fc(input, weights, initializer=tf.initializers.he_normal(), name="fa_fc"):
random = tf.get_variable("random", shape=reversed(weights.get_shape().as_list()),
initializer=initializer, use_resource=True, trainable=False)
@tf.custom_gradient
def func(x):
def grad(dy, variables=[weights]):
dx = tf.matmul(dy, random)
dw = tf.matmul(tf.transpose(x), dy)
return dx, [dw]
return tf.matmul(x, weights), grad
with tf.name_scope(name):
return func(input)
def feedback_alignment_conv(input, weights, strides, padding, use_cudnn_on_gpu=True, data_format='NHWC',
dilations=[1, 1, 1, 1], initializer=tf.initializers.he_normal(),
name="fa_conv"):
random = tf.get_variable("random", shape=weights.get_shape().as_list(), initializer=initializer, use_resource=True, trainable=False)
@tf.custom_gradient
def func(x):
def grad(dy, variables=[weights]):
dx = tf.nn.conv2d_backprop_input(tf.shape(x), random, dy, strides, padding, use_cudnn_on_gpu,
data_format, dilations)
dw = tf.nn.conv2d_backprop_filter(x, weights.get_shape(), dy, strides, padding, use_cudnn_on_gpu,
data_format, dilations)
return dx, [dw]
return tf.nn.conv2d(input, weights, strides, padding, use_cudnn_on_gpu, data_format, dilations), grad
with tf.name_scope(name):
return func(input)
def direct_feedback_alignment_fc(input, weights, output_dim, error_container, initializer=tf.initializers.he_normal(),
name="dfa_fc"):
random = tf.get_variable("random", shape=[output_dim, weights.shape[0]], initializer=initializer, use_resource=True, trainable=False)
@tf.custom_gradient
def func(x):
def grad(dy, variables=[weights]):
dx = tf.matmul(error_container[0], random, name='matmul_grad_x')
dw = tf.matmul(tf.transpose(x), dy, name='matmul_grad_w')
return dx, [dw]
return tf.matmul(x, weights, name='matmul_forward_x'), grad
with tf.name_scope(name):
return func(input)
def direct_feedback_alignment_conv(input, weights, output_dim, error_container, strides, padding,
use_cudnn_on_gpu=True, data_format='NHWC', dilations=[1, 1, 1, 1],
initializer=tf.initializers.he_normal(), name="dfa_conv"):
input_shape = tf.shape(input)
input_flat_shape = np.prod(input.shape[1:])
random = tf.get_variable("random", shape=[output_dim, input_flat_shape],
initializer=initializer, use_resource=True, trainable=False)
@tf.custom_gradient
def func(x):
def grad(dy, variables=[weights]):
dx = tf.reshape(tf.matmul(error_container[0], random), input_shape)
dw = tf.nn.conv2d_backprop_filter(x, weights.get_shape(), dy, strides, padding, use_cudnn_on_gpu,
data_format, dilations)
return dx, [dw]
return tf.nn.conv2d(input, weights, strides, padding, use_cudnn_on_gpu, data_format, dilations), grad
with tf.name_scope(name):
return func(input)
| 50.910448 | 137 | 0.623571 | 431 | 3,411 | 4.716937 | 0.162413 | 0.027546 | 0.058534 | 0.07575 | 0.880964 | 0.852435 | 0.778652 | 0.697 | 0.621249 | 0.621249 | 0 | 0.006746 | 0.261214 | 3,411 | 66 | 138 | 51.681818 | 0.8 | 0 | 0 | 0.568966 | 0 | 0 | 0.029317 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.206897 | false | 0 | 0.034483 | 0 | 0.448276 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
e6f4df86118a4e826c2d9d44980cd1f50cb81cec | 1,459 | py | Python | ifo/ifo.py | DiegoHeer/ifo | af3190c2024274d7610142d972bc5be2028f7777 | [
"MIT"
] | null | null | null | ifo/ifo.py | DiegoHeer/ifo | af3190c2024274d7610142d972bc5be2028f7777 | [
"MIT"
] | null | null | null | ifo/ifo.py | DiegoHeer/ifo | af3190c2024274d7610142d972bc5be2028f7777 | [
"MIT"
] | null | null | null | import pymsgbox
import backend
import dashboard
import database
def update_ifo():
# TODO
# Updates all data of backend based on database
pass
def currency_update():
# TODO
# Updates all data based on the currency selected in the Transaction Block of the Dashboard
pass
def entry():
# TODO
# Provide a user form for transaction entry and selectively updates backend
pass
def manual_update():
# TODO
# Provide a user form to filter out data, exporting the filtered data to a table in a new sheet,
# which than can be manually updated by the user. After the update is complete the user can refresh
# the database with the updated data using the refresh database button
pass
def manual_remove():
# TODO
# Provides a user form to filter out data, exporting the filtered data to a table in a new sheet.
# The user can than delete lines of data which are not required anymore.
# The database can than be updated by the user using the refresh database button
pass
def refresh_database():
# TODO
# Works in conjunction with the manual_update or manual_remove button.
# After the user updated the data from the sheet,
# the table is exported to a dataframe which is then used to update the database
pass
def tester():
# Temporary function to test main functions of project
# dashboard.tester()
# backend.tester()
# database.tester()
pass
| 26.053571 | 103 | 0.712132 | 217 | 1,459 | 4.75576 | 0.35023 | 0.040698 | 0.026163 | 0.034884 | 0.229651 | 0.199612 | 0.199612 | 0.129845 | 0.129845 | 0.129845 | 0 | 0 | 0.246744 | 1,459 | 55 | 104 | 26.527273 | 0.939035 | 0.719671 | 0 | 0.388889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.018182 | 0 | 1 | 0.388889 | true | 0.388889 | 0.222222 | 0 | 0.611111 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
fc0468cb2e5c8935ccf146b56a77f3227dd3e564 | 2,296 | py | Python | nlr/utils/__init__.py | john-james-sf/nlr | 57dc67aadb5cfd0c8f0181ddf672c606a865e45b | [
"BSD-3-Clause"
] | null | null | null | nlr/utils/__init__.py | john-james-sf/nlr | 57dc67aadb5cfd0c8f0181ddf672c606a865e45b | [
"BSD-3-Clause"
] | null | null | null | nlr/utils/__init__.py | john-james-sf/nlr | 57dc67aadb5cfd0c8f0181ddf672c606a865e45b | [
"BSD-3-Clause"
] | null | null | null | #!/usr/bin/env python3
# -*- coding:utf-8 -*-
# ======================================================================================================================== #
# Project : Natural Language Recommendation #
# Version : 0.1.0 #
# File : \__init__.py #
# Language : Python 3.7.11 #
# ------------------------------------------------------------------------------------------------------------------------ #
# Author : John James #
# Company : nov8.ai #
# Email : john.james@nov8.ai #
# URL : https://github.com/john-james-sf/nlr #
# ------------------------------------------------------------------------------------------------------------------------ #
# Created : Saturday, November 6th 2021, 11:08:28 pm #
# Modified : Monday, November 8th 2021, 12:26:27 pm #
# Modifier : John James (john.james@nov8.ai) #
# ------------------------------------------------------------------------------------------------------------------------ #
# License : BSD 3-clause "New" or "Revised" License #
# Copyright: (c) 2021 nov8.ai #
# ======================================================================================================================== #
| 109.333333 | 124 | 0.150697 | 80 | 2,296 | 4.275 | 0.7125 | 0.131579 | 0.076023 | 0.087719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.038168 | 0.543554 | 2,296 | 20 | 125 | 114.8 | 0.288168 | 0.974303 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
fc0de13f13c870a71f6670f0bef45c0fd4073396 | 525 | py | Python | pymctdh/__init__.py | addschile/pymctdh | 20a93ce543526de1919757defceef16f9005f423 | [
"MIT"
] | null | null | null | pymctdh/__init__.py | addschile/pymctdh | 20a93ce543526de1919757defceef16f9005f423 | [
"MIT"
] | null | null | null | pymctdh/__init__.py | addschile/pymctdh | 20a93ce543526de1919757defceef16f9005f423 | [
"MIT"
] | null | null | null | from pymctdh import units
from pymctdh import pbasis
from pymctdh import wavefunction
from pymctdh import hamiltonian
from pymctdh import qoperator
from pymctdh import vmfpropagate
from pymctdh import results
from pymctdh.wavefunction import Wavefunction
from pymctdh.hamiltonian import Hamiltonian
from pymctdh.qoperator import QOperator
from pymctdh.pbasis import PBasis
from pymctdh.vmfpropagate import vmfpropagate,vmfpropagatejumps
from pymctdh.cmffixpropagate import cmffixpropagate
from pymctdh.results import Results
| 35 | 63 | 0.878095 | 64 | 525 | 7.203125 | 0.1875 | 0.334056 | 0.258134 | 0.099783 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.106667 | 525 | 14 | 64 | 37.5 | 0.982942 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
fc190e40929339dfcfeb544bb49d19378747c528 | 770 | py | Python | overlays/test_basic.py | stroxler/upypyre | 6db6e659bf35f5c8d8b719d61959f29ac6ec2f22 | [
"CC0-1.0"
] | null | null | null | overlays/test_basic.py | stroxler/upypyre | 6db6e659bf35f5c8d8b719d61959f29ac6ec2f22 | [
"CC0-1.0"
] | null | null | null | overlays/test_basic.py | stroxler/upypyre | 6db6e659bf35f5c8d8b719d61959f29ac6ec2f22 | [
"CC0-1.0"
] | 1 | 2022-03-31T13:15:33.000Z | 2022-03-31T13:15:33.000Z | #!/usr/bin/env python3
from basic import create_env_stack
def test_env_stack():
(
code_env,
ast_env,
class_body_env,
class_parents_env,
class_grandparents_env
) = create_env_stack(code={
"a": """
class X: pass
class Y(a.X): pass
""",
"b": """
class Z(a.X): pass
class W(b.Z): pass
""",
})
assert class_grandparents_env.get("b.Z", "") == []
assert class_grandparents_env.get("b.W", "") == ["a.X"]
class_grandparents_env.update("b", code="""
class Z(a.Y): pass
class W(b.Z): pass
""")
assert class_grandparents_env.get("b.Z", "") == ["a.X"]
assert class_grandparents_env.get("b.W", "") == ["a.Y"]
| 24.83871 | 59 | 0.512987 | 101 | 770 | 3.673267 | 0.267327 | 0.274933 | 0.32345 | 0.280323 | 0.425876 | 0.425876 | 0.425876 | 0.425876 | 0.253369 | 0.253369 | 0 | 0.001887 | 0.311688 | 770 | 30 | 60 | 25.666667 | 0.698113 | 0.027273 | 0 | 0.153846 | 0 | 0 | 0.294118 | 0 | 0 | 0 | 0 | 0 | 0.153846 | 1 | 0.038462 | true | 0.230769 | 0.038462 | 0 | 0.076923 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
fc50744469154688548b9a68c2eaaa1d215737a9 | 8,530 | py | Python | core/timeline.py | oliverseal/babybuddy | de68d172d49b659372c54b30afac09d13eabe79e | [
"BSD-2-Clause"
] | null | null | null | core/timeline.py | oliverseal/babybuddy | de68d172d49b659372c54b30afac09d13eabe79e | [
"BSD-2-Clause"
] | null | null | null | core/timeline.py | oliverseal/babybuddy | de68d172d49b659372c54b30afac09d13eabe79e | [
"BSD-2-Clause"
] | null | null | null | # -*- coding: utf-8 -*-
from django.urls import reverse
from django.utils import timezone, timesince
from django.utils.translation import gettext as _
from core.models import DiaperChange, Feeding, Note, Pumping, Sleep, TummyTime
from datetime import timedelta
def get_objects(date, child=None):
"""
Create a time-sorted dictionary of all events for a child.
:param date: a DateTime instance for the day to be summarized.
:param child: Child instance to filter results for (no filter if `None`).
:returns: a list of the day's events.
"""
min_date = date
max_date = date.replace(hour=23, minute=59, second=59)
events = []
_add_diaper_changes(min_date, max_date, events, child)
_add_feedings(min_date, max_date, events, child)
_add_pumpings(min_date, max_date, events, child)
_add_sleeps(min_date, max_date, events, child)
_add_tummy_times(min_date, max_date, events, child)
_add_notes(min_date, max_date, events, child)
explicit_type_ordering = {'start': 0, 'end': 1}
events.sort(
key=lambda x: (
x['time'],
explicit_type_ordering.get(x.get('type'), -1),
),
reverse=True,
)
return events
def _add_tummy_times(min_date, max_date, events, child=None):
instances = TummyTime.objects.filter(
start__range=(min_date, max_date)).order_by('-start')
if child:
instances = instances.filter(child=child)
for instance in instances:
details = []
if instance.milestone:
details.append(instance.milestone)
edit_link = reverse('core:tummytime-update', args=[instance.id])
events.append({
'time': timezone.localtime(instance.start),
'event': _('%(child)s started tummy time!') % {
'child': instance.child.first_name
},
'details': details,
'edit_link': edit_link,
'model_name': instance.model_name,
'type': 'start'
})
events.append({
'time': timezone.localtime(instance.end),
'event': _('%(child)s finished tummy time.') % {
'child': instance.child.first_name
},
'details': details,
'edit_link': edit_link,
'duration': timesince.timesince(instance.start, now=instance.end),
'model_name': instance.model_name,
'type': 'end'
})
def _add_sleeps(min_date, max_date, events, child=None):
instances = Sleep.objects.filter(
start__range=(min_date, max_date)).order_by('-start')
if child:
instances = instances.filter(child=child)
for instance in instances:
details = []
if instance.notes:
details.append(instance.notes)
edit_link = reverse('core:sleep-update', args=[instance.id])
events.append({
'time': timezone.localtime(instance.start),
'event': _('%(child)s fell asleep.') % {
'child': instance.child.first_name
},
'details': details,
'edit_link': edit_link,
'model_name': instance.model_name,
'type': 'start'
})
events.append({
'time': timezone.localtime(instance.end),
'event': _('%(child)s woke up.') % {
'child': instance.child.first_name
},
'details': details,
'edit_link': edit_link,
'duration': timesince.timesince(instance.start, now=instance.end),
'model_name': instance.model_name,
'type': 'end'
})
def _add_feedings(min_date, max_date, events, child=None):
# Ensure first feeding has a previous.
yesterday = min_date - timedelta(days=1)
prev_start = None
instances = Feeding.objects.filter(
start__range=(yesterday, max_date)).order_by('start')
if child:
instances = instances.filter(child=child)
for instance in instances:
details = []
if instance.notes:
details.append(instance.notes)
time_since_prev = None
if prev_start:
time_since_prev = \
timesince.timesince(prev_start, now=instance.start)
prev_start = instance.start
if instance.start < min_date:
continue
edit_link = reverse('core:feeding-update', args=[instance.id])
if instance.amount:
details.append(_('Amount: %(amount).0f') % {
'amount': instance.amount,
})
events.append({
'time': timezone.localtime(instance.start),
'event': _('%(child)s started feeding.') % {
'child': instance.child.first_name
},
'details': details,
'edit_link': edit_link,
'time_since_prev': time_since_prev,
'model_name': instance.model_name,
'type': 'start'
})
events.append({
'time': timezone.localtime(instance.end),
'event': _('%(child)s finished feeding.') % {
'child': instance.child.first_name
},
'details': details,
'edit_link': edit_link,
'duration': timesince.timesince(instance.start, now=instance.end),
'model_name': instance.model_name,
'type': 'end'
})
def _add_pumpings(min_date, max_date, events, child=None):
# Ensure first feeding has a previous.
yesterday = min_date - timedelta(days=1)
prev_start = None
instances = Pumping.objects.filter(
start__range=(yesterday, max_date)).order_by('start')
if child:
instances = instances.filter(child=child)
for instance in instances:
details = []
if instance.notes:
details.append(instance.notes)
time_since_prev = None
if prev_start:
time_since_prev = \
timesince.timesince(prev_start, now=instance.start)
prev_start = instance.start
if instance.start < min_date:
continue
edit_link = reverse('core:pumping-update', args=[instance.id])
if instance.amount:
details.append(_('Amount: %(amount).0f') % {
'amount': instance.amount,
})
events.append({
'time': timezone.localtime(instance.start),
'event': _('Started pumping for %(child)s.') % {
'child': instance.child.first_name
},
'details': details,
'edit_link': edit_link,
'time_since_prev': time_since_prev,
'model_name': instance.model_name,
'type': 'start'
})
events.append({
'time': timezone.localtime(instance.end),
'event': _('Finished pumping for %(child)s.') % {
'child': instance.child.first_name
},
'details': details,
'edit_link': edit_link,
'duration': timesince.timesince(instance.start, now=instance.end),
'model_name': instance.model_name,
'type': 'end'
})
def _add_diaper_changes(min_date, max_date, events, child):
instances = DiaperChange.objects.filter(
time__range=(min_date, max_date)).order_by('-time')
if child:
instances = instances.filter(child=child)
for instance in instances:
contents = []
if instance.wet:
contents.append('💧')
if instance.solid:
contents.append('💩')
events.append({
'time': timezone.localtime(instance.time),
'event': _('%(child)s had a %(type)s diaper change.') % {
'child': instance.child.first_name,
'type': ''.join(contents),
},
'edit_link': reverse('core:diaperchange-update',
args=[instance.id]),
'model_name': instance.model_name
})
def _add_notes(min_date, max_date, events, child):
instances = Note.objects.filter(
time__range=(min_date, max_date)).order_by('-time')
if child:
instances = instances.filter(child=child)
for instance in instances:
events.append({
'time': timezone.localtime(instance.time),
'details': [instance.note],
'edit_link': reverse('core:note-update',
args=[instance.id]),
'model_name': instance.model_name
})
| 35.541667 | 78 | 0.571043 | 912 | 8,530 | 5.148026 | 0.139254 | 0.037487 | 0.03983 | 0.04771 | 0.797444 | 0.791693 | 0.791693 | 0.76869 | 0.716507 | 0.663898 | 0 | 0.002363 | 0.30551 | 8,530 | 239 | 79 | 35.690377 | 0.789838 | 0.038687 | 0 | 0.666667 | 0 | 0 | 0.120318 | 0.005508 | 0 | 0 | 0 | 0 | 0 | 1 | 0.033333 | false | 0 | 0.02381 | 0 | 0.061905 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
fc7b1fab3746de20fba7fdda46ab20853ef1f85b | 533 | py | Python | pystsup/utilities/__init__.py | rithinch/GeneticAlgorithms-StudentSupervisorAllocation | cbfec0dad469323e49a9affb0cfa550dda5e5a4f | [
"MIT"
] | 12 | 2019-08-30T18:42:05.000Z | 2022-03-26T15:26:44.000Z | pystsup/utilities/__init__.py | rithinch/GeneticAlgorithms-StudentSupervisorAllocation | cbfec0dad469323e49a9affb0cfa550dda5e5a4f | [
"MIT"
] | 6 | 2019-09-18T19:28:39.000Z | 2022-02-04T19:09:07.000Z | pystsup/utilities/__init__.py | rithinch/GeneticAlgorithms-StudentSupervisorAllocation | cbfec0dad469323e49a9affb0cfa550dda5e5a4f | [
"MIT"
] | 2 | 2020-12-21T11:32:29.000Z | 2021-06-12T14:49:29.000Z | from .acmParser import parseFile, getPath
from .createRandomData import createRandomData,createRandomDataExcel
from .createExperiments import createExperimentsFromRealData,createExperiments,readFile,parseConfigFile,strToOp,saveExpResults,updateConfigFile, calcFitnessCache
from .runExperiments import runExperiments
from .integerPartition import partition
from .generateData import getData,writeFrontier,createExcelFile, scanInputData
from .runExperimentsOpt import createGAMSFileSup, runExperimentsOpt, runAllExperimentsOptStudent
| 59.222222 | 161 | 0.896811 | 42 | 533 | 11.380952 | 0.619048 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.06379 | 533 | 8 | 162 | 66.625 | 0.957916 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
5d8a35e495a0a7540263fb8d178360395040c497 | 121 | py | Python | app/errors/__init__.py | mredle/expenseapp | 0e95974ca48e63c56b83e7bdbc76630fb79ea6d4 | [
"MIT"
] | null | null | null | app/errors/__init__.py | mredle/expenseapp | 0e95974ca48e63c56b83e7bdbc76630fb79ea6d4 | [
"MIT"
] | 22 | 2019-02-20T21:32:49.000Z | 2020-10-21T22:16:54.000Z | app/errors/__init__.py | mredle/expenseapp | 0e95974ca48e63c56b83e7bdbc76630fb79ea6d4 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from flask import Blueprint
bp = Blueprint('errors', __name__)
from app.errors import handlers | 17.285714 | 34 | 0.710744 | 16 | 121 | 5.125 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009804 | 0.157025 | 121 | 7 | 35 | 17.285714 | 0.794118 | 0.173554 | 0 | 0 | 0 | 0 | 0.060606 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0.666667 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 0 | 5 |
5d8b050ca043d4825b596c46884880a1325015f7 | 319 | py | Python | ciex/compat.py | walkr/ciex | bbb61dff82ba767ce1c97caa83be69c0e139a0f5 | [
"MIT"
] | null | null | null | ciex/compat.py | walkr/ciex | bbb61dff82ba767ce1c97caa83be69c0e139a0f5 | [
"MIT"
] | null | null | null | ciex/compat.py | walkr/ciex | bbb61dff82ba767ce1c97caa83be69c0e139a0f5 | [
"MIT"
] | null | null | null | # Import certain modules based on python version
try:
from urlparse import urlparse
except ImportError:
from urllib.parse import urlparse
try:
from Queue import Queue
except ImportError:
from queue import Queue
try:
import configparser
except ImportError:
import ConfigParser as configparser
| 18.764706 | 48 | 0.768025 | 39 | 319 | 6.282051 | 0.435897 | 0.208163 | 0.171429 | 0.163265 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.206897 | 319 | 16 | 49 | 19.9375 | 0.968379 | 0.144201 | 0 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 0 | 0 | 0 | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
5dc884ee200680b78219988c5db78074548a35c2 | 557 | py | Python | onlinejudge/service/__init__.py | beet-aizu/online-judge-tools | 989ed65ae45bfe3153b726da1e80cf34d5dd88fc | [
"MIT"
] | null | null | null | onlinejudge/service/__init__.py | beet-aizu/online-judge-tools | 989ed65ae45bfe3153b726da1e80cf34d5dd88fc | [
"MIT"
] | null | null | null | onlinejudge/service/__init__.py | beet-aizu/online-judge-tools | 989ed65ae45bfe3153b726da1e80cf34d5dd88fc | [
"MIT"
] | null | null | null | # Python Version: 3.x
import onlinejudge.service.anarchygolf
import onlinejudge.service.aoj
import onlinejudge.service.atcoder
import onlinejudge.service.codechef
import onlinejudge.service.codeforces
import onlinejudge.service.csacademy
import onlinejudge.service.facebook
import onlinejudge.service.hackerrank
import onlinejudge.service.kattis
import onlinejudge.service.library_checker
import onlinejudge.service.poj
import onlinejudge.service.spoj
import onlinejudge.service.topcoder
import onlinejudge.service.toph
import onlinejudge.service.yukicoder
| 32.764706 | 42 | 0.879713 | 65 | 557 | 7.523077 | 0.353846 | 0.521472 | 0.736196 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001912 | 0.061041 | 557 | 16 | 43 | 34.8125 | 0.933078 | 0.034111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
5dcd84bb3b602362f88e9fa085d09108b9d968a3 | 271 | py | Python | config.py | allanyung/clamav-rest | cbf302ec28d15b19bba657d90e4123a0d6fbf7ca | [
"MIT"
] | 1 | 2021-10-20T09:21:27.000Z | 2021-10-20T09:21:27.000Z | config.py | allanyung/clamav-rest | cbf302ec28d15b19bba657d90e4123a0d6fbf7ca | [
"MIT"
] | 7 | 2019-08-31T11:30:12.000Z | 2021-04-10T06:40:40.000Z | config.py | allanyung/clamav-rest | cbf302ec28d15b19bba657d90e4123a0d6fbf7ca | [
"MIT"
] | 4 | 2021-04-26T08:11:23.000Z | 2021-11-08T08:34:15.000Z | import os
LOGLEVEL = os.environ.get('LOGLEVEL', 'INFO')
CLAMD_HOST = os.environ.get('CLAMD_HOST', 'clamav')
CLAMD_PORT = int(os.environ.get('CLAMD_PORT', 3310))
AUTH_USERNAME = os.environ.get('AUTH_USERNAME', None)
AUTH_PASSWORD = os.environ.get('AUTH_PASSWORD', None)
| 30.111111 | 53 | 0.741697 | 41 | 271 | 4.707317 | 0.390244 | 0.233161 | 0.310881 | 0.176166 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.016194 | 0.088561 | 271 | 8 | 54 | 33.875 | 0.765182 | 0 | 0 | 0 | 0 | 0 | 0.236162 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0.166667 | 0.166667 | 0 | 0.166667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
5dd63a7cb298b7f4fdcd3a48f08946f3a6f80f40 | 65 | py | Python | jarbas_hive_mind/slave/terminal.py | flo-mic/HiveMind-core | ccc394d53f69900c3a119cce54f2f2630d8099ea | [
"Apache-2.0"
] | 43 | 2020-11-23T17:53:47.000Z | 2022-02-07T13:30:57.000Z | jarbas_hive_mind/slave/terminal.py | flo-mic/HiveMind-core | ccc394d53f69900c3a119cce54f2f2630d8099ea | [
"Apache-2.0"
] | 24 | 2020-11-10T07:53:09.000Z | 2021-12-13T22:58:50.000Z | jarbas_hive_mind/slave/terminal.py | flo-mic/HiveMind-core | ccc394d53f69900c3a119cce54f2f2630d8099ea | [
"Apache-2.0"
] | 5 | 2020-12-26T00:44:29.000Z | 2021-09-14T16:38:51.000Z | from jarbas_hive_mind.nodes.terminal import *
# backwards compat
| 21.666667 | 45 | 0.830769 | 9 | 65 | 5.777778 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.107692 | 65 | 2 | 46 | 32.5 | 0.896552 | 0.246154 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
5de4c9103e2a6fbb7e15324357bbdfc259e51c44 | 316 | py | Python | apps/life_sci/dgllife/model/model_zoo/__init__.py | arangoml/dgl | d135058f9986fadcbdf6aa1011a00c3ad45a8ce3 | [
"Apache-2.0"
] | 3 | 2020-02-28T07:28:52.000Z | 2020-06-03T08:41:55.000Z | apps/life_sci/dgllife/model/model_zoo/__init__.py | arangoml/dgl | d135058f9986fadcbdf6aa1011a00c3ad45a8ce3 | [
"Apache-2.0"
] | null | null | null | apps/life_sci/dgllife/model/model_zoo/__init__.py | arangoml/dgl | d135058f9986fadcbdf6aa1011a00c3ad45a8ce3 | [
"Apache-2.0"
] | null | null | null | """Collection of model architectures"""
from .jtnn import *
from .dgmg import *
from .attentivefp_predictor import *
from .gat_predictor import *
from .gcn_predictor import *
from .mlp_predictor import *
from .schnet_predictor import *
from .mgcn_predictor import *
from .mpnn_predictor import *
from .acnn import *
| 26.333333 | 39 | 0.778481 | 41 | 316 | 5.829268 | 0.414634 | 0.376569 | 0.556485 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.139241 | 316 | 11 | 40 | 28.727273 | 0.878676 | 0.10443 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
f8de2c74bad3157adab737d6c5220c9395d1bae9 | 207 | py | Python | tridet/layers/__init__.py | flipson/dd3d | 86d8660c29612b79836dad9b6c39972ac2ca1557 | [
"MIT"
] | 227 | 2021-08-17T02:42:28.000Z | 2022-03-31T22:35:06.000Z | tridet/layers/__init__.py | flipson/dd3d | 86d8660c29612b79836dad9b6c39972ac2ca1557 | [
"MIT"
] | 21 | 2021-08-20T06:51:59.000Z | 2022-03-31T16:47:18.000Z | tridet/layers/__init__.py | flipson/dd3d | 86d8660c29612b79836dad9b6c39972ac2ca1557 | [
"MIT"
] | 35 | 2021-08-21T08:22:17.000Z | 2022-03-30T05:32:45.000Z | # Copyright 2021 Toyota Research Institute. All rights reserved.
from tridet.layers.bev_nms import bev_nms
from tridet.layers.iou_loss import IOULoss
from tridet.layers.smooth_l1_loss import smooth_l1_loss
| 41.4 | 65 | 0.850242 | 33 | 207 | 5.121212 | 0.575758 | 0.177515 | 0.284024 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.032432 | 0.10628 | 207 | 4 | 66 | 51.75 | 0.881081 | 0.304348 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
f8f01afd52199a12dc89d7d3a91611d9f2402e8c | 61 | py | Python | ledger/partner/strategy.py | jawaidm/ledger | 7094f3320d6a409a2a0080e70fa7c2b9dba4a715 | [
"Apache-2.0"
] | 5 | 2018-02-12T03:16:36.000Z | 2019-09-07T20:36:37.000Z | ledger/partner/strategy.py | jawaidm/ledger | 7094f3320d6a409a2a0080e70fa7c2b9dba4a715 | [
"Apache-2.0"
] | 162 | 2018-02-16T05:13:03.000Z | 2021-05-14T02:47:37.000Z | ledger/partner/strategy.py | jawaidm/ledger | 7094f3320d6a409a2a0080e70fa7c2b9dba4a715 | [
"Apache-2.0"
] | 14 | 2018-02-15T05:22:36.000Z | 2022-02-15T08:24:43.000Z | from oscar.apps.partner.strategy import PurchaseInfo, Base
| 15.25 | 58 | 0.819672 | 8 | 61 | 6.25 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114754 | 61 | 3 | 59 | 20.333333 | 0.925926 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
5d00ead655621922e9f6c38db6cc98909bfa6dcf | 137 | py | Python | performance/driver/core/summarizer/__init__.py | mesosphere/dcos-perf-test-driver | 8fba87cb6c6f64690c0b5bef5c7d9f2aa0fba06b | [
"Apache-2.0"
] | 2 | 2018-02-27T18:21:21.000Z | 2018-03-16T12:12:12.000Z | performance/driver/core/summarizer/__init__.py | mesosphere/dcos-perf-test-driver | 8fba87cb6c6f64690c0b5bef5c7d9f2aa0fba06b | [
"Apache-2.0"
] | 1 | 2018-06-25T07:14:41.000Z | 2018-06-25T07:14:41.000Z | performance/driver/core/summarizer/__init__.py | mesosphere/dcos-perf-test-driver | 8fba87cb6c6f64690c0b5bef5c7d9f2aa0fba06b | [
"Apache-2.0"
] | 1 | 2020-06-25T10:37:21.000Z | 2020-06-25T10:37:21.000Z | from .axis import SummarizerAxis, SummarizerAxisParameters
from .core import Summarizer
from .timeseries import SummarizerAxisTimeseries
| 34.25 | 58 | 0.875912 | 13 | 137 | 9.230769 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.094891 | 137 | 3 | 59 | 45.666667 | 0.967742 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
5d1a2d9cc637922f8fe59c82cdacb32d1db7309e | 40 | py | Python | tests/__init__.py | s-ball/remo_serv | 66accbd77183db0628a9618cf258656ec2d81316 | [
"MIT"
] | null | null | null | tests/__init__.py | s-ball/remo_serv | 66accbd77183db0628a9618cf258656ec2d81316 | [
"MIT"
] | null | null | null | tests/__init__.py | s-ball/remo_serv | 66accbd77183db0628a9618cf258656ec2d81316 | [
"MIT"
] | null | null | null | # Copyright (c) 2020 SBA- MIT License
| 13.333333 | 38 | 0.675 | 6 | 40 | 4.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.129032 | 0.225 | 40 | 2 | 39 | 20 | 0.741935 | 0.875 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
5d1c74303ab13d50ac482e21f7b73748e3591549 | 987 | py | Python | python/blender_addon/blender-flow.py | delldu/Study | 61b06e7712e20f75b23b048dc3bda5a6b471e9b5 | [
"Apache-2.0"
] | 2 | 2018-12-06T15:16:31.000Z | 2019-08-21T03:53:11.000Z | python/blender_addon/blender-flow.py | delldu/Study | 61b06e7712e20f75b23b048dc3bda5a6b471e9b5 | [
"Apache-2.0"
] | null | null | null | python/blender_addon/blender-flow.py | delldu/Study | 61b06e7712e20f75b23b048dc3bda5a6b471e9b5 | [
"Apache-2.0"
] | null | null | null | import bpy
# Add cube
bpy.ops.mesh.primitive_cube_add(location=(0, 0, 0))
bpy.context.object.scale = [5, 5, 5]
bpy.ops.object.modifier_add(type = 'FLUID')
bpy.context.object.modifiers['Fluid'].fluid_type = 'DOMAIN'
bpy.context.object.modifiers['Fluid'].domain_settings.domain_type = 'LIQUID'
bpy.context.object.modifiers['Fluid'].domain_settings.use_mesh = 1
bpy.context.object.modifiers['Fluid'].domain_settings.cache_type = 'ALL'
bpy.context.object.modifiers['Fluid'].domain_settings.cache_frame_end = 60
# Add ball
bpy.ops.mesh.primitive_ico_sphere_add(location=(0, 0, 0))
bpy.ops.object.modifier_add(type='FLUID')
bpy.context.object.modifiers['Fluid'].fluid_type = 'FLOW'
bpy.context.object.modifiers['Fluid'].flow_settings.flow_type = 'LIQUID'
bpy.context.object.modifiers['Fluid'].flow_settings.flow_behavior = 'INFLOW'
bpy.context.object.modifiers['Fluid'].flow_settings.use_initial_velocity = 1
bpy.context.object.modifiers['Fluid'].flow_settings.velocity_coord = [0, 0, -2]
| 42.913043 | 79 | 0.777102 | 148 | 987 | 5 | 0.25 | 0.148649 | 0.237838 | 0.337838 | 0.752703 | 0.752703 | 0.708108 | 0.448649 | 0.191892 | 0.191892 | 0 | 0.017241 | 0.059777 | 987 | 22 | 80 | 44.863636 | 0.780172 | 0.017224 | 0 | 0.125 | 0 | 0 | 0.094301 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.0625 | 0 | 0.0625 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
537670665b199c865097c31471df364a6ac252da | 168 | py | Python | bin/tda-order-codegen.py | zhangted/tda-api | 1169c87129b80c120217d420e4996a439c5903dc | [
"MIT"
] | 986 | 2020-04-14T21:50:03.000Z | 2022-03-29T19:09:31.000Z | bin/tda-order-codegen.py | zhangted/tda-api | 1169c87129b80c120217d420e4996a439c5903dc | [
"MIT"
] | 243 | 2020-04-26T14:05:34.000Z | 2022-03-12T13:02:51.000Z | bin/tda-order-codegen.py | zhangted/tda-api | 1169c87129b80c120217d420e4996a439c5903dc | [
"MIT"
] | 286 | 2020-04-14T22:17:04.000Z | 2022-03-27T07:30:15.000Z | #!/usr/bin/env python
from tda.scripts.orders_codegen import latest_order_main
if __name__ == '__main__':
import sys
sys.exit(latest_order_main(sys.argv[1:]))
| 24 | 56 | 0.744048 | 26 | 168 | 4.307692 | 0.730769 | 0.196429 | 0.267857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.006849 | 0.130952 | 168 | 6 | 57 | 28 | 0.760274 | 0.119048 | 0 | 0 | 0 | 0 | 0.054422 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
539c13097d3dfe5c595711e6424e47ee11cedfb3 | 148 | py | Python | PythonExercicios/ex021.py | Renanfn/python | f8b930599f76c4eee57e2917c924283a1deac4db | [
"MIT"
] | null | null | null | PythonExercicios/ex021.py | Renanfn/python | f8b930599f76c4eee57e2917c924283a1deac4db | [
"MIT"
] | null | null | null | PythonExercicios/ex021.py | Renanfn/python | f8b930599f76c4eee57e2917c924283a1deac4db | [
"MIT"
] | null | null | null | import pygame
pygame.mixer.init()
pygame.init()
pygame.mixer.music.load('ex021.mp3')
pygame.mixer.music.play(loops=0, start=0.0)
pygame.event.wait() | 24.666667 | 43 | 0.763514 | 25 | 148 | 4.52 | 0.56 | 0.292035 | 0.283186 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.049645 | 0.047297 | 148 | 6 | 44 | 24.666667 | 0.751773 | 0 | 0 | 0 | 0 | 0 | 0.060403 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.166667 | 0 | 0.166667 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
53db62554e83082ef6cb740d54084c66e13183ec | 20,848 | py | Python | core/policy/lbc_policy.py | L-Net-1992/DI-drive | cc7f47bedbf60922acbcf3a5f77fc8e274df62cf | [
"Apache-2.0"
] | 219 | 2021-07-07T21:55:21.000Z | 2022-03-31T14:56:43.000Z | core/policy/lbc_policy.py | L-Net-1992/DI-drive | cc7f47bedbf60922acbcf3a5f77fc8e274df62cf | [
"Apache-2.0"
] | 7 | 2021-08-11T05:26:19.000Z | 2022-03-29T22:21:24.000Z | core/policy/lbc_policy.py | L-Net-1992/DI-drive | cc7f47bedbf60922acbcf3a5f77fc8e274df62cf | [
"Apache-2.0"
] | 29 | 2021-07-08T03:17:22.000Z | 2022-03-16T03:51:43.000Z | from collections import namedtuple
import os
from ding.torch_utils.data_helper import to_device, to_dtype, to_tensor
import torch
from torchvision import transforms
import numpy as np
from typing import Dict, List, Any, Optional
from .base_carla_policy import BaseCarlaPolicy
from core.models import PIDController, CustomController
from core.models.lbc_model import LBCBirdviewModel, LBCImageModel
from core.utils.model_utils import common
from ding.utils.data import default_collate, default_decollate
from core.utils.learner_utils.loss_utils import LocationLoss
STEPS = 5
SPEED_STEPS = 3
COMMANDS = 4
class LBCBirdviewPolicy(BaseCarlaPolicy):
"""
LBC driving policy with Bird-eye View inputs. It has an LBC NN model which can handle
observations from several environments by collating data into batch. Each environment
has a PID controller related to it to get final control signals. In each updating, all
envs should use the correct env id to make the PID controller works well, and the
controller should be reset when starting a new episode.
It contains 2 modes: `eval` and `learn`. The learn mode will calculate all losses,
but will not back-propregate it. In `eval` mode, the output control signal will be
postprocessed to standard control signal in Carla.
:Arguments:
- cfg (Dict): Config Dict.
- enable_field(List): Enable policy filed, default to ['eval', 'learn']
:Interfaces:
reset, forward
"""
config = dict(
cuda=True,
model=dict(),
learn=dict(loss='l1', ),
steer_points=None,
pid=None,
gap=5,
dt=0.1,
crop_size=192,
pixels_per_meter=5,
)
def __init__(self, cfg: dict, enable_field: List = ['eval', 'learn']) -> None:
super().__init__(cfg, enable_field=enable_field)
self._controller_dict = dict()
if self._cfg.cuda:
if not torch.cuda.is_available():
print('[POLICY] No cuda device found! Use cpu by default')
self._device = torch.device('cpu')
else:
self._device = torch.device('cuda')
else:
self._device = torch.device('cpu')
self._one_hot = torch.FloatTensor(torch.eye(4))
self._transform = transforms.ToTensor()
self._gap = self._cfg.gap
self._dt = self._cfg.dt
self._crop_size = self._cfg.crop_size
self._pixels_per_meter = self._cfg.pixels_per_meter
self._steer_points = self._cfg.steer_points
self._pid = self._cfg.pid
if self._steer_points is None:
self._steer_points = {"1": 3, "2": 2, "3": 2, "4": 2}
if self._pid is None:
self._pid = {
"1": {
"Kp": 1.0,
"Ki": 0.1,
"Kd": 0
}, # Left
"2": {
"Kp": 1.0,
"Ki": 0.1,
"Kd": 0
}, # Right
"3": {
"Kp": 0.8,
"Ki": 0.1,
"Kd": 0
}, # Straight
"4": {
"Kp": 0.8,
"Ki": 0.1,
"Kd": 0
}, # Follow
}
self._speed_control_func = lambda: PIDController(K_P=1.0, K_I=0.1, K_D=2.5)
self._turn_control_func = lambda: CustomController(self._pid)
self._model = LBCBirdviewModel(**self._cfg.model)
self._model.to(self._device)
for field in self._enable_field:
getattr(self, '_init_' + field)()
def _init_learn(self) -> None:
if self._cfg.learn.loss == 'l1':
self._criterion = LocationLoss(choice='l1')
elif self._cfg.policy.learn.loss == 'l2':
self._criterion = LocationLoss(choice='l2')
def _postprocess(self, steer, throttle, brake):
control = {}
control.update(
{
'steer': np.clip(steer, -1.0, 1.0),
'throttle': np.clip(throttle, 0.0, 1.0),
'brake': np.clip(brake, 0.0, 1.0),
}
)
return control
def _reset_single(self, data_id):
if data_id in self._controller_dict:
self._controller_dict.pop(data_id)
self._controller_dict[data_id] = (self._speed_control_func(), self._turn_control_func())
def _reset(self, data_ids: Optional[List[int]] = None) -> None:
if data_ids is not None:
for id in data_ids:
self._reset_single(id)
else:
for id in self._controller_dict:
self._reset_single(id)
@torch.no_grad()
def _forward_eval(self, data: Dict) -> Dict[str, Any]:
"""
Running forward to get control signal of `eval` mode.
:Arguments:
- data (Dict): Input dict, with env id in keys and related observations in values,
:Returns:
Dict: Control and waypoints dict stored in values for each provided env id.
"""
data_ids = list(data.keys())
data = default_collate(list(data.values()))
birdview = to_dtype(data['birdview'], dtype=torch.float32).permute(0, 3, 1, 2)
speed = data['speed']
command_index = [i.item() - 1 for i in data['command']]
command = self._one_hot[command_index]
if command.ndim == 1:
command = command.unsqueeze(0)
_birdview = birdview.to(self._device)
_speed = speed.to(self._device)
_command = command.to(self._device)
if self._model._all_branch:
_locations, _ = self._model(_birdview, _speed, _command)
else:
_locations = self._model(_birdview, _speed, _command)
_locations = _locations.detach().cpu().numpy()
map_locations = _locations
actions = {}
for index, data_id in enumerate(data_ids):
# Pixel coordinates.
map_location = map_locations[index, ...]
map_location = (map_location + 1) / 2 * self._crop_size
targets = list()
for i in range(STEPS):
pixel_dx, pixel_dy = map_location[i]
pixel_dx = pixel_dx - self._crop_size / 2
pixel_dy = self._crop_size - pixel_dy
angle = np.arctan2(pixel_dx, pixel_dy)
dist = np.linalg.norm([pixel_dx, pixel_dy]) / self._pixels_per_meter
targets.append([dist * np.cos(angle), dist * np.sin(angle)])
target_speed = 0.0
for i in range(1, SPEED_STEPS):
pixel_dx, pixel_dy = map_location[i]
prev_dx, prev_dy = map_location[i - 1]
dx = pixel_dx - prev_dx
dy = pixel_dy - prev_dy
delta = np.linalg.norm([dx, dy])
target_speed += delta / (self._pixels_per_meter * self._gap * self._dt) / (SPEED_STEPS - 1)
_cmd = data['command'][index].item()
_sp = data['speed'][index].item()
n = self._steer_points.get(str(_cmd), 1)
targets = np.concatenate([[[0, 0]], targets], 0)
c, r = ls_circle(targets)
closest = common.project_point_to_circle(targets[n], c, r)
v = [1.0, 0.0, 0.0]
w = [closest[0], closest[1], 0.0]
alpha = common.signed_angle(v, w)
steer = self._controller_dict[data_id][1].run_step(alpha, _cmd)
throttle = self._controller_dict[data_id][0].step(target_speed - _sp)
brake = 0.0
if target_speed < 1.0:
steer = 0.0
throttle = 0.0
brake = 1.0
control = self._postprocess(steer, throttle, brake)
control.update({'map_locations': map_location})
actions[data_id] = {'action': control}
return actions
def _reset_eval(self, data_ids: Optional[List[int]] = None) -> None:
"""
Reset policy of `eval` mode. It will change the NN model into 'eval' mode and reset
the controllers in provided env id.
:Arguments:
- data_id (List[int], optional): List of env id to reset. Defaults to None.
"""
self._model.eval()
self._reset(data_ids)
def _forward_learn(self, data: Dict) -> Dict[str, Any]:
"""
Running forward of `learn` mode to get loss.
:Arguments:
- data (Dict): Input dict, with env id in keys and related observations in values,
:Returns:
Dict: information about training loss.
"""
birdview = to_dtype(data['birdview'], dtype=torch.float32).permute(0, 3, 1, 2)
speed = to_dtype(data['speed'], dtype=torch.float32)
command_index = [i.item() - 1 for i in data['command']]
command = self._one_hot[command_index]
if command.ndim == 1:
command = command.unsqueeze(0)
_birdview = birdview.to(self._device)
_speed = speed.to(self._device)
_command = command.to(self._device)
if self._model._all_branch:
_locations, _all_branch_locations = self._model(_birdview, _speed, _command)
else:
_locations = self._model(_birdview, _speed, _command)
locations_pred = _locations
if self._model._all_branch:
all_branch_locations_pred = _all_branch_locations
location_gt = data['location'].to(self._device)
loss = self._criterion(locations_pred, location_gt)
if self._model._all_branch:
return {
'loss': loss,
'locations_pred': locations_pred,
'all_branch_locations_pred': all_branch_locations_pred
}
return {
'loss': loss,
'locations_pred': locations_pred,
}
def _reset_learn(self, data_ids: Optional[List[int]] = None) -> None:
"""
Reset policy of `learn` mode. It will change the NN model into 'train' mode.
:Arguments:
- data_id (List[int], optional): List of env id to reset. Defaults to None.
"""
self._model.train()
class LBCImagePolicy(BaseCarlaPolicy):
"""
LBC driving policy with RGB image inputs. It has an LBC NN model which can handle
observations from several environments by collating data into batch. Each environment
has a PID controller related to it to get final control signals. In each updating, all
envs should use the correct env id to make the PID controller works well, and the
controller should be reset when starting a new episode.
:Arguments:
- cfg (Dict): Config Dict.
:Interfaces:
reset, forward
"""
config = dict(
cuda=True,
model=dict(),
learn=dict(loss='l1', ),
camera_args=dict(
fixed_offset=4.0,
fov=90,
h=160,
w=384,
world_y=1.4,
),
steer_points=None,
pid=None,
gap=5,
dt=0.1,
)
def __init__(self, cfg: dict, enable_field: List = ['eval', 'learn']) -> None:
super().__init__(cfg, enable_field=enable_field)
self._controller_dict = dict()
if self._cfg.cuda:
if not torch.cuda.is_available():
print('[POLICY] No cuda device found! Use cpu by default')
self._device = torch.device('cpu')
else:
self._device = torch.device('cuda')
else:
self._device = torch.device('cpu')
self._one_hot = torch.FloatTensor(torch.eye(4))
self._transform = transforms.ToTensor()
self._camera_args = self._cfg.camera_args
self._fixed_offset = self._camera_args.fixed_offset
w = float(self._camera_args.w)
h = float(self._camera_args.h)
self._img_size = np.array([w, h])
self._gap = self._cfg.gap
self._dt = self._cfg.dt
self._steer_points = self._cfg.steer_points
self._pid = self._cfg.pid
if self._steer_points is None:
self._steer_points = {"1": 4, "2": 3, "3": 2, "4": 2}
if self._pid is None:
self._pid = {
"1": {
"Kp": 0.5,
"Ki": 0.20,
"Kd": 0.0
},
"2": {
"Kp": 0.7,
"Ki": 0.10,
"Kd": 0.0
},
"3": {
"Kp": 1.0,
"Ki": 0.10,
"Kd": 0.0
},
"4": {
"Kp": 1.0,
"Ki": 0.50,
"Kd": 0.0
}
}
self._speed_control_func = lambda: PIDController(K_P=.8, K_I=.08, K_D=0.)
self._turn_control_func = lambda: CustomController(self._pid)
self._engine_brake_threshold = 2.0
self._brake_threshold = 2.0
self._model = LBCImageModel(**self._cfg.model)
self._model.to(self._device)
for field in self._enable_field:
getattr(self, '_init_' + field)()
def _init_learn(self) -> None:
if self._cfg.learn.loss == 'l1':
self._criterion = LocationLoss(choice='l1')
elif self._cfg.policy.learn.loss == 'l2':
self._criterion = LocationLoss(choice='l2')
def _reset_single(self, data_id):
if data_id in self._controller_dict:
self._controller_dict.pop(data_id)
self._controller_dict[data_id] = (self._speed_control_func(), self._turn_control_func())
def _reset(self, data_ids: Optional[List[int]] = None) -> None:
if data_ids is not None:
for id in data_ids:
self._reset_single(id)
else:
for id in self._controller_dict:
self._reset_single(id)
def _postprocess(self, steer, throttle, brake):
control = {}
control.update(
{
'steer': np.clip(steer, -1.0, 1.0),
'throttle': np.clip(throttle, 0.0, 1.0),
'brake': np.clip(brake, 0.0, 1.0),
}
)
return control
def _unproject(self, output, world_y=1.4, fov=90):
cx, cy = self._img_size / 2
w, h = self._img_size
f = w / (2 * np.tan(fov * np.pi / 360))
xt = (output[..., 0:1] - cx) / f
yt = (output[..., 1:2] - cy) / f
world_z = world_y / yt
world_x = world_z * xt
world_output = np.stack([world_x, world_z], axis=-1)
if self._fixed_offset:
world_output[..., 1] -= self._fixed_offset
world_output = world_output.squeeze()
return world_output
def _forward_eval(self, data: Dict) -> Dict:
"""
Running forward to get control signal of `eval` mode.
:Arguments:
- data (Dict): Input dict, with env id in keys and related observations in values,
:Returns:
Dict: Control and waypoints dict stored in values for each provided env id.
"""
data_ids = list(data.keys())
data = default_collate(list(data.values()))
rgb = to_dtype(data['rgb'], dtype=torch.float32).permute(0, 3, 1, 2)
speed = data['speed']
command_index = [i.item() - 1 for i in data['command']]
command = self._one_hot[command_index]
if command.ndim == 1:
command = command.unsqueeze(0)
with torch.no_grad():
_rgb = rgb.to(self._device)
_speed = speed.to(self._device)
_command = command.to(self._device)
if self._model._all_branch:
model_pred, _ = self._model(_rgb, _speed, _command)
else:
model_pred = self._model(_rgb, _speed, _command)
model_pred = model_pred.detach().cpu().numpy()
pixels_pred = model_pred
actions = {}
for index, data_id in enumerate(data_ids):
# Project back to world coordinate
pixel_pred = pixels_pred[index, ...]
pixel_pred = (pixel_pred + 1) * self._img_size / 2
world_pred = self._unproject(pixel_pred, self._camera_args.world_y, self._camera_args.fov)
targets = [(0, 0)]
for i in range(STEPS):
pixel_dx, pixel_dy = world_pred[i]
angle = np.arctan2(pixel_dx, pixel_dy)
dist = np.linalg.norm([pixel_dx, pixel_dy])
targets.append([dist * np.cos(angle), dist * np.sin(angle)])
targets = np.array(targets)
target_speed = np.linalg.norm(targets[:-1] - targets[1:], axis=1).mean() / (self._gap * self._dt)
_cmd = data['command'][index].item()
_sp = data['speed'][index].item()
c, r = ls_circle(targets)
n = self._steer_points.get(str(_cmd), 1)
closest = common.project_point_to_circle(targets[n], c, r)
v = [1.0, 0.0, 0.0]
w = [closest[0], closest[1], 0.0]
alpha = common.signed_angle(v, w)
steer = self._controller_dict[data_id][1].run_step(alpha, _cmd)
throttle = self._controller_dict[data_id][0].step(target_speed - _sp)
brake = 0.0
# Slow or stop.
if target_speed <= self._engine_brake_threshold:
steer = 0.0
throttle = 0.0
if target_speed <= self._brake_threshold:
brake = 1.0
control = self._postprocess(steer, throttle, brake)
control.update({'map_locations': pixels_pred})
actions[data_id] = {'action': control}
return actions
def _reset_eval(self, data_ids: Optional[List[int]]) -> None:
"""
Reset policy of `eval` mode. It will change the NN model into 'eval' mode and reset
the controllers in provided env id.
:Arguments:
- data_id (List[int], optional): List of env id to reset. Defaults to None.
"""
self._model.eval()
self._reset(data_ids)
def _forward_learn(self, data: Dict) -> Dict[str, Any]:
"""
Running forward of `learn` mode to get loss.
:Arguments:
- data (Dict): Input dict, with env id in keys and related observations in values,
:Returns:
Dict: information about training loss.
"""
# rgb = to_dtype(data['rgb'], dtype=torch.float32).permute(0, 3, 1, 2)
rgb = to_dtype(data['rgb'], dtype=torch.float32)
speed = to_dtype(data['speed'], dtype=torch.float32)
command_index = [i.item() - 1 for i in data['command']]
command = self._one_hot[command_index]
if command.ndim == 1:
command = command.unsqueeze(0)
_rgb = rgb.to(self._device)
_speed = speed.to(self._device)
_command = command.to(self._device)
if self._model._all_branch:
_locations, _all_branch_locations = self._model(_rgb, _speed, _command)
else:
_locations = self._model(_rgb, _speed, _command)
locations_pred = _locations
if self._model._all_branch:
all_branch_locations_pred = _all_branch_locations
location_gt = data['location'].to(self._device)
loss = self._criterion(locations_pred, location_gt)
if self._model._all_branch:
return {
'loss': loss,
'locations_pred': locations_pred,
'all_branch_locations_pred': all_branch_locations_pred
}
return {
'loss': loss,
'locations_pred': locations_pred,
}
def _reset_learn(self, data_ids: Optional[List[int]] = None) -> None:
"""
Reset policy of `learn` mode. It will change the NN model into 'train' mode.
:Arguments:
- data_id (List[int], optional): List of env id to reset. Defaults to None.
"""
self._model.train()
def ls_circle(points):
'''
Input: Nx2 points
Output: cx, cy, r
'''
xs = points[:, 0]
ys = points[:, 1]
us = xs - np.mean(xs)
vs = ys - np.mean(ys)
Suu = np.sum(us ** 2)
Suv = np.sum(us * vs)
Svv = np.sum(vs ** 2)
Suuu = np.sum(us ** 3)
Suvv = np.sum(us * vs * vs)
Svvv = np.sum(vs ** 3)
Svuu = np.sum(vs * us * us)
A = np.array([[Suu, Suv], [Suv, Svv]])
b = np.array([1 / 2. * Suuu + 1 / 2. * Suvv, 1 / 2. * Svvv + 1 / 2. * Svuu])
cx, cy = np.linalg.solve(A, b)
r = np.sqrt(cx * cx + cy * cy + (Suu + Svv) / len(xs))
cx += np.mean(xs)
cy += np.mean(ys)
return np.array([cx, cy]), r
| 33.197452 | 109 | 0.553243 | 2,606 | 20,848 | 4.198388 | 0.122794 | 0.004936 | 0.017549 | 0.010237 | 0.777534 | 0.747464 | 0.737318 | 0.727264 | 0.70213 | 0.680011 | 0 | 0.020379 | 0.331543 | 20,848 | 627 | 110 | 33.250399 | 0.76471 | 0.148216 | 0 | 0.64891 | 0 | 0 | 0.031226 | 0.002902 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048426 | false | 0 | 0.031477 | 0 | 0.113801 | 0.004843 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
53e01379b20fbafa93d2972956cf24ff393df66d | 188 | py | Python | Ipython_Help.py | liangguohuan/ZScripts | 6df7c9835fe04b9644bc47ba224b40cafd7ed650 | [
"MIT"
] | null | null | null | Ipython_Help.py | liangguohuan/ZScripts | 6df7c9835fe04b9644bc47ba224b40cafd7ed650 | [
"MIT"
] | null | null | null | Ipython_Help.py | liangguohuan/ZScripts | 6df7c9835fe04b9644bc47ba224b40cafd7ed650 | [
"MIT"
] | null | null | null | # Enter script code
keyboard.send_key("<backspace>")
keyboard.send_key("<home>")
keyboard.send_keys("help(")
keyboard.send_key("<end>")
keyboard.send_key(")")
keyboard.send_key("<enter>") | 23.5 | 32 | 0.728723 | 26 | 188 | 5.038462 | 0.423077 | 0.549618 | 0.572519 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.053191 | 188 | 8 | 33 | 23.5 | 0.735955 | 0.090426 | 0 | 0 | 0 | 0 | 0.205882 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
53fb348b8f0a605768c36273d4c47e7eead91360 | 354 | py | Python | src/main.py | paveles/Machine_Learning_and_Equity_Index_Returns | c5fbbb83fecea0ffd114dad51ace22738a67991c | [
"MIT"
] | 3 | 2019-06-30T13:18:25.000Z | 2019-07-29T14:06:30.000Z | src/main.py | paveles/Machine_Learning_and_Equity_Index_Returns | c5fbbb83fecea0ffd114dad51ace22738a67991c | [
"MIT"
] | 1 | 2019-06-29T12:04:47.000Z | 2019-06-29T12:04:47.000Z | src/main.py | paveles/Machine_Learning_and_Equity_Index_Returns | c5fbbb83fecea0ffd114dad51ace22738a67991c | [
"MIT"
] | null | null | null | #!/usr/bin/python
"""
Predicting Equity Index Returns using Machine Learning Methods - Main file that runs all scripts
"""
print("Execute src/main.py")
if __name__ == "__main__":
print("Execute src/data.py")
import src.data
print("Execute src/train.py")
import src.train
print("Execute src/visualize.py")
import src.visualize | 22.125 | 96 | 0.694915 | 49 | 354 | 4.857143 | 0.55102 | 0.201681 | 0.252101 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.180791 | 354 | 16 | 97 | 22.125 | 0.82069 | 0.319209 | 0 | 0 | 0 | 0 | 0.386266 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.375 | 0 | 0.375 | 0.5 | 0 | 0 | 0 | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 5 |
54c9176087553c4b980d9183eb8b1df63c0b3498 | 451 | py | Python | mtgcompiler/frontend/compilers/LarkMtgJson/MtgJsonPreprocessor.py | rmmilewi/mtgcompiler | b79b1608dbc1aa0b1eb427c8eb58dd40676b406d | [
"MIT"
] | 4 | 2018-09-06T03:56:59.000Z | 2021-10-30T21:41:37.000Z | mtgcompiler/frontend/compilers/LarkMtgJson/MtgJsonPreprocessor.py | rmmilewi/mtgcompiler | b79b1608dbc1aa0b1eb427c8eb58dd40676b406d | [
"MIT"
] | null | null | null | mtgcompiler/frontend/compilers/LarkMtgJson/MtgJsonPreprocessor.py | rmmilewi/mtgcompiler | b79b1608dbc1aa0b1eb427c8eb58dd40676b406d | [
"MIT"
] | null | null | null | from mtgcompiler.frontend.compilers.BaseImplementation.BasePreprocessor import BasePreprocessor
class MtgJsonPreprocessor(BasePreprocessor):
"""The MtgJson preprocessor."""
def __init__(self,options):
pass #TODO
def prelex(self,inputobj,flags):
return inputobj
def postlex(self,inputobj,flags):
return inputobj | 32.214286 | 95 | 0.578714 | 34 | 451 | 7.558824 | 0.676471 | 0.093385 | 0.132296 | 0.178988 | 0.241245 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.356984 | 451 | 14 | 96 | 32.214286 | 0.886207 | 0.066519 | 0 | 0.25 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 0 | 1 | 0.375 | false | 0.125 | 0.125 | 0.25 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 5 |
54d3c8004dc54db1b3816da7095afa04a30307d4 | 64,342 | py | Python | indigo/views.py | INDIGO-Initiative/database-app | 0bb8ef553ea01b6e3cc8ce045322b3337cb2364a | [
"MIT"
] | 1 | 2020-08-06T09:11:06.000Z | 2020-08-06T09:11:06.000Z | indigo/views.py | INDIGO-Initiative/database-app | 0bb8ef553ea01b6e3cc8ce045322b3337cb2364a | [
"MIT"
] | 61 | 2020-07-02T17:23:04.000Z | 2022-03-25T16:45:56.000Z | indigo/views.py | INDIGO-Initiative/database-app | 0bb8ef553ea01b6e3cc8ce045322b3337cb2364a | [
"MIT"
] | null | null | null | import csv
import os
import random
import tempfile
import jsondataferret
import jsondataferret.utils
import jsonpointer
import spreadsheetforms.api
from django.conf import settings
from django.contrib import messages
from django.contrib.auth.decorators import permission_required
from django.contrib.auth.mixins import PermissionRequiredMixin
from django.core.files.storage import default_storage
from django.db import connection
from django.db.models.functions import Now
from django.http import Http404, HttpResponse, HttpResponseRedirect, JsonResponse
from django.shortcuts import render
from django.urls import reverse
from django.views import View
from jsondataferret.models import Edit, Event, Record, Type
from jsondataferret.pythonapi.newevent import NewEventData, newEvent
import indigo.processdata
import indigo.utils
from indigo import (
TYPE_ASSESSMENT_RESOURCE_PUBLIC_ID,
TYPE_FUND_PUBLIC_ID,
TYPE_ORGANISATION_PUBLIC_ID,
TYPE_PROJECT_PUBLIC_ID,
)
from indigo.dataqualityreport import DataQualityReportForProject
from indigo.tasks import task_process_imported_project_file
from .forms import (
AssessmentResourceNewForm,
FundNewForm,
ModelImportForm,
OrganisationImportForm,
OrganisationNewForm,
ProjectImportForm,
ProjectImportStage2Form,
ProjectMakeDisputedForm,
ProjectMakePrivateForm,
ProjectNewForm,
RecordChangeStatusForm,
)
from .models import (
AssessmentResource,
Fund,
Organisation,
Project,
ProjectImport,
Sandbox,
)
from .spreadsheetforms import (
convert_assessment_resource_data_to_spreadsheetforms_data,
convert_fund_data_to_spreadsheetforms_data,
convert_organisation_data_to_spreadsheetforms_data,
convert_project_data_to_spreadsheetforms_data,
extract_edits_from_assessment_resource_spreadsheet,
extract_edits_from_fund_spreadsheet,
extract_edits_from_organisation_spreadsheet,
extract_edits_from_project_spreadsheet,
)
########################### Home Page
def index(request):
return render(request, "indigo/index.html")
########################### Public - Project
def projects_list(request):
projects = Project.objects.filter(exists=True, status_public=True).order_by(
"public_id"
)
return render(request, "indigo/projects.html", {"projects": projects},)
def projects_list_download(request):
projects = Project.objects.filter(exists=True, status_public=True).order_by(
"public_id"
)
return _projects_list_download_worker(projects)
def projects_list_download_social_investment_prototype(request):
projects = Project.objects.filter(
exists=True, status_public=True, social_investment_prototype=True
).order_by("public_id")
return _projects_list_download_worker(projects)
def _projects_list_download_worker(projects):
response = HttpResponse(content_type="text/csv")
response["Content-Disposition"] = 'attachment; filename="projects.csv"'
labels = ["ID"]
keys = []
for config in settings.JSONDATAFERRET_TYPE_INFORMATION["project"]["fields"]:
if config.get("type", "") != "list" and config.get("key").find("/status") == -1:
labels.append(config.get("title"))
keys.append(config.get("key"))
labels.append("Organisations")
labels.append("Countries")
writer = csv.writer(response)
writer.writerow(labels)
for project in projects:
# id
row = [project.public_id]
# fields
for key in keys:
try:
row.append(jsonpointer.resolve_pointer(project.data_public, key))
except jsonpointer.JsonPointerException:
row.append("")
# orgs
orgs_list = jsonpointer.resolve_pointer(
project.data_public, "/organisations", []
)
if isinstance(orgs_list, list):
orgs = [jsonpointer.resolve_pointer(d, "/id", "") for d in orgs_list]
row.append(", ".join([i for i in orgs if isinstance(i, str) and i]))
else:
row.append("")
# Countries
delivery_locations_list = jsonpointer.resolve_pointer(
project.data_public, "/delivery_locations", []
)
if isinstance(delivery_locations_list, list):
delivery_locations = [
jsonpointer.resolve_pointer(d, "/location_country/value", "")
for d in delivery_locations_list
]
# List/set removes duplicates
row.append(
", ".join(
list(
set([i for i in delivery_locations if isinstance(i, str) and i])
)
)
)
else:
row.append("")
# project done
writer.writerow(row)
return response
def project_download_blank_form(request):
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
guide_file = settings.JSONDATAFERRET_TYPE_INFORMATION["project"][
"spreadsheet_public_form_guide"
]
spreadsheetforms.api.make_empty_form(guide_file, out_file)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = "inline; filename=project.xlsx"
return response
def project_index(request, public_id):
try:
project = Project.objects.get(
exists=True, status_public=True, public_id=public_id
)
except Project.DoesNotExist:
raise Http404("Project does not exist")
if not project.status_public or not project.exists:
raise Http404("Project does not exist")
field_data = jsondataferret.utils.get_field_list_from_json(
TYPE_PROJECT_PUBLIC_ID, project.data_public
)
return render(
request,
"indigo/project/index.html",
{"project": project, "field_data": field_data},
)
def project_download_form(request, public_id):
try:
project = Project.objects.get(public_id=public_id)
except Project.DoesNotExist:
raise Http404("Project does not exist")
if not project.status_public or not project.exists:
raise Http404("Project does not exist")
data = convert_project_data_to_spreadsheetforms_data(project, public_only=True)
guide_file = settings.JSONDATAFERRET_TYPE_INFORMATION["project"][
"spreadsheet_public_form_guide"
]
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
spreadsheetforms.api.put_data_in_form(guide_file, data, out_file)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = (
"inline; filename=project" + project.public_id + ".xlsx"
)
return response
########################### Public - Organisation
def organisations_list(request):
organisations = Organisation.objects.filter(
exists=True, status_public=True
).order_by("public_id")
return render(
request, "indigo/organisations.html", {"organisations": organisations},
)
def organisations_list_download(request):
organisations = Organisation.objects.filter(
exists=True, status_public=True
).order_by("public_id")
response = HttpResponse(content_type="text/csv")
response["Content-Disposition"] = 'attachment; filename="organisations.csv"'
labels = ["ID"]
keys = []
for config in settings.JSONDATAFERRET_TYPE_INFORMATION["organisation"]["fields"]:
if (
config.get("type", "") != "list"
and config.get("key").find("/contact") == -1
):
labels.append(config.get("title"))
keys.append(config.get("key"))
writer = csv.writer(response)
writer.writerow(labels)
for organisation in organisations:
row = [organisation.public_id]
for key in keys:
try:
row.append(jsonpointer.resolve_pointer(organisation.data_public, key))
except jsonpointer.JsonPointerException:
row.append("")
writer.writerow(row)
return response
def organisation_download_blank_form(request):
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
guide_file = os.path.join(
settings.BASE_DIR,
"indigo",
"spreadsheetform_guides",
"organisation_public_v003.xlsx",
)
spreadsheetforms.api.make_empty_form(guide_file, out_file)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = "inline; filename=organisation.xlsx"
return response
def organisation_index(request, public_id):
try:
organisation = Organisation.objects.get(
exists=True, status_public=True, public_id=public_id
)
except Organisation.DoesNotExist:
raise Http404("Organisation does not exist")
if not organisation.status_public or not organisation.exists:
raise Http404("Organisation does not exist")
field_data = jsondataferret.utils.get_field_list_from_json(
TYPE_ORGANISATION_PUBLIC_ID, organisation.data_public
)
return render(
request,
"indigo/organisation/index.html",
{"organisation": organisation, "field_data": field_data},
)
def organisation_download_form(request, public_id):
try:
organisation = Organisation.objects.get(public_id=public_id)
except Organisation.DoesNotExist:
raise Http404("Organisation does not exist")
if not organisation.status_public or not organisation.exists:
raise Http404("Organisation does not exist")
data = convert_organisation_data_to_spreadsheetforms_data(
organisation, public_only=True
)
guide_file = os.path.join(
settings.BASE_DIR,
"indigo",
"spreadsheetform_guides",
"organisation_public_v003.xlsx",
)
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
spreadsheetforms.api.put_data_in_form(guide_file, data, out_file)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = (
"inline; filename=organisation" + organisation.public_id + ".xlsx"
)
return response
########################### Public - Fund & Assesment Resource
class ModelList(View):
def get(self, request):
datas = self.__class__._model.objects.filter(
exists=True, status_public=True
).order_by("public_id")
return render(
request,
"indigo/" + self.__class__._model.__name__.lower() + "s.html",
{"datas": datas},
)
class FundList(ModelList):
_model = Fund
class AssessmentResourceList(ModelList):
_model = AssessmentResource
class ModelIndex(View):
def get(self, request, public_id):
try:
data = self.__class__._model.objects.get(
exists=True, status_public=True, public_id=public_id
)
except self._model.DoesNotExist:
raise Http404("Data does not exist")
if not data.status_public or not data.exists:
raise Http404("Data does not exist")
field_data = jsondataferret.utils.get_field_list_from_json(
self.__class__._type_public_id, data.data_public
)
return render(
request,
"indigo/" + self.__class__._model.__name__.lower() + "/index.html",
{"data": data, "field_data": field_data},
)
class FundIndex(ModelIndex):
_model = Fund
_type_public_id = TYPE_FUND_PUBLIC_ID
class AssessmentResourceIndex(ModelIndex):
_model = AssessmentResource
_type_public_id = TYPE_ASSESSMENT_RESOURCE_PUBLIC_ID
class ModelDownloadForm(View):
def get(self, request, public_id):
try:
data = self.__class__._model.objects.get(
exists=True, status_public=True, public_id=public_id
)
except self._model.DoesNotExist:
raise Http404("Data does not exist")
if not data.status_public or not data.exists:
raise Http404("Data does not exist")
data_for_spreadsheet = self.__class__._convert_function(data, public_only=True)
guide_file = os.path.join(
settings.BASE_DIR,
"indigo",
"spreadsheetform_guides",
self.__class__._spreadsheet_file_name,
)
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
spreadsheetforms.api.put_data_in_form(
guide_file, data_for_spreadsheet, out_file
)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = (
"inline; filename=" + data.__class__.__name__ + data.public_id + ".xlsx"
)
return response
class FundDownloadForm(ModelDownloadForm):
_model = Fund
_type_public_id = TYPE_FUND_PUBLIC_ID
_spreadsheet_file_name = "fund_public_v001.xlsx"
_convert_function = convert_fund_data_to_spreadsheetforms_data
########################### Public - All
def all_public_data_file_per_record_in_zip(request):
if default_storage.exists("public/all_data_as_spreadsheets.zip"):
wrapper = default_storage.open("public/all_data_as_spreadsheets.zip")
response = HttpResponse(wrapper, content_type="application/zip")
response[
"Content-Disposition"
] = "attachment; filename=all_data_as_spreadsheets.zip"
return response
def all_public_data_file_per_data_type_csv_in_zip(request):
if default_storage.exists("public/all_data_per_data_type_csv.zip"):
wrapper = default_storage.open("public/all_data_per_data_type_csv.zip")
response = HttpResponse(wrapper, content_type="application/zip")
response[
"Content-Disposition"
] = "attachment; filename=all_data_per_data_type_csv.zip"
return response
########################### Public - Project - API
def api1_projects_list(request):
projects = Project.objects.filter()
data = {
"projects": [
{"id": p.public_id, "public": (p.exists and p.status_public)}
for p in projects
]
}
return JsonResponse(data)
def api1_project_index(request, public_id):
try:
project = Project.objects.get(public_id=public_id)
except Project.DoesNotExist:
raise Http404("Project does not exist")
if not project.status_public or not project.exists:
raise Http404("Project does not exist")
data = {"project": {"id": project.public_id, "data": project.data_public,}}
if (
settings.API_SANDBOX_DATA_PASSWORD
and request.GET.get("sandbox_data_password", "")
== settings.API_SANDBOX_DATA_PASSWORD
):
data["project"]["sandboxes"] = project.data_sandboxes
return JsonResponse(data)
########################### Public - Organisation - API
def api1_organisations_list(request):
organisations = Organisation.objects.filter()
data = {
"organisations": [
{"id": p.public_id, "public": (p.exists and p.status_public)}
for p in organisations
]
}
return JsonResponse(data)
def api1_organisation_index(request, public_id):
try:
organisation = Organisation.objects.get(public_id=public_id)
except Organisation.DoesNotExist:
raise Http404("Organisation does not exist")
if not organisation.status_public or not organisation.exists:
raise Http404("Organisation does not exist")
data = {
"organisation": {
"id": organisation.public_id,
"data": organisation.data_public,
}
}
return JsonResponse(data)
########################### Public - Fund & Assesment Resource - API
class API1ModelList(View):
def get(self, request):
datas = self.__class__._model.objects.filter().order_by("public_id")
output = {
self.__class__._model.type_id
+ "s": [
{"id": d.public_id, "public": (d.exists and d.status_public)}
for d in datas
]
}
return JsonResponse(output)
class API1FundList(API1ModelList):
_model = Fund
class API1AssessmentResourceList(API1ModelList):
_model = AssessmentResource
class API1ModelIndex(View):
def get(self, request, public_id):
try:
data = self.__class__._model.objects.get(
exists=True, status_public=True, public_id=public_id
)
except self._model.DoesNotExist:
raise Http404("Data does not exist")
if not data.status_public or not data.exists:
raise Http404("Data does not exist")
data = {
self.__class__._model.type_id: {
"id": data.public_id,
"data": data.data_public,
}
}
return JsonResponse(data)
class API1FundIndex(API1ModelIndex):
_model = Fund
class API1AssessmentResourceIndex(API1ModelIndex):
_model = AssessmentResource
########################### Admin
@permission_required("indigo.admin")
def admin_index(request):
return render(request, "indigo/admin/index.html")
########################### Admin - Projects
@permission_required("indigo.admin")
def admin_project_download_blank_form(request):
type_data = settings.JSONDATAFERRET_TYPE_INFORMATION.get(TYPE_PROJECT_PUBLIC_ID, {})
if not type_data.get("spreadsheet_form_guide"):
raise Http404("Feature not available")
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
spreadsheetforms.api.make_empty_form(
type_data.get("spreadsheet_form_guide"), out_file
)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = "inline; filename=project.xlsx"
return response
@permission_required("indigo.admin")
def admin_projects_list(request):
try:
type = Type.objects.get(public_id=TYPE_PROJECT_PUBLIC_ID)
except Type.DoesNotExist:
raise Http404("Type does not exist")
projects = Record.objects.filter(type=type).order_by("public_id")
return render(request, "indigo/admin/projects.html", {"projects": projects},)
@permission_required("indigo.admin")
def admin_project_index(request, public_id):
try:
project = Project.objects.get(public_id=public_id)
except Project.DoesNotExist:
raise Http404("Project does not exist")
field_data = jsondataferret.utils.get_field_list_from_json(
TYPE_PROJECT_PUBLIC_ID, project.data_private
)
return render(
request,
"indigo/admin/project/index.html",
{"project": project, "field_data": field_data},
)
@permission_required("indigo.admin")
def admin_project_download_form(request, public_id):
type_data = settings.JSONDATAFERRET_TYPE_INFORMATION.get(TYPE_PROJECT_PUBLIC_ID, {})
try:
project = Project.objects.get(public_id=public_id)
except Project.DoesNotExist:
raise Http404("Project does not exist")
data = convert_project_data_to_spreadsheetforms_data(project, public_only=False)
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
spreadsheetforms.api.put_data_in_form(
type_data.get("spreadsheet_form_guide"), data, out_file
)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = "inline; filename=project.xlsx"
return response
@permission_required("indigo.admin")
def admin_project_import_form(request, public_id):
try:
type = Type.objects.get(public_id=TYPE_PROJECT_PUBLIC_ID)
record = Record.objects.get(type=type, public_id=public_id)
project = Project.objects.get(public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
except Project.DoesNotExist:
raise Http404("Project does not exist")
if request.method == "POST":
# Create a form instance and populate it with data from the request (binding):
form = ProjectImportForm(request.POST, request.FILES)
# Check if the form is valid:
if form.is_valid():
# Save the data
project_import = ProjectImport()
project_import.user = request.user
project_import.project = project
with open(request.FILES["file"].temporary_file_path(), "rb") as fp:
project_import.file_data = fp.read()
project_import.save()
# Make celery call to start background worker
task_process_imported_project_file.delay(project_import.id)
# redirect to a new URL so user can wait for stage 2 of the process to be ready
return HttpResponseRedirect(
reverse(
"indigo_admin_project_import_form_stage_2",
kwargs={
"public_id": project.public_id,
"import_id": project_import.id,
},
)
)
# If this is a GET (or any other method) create the default form.
else:
form = ProjectImportForm()
context = {
"record": record,
"project": project,
"form": form,
}
return render(request, "indigo/admin/project/import_form.html", context)
@permission_required("indigo.admin")
def admin_project_import_form_stage_2(request, public_id, import_id):
try:
type = Type.objects.get(public_id=TYPE_PROJECT_PUBLIC_ID)
record = Record.objects.get(type=type, public_id=public_id)
project = Project.objects.get(public_id=public_id)
project_import = ProjectImport.objects.get(id=import_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
except Project.DoesNotExist:
raise Http404("Project does not exist")
except ProjectImport.DoesNotExist:
raise Http404("Import does not exist")
if project_import.project != project:
raise Http404("Import is for another project")
if project_import.user != request.user:
raise Http404("Import is for another user")
if project_import.imported:
raise Http404("Import already done")
if project_import.exception:
return render(
request,
"indigo/admin/project/import_form_stage_2_exception.html",
{"record": record, "project": project, "import": project_import},
)
if project_import.file_not_valid:
return render(
request,
"indigo/admin/project/import_form_stage_2_file_not_valid.html",
{"record": record, "project": project},
)
if not project_import.data:
return render(
request,
"indigo/admin/project/import_form_stage_2_wait.html",
{"record": record, "project": project, "import": project_import},
)
data_quality_report = DataQualityReportForProject(project_import.data)
level_zero_errors = data_quality_report.get_errors_for_priority_level(0)
if request.method == "POST":
# Create a form instance and populate it with data from the request (binding):
form = ProjectImportStage2Form(request.POST, request.FILES)
# Check if the form is valid:
if form.is_valid():
# process the data as required
# Save the event
new_event_datas = extract_edits_from_project_spreadsheet(
record, project_import.data
)
newEvent(
new_event_datas,
user=request.user,
comment=form.cleaned_data["comment"],
)
# mark import done
project_import.imported = Now()
project_import.save()
# redirect to project page with message
messages.add_message(
request,
messages.INFO,
"The data has been imported; remember to moderate it!",
)
return HttpResponseRedirect(
reverse(
"indigo_admin_project_index",
kwargs={"public_id": project.public_id},
)
)
# If this is a GET (or any other method) create the default form.
else:
form = ProjectImportStage2Form()
context = {
"record": record,
"project": project,
"form": form,
"level_zero_errors": level_zero_errors,
}
return render(request, "indigo/admin/project/import_form_stage_2.html", context)
@permission_required("indigo.admin")
def admin_project_make_private(request, public_id):
try:
type = Type.objects.get(public_id=TYPE_PROJECT_PUBLIC_ID)
record = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
if request.method == "POST":
# Create a form instance and populate it with data from the request (binding):
form = ProjectMakePrivateForm(request.POST)
# Check if the form is valid:
if form.is_valid():
# Save the event
new_event_data = NewEventData(
type,
record,
{"status": "PRIVATE"},
mode=jsondataferret.EVENT_MODE_MERGE,
)
newEvent(
[new_event_data],
user=request.user,
comment=form.cleaned_data["comment"],
)
# redirect to a new URL:
return HttpResponseRedirect(
reverse(
"indigo_admin_project_index",
kwargs={"public_id": record.public_id},
)
)
else:
form = ProjectMakePrivateForm()
context = {
"record": record,
"form": form,
}
return render(request, "indigo/admin/project/make_private.html", context,)
@permission_required("indigo.admin")
def admin_project_make_disputed(request, public_id):
try:
type = Type.objects.get(public_id=TYPE_PROJECT_PUBLIC_ID)
record = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
if request.method == "POST":
# Create a form instance and populate it with data from the request (binding):
form = ProjectMakeDisputedForm(request.POST)
# Check if the form is valid:
if form.is_valid():
# Save the event
new_event_data = NewEventData(
type,
record,
{"status": "DISPUTED"},
mode=jsondataferret.EVENT_MODE_MERGE,
)
newEvent(
[new_event_data],
user=request.user,
comment=form.cleaned_data["comment"],
)
# redirect to a new URL:
return HttpResponseRedirect(
reverse(
"indigo_admin_project_index",
kwargs={"public_id": record.public_id},
)
)
else:
form = ProjectMakeDisputedForm()
context = {
"record": record,
"form": form,
}
return render(request, "indigo/admin/project/make_disputed.html", context,)
@permission_required("indigo.admin")
def admin_projects_new(request):
try:
type = Type.objects.get(public_id=TYPE_PROJECT_PUBLIC_ID)
except Type.DoesNotExist:
raise Http404("Type does not exist")
# If this is a POST request then process the Form data
if request.method == "POST":
# Create a form instance and populate it with data from the request (binding):
form = ProjectNewForm(request.POST)
# Check if the form is valid:
if form.is_valid():
# process the data in form.cleaned_data as required
# Save the event
id = form.cleaned_data["id"]
existing_record = Record.objects.filter(type=type, public_id=id)
if existing_record:
form.add_error("id", "This ID already exists")
else:
data = NewEventData(
type,
id,
{"name": {"value": form.cleaned_data["name"]}},
approved=True,
)
newEvent(
[data], user=request.user, comment=form.cleaned_data["comment"]
)
# redirect to a new URL:
return HttpResponseRedirect(
reverse("indigo_admin_project_index", kwargs={"public_id": id},)
)
# If this is a GET (or any other method) create the default form.
else:
form = ProjectNewForm()
context = {
"form": form,
}
return render(request, "indigo/admin/project/new.html", context)
@permission_required("indigo.admin")
def admin_project_moderate(request, public_id):
try:
type = Type.objects.get(public_id=TYPE_PROJECT_PUBLIC_ID)
record = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
edits = Edit.objects.filter(record=record, approval_event=None, refusal_event=None)
if request.method == "POST":
# TODO check CSFR
actions = []
for edit in edits:
action = request.POST.get("action_" + str(edit.id))
if action == "approve":
actions.append(jsondataferret.pythonapi.newevent.NewEventApproval(edit))
elif action == "reject":
actions.append(
jsondataferret.pythonapi.newevent.NewEventRejection(edit)
)
if actions:
jsondataferret.pythonapi.newevent.newEvent(
actions, user=request.user, comment=request.POST.get("comment")
)
return HttpResponseRedirect(
reverse("indigo_admin_project_index", kwargs={"public_id": public_id},)
)
for edit in edits:
# TODO This will not take account of data_key on an edit If we start using that we will need to check this
edit.field_datas = jsondataferret.utils.get_field_list_from_json(
TYPE_PROJECT_PUBLIC_ID, edit.data
)
return render(
request,
"indigo/admin/project/moderate.html",
{"type": type, "record": record, "edits": edits},
)
@permission_required("indigo.admin")
def admin_project_history(request, public_id):
try:
type = Type.objects.get(public_id=TYPE_PROJECT_PUBLIC_ID)
record = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
events = Event.objects.filter_by_record(record)
return render(
request,
"indigo/admin/project/history.html",
{"type": type, "record": record, "events": events},
)
@permission_required("indigo.admin")
def admin_project_data_quality_report(request, public_id):
try:
project = Project.objects.get(public_id=public_id)
except Project.DoesNotExist:
raise Http404("Project does not exist")
dqr = DataQualityReportForProject(project.record.cached_data)
return render(
request,
"indigo/admin/project/data_quality_report.html",
{
"project": project,
"record": project.record,
"data_quality_report": dqr,
"errors_by_priority_level": dqr.get_errors_in_priority_levels(),
},
)
@permission_required("indigo.admin")
def admin_all_projects_data_quality_report(request):
return render(
request,
"indigo/admin/projects_data_quality_report.html",
{
"fields_single": [
i
for i in settings.JSONDATAFERRET_TYPE_INFORMATION["project"]["fields"]
if i.get("type") != "list"
],
},
)
@permission_required("indigo.admin")
def admin_all_projects_data_quality_report_field_single(request):
field_path = request.GET.get("field", "")
# Note we MUST explicitly check the field the user passed is in our pre-calculated Config list!
# If we don't, we open ourselves up to SQL Injection security holes.
fields = [
i
for i in settings.JSONDATAFERRET_TYPE_INFORMATION["project"]["fields"]
if i.get("type") != "list" and i.get("key") == field_path
]
if not fields:
raise Http404("Field does not exist") #
field = fields[0]
field_bits = ["'" + i + "'" for i in field["key"].split("/") if i]
sql_start = "select count(*) as c from indigo_project"
sql_where = "CAST(data_private::json->" + "->".join(field_bits) + " as text)"
with connection.cursor() as cursor:
cursor.execute(
sql_start + " WHERE " + sql_where + " = 'null' OR " + sql_where + " IS NULL"
)
count_no_data = cursor.fetchone()[0]
cursor.execute(sql_start + " WHERE " + sql_where + " != 'null'")
count_data = cursor.fetchone()[0]
return render(
request,
"indigo/admin/projects_data_quality_report_single_field.html",
{"field": field, "count_no_data": count_no_data, "count_data": count_data,},
)
@permission_required("indigo.admin")
def admin_all_projects_data_quality_list_projects_by_priority_highest(
request, priority
):
priority = int(priority)
if priority < 0 or priority > 3:
raise Http404("Priority does not exist")
projects = Project.objects.filter(exists=True)
projects = [
p
for p in projects
if p.data_quality_report_counts_by_priority.get(str(priority), 0) > 0
]
projects = sorted(
projects,
key=lambda x: x.data_quality_report_counts_by_priority.get(str(priority)),
reverse=True,
)
return render(
request,
"indigo/admin/projects_data_quality_report_list_projects_by_priority_highest.html",
{
"priority": priority,
"projects_with_count": [
(
project,
project.data_quality_report_counts_by_priority.get(str(priority)),
)
for project in projects
],
},
)
########################### Admin - Organisations
@permission_required("indigo.admin")
def admin_organisation_download_blank_form(request):
type_data = settings.JSONDATAFERRET_TYPE_INFORMATION.get(
TYPE_ORGANISATION_PUBLIC_ID, {}
)
if not type_data.get("spreadsheet_form_guide"):
raise Http404("Feature not available")
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
spreadsheetforms.api.make_empty_form(
type_data.get("spreadsheet_form_guide"), out_file
)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = "inline; filename=organisation.xlsx"
return response
@permission_required("indigo.admin")
def admin_organisations_list(request):
return render(request, "indigo/admin/organisations.html", {},)
@permission_required("indigo.admin")
def admin_organisations_goto(request):
goto = request.POST.get("goto").strip()
try:
organisation = Organisation.objects.get(public_id=goto)
except Organisation.DoesNotExist:
raise Http404("Organisation does not exist")
return HttpResponseRedirect(
reverse(
"indigo_admin_organisation_index",
kwargs={"public_id": organisation.public_id},
)
)
@permission_required("indigo.admin")
def admin_organisations_search(request):
search_term = request.GET.get("search", "").strip()
organisations = Organisation.objects
if search_term:
organisations = organisations.filter(
full_text_search_private__search=search_term
)
organisations = organisations.order_by("public_id")
return render(
request,
"indigo/admin/organisations_search.html",
{"search_term": search_term, "organisations": organisations},
)
@permission_required("indigo.admin")
def admin_organisation_download_all_csv(request):
try:
type = Type.objects.get(public_id=TYPE_ORGANISATION_PUBLIC_ID)
except Type.DoesNotExist:
raise Http404("Type does not exist")
organisations = Record.objects.filter(type=type).order_by("public_id")
response = HttpResponse(content_type="text/csv")
response["Content-Disposition"] = 'attachment; filename="organisations-admin.csv"'
labels = ["ID"]
keys = []
for config in settings.JSONDATAFERRET_TYPE_INFORMATION["organisation"]["fields"]:
if config.get("type", "") != "list":
labels.append(config.get("title"))
keys.append(config.get("key"))
writer = csv.writer(response)
writer.writerow(labels)
for organisation in organisations:
row = [organisation.public_id]
for key in keys:
try:
row.append(jsonpointer.resolve_pointer(organisation.cached_data, key))
except jsonpointer.JsonPointerException:
row.append("")
writer.writerow(row)
return response
@permission_required("indigo.admin")
def admin_organisation_index(request, public_id):
try:
organisation = Organisation.objects.get(public_id=public_id)
except Organisation.DoesNotExist:
raise Http404("Organisation does not exist")
field_data = jsondataferret.utils.get_field_list_from_json(
TYPE_ORGANISATION_PUBLIC_ID, organisation.data_private
)
return render(
request,
"indigo/admin/organisation/index.html",
{"organisation": organisation, "field_data": field_data},
)
@permission_required("indigo.admin")
def admin_organisation_change_status(request, public_id):
try:
type = Type.objects.get(public_id=TYPE_ORGANISATION_PUBLIC_ID)
record = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
if request.method == "POST":
# Create a form instance and populate it with data from the request (binding):
form = RecordChangeStatusForm(request.POST)
# Check if the form is valid:
if form.is_valid():
# Save the event
new_event_data = NewEventData(
type,
record,
{"status": form.cleaned_data["status"]},
mode=jsondataferret.EVENT_MODE_MERGE,
)
newEvent(
[new_event_data],
user=request.user,
comment=form.cleaned_data["comment"],
)
# redirect to a new URL:
messages.add_message(
request, messages.INFO, "Done; remember to moderate it!",
)
return HttpResponseRedirect(
reverse(
"indigo_admin_organisation_index",
kwargs={"public_id": record.public_id},
)
)
else:
form = RecordChangeStatusForm()
context = {
"record": record,
"form": form,
}
return render(request, "indigo/admin/organisation/change_status.html", context,)
@permission_required("indigo.admin")
def admin_organisation_projects(request, public_id):
try:
organisation = Organisation.objects.get(public_id=public_id)
except Organisation.DoesNotExist:
raise Http404("Organisation does not exist")
return render(
request,
"indigo/admin/organisation/projects.html",
{
"organisation": organisation,
"project_links": organisation.included_by_projects.all(),
},
)
@permission_required("indigo.admin")
def admin_organisation_download_form(request, public_id):
try:
organisation = Organisation.objects.get(public_id=public_id)
except Organisation.DoesNotExist:
raise Http404("Organisation does not exist")
guide_file = os.path.join(
settings.BASE_DIR, "indigo", "spreadsheetform_guides", "organisation_v004.xlsx",
)
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
data = convert_organisation_data_to_spreadsheetforms_data(
organisation, public_only=False
)
spreadsheetforms.api.put_data_in_form(guide_file, data, out_file)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = "inline; filename=organisation.xlsx"
return response
@permission_required("indigo.admin")
def admin_organisation_import_form(request, public_id):
try:
type = Type.objects.get(public_id=TYPE_ORGANISATION_PUBLIC_ID)
data = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
if request.method == "POST":
# Create a form instance and populate it with data from the request (binding):
form = OrganisationImportForm(request.POST, request.FILES)
# Check if the form is valid:
if form.is_valid():
# get data
version = indigo.utils.get_organisation_spreadsheet_version(
request.FILES["file"].temporary_file_path()
)
if (
version
not in settings.JSONDATAFERRET_TYPE_INFORMATION["organisation"][
"spreadsheet_form_guide_spec_versions"
].keys()
):
raise Exception("This seems to not be a organisation spreadsheet?")
import_json = spreadsheetforms.api.get_data_from_form_with_guide_spec(
settings.JSONDATAFERRET_TYPE_INFORMATION["organisation"][
"spreadsheet_form_guide_spec_versions"
][version],
request.FILES["file"].temporary_file_path(),
date_format=getattr(
settings, "JSONDATAFERRET_SPREADSHEET_FORM_DATE_FORMAT", None
),
)
# process the data in form.cleaned_data as required
# Save the event
new_event_datas = extract_edits_from_organisation_spreadsheet(
data, import_json
)
newEvent(
new_event_datas,
user=request.user,
comment=form.cleaned_data["comment"],
)
# redirect to a new URL:
messages.add_message(
request,
messages.INFO,
"The data has been imported; remember to moderate it!",
)
return HttpResponseRedirect(
reverse(
"indigo_admin_organisation_index",
kwargs={"public_id": data.public_id},
)
)
# If this is a GET (or any other method) create the default form.
else:
form = OrganisationImportForm()
context = {
"record": data,
"form": form,
}
return render(request, "indigo/admin/organisation/import_form.html", context)
@permission_required("indigo.admin")
def admin_organisations_new(request):
try:
type = Type.objects.get(public_id=TYPE_ORGANISATION_PUBLIC_ID)
except Type.DoesNotExist:
raise Http404("Type does not exist")
# If this is a POST request then process the Form data
if request.method == "POST":
# Create a form instance and populate it with data from the request (binding):
form = OrganisationNewForm(request.POST)
# Check if the form is valid:
if form.is_valid():
# process the data in form.cleaned_data as required
# Save the event
id = form.cleaned_data["id"]
existing_record = Record.objects.filter(type=type, public_id=id)
if existing_record:
form.add_error("id", "This ID already exists")
else:
data = NewEventData(
type,
id,
{"name": {"value": form.cleaned_data["name"]}},
approved=True,
)
newEvent(
[data], user=request.user, comment=form.cleaned_data["comment"]
)
# redirect to a new URL:
return HttpResponseRedirect(
reverse(
"indigo_admin_organisation_index", kwargs={"public_id": id},
)
)
# If this is a GET (or any other method) create the default form.
else:
form = OrganisationNewForm()
context = {
"form": form,
}
return render(request, "indigo/admin/organisation/new.html", context)
@permission_required("indigo.admin")
def admin_organisation_moderate(request, public_id):
try:
type = Type.objects.get(public_id=TYPE_ORGANISATION_PUBLIC_ID)
record = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
edits = Edit.objects.filter(record=record, approval_event=None, refusal_event=None)
if request.method == "POST":
# TODO check CSFR
actions = []
for edit in edits:
action = request.POST.get("action_" + str(edit.id))
if action == "approve":
actions.append(jsondataferret.pythonapi.newevent.NewEventApproval(edit))
elif action == "reject":
actions.append(
jsondataferret.pythonapi.newevent.NewEventRejection(edit)
)
if actions:
jsondataferret.pythonapi.newevent.newEvent(
actions, user=request.user, comment=request.POST.get("comment")
)
return HttpResponseRedirect(
reverse("indigo_admin_organisation_index", kwargs={"public_id": public_id},)
)
for edit in edits:
# TODO This will not take account of data_key on an edit If we start using that we will need to check this
edit.field_datas = jsondataferret.utils.get_field_list_from_json(
TYPE_ORGANISATION_PUBLIC_ID, edit.data
)
return render(
request,
"indigo/admin/organisation/moderate.html",
{"type": type, "record": record, "edits": edits},
)
@permission_required("indigo.admin")
def admin_organisation_history(request, public_id):
try:
type = Type.objects.get(public_id=TYPE_ORGANISATION_PUBLIC_ID)
record = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
events = Event.objects.filter_by_record(record)
return render(
request,
"indigo/admin/organisation/history.html",
{"type": type, "record": record, "events": events},
)
########################### Admin - funds & assessment resources
class AdminModelDownloadBlankForm(PermissionRequiredMixin, View):
permission_required = "indigo.admin"
def get(self, request):
type_data = settings.JSONDATAFERRET_TYPE_INFORMATION.get(
self.__class__._type_public_id, {}
)
if not type_data.get("spreadsheet_form_guide"):
raise Http404("Feature not available")
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
spreadsheetforms.api.make_empty_form(
type_data.get("spreadsheet_form_guide"), out_file
)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = (
"inline; filename=" + self.__class__._model.__name__.lower() + ".xlsx"
)
return response
class AdminFundDownloadBlankForm(AdminModelDownloadBlankForm):
_model = Fund
_type_public_id = TYPE_FUND_PUBLIC_ID
class AdminAssessmentResourceDownloadBlankForm(AdminModelDownloadBlankForm):
_model = AssessmentResource
_type_public_id = TYPE_ASSESSMENT_RESOURCE_PUBLIC_ID
class AdminModelList(PermissionRequiredMixin, View):
permission_required = "indigo.admin"
def get(self, request):
try:
type = Type.objects.get(public_id=self.__class__._type_public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
datas = Record.objects.filter(type=type).order_by("public_id")
return render(
request,
"indigo/admin/" + self.__class__._model.__name__.lower() + "s.html",
{"datas": datas},
)
class AdminFundList(AdminModelList):
_model = Fund
_type_public_id = TYPE_FUND_PUBLIC_ID
class AdminAssessmentResourceList(AdminModelList):
_model = AssessmentResource
_type_public_id = TYPE_ASSESSMENT_RESOURCE_PUBLIC_ID
class AdminModelIndex(PermissionRequiredMixin, View):
permission_required = "indigo.admin"
def get(self, request, public_id):
try:
data = self.__class__._model.objects.get(public_id=public_id)
except self._model.DoesNotExist:
raise Http404("Data does not exist")
field_data = jsondataferret.utils.get_field_list_from_json(
self.__class__._model.type_id, data.data_private
)
return render(
request,
"indigo/admin/" + self.__class__._model.__name__.lower() + "/index.html",
{"data": data, "field_data": field_data},
)
class AdminFundIndex(AdminModelIndex):
_model = Fund
class AdminAssessmentResourceIndex(AdminModelIndex):
_model = AssessmentResource
@permission_required("indigo.admin")
def admin_fund_projects(request, public_id):
try:
fund = Fund.objects.get(public_id=public_id)
except Fund.DoesNotExist:
raise Http404("Fund does not exist")
return render(
request,
"indigo/admin/fund/projects.html",
{"fund": fund, "project_links": fund.included_by_projects.all(),},
)
class AdminModelDownloadForm(PermissionRequiredMixin, View):
permission_required = "indigo.admin"
def get(self, request, public_id):
try:
data = self.__class__._model.objects.get(public_id=public_id)
except self._model.DoesNotExist:
raise Http404("Data does not exist")
guide_file = os.path.join(
settings.BASE_DIR,
"indigo",
"spreadsheetform_guides",
self.__class__._guide_file_name,
)
out_file = os.path.join(
tempfile.gettempdir(),
"indigo" + str(random.randrange(1, 100000000000)) + ".xlsx",
)
data_for_form = self._get_data_for_form(data)
spreadsheetforms.api.put_data_in_form(guide_file, data_for_form, out_file)
with open(out_file, "rb") as fh:
response = HttpResponse(fh.read(), content_type="application/vnd.ms-excel")
response["Content-Disposition"] = (
"inline; filename=" + self.__class__._model.__name__.lower() + ".xlsx"
)
return response
class AdminFundDownloadForm(AdminModelDownloadForm):
_model = Fund
_type_public_id = TYPE_FUND_PUBLIC_ID
_guide_file_name = "fund_v003.xlsx"
def _get_data_for_form(self, data):
return convert_fund_data_to_spreadsheetforms_data(data, public_only=False)
class AdminAssessmentResourceDownloadForm(AdminModelDownloadForm):
_model = AssessmentResource
_type_public_id = TYPE_ASSESSMENT_RESOURCE_PUBLIC_ID
_guide_file_name = "assessment_resource_v001.xlsx"
def _get_data_for_form(self, data):
return convert_assessment_resource_data_to_spreadsheetforms_data(
data, public_only=False
)
class AdminModelImportForm(PermissionRequiredMixin, View):
permission_required = "indigo.admin"
def get(self, request, public_id):
try:
data = self.__class__._model.objects.get(public_id=public_id)
except self._model.DoesNotExist:
raise Http404("Data does not exist")
form = self.__class__._form_class()
return render(
request,
"indigo/admin/"
+ self.__class__._model.__name__.lower()
+ "/import_form.html",
{"data": data, "form": form,},
)
def post(self, request, public_id):
try:
data = self.__class__._model.objects.get(public_id=public_id)
except self._model.DoesNotExist:
raise Http404("Data does not exist")
form = self.__class__._form_class(request.POST, request.FILES)
if form.is_valid():
# get data
import_json = spreadsheetforms.api.get_data_from_form_with_guide_spec(
settings.JSONDATAFERRET_TYPE_INFORMATION[self.__class__._model.type_id][
"spreadsheet_form_guide_spec"
],
request.FILES["file"].temporary_file_path(),
date_format=getattr(
settings, "JSONDATAFERRET_SPREADSHEET_FORM_DATE_FORMAT", None
),
)
# process the data in form.cleaned_data as required
# Save the event
newEvent(
self._get_edits(data, import_json),
user=request.user,
comment=form.cleaned_data["comment"],
)
# redirect to a new URL:
messages.add_message(
request,
messages.INFO,
"The data has been imported; remember to moderate it!",
)
return HttpResponseRedirect(
reverse(
self.__class__._redirect_view, kwargs={"public_id": data.public_id},
)
)
else:
return render(
request,
"indigo/admin/"
+ self.__class__._model.__name__.lower()
+ "/import_form.html",
{"data": data, "form": form,},
)
class AdminFundImportForm(AdminModelImportForm):
_model = Fund
_type_public_id = TYPE_FUND_PUBLIC_ID
_form_class = ModelImportForm
_redirect_view = "indigo_admin_fund_index"
def _get_edits(self, data, import_json):
return extract_edits_from_fund_spreadsheet(data.record, import_json)
class AdminAssessmentResourceImportForm(AdminModelImportForm):
_model = AssessmentResource
_type_public_id = TYPE_ASSESSMENT_RESOURCE_PUBLIC_ID
_form_class = ModelImportForm
_redirect_view = "indigo_admin_assessment_resource_index"
def _get_edits(self, data, import_json):
return extract_edits_from_assessment_resource_spreadsheet(
data.record, import_json
)
class AdminModelNew(PermissionRequiredMixin, View):
permission_required = "indigo.admin"
def get(self, request):
form = self.__class__._form_class()
return render(
request,
"indigo/admin/" + self.__class__._model.__name__.lower() + "/new.html",
{"form": form,},
)
def post(self, request):
try:
type = Type.objects.get(public_id=self.__class__._type_public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
form = self.__class__._form_class(request.POST)
if form.is_valid():
# process the data in form.cleaned_data as required
# Save the event
id = form.cleaned_data["id"]
existing_record = Record.objects.filter(type=type, public_id=id)
if existing_record:
form.add_error("id", "This ID already exists")
else:
data = NewEventData(
type,
id,
{"name": {"value": form.cleaned_data["name"]}},
approved=True,
)
newEvent(
[data], user=request.user, comment=form.cleaned_data["comment"]
)
# redirect to a new URL:
return HttpResponseRedirect(
reverse(self.__class__._redirect_view, kwargs={"public_id": id},)
)
return render(
request,
"indigo/admin/" + self.__class__._model.__name__.lower() + "/new.html",
{"form": form,},
)
class AdminFundNew(AdminModelNew):
_model = Fund
_type_public_id = TYPE_FUND_PUBLIC_ID
_form_class = FundNewForm
_redirect_view = "indigo_admin_fund_index"
class AdminAssessmentResourceNew(AdminModelNew):
_model = AssessmentResource
_type_public_id = TYPE_ASSESSMENT_RESOURCE_PUBLIC_ID
_form_class = AssessmentResourceNewForm
_redirect_view = "indigo_admin_assessment_resource_index"
class AdminModelModerate(PermissionRequiredMixin, View):
permission_required = "indigo.admin"
def get(self, request, public_id):
return self.post(request, public_id)
def post(self, request, public_id):
try:
type = Type.objects.get(public_id=self.__class__._model.type_id)
record = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
edits = Edit.objects.filter(
record=record, approval_event=None, refusal_event=None
)
if request.method == "POST":
# TODO check CSFR
actions = []
for edit in edits:
action = request.POST.get("action_" + str(edit.id))
if action == "approve":
actions.append(
jsondataferret.pythonapi.newevent.NewEventApproval(edit)
)
elif action == "reject":
actions.append(
jsondataferret.pythonapi.newevent.NewEventRejection(edit)
)
if actions:
jsondataferret.pythonapi.newevent.newEvent(
actions, user=request.user, comment=request.POST.get("comment")
)
return HttpResponseRedirect(
reverse(
self.__class__._redirect_view,
kwargs={"public_id": record.public_id},
)
)
for edit in edits:
# TODO This will not take account of data_key on an edit If we start using that we will need to check this
edit.field_datas = jsondataferret.utils.get_field_list_from_json(
self.__class__._model.type_id, edit.data
)
return render(
request,
"indigo/admin/" + self.__class__._model.__name__.lower() + "/moderate.html",
{"type": type, "record": record, "edits": edits},
)
class AdminFundModerate(AdminModelModerate):
_model = Fund
_redirect_view = "indigo_admin_fund_index"
class AdminAssessmentResourceModerate(AdminModelModerate):
_model = AssessmentResource
_redirect_view = "indigo_admin_assessment_resource_index"
class AdminModelHistory(PermissionRequiredMixin, View):
permission_required = "indigo.admin"
def get(self, request, public_id):
try:
type = Type.objects.get(public_id=self.__class__._type_public_id)
record = Record.objects.get(type=type, public_id=public_id)
except Type.DoesNotExist:
raise Http404("Type does not exist")
except Record.DoesNotExist:
raise Http404("Record does not exist")
events = Event.objects.filter_by_record(record)
return render(
request,
"indigo/admin/" + self.__class__._model.__name__.lower() + "/history.html",
{"type": type, "record": record, "events": events},
)
class AdminFundHistory(AdminModelHistory):
_model = Fund
_type_public_id = TYPE_FUND_PUBLIC_ID
class AdminAssessmentResourceHistory(AdminModelHistory):
_model = AssessmentResource
_type_public_id = TYPE_ASSESSMENT_RESOURCE_PUBLIC_ID
########################### Admin - sandboxes
@permission_required("indigo.admin")
def admin_sandbox_list(request):
sandboxes = Sandbox.objects.all()
return render(request, "indigo/admin/sandboxes.html", {"sandboxes": sandboxes},)
@permission_required("indigo.admin")
def admin_sandbox_index(request, public_id):
try:
sandbox = Sandbox.objects.get(public_id=public_id)
except Sandbox.DoesNotExist:
raise Http404("Sandbox does not exist")
return render(request, "indigo/admin/sandbox/index.html", {"sandbox": sandbox},)
########################### Admin - Event
@permission_required("indigo.admin")
def admin_event_index(request, event_id):
try:
event = Event.objects.get(public_id=event_id)
except Event.DoesNotExist:
raise Http404("Event does not exist")
edits_created = event.edits_created.all()
edits_approved = event.edits_approved.all()
edits_refused = event.edits_refused.all()
edits_created_and_approved = list(set(edits_created).intersection(edits_approved))
edits_only_created = [
edit for edit in edits_created if edit not in edits_created_and_approved
]
edits_only_approved = [
edit for edit in edits_approved if edit not in edits_created_and_approved
]
return render(
request,
"indigo/admin/event/index.html",
{
"event": event,
"edits_created": edits_created,
"edits_approved": edits_approved,
"edits_refused": edits_refused,
"edits_only_created": edits_only_created,
"edits_only_approved": edits_only_approved,
"edits_created_and_approved": edits_created_and_approved,
},
)
| 32.430444 | 118 | 0.630102 | 6,880 | 64,342 | 5.63561 | 0.058285 | 0.049106 | 0.020736 | 0.029015 | 0.803678 | 0.775179 | 0.745596 | 0.715575 | 0.678694 | 0.64287 | 0 | 0.008871 | 0.269404 | 64,342 | 1,983 | 119 | 32.446798 | 0.815946 | 0.046082 | 0 | 0.577483 | 0 | 0 | 0.13678 | 0.05357 | 0.000662 | 0 | 0 | 0.000504 | 0 | 1 | 0.047682 | false | 0.001987 | 0.061589 | 0.00596 | 0.233775 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
54da12da7185f0de273a0bcd62f723cfea4c6dd8 | 58 | py | Python | nick_derobertis_site/common/updating.py | nickderobertis/nick-derobertis-site | 386061dc258921eed41f2d3965ef69e02adde7ba | [
"MIT"
] | 1 | 2022-03-31T10:55:40.000Z | 2022-03-31T10:55:40.000Z | nick_derobertis_site/common/updating.py | nickderobertis/nick-derobertis-site | 386061dc258921eed41f2d3965ef69e02adde7ba | [
"MIT"
] | 8 | 2020-08-28T11:44:37.000Z | 2020-08-31T09:19:19.000Z | nick_derobertis_site/common/updating.py | nickderobertis/nick-derobertis-site | 386061dc258921eed41f2d3965ef69e02adde7ba | [
"MIT"
] | null | null | null | from awesome_panel_extensions.updating import UpdatingItem | 58 | 58 | 0.931034 | 7 | 58 | 7.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.051724 | 58 | 1 | 58 | 58 | 0.945455 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
54de9b24801b83c2c6f36c1fa58f96ab1aa1ff01 | 96 | py | Python | python/pyrutok/__init__.py | alesapin/rutok | 808a88dd80f1cf03bbf459bec71b8655825b5391 | [
"MIT"
] | null | null | null | python/pyrutok/__init__.py | alesapin/rutok | 808a88dd80f1cf03bbf459bec71b8655825b5391 | [
"MIT"
] | 8 | 2019-08-17T11:50:37.000Z | 2019-08-17T20:28:31.000Z | python/pyrutok/__init__.py | alesapin/rutok | 808a88dd80f1cf03bbf459bec71b8655825b5391 | [
"MIT"
] | null | null | null | from .api import GraphemTag, SemanticTag, TokenType
from .api import Token, Sentence, Tokenizer
| 32 | 51 | 0.8125 | 12 | 96 | 6.5 | 0.75 | 0.179487 | 0.333333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 96 | 2 | 52 | 48 | 0.928571 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
071b87d9840b41c8baeb1f85f24a0241c81bd9de | 125 | py | Python | tags/translation.py | pmaigutyak/mp-tags | 4eb27d362b674778787fc46c112b3895b066b81e | [
"0BSD"
] | null | null | null | tags/translation.py | pmaigutyak/mp-tags | 4eb27d362b674778787fc46c112b3895b066b81e | [
"0BSD"
] | null | null | null | tags/translation.py | pmaigutyak/mp-tags | 4eb27d362b674778787fc46c112b3895b066b81e | [
"0BSD"
] | null | null | null |
from modeltranslation.translator import translator
from tags.models import Tag
translator.register(Tag, fields=['text'])
| 15.625 | 50 | 0.8 | 15 | 125 | 6.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112 | 125 | 7 | 51 | 17.857143 | 0.900901 | 0 | 0 | 0 | 0 | 0 | 0.032258 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
0726b5f43f841c975db802656c620e04b7e6c721 | 178 | py | Python | dsbox/datapreprocessing/featurizer/pass/__init__.py | usc-isi-i2/dsbox-featurizer | 0e6400f0855ab1303043d9694a1e5151374eec92 | [
"MIT"
] | null | null | null | dsbox/datapreprocessing/featurizer/pass/__init__.py | usc-isi-i2/dsbox-featurizer | 0e6400f0855ab1303043d9694a1e5151374eec92 | [
"MIT"
] | 14 | 2018-07-06T07:03:50.000Z | 2019-04-18T07:10:38.000Z | dsbox/datapreprocessing/featurizer/pass/__init__.py | usc-isi-i2/dsbox-featurizer | 0e6400f0855ab1303043d9694a1e5151374eec92 | [
"MIT"
] | 1 | 2018-11-08T21:37:05.000Z | 2018-11-08T21:37:05.000Z | from pkgutil import extend_path
from .do_nothing import DoNothing
from .do_nothing_dataset import DoNothingForDataset
__path__ = extend_path(__path__, __name__) # type: ignore
| 29.666667 | 58 | 0.837079 | 23 | 178 | 5.73913 | 0.565217 | 0.151515 | 0.19697 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117978 | 178 | 5 | 59 | 35.6 | 0.840764 | 0.067416 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
0728dc3789e8ff640eca79eab54b47ecf0b8e936 | 231 | py | Python | machinelearnlib/__init__.py | Kai-Bailey/machinelearnlib | 5139d2f0e2aae501ef692c6fe7101268c06d0c49 | [
"MIT"
] | 1 | 2018-09-04T19:44:08.000Z | 2018-09-04T19:44:08.000Z | machinelearnlib/__init__.py | Kai-Bailey/machinelearnlib | 5139d2f0e2aae501ef692c6fe7101268c06d0c49 | [
"MIT"
] | null | null | null | machinelearnlib/__init__.py | Kai-Bailey/machinelearnlib | 5139d2f0e2aae501ef692c6fe7101268c06d0c49 | [
"MIT"
] | null | null | null | from . import activationFunc
from . import featureScaling
from . import initializeWeights
from . import loadData
from . import machinelearnlib
from . import plots
from . import processList
from . import train
from .models import * | 23.1 | 31 | 0.800866 | 27 | 231 | 6.851852 | 0.407407 | 0.432432 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155844 | 231 | 10 | 32 | 23.1 | 0.948718 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
073c8a674f201a9a85f8d5dabc71b89e1232c134 | 2,035 | py | Python | server/tests/test_parse.py | wasauce/air-quality | 37b610111d530afebb31c1f1aadc4a6f40991797 | [
"Apache-2.0"
] | 110 | 2020-10-11T04:21:13.000Z | 2022-01-09T18:37:02.000Z | server/tests/test_parse.py | wasauce/air-quality | 37b610111d530afebb31c1f1aadc4a6f40991797 | [
"Apache-2.0"
] | 10 | 2020-10-11T18:59:39.000Z | 2021-09-05T16:22:15.000Z | server/tests/test_parse.py | wasauce/air-quality | 37b610111d530afebb31c1f1aadc4a6f40991797 | [
"Apache-2.0"
] | 14 | 2020-10-11T16:48:52.000Z | 2021-10-01T04:56:57.000Z | # Copyright 2020
#
# pytest file
# Best invoked from the server/ directory as pytest tests/
import sys
import pytest
from pathlib import Path
sys.path.append(str(Path(__file__).parents[1] / "update_data/"))
import purpleair
SAMPLE_DATA = """{
"api_version" : "V1.0.6-0.0.9",
"time_stamp" : 1608141357,
"data_time_stamp" : 1608141326,
"location_type" : 1,
"max_age" : 300,
"fields" : [
"sensor_index",
"latitude",
"longitude",
"last_seen",
"pm2.5_10minute",
"pm2.5_30minute",
"pm2.5_60minute",
"pm2.5_6hour",
"pm2.5_24hour",
"humidity"
],
"data" : [
[65539,38.295,-122.4606,1608141289,2.3,2.3,2.3,1.7,1.1,31],
[65543,37.9084,-122.5563,1608141295,4.5,4.7,4.8,4.4,3.1,25],
[65545,37.8903,-122.1868,1608141288,4.3,4.2,4.0,2.4,1.1,31],
[65557,37.7751,-122.4239,1608141279,4.5,3.9,3.2,9.4,8.6,33],
[65559,37.7775,-122.4686,1608141220,0.2,0.2,0.4,0.7,0.5,33],
[65561,37.7783,-122.3977,1608141224,0.1,0.1,0.2,8.6,21.1,30],
[65563,38.3147,-122.3014,1608141144,8.4,9.4,9.7,8.8,7.7,30],
[65569,37.7377,-122.253,1608141308,3.7,4.5,4.4,3.2,3.7,25],
[65573,38.3147,-122.3015,1608141322,7.7,7.8,8.0,7.9,6.7,28],
[65575,38.1623,-122.1787,1608141319,8.8,6.2,5.0,3.9,10.5,21],
[65577,37.7508,-122.2386,1608141254,2.8,5.2,6.8,5.8,3.7,25],
[65579,37.7775,-122.4685,1608141112,1.2,1.5,1.5,1.5,1.2,32],
[65583,37.2928,-122.0053,1608141235,5.3,4.0,3.2,5.0,4.8,30],
[65589,37.3546,-122.0808,1608141219,4.1,4.2,4.3,4.5,4.5,29],
[65597,37.8867,-122.2952,1608141223,3.3,3.0,3.0,4.2,4.8,33],
[65599,37.6903,-122.0425,1608141303,5.1,5.0,4.4,3.5,3.6,24],
[65603,37.3425,-122.0763,1608141256,3.0,3.1,3.3,7.6,10.5,24],
[71161,37.2573,-122.0174,1608141209,2.2,2.3,2.2,2.4,2.9,21],
[71167,37.8959,-122.2961,1608141314,2.8,2.4,2.1,1.3,1.2,null],
[74559,null,null,1608141265,4.1,3.7,3.3,2.8,5.7,32],
[71166,37.8959,-122.2961,1608141314,2.8,2.4,null,1.3,1.2,null]
]
}"""
def test_parse():
purpleair.parse_api(SAMPLE_DATA)
| 35.086207 | 66 | 0.62457 | 442 | 2,035 | 2.825792 | 0.346154 | 0.011209 | 0.007206 | 0.006405 | 0.061649 | 0.043235 | 0.043235 | 0.043235 | 0.043235 | 0 | 0 | 0.488294 | 0.118428 | 2,035 | 57 | 67 | 35.701754 | 0.207915 | 0.040786 | 0 | 0 | 0 | 0.428571 | 0.895223 | 0.645609 | 0 | 0 | 0 | 0 | 0 | 1 | 0.020408 | false | 0 | 0.081633 | 0 | 0.102041 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
074f453693f37e2d8a0351a16c2ef3a515196823 | 53 | py | Python | app/model/all_models/__all_models.py | seldish-og/Teselium | 3b7f30e767ed8e08a34db03dc3dec55986c32b98 | [
"MIT"
] | 1 | 2022-02-21T10:46:34.000Z | 2022-02-21T10:46:34.000Z | app/model/all_models/__all_models.py | seldish-og/Teselium | 3b7f30e767ed8e08a34db03dc3dec55986c32b98 | [
"MIT"
] | null | null | null | app/model/all_models/__all_models.py | seldish-og/Teselium | 3b7f30e767ed8e08a34db03dc3dec55986c32b98 | [
"MIT"
] | null | null | null | from . import auth_models
from . import cards_models
| 17.666667 | 26 | 0.811321 | 8 | 53 | 5.125 | 0.625 | 0.487805 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.150943 | 53 | 2 | 27 | 26.5 | 0.911111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
075af093524473ab58e440f2c73dc3ff92d1b9ed | 177 | py | Python | django/app/views.py | shimakaze-git/docker_django_celery_vote | 0b0899a9fb4d817ff5966e9d1f1f027755e585cf | [
"BSD-2-Clause"
] | null | null | null | django/app/views.py | shimakaze-git/docker_django_celery_vote | 0b0899a9fb4d817ff5966e9d1f1f027755e585cf | [
"BSD-2-Clause"
] | null | null | null | django/app/views.py | shimakaze-git/docker_django_celery_vote | 0b0899a9fb4d817ff5966e9d1f1f027755e585cf | [
"BSD-2-Clause"
] | null | null | null | from django.shortcuts import render
# from django.views.generic import DetailView, CreateView
# Create your views here.
# https://yu-nix.com/blog/2021/7/31/django-create-view/
| 29.5 | 57 | 0.779661 | 27 | 177 | 5.111111 | 0.777778 | 0.144928 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.044025 | 0.101695 | 177 | 5 | 58 | 35.4 | 0.823899 | 0.751412 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
4acc4c9af3df113b8e3f7ab8c7a315ba0210f334 | 68 | py | Python | mayatools/utils.py | westernx/mayatools | 47c91050cb54167268d456e130ffce2d55373381 | [
"BSD-3-Clause"
] | 47 | 2015-01-07T17:38:39.000Z | 2022-03-22T02:42:39.000Z | mayatools/utils.py | vfxetc/mayatools | fc1a988ba8215dd28142d9ceb616b1c0888883fd | [
"BSD-3-Clause"
] | 1 | 2016-04-25T09:02:57.000Z | 2016-04-25T13:55:13.000Z | mayatools/utils.py | westernx/mayatools | 47c91050cb54167268d456e130ffce2d55373381 | [
"BSD-3-Clause"
] | 12 | 2015-07-13T12:32:35.000Z | 2020-04-29T02:58:49.000Z | from metatools.imports import load_entrypoint as resolve_entrypoint
| 34 | 67 | 0.897059 | 9 | 68 | 6.555556 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.088235 | 68 | 1 | 68 | 68 | 0.951613 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
4af4f8a43bdc238e62171b15a17a1d75039add99 | 6,511 | py | Python | jina/clients/mixin.py | varghesejose2020/jina | b3e90cd6a4071a96f7127e71676dcb224d0663a8 | [
"Apache-2.0"
] | 1 | 2022-02-03T08:30:51.000Z | 2022-02-03T08:30:51.000Z | jina/clients/mixin.py | varghesejose2020/jina | b3e90cd6a4071a96f7127e71676dcb224d0663a8 | [
"Apache-2.0"
] | 1 | 2021-07-16T17:36:22.000Z | 2021-09-22T13:48:18.000Z | jina/clients/mixin.py | varghesejose2020/jina | b3e90cd6a4071a96f7127e71676dcb224d0663a8 | [
"Apache-2.0"
] | 1 | 2022-02-03T08:30:53.000Z | 2022-02-03T08:30:53.000Z | from functools import partialmethod
from typing import Optional, Dict, List, AsyncGenerator, TYPE_CHECKING, Union
from jina.helper import run_async
if TYPE_CHECKING:
from jina.clients.base import CallbackFnType, InputType
from jina.types.request import Response
from jina import DocumentArray
class PostMixin:
"""The Post Mixin class for Client and Flow"""
def post(
self,
on: str,
inputs: Optional['InputType'] = None,
on_done: Optional['CallbackFnType'] = None,
on_error: Optional['CallbackFnType'] = None,
on_always: Optional['CallbackFnType'] = None,
parameters: Optional[Dict] = None,
target_executor: Optional[str] = None,
request_size: int = 100,
show_progress: bool = False,
continue_on_error: bool = False,
return_results: bool = False,
**kwargs,
) -> Optional[Union['DocumentArray', List['Response']]]:
"""Post a general data request to the Flow.
:param inputs: input data which can be an Iterable, a function which returns an Iterable, or a single Document id.
:param on: the endpoint is used for identifying the user-defined ``request_type``, labeled by ``@requests(on='/abc')``
:param on_done: the function to be called when the :class:`Request` object is resolved.
:param on_error: the function to be called when the :class:`Request` object is rejected.
:param on_always: the function to be called when the :class:`Request` object is either resolved or rejected.
:param parameters: the kwargs that will be sent to the executor
:param target_executor: a regex string. Only matching Executors will process the request.
:param request_size: the number of Documents per request. <=0 means all inputs in one request.
:param show_progress: if set, client will show a progress bar on receiving every request.
:param continue_on_error: if set, a Request that causes callback error will be logged only without blocking the further requests.
:param return_results: if set, the Documents resulting from all Requests will be returned as a DocumentArray. This is useful when one wants process Responses in bulk instead of using callback.
:param kwargs: additional parameters
:return: None or DocumentArray containing all response Documents
.. warning::
``target_executor`` uses ``re.match`` for checking if the pattern is matched.
``target_executor=='foo'`` will match both pods with the name ``foo`` and ``foo_what_ever_suffix``.
"""
async def _get_results(*args, **kwargs):
result = []
c = self.client
c.show_progress = show_progress
c.continue_on_error = continue_on_error
async for resp in c._get_results(*args, **kwargs):
if return_results:
result.append(resp)
if return_results:
if c.args.results_as_docarray:
docs = [r.data.docs for r in result]
if len(docs) < 1:
return docs
else:
return docs[0].reduce_all(docs[1:])
else:
return result
if (on_always is None) and (on_done is None):
return_results = True
return run_async(
_get_results,
inputs=inputs,
on_done=on_done,
on_error=on_error,
on_always=on_always,
exec_endpoint=on,
target_executor=target_executor,
parameters=parameters,
request_size=request_size,
**kwargs,
)
# ONLY CRUD, for other request please use `.post`
index = partialmethod(post, '/index')
search = partialmethod(post, '/search')
update = partialmethod(post, '/update')
delete = partialmethod(post, '/delete')
class AsyncPostMixin:
"""The Async Post Mixin class for AsyncClient and AsyncFlow"""
async def post(
self,
on: str,
inputs: Optional['InputType'] = None,
on_done: Optional['CallbackFnType'] = None,
on_error: Optional['CallbackFnType'] = None,
on_always: Optional['CallbackFnType'] = None,
parameters: Optional[Dict] = None,
target_executor: Optional[str] = None,
request_size: int = 100,
show_progress: bool = False,
continue_on_error: bool = False,
**kwargs,
) -> AsyncGenerator[None, 'Response']:
"""Post a general data request to the Flow.
:param inputs: input data which can be an Iterable, a function which returns an Iterable, or a single Document id.
:param on: the endpoint is used for identifying the user-defined ``request_type``, labeled by ``@requests(on='/abc')``
:param on_done: the function to be called when the :class:`Request` object is resolved.
:param on_error: the function to be called when the :class:`Request` object is rejected.
:param on_always: the function to be called when the :class:`Request` object is is either resolved or rejected.
:param parameters: the kwargs that will be sent to the executor
:param target_executor: a regex string. Only matching Executors will process the request.
:param request_size: the number of Documents per request. <=0 means all inputs in one request.
:param show_progress: if set, client will show a progress bar on receiving every request.
:param continue_on_error: if set, a Request that causes callback error will be logged only without blocking the further requests.
:param kwargs: additional parameters
:yield: Response object
"""
c = self.client
c.show_progress = show_progress
c.continue_on_error = continue_on_error
async for r in c._get_results(
inputs=inputs,
on_done=on_done,
on_error=on_error,
on_always=on_always,
exec_endpoint=on,
target_executor=target_executor,
parameters=parameters,
request_size=request_size,
**kwargs,
):
yield r
# ONLY CRUD, for other request please use `.post`
index = partialmethod(post, '/index')
search = partialmethod(post, '/search')
update = partialmethod(post, '/update')
delete = partialmethod(post, '/delete')
| 44.59589 | 200 | 0.637997 | 805 | 6,511 | 5.042236 | 0.212422 | 0.027593 | 0.029564 | 0.022173 | 0.715447 | 0.715447 | 0.715447 | 0.715447 | 0.715447 | 0.715447 | 0 | 0.002352 | 0.281677 | 6,511 | 145 | 201 | 44.903448 | 0.865512 | 0.264322 | 0 | 0.652174 | 0 | 0 | 0.053936 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.01087 | false | 0 | 0.065217 | 0 | 0.228261 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
ab18b7412f1875ca7f6779aa2241255a19b247fb | 92 | py | Python | tournaments/charactersRearrangement/charactersRearrangement.py | gurfinkel/codeSignal | 114817947ac6311bd53a48f0f0e17c0614bf7911 | [
"MIT"
] | 5 | 2020-02-06T09:51:22.000Z | 2021-03-19T00:18:44.000Z | tournaments/charactersRearrangement/charactersRearrangement.py | gurfinkel/codeSignal | 114817947ac6311bd53a48f0f0e17c0614bf7911 | [
"MIT"
] | null | null | null | tournaments/charactersRearrangement/charactersRearrangement.py | gurfinkel/codeSignal | 114817947ac6311bd53a48f0f0e17c0614bf7911 | [
"MIT"
] | 3 | 2019-09-27T13:06:21.000Z | 2021-04-20T23:13:17.000Z | def charactersRearrangement(string1, string2):
return sorted(string1) == sorted(string2) | 46 | 46 | 0.782609 | 9 | 92 | 8 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.04878 | 0.108696 | 92 | 2 | 47 | 46 | 0.829268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.5 | false | 0 | 0 | 0.5 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
db3e6507a34750d9c3c6ccecd2b99aea357c1312 | 2,688 | py | Python | src/tests/models/eventbrite_event_model_tests.py | RoaringForkTech/rfv-events | 8d3c48c9f125197d5879317f5b3bc911ae9b774b | [
"MIT"
] | null | null | null | src/tests/models/eventbrite_event_model_tests.py | RoaringForkTech/rfv-events | 8d3c48c9f125197d5879317f5b3bc911ae9b774b | [
"MIT"
] | 37 | 2019-07-07T15:18:41.000Z | 2020-01-03T03:00:34.000Z | src/tests/models/eventbrite_event_model_tests.py | COWestSlopeTech/project-kronos | 8d3c48c9f125197d5879317f5b3bc911ae9b774b | [
"MIT"
] | 1 | 2019-10-09T00:23:45.000Z | 2019-10-09T00:23:45.000Z | from unittest import TestCase
from src.models.eventbrite_event_model import Eventbrite_Event
from src.tests.models.config import eventbrite_model_fields
class EventBriteModelTest(TestCase):
def test_eventbrite_event_model_has_required_props(self):
"""
Testing that properties are set in the object
"""
event_mock = Eventbrite_Event(eventbrite_model_fields["source"], eventbrite_model_fields["source_id"],
eventbrite_model_fields["name"])
assert event_mock.name == eventbrite_model_fields["name"]
assert event_mock.source == eventbrite_model_fields["source"]
assert event_mock.source_id == eventbrite_model_fields["source_id"]
def test_eventbrite_event_model_has_optional_props(self):
"""
Testing that properties are set in the object
"""
event_mock = Eventbrite_Event(eventbrite_model_fields["source"], eventbrite_model_fields["source_id"],
eventbrite_model_fields["name"])
event_mock.start_time = eventbrite_model_fields["start_time"]
event_mock.end_time = eventbrite_model_fields["end_time"]
event_mock.description = eventbrite_model_fields["description"]
assert event_mock.start_time == eventbrite_model_fields["start_time"]
assert event_mock.start_time == eventbrite_model_fields["end_time"]
assert event_mock.description == eventbrite_model_fields["description"]
event_mock.status = eventbrite_model_fields["status"]
event_mock.capacity = eventbrite_model_fields["capacity"]
event_mock.source_url = eventbrite_model_fields["source_url"]
event_mock.venue_id = eventbrite_model_fields["venue_id"]
event_mock.organization_id = eventbrite_model_fields["organization_id"]
event_mock.invite_only = eventbrite_model_fields["invite_only"]
event_mock.online_event = eventbrite_model_fields["online_event"]
event_mock.organizer_id = eventbrite_model_fields["organizer_id"]
event_mock.cost = eventbrite_model_fields["cost"]
assert event_mock.capacity == eventbrite_model_fields["capacity"]
assert event_mock.source_url == eventbrite_model_fields["source_url"]
assert event_mock.venue_id == eventbrite_model_fields["venue_id"]
assert event_mock.organization_id == eventbrite_model_fields["organization_id"]
assert event_mock.invite_only == eventbrite_model_fields["invite_only"]
assert event_mock.online_event == eventbrite_model_fields["online_event"]
assert event_mock.organizer_id == eventbrite_model_fields["organizer_id"]
| 49.777778 | 110 | 0.729539 | 315 | 2,688 | 5.768254 | 0.152381 | 0.264172 | 0.36984 | 0.113924 | 0.813979 | 0.775454 | 0.720969 | 0.582829 | 0.554761 | 0.167309 | 0 | 0 | 0.1875 | 2,688 | 53 | 111 | 50.716981 | 0.83196 | 0.033854 | 0 | 0.114286 | 0 | 0 | 0.108969 | 0 | 0 | 0 | 0 | 0 | 0.371429 | 1 | 0.057143 | false | 0 | 0.085714 | 0 | 0.171429 | 0 | 0 | 0 | 0 | null | 1 | 1 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
db5be3730c233940a6d53189f552b7ff5c70e8df | 42 | py | Python | Python/1. Introduction/07 - Print Function.py | rosiejh/HackerRank | bfb07b8add04d3f3b67a61754db483f88a79e5a5 | [
"Apache-2.0"
] | null | null | null | Python/1. Introduction/07 - Print Function.py | rosiejh/HackerRank | bfb07b8add04d3f3b67a61754db483f88a79e5a5 | [
"Apache-2.0"
] | null | null | null | Python/1. Introduction/07 - Print Function.py | rosiejh/HackerRank | bfb07b8add04d3f3b67a61754db483f88a79e5a5 | [
"Apache-2.0"
] | null | null | null | print(*range(1, int(input()) + 1), sep='') | 42 | 42 | 0.547619 | 7 | 42 | 3.285714 | 0.857143 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.052632 | 0.095238 | 42 | 1 | 42 | 42 | 0.552632 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
db6c2100e48458aeb0295604295bf33e8b9b4a0d | 42 | py | Python | homeassistant/components/comed_hourly_pricing/__init__.py | domwillcode/home-assistant | f170c80bea70c939c098b5c88320a1c789858958 | [
"Apache-2.0"
] | 30,023 | 2016-04-13T10:17:53.000Z | 2020-03-02T12:56:31.000Z | homeassistant/components/comed_hourly_pricing/__init__.py | jagadeeshvenkatesh/core | 1bd982668449815fee2105478569f8e4b5670add | [
"Apache-2.0"
] | 31,101 | 2020-03-02T13:00:16.000Z | 2022-03-31T23:57:36.000Z | homeassistant/components/comed_hourly_pricing/__init__.py | jagadeeshvenkatesh/core | 1bd982668449815fee2105478569f8e4b5670add | [
"Apache-2.0"
] | 11,956 | 2016-04-13T18:42:31.000Z | 2020-03-02T09:32:12.000Z | """The comed_hourly_pricing component."""
| 21 | 41 | 0.761905 | 5 | 42 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.071429 | 42 | 1 | 42 | 42 | 0.769231 | 0.833333 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
dba4b51b42204925e9e6b3c40e87491a680eb8dd | 16,236 | py | Python | tests/test_entity_package.py | MisterOwlPT/enlil | 0d2386f664acd69169675ae68036f83aa32df836 | [
"MIT"
] | null | null | null | tests/test_entity_package.py | MisterOwlPT/enlil | 0d2386f664acd69169675ae68036f83aa32df836 | [
"MIT"
] | null | null | null | tests/test_entity_package.py | MisterOwlPT/enlil | 0d2386f664acd69169675ae68036f83aa32df836 | [
"MIT"
] | null | null | null | # Enlil
#
# Copyright © 2021 Pedro Pereira, Rafael Arrais
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
import unittest
from unittest import mock
from pipeline.loader.entities.robot import Robot
from pipeline.loader.entities.package import Package
class TestEntityPackage(unittest.TestCase):
__robotic_area = {
'id': 'dummy_area',
'robots': ['dummy_ros1_robot', 'dummy_ros2_robot']
}
__robot_ros1_data = {
'id': 'dummy_ros1_robot',
'ros': 'melodic:11311',
'images': ['dummy_image']
}
__robot_ros2_data = {
'id': 'dummy_ros2_robot',
'ros': 'foxy:42',
'images': ['dummy_image']
}
__robot_ros1 = Robot(__robot_ros1_data, __robotic_area)
__robot_ros2 = Robot(__robot_ros2_data, __robotic_area)
@mock.patch.object(Package, '_Package__parse_yaml_data')
def test_parsing_package(self, mock):
""" Test if the same provided data is the one being parsed.
"""
yaml_data = {'dummy': 'dummy'}
package = Package(yaml_data, self.__robot_ros1)
mock.assert_called_once_with(yaml_data, self.__robot_ros1)
self.assertEqual(package.yaml_data, yaml_data)
def test_loading_package_without_id(self):
""" Test if execution is terminated if provided data has no required field "id".
"""
yaml_data = {'dummy': 'dummy'}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_invalid_id(self):
""" Test if execution is terminated if provided data has an empty "id" field.
"""
yaml_data = {'id': ''}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_without_path(self):
""" Test if execution is terminated if provided data has no required field "path".
"""
yaml_data = {'id': 'dummy_package', 'command': 'dummy_command'}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_invalid_path(self):
""" Test if execution is terminated if provided data has an empty "path" field.
"""
yaml_data = {'id': 'dummy_package', 'path': '', 'command': 'dummy_command'}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_without_command(self):
""" Test if execution is terminated if provided data has no required field "command".
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path'}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_invalid_command(self):
""" Test if execution is terminated if provided data has an empty "path" field.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': ''}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_id(self):
""" Test if "id" is set properly.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package_id = yaml_data['id']
package = Package(yaml_data, self.__robot_ros1)
self.assertEqual(package.id, f"{self.__robot_ros1.id}-{package_id}")
def test_loading_package_no_content(self):
""" Test if execution is terminated if any of the field "apt", "git" and "rosinstall" are not declared.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command'}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_empty_git(self):
""" Test if execution is terminated if field "git" is declared but not set.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': []}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_no_git(self):
""" Test if no "git clone" command is added when "git" field is not declared.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'apt': ['dummt_apt']}
package = Package(yaml_data, self.__robot_ros1)
self.assertTrue('git_cmds' not in package.yaml_data)
def test_loading_package_git_default_branch(self):
""" Test if "git clone" command are properly added when no branch is specified.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_git']}
package = Package(yaml_data, self.__robot_ros1)
self.assertEqual(len(package.yaml_data['git_cmds']), 1)
self.assertEqual(
package.yaml_data['git_cmds'][0],
f"git -C /ros_workspace/src clone -b {self.__robot_ros1_data['ros'].split(':')[0]} {yaml_data['git'][0]}"
)
def test_loading_package_git_branch(self):
""" Test if "git clone" command are properly added.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_git:branch']}
git_repo, git_branch = yaml_data['git'][0].split(':')
package = Package(yaml_data, self.__robot_ros1)
self.assertEqual(len(package.yaml_data['git_cmds']), 1)
self.assertEqual(
package.yaml_data['git_cmds'][0],
f"git -C /ros_workspace/src clone -b {git_branch} {git_repo}"
)
def test_loading_package_empty_apt(self):
""" Test if execution is terminated if field "apt" is declared but not set.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'apt': []}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_empty_rosinstall(self):
""" Test if execution is terminated if field "rosinstall" is declared but not set.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'rosinstall': []}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros1)
self.assertEqual(exception.exception.code, 1)
def test_loading_package_ros1_environment_variables(self):
""" Test if default environment variables are set properly for ROS1 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros1)
self.assertEqual(len(package.yaml_data['environment']), 2)
self.assertTrue(f"ROS_HOSTNAME={yaml_data['id']}" in package.yaml_data['environment'])
self.assertTrue('ROS_MASTER_URI=http://roscore-{{ROBOT_ID}}:{{ROBOT_ROS_PORT}}' in package.yaml_data['environment'])
def test_loading_package_ros2_environment_variables(self):
""" Test if default environment variables are set properly for ROS2 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros2)
self.assertEqual(len(package.yaml_data['environment']), 1)
self.assertTrue('ROS_DOMAIN_ID={{ROBOT_ROS_DOMAIN}}' in package.yaml_data['environment'])
def test_loading_package_ros(self):
""" Test if field "ros" is set properly.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros1)
self.assertEqual(package.yaml_data['ros'], '{{ROBOT_ROS_DISTRO}}')
def test_loading_package_ros1_networks(self):
""" Test if default networks are properly set for ROS1 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros1)
self.assertEqual(len(package.yaml_data['networks']), 1)
self.assertTrue(f"{self.__robotic_area['id']}-network" in package.yaml_data['networks'])
def test_loading_package_ros2_networks(self):
""" Test if default networks are properly set for ROS2 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros2)
self.assertEqual(len(package.yaml_data['networks']), 1)
self.assertTrue(f"{self.__robotic_area['id']}-network" in package.yaml_data['networks'])
def test_loading_package_ros1_depends_on(self):
""" Test if field "depends_on" is properly set for ROS1 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros1)
self.assertEqual(len(package.yaml_data['depends_on']), 1)
self.assertTrue(f"roscore-{self.__robot_ros1.yaml_data['id']}" in package.yaml_data['depends_on'])
def test_loading_package_ros2_depends_on(self):
""" Test if field "depends_on" is not set for ROS2 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros2)
self.assertEqual(len(package.yaml_data['depends_on']), 0)
def test_loading_package_ros1_restart_default(self):
""" Test if field "restart" is properly set to default for ROS1 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros1)
self.assertEqual(package.yaml_data['restart'], 'always')
def test_loading_package_ros1_restart(self):
""" Test if field "restart" is properly set when specified for ROS1 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros1)
self.assertEqual(package.yaml_data['restart'], yaml_data['restart'])
def test_loading_package_ros2_restart_default(self):
""" Test if field "restart" is properly set to default for ROS2 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros2)
self.assertEqual(package.yaml_data['restart'], 'always')
def test_loading_package_ros2_restart(self):
""" Test if field "restart" is properly set when specified for ROS1 packages.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo']}
package = Package(yaml_data, self.__robot_ros2)
self.assertEqual(package.yaml_data['restart'], yaml_data['restart'])
def test_ssh_empty(self):
""" Test if execution finishes if field "ssh" is declared empty
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo'], 'ssh': []}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros2)
self.assertEqual(exception.exception.code, 1)
def test_ssh_not_list(self):
""" Test if execution finishes if field "ssh" is not a list.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo'], 'ssh': 'random_path'}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros2)
self.assertEqual(exception.exception.code, 1)
def test_ssh_not_list_of_files(self):
""" Test if execution finishes if field "ssh" is not a list of files.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo'], 'ssh': [[], 'dummy']}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros2)
self.assertEqual(exception.exception.code, 1)
def test_ssh_value_set(self):
""" Test if field "ssh" is properly set when defined.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo'], 'ssh': ['random_path']}
package = Package(yaml_data, self.__robot_ros2)
self.assertEqual(package.yaml_data['ssh'], yaml_data['ssh'])
def test_files_empty(self):
""" Test if execution finishes if field "files" is declared empty
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo'], 'files': []}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros2)
self.assertEqual(exception.exception.code, 1)
def test_files_not_list(self):
""" Test if execution finishes if field "files" is not a list.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo'], 'files': 'random_path'}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros2)
self.assertEqual(exception.exception.code, 1)
def test_files_not_list_of_files(self):
""" Test if execution finishes if field "files" is not a list of files.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo'], 'files': [[], 'dummy']}
with self.assertRaises(SystemExit) as exception:
Package(yaml_data, self.__robot_ros2)
self.assertEqual(exception.exception.code, 1)
def test_files_value_set(self):
""" Test if field "files" is properly set when defined.
"""
yaml_data = {'id': 'dummy_package', 'path': 'dummy_path', 'command': 'dummy_command', 'git': ['dummy_repo'], 'files': ['random_path']}
package = Package(yaml_data, self.__robot_ros2)
self.assertEqual(package.yaml_data['files'], yaml_data['files'])
if __name__ == '__main__':
unittest.main()
| 49.651376 | 142 | 0.658537 | 2,046 | 16,236 | 4.951613 | 0.101662 | 0.082914 | 0.087356 | 0.058731 | 0.792617 | 0.773369 | 0.758069 | 0.743855 | 0.72293 | 0.697957 | 0 | 0.008398 | 0.207933 | 16,236 | 326 | 143 | 49.803681 | 0.779316 | 0.22407 | 0 | 0.497409 | 0 | 0.005181 | 0.222824 | 0.024489 | 0 | 0 | 0 | 0 | 0.305699 | 1 | 0.176166 | false | 0 | 0.020725 | 0 | 0.227979 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
dba7339315253d931265b731f6ccf9148130bd0b | 52 | py | Python | com/LimePencil/Q24751/Betting.py | LimePencil/baekjoonProblems | 61eeeeb875585d165d9e39ecdb3d905b4ba6aa87 | [
"MIT"
] | null | null | null | com/LimePencil/Q24751/Betting.py | LimePencil/baekjoonProblems | 61eeeeb875585d165d9e39ecdb3d905b4ba6aa87 | [
"MIT"
] | null | null | null | com/LimePencil/Q24751/Betting.py | LimePencil/baekjoonProblems | 61eeeeb875585d165d9e39ecdb3d905b4ba6aa87 | [
"MIT"
] | null | null | null | n=int(input())
print((100-n)/n+1)
print(n/(100-n)+1) | 17.333333 | 18 | 0.596154 | 13 | 52 | 2.384615 | 0.461538 | 0.258065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.16 | 0.038462 | 52 | 3 | 19 | 17.333333 | 0.46 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.666667 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
dbbe69ba8c9ec5f85d5d0105f524d87dbd7a0b31 | 191 | py | Python | det3d/models/bbox_heads/__init__.py | Lelin-HUNUST/VISTA | 7bf34132d719cb0e5e803b92cd15451df58a9a5d | [
"MIT"
] | 47 | 2022-03-21T02:41:39.000Z | 2022-03-30T17:25:29.000Z | det3d/models/bbox_heads/__init__.py | Lelin-HUNUST/VISTA | 7bf34132d719cb0e5e803b92cd15451df58a9a5d | [
"MIT"
] | 1 | 2022-03-28T15:11:26.000Z | 2022-03-28T16:27:40.000Z | det3d/models/bbox_heads/__init__.py | Lelin-HUNUST/VISTA | 7bf34132d719cb0e5e803b92cd15451df58a9a5d | [
"MIT"
] | 2 | 2022-03-23T12:56:14.000Z | 2022-03-27T14:25:50.000Z | from .clear_mg_ohs_head import OHSHeadClear
from .deep_decouple_clear_mg_ohs_head import DeepMultiGroupOHSHeadClear_Decouple
__all__ = ["OHSHeadClear","DeepMultiGroupOHSHeadClear_Decouple"]
| 38.2 | 80 | 0.884817 | 21 | 191 | 7.380952 | 0.52381 | 0.090323 | 0.129032 | 0.180645 | 0.258065 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.062827 | 191 | 4 | 81 | 47.75 | 0.865922 | 0 | 0 | 0 | 0 | 0 | 0.246073 | 0.183246 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
91876bf04eeddbfefbd7072b2f55cc61515ad960 | 91 | py | Python | ingrex/__init__.py | Cojad/ingrex_lib | c2cff0b5fd1f8bc66bdc1d476a9d52720be214be | [
"MIT"
] | 70 | 2015-07-02T23:12:07.000Z | 2021-06-24T20:21:48.000Z | ingrex/__init__.py | Cojad/ingrex_lib | c2cff0b5fd1f8bc66bdc1d476a9d52720be214be | [
"MIT"
] | 10 | 2015-07-03T00:12:46.000Z | 2020-12-18T20:00:16.000Z | ingrex/__init__.py | Cojad/ingrex_lib | c2cff0b5fd1f8bc66bdc1d476a9d52720be214be | [
"MIT"
] | 52 | 2015-07-17T10:14:24.000Z | 2021-08-08T21:26:14.000Z | "Init"
from . intel import Intel
from . praser import Message
from . import utils as Utils
| 18.2 | 28 | 0.758242 | 14 | 91 | 4.928571 | 0.571429 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.186813 | 91 | 4 | 29 | 22.75 | 0.932432 | 0.043956 | 0 | 0 | 0 | 0 | 0.043956 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.75 | 0 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
91b3d7f41759cd3415c2a362dfbc09418f918ddb | 18,798 | py | Python | tensorflow/contrib/rnn/python/kernel_tests/rnn_test.py | atfkaka/tensorflow | 5657d0dee8d87f4594b3e5902ed3e3ca8d6dfc0a | [
"Apache-2.0"
] | 2 | 2018-04-10T11:50:28.000Z | 2019-01-08T02:40:17.000Z | tensorflow/contrib/rnn/python/kernel_tests/rnn_test.py | atfkaka/tensorflow | 5657d0dee8d87f4594b3e5902ed3e3ca8d6dfc0a | [
"Apache-2.0"
] | null | null | null | tensorflow/contrib/rnn/python/kernel_tests/rnn_test.py | atfkaka/tensorflow | 5657d0dee8d87f4594b3e5902ed3e3ca8d6dfc0a | [
"Apache-2.0"
] | 6 | 2017-04-14T07:11:14.000Z | 2019-11-20T08:19:15.000Z | # Copyright 2015 The TensorFlow Authors. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# ==============================================================================
"""Tests for rnn module."""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import itertools
import numpy as np
import tensorflow as tf
class StackBidirectionalRNNTest(tf.test.TestCase):
def setUp(self):
self._seed = 23489
np.random.seed(self._seed)
def _createStackBidirectionalRNN(self,
use_gpu,
use_shape,
use_sequence_length,
initial_states_fw=None,
initial_states_bw=None,
scope=None):
self.layers = [2, 3]
input_size = 5
batch_size = 2
max_length = 8
initializer = tf.random_uniform_initializer(-0.01, 0.01, seed=self._seed)
sequence_length = tf.placeholder(tf.int64) if use_sequence_length else None
self.cells_fw = [tf.nn.rnn_cell.LSTMCell(
num_units, input_size, initializer=initializer, state_is_tuple=False)
for num_units in self.layers]
self.cells_bw = [tf.nn.rnn_cell.LSTMCell(
num_units, input_size, initializer=initializer, state_is_tuple=False)
for num_units in self.layers]
inputs = max_length * [
tf.placeholder(
tf.float32,
shape=(batch_size, input_size) if use_shape else (None, input_size))
]
outputs, state_fw, state_bw = tf.contrib.rnn.stack_bidirectional_rnn(
self.cells_fw,
self.cells_bw,
inputs,
initial_states_fw,
initial_states_bw,
dtype=tf.float32,
sequence_length=sequence_length,
scope=scope)
self.assertEqual(len(outputs), len(inputs))
for out in outputs:
self.assertAlmostEqual(
out.get_shape().as_list(),
[batch_size if use_shape else None, 2 * self.layers[-1]])
input_value = np.random.randn(batch_size, input_size)
outputs = tf.stack(outputs)
return input_value, inputs, outputs, state_fw, state_bw, sequence_length
def _testStackBidirectionalRNN(self, use_gpu, use_shape):
with self.test_session(use_gpu=use_gpu, graph=tf.Graph()) as sess:
input_value, inputs, outputs, state_fw, state_bw, sequence_length = (
self._createStackBidirectionalRNN(use_gpu, use_shape, True))
tf.global_variables_initializer().run()
# Run with pre-specified sequence lengths of 2, 3.
out, s_fw, s_bw = sess.run([outputs, state_fw, state_bw],
feed_dict={inputs[0]: input_value,
sequence_length: [2, 3]})
# Since the forward and backward LSTM cells were initialized with the
# same parameters, the forward and backward states of the first layer
# must be the same.
# For the next layers, since the input is a concat of forward and backward
# outputs of the previous layers the symmetry is broken and the following
# states and outputs differ.
# We cannot access the intermediate values between layers but we can
# check that the forward and backward states of the first layer match.
self.assertAllClose(s_fw[0], s_bw[0])
# If outputs are not concat between layers the output of the forward
# and backward would be the same but symmetric.
# Check that it is not the case.
# Due to depth concatenation (as num_units=3 for both RNNs):
# - forward output: out[][][depth] for 0 <= depth < 3
# - backward output: out[][][depth] for 4 <= depth < 6
# First sequence in batch is length=2
# Check that the time=0 forward output is not equal to time=1 backward.
self.assertNotEqual(out[0][0][0], out[1][0][3])
self.assertNotEqual(out[0][0][1], out[1][0][4])
self.assertNotEqual(out[0][0][2], out[1][0][5])
# Check that the time=1 forward output is not equal to time=0 backward.
self.assertNotEqual(out[1][0][0], out[0][0][3])
self.assertNotEqual(out[1][0][1], out[0][0][4])
self.assertNotEqual(out[1][0][2], out[0][0][5])
# Second sequence in batch is length=3
# Check that the time=0 forward output is not equal to time=2 backward.
self.assertNotEqual(out[0][1][0], out[2][1][3])
self.assertNotEqual(out[0][1][1], out[2][1][4])
self.assertNotEqual(out[0][1][2], out[2][1][5])
# Check that the time=1 forward output is not equal to time=1 backward.
self.assertNotEqual(out[1][1][0], out[1][1][3])
self.assertNotEqual(out[1][1][1], out[1][1][4])
self.assertNotEqual(out[1][1][2], out[1][1][5])
# Check that the time=2 forward output is not equal to time=0 backward.
self.assertNotEqual(out[2][1][0], out[0][1][3])
self.assertNotEqual(out[2][1][1], out[0][1][4])
self.assertNotEqual(out[2][1][2], out[0][1][5])
def _testStackBidirectionalRNNStates(self, use_gpu):
# Check that the states are correctly initialized.
# - Create a net and iterate for 3 states. Keep the state (state_3).
# - Reset states, and iterate for 5 steps. Last state is state_5.
# - Reset the sets to state_3 and iterate for 2 more steps,
# last state will be state_5'.
# - Check that the state_5 and state_5' (forward and backward) are the
# same for the first layer (it does not apply for the second layer since
# it has forward-backward dependencies).
with self.test_session(use_gpu=use_gpu, graph=tf.Graph()) as sess:
batch_size = 2
# Create states placeholders.
initial_states_fw = [tf.placeholder(tf.float32, shape=(batch_size, layer*2))
for layer in self.layers]
initial_states_bw = [tf.placeholder(tf.float32, shape=(batch_size, layer*2))
for layer in self.layers]
# Create the net
input_value, inputs, outputs, state_fw, state_bw, sequence_length = (
self._createStackBidirectionalRNN(use_gpu, True, True,
initial_states_fw, initial_states_bw))
tf.global_variables_initializer().run()
# Run 3 steps.
feed_dict = {inputs[0]: input_value, sequence_length: [3, 2]}
# Initialize to empty state.
for i, layer in enumerate(self.layers):
feed_dict[initial_states_fw[i]] = np.zeros((batch_size, layer*2),
dtype=np.float32)
feed_dict[initial_states_bw[i]] = np.zeros((batch_size, layer*2),
dtype=np.float32)
_, st_3_fw, st_3_bw = sess.run([outputs, state_fw, state_bw],
feed_dict=feed_dict)
# Reset the net and run 5 steps.
feed_dict = {inputs[0]: input_value, sequence_length: [5, 3]}
for i, layer in enumerate(self.layers):
feed_dict[initial_states_fw[i]] = np.zeros((batch_size, layer*2),
dtype=np.float32)
feed_dict[initial_states_bw[i]] = np.zeros((batch_size, layer*2),
dtype=np.float32)
_, st_5_fw, st_5_bw = sess.run([outputs, state_fw, state_bw],
feed_dict=feed_dict)
# Reset the net to state_3 and run 2 more steps.
feed_dict = {inputs[0]: input_value, sequence_length: [2, 1]}
for i, _ in enumerate(self.layers):
feed_dict[initial_states_fw[i]] = st_3_fw[i]
feed_dict[initial_states_bw[i]] = st_3_bw[i]
out_5p, st_5p_fw, st_5p_bw = sess.run([outputs, state_fw, state_bw],
feed_dict=feed_dict)
# Check that the 3+2 and 5 first layer states.
self.assertAllEqual(st_5_fw[0], st_5p_fw[0])
self.assertAllEqual(st_5_bw[0], st_5p_bw[0])
def testStackBidirectionalRNN(self):
self._testStackBidirectionalRNN(use_gpu=False, use_shape=False)
self._testStackBidirectionalRNN(use_gpu=True, use_shape=False)
self._testStackBidirectionalRNN(use_gpu=False, use_shape=True)
self._testStackBidirectionalRNN(use_gpu=True, use_shape=True)
self._testStackBidirectionalRNNStates(use_gpu=False)
self._testStackBidirectionalRNNStates(use_gpu=True)
def _createStackBidirectionalDynamicRNN(self,
use_gpu,
use_shape,
use_state_tuple,
initial_states_fw=None,
initial_states_bw=None,
scope=None):
self.layers = [2, 3]
input_size = 5
batch_size = 2
max_length = 8
initializer = tf.random_uniform_initializer(-0.01, 0.01, seed=self._seed)
sequence_length = tf.placeholder(tf.int64)
self.cells_fw = [tf.nn.rnn_cell.LSTMCell(
num_units, input_size, initializer=initializer, state_is_tuple=False)
for num_units in self.layers]
self.cells_bw = [tf.nn.rnn_cell.LSTMCell(
num_units, input_size, initializer=initializer, state_is_tuple=False)
for num_units in self.layers]
inputs = max_length * [
tf.placeholder(
tf.float32,
shape=(batch_size, input_size) if use_shape else (None, input_size))
]
inputs_c = tf.stack(inputs)
inputs_c = tf.transpose(inputs_c, [1, 0, 2])
outputs, st_fw, st_bw = tf.contrib.rnn.stack_bidirectional_dynamic_rnn(
self.cells_fw,
self.cells_bw,
inputs_c,
initial_states_fw=initial_states_fw,
initial_states_bw=initial_states_bw,
dtype=tf.float32,
sequence_length=sequence_length,
scope=scope)
# Outputs has shape (batch_size, max_length, 2* layer[-1].
output_shape = [None, max_length, 2 * self.layers[-1]]
if use_shape:
output_shape[0] = batch_size
self.assertAllEqual(outputs.get_shape().as_list(), output_shape)
input_value = np.random.randn(batch_size, input_size)
return input_value, inputs, outputs, st_fw, st_bw, sequence_length
def _testStackBidirectionalDynamicRNN(self, use_gpu, use_shape,
use_state_tuple):
with self.test_session(use_gpu=use_gpu, graph=tf.Graph()) as sess:
input_value, inputs, outputs, state_fw, state_bw, sequence_length = (
self._createStackBidirectionalDynamicRNN(use_gpu, use_shape,
use_state_tuple))
tf.global_variables_initializer().run()
# Run with pre-specified sequence length of 2, 3
out, s_fw, s_bw = sess.run([outputs, state_fw, state_bw],
feed_dict={inputs[0]: input_value,
sequence_length: [2, 3]})
# Since the forward and backward LSTM cells were initialized with the
# same parameters, the forward and backward states of the first layer has
# to be the same.
# For the next layers, since the input is a concat of forward and backward
# outputs of the previous layers the symmetry is broken and the following
# states and outputs differ.
# We cannot access the intermediate values between layers but we can
# check that the forward and backward states of the first layer match.
self.assertAllClose(s_fw[0], s_bw[0])
out = np.swapaxes(out, 0, 1)
# If outputs are not concat between layers the output of the forward
# and backward would be the same but symmetric.
# Check that is not the case.
# Due to depth concatenation (as num_units=3 for both RNNs):
# - forward output: out[][][depth] for 0 <= depth < 3
# - backward output: out[][][depth] for 4 <= depth < 6
# First sequence in batch is length=2
# Check that the time=0 forward output is not equal to time=1 backward.
self.assertNotEqual(out[0][0][0], out[1][0][3])
self.assertNotEqual(out[0][0][1], out[1][0][4])
self.assertNotEqual(out[0][0][2], out[1][0][5])
# Check that the time=1 forward output is not equal to time=0 backward.
self.assertNotEqual(out[1][0][0], out[0][0][3])
self.assertNotEqual(out[1][0][1], out[0][0][4])
self.assertNotEqual(out[1][0][2], out[0][0][5])
# Second sequence in batch is length=3
# Check that the time=0 forward output is not equal to time=2 backward.
self.assertNotEqual(out[0][1][0], out[2][1][3])
self.assertNotEqual(out[0][1][1], out[2][1][4])
self.assertNotEqual(out[0][1][2], out[2][1][5])
# Check that the time=1 forward output is not equal to time=1 backward.
self.assertNotEqual(out[1][1][0], out[1][1][3])
self.assertNotEqual(out[1][1][1], out[1][1][4])
self.assertNotEqual(out[1][1][2], out[1][1][5])
# Check that the time=2 forward output is not equal to time=0 backward.
self.assertNotEqual(out[2][1][0], out[0][1][3])
self.assertNotEqual(out[2][1][1], out[0][1][4])
self.assertNotEqual(out[2][1][2], out[0][1][5])
def _testStackBidirectionalDynamicRNNStates(self, use_gpu):
# Check that the states are correctly initialized.
# - Create a net and iterate for 3 states. Keep the state (state_3).
# - Reset states, and iterate for 5 steps. Last state is state_5.
# - Reset the sets to state_3 and iterate for 2 more steps,
# last state will be state_5'.
# - Check that the state_5 and state_5' (forward and backward) are the
# same for the first layer (it does not apply for the second layer since
# it has forward-backward dependencies).
with self.test_session(use_gpu=use_gpu, graph=tf.Graph()) as sess:
batch_size=2
# Create states placeholders.
initial_states_fw = [tf.placeholder(tf.float32, shape=(batch_size, layer*2))
for layer in self.layers]
initial_states_bw = [tf.placeholder(tf.float32, shape=(batch_size, layer*2))
for layer in self.layers]
# Create the net
input_value, inputs, outputs, state_fw, state_bw, sequence_length = (
self._createStackBidirectionalDynamicRNN(
use_gpu,
use_shape=True,
use_state_tuple=False,
initial_states_fw=initial_states_fw,
initial_states_bw=initial_states_bw))
tf.global_variables_initializer().run()
# Run 3 steps.
feed_dict = {inputs[0]: input_value, sequence_length: [3, 2]}
# Initialize to empty state.
for i, layer in enumerate(self.layers):
feed_dict[initial_states_fw[i]] = np.zeros((batch_size, layer*2),
dtype=np.float32)
feed_dict[initial_states_bw[i]] = np.zeros((batch_size, layer*2),
dtype=np.float32)
_, st_3_fw, st_3_bw = sess.run([outputs, state_fw, state_bw],
feed_dict=feed_dict)
# Reset the net and run 5 steps.
feed_dict = {inputs[0]: input_value, sequence_length: [5, 3]}
for i, layer in enumerate(self.layers):
feed_dict[initial_states_fw[i]] = np.zeros((batch_size, layer*2),
dtype=np.float32)
feed_dict[initial_states_bw[i]] = np.zeros((batch_size, layer*2),
dtype=np.float32)
_, st_5_fw, st_5_bw = sess.run([outputs, state_fw, state_bw],
feed_dict=feed_dict)
# Reset the net to state_3 and run 2 more steps.
feed_dict = {inputs[0]: input_value, sequence_length: [2, 1]}
for i, _ in enumerate(self.layers):
feed_dict[initial_states_fw[i]] = st_3_fw[i]
feed_dict[initial_states_bw[i]] = st_3_bw[i]
out_5p, st_5p_fw, st_5p_bw = sess.run([outputs, state_fw, state_bw],
feed_dict=feed_dict)
# Check that the 3+2 and 5 first layer states.
self.assertAllEqual(st_5_fw[0], st_5p_fw[0])
self.assertAllEqual(st_5_bw[0], st_5p_bw[0])
def testBidirectionalRNN(self):
# Generate 2^3 option values
# from [True, True, True] to [False, False, False]
options = itertools.product([True, False], repeat=3)
for option in options:
self._testStackBidirectionalDynamicRNN(
use_gpu=option[0], use_shape=option[1], use_state_tuple=option[2])
# Check States.
self._testStackBidirectionalDynamicRNNStates(
use_gpu=False)
self._testStackBidirectionalDynamicRNNStates(
use_gpu=True)
def _testScope(self, factory, prefix="prefix", use_outer_scope=True):
# REMARKS: factory(scope) is a function accepting a scope
# as an argument, such scope can be None, a string
# or a VariableScope instance.
with self.test_session(use_gpu=True, graph=tf.Graph()):
if use_outer_scope:
with tf.variable_scope(prefix) as scope:
factory(scope)
else:
factory(prefix)
# check that all the variables names starts with the proper scope.
tf.global_variables_initializer()
all_vars = tf.all_variables()
prefix = prefix or "stack_bidirectional_rnn"
scope_vars = [v for v in all_vars if v.name.startswith(prefix + "/")]
tf.logging.info("StackRNN with scope: %s (%s)"
% (prefix, "scope" if use_outer_scope else "str"))
for v in scope_vars:
tf.logging.info(v.name)
self.assertEqual(len(scope_vars), len(all_vars))
def testStackBidirectionalRNNScope(self):
def factory(scope):
return self._createStackBidirectionalRNN(
use_gpu=True, use_shape=True,
use_sequence_length=True, scope=scope)
self._testScope(factory, use_outer_scope=True)
self._testScope(factory, use_outer_scope=False)
self._testScope(factory, prefix=None, use_outer_scope=False)
def testBidirectionalDynamicRNNScope(self):
def factory(scope):
return self._createStackBidirectionalDynamicRNN(
use_gpu=True, use_shape=True, use_state_tuple=True, scope=scope)
self._testScope(factory, use_outer_scope=True)
self._testScope(factory, use_outer_scope=False)
self._testScope(factory, prefix=None, use_outer_scope=False)
if __name__ == "__main__":
tf.test.main()
| 45.1875 | 82 | 0.636929 | 2,635 | 18,798 | 4.355598 | 0.101708 | 0.036246 | 0.054892 | 0.023177 | 0.781127 | 0.769278 | 0.747408 | 0.724492 | 0.713688 | 0.706544 | 0 | 0.029632 | 0.254974 | 18,798 | 415 | 83 | 45.296386 | 0.789861 | 0.2579 | 0 | 0.650376 | 0 | 0 | 0.005343 | 0.001661 | 0 | 0 | 0 | 0 | 0.150376 | 1 | 0.052632 | false | 0 | 0.022556 | 0.007519 | 0.093985 | 0.003759 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
91c78088192ebe256018bacae48010559fb1696d | 18,731 | py | Python | tests/hwsim/test_dfs.py | zhijianli88/hostap | 6d49aeb76247c4145cb4f7c05afb7b35f27150c1 | [
"Unlicense"
] | 1 | 2018-10-28T16:15:01.000Z | 2018-10-28T16:15:01.000Z | tests/hwsim/test_dfs.py | zhijianli88/hostap | 6d49aeb76247c4145cb4f7c05afb7b35f27150c1 | [
"Unlicense"
] | 1 | 2018-01-09T16:46:00.000Z | 2018-01-09T16:46:00.000Z | tests/hwsim/test_dfs.py | zhijianli88/hostap | 6d49aeb76247c4145cb4f7c05afb7b35f27150c1 | [
"Unlicense"
] | 1 | 2019-11-01T10:12:55.000Z | 2019-11-01T10:12:55.000Z | # Test cases for DFS
# Copyright (c) 2013, Jouni Malinen <j@w1.fi>
#
# This software may be distributed under the terms of the BSD license.
# See README for more details.
from remotehost import remote_compatible
import os
import subprocess
import time
import logging
logger = logging.getLogger()
import hwsim_utils
import hostapd
from utils import HwsimSkip
def wait_dfs_event(hapd, event, timeout):
dfs_events = [ "DFS-RADAR-DETECTED", "DFS-NEW-CHANNEL",
"DFS-CAC-START", "DFS-CAC-COMPLETED",
"DFS-NOP-FINISHED", "AP-ENABLED", "AP-CSA-FINISHED" ]
ev = hapd.wait_event(dfs_events, timeout=timeout)
if not ev:
raise Exception("DFS event timed out")
if event and event not in ev:
raise Exception("Unexpected DFS event: " + ev + " (expected: %s)" % event)
return ev
def start_dfs_ap(ap, allow_failure=False, ssid="dfs", ht=True, ht40=False,
ht40minus=False, vht80=False, vht20=False, chanlist=None,
channel=None, country="FI"):
ifname = ap['ifname']
logger.info("Starting AP " + ifname + " on DFS channel")
hapd = hostapd.add_ap(ap, {}, no_enable=True)
hapd.set("ssid", ssid)
hapd.set("country_code", country)
hapd.set("ieee80211d", "1")
hapd.set("ieee80211h", "1")
hapd.set("hw_mode", "a")
hapd.set("channel", "52")
if not ht:
hapd.set("ieee80211n", "0")
if ht40:
hapd.set("ht_capab", "[HT40+]")
elif ht40minus:
hapd.set("ht_capab", "[HT40-]")
hapd.set("channel", "56")
if vht80:
hapd.set("ieee80211ac", "1")
hapd.set("vht_oper_chwidth", "1")
hapd.set("vht_oper_centr_freq_seg0_idx", "58")
if vht20:
hapd.set("ieee80211ac", "1")
hapd.set("vht_oper_chwidth", "0")
hapd.set("vht_oper_centr_freq_seg0_idx", "0")
if chanlist:
hapd.set("chanlist", chanlist)
if channel:
hapd.set("channel", str(channel))
hapd.enable()
ev = wait_dfs_event(hapd, "DFS-CAC-START", 5)
if "DFS-CAC-START" not in ev:
raise Exception("Unexpected DFS event: " + ev)
state = hapd.get_status_field("state")
if state != "DFS":
if allow_failure:
logger.info("Interface state not DFS: " + state)
if not os.path.exists("dfs"):
raise HwsimSkip("Assume DFS testing not supported")
raise Exception("Failed to start DFS AP")
raise Exception("Unexpected interface state: " + state)
return hapd
def dfs_simulate_radar(hapd):
logger.info("Trigger a simulated radar event")
phyname = hapd.get_driver_status_field("phyname")
radar_file = '/sys/kernel/debug/ieee80211/' + phyname + '/hwsim/dfs_simulate_radar'
with open(radar_file, 'w') as f:
f.write('1')
def test_dfs(dev, apdev):
"""DFS CAC functionality on clear channel"""
try:
hapd = None
hapd = start_dfs_ap(apdev[0], allow_failure=True, country="US")
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 70)
if "success=1" not in ev:
raise Exception("CAC failed")
if "freq=5260" not in ev:
raise Exception("Unexpected DFS freq result")
ev = hapd.wait_event(["AP-ENABLED"], timeout=5)
if not ev:
raise Exception("AP setup timed out")
state = hapd.get_status_field("state")
if state != "ENABLED":
raise Exception("Unexpected interface state")
freq = hapd.get_status_field("freq")
if freq != "5260":
raise Exception("Unexpected frequency")
dev[0].connect("dfs", key_mgmt="NONE")
hwsim_utils.test_connectivity(dev[0], hapd)
hapd.request("RADAR DETECTED freq=5260 ht_enabled=1 chan_width=1")
ev = hapd.wait_event(["DFS-RADAR-DETECTED"], timeout=10)
if ev is None:
raise Exception("DFS-RADAR-DETECTED event not reported")
if "freq=5260" not in ev:
raise Exception("Incorrect frequency in radar detected event: " + ev)
ev = hapd.wait_event(["DFS-NEW-CHANNEL"], timeout=70)
if ev is None:
raise Exception("DFS-NEW-CHANNEL event not reported")
if "freq=5260" in ev:
raise Exception("Channel did not change after radar was detected")
ev = hapd.wait_event(["AP-CSA-FINISHED"], timeout=70)
if ev is None:
raise Exception("AP-CSA-FINISHED event not reported")
if "freq=5260" in ev:
raise Exception("Channel did not change after radar was detected(2)")
time.sleep(1)
hwsim_utils.test_connectivity(dev[0], hapd)
finally:
dev[0].request("DISCONNECT")
if hapd:
hapd.request("DISABLE")
subprocess.call(['iw', 'reg', 'set', '00'])
dev[0].flush_scan_cache()
def test_dfs_etsi(dev, apdev, params):
"""DFS and uniform spreading requirement for ETSI [long]"""
if not params['long']:
raise HwsimSkip("Skip test case with long duration due to --long not specified")
try:
hapd = None
hapd = start_dfs_ap(apdev[0], allow_failure=True)
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 70)
if "success=1" not in ev:
raise Exception("CAC failed")
if "freq=5260" not in ev:
raise Exception("Unexpected DFS freq result")
ev = hapd.wait_event(["AP-ENABLED"], timeout=5)
if not ev:
raise Exception("AP setup timed out")
state = hapd.get_status_field("state")
if state != "ENABLED":
raise Exception("Unexpected interface state")
freq = hapd.get_status_field("freq")
if freq != "5260":
raise Exception("Unexpected frequency")
dev[0].connect("dfs", key_mgmt="NONE")
hwsim_utils.test_connectivity(dev[0], hapd)
hapd.request("RADAR DETECTED freq=%s ht_enabled=1 chan_width=1" % freq)
ev = hapd.wait_event(["DFS-RADAR-DETECTED"], timeout=5)
if ev is None:
raise Exception("DFS-RADAR-DETECTED event not reported")
if "freq=%s" % freq not in ev:
raise Exception("Incorrect frequency in radar detected event: " + ev)
ev = hapd.wait_event(["DFS-NEW-CHANNEL"], timeout=5)
if ev is None:
raise Exception("DFS-NEW-CHANNEL event not reported")
if "freq=%s" % freq in ev:
raise Exception("Channel did not change after radar was detected")
ev = hapd.wait_event(["AP-CSA-FINISHED", "DFS-CAC-START"], timeout=10)
if ev is None:
raise Exception("AP-CSA-FINISHED or DFS-CAC-START event not reported")
if "DFS-CAC-START" in ev:
# The selected new channel requires CAC
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 70)
if "success=1" not in ev:
raise Exception("CAC failed")
ev = hapd.wait_event(["AP-ENABLED"], timeout=5)
if not ev:
raise Exception("AP setup timed out")
ev = hapd.wait_event(["AP-STA-CONNECTED"], timeout=30)
if not ev:
raise Exception("STA did not reconnect on new DFS channel")
else:
# The new channel did not require CAC - try again
if "freq=%s" % freq in ev:
raise Exception("Channel did not change after radar was detected(2)")
time.sleep(1)
hwsim_utils.test_connectivity(dev[0], hapd)
finally:
dev[0].request("DISCONNECT")
if hapd:
hapd.request("DISABLE")
subprocess.call(['iw', 'reg', 'set', '00'])
dev[0].flush_scan_cache()
def test_dfs_radar(dev, apdev):
"""DFS CAC functionality with radar detected"""
try:
hapd = None
hapd2 = None
hapd = start_dfs_ap(apdev[0], allow_failure=True)
time.sleep(1)
dfs_simulate_radar(hapd)
hapd2 = start_dfs_ap(apdev[1], ssid="dfs2", ht40=True)
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 5)
if ev is None:
raise Exception("Timeout on DFS aborted event")
if "success=0 freq=5260" not in ev:
raise Exception("Unexpected DFS aborted event contents: " + ev)
ev = wait_dfs_event(hapd, "DFS-RADAR-DETECTED", 5)
if "freq=5260" not in ev:
raise Exception("Unexpected DFS radar detection freq")
ev = wait_dfs_event(hapd, "DFS-NEW-CHANNEL", 5)
if "freq=5260" in ev:
raise Exception("Unexpected DFS new freq")
ev = wait_dfs_event(hapd, None, 5)
if "AP-ENABLED" in ev:
logger.info("Started AP on non-DFS channel")
else:
logger.info("Trying to start AP on another DFS channel")
if "DFS-CAC-START" not in ev:
raise Exception("Unexpected DFS event: " + ev)
if "freq=5260" in ev:
raise Exception("Unexpected DFS CAC freq")
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 70)
if "success=1" not in ev:
raise Exception("CAC failed")
if "freq=5260" in ev:
raise Exception("Unexpected DFS freq result - radar channel")
ev = hapd.wait_event(["AP-ENABLED"], timeout=5)
if not ev:
raise Exception("AP setup timed out")
state = hapd.get_status_field("state")
if state != "ENABLED":
raise Exception("Unexpected interface state")
freq = hapd.get_status_field("freq")
if freq == "5260":
raise Exception("Unexpected frequency: " + freq)
dev[0].connect("dfs", key_mgmt="NONE")
ev = hapd2.wait_event(["AP-ENABLED"], timeout=70)
if not ev:
raise Exception("AP2 setup timed out")
dfs_simulate_radar(hapd2)
ev = wait_dfs_event(hapd2, "DFS-RADAR-DETECTED", 5)
if "freq=5260 ht_enabled=1 chan_offset=1 chan_width=2" not in ev:
raise Exception("Unexpected DFS radar detection freq from AP2")
ev = wait_dfs_event(hapd2, "DFS-NEW-CHANNEL", 5)
if "freq=5260" in ev:
raise Exception("Unexpected DFS new freq for AP2")
wait_dfs_event(hapd2, None, 5)
finally:
dev[0].request("DISCONNECT")
if hapd:
hapd.request("DISABLE")
if hapd2:
hapd2.request("DISABLE")
subprocess.call(['iw', 'reg', 'set', '00'])
dev[0].flush_scan_cache()
@remote_compatible
def test_dfs_radar_on_non_dfs_channel(dev, apdev):
"""DFS radar detection test code on non-DFS channel"""
params = { "ssid": "radar" }
hapd = hostapd.add_ap(apdev[0], params)
hapd.request("RADAR DETECTED freq=5260 ht_enabled=1 chan_width=1")
hapd.request("RADAR DETECTED freq=2412 ht_enabled=1 chan_width=1")
def test_dfs_radar_chanlist(dev, apdev):
"""DFS chanlist when radar is detected"""
try:
hapd = None
hapd = start_dfs_ap(apdev[0], chanlist="40 44", allow_failure=True)
time.sleep(1)
dfs_simulate_radar(hapd)
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 5)
if ev is None:
raise Exception("Timeout on DFS aborted event")
if "success=0 freq=5260" not in ev:
raise Exception("Unexpected DFS aborted event contents: " + ev)
ev = wait_dfs_event(hapd, "DFS-RADAR-DETECTED", 5)
if "freq=5260" not in ev:
raise Exception("Unexpected DFS radar detection freq")
ev = wait_dfs_event(hapd, "DFS-NEW-CHANNEL", 5)
if "freq=5200 chan=40" not in ev and "freq=5220 chan=44" not in ev:
raise Exception("Unexpected DFS new freq: " + ev)
ev = wait_dfs_event(hapd, None, 5)
if "AP-ENABLED" not in ev:
raise Exception("Unexpected DFS event: " + ev)
dev[0].connect("dfs", key_mgmt="NONE")
finally:
dev[0].request("DISCONNECT")
if hapd:
hapd.request("DISABLE")
subprocess.call(['iw', 'reg', 'set', '00'])
dev[0].flush_scan_cache()
def test_dfs_radar_chanlist_vht80(dev, apdev):
"""DFS chanlist when radar is detected and VHT80 configured"""
try:
hapd = None
hapd = start_dfs_ap(apdev[0], chanlist="36", ht40=True, vht80=True,
allow_failure=True)
time.sleep(1)
dfs_simulate_radar(hapd)
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 5)
if ev is None:
raise Exception("Timeout on DFS aborted event")
if "success=0 freq=5260" not in ev:
raise Exception("Unexpected DFS aborted event contents: " + ev)
ev = wait_dfs_event(hapd, "DFS-RADAR-DETECTED", 5)
if "freq=5260" not in ev:
raise Exception("Unexpected DFS radar detection freq")
ev = wait_dfs_event(hapd, "DFS-NEW-CHANNEL", 5)
if "freq=5180 chan=36 sec_chan=1" not in ev:
raise Exception("Unexpected DFS new freq: " + ev)
ev = wait_dfs_event(hapd, None, 5)
if "AP-ENABLED" not in ev:
raise Exception("Unexpected DFS event: " + ev)
dev[0].connect("dfs", key_mgmt="NONE")
if hapd.get_status_field('vht_oper_centr_freq_seg0_idx') != "42":
raise Exception("Unexpected seg0 idx")
finally:
dev[0].request("DISCONNECT")
if hapd:
hapd.request("DISABLE")
subprocess.call(['iw', 'reg', 'set', '00'])
dev[0].flush_scan_cache()
def test_dfs_radar_chanlist_vht20(dev, apdev):
"""DFS chanlist when radar is detected and VHT40 configured"""
try:
hapd = None
hapd = start_dfs_ap(apdev[0], chanlist="36", vht20=True,
allow_failure=True)
time.sleep(1)
dfs_simulate_radar(hapd)
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 5)
if ev is None:
raise Exception("Timeout on DFS aborted event")
if "success=0 freq=5260" not in ev:
raise Exception("Unexpected DFS aborted event contents: " + ev)
ev = wait_dfs_event(hapd, "DFS-RADAR-DETECTED", 5)
if "freq=5260" not in ev:
raise Exception("Unexpected DFS radar detection freq")
ev = wait_dfs_event(hapd, "DFS-NEW-CHANNEL", 5)
if "freq=5180 chan=36 sec_chan=0" not in ev:
raise Exception("Unexpected DFS new freq: " + ev)
ev = wait_dfs_event(hapd, None, 5)
if "AP-ENABLED" not in ev:
raise Exception("Unexpected DFS event: " + ev)
dev[0].connect("dfs", key_mgmt="NONE")
finally:
dev[0].request("DISCONNECT")
if hapd:
hapd.request("DISABLE")
subprocess.call(['iw', 'reg', 'set', '00'])
dev[0].flush_scan_cache()
def test_dfs_radar_no_ht(dev, apdev):
"""DFS chanlist when radar is detected and no HT configured"""
try:
hapd = None
hapd = start_dfs_ap(apdev[0], chanlist="36", ht=False,
allow_failure=True)
time.sleep(1)
dfs_simulate_radar(hapd)
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 5)
if ev is None:
raise Exception("Timeout on DFS aborted event")
if "success=0 freq=5260" not in ev:
raise Exception("Unexpected DFS aborted event contents: " + ev)
ev = wait_dfs_event(hapd, "DFS-RADAR-DETECTED", 5)
if "freq=5260 ht_enabled=0" not in ev:
raise Exception("Unexpected DFS radar detection freq: " + ev)
ev = wait_dfs_event(hapd, "DFS-NEW-CHANNEL", 5)
if "freq=5180 chan=36 sec_chan=0" not in ev:
raise Exception("Unexpected DFS new freq: " + ev)
ev = wait_dfs_event(hapd, None, 5)
if "AP-ENABLED" not in ev:
raise Exception("Unexpected DFS event: " + ev)
dev[0].connect("dfs", key_mgmt="NONE")
finally:
dev[0].request("DISCONNECT")
if hapd:
hapd.request("DISABLE")
subprocess.call(['iw', 'reg', 'set', '00'])
dev[0].flush_scan_cache()
def test_dfs_radar_ht40minus(dev, apdev):
"""DFS chanlist when radar is detected and HT40- configured"""
try:
hapd = None
hapd = start_dfs_ap(apdev[0], chanlist="36", ht40minus=True,
allow_failure=True)
time.sleep(1)
dfs_simulate_radar(hapd)
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 5)
if ev is None:
raise Exception("Timeout on DFS aborted event")
if "success=0 freq=5280 ht_enabled=1 chan_offset=-1" not in ev:
raise Exception("Unexpected DFS aborted event contents: " + ev)
ev = wait_dfs_event(hapd, "DFS-RADAR-DETECTED", 5)
if "freq=5280 ht_enabled=1 chan_offset=-1" not in ev:
raise Exception("Unexpected DFS radar detection freq: " + ev)
ev = wait_dfs_event(hapd, "DFS-NEW-CHANNEL", 5)
if "freq=5180 chan=36 sec_chan=1" not in ev:
raise Exception("Unexpected DFS new freq: " + ev)
ev = wait_dfs_event(hapd, None, 5)
if "AP-ENABLED" not in ev:
raise Exception("Unexpected DFS event: " + ev)
dev[0].connect("dfs", key_mgmt="NONE")
finally:
dev[0].request("DISCONNECT")
if hapd:
hapd.request("DISABLE")
subprocess.call(['iw', 'reg', 'set', '00'])
dev[0].flush_scan_cache()
def test_dfs_ht40_minus(dev, apdev, params):
"""DFS CAC functionality on channel 104 HT40- [long]"""
if not params['long']:
raise HwsimSkip("Skip test case with long duration due to --long not specified")
try:
hapd = None
hapd = start_dfs_ap(apdev[0], allow_failure=True, ht40minus=True,
channel=104)
ev = wait_dfs_event(hapd, "DFS-CAC-COMPLETED", 70)
if "success=1" not in ev:
raise Exception("CAC failed")
if "freq=5520" not in ev:
raise Exception("Unexpected DFS freq result")
ev = hapd.wait_event(["AP-ENABLED"], timeout=5)
if not ev:
raise Exception("AP setup timed out")
state = hapd.get_status_field("state")
if state != "ENABLED":
raise Exception("Unexpected interface state")
freq = hapd.get_status_field("freq")
if freq != "5520":
raise Exception("Unexpected frequency")
dev[0].connect("dfs", key_mgmt="NONE", scan_freq="5520")
hwsim_utils.test_connectivity(dev[0], hapd)
finally:
dev[0].request("DISCONNECT")
if hapd:
hapd.request("DISABLE")
subprocess.call(['iw', 'reg', 'set', '00'])
dev[0].flush_scan_cache()
| 36.944773 | 88 | 0.592334 | 2,503 | 18,731 | 4.321215 | 0.093088 | 0.097078 | 0.076923 | 0.073225 | 0.79068 | 0.762666 | 0.752219 | 0.746857 | 0.72476 | 0.679549 | 0 | 0.033906 | 0.285142 | 18,731 | 506 | 89 | 37.017787 | 0.773861 | 0.039774 | 0 | 0.686275 | 0 | 0 | 0.270679 | 0.007641 | 0 | 0 | 0 | 0 | 0 | 1 | 0.031863 | false | 0 | 0.019608 | 0 | 0.056373 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
91cfef212957ecf1b19642b47fb1c79729e823f1 | 72 | py | Python | IceGame/h5game/config/__init__.py | onebitxy/djangoing | 5061516b0f5bce1794488680ca616f512ae3ad42 | [
"MIT"
] | null | null | null | IceGame/h5game/config/__init__.py | onebitxy/djangoing | 5061516b0f5bce1794488680ca616f512ae3ad42 | [
"MIT"
] | null | null | null | IceGame/h5game/config/__init__.py | onebitxy/djangoing | 5061516b0f5bce1794488680ca616f512ae3ad42 | [
"MIT"
] | null | null | null | #!/usr/bin/python3.7
# -*- coding: utf-8 -*-
from .game_conf import * | 18 | 25 | 0.597222 | 11 | 72 | 3.818182 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05 | 0.166667 | 72 | 4 | 26 | 18 | 0.65 | 0.569444 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
91e383e97841299a61d5f1bc55752b99320901eb | 117 | py | Python | Mundo01/Python/aula07-007.py | molonti/CursoemVideo---Python | 4f6a7af648f7f619d11e95fa3dc7a33b28fcfa11 | [
"MIT"
] | null | null | null | Mundo01/Python/aula07-007.py | molonti/CursoemVideo---Python | 4f6a7af648f7f619d11e95fa3dc7a33b28fcfa11 | [
"MIT"
] | null | null | null | Mundo01/Python/aula07-007.py | molonti/CursoemVideo---Python | 4f6a7af648f7f619d11e95fa3dc7a33b28fcfa11 | [
"MIT"
] | null | null | null | n1 = int(input('Digite a nota 1: '))
n2 = int(input('Digite a nota 2: '))
print(f'A média das Notas é: {(n1+n2)/2}')
| 29.25 | 42 | 0.598291 | 24 | 117 | 2.916667 | 0.625 | 0.228571 | 0.4 | 0.428571 | 0.542857 | 0 | 0 | 0 | 0 | 0 | 0 | 0.072165 | 0.17094 | 117 | 3 | 43 | 39 | 0.649485 | 0 | 0 | 0 | 0 | 0 | 0.564103 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0.333333 | 1 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
37db5744df6285baefcb20dc0fbbd7396be022a0 | 54 | py | Python | dictionary/__init__.py | eric-Rstats/pydictionary | 938ee3e1f345a43256eae9426927355f2a9889ac | [
"MIT"
] | 1 | 2019-11-25T12:16:51.000Z | 2019-11-25T12:16:51.000Z | dictionary/__init__.py | eric-Rstats/pydictionary | 938ee3e1f345a43256eae9426927355f2a9889ac | [
"MIT"
] | null | null | null | dictionary/__init__.py | eric-Rstats/pydictionary | 938ee3e1f345a43256eae9426927355f2a9889ac | [
"MIT"
] | 1 | 2021-04-22T11:46:48.000Z | 2021-04-22T11:46:48.000Z | from .cn_dict.baidu import BaiduChineseWordDictionary
| 27 | 53 | 0.888889 | 6 | 54 | 7.833333 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.074074 | 54 | 1 | 54 | 54 | 0.94 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
37e7da942368a357a6aa16bec5c53d8fb3c8e036 | 141 | py | Python | srblib.py | srbcheema1/srblib | 26146cb0d5586548da5f97a9fe3af355cd97f3ca | [
"MIT"
] | 2 | 2019-04-03T00:51:54.000Z | 2019-05-16T10:33:44.000Z | srblib.py | srbcheema1/srblib | 26146cb0d5586548da5f97a9fe3af355cd97f3ca | [
"MIT"
] | null | null | null | srblib.py | srbcheema1/srblib | 26146cb0d5586548da5f97a9fe3af355cd97f3ca | [
"MIT"
] | null | null | null | #!/usr/bin/env python3
# PYTHON_ARGCOMPLETE_OK
from srblib import *
if __name__ == "__main__":
from srblib.main import main
main()
| 15.666667 | 32 | 0.702128 | 19 | 141 | 4.684211 | 0.684211 | 0.224719 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.008772 | 0.191489 | 141 | 8 | 33 | 17.625 | 0.77193 | 0.304965 | 0 | 0 | 0 | 0 | 0.083333 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
37f0bbe562512166db41a4893310f0efbdda9806 | 85 | py | Python | telethon_tests/tl_test.py | Rezlazy/Telethon | 239ee16bc3c6e393d3777414fc504cacd7d70642 | [
"MIT"
] | 2 | 2018-03-31T05:43:23.000Z | 2018-07-11T12:35:50.000Z | telethon_tests/tl_test.py | Rezlazy/Telethon | 239ee16bc3c6e393d3777414fc504cacd7d70642 | [
"MIT"
] | 1 | 2018-03-20T21:15:47.000Z | 2018-03-20T21:15:47.000Z | telethon_tests/tl_test.py | Rezlazy/Telethon | 239ee16bc3c6e393d3777414fc504cacd7d70642 | [
"MIT"
] | 1 | 2019-05-10T21:53:16.000Z | 2019-05-10T21:53:16.000Z | import unittest
class TLTests(unittest.TestCase):
"""There are no tests yet"""
| 14.166667 | 33 | 0.705882 | 11 | 85 | 5.454545 | 0.909091 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.176471 | 85 | 5 | 34 | 17 | 0.857143 | 0.258824 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
530d26c48132100ff13c7aa51f35dcc07daae9ab | 445 | py | Python | ares/inference/__init__.py | jlashner/ares | 6df2b676ded6bd59082a531641cb1dadd475c8a8 | [
"MIT"
] | 10 | 2020-03-26T01:08:10.000Z | 2021-12-04T13:02:10.000Z | ares/inference/__init__.py | jlashner/ares | 6df2b676ded6bd59082a531641cb1dadd475c8a8 | [
"MIT"
] | 25 | 2020-06-08T14:52:28.000Z | 2022-03-08T02:30:54.000Z | ares/inference/__init__.py | jlashner/ares | 6df2b676ded6bd59082a531641cb1dadd475c8a8 | [
"MIT"
] | 8 | 2020-03-24T14:11:25.000Z | 2021-11-06T06:32:59.000Z | from ares.inference.ModelFit import ModelFit
from ares.inference.ModelGrid import ModelGrid
from ares.inference.ModelSample import ModelSample
from ares.inference.FitGlobal21cm import FitGlobal21cm
#from ares.inference.ModelEmulator import ModelEmulator
from ares.inference.CalibrateModel import CalibrateModel
#from ares.inference.OptimizeSpectrum import SpectrumOptimization
from ares.inference.FitGalaxyPopulation import FitGalaxyPopulation
| 44.5 | 66 | 0.885393 | 48 | 445 | 8.208333 | 0.270833 | 0.162437 | 0.345178 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.009709 | 0.074157 | 445 | 9 | 67 | 49.444444 | 0.946602 | 0.265169 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
531f46cb85f48446b9d6f6faad5814acc280bfde | 23 | py | Python | dziban/__init__.py | haldenl/dziban | c5afdadc751017ec94424e2b4f2e9deea8f665a5 | [
"BSD-3-Clause"
] | 19 | 2020-02-28T17:08:14.000Z | 2022-02-15T09:20:59.000Z | dziban/__init__.py | haldenl/dziban | c5afdadc751017ec94424e2b4f2e9deea8f665a5 | [
"BSD-3-Clause"
] | null | null | null | dziban/__init__.py | haldenl/dziban | c5afdadc751017ec94424e2b4f2e9deea8f665a5 | [
"BSD-3-Clause"
] | 2 | 2020-05-25T13:55:44.000Z | 2020-08-17T13:20:56.000Z | from .mkiv import Chart | 23 | 23 | 0.826087 | 4 | 23 | 4.75 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.130435 | 23 | 1 | 23 | 23 | 0.95 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
536456f0e2c9af0b7197952a21b87bb39e930a6a | 210 | py | Python | calculus.py | ianzim/calculadora | 421b2a23c5cf3ce7168df0f379f92f3e7462531d | [
"MIT"
] | null | null | null | calculus.py | ianzim/calculadora | 421b2a23c5cf3ce7168df0f379f92f3e7462531d | [
"MIT"
] | null | null | null | calculus.py | ianzim/calculadora | 421b2a23c5cf3ce7168df0f379f92f3e7462531d | [
"MIT"
] | null | null | null | import math
def integral():
a = float(input("Ponto inicial da integração: "))
b = float(input("Ponto final da integração: "))
func = str(input("Função a ser integrada: "))
def derivada():
pass | 23.333333 | 53 | 0.642857 | 28 | 210 | 4.821429 | 0.714286 | 0.148148 | 0.222222 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.214286 | 210 | 9 | 54 | 23.333333 | 0.818182 | 0 | 0 | 0 | 0 | 0 | 0.379147 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.285714 | false | 0.142857 | 0.142857 | 0 | 0.428571 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
72554c56a451825142c63f2399ddd2b2c916dca7 | 127 | py | Python | image-uploader/default_settings.py | 840599/flask-image-uploader-1 | b231809851e552900ec5af1226b62aeecf57bdb8 | [
"MIT"
] | 3 | 2019-04-20T09:01:48.000Z | 2020-05-19T08:19:02.000Z | image-uploader/default_settings.py | 840599/flask-image-uploader-1 | b231809851e552900ec5af1226b62aeecf57bdb8 | [
"MIT"
] | null | null | null | image-uploader/default_settings.py | 840599/flask-image-uploader-1 | b231809851e552900ec5af1226b62aeecf57bdb8 | [
"MIT"
] | 3 | 2018-04-27T12:07:42.000Z | 2021-01-14T01:32:23.000Z | # Example configuration
DEBUG = False
SECRET_KEY = 'X&\x0bx\x02\xac@\x03\x8f\x1e\xc6\xa4{\xe1\xfe}\xa2\x9a\x1c\xaf7\x1a\\\xa4'
| 31.75 | 88 | 0.716535 | 23 | 127 | 3.913043 | 0.956522 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135593 | 0.070866 | 127 | 3 | 89 | 42.333333 | 0.627119 | 0.165354 | 0 | 0 | 0 | 0.5 | 0.701923 | 0.701923 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
727d29c36c9821c7b414f175507446cf07ef3878 | 75 | py | Python | IntegrationTests/helpers/fixture_types/PathFixture.py | renovate-tests/Emcee | e0eaf8cbae7a4472aecf8b13b2dfb0c8fc4258f3 | [
"MIT"
] | null | null | null | IntegrationTests/helpers/fixture_types/PathFixture.py | renovate-tests/Emcee | e0eaf8cbae7a4472aecf8b13b2dfb0c8fc4258f3 | [
"MIT"
] | null | null | null | IntegrationTests/helpers/fixture_types/PathFixture.py | renovate-tests/Emcee | e0eaf8cbae7a4472aecf8b13b2dfb0c8fc4258f3 | [
"MIT"
] | null | null | null | class PathFixture():
def __init__(self, path):
self.path = path | 25 | 29 | 0.626667 | 9 | 75 | 4.777778 | 0.666667 | 0.372093 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.253333 | 75 | 3 | 30 | 25 | 0.767857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
7284585ab58ac295aca6af7ec76c6fbdc2c5dfb0 | 124 | py | Python | drf/admin.py | Lovekesh-GH/Restapi | e66c057b67356545564f348f1d067e2eb5f89e66 | [
"MIT"
] | null | null | null | drf/admin.py | Lovekesh-GH/Restapi | e66c057b67356545564f348f1d067e2eb5f89e66 | [
"MIT"
] | null | null | null | drf/admin.py | Lovekesh-GH/Restapi | e66c057b67356545564f348f1d067e2eb5f89e66 | [
"MIT"
] | null | null | null | from django.contrib import admin
from drf.models import Students
# Register your models here.
admin.site.register(Students) | 24.8 | 32 | 0.822581 | 18 | 124 | 5.666667 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.112903 | 124 | 5 | 33 | 24.8 | 0.927273 | 0.209677 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
72b65c0c15773c661b1181bfbde40fdb31dd5355 | 312 | py | Python | src/stonktastic/config/__init__.py | aevear/Stonktastic | 69a7b33c29492c8d76f5bec892eefb6606c2eaab | [
"MIT"
] | 1 | 2021-01-20T02:00:08.000Z | 2021-01-20T02:00:08.000Z | src/stonktastic/config/__init__.py | KKR959/Stonktastic | bd7a5f43fb899368886d86ffe4a5e37b0cd7ad4d | [
"MIT"
] | null | null | null | src/stonktastic/config/__init__.py | KKR959/Stonktastic | bd7a5f43fb899368886d86ffe4a5e37b0cd7ad4d | [
"MIT"
] | 1 | 2021-01-18T23:18:50.000Z | 2021-01-18T23:18:50.000Z | """
There are two areas of Config that need to be imported
- Config : imports the variables saved in *config.csv* into the code.
- Paths : Using *pathlib* we generate the pathing needed for reference to the database, models and other files
"""
import stonktastic.config.config
import stonktastic.config.paths
| 28.363636 | 110 | 0.769231 | 47 | 312 | 5.106383 | 0.744681 | 0.141667 | 0.191667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.163462 | 312 | 10 | 111 | 31.2 | 0.91954 | 0.759615 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
72d7475d0c721bbd9825ee21596201d7c16b99db | 189 | py | Python | homeinventory/inventory/admin.py | le4ndro/homeinventory | 3b5fedbbc86eaf1eac8c8475fd4f2649a3815a68 | [
"MIT"
] | null | null | null | homeinventory/inventory/admin.py | le4ndro/homeinventory | 3b5fedbbc86eaf1eac8c8475fd4f2649a3815a68 | [
"MIT"
] | 1 | 2022-01-13T00:49:07.000Z | 2022-01-13T00:49:07.000Z | homeinventory/inventory/admin.py | le4ndro/homeinventory | 3b5fedbbc86eaf1eac8c8475fd4f2649a3815a68 | [
"MIT"
] | 1 | 2017-10-19T11:49:37.000Z | 2017-10-19T11:49:37.000Z | from django.contrib import admin
from homeinventory.inventory.models import Category, Location, Item
admin.site.register(Category)
admin.site.register(Location)
admin.site.register(Item)
| 23.625 | 67 | 0.830688 | 25 | 189 | 6.28 | 0.52 | 0.171975 | 0.324841 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.079365 | 189 | 7 | 68 | 27 | 0.902299 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.4 | 0 | 0.4 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
f454e245ba4d81a1357e40d8dc8a37d616f59935 | 106 | py | Python | src/unittest/python/daemon_tests.py | KamdenChew/GitUp | f28b8723bf4200c4b46abd1384031100d000d79c | [
"MIT"
] | null | null | null | src/unittest/python/daemon_tests.py | KamdenChew/GitUp | f28b8723bf4200c4b46abd1384031100d000d79c | [
"MIT"
] | null | null | null | src/unittest/python/daemon_tests.py | KamdenChew/GitUp | f28b8723bf4200c4b46abd1384031100d000d79c | [
"MIT"
] | null | null | null | #Put daemon tests in here using Python's unittest module: https://docs.python.org/3/library/unittest.html
| 53 | 105 | 0.792453 | 18 | 106 | 4.666667 | 0.888889 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.010417 | 0.09434 | 106 | 1 | 106 | 106 | 0.864583 | 0.981132 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
f46fe6cd67296708f4be8acf633e85d163f45ab3 | 2,021 | py | Python | firmwire/vendor/mtk/hw/various.py | j4s0n/FirmWire | d3a20e2429cb4827f538d1a16163afde8b45826b | [
"BSD-3-Clause"
] | null | null | null | firmwire/vendor/mtk/hw/various.py | j4s0n/FirmWire | d3a20e2429cb4827f538d1a16163afde8b45826b | [
"BSD-3-Clause"
] | null | null | null | firmwire/vendor/mtk/hw/various.py | j4s0n/FirmWire | d3a20e2429cb4827f538d1a16163afde8b45826b | [
"BSD-3-Clause"
] | null | null | null | ## Copyright (c) 2022, Team FirmWire
## SPDX-License-Identifier: BSD-3-Clause
from . import PassthroughPeripheral
class CDMM_Periph(PassthroughPeripheral):
def __init__(self, name, address, size, **kwargs):
super().__init__(name, address, size, **kwargs)
class TOPSM_Periph(PassthroughPeripheral):
def __init__(self, name, address, size, **kwargs):
super().__init__(name, address, size, **kwargs)
def hw_read(self, offset, size):
if offset == 0x590:
# SM_PLL_STA
# TODO: boot hack, set all the bits?
return 0xFFFFFFFF
else:
return super().hw_read(offset, size)
class MODEML1_TOPSM_Periph(PassthroughPeripheral):
def __init__(self, name, address, size, **kwargs):
super().__init__(name, address, size, **kwargs)
def hw_read(self, offset, size):
if offset == 0xD4:
# probably some PWR_STA
# TODO: for now, just set all the bits
self.log.info(f"{self.name}: read PWR_STA")
return 0xFFFFFFFF
else:
return super().hw_read(offset, size)
class MDPERISYS_MISC_Periph(PassthroughPeripheral):
def __init__(self, name, address, size, **kwargs):
super().__init__(name, address, size, **kwargs)
# AP2MD_DUMMY (is AP blocked from MD?)
self.mem[0x300] = 1
class TDMABase_Periph(PassthroughPeripheral):
def __init__(self, name, address, size, **kwargs):
super().__init__(name, address, size, **kwargs)
self.timerHack = 0
def hw_read(self, offset, size):
if offset == 0x0:
# TQ_CURRENT_COUNT (TDMA timer)
# TODO
self.timerHack = self.timerHack + 1
return self.timerHack
else:
return super().hw_read(offset, size)
# dummy reads/writes to act as a barrier
class MCUSync_Periph(PassthroughPeripheral):
def __init__(self, name, address, size, **kwargs):
super().__init__(name, address, size, **kwargs)
| 31.092308 | 55 | 0.626423 | 237 | 2,021 | 5.050633 | 0.350211 | 0.110276 | 0.150376 | 0.210526 | 0.644946 | 0.644946 | 0.644946 | 0.619048 | 0.59315 | 0.59315 | 0 | 0.016043 | 0.259772 | 2,021 | 64 | 56 | 31.578125 | 0.784091 | 0.142009 | 0 | 0.605263 | 0 | 0 | 0.014526 | 0 | 0 | 0 | 0.021499 | 0.015625 | 0 | 1 | 0.236842 | false | 0.184211 | 0.026316 | 0 | 0.578947 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 1 | 0 | 0 | 5 |
be32ec3f45df7a640a430d5be76cf6c565ef59fb | 174 | py | Python | web/apps/inventories/admin.py | trantinan2512/Francis | f5f7cd3c5af6efd36d6c25c0c516dbf286195f11 | [
"MIT"
] | null | null | null | web/apps/inventories/admin.py | trantinan2512/Francis | f5f7cd3c5af6efd36d6c25c0c516dbf286195f11 | [
"MIT"
] | 2 | 2020-02-11T23:06:52.000Z | 2020-06-05T18:46:58.000Z | web/apps/inventories/admin.py | trantinan2512/francis-discord-bot | f5f7cd3c5af6efd36d6c25c0c516dbf286195f11 | [
"MIT"
] | 1 | 2019-06-12T21:33:20.000Z | 2019-06-12T21:33:20.000Z | from django.contrib import admin
from .models import Inventory, InventoryItem
# Register your models here.
admin.site.register(Inventory)
admin.site.register(InventoryItem)
| 24.857143 | 44 | 0.827586 | 22 | 174 | 6.545455 | 0.545455 | 0.125 | 0.236111 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.097701 | 174 | 6 | 45 | 29 | 0.917197 | 0.149425 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 0 | 0 | null | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
be4ea8478514f6cce271ccc3dd17ac2410581f60 | 462 | py | Python | setup.py | RFogarty1/sim_xps_spectra | 26933a8b00678494121507e66205cf4c02d9b357 | [
"MIT"
] | null | null | null | setup.py | RFogarty1/sim_xps_spectra | 26933a8b00678494121507e66205cf4c02d9b357 | [
"MIT"
] | null | null | null | setup.py | RFogarty1/sim_xps_spectra | 26933a8b00678494121507e66205cf4c02d9b357 | [
"MIT"
] | null | null | null | from distutils.core import setup
setup(name='sim_xps_spectra',
version='1.0',
author='Richard Fogarty',
author_email = 'richard.m.fogarty@gmail.com',
packages = ['sim_xps_spectra','sim_xps_spectra.gen_spectra','sim_xps_spectra.broad_functs', 'sim_xps_spectra.plotters',
'sim_xps_spectra.mol_spectra', 'sim_xps_spectra.x_sections', 'sim_xps_spectra.shared', 'sim_xps_spectra.parsers',
'sim_xps_spectra.interfaces']
)
| 38.5 | 128 | 0.722944 | 63 | 462 | 4.904762 | 0.47619 | 0.194175 | 0.420712 | 0.194175 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.005063 | 0.145022 | 462 | 11 | 129 | 42 | 0.777215 | 0 | 0 | 0 | 0 | 0 | 0.603037 | 0.498915 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.111111 | 0 | 0.111111 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
be6e7a9c566343679ec22c9b05633c2f0f243934 | 45 | py | Python | data_extraction/__init__.py | norberte/Statistical-Consulting | 58cf9c0b06d07221afaf5005c8ca3fddf91f4a5e | [
"MIT"
] | null | null | null | data_extraction/__init__.py | norberte/Statistical-Consulting | 58cf9c0b06d07221afaf5005c8ca3fddf91f4a5e | [
"MIT"
] | null | null | null | data_extraction/__init__.py | norberte/Statistical-Consulting | 58cf9c0b06d07221afaf5005c8ca3fddf91f4a5e | [
"MIT"
] | 1 | 2021-05-01T10:30:13.000Z | 2021-05-01T10:30:13.000Z | from map import Map
from parser import Parser | 22.5 | 25 | 0.844444 | 8 | 45 | 4.75 | 0.5 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.155556 | 45 | 2 | 25 | 22.5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
be717ec1bab19d1b56ed3d9355490085c17286a2 | 71 | py | Python | get_geo-example.py | jamesacampbell/python-examples | 03b8c0ec33bd0a6ef08b6d7469874e6e92112a0a | [
"MIT"
] | 39 | 2016-01-28T18:46:08.000Z | 2021-03-29T21:54:37.000Z | get_geo-example.py | jamesacampbell/python-examples | 03b8c0ec33bd0a6ef08b6d7469874e6e92112a0a | [
"MIT"
] | 1 | 2019-06-19T20:23:36.000Z | 2019-07-03T14:07:57.000Z | get_geo-example.py | jamesacampbell/python-examples | 03b8c0ec33bd0a6ef08b6d7469874e6e92112a0a | [
"MIT"
] | 25 | 2016-01-28T18:46:30.000Z | 2021-07-02T15:02:58.000Z | """Google maps to shodan ."""
print("google api killed this example.")
| 23.666667 | 40 | 0.690141 | 10 | 71 | 4.9 | 0.9 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140845 | 71 | 2 | 41 | 35.5 | 0.803279 | 0.323944 | 0 | 0 | 0 | 0 | 0.738095 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
be7e37ac94b81d7b9185fc8a2ad49e146294e16e | 167 | py | Python | tests/web_platform/CSS2/linebox/test_vertical_align_applies_to.py | jonboland/colosseum | cbf974be54fd7f6fddbe7285704cfaf7a866c5c5 | [
"BSD-3-Clause"
] | 71 | 2015-04-13T09:44:14.000Z | 2019-03-24T01:03:02.000Z | tests/web_platform/CSS2/linebox/test_vertical_align_applies_to.py | jonboland/colosseum | cbf974be54fd7f6fddbe7285704cfaf7a866c5c5 | [
"BSD-3-Clause"
] | 35 | 2019-05-06T15:26:09.000Z | 2022-03-28T06:30:33.000Z | tests/web_platform/CSS2/linebox/test_vertical_align_applies_to.py | jonboland/colosseum | cbf974be54fd7f6fddbe7285704cfaf7a866c5c5 | [
"BSD-3-Clause"
] | 139 | 2015-05-30T18:37:43.000Z | 2019-03-27T17:14:05.000Z | from tests.utils import W3CTestCase
class TestVerticalAlignAppliesTo(W3CTestCase):
vars().update(W3CTestCase.find_tests(__file__, 'vertical-align-applies-to-'))
| 27.833333 | 81 | 0.802395 | 18 | 167 | 7.166667 | 0.833333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.019608 | 0.083832 | 167 | 5 | 82 | 33.4 | 0.823529 | 0 | 0 | 0 | 0 | 0 | 0.155689 | 0.155689 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
be7f4b6da3107da4f244dc7258a248c70a4aff5a | 280 | py | Python | categories/models.py | Rat-Shop/RatShop | e3878584fe8cd865bd00a36b0b039e543aaf85aa | [
"MIT"
] | null | null | null | categories/models.py | Rat-Shop/RatShop | e3878584fe8cd865bd00a36b0b039e543aaf85aa | [
"MIT"
] | null | null | null | categories/models.py | Rat-Shop/RatShop | e3878584fe8cd865bd00a36b0b039e543aaf85aa | [
"MIT"
] | null | null | null | from django.db import models
# Create your models here.
class ShopCategory(models.Model):
name = models.CharField(max_length=32)
description = models.CharField(max_length=255)
image = models.CharField(max_length=255)
def __str__(self):
return self.name
| 23.333333 | 50 | 0.725 | 37 | 280 | 5.297297 | 0.621622 | 0.229592 | 0.27551 | 0.367347 | 0.27551 | 0 | 0 | 0 | 0 | 0 | 0 | 0.034935 | 0.182143 | 280 | 11 | 51 | 25.454545 | 0.820961 | 0.085714 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0.142857 | 1 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
be8b1d3a97ab219941d822c8475de3c2ba073b18 | 44 | py | Python | python/file-tests/student-submission-tests.py | CodeSteak/write-your-python-program | 13a10f1e8f4fe23c66a8762854c8bb12f0fb9ff4 | [
"BSD-3-Clause"
] | 1 | 2021-09-30T10:17:57.000Z | 2021-09-30T10:17:57.000Z | python/file-tests/student-submission-tests.py | CodeSteak/write-your-python-program | 13a10f1e8f4fe23c66a8762854c8bb12f0fb9ff4 | [
"BSD-3-Clause"
] | 47 | 2020-11-16T14:02:52.000Z | 2022-03-18T12:44:38.000Z | python/file-tests/student-submission-tests.py | CodeSteak/write-your-python-program | 13a10f1e8f4fe23c66a8762854c8bb12f0fb9ff4 | [
"BSD-3-Clause"
] | 4 | 2020-10-28T13:54:44.000Z | 2022-01-20T17:36:24.000Z | from wypp import *
check(incByOne(41), 42)
| 11 | 23 | 0.704545 | 7 | 44 | 4.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.108108 | 0.159091 | 44 | 3 | 24 | 14.666667 | 0.72973 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.5 | 0 | 0.5 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
beb3632a474a44426f6d9be18e29785f62bb775c | 164 | py | Python | peframe/__init__.py | ki1556ki/MJUOpenSource | 4087db825bbc7c460f8275428703e5c7066a84ae | [
"MIT"
] | null | null | null | peframe/__init__.py | ki1556ki/MJUOpenSource | 4087db825bbc7c460f8275428703e5c7066a84ae | [
"MIT"
] | null | null | null | peframe/__init__.py | ki1556ki/MJUOpenSource | 4087db825bbc7c460f8275428703e5c7066a84ae | [
"MIT"
] | 1 | 2020-07-14T03:39:06.000Z | 2020-07-14T03:39:06.000Z | import os
# 루트를 파일의 절대경로로 설정
_ROOT = os.path.abspath(os.path.dirname(__file__))
# 경로의 데어터 반환
def get_data(path):
return os.path.join(_ROOT, 'signatures', path)
| 23.428571 | 50 | 0.719512 | 28 | 164 | 3.964286 | 0.714286 | 0.162162 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.146341 | 164 | 6 | 51 | 27.333333 | 0.792857 | 0.164634 | 0 | 0 | 0 | 0 | 0.074627 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0 | 0.25 | 0.25 | 0.75 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 5 |
fe45b30d426330d7f914fce8309ef978da22001b | 119 | py | Python | src/embeddingdb/__init__.py | cthoyt/embeddingdb | e6c67e92e540c4315045a0b4de5b31490331c177 | [
"MIT"
] | 2 | 2019-12-19T05:56:09.000Z | 2021-08-07T16:35:14.000Z | src/embeddingdb/__init__.py | cthoyt/embeddingdb | e6c67e92e540c4315045a0b4de5b31490331c177 | [
"MIT"
] | null | null | null | src/embeddingdb/__init__.py | cthoyt/embeddingdb | e6c67e92e540c4315045a0b4de5b31490331c177 | [
"MIT"
] | 1 | 2021-08-07T16:35:18.000Z | 2021-08-07T16:35:18.000Z | # -*- coding: utf-8 -*-
"""A package for storing and querying entity embeddings."""
from .version import get_version
| 19.833333 | 59 | 0.697479 | 16 | 119 | 5.125 | 0.9375 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.01 | 0.159664 | 119 | 5 | 60 | 23.8 | 0.81 | 0.638655 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
fe489431ca8664d5507ab0da53957a10aaee2b2d | 61 | py | Python | enthought/traits/ui/editors/tabular_editor.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 3 | 2016-12-09T06:05:18.000Z | 2018-03-01T13:00:29.000Z | enthought/traits/ui/editors/tabular_editor.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | 1 | 2020-12-02T00:51:32.000Z | 2020-12-02T08:48:55.000Z | enthought/traits/ui/editors/tabular_editor.py | enthought/etsproxy | 4aafd628611ebf7fe8311c9d1a0abcf7f7bb5347 | [
"BSD-3-Clause"
] | null | null | null | # proxy module
from traitsui.editors.tabular_editor import *
| 20.333333 | 45 | 0.819672 | 8 | 61 | 6.125 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114754 | 61 | 2 | 46 | 30.5 | 0.907407 | 0.196721 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
fe6e65cbd2b3fda395f62b502ec5b63cebd9bc07 | 40 | py | Python | tests/components/picnic/__init__.py | MrDelik/core | 93a66cc357b226389967668441000498a10453bb | [
"Apache-2.0"
] | 30,023 | 2016-04-13T10:17:53.000Z | 2020-03-02T12:56:31.000Z | tests/components/picnic/__init__.py | jagadeeshvenkatesh/core | 1bd982668449815fee2105478569f8e4b5670add | [
"Apache-2.0"
] | 31,101 | 2020-03-02T13:00:16.000Z | 2022-03-31T23:57:36.000Z | tests/components/picnic/__init__.py | jagadeeshvenkatesh/core | 1bd982668449815fee2105478569f8e4b5670add | [
"Apache-2.0"
] | 11,956 | 2016-04-13T18:42:31.000Z | 2020-03-02T09:32:12.000Z | """Tests for the Picnic integration."""
| 20 | 39 | 0.7 | 5 | 40 | 5.6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.125 | 40 | 1 | 40 | 40 | 0.8 | 0.825 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
fe85ddb855ce8ca02a339a97a93f6d38d544bd99 | 105 | py | Python | tikibar/__init__.py | eventbrite/tikibar | cc1675ee500eb7fca80bd68bdddacf8301b5e154 | [
"Apache-2.0"
] | 13 | 2017-06-22T20:59:56.000Z | 2022-01-09T17:50:06.000Z | tikibar/__init__.py | eventbrite/tikibar | cc1675ee500eb7fca80bd68bdddacf8301b5e154 | [
"Apache-2.0"
] | 5 | 2017-06-22T20:01:49.000Z | 2017-11-28T20:45:34.000Z | tikibar/__init__.py | eventbrite/tikibar | cc1675ee500eb7fca80bd68bdddacf8301b5e154 | [
"Apache-2.0"
] | 6 | 2017-06-23T17:39:34.000Z | 2021-09-08T11:21:14.000Z | from __future__ import absolute_import
from tikibar.version import __version__, __version_info__ # noqa
| 35 | 65 | 0.857143 | 13 | 105 | 5.846154 | 0.615385 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.114286 | 105 | 2 | 66 | 52.5 | 0.817204 | 0.038095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
fea1239f547845804437ee72fd2c9bc71978851c | 3,489 | py | Python | app/tasks/docker/network.py | Clivern/Kraven | 5d8d2de26e170d853d7d5f2b1f2d453ab07e4401 | [
"Apache-2.0"
] | 3 | 2018-07-22T22:36:09.000Z | 2019-05-31T10:29:54.000Z | app/tasks/docker/network.py | Clivern/Kraven | 5d8d2de26e170d853d7d5f2b1f2d453ab07e4401 | [
"Apache-2.0"
] | 41 | 2018-07-22T22:07:52.000Z | 2018-11-14T11:07:48.000Z | app/tasks/docker/network.py | Clivern/Kraven | 5d8d2de26e170d853d7d5f2b1f2d453ab07e4401 | [
"Apache-2.0"
] | 1 | 2020-04-24T12:55:27.000Z | 2020-04-24T12:55:27.000Z | """
Docker Network Tasks
"""
# Third party
from celery import shared_task
# Django
from django.utils.translation import gettext as _
# Local Django
from app.modules.service.docker.network import Network as Network_Module
@shared_task
def create_network(host_id):
try:
_network = Network_Module()
if not _network.set_host(host_id).check_health():
return {
"status": "failed",
"result": {
"error": _("Error, Unable to connect to docker host!")
},
"notify_type": "failed"
}
except Exception as e:
return {
"status": "error",
"result": {
"error": str(e)
},
"notify_type": "error"
}
@shared_task
def remove_network_by_id(host_id, network_id):
try:
_network = Network_Module()
if not _network.set_host(host_id).check_health():
return {
"status": "failed",
"result": {
"error": _("Error, Unable to connect to docker host!")
},
"notify_type": "failed"
}
except Exception as e:
return {
"status": "error",
"result": {
"error": str(e)
},
"notify_type": "error"
}
@shared_task
def connect_network_container(host_id):
try:
_network = Network_Module()
if not _network.set_host(host_id).check_health():
return {
"status": "failed",
"result": {
"error": _("Error, Unable to connect to docker host!")
},
"notify_type": "failed"
}
except Exception as e:
return {
"status": "error",
"result": {
"error": str(e)
},
"notify_type": "error"
}
@shared_task
def disconnect_network_container(host_id):
try:
_network = Network_Module()
if not _network.set_host(host_id).check_health():
return {
"status": "failed",
"result": {
"error": _("Error, Unable to connect to docker host!")
},
"notify_type": "failed"
}
except Exception as e:
return {
"status": "error",
"result": {
"error": str(e)
},
"notify_type": "error"
}
@shared_task
def prune_unused_networks(host_id):
try:
_network = Network_Module()
if not _network.set_host(host_id).check_health():
return {
"status": "failed",
"result": {
"error": _("Error, Unable to connect to docker host!")
},
"notify_type": "failed"
}
result = _network.prune()
if result:
return {
"status": "passed",
"result": "{}",
"notify_type": "passed"
}
else:
return {
"status": "failed",
"result": "{}",
"notify_type": "failed"
}
except Exception as e:
return {
"status": "error",
"result": {
"error": str(e)
},
"notify_type": "error"
}
| 24.398601 | 74 | 0.447979 | 303 | 3,489 | 4.920792 | 0.174917 | 0.096579 | 0.072435 | 0.096579 | 0.758551 | 0.758551 | 0.758551 | 0.758551 | 0.758551 | 0.758551 | 0 | 0 | 0.436228 | 3,489 | 142 | 75 | 24.570423 | 0.758007 | 0.015191 | 0 | 0.698276 | 0 | 0 | 0.193812 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.043103 | false | 0.017241 | 0.025862 | 0 | 0.172414 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
fea8225a4a5ea5cd335e32289961d78caa9ca3ef | 86,988 | py | Python | featurewiz/ml_models.py | 17zhangw/featurewiz | f0ec08a8ca4de0e05eff6f79e1275f0f137f68e0 | [
"Apache-2.0"
] | null | null | null | featurewiz/ml_models.py | 17zhangw/featurewiz | f0ec08a8ca4de0e05eff6f79e1275f0f137f68e0 | [
"Apache-2.0"
] | null | null | null | featurewiz/ml_models.py | 17zhangw/featurewiz | f0ec08a8ca4de0e05eff6f79e1275f0f137f68e0 | [
"Apache-2.0"
] | null | null | null | import pandas as pd
import numpy as np
np.random.seed(99)
from sklearn.model_selection import train_test_split
from sklearn.model_selection import KFold
from sklearn.model_selection import GridSearchCV
from sklearn.multioutput import MultiOutputClassifier, MultiOutputRegressor
from sklearn.multiclass import OneVsRestClassifier
import xgboost as xgb
from xgboost.sklearn import XGBClassifier
from xgboost.sklearn import XGBRegressor
from sklearn.model_selection import train_test_split
from sklearn.multiclass import OneVsRestClassifier
from sklearn.preprocessing import LabelEncoder
import lightgbm as lgbm
from sklearn.model_selection import KFold, cross_val_score,StratifiedKFold
import seaborn as sns
from sklearn.preprocessing import OneHotEncoder, LabelEncoder, label_binarize
import csv
import re
from xgboost import XGBRegressor, XGBClassifier
from sklearn.metrics import mean_squared_log_error, mean_squared_error,balanced_accuracy_score
from scipy import stats
from sklearn.model_selection import RandomizedSearchCV
import scipy as sp
import time
import copy
from sklearn.preprocessing import StandardScaler, MinMaxScaler
from collections import Counter, defaultdict
import pdb
from tqdm.notebook import tqdm
from pathlib import Path
#sklearn data_preprocessing
from sklearn.preprocessing import StandardScaler, MinMaxScaler
#sklearn categorical encoding
import category_encoders as ce
#sklearn modelling
from sklearn.model_selection import KFold
from collections import Counter, defaultdict
from sklearn.base import BaseEstimator, ClassifierMixin, TransformerMixin
# boosting library
import xgboost as xgb
import matplotlib.pyplot as plt
import warnings
warnings.filterwarnings("ignore")
import copy
#################################################################################
#### Regression or Classification type problem
def analyze_problem_type(train, target, verbose=0) :
target = copy.deepcopy(target)
train = copy.deepcopy(train)
if isinstance(train, pd.Series):
train = pd.DataFrame(train)
### the number of categories cannot be more than 2% of train size ####
### this determines the number of categories above which integer target becomes a regression problem ##
cat_limit = int(train.shape[0]*0.02)
cat_limit = min(cat_limit, 100) ## anything over 100 categories is a regression problem ##
cat_limit = max(cat_limit, 10) ### anything above at least 10 categories is a Regression problem
float_limit = 15 ### number of categories a float target above which it becomes a Regression problem
if isinstance(target, str):
target = [target]
if len(target) == 1:
targ = target[0]
model_label = 'Single_Label'
else:
targ = target[0]
model_label = 'Multi_Label'
#### This is where you detect what kind of problem it is #################
if train[targ].dtype in ['int64', 'int32','int16','int8']:
if len(train[targ].unique()) <= 2:
model_class = 'Binary_Classification'
elif len(train[targ].unique()) > 2 and len(train[targ].unique()) <= cat_limit:
model_class = 'Multi_Classification'
else:
model_class = 'Regression'
elif train[targ].dtype in ['float16','float32','float64','float']:
if len(train[targ].unique()) <= 2:
model_class = 'Binary_Classification'
elif len(train[targ].unique()) > 2 and len(train[targ].unique()) <= float_limit:
model_class = 'Multi_Classification'
else:
model_class = 'Regression'
else:
if len(train[targ].unique()) <= 2:
model_class = 'Binary_Classification'
else:
model_class = 'Multi_Classification'
########### print this for the start of next step ###########
if verbose <= 1:
print('''#### %s %s Feature Selection Started ####''' %(
model_label,model_class))
return model_class
#####################################################################################
from sklearn.base import TransformerMixin, BaseEstimator
from collections import defaultdict
class My_LabelEncoder(BaseEstimator, TransformerMixin):
"""
################################################################################################
###### The My_LabelEncoder class works just like sklearn's Label Encoder but better! #######
##### It label encodes any object or category dtype in your dataset. It also handles NaN's.####
## The beauty of this function is that it takes care of encoding unknown (future) values. #####
##################### This is the BEST working version - don't mess with it!! ##################
################################################################################################
Usage:
le = My_LabelEncoder()
le.fit_transform(train[column]) ## this will give your transformed values as an array
le.transform(test[column]) ### this will give your transformed values as an array
Usage in Column Transformers and Pipelines:
No. It cannot be used in pipelines since it need to produce two columns for the next stage in pipeline.
See my other module called My_LabelEncoder_Pipe() to see how it can be used in Pipelines.
"""
def __init__(self):
self.transformer = defaultdict(str)
self.inverse_transformer = defaultdict(str)
self.max_val = 0
def fit(self,testx, y=None):
## testx must still be a pd.Series for this encoder to work!
if isinstance(testx, pd.Series):
pass
elif isinstance(testx, np.ndarray):
testx = pd.Series(testx)
else:
#### There is no way to transform dataframes since you will get a nested renamer error if you try ###
### But if it is a one-dimensional dataframe, convert it into a Series
#### Do not change this since I have tested it and it works.
if testx.shape[1] == 1:
testx = pd.Series(testx.values.ravel(),name=testx.columns[0])
else:
#### Since it is multi-dimensional, So in this case, just return the object as is
return self
ins = np.unique(testx.factorize()[1]).tolist()
outs = np.unique(testx.factorize()[0]).tolist()
#ins = testx.value_counts(dropna=False).index
if -1 in outs:
# it already has nan if -1 is in outs. No need to add it.
if not np.nan in ins:
ins.insert(0,np.nan)
self.transformer = dict(zip(ins,outs))
self.inverse_transformer = dict(zip(outs,ins))
return self
def transform(self, testx, y=None):
## testx must still be a pd.Series for this encoder to work!
if isinstance(testx, pd.Series):
pass
elif isinstance(testx, np.ndarray):
testx = pd.Series(testx)
else:
#### There is no way to transform dataframes since you will get a nested renamer error if you try ###
### But if it is a one-dimensional dataframe, convert it into a Series
#### Do not change this since I have tested it and it works.
if testx.shape[1] == 1:
testx = pd.Series(testx.values.ravel(),name=testx.columns[0])
else:
#### Since it is multi-dimensional, So in this case, just return the data as is
#### Do not change this since I have tested it and it works.
return testx
### now convert the input to transformer dictionary values
new_ins = np.unique(testx.factorize()[1]).tolist()
missing = [x for x in new_ins if x not in self.transformer.keys()]
if len(missing) > 0:
for each_missing in missing:
self.transformer[each_missing] = int(self.max_val + 1)
self.inverse_transformer[int(self.max_val+1)] = each_missing
self.max_val = int(self.max_val+1)
else:
self.max_val = np.max(list(self.transformer.values()))
### To handle category dtype you must do the next step #####
#### Do not change this since I have tested it and it works.
testk = testx.map(self.transformer)
if testx.dtype not in [np.int16, np.int32, np.int64, float, bool, object]:
if testx.isnull().sum().sum() > 0:
fillval = self.transformer[np.nan]
testk = testk.cat.add_categories([fillval])
testk = testk.fillna(fillval)
testk = testk.astype(int)
return testk
else:
testk = testk.astype(int)
return testk
else:
outs = testx.map(self.transformer).values.astype(int)
return outs
def inverse_transform(self, testx, y=None):
### now convert the input to transformer dictionary values
if isinstance(testx, pd.Series):
outs = testx.map(self.inverse_transformer).values
elif isinstance(testx, np.ndarray):
outs = pd.Series(testx).map(self.inverse_transformer).values
else:
outs = testx[:]
return outs
#################################################################################
from sklearn.impute import SimpleImputer
def data_transform(X_train, Y_train, X_test="", Y_test="", modeltype='Classification',
multi_label=False, enc_method='label', scaler = StandardScaler()):
##### Use My_Label_Encoder to transform label targets if needed #####
if multi_label:
if modeltype != 'Regression':
targets = Y_train.columns
Y_train_encoded = copy.deepcopy(Y_train)
for each_target in targets:
if Y_train[each_target].dtype not in ['int64', 'int32','int16','int8', 'float16','float32','float64','float']:
mlb = My_LabelEncoder()
Y_train_encoded[each_target] = mlb.fit_transform(Y_train[each_target])
if not isinstance(Y_test, str):
Y_test_encoded= mlb.transform(Y_test)
else:
Y_test_encoded = copy.deepcopy(Y_test)
else:
Y_train_encoded = copy.deepcopy(Y_train)
Y_test_encoded = copy.deepcopy(Y_test)
else:
Y_train_encoded = copy.deepcopy(Y_train)
Y_test_encoded = copy.deepcopy(Y_test)
else:
if modeltype != 'Regression':
if Y_train.dtype not in ['int64', 'int32','int16','int8', 'float16','float32','float64','float']:
mlb = My_LabelEncoder()
Y_train_encoded= mlb.fit_transform(Y_train)
if not isinstance(Y_test, str):
Y_test_encoded= mlb.transform(Y_test)
else:
Y_test_encoded = copy.deepcopy(Y_test)
else:
Y_train_encoded = copy.deepcopy(Y_train)
Y_test_encoded = copy.deepcopy(Y_test)
else:
Y_train_encoded = copy.deepcopy(Y_train)
Y_test_encoded = copy.deepcopy(Y_test)
#### This is where we find datetime vars and convert them to strings ####
datetime_feats = X_train.select_dtypes(include='datetime').columns.tolist()
### if there are datetime values, convert them into features here ###
from .featurewiz import FE_create_time_series_features
for date_col in datetime_feats:
fillnum = X_train[date_col].mode()[0]
X_train[date_col].fillna(fillnum,inplace=True)
X_train, ts_adds = FE_create_time_series_features(X_train, date_col)
if not isinstance(X_test, str):
X_test[date_col].fillna(fillnum,inplace=True)
X_test, _ = FE_create_time_series_features(X_test, date_col, ts_adds)
print(' Adding time series features from %s to data...' %date_col)
####### Set up feature to encode ####################
##### First make sure that the originals are not modified ##########
X_train_encoded = copy.deepcopy(X_train)
X_test_encoded = copy.deepcopy(X_test)
feature_to_encode = X_train.select_dtypes(include='object').columns.tolist(
)+X_train.select_dtypes(include='category').columns.tolist()
#### Do label encoding now #################
if enc_method == 'label':
for feat in feature_to_encode:
# Initia the encoder model
lbEncoder = My_LabelEncoder()
fillnum = X_train[feat].mode()[0]
X_train[feat].fillna(fillnum,inplace=True)
# fit the train data
lbEncoder.fit(X_train[feat])
# transform training set
X_train_encoded[feat] = lbEncoder.transform(X_train[feat])
# transform test set
if not isinstance(X_test_encoded, str):
X_test[feat].fillna(fillnum,inplace=True)
X_test_encoded[feat] = lbEncoder.transform(X_test[feat])
elif enc_method == 'glmm':
# Initialize the encoder model
GLMMEncoder = ce.glmm.GLMMEncoder(verbose=0 ,binomial_target=False)
# fit the train data
GLMMEncoder.fit(X_train[feature_to_encode],Y_train_encoded)
# transform training set ####
X_train_encoded[feature_to_encode] = GLMMEncoder.transform(X_train[feature_to_encode])
# transform test set
if not isinstance(X_test_encoded, str):
X_test_encoded[feature_to_encode] = GLMMEncoder.transform(X_test[feature_to_encode])
else:
print('No encoding transform performed')
### make sure there are no missing values ###
try:
imputer = SimpleImputer(strategy='constant', fill_value=0, verbose=0, add_indicator=True)
imputer.fit_transform(X_train_encoded)
if not isinstance(X_test_encoded, str):
imputer.transform(X_test_encoded)
except:
X_train_encoded = X_train_encoded.fillna(0)
if not isinstance(X_test_encoded, str):
X_test_encoded = X_test_encoded.fillna(0)
# fit the scaler to the entire train and transform the test set
scaler.fit(X_train_encoded)
# transform training set
X_train_scaled = pd.DataFrame(scaler.transform(X_train_encoded),
columns=X_train_encoded.columns, index=X_train_encoded.index)
# transform test set
if not isinstance(X_test_encoded, str):
X_test_scaled = pd.DataFrame(scaler.transform(X_test_encoded),
columns=X_test_encoded.columns, index=X_test_encoded.index)
else:
X_test_scaled = ""
return X_train_scaled, Y_train_encoded, X_test_scaled, Y_test_encoded
##################################################################################
from sklearn.model_selection import KFold, cross_val_score,StratifiedKFold
import seaborn as sns
from sklearn.preprocessing import OneHotEncoder, LabelEncoder, label_binarize
import csv
import re
from xgboost import XGBRegressor, XGBClassifier
from sklearn.metrics import mean_squared_log_error, mean_squared_error,balanced_accuracy_score
from scipy import stats
from sklearn.model_selection import RandomizedSearchCV
import scipy as sp
import time
##################################################################################
import lightgbm as lgbm
def lightgbm_model_fit(random_search_flag, x_train, y_train, x_test, y_test, modeltype,
multi_label, log_y, model=""):
start_time = time.time()
if multi_label:
rand_params = {
}
else:
rand_params = {
'learning_rate': sp.stats.uniform(scale=1),
'num_leaves': sp.stats.randint(20, 100),
'n_estimators': sp.stats.randint(100,500),
"max_depth": sp.stats.randint(3, 15),
}
if modeltype == 'Regression':
lgb = lgbm.LGBMRegressor()
objective = 'regression'
metric = 'rmse'
is_unbalance = False
class_weight = None
score_name = 'Score'
else:
if modeltype =='Binary_Classification':
lgb = lgbm.LGBMClassifier()
objective = 'binary'
metric = 'auc'
is_unbalance = True
class_weight = None
score_name = 'ROC AUC'
num_class = 1
else:
lgb = lgbm.LGBMClassifier()
objective = 'multiclass'
#objective = 'multiclassova'
metric = 'multi_logloss'
is_unbalance = True
class_weight = 'balanced'
score_name = 'Multiclass Logloss'
if multi_label:
if isinstance(y_train, np.ndarray):
num_class = np.unique(y_train).max() + 1
else:
num_class = y_train.nunique().max()
else:
if isinstance(y_train, np.ndarray):
num_class = np.unique(y_train).max() + 1
else:
num_class = y_train.nunique()
early_stopping_params={"early_stopping_rounds":10,
"eval_metric" : metric,
"eval_set" : [[x_test, y_test]],
}
if modeltype == 'Regression':
## there is no num_class in regression for LGBM model ##
lgbm_params = {'learning_rate': 0.001,
'objective': objective,
'metric': metric,
'boosting_type': 'gbdt',
'max_depth': 8,
'subsample': 0.2,
'colsample_bytree': 0.3,
'reg_alpha': 0.54,
'reg_lambda': 0.4,
'min_split_gain': 0.7,
'min_child_weight': 26,
'num_leaves': 32,
'save_binary': True,
'seed': 1337, 'feature_fraction_seed': 1337,
'bagging_seed': 1337, 'drop_seed': 1337,
'data_random_seed': 1337,
'verbose': -1,
'n_estimators': 400,
}
else:
lgbm_params = {'learning_rate': 0.001,
'objective': objective,
'metric': metric,
'boosting_type': 'gbdt',
'max_depth': 8,
'subsample': 0.2,
'colsample_bytree': 0.3,
'reg_alpha': 0.54,
'reg_lambda': 0.4,
'min_split_gain': 0.7,
'min_child_weight': 26,
'num_leaves': 32,
'save_binary': True,
'seed': 1337, 'feature_fraction_seed': 1337,
'bagging_seed': 1337, 'drop_seed': 1337,
'data_random_seed': 1337,
'verbose': -1,
'num_class': num_class,
'is_unbalance': is_unbalance,
'class_weight': class_weight,
'n_estimators': 400,
}
lgb.set_params(**lgbm_params)
if multi_label:
if modeltype == 'Regression':
lgb = MultiOutputRegressor(lgb)
else:
lgb = MultiOutputClassifier(lgb)
######## Now let's perform randomized search to find best hyper parameters ######
if random_search_flag:
if modeltype == 'Regression':
scoring = 'neg_mean_squared_error'
else:
scoring = 'precision'
model = RandomizedSearchCV(lgb,
param_distributions = rand_params,
n_iter = 10,
return_train_score = True,
random_state = 99,
n_jobs=-1,
cv = 3,
refit=True,
scoring = scoring,
verbose = False)
##### This is where we search for hyper params for model #######
if multi_label:
model.fit(x_train, y_train)
else:
model.fit(x_train, y_train, **early_stopping_params)
print('Time taken for Hyper Param tuning of LGBM (in minutes) = %0.1f' %(
(time.time()-start_time)/60))
cv_results = pd.DataFrame(model.cv_results_)
if modeltype == 'Regression':
print('Mean cross-validated train %s = %0.04f' %(score_name, np.sqrt(abs(cv_results['mean_train_score'].mean()))))
print('Mean cross-validated test %s = %0.04f' %(score_name, np.sqrt(abs(cv_results['mean_test_score'].mean()))))
else:
print('Mean cross-validated test %s = %0.04f' %(score_name, cv_results['mean_train_score'].mean()))
print('Mean cross-validated test %s = %0.04f' %(score_name, cv_results['mean_test_score'].mean()))
else:
try:
model.fit(x_train, y_train, verbose=-1)
except:
print('lightgbm model is crashing. Please check your inputs and try again...')
return model
##############################################################################################
def complex_XGBoost_model(X_train, y_train, X_test, log_y=False, GPU_flag=False,
scaler = '', enc_method='label', n_splits=5, verbose=-1):
"""
This model is called complex because it handle multi-label, mulit-class datasets which XGBoost ordinarily cant.
Just send in X_train, y_train and what you want to predict, X_test
It will automatically split X_train into multiple folds (10) and train and predict each time on X_test.
It will then use average (or use mode) to combine the results and give you a y_test.
It will automatically detect modeltype as "Regression" or 'Classification'
It will also add MultiOutputClassifier and MultiOutputRegressor to multi_label problems.
The underlying estimators in all cases is XGB. So you get the best of both worlds.
Inputs:
------------
X_train: pandas dataframe only: do not send in numpy arrays. This is the X_train of your dataset.
y_train: pandas Series or DataFrame only: do not send in numpy arrays. This is the y_train of your dataset.
X_test: pandas dataframe only: do not send in numpy arrays. This is the X_test of your dataset.
log_y: default = False: If True, it means use the log of the target variable "y" to train and test.
GPU_flag: if your machine has a GPU set this flag and it will use XGBoost GPU to speed up processing.
scaler : default is empty string which means to use StandardScaler.
But you can explicity send in "minmax' to select MinMaxScaler().
Alternatively, you can send in a scaler object that you define here: MaxAbsScaler(), etc.
enc_method: default is 'label' encoding. But you can choose 'glmm' as an alternative. But those are the only two.
verbose: default = 0. Choosing 1 will give you lot more output.
Outputs:
------------
y_preds: Predicted values for your X_XGB_test dataframe.
It has been averaged after repeatedly predicting on X_XGB_test. So likely to be better than one model.
"""
X_XGB = copy.deepcopy(X_train)
Y_XGB = copy.deepcopy(y_train)
X_XGB_test = copy.deepcopy(X_test)
####################################
start_time = time.time()
top_num = 10
num_boost_round = 400
if isinstance(Y_XGB, pd.Series):
targets = [Y_XGB.name]
else:
targets = Y_XGB.columns.tolist()
if len(targets) == 1:
multi_label = False
if isinstance(Y_XGB, pd.DataFrame):
Y_XGB = pd.Series(Y_XGB.values.ravel(),name=targets[0], index=Y_XGB.index)
else:
multi_label = True
modeltype = analyze_problem_type(Y_XGB, targets)
columns = X_XGB.columns
###################################################################################
######### S C A L E R P R O C E S S I N G B E G I N S ############
###################################################################################
if isinstance(scaler, str):
if not scaler == '':
scaler = scaler.lower()
if scaler == 'standard':
scaler = StandardScaler()
elif scaler == 'minmax':
scaler = MinMaxScaler()
else:
scaler = StandardScaler()
else:
scaler = StandardScaler()
else:
pass
######### G P U P R O C E S S I N G B E G I N S ############
###### This is where we set the CPU and GPU parameters for XGBoost
if GPU_flag:
GPU_exists = check_if_GPU_exists()
else:
GPU_exists = False
##### Set the Scoring Parameters here based on each model and preferences of user ###
cpu_params = {}
param = {}
cpu_params['tree_method'] = 'hist'
cpu_params['gpu_id'] = 0
cpu_params['updater'] = 'grow_colmaker'
cpu_params['predictor'] = 'cpu_predictor'
if GPU_exists:
param['tree_method'] = 'gpu_hist'
param['gpu_id'] = 0
param['updater'] = 'grow_gpu_hist' #'prune'
param['predictor'] = 'gpu_predictor'
print(' Hyper Param Tuning XGBoost with GPU parameters. This will take time. Please be patient...')
else:
param = copy.deepcopy(cpu_params)
print(' Hyper Param Tuning XGBoost with CPU parameters. This will take time. Please be patient...')
#################################################################################
if modeltype == 'Regression':
if log_y:
Y_XGB.loc[Y_XGB==0] = 1e-15 ### just set something that is zero to a very small number
######### Now set the number of rows we need to tune hyper params ###
scoreFunction = { "precision": "precision_weighted","recall": "recall_weighted"}
random_search_flag = True
#### We need a small validation data set for hyper-param tuning #########################
hyper_frac = 0.2
#### now select a random sample from X_XGB ##
if modeltype == 'Regression':
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=999)
else:
try:
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=999, stratify = Y_XGB)
except:
## In some small cases there are too few samples to stratify hence just split them as is
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=999)
###### This step is needed for making sure y is transformed to log_y ####################
if modeltype == 'Regression' and log_y:
Y_train = np.log(Y_train)
Y_valid = np.log(Y_valid)
#### First convert test data into numeric using train data ###
X_train, Y_train, X_valid, Y_valid = data_transform(X_train, Y_train, X_valid, Y_valid,
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
###### Time to hyper-param tune model using randomizedsearchcv and partial train data #########
num_boost_round = xgbm_model_fit(random_search_flag, X_train, Y_train, X_valid, Y_valid, modeltype,
multi_label, log_y, num_boost_round=num_boost_round)
#### First convert test data into numeric using train data ###############################
if not isinstance(X_XGB_test, str):
x_train, y_train, x_test, _ = data_transform(X_XGB, Y_XGB, X_XGB_test, "",
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
###### Time to train the hyper-tuned model on full train data ##########################
random_search_flag = False
model = xgbm_model_fit(random_search_flag, x_train, y_train, x_test, "", modeltype,
multi_label, log_y, num_boost_round=num_boost_round)
############# Time to get feature importances based on full train data ################
if multi_label:
for i,target_name in enumerate(targets):
each_model = model.estimators_[i]
imp_feats = dict(zip(x_train.columns, each_model.feature_importances_))
importances = pd.Series(imp_feats).sort_values(ascending=False)[:top_num].values
important_features = pd.Series(imp_feats).sort_values(ascending=False)[:top_num].index.tolist()
print('Top 10 features for {}: {}'.format(target_name, important_features))
else:
imp_feats = model.get_score(fmap='', importance_type='gain')
importances = pd.Series(imp_feats).sort_values(ascending=False)[:top_num].values
important_features = pd.Series(imp_feats).sort_values(ascending=False)[:top_num].index.tolist()
print('Top 10 features:\n%s' %important_features[:top_num])
####### order this in the same order in which they were collected ######
feature_importances = pd.DataFrame(importances,
index = important_features,
columns=['importance'])
###### Time to consolidate the predictions on test data ################################
if not multi_label and not isinstance(X_XGB_test, str):
x_test = xgb.DMatrix(x_test)
if isinstance(X_XGB_test, str):
print('No predictions since X_XGB_test is empty string. Returning...')
return {}
if modeltype == 'Regression':
if not isinstance(X_XGB_test, str):
if log_y:
pred_xgbs = np.exp(model.predict(x_test))
else:
pred_xgbs = model.predict(x_test)
#### if there is no test data just return empty strings ###
else:
pred_xgbs = []
else:
if multi_label:
pred_xgbs = model.predict(x_test)
pred_probas = model.predict_proba(x_test)
else:
pred_probas = model.predict(x_test)
if modeltype =='Multi_Classification':
pred_xgbs = pred_probas.argmax(axis=1)
else:
pred_xgbs = (pred_probas>0.5).astype(int)
##### once the entire model is trained on full train data ##################
print(' Time taken for training XGBoost on entire train data (in minutes) = %0.1f' %(
(time.time()-start_time)/60))
if multi_label:
for i,target_name in enumerate(targets):
each_model = model.estimators_[i]
xgb.plot_importance(each_model, importance_type='gain', max_num_features=top_num,
title='XGBoost model feature importances for %s' %target_name)
else:
xgb.plot_importance(model, importance_type='gain', max_num_features=top_num,
title='XGBoost final model feature importances')
print('Returning the following:')
print(' Model = %s' %model)
if modeltype == 'Regression':
if not isinstance(X_XGB_test, str):
print(' final predictions', pred_xgbs[:10])
return (pred_xgbs, model)
else:
if not isinstance(X_XGB_test, str):
print(' final predictions (may need to be transformed to original labels)', pred_xgbs[:10])
print(' predicted probabilities', pred_probas[:1])
return (pred_xgbs, pred_probas, model)
##############################################################################################
import xgboost as xgb
def xgbm_model_fit(random_search_flag, x_train, y_train, x_test, y_test, modeltype,
multi_label, log_y, num_boost_round=100):
start_time = time.time()
if multi_label and not random_search_flag:
model = num_boost_round
else:
rand_params = {
'learning_rate': sp.stats.uniform(scale=1),
'gamma': sp.stats.randint(0, 100),
'n_estimators': sp.stats.randint(100,500),
"max_depth": sp.stats.randint(3, 15),
}
if modeltype == 'Regression':
objective = 'reg:squarederror'
eval_metric = 'rmse'
shuffle = False
stratified = False
num_class = 0
score_name = 'Score'
scale_pos_weight = 1
else:
if modeltype =='Binary_Classification':
objective='binary:logistic'
eval_metric = 'auc' ## dont change this. AUC works well.
shuffle = True
stratified = True
num_class = 1
score_name = 'AUC'
scale_pos_weight = get_scale_pos_weight(y_train)
else:
objective = 'multi:softprob'
eval_metric = 'auc' ## dont change this. AUC works well for now.
shuffle = True
stratified = True
if multi_label:
num_class = y_train.nunique().max()
else:
if isinstance(y_train, np.ndarray):
num_class = np.unique(y_train).max() + 1
elif isinstance(y_train, pd.Series):
num_class = y_train.nunique()
else:
num_class = y_train.nunique().max()
score_name = 'Multiclass AUC'
scale_pos_weight = 1 ### use sample_weights in multi-class settings ##
######################################################
final_params = {
'booster' :'gbtree',
'colsample_bytree': 0.5,
'alpha': 0.015,
'gamma': 4,
'learning_rate': 0.01,
'max_depth': 8,
'min_child_weight': 2,
'reg_lambda': 0.5,
'subsample': 0.7,
'random_state': 99,
'objective': objective,
'eval_metric': eval_metric,
'verbosity': 0,
'n_jobs': -1,
'scale_pos_weight':scale_pos_weight,
'num_class': num_class,
'silent': True
}
####### This is where we split into single and multi label ############
if multi_label:
###### This is for Multi_Label problems ############
rand_params = {'estimator__learning_rate':[0.1, 0.5, 0.01, 0.05],
'estimator__n_estimators':[50, 100, 150, 200, 250],
'estimator__gamma':[2, 4, 8, 16, 32],
'estimator__max_depth':[3, 5, 8, 12],
}
if random_search_flag:
if modeltype == 'Regression':
clf = XGBRegressor(n_jobs=-1, random_state=999, max_depth=6)
clf.set_params(**final_params)
model = MultiOutputRegressor(clf, n_jobs=-1)
else:
clf = XGBClassifier(n_jobs=-1, random_state=999, max_depth=6)
clf.set_params(**final_params)
model = MultiOutputClassifier(clf, n_jobs=-1)
if modeltype == 'Regression':
scoring = 'neg_mean_squared_error'
else:
scoring = 'precision'
model = RandomizedSearchCV(model,
param_distributions = rand_params,
n_iter = 15,
return_train_score = True,
random_state = 99,
n_jobs=-1,
cv = 3,
refit=True,
scoring = scoring,
verbose = False)
model.fit(x_train, y_train)
print('Time taken for Hyper Param tuning of multi_label XGBoost (in minutes) = %0.1f' %(
(time.time()-start_time)/60))
cv_results = pd.DataFrame(model.cv_results_)
if modeltype == 'Regression':
print('Mean cross-validated train %s = %0.04f' %(score_name, np.sqrt(abs(cv_results['mean_train_score'].mean()))))
print('Mean cross-validated test %s = %0.04f' %(score_name, np.sqrt(abs(cv_results['mean_test_score'].mean()))))
else:
print('Mean cross-validated test %s = %0.04f' %(score_name, cv_results['mean_train_score'].mean()))
print('Mean cross-validated test %s = %0.04f' %(score_name, cv_results['mean_test_score'].mean()))
### In this case, there is no boost rounds so just return the default num_boost_round
return model.best_estimator_
else:
try:
model.fit(x_train, y_train)
except:
print('Multi_label XGBoost model is crashing during training. Please check your inputs and try again...')
return model
else:
#### This is for Single Label Problems #############
if modeltype == 'Multi_Classification':
wt_array = get_sample_weight_array(y_train)
dtrain = xgb.DMatrix(x_train, label=y_train, weight=wt_array)
else:
dtrain = xgb.DMatrix(x_train, label=y_train)
######## Now let's perform randomized search to find best hyper parameters ######
if random_search_flag:
cv_results = xgb.cv(final_params, dtrain, num_boost_round=400, nfold=5,
stratified=stratified, metrics=eval_metric, early_stopping_rounds=10, seed=999, shuffle=shuffle)
# Update best eval_metric
best_eval = 'test-'+eval_metric+'-mean'
if modeltype == 'Regression':
mean_mae = cv_results[best_eval].min()
boost_rounds = cv_results[best_eval].argmin()
else:
mean_mae = cv_results[best_eval].max()
boost_rounds = cv_results[best_eval].argmax()
print("Cross-validated %s = %0.3f in num rounds = %s" %(score_name, mean_mae, boost_rounds))
print('Time taken for Hyper Param tuning of XGBoost (in minutes) = %0.1f' %(
(time.time()-start_time)/60))
return boost_rounds
else:
try:
model = xgb.train(
final_params,
dtrain,
num_boost_round=num_boost_round,
verbose_eval=False,
)
except:
print('XGBoost model is crashing. Please check your inputs and try again...')
return model
####################################################################################
# Calculate class weight
from sklearn.utils.class_weight import compute_class_weight
import copy
from collections import Counter
def find_rare_class(classes, verbose=0):
######### Print the % count of each class in a Target variable #####
"""
Works on Multi Class too. Prints class percentages count of target variable.
It returns the name of the Rare class (the one with the minimum class member count).
This can also be helpful in using it as pos_label in Binary and Multi Class problems.
"""
counts = OrderedDict(Counter(classes))
total = sum(counts.values())
if verbose >= 1:
print(' Class -> Counts -> Percent')
sorted_keys = sorted(counts.keys())
for cls in sorted_keys:
print("%12s: % 7d -> % 5.1f%%" % (cls, counts[cls], counts[cls]/total*100))
if type(pd.Series(counts).idxmin())==str:
return pd.Series(counts).idxmin()
else:
return int(pd.Series(counts).idxmin())
###################################################################################
def get_sample_weight_array(y_train):
y_train = copy.deepcopy(y_train)
if isinstance(y_train, np.ndarray):
y_train = pd.Series(y_train)
elif isinstance(y_train, pd.Series):
pass
elif isinstance(y_train, pd.DataFrame):
### if it is a dataframe, return only if it s one column dataframe ##
y_train = y_train.iloc[:,0]
else:
### if you cannot detect the type or if it is a multi-column dataframe, ignore it
return None
classes = np.unique(y_train)
class_weights = compute_class_weight('balanced', classes=classes, y=y_train)
if len(class_weights[(class_weights < 1)]) > 0:
### if the weights are less than 1, then divide them until the lowest weight is 1.
class_weights = class_weights/min(class_weights)
else:
class_weights = (class_weights)
### even after you change weights if they are all below 1.5 do this ##
#if (class_weights<=1.5).all():
# class_weights = np.around(class_weights+0.49)
class_weights = class_weights.astype(int)
wt = dict(zip(classes, class_weights))
### Map class weights to corresponding target class values
### You have to make sure class labels have range (0, n_classes-1)
wt_array = y_train.map(wt)
#set(zip(y_train, wt_array))
# Convert wt series to wt array
wt_array = wt_array.values
return wt_array
###############################################################################
from collections import OrderedDict
def get_scale_pos_weight(y_input):
y_input = copy.deepcopy(y_input)
if isinstance(y_input, np.ndarray):
y_input = pd.Series(y_input)
elif isinstance(y_input, pd.Series):
pass
elif isinstance(y_input, pd.DataFrame):
### if it is a dataframe, return only if it s one column dataframe ##
y_input = y_input.iloc[:,0]
else:
### if you cannot detect the type or if it is a multi-column dataframe, ignore it
return None
classes = np.unique(y_input)
rare_class = find_rare_class(y_input)
xp = Counter(y_input)
class_weights = compute_class_weight('balanced', classes=classes, y=y_input)
if len(class_weights[(class_weights < 1)]) > 0:
### if the weights are less than 1, then divide them until the lowest weight is 1.
class_weights = class_weights/min(class_weights)
else:
class_weights = (class_weights)
### even after you change weights if they are all below 1.5 do this ##
#if (class_weights<=1.5).all():
# class_weights = np.around(class_weights+0.49)
class_weights = class_weights.astype(int)
class_weights[(class_weights<1)]=1
class_rows = class_weights*[xp[x] for x in classes]
class_rows = class_rows.astype(int)
class_weighted_rows = dict(zip(classes,class_weights))
rare_class_weight = class_weighted_rows[rare_class]
print(' For class %s, weight = %s' %(rare_class, rare_class_weight))
return rare_class_weight
############################################################################################
def xgboost_model_fit(model, x_train, y_train, x_test, y_test, modeltype, log_y, params,
cpu_params, early_stopping_params={}):
early_stopping = 10
start_time = time.time()
if str(model).split("(")[0] == 'RandomizedSearchCV':
model.fit(x_train, y_train, **early_stopping_params)
print('Time taken for Hyper Param tuning of XGB (in minutes) = %0.1f' %(
(time.time()-start_time)/60))
else:
try:
if modeltype == 'Regression':
if log_y:
model.fit(x_train, np.log(y_train), early_stopping_rounds=early_stopping, eval_metric=['rmse'],
eval_set=[(x_test, np.log(y_test))], verbose=0)
else:
model.fit(x_train, y_train, early_stopping_rounds=early_stopping, eval_metric=['rmse'],
eval_set=[(x_test, y_test)], verbose=0)
else:
if modeltype == 'Binary_Classification':
objective='binary:logistic'
eval_metric = 'auc'
else:
objective='multi:softprob'
eval_metric = 'auc'
model.fit(x_train, y_train, early_stopping_rounds=early_stopping, eval_metric = eval_metric,
eval_set=[(x_test, y_test)], verbose=0)
except:
print('GPU is present but not turned on. Please restart after that. Currently using CPU...')
if str(model).split("(")[0] == 'RandomizedSearchCV':
xgb = model.estimator_
xgb.set_params(**cpu_params)
if modeltype == 'Regression':
scoring = 'neg_mean_squared_error'
else:
scoring = 'precision'
model = RandomizedSearchCV(xgb,
param_distributions = params,
n_iter = 15,
n_jobs=-1,
cv = 3,
scoring=scoring,
refit=True,
)
model.fit(x_train, y_train, **early_stopping_params)
return model
else:
model = model.set_params(**cpu_params)
if modeltype == 'Regression':
if log_y:
model.fit(x_train, np.log(y_train), early_stopping_rounds=6, eval_metric=['rmse'],
eval_set=[(x_test, np.log(y_test))], verbose=0)
else:
model.fit(x_train, y_train, early_stopping_rounds=6, eval_metric=['rmse'],
eval_set=[(x_test, y_test)], verbose=0)
else:
model.fit(x_train, y_train, early_stopping_rounds=6,eval_metric=eval_metric,
eval_set=[(x_test, y_test)], verbose=0)
return model
#################################################################################
def simple_XGBoost_model(X_train, y_train, X_test, log_y=False, GPU_flag=False,
scaler = '', enc_method='label', n_splits=5, verbose=0):
"""
Easy to use XGBoost model. Just send in X_train, y_train and what you want to predict, X_test
It will automatically split X_train into multiple folds (10) and train and predict each time on X_test.
It will then use average (or use mode) to combine the results and give you a y_test.
You just need to give the modeltype as "Regression" or 'Classification'
Inputs:
------------
X_train: pandas dataframe only: do not send in numpy arrays. This is the X_train of your dataset.
y_train: pandas Series or DataFrame only: do not send in numpy arrays. This is the y_train of your dataset.
X_test: pandas dataframe only: do not send in numpy arrays. This is the X_test of your dataset.
modeltype: can only be 'Regression' or 'Classification'
log_y: default = False: If True, it means use the log of the target variable "y" to train and test.
GPU_flag: if your machine has a GPU set this flag and it will use XGBoost GPU to speed up processing.
scaler : default is StandardScaler(). But you can send in MinMaxScaler() as input to change it or any other scaler.
enc_method: default is 'label' encoding. But you can choose 'glmm' as an alternative. But those are the only two.
verbose: default = 0. Choosing 1 will give you lot more output.
Outputs:
------------
y_preds: Predicted values for your X_XGB_test dataframe.
It has been averaged after repeatedly predicting on X_XGB_test. So likely to be better than one model.
"""
X_XGB = copy.deepcopy(X_train)
Y_XGB = copy.deepcopy(y_train)
X_XGB_test = copy.deepcopy(X_test)
start_time = time.time()
if isinstance(Y_XGB, pd.Series):
targets = [Y_XGB.name]
else:
targets = Y_XGB.columns.tolist()
Y_XGB_index = Y_XGB.index
if len(targets) == 1:
multi_label = False
if isinstance(Y_XGB, pd.DataFrame):
Y_XGB = pd.Series(Y_XGB.values.ravel(),name=targets[0], index=Y_XGB.index)
else:
multi_label = True
print('Multi_label is not supported in simple_XGBoost_model. Try the complex_XGBoost_model...Returning')
return {}
##### Start your analysis of the data ############
modeltype = analyze_problem_type(Y_XGB, targets)
columns = X_XGB.columns
###################################################################################
######### S C A L E R P R O C E S S I N G B E G I N S ############
###################################################################################
if isinstance(scaler, str):
if not scaler == '':
scaler = scaler.lower()
if scaler == 'standard':
scaler = StandardScaler()
elif scaler == 'minmax':
scaler = MinMaxScaler()
else:
scaler = StandardScaler()
else:
scaler = StandardScaler()
else:
pass
######### G P U P R O C E S S I N G B E G I N S ############
###### This is where we set the CPU and GPU parameters for XGBoost
if GPU_flag:
GPU_exists = check_if_GPU_exists()
else:
GPU_exists = False
##### Set the Scoring Parameters here based on each model and preferences of user ###
cpu_params = {}
param = {}
cpu_params['tree_method'] = 'hist'
cpu_params['gpu_id'] = 0
cpu_params['updater'] = 'grow_colmaker'
cpu_params['predictor'] = 'cpu_predictor'
if GPU_exists:
param['tree_method'] = 'gpu_hist'
param['gpu_id'] = 0
param['updater'] = 'grow_gpu_hist' #'prune'
param['predictor'] = 'gpu_predictor'
print(' Hyper Param Tuning XGBoost with GPU parameters. This will take time. Please be patient...')
else:
param = copy.deepcopy(cpu_params)
print(' Hyper Param Tuning XGBoost with CPU parameters. This will take time. Please be patient...')
#################################################################################
if modeltype == 'Regression':
if log_y:
Y_XGB.loc[Y_XGB==0] = 1e-15 ### just set something that is zero to a very small number
xgb = XGBRegressor(
booster = 'gbtree',
colsample_bytree=0.5,
alpha=0.015,
gamma=4,
learning_rate=0.01,
max_depth=8,
min_child_weight=2,
n_estimators=1000,
reg_lambda=0.5,
#reg_alpha=8,
subsample=0.7,
random_state=99,
objective='reg:squarederror',
eval_metric='rmse',
verbosity = 0,
n_jobs=-1,
#grow_policy='lossguide',
silent = True)
objective='reg:squarederror'
eval_metric = 'rmse'
score_name = 'RMSE'
else:
if multi_label:
num_class = Y_XGB.nunique().max()
else:
if isinstance(Y_XGB, np.ndarray):
num_class = np.unique(Y_XGB).max() + 1
else:
num_class = Y_XGB.nunique()
if num_class == 2:
num_class = 1
if num_class <= 2:
objective='binary:logistic'
eval_metric = 'auc'
score_name = 'ROC AUC'
else:
objective='multi:softprob'
eval_metric = 'auc'
score_name = 'Multiclass ROC AUC'
xgb = XGBClassifier(
booster = 'gbtree',
colsample_bytree=0.5,
alpha=0.015,
gamma=4,
learning_rate=0.01,
max_depth=8,
min_child_weight=2,
n_estimators=1000,
reg_lambda=0.5,
objective=objective,
subsample=0.7,
random_state=99,
n_jobs=-1,
#grow_policy='lossguide',
num_class = num_class,
verbosity = 0,
silent = True)
#testing for GPU
model = xgb.set_params(**param)
hyper_frac = 0.2
#### now select a random sample from X_XGB and Y_XGB ################
if modeltype == 'Regression':
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=99)
else:
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=99, stratify=Y_XGB)
scoreFunction = { "precision": "precision_weighted","recall": "recall_weighted"}
params = {
'learning_rate': sp.stats.uniform(scale=1),
'gamma': sp.stats.randint(0, 32),
'n_estimators': sp.stats.randint(100,500),
"max_depth": sp.stats.randint(3, 15),
}
early_stopping_params={"early_stopping_rounds":5,
"eval_metric" : eval_metric,
"eval_set" : [[X_valid, Y_valid]]
}
if modeltype == 'Regression':
scoring = 'neg_mean_squared_error'
else:
scoring = 'precision'
model = RandomizedSearchCV(xgb.set_params(**param),
param_distributions = params,
n_iter = 15,
return_train_score = True,
random_state = 99,
n_jobs=-1,
cv = 3,
refit=True,
scoring=scoring,
verbose = False)
X_train, Y_train, X_valid, Y_valid = data_transform(X_train, Y_train, X_valid, Y_valid,
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
gbm_model = xgboost_model_fit(model, X_train, Y_train, X_valid, Y_valid, modeltype,
log_y, params, cpu_params, early_stopping_params)
#############################################################################
ls=[]
if modeltype == 'Regression':
fold = KFold(n_splits=n_splits)
else:
fold = StratifiedKFold(shuffle=True, n_splits=n_splits, random_state=99)
scores=[]
if not isinstance(X_XGB_test, str):
pred_xgbs = np.zeros(len(X_XGB_test))
pred_probas = np.zeros(len(X_XGB_test))
else:
pred_xgbs = []
pred_probas = []
#### First convert test data into numeric using train data ###
if not isinstance(X_XGB_test, str):
X_XGB_train_enc, Y_XGB, X_XGB_test_enc, _ = data_transform(X_XGB, Y_XGB, X_XGB_test,"",
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
else:
X_XGB_train_enc, Y_XGB, X_XGB_test_enc, _ = data_transform(X_XGB, Y_XGB, "","",
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
#### now run all the folds each one by one ##################################
start_time = time.time()
for folds, (train_index, test_index) in tqdm(enumerate(fold.split(X_XGB,Y_XGB))):
x_train, x_valid = X_XGB.iloc[train_index], X_XGB.iloc[test_index]
### you need to keep y_valid as-is in the same original state as it was given ####
if isinstance(Y_XGB, np.ndarray):
Y_XGB = pd.Series(Y_XGB,name=targets[0], index=Y_XGB_index)
### y_valid here will be transformed into log_y to ensure training and validation ####
if modeltype == 'Regression':
if log_y:
y_train, y_valid = np.log(Y_XGB.iloc[train_index]), np.log(Y_XGB.iloc[test_index])
else:
y_train, y_valid = Y_XGB.iloc[train_index], Y_XGB.iloc[test_index]
else:
y_train, y_valid = Y_XGB.iloc[train_index], Y_XGB.iloc[test_index]
## scale the x_train and x_valid values - use all columns -
x_train, y_train, x_valid, y_valid = data_transform(x_train, y_train, x_valid, y_valid,
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
model = gbm_model.best_estimator_
model = xgboost_model_fit(model, x_train, y_train, x_valid, y_valid, modeltype,
log_y, params, cpu_params)
#### now make predictions on validation data and compare it to y_valid which is in original state ##
if modeltype == 'Regression':
if log_y:
preds = np.exp(model.predict(x_valid))
else:
preds = model.predict(x_valid)
else:
preds = model.predict(x_valid)
feature_importances = pd.DataFrame(model.feature_importances_,
index = X_XGB.columns,
columns=['importance'])
sum_all=feature_importances.values
ls.append(sum_all)
###### Time to consolidate the predictions on test data #########
if modeltype == 'Regression':
if not isinstance(X_XGB_test, str):
if log_y:
pred_xgb=np.exp(model.predict(X_XGB_test_enc[columns]))
else:
pred_xgb=model.predict(X_XGB_test_enc[columns])
pred_xgbs = np.vstack([pred_xgbs, pred_xgb])
pred_xgbs = pred_xgbs.mean(axis=0)
#### preds here is for only one fold and we are comparing it to original y_valid ####
score = np.sqrt(mean_squared_error(y_valid, preds))
print('%s score in fold %d = %s' %(score_name, folds+1, score))
else:
if not isinstance(X_XGB_test, str):
pred_xgb=model.predict(X_XGB_test_enc[columns])
pred_proba = model.predict_proba(X_XGB_test_enc[columns])
if folds == 0:
pred_xgbs = copy.deepcopy(pred_xgb)
pred_probas = copy.deepcopy(pred_proba)
else:
pred_xgbs = np.vstack([pred_xgbs, pred_xgb])
pred_xgbs = stats.mode(pred_xgbs, axis=0)[0][0]
pred_probas = np.mean( np.array([ pred_probas, pred_proba ]), axis=0 )
#### preds here is for only one fold and we are comparing it to original y_valid ####
score = balanced_accuracy_score(y_valid, preds)
print('%s score in fold %d = %0.1f%%' %(score_name, folds+1, score*100))
scores.append(score)
print(' Time taken for Cross Validation of XGBoost (in minutes) = %0.1f' %(
(time.time()-start_time)/60))
print("\nCross-validated Average scores are: ", np.sum(scores)/len(scores))
##### Train on full train data set and predict #################################
print('Training model on full train dataset...')
start_time1 = time.time()
model = gbm_model.best_estimator_
model.fit(X_XGB_train_enc, Y_XGB)
if not isinstance(X_XGB_test, str):
pred_xgbs = model.predict(X_XGB_test_enc)
if modeltype != 'Regression':
pred_probas = model.predict_proba(X_XGB_test_enc)
else:
pred_probas = np.array([])
else:
pred_xgbs = np.array([])
pred_probas = np.array([])
print(' Time taken for training XGBoost (in minutes) = %0.1f' %((time.time()-start_time1)/60))
if verbose:
plot_importances_XGB(train_set=X_XGB, labels=Y_XGB, ls=ls, y_preds=pred_xgbs,
modeltype=modeltype, top_num='all')
print('Returning the following:')
if modeltype == 'Regression':
if not isinstance(X_XGB_test, str):
print(' final predictions', pred_xgbs[:10])
else:
print(' no X_test given. Returning empty array.')
print(' Model = %s' %model)
return (pred_xgbs, model)
else:
if not isinstance(X_XGB_test, str):
print(' final predictions (may need to be transformed to original labels)', pred_xgbs[:10])
print(' predicted probabilities', pred_probas[:1])
else:
print(' no X_test given. Returning empty array.')
print(' Model = %s' %model)
return (pred_xgbs, pred_probas, model)
##################################################################################
def complex_LightGBM_model(X_train, y_train, X_test, log_y=False, GPU_flag=False,
scaler = '', enc_method='label', n_splits=5, verbose=-1):
"""
This model is called complex because it handle multi-label, mulit-class datasets which LGBM ordinarily cant.
Just send in X_train, y_train and what you want to predict, X_test
It will automatically split X_train into multiple folds (10) and train and predict each time on X_test.
It will then use average (or use mode) to combine the results and give you a y_test.
It will automatically detect modeltype as "Regression" or 'Classification'
It will also add MultiOutputClassifier and MultiOutputRegressor to multi_label problems.
The underlying estimators in all cases is LGBM. So you get the best of both worlds.
Inputs:
------------
X_train: pandas dataframe only: do not send in numpy arrays. This is the X_train of your dataset.
y_train: pandas Series or DataFrame only: do not send in numpy arrays. This is the y_train of your dataset.
X_test: pandas dataframe only: do not send in numpy arrays. This is the X_test of your dataset.
log_y: default = False: If True, it means use the log of the target variable "y" to train and test.
GPU_flag: if your machine has a GPU set this flag and it will use XGBoost GPU to speed up processing.
scaler : default is StandardScaler(). But you can send in MinMaxScaler() as input to change it or any other scaler.
enc_method: default is 'label' encoding. But you can choose 'glmm' as an alternative. But those are the only two.
verbose: default = 0. Choosing 1 will give you lot more output.
Outputs:
------------
y_preds: Predicted values for your X_XGB_test dataframe.
It has been averaged after repeatedly predicting on X_XGB_test. So likely to be better than one model.
"""
X_XGB = copy.deepcopy(X_train)
Y_XGB = copy.deepcopy(y_train)
X_XGB_test = copy.deepcopy(X_test)
####################################
start_time = time.time()
top_num = 10
if isinstance(Y_XGB, pd.Series):
targets = [Y_XGB.name]
else:
targets = Y_XGB.columns.tolist()
if len(targets) == 1:
multi_label = False
if isinstance(Y_XGB, pd.DataFrame):
Y_XGB = pd.Series(Y_XGB.values.ravel(),name=targets[0], index=Y_XGB.index)
else:
multi_label = True
modeltype = analyze_problem_type(Y_XGB, targets)
columns = X_XGB.columns
#### In some cases, there are special chars in column names. Remove them. ###
if np.array([':' in x for x in columns]).any():
sel_preds = columns[np.array([':' in x for x in columns])].tolist()
print('removing special char : in %s since LightGBM does not like it...' %sel_preds)
columns = ["_".join(x.split(":")) for x in columns]
X_XGB.columns = columns
if not isinstance(X_XGB_test, str):
X_XGB_test.columns = columns
###################################################################################
######### S C A L E R P R O C E S S I N G B E G I N S ############
###################################################################################
if isinstance(scaler, str):
if not scaler == '':
scaler = scaler.lower()
if scaler == 'standard':
scaler = StandardScaler()
elif scaler == 'minmax':
scaler = MinMaxScaler()
else:
scaler = StandardScaler()
else:
scaler = StandardScaler()
else:
pass
######### G P U P R O C E S S I N G B E G I N S ############
###### This is where we set the CPU and GPU parameters for XGBoost
if GPU_flag:
GPU_exists = check_if_GPU_exists()
else:
GPU_exists = False
##### Set the Scoring Parameters here based on each model and preferences of user ###
cpu_params = {}
param = {}
cpu_params['tree_method'] = 'hist'
cpu_params['gpu_id'] = 0
cpu_params['updater'] = 'grow_colmaker'
cpu_params['predictor'] = 'cpu_predictor'
if GPU_exists:
param['tree_method'] = 'gpu_hist'
param['gpu_id'] = 0
param['updater'] = 'grow_gpu_hist' #'prune'
param['predictor'] = 'gpu_predictor'
print(' Hyper Param Tuning LightGBM with GPU parameters. This will take time. Please be patient...')
else:
param = copy.deepcopy(cpu_params)
print(' Hyper Param Tuning LightGBM with CPU parameters. This will take time. Please be patient...')
#################################################################################
if modeltype == 'Regression':
if log_y:
Y_XGB.loc[Y_XGB==0] = 1e-15 ### just set something that is zero to a very small number
######### Now set the number of rows we need to tune hyper params ###
scoreFunction = { "precision": "precision_weighted","recall": "recall_weighted"}
#### We need a small validation data set for hyper-param tuning #############
hyper_frac = 0.2
#### now select a random sample from X_XGB ##
if modeltype == 'Regression':
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=999)
else:
try:
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=999, stratify = Y_XGB)
except:
## In some small cases, you cannot stratify since there are too few samples. So leave it as is ##
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=999)
#### First convert test data into numeric using train data ###
X_train, Y_train, X_valid, Y_valid = data_transform(X_train, Y_train, X_valid, Y_valid,
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
###### This step is needed for making sure y is transformed to log_y ######
if modeltype == 'Regression' and log_y:
Y_train = np.log(Y_train)
Y_valid = np.log(Y_valid)
random_search_flag = True
###### Time to hyper-param tune model using randomizedsearchcv #########
gbm_model = lightgbm_model_fit(random_search_flag, X_train, Y_train, X_valid, Y_valid, modeltype,
multi_label, log_y, model="")
model = gbm_model.best_estimator_
#### First convert test data into numeric using train data ###
if not isinstance(X_XGB_test, str):
x_train, y_train, x_test, _ = data_transform(X_XGB, Y_XGB, X_XGB_test, "",
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
###### Time to train the hyper-tuned model on full train data #########
random_search_flag = False
model = lightgbm_model_fit(random_search_flag, x_train, y_train, x_test, "", modeltype,
multi_label, log_y, model=model)
############# Time to get feature importances based on full train data ################
if multi_label:
for i,target_name in enumerate(targets):
print('Top 10 features for {}: {}'.format(target_name,pd.DataFrame(model.estimators_[i].feature_importances_,
index=model.estimators_[i].feature_name_,
columns=['importance']).sort_values('importance', ascending=False).index.tolist()[:10]))
else:
print('Top 10 features:\n', pd.DataFrame(model.feature_importances_,index=model.feature_name_,
columns=['importance']).sort_values('importance', ascending=False).index.tolist()[:10])
###### Time to consolidate the predictions on test data #########
if modeltype == 'Regression':
if not isinstance(X_XGB_test, str):
if log_y:
pred_xgbs = np.exp(model.predict(x_test))
else:
pred_xgbs = model.predict(x_test)
#### if there is no test data just return empty strings ###
else:
pred_xgbs = []
else:
if not isinstance(X_XGB_test, str):
if not multi_label:
pred_xgbs = model.predict(x_test)
pred_probas = model.predict_proba(x_test)
else:
### This is how you have to process if it is multi_label ##
pred_probas = model.predict_proba(x_test)
predsy = [np.argmax(line,axis=1) for line in pred_probas]
pred_xgbs = np.array(predsy)
else:
pred_xgbs = []
pred_probas = []
##### once the entire model is trained on full train data ##################
print(' Time taken for training Light GBM on entire train data (in minutes) = %0.1f' %(
(time.time()-start_time)/60))
if multi_label:
for i,target_name in enumerate(targets):
lgbm.plot_importance(model.estimators_[i], importance_type='gain', max_num_features=top_num,
title='LGBM model feature importances for %s' %target_name)
else:
lgbm.plot_importance(model, importance_type='gain', max_num_features=top_num,
title='LGBM final model feature importances')
print('Returning the following:')
print(' Model = %s' %model)
if modeltype == 'Regression':
if not isinstance(X_XGB_test, str):
print(' final predictions', pred_xgbs[:10])
return (pred_xgbs, model)
else:
if not isinstance(X_XGB_test, str):
print(' final predictions (may need to be transformed to original labels)', pred_xgbs[:10])
print(' predicted probabilities', pred_probas[:1])
return (pred_xgbs, pred_probas, model)
##################################################################################
def simple_LightGBM_model(X_train, y_train, X_test, log_y=False, GPU_flag=False,
scaler = '', enc_method='label', n_splits=5, verbose=-1):
"""
Easy to use XGBoost model. Just send in X_train, y_train and what you want to predict, X_test
It will automatically split X_train into multiple folds (10) and train and predict each time on X_test.
It will then use average (or use mode) to combine the results and give you a y_test.
You just need to give the modeltype as "Regression" or 'Classification'
Inputs:
------------
X_train: pandas dataframe only: do not send in numpy arrays. This is the X_train of your dataset.
y_train: pandas Series or DataFrame only: do not send in numpy arrays. This is the y_train of your dataset.
X_test: pandas dataframe only: do not send in numpy arrays. This is the X_test of your dataset.
modeltype: can only be 'Regression' or 'Classification'
log_y: default = False: If True, it means use the log of the target variable "y" to train and test.
GPU_flag: if your machine has a GPU set this flag and it will use XGBoost GPU to speed up processing.
scaler : default is StandardScaler(). But you can send in MinMaxScaler() as input to change it or any other scaler.
enc_method: default is 'label' encoding. But you can choose 'glmm' as an alternative. But those are the only two.
verbose: default = 0. Choosing 1 will give you lot more output.
Outputs:
------------
y_preds: Predicted values for your X_XGB_test dataframe.
It has been averaged after repeatedly predicting on X_XGB_test. So likely to be better than one model.
"""
X_XGB = copy.deepcopy(X_train)
Y_XGB = copy.deepcopy(y_train)
X_XGB_test = copy.deepcopy(X_test)
#######################################
start_time = time.time()
if isinstance(Y_XGB, pd.Series):
targets = [Y_XGB.name]
else:
targets = Y_XGB.columns.tolist()
if len(targets) == 1:
multi_label = False
if isinstance(Y_XGB, pd.DataFrame):
Y_XGB = pd.Series(Y_XGB.values.ravel(),name=targets[0], index=Y_XGB.index)
else:
multi_label = True
print('Multi_label is not supported in simple_LightGBM_model. Try the complex_LightGBM_model...Returning')
return {}
##### Start your analysis of the data ############
modeltype = analyze_problem_type(Y_XGB, targets)
columns = X_XGB.columns
#### In some cases, there are special chars in column names. Remove them. ###
if np.array([':' in x for x in columns]).any():
sel_preds = columns[np.array([':' in x for x in columns])].tolist()
print('removing special char : in %s since LightGBM does not like it...' %sel_preds)
columns = ["_".join(x.split(":")) for x in columns]
X_XGB.columns = columns
if not isinstance(X_XGB_test, str):
X_XGB_test.columns = columns
###################################################################################
######### S C A L E R P R O C E S S I N G B E G I N S ############
###################################################################################
if isinstance(scaler, str):
if not scaler == '':
scaler = scaler.lower()
if scaler == 'standard':
scaler = StandardScaler()
elif scaler == 'minmax':
scaler = MinMaxScaler()
else:
scaler = StandardScaler()
else:
scaler = StandardScaler()
else:
pass
######### G P U P R O C E S S I N G B E G I N S ############
###### This is where we set the CPU and GPU parameters for XGBoost
if GPU_flag:
GPU_exists = check_if_GPU_exists()
else:
GPU_exists = False
##### Set the Scoring Parameters here based on each model and preferences of user ###
cpu_params = {}
param = {}
cpu_params['tree_method'] = 'hist'
cpu_params['gpu_id'] = 0
cpu_params['updater'] = 'grow_colmaker'
cpu_params['predictor'] = 'cpu_predictor'
if GPU_exists:
param['tree_method'] = 'gpu_hist'
param['gpu_id'] = 0
param['updater'] = 'grow_gpu_hist' #'prune'
param['predictor'] = 'gpu_predictor'
print(' Hyper Param Tuning LightGBM with GPU parameters. This will take time. Please be patient...')
else:
param = copy.deepcopy(cpu_params)
print(' Hyper Param Tuning LightGBM with CPU parameters. This will take time. Please be patient...')
#################################################################################
if modeltype == 'Regression':
if log_y:
Y_XGB.loc[Y_XGB==0] = 1e-15 ### just set something that is zero to a very small number
#testing for GPU
hyper_frac = 0.2
#### now select a random sample from X_XGB and Y_XGB ################
if modeltype == 'Regression':
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=99)
else:
X_train, X_valid, Y_train, Y_valid = train_test_split(X_XGB, Y_XGB, test_size=hyper_frac,
random_state=99, stratify=Y_XGB)
scoreFunction = { "precision": "precision_weighted","recall": "recall_weighted"}
X_train, Y_train, X_valid, Y_valid = data_transform(X_train, Y_train, X_valid, Y_valid,
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
if modeltype == 'Regression':
if log_y:
Y_train, Y_valid = np.log(Y_train), np.log(Y_valid)
random_search_flag = True
gbm_model = lightgbm_model_fit(random_search_flag, X_train, Y_train, X_valid, Y_valid, modeltype,
multi_label, log_y, model="")
model = gbm_model.best_estimator_
random_search_flag = False
#############################################################################
ls=[]
if modeltype == 'Regression':
fold = KFold(n_splits=n_splits)
else:
fold = StratifiedKFold(shuffle=True, n_splits=n_splits, random_state=99)
scores=[]
if not isinstance(X_XGB_test, str):
pred_xgbs = np.zeros(len(X_XGB_test))
pred_probas = np.zeros(len(X_XGB_test))
else:
pred_xgbs = []
pred_probas = []
#### First convert test data into numeric using train data ###
if not isinstance(X_XGB_test, str):
X_XGB_train_enc,_, X_XGB_test_enc,_ = data_transform(X_XGB, Y_XGB, X_XGB_test, "",
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
#### now run all the folds each one by one ##################################
start_time = time.time()
for folds, (train_index, test_index) in tqdm(enumerate(fold.split(X_XGB,Y_XGB))):
x_train, x_test = X_XGB.iloc[train_index], X_XGB.iloc[test_index]
### you need to keep y_test as-is in the same original state as it was given ####
y_test = Y_XGB.iloc[test_index]
### y_valid here will be transformed into log_y to ensure training and validation ####
if modeltype == 'Regression':
if log_y:
y_train, y_valid = np.log(Y_XGB.iloc[train_index]), np.log(Y_XGB.iloc[test_index])
else:
y_train, y_valid = Y_XGB.iloc[train_index], Y_XGB.iloc[test_index]
else:
y_train, y_valid = Y_XGB.iloc[train_index], Y_XGB.iloc[test_index]
## scale the x_train and x_test values - use all columns -
x_train, y_train, x_test, _ = data_transform(x_train, y_train, x_test, y_test,
modeltype, multi_label, scaler=scaler, enc_method=enc_method)
model = gbm_model.best_estimator_
model = lightgbm_model_fit(random_search_flag, x_train, y_train, x_test, y_valid, modeltype,
multi_label, log_y, model=model)
#### now make predictions on validation data and compare it to y_test which is in original state ##
if modeltype == 'Regression':
if log_y:
preds = np.exp(model.predict(x_test))
else:
preds = model.predict(x_test)
else:
preds = model.predict(x_test)
feature_importances = pd.DataFrame(model.feature_importances_,
index = X_XGB.columns,
columns=['importance'])
sum_all=feature_importances.values
ls.append(sum_all)
###### Time to consolidate the predictions on test data #########
if modeltype == 'Regression':
if not isinstance(X_XGB_test, str):
if log_y:
pred_xgb=np.exp(model.predict(X_XGB_test_enc[columns]))
else:
pred_xgb=model.predict(X_XGB_test_enc[columns])
pred_xgbs = np.vstack([pred_xgbs, pred_xgb])
pred_xgbs = pred_xgbs.mean(axis=0)
#### preds here is for only one fold and we are comparing it to original y_test ####
score = np.sqrt(mean_squared_error(y_test, preds))
print('RMSE score in fold %d = %s' %(folds+1, score))
else:
if not isinstance(X_XGB_test, str):
pred_xgb=model.predict(X_XGB_test_enc[columns])
pred_proba = model.predict_proba(X_XGB_test_enc[columns])
if folds == 0:
pred_xgbs = copy.deepcopy(pred_xgb)
pred_probas = copy.deepcopy(pred_proba)
else:
pred_xgbs = np.vstack([pred_xgbs, pred_xgb])
pred_xgbs = stats.mode(pred_xgbs, axis=0)[0][0]
pred_probas = np.mean( np.array([ pred_probas, pred_proba ]), axis=0 )
#### preds here is for only one fold and we are comparing it to original y_test ####
score = balanced_accuracy_score(y_test, preds)
print('AUC score in fold %d = %0.1f%%' %(folds+1, score*100))
scores.append(score)
print("\nCross-validated average scores are: ", np.sum(scores)/len(scores))
############# F I N A L T R A I N I N G ###################################
print('Training model on full train dataset...')
start_time1 = time.time()
model = gbm_model.best_estimator_
model.fit(X_XGB_train_enc, Y_XGB)
if not isinstance(X_XGB_test, str):
pred_xgbs = model.predict(X_XGB_test_enc)
if modeltype != 'Regression':
pred_probas = model.predict_proba(X_XGB_test_enc)
else:
pred_probas = np.array([])
else:
pred_xgbs = np.array([])
pred_probas = np.array([])
print(' Time taken for training LightGBM (in minutes) = %0.1f' %((time.time()-start_time1)/60))
if verbose:
plot_importances_XGB(train_set=X_XGB, labels=Y_XGB, ls=ls, y_preds=pred_xgbs,
modeltype=modeltype, top_num='all')
print('Returning the following:')
if modeltype == 'Regression':
if not isinstance(X_XGB_test, str):
print(' final predictions', pred_xgbs[:10])
else:
print(' no X_test given. Returning empty array.')
print(' Model = %s' %model)
return (pred_xgbs, model)
else:
if not isinstance(X_XGB_test, str):
print(' final predictions (may need to be transformed to original labels)', pred_xgbs[:10])
print(' predicted probabilities', pred_probas[:1])
else:
print(' no X_test given. Returning empty array.')
print(' Model = %s' %model)
return (pred_xgbs, pred_probas, model)
########################################################################################
def plot_importances_XGB(train_set, labels, ls, y_preds, modeltype, top_num='all'):
add_items=0
for item in ls:
add_items +=item
if isinstance(top_num, str):
feat_imp=pd.DataFrame(add_items/len(ls),index=train_set.columns,
columns=["importance"]).sort_values('importance', ascending=False)
feat_imp2=feat_imp[feat_imp>0.00005]
#df_cv=df_cv.reset_index()
#### don't add [:top_num] at the end of this statement since it will error #######
#feat_imp = pd.Series(df_cv.importance.values,
# index=df_cv.drop(["importance"], axis=1)).sort_values(axis='index',ascending=False)
else:
## this limits the number of items to the top_num items
feat_imp=pd.DataFrame(add_items/len(ls),index=train_set.columns[:top_num],
columns=["importance"]).sort_values('importance', ascending=False)
feat_imp2=feat_imp[feat_imp>0.00005]
#df_cv=df_cv.reset_index()
#feat_imp = pd.Series(df_cv.importance.values,
# index=df_cv.drop(["importance"], axis=1)).sort_values(axis='index',ascending=False)[:top_num]
##### Now plot the feature importances #################
imp_columns=[]
for item in pd.DataFrame(feat_imp2).reset_index()["index"].tolist():
fcols=re.sub("[(),]","",str(item))
try:
columns= int(re.sub("['']","",fcols))
imp_columns.append(columns)
except:
columns= re.sub("['']","",fcols)
imp_columns.append(columns)
# X_UPDATED=X_GB[imp_columns]
len(imp_columns)
fig = plt.figure(figsize=(15,8))
ax1=plt.subplot(2, 2, 1)
if isinstance(top_num, str):
feat_imp2[:].plot(kind='barh', ax=ax1, title='Feature importances of model on test data')
else:
feat_imp2[:top_num].plot(kind='barh', ax=ax1, title='Feature importances of model on test data')
if modeltype == 'Regression':
ax2=plt.subplot(2, 2, 2)
pd.Series(y_preds).plot(ax=ax2, color='b', title='Model predictions on test data');
else:
ax2=plt.subplot(2, 2, 2)
pd.Series(y_preds).hist(ax=ax2, color='b', label='Model predictions histogram on test data');
################################################################################## | 49.537585 | 131 | 0.557882 | 10,560 | 86,988 | 4.391951 | 0.068561 | 0.016818 | 0.011212 | 0.011643 | 0.788374 | 0.757735 | 0.723776 | 0.701826 | 0.685828 | 0.67261 | 0 | 0.011908 | 0.308778 | 86,988 | 1,756 | 132 | 49.537585 | 0.75943 | 0.189566 | 0 | 0.720461 | 0 | 0 | 0.117656 | 0.007098 | 0 | 0 | 0 | 0 | 0 | 1 | 0.012248 | false | 0.005764 | 0.069885 | 0 | 0.106628 | 0.056916 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
feaf037fc7a9e40b9ce1a615735eaaf3916d64a4 | 630 | py | Python | src/semu.robotics.ros2_bridge/semu/robotics/ros2_bridge/packages/control_msgs/msg/__init__.py | Toni-SM/omni.add_on.ros2_bridge | 9c5e47153d51da3a401d7f4ce679b773b32beffc | [
"MIT"
] | null | null | null | src/semu.robotics.ros2_bridge/semu/robotics/ros2_bridge/packages/control_msgs/msg/__init__.py | Toni-SM/omni.add_on.ros2_bridge | 9c5e47153d51da3a401d7f4ce679b773b32beffc | [
"MIT"
] | null | null | null | src/semu.robotics.ros2_bridge/semu/robotics/ros2_bridge/packages/control_msgs/msg/__init__.py | Toni-SM/omni.add_on.ros2_bridge | 9c5e47153d51da3a401d7f4ce679b773b32beffc | [
"MIT"
] | null | null | null | from control_msgs.msg._dynamic_joint_state import DynamicJointState # noqa: F401
from control_msgs.msg._gripper_command import GripperCommand # noqa: F401
from control_msgs.msg._interface_value import InterfaceValue # noqa: F401
from control_msgs.msg._joint_controller_state import JointControllerState # noqa: F401
from control_msgs.msg._joint_jog import JointJog # noqa: F401
from control_msgs.msg._joint_tolerance import JointTolerance # noqa: F401
from control_msgs.msg._joint_trajectory_controller_state import JointTrajectoryControllerState # noqa: F401
from control_msgs.msg._pid_state import PidState # noqa: F401
| 70 | 108 | 0.847619 | 84 | 630 | 6.02381 | 0.321429 | 0.173913 | 0.237154 | 0.284585 | 0.399209 | 0.399209 | 0.245059 | 0 | 0 | 0 | 0 | 0.042403 | 0.101587 | 630 | 8 | 109 | 78.75 | 0.85159 | 0.138095 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
22a1dbc90ea832c65ae734b380194d8ea06ee41c | 121 | py | Python | django_api/storefront2/playground/views.py | SyedArsalanAmin/webdev | 28fd7fc6c865588604c9e965a4416c7e0eb4a1c8 | [
"MIT"
] | null | null | null | django_api/storefront2/playground/views.py | SyedArsalanAmin/webdev | 28fd7fc6c865588604c9e965a4416c7e0eb4a1c8 | [
"MIT"
] | null | null | null | django_api/storefront2/playground/views.py | SyedArsalanAmin/webdev | 28fd7fc6c865588604c9e965a4416c7e0eb4a1c8 | [
"MIT"
] | null | null | null | from django.shortcuts import render
def say_hello(request):
return render(request, 'hello.html', {'name': 'Mosh'})
| 20.166667 | 58 | 0.710744 | 16 | 121 | 5.3125 | 0.8125 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.140496 | 121 | 5 | 59 | 24.2 | 0.817308 | 0 | 0 | 0 | 0 | 0 | 0.14876 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.333333 | false | 0 | 0.333333 | 0.333333 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 5 |
22da151cea22cf3cedf70e146e62673b9e520f17 | 2,374 | py | Python | code/python/src/vm/compare.py | ShakeM/luago-book | ea7ceaea677454be714235b2982343684044922e | [
"MIT"
] | 723 | 2018-01-08T04:55:42.000Z | 2022-03-27T14:30:53.000Z | code/python/src/vm/compare.py | KEVINYZY/luago-book | 88e64dfeb37b75a7fed147c51a199c41ef9f7bc4 | [
"MIT"
] | 23 | 2018-04-03T06:05:02.000Z | 2021-07-06T00:58:31.000Z | code/python/src/vm/compare.py | KEVINYZY/luago-book | 88e64dfeb37b75a7fed147c51a199c41ef9f7bc4 | [
"MIT"
] | 179 | 2018-01-08T08:16:32.000Z | 2022-03-20T02:49:44.000Z | from vm.lua_table import LuaTable
from vm.lua_value import LuaValue
class Compare:
@staticmethod
def eq(a, b, ls):
if a is None:
return b is None
if isinstance(a, bool) or isinstance(a, str):
return a == b
if isinstance(a, int):
if isinstance(b, int):
return a == b
elif isinstance(b, float):
return float(a) == b
else:
return False
if isinstance(a, float):
if isinstance(b, float):
return a == b
elif isinstance(b, int):
return a == float(b)
else:
return False
if isinstance(a, LuaTable):
if isinstance(b, LuaTable) and a != b and ls:
mm = ls.get_metamethod(a, b, '__eq')
if mm:
return LuaValue.to_boolean(ls.call_metamethod(a, mm, b))
return a == b
@staticmethod
def lt(a, b, ls):
if isinstance(a, str) and isinstance(b, str):
return a < b
if isinstance(a, int):
if isinstance(b, int):
return a < b
elif isinstance(b, float):
return float(a) < b
if isinstance(a, float):
if isinstance(b, float):
return a < b
elif isinstance(b, int):
return a < float(b)
mm = ls.get_metamethod(a, b, '__lt')
if mm:
return LuaValue.to_boolean(ls.call_metamethod(a, mm, b))
raise Exception('Comparison Error')
@staticmethod
def le(a, b, ls):
if isinstance(a, str) and isinstance(b, str):
return a <= b
if isinstance(a, int):
if isinstance(b, int):
return a <= b
elif isinstance(b, float):
return float(a) <= b
if isinstance(a, float):
if isinstance(b, float):
return a <= b
elif isinstance(b, int):
return a <= float(b)
mm = ls.get_metamethod(a, b, '__le')
if mm:
return LuaValue.to_boolean(ls.call_metamethod(a, mm, b))
mm = ls.get_metamethod(b, a, '__lt')
if mm:
return LuaValue.to_boolean(ls.call_metamethod(a, mm, b))
raise Exception('Comparison Error')
| 30.435897 | 76 | 0.489469 | 293 | 2,374 | 3.890785 | 0.143345 | 0.035088 | 0.114035 | 0.105263 | 0.8 | 0.785088 | 0.768421 | 0.730702 | 0.730702 | 0.730702 | 0 | 0 | 0.408593 | 2,374 | 77 | 77 | 30.831169 | 0.811966 | 0 | 0 | 0.691176 | 0 | 0 | 0.020219 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.044118 | false | 0 | 0.029412 | 0 | 0.426471 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
22fbea9ca848e268863cb6cf2f604ba4eb66a78e | 30,950 | py | Python | graphql/parsetab.py | ivelum/graphql.py | 9859dff2d1adb1a738c5bcfa31ff8cef5b6caad1 | [
"MIT"
] | 58 | 2015-08-21T15:35:50.000Z | 2022-03-05T17:42:25.000Z | graphql/parsetab.py | ivelum/graphql.py | 9859dff2d1adb1a738c5bcfa31ff8cef5b6caad1 | [
"MIT"
] | 14 | 2015-09-21T17:07:23.000Z | 2021-01-14T10:30:13.000Z | graphql/parsetab.py | ivelum/graphql.py | 9859dff2d1adb1a738c5bcfa31ff8cef5b6caad1 | [
"MIT"
] | 10 | 2015-09-02T17:54:34.000Z | 2021-12-09T07:48:50.000Z |
# /Users/dvs/Dropbox/Code/graphql-py/graphql/parsetab.py
# This file is automatically generated. Do not edit.
_tabversion = '3.5'
_lr_method = 'LALR'
_lr_signature = 'E6D6D5E915094EAB3A68E387CED054AB'
_lr_action_items = {'BRACE_L':([0,8,10,11,12,22,24,25,26,27,28,29,30,31,32,33,34,35,38,39,53,54,55,61,62,64,70,71,73,74,76,83,84,85,86,90,93,94,95,99,101,102,106,110,111,112,113,114,115,116,117,118,120,121,122,123,124,125,126,127,128,129,138,139,140,141,143,147,151,152,153,155,157,158,159,160,161,162,163,164,165,169,170,171,173,176,177,178,180,],[5,5,-18,-19,-20,5,-79,-74,-75,-76,-77,-78,-80,-81,-82,5,5,5,-58,-60,5,5,5,5,5,5,-59,-62,5,5,5,5,-57,-131,5,-67,-61,5,5,-63,130,5,5,-84,-85,-86,-87,-88,-89,-90,-91,-92,-103,-101,-102,-104,-105,-106,-107,-108,-109,130,-72,130,-111,-113,-119,166,-110,-112,-118,130,-93,-94,-95,-96,-97,-98,-99,-100,166,166,-115,-117,-124,-114,-116,-123,166,]),'FRAGMENT':([0,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,36,38,39,40,41,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,63,65,66,69,70,71,72,73,74,75,76,77,78,79,80,82,87,88,89,93,95,96,97,98,99,100,101,103,104,105,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,136,137,138,139,140,141,142,143,144,147,151,152,153,154,155,157,158,159,160,161,162,163,164,165,166,168,169,170,171,172,173,174,176,177,178,179,180,181,],[9,9,9,-7,26,-8,-9,26,41,-18,-19,-20,-6,9,-5,26,-23,-24,-25,-26,26,-42,41,-79,-74,-75,-76,-77,-78,-80,-81,-82,-17,-58,-60,26,-50,-49,-51,-52,-53,-54,-55,-56,-4,-21,-22,-38,-39,-40,-41,-83,26,-44,26,-13,-15,-16,26,-59,-62,26,-37,-36,-35,-34,-33,-32,26,-65,-43,-11,-12,-14,-61,-31,-30,-29,-28,-63,-64,124,-48,-10,26,-46,-27,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,26,-103,-101,-102,-104,-105,-106,-107,-108,-109,124,26,-47,26,-45,-72,124,-111,-113,26,-119,-121,124,-110,-112,-118,-120,124,-93,-94,-95,-96,-97,-98,-99,-100,124,26,-122,124,-115,-117,26,-124,-126,-114,-116,-123,-125,124,-127,]),'QUERY':([0,2,4,5,6,7,8,9,10,11,12,13,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,36,38,39,40,41,43,44,45,46,47,48,49,51,52,53,54,55,56,57,58,59,60,63,65,66,69,70,71,72,73,74,75,76,77,78,79,80,82,87,88,89,93,95,96,97,98,99,100,101,103,104,105,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,136,137,138,139,140,141,142,143,144,147,151,152,153,154,155,157,158,159,160,161,162,163,164,165,166,168,169,170,171,172,173,174,176,177,178,179,180,181,],[10,10,-7,27,-8,-9,27,44,-18,-19,-20,-6,27,-23,-24,-25,-26,27,-42,44,-79,-74,-75,-76,-77,-78,-80,-81,-82,-17,-58,-60,27,-50,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-83,27,-44,27,-13,-15,-16,27,-59,-62,27,-37,-36,-35,-34,-33,-32,27,-65,-43,-11,-12,-14,-61,-31,-30,-29,-28,-63,-64,125,-48,-10,27,-46,-27,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,27,-103,-101,-102,-104,-105,-106,-107,-108,-109,125,27,-47,27,-45,-72,125,-111,-113,27,-119,-121,125,-110,-112,-118,-120,125,-93,-94,-95,-96,-97,-98,-99,-100,125,27,-122,125,-115,-117,27,-124,-126,-114,-116,-123,-125,125,-127,]),'MUTATION':([0,2,4,5,6,7,8,9,10,11,12,13,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,36,38,39,40,41,43,44,45,46,47,48,49,51,52,53,54,55,56,57,58,59,60,63,65,66,69,70,71,72,73,74,75,76,77,78,79,80,82,87,88,89,93,95,96,97,98,99,100,101,103,104,105,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,136,137,138,139,140,141,142,143,144,147,151,152,153,154,155,157,158,159,160,161,162,163,164,165,166,168,169,170,171,172,173,174,176,177,178,179,180,181,],[11,11,-7,28,-8,-9,28,45,-18,-19,-20,-6,28,-23,-24,-25,-26,28,-42,45,-79,-74,-75,-76,-77,-78,-80,-81,-82,-17,-58,-60,28,-50,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-83,28,-44,28,-13,-15,-16,28,-59,-62,28,-37,-36,-35,-34,-33,-32,28,-65,-43,-11,-12,-14,-61,-31,-30,-29,-28,-63,-64,126,-48,-10,28,-46,-27,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,28,-103,-101,-102,-104,-105,-106,-107,-108,-109,126,28,-47,28,-45,-72,126,-111,-113,28,-119,-121,126,-110,-112,-118,-120,126,-93,-94,-95,-96,-97,-98,-99,-100,126,28,-122,126,-115,-117,28,-124,-126,-114,-116,-123,-125,126,-127,]),'SUBSCRIPTION':([0,2,4,5,6,7,8,9,10,11,12,13,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,36,38,39,40,41,43,44,45,46,47,48,49,51,52,53,54,55,56,57,58,59,60,63,65,66,69,70,71,72,73,74,75,76,77,78,79,80,82,87,88,89,93,95,96,97,98,99,100,101,103,104,105,107,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,136,137,138,139,140,141,142,143,144,147,151,152,153,154,155,157,158,159,160,161,162,163,164,165,166,168,169,170,171,172,173,174,176,177,178,179,180,181,],[12,12,-7,29,-8,-9,29,46,-18,-19,-20,-6,29,-23,-24,-25,-26,29,-42,46,-79,-74,-75,-76,-77,-78,-80,-81,-82,-17,-58,-60,29,-50,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-83,29,-44,29,-13,-15,-16,29,-59,-62,29,-37,-36,-35,-34,-33,-32,29,-65,-43,-11,-12,-14,-61,-31,-30,-29,-28,-63,-64,127,-48,-10,29,-46,-27,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,29,-103,-101,-102,-104,-105,-106,-107,-108,-109,127,29,-47,29,-45,-72,127,-111,-113,29,-119,-121,127,-110,-112,-118,-120,127,-93,-94,-95,-96,-97,-98,-99,-100,127,29,-122,127,-115,-117,29,-124,-126,-114,-116,-123,-125,127,-127,]),'$end':([1,2,3,4,6,7,13,14,15,36,50,51,63,65,66,87,88,89,104,107,137,],[0,-1,-2,-7,-8,-9,-6,-3,-5,-17,-4,-21,-13,-15,-16,-11,-12,-14,-10,-46,-45,]),'SPREAD':([5,16,17,18,19,20,22,24,25,26,27,28,29,30,31,32,38,39,41,43,44,45,46,47,48,49,51,52,53,54,55,56,59,70,71,73,74,75,76,77,78,82,93,95,96,97,98,99,103,108,131,],[23,23,-23,-24,-25,-26,-42,-79,-74,-75,-76,-77,-78,-80,-81,-82,-58,-60,-50,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-44,-59,-62,-37,-36,-35,-34,-33,-32,-43,-61,-31,-30,-29,-28,-63,-48,-27,-47,]),'NAME':([5,8,9,10,11,12,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,38,39,40,41,43,44,45,46,47,48,49,51,52,53,54,55,56,57,58,59,60,69,70,71,72,73,74,75,76,77,78,79,80,82,93,95,96,97,98,99,100,101,103,105,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,136,138,139,140,141,142,143,144,147,151,152,153,154,155,157,158,159,160,161,162,163,164,165,166,168,169,170,171,172,173,174,176,177,178,179,180,181,],[25,25,43,-18,-19,-20,25,-23,-24,-25,-26,25,-42,43,-79,-74,-75,-76,-77,-78,-80,-81,-82,-58,-60,25,-50,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-83,25,-44,25,25,-59,-62,25,-37,-36,-35,-34,-33,-32,25,-65,-43,-61,-31,-30,-29,-28,-63,-64,123,-48,25,-27,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,25,-103,-101,-102,-104,-105,-106,-107,-108,-109,123,25,-47,25,-72,123,-111,-113,25,-119,-121,123,-110,-112,-118,-120,123,-93,-94,-95,-96,-97,-98,-99,-100,123,25,-122,123,-115,-117,25,-124,-126,-114,-116,-123,-125,123,-127,]),'ON':([5,8,10,11,12,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,38,39,40,41,42,43,44,45,46,47,48,49,51,52,53,54,55,56,57,58,59,60,69,70,71,72,73,74,75,76,77,78,79,80,82,93,95,96,97,98,99,100,101,103,105,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,136,138,139,140,141,142,143,144,147,151,152,153,154,155,157,158,159,160,161,162,163,164,165,166,168,169,170,171,172,173,174,176,177,178,179,180,181,],[24,24,-18,-19,-20,24,-23,-24,-25,-26,24,-42,60,-79,-74,-75,-76,-77,-78,-80,-81,-82,-58,-60,24,-50,72,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-83,24,-44,24,24,-59,-62,24,-37,-36,-35,-34,-33,-32,24,-65,-43,-61,-31,-30,-29,-28,-63,-64,128,-48,24,-27,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,24,-103,-101,-102,-104,-105,-106,-107,-108,-109,128,24,-47,24,-72,128,-111,-113,24,-119,-121,128,-110,-112,-118,-120,128,-93,-94,-95,-96,-97,-98,-99,-100,128,24,-122,128,-115,-117,24,-124,-126,-114,-116,-123,-125,128,-127,]),'TRUE':([5,8,9,10,11,12,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,38,39,40,41,43,44,45,46,47,48,49,51,52,53,54,55,56,57,58,59,60,69,70,71,72,73,74,75,76,77,78,79,80,82,93,95,96,97,98,99,100,101,103,105,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,136,138,139,140,141,142,143,144,147,151,152,153,154,155,157,158,159,160,161,162,163,164,165,166,168,169,170,171,172,173,174,176,177,178,179,180,181,],[30,30,47,-18,-19,-20,30,-23,-24,-25,-26,30,-42,47,-79,-74,-75,-76,-77,-78,-80,-81,-82,-58,-60,30,-50,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-83,30,-44,30,30,-59,-62,30,-37,-36,-35,-34,-33,-32,30,-65,-43,-61,-31,-30,-29,-28,-63,-64,121,-48,30,-27,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,30,-103,-101,-102,-104,-105,-106,-107,-108,-109,121,30,-47,30,-72,121,-111,-113,30,-119,-121,121,-110,-112,-118,-120,121,-93,-94,-95,-96,-97,-98,-99,-100,121,30,-122,121,-115,-117,30,-124,-126,-114,-116,-123,-125,121,-127,]),'FALSE':([5,8,9,10,11,12,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,38,39,40,41,43,44,45,46,47,48,49,51,52,53,54,55,56,57,58,59,60,69,70,71,72,73,74,75,76,77,78,79,80,82,93,95,96,97,98,99,100,101,103,105,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,136,138,139,140,141,142,143,144,147,151,152,153,154,155,157,158,159,160,161,162,163,164,165,166,168,169,170,171,172,173,174,176,177,178,179,180,181,],[31,31,48,-18,-19,-20,31,-23,-24,-25,-26,31,-42,48,-79,-74,-75,-76,-77,-78,-80,-81,-82,-58,-60,31,-50,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-83,31,-44,31,31,-59,-62,31,-37,-36,-35,-34,-33,-32,31,-65,-43,-61,-31,-30,-29,-28,-63,-64,122,-48,31,-27,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,31,-103,-101,-102,-104,-105,-106,-107,-108,-109,122,31,-47,31,-72,122,-111,-113,31,-119,-121,122,-110,-112,-118,-120,122,-93,-94,-95,-96,-97,-98,-99,-100,122,31,-122,122,-115,-117,31,-124,-126,-114,-116,-123,-125,122,-127,]),'NULL':([5,8,9,10,11,12,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,38,39,40,41,43,44,45,46,47,48,49,51,52,53,54,55,56,57,58,59,60,69,70,71,72,73,74,75,76,77,78,79,80,82,93,95,96,97,98,99,100,101,103,105,108,109,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129,130,131,136,138,139,140,141,142,143,144,147,151,152,153,154,155,157,158,159,160,161,162,163,164,165,166,168,169,170,171,172,173,174,176,177,178,179,180,181,],[32,32,49,-18,-19,-20,32,-23,-24,-25,-26,32,-42,49,-79,-74,-75,-76,-77,-78,-80,-81,-82,-58,-60,32,-50,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-83,32,-44,32,32,-59,-62,32,-37,-36,-35,-34,-33,-32,32,-65,-43,-61,-31,-30,-29,-28,-63,-64,120,-48,32,-27,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,32,-103,-101,-102,-104,-105,-106,-107,-108,-109,120,32,-47,32,-72,120,-111,-113,32,-119,-121,120,-110,-112,-118,-120,120,-93,-94,-95,-96,-97,-98,-99,-100,120,32,-122,120,-115,-117,32,-124,-126,-114,-116,-123,-125,120,-127,]),'PAREN_L':([8,10,11,12,22,24,25,26,27,28,29,30,31,32,33,53,71,],[37,-18,-19,-20,58,-79,-74,-75,-76,-77,-78,-80,-81,-82,37,58,58,]),'AT':([8,10,11,12,22,24,25,26,27,28,29,30,31,32,33,34,38,39,41,43,44,45,46,47,48,49,53,54,59,61,70,71,73,83,84,85,90,93,94,99,],[40,-18,-19,-20,40,-79,-74,-75,-76,-77,-78,-80,-81,-82,40,40,40,-60,-50,-49,-51,-52,-53,-54,-55,-56,40,40,40,40,-59,-62,40,40,-57,-131,-67,-61,40,-63,]),'BRACE_R':([16,17,18,19,20,22,24,25,26,27,28,29,30,31,32,38,39,41,43,44,45,46,47,48,49,51,52,53,54,55,56,59,70,71,73,74,75,76,77,78,82,93,95,96,97,98,99,103,108,110,111,112,113,114,115,116,117,118,120,121,122,123,124,125,126,127,128,130,131,138,140,142,143,144,151,153,154,157,158,159,160,161,162,163,164,166,168,170,172,173,174,176,178,179,181,],[51,-23,-24,-25,-26,-42,-79,-74,-75,-76,-77,-78,-80,-81,-82,-58,-60,-50,-49,-51,-52,-53,-54,-55,-56,-21,-22,-38,-39,-40,-41,-44,-59,-62,-37,-36,-35,-34,-33,-32,-43,-61,-31,-30,-29,-28,-63,-48,-27,-84,-85,-86,-87,-88,-89,-90,-91,-92,-103,-101,-102,-104,-105,-106,-107,-108,-109,143,-47,-72,-111,153,-119,-121,-110,-118,-120,-93,-94,-95,-96,-97,-98,-99,-100,173,-122,-115,178,-124,-126,-114,-123,-125,-127,]),'COLON':([22,24,25,26,27,28,29,30,31,32,81,92,145,175,],[57,-79,-74,-75,-76,-77,-78,-80,-81,-82,101,105,155,180,]),'BANG':([24,25,26,27,28,29,30,31,32,85,133,134,167,],[-79,-74,-75,-76,-77,-78,-80,-81,-82,-131,148,149,-132,]),'EQUALS':([24,25,26,27,28,29,30,31,32,85,132,133,134,135,148,149,167,],[-79,-74,-75,-76,-77,-78,-80,-81,-82,-131,147,-128,-129,-130,-133,-134,-132,]),'PAREN_R':([24,25,26,27,28,29,30,31,32,67,68,79,80,85,91,100,109,110,111,112,113,114,115,116,117,118,120,121,122,123,124,125,126,127,128,132,133,134,135,138,140,143,146,148,149,151,153,156,157,158,159,160,161,162,163,164,167,170,173,176,178,],[-79,-74,-75,-76,-77,-78,-80,-81,-82,90,-69,99,-65,-131,-68,-64,-66,-84,-85,-86,-87,-88,-89,-90,-91,-92,-103,-101,-102,-104,-105,-106,-107,-108,-109,-71,-128,-129,-130,-72,-111,-119,-70,-133,-134,-110,-118,-73,-93,-94,-95,-96,-97,-98,-99,-100,-132,-115,-124,-114,-123,]),'DOLLAR':([24,25,26,27,28,29,30,31,32,37,67,68,85,91,101,110,111,112,113,114,115,116,117,118,120,121,122,123,124,125,126,127,128,129,132,133,134,135,138,139,140,141,143,146,148,149,151,152,153,155,156,157,158,159,160,161,162,163,164,167,170,173,176,178,],[-79,-74,-75,-76,-77,-78,-80,-81,-82,69,69,-69,-131,-68,119,-84,-85,-86,-87,-88,-89,-90,-91,-92,-103,-101,-102,-104,-105,-106,-107,-108,-109,119,-71,-128,-129,-130,-72,119,-111,-113,-119,-70,-133,-134,-110,-112,-118,119,-73,-93,-94,-95,-96,-97,-98,-99,-100,-132,-115,-124,-114,-123,]),'BRACKET_R':([24,25,26,27,28,29,30,31,32,85,110,111,112,113,114,115,116,117,118,120,121,122,123,124,125,126,127,128,129,133,134,135,138,139,140,141,143,148,149,150,151,152,153,157,158,159,160,161,162,163,164,165,167,169,170,171,173,176,177,178,],[-79,-74,-75,-76,-77,-78,-80,-81,-82,-131,-84,-85,-86,-87,-88,-89,-90,-91,-92,-103,-101,-102,-104,-105,-106,-107,-108,-109,140,-128,-129,-130,-72,151,-111,-113,-119,-133,-134,167,-110,-112,-118,-93,-94,-95,-96,-97,-98,-99,-100,170,-132,176,-115,-117,-124,-114,-116,-123,]),'INT_VALUE':([24,25,26,27,28,29,30,31,32,101,110,111,112,113,114,115,116,117,118,120,121,122,123,124,125,126,127,128,129,138,139,140,141,143,147,151,152,153,155,157,158,159,160,161,162,163,164,165,169,170,171,173,176,177,178,180,],[-79,-74,-75,-76,-77,-78,-80,-81,-82,111,-84,-85,-86,-87,-88,-89,-90,-91,-92,-103,-101,-102,-104,-105,-106,-107,-108,-109,111,-72,111,-111,-113,-119,157,-110,-112,-118,111,-93,-94,-95,-96,-97,-98,-99,-100,157,157,-115,-117,-124,-114,-116,-123,157,]),'FLOAT_VALUE':([24,25,26,27,28,29,30,31,32,101,110,111,112,113,114,115,116,117,118,120,121,122,123,124,125,126,127,128,129,138,139,140,141,143,147,151,152,153,155,157,158,159,160,161,162,163,164,165,169,170,171,173,176,177,178,180,],[-79,-74,-75,-76,-77,-78,-80,-81,-82,112,-84,-85,-86,-87,-88,-89,-90,-91,-92,-103,-101,-102,-104,-105,-106,-107,-108,-109,112,-72,112,-111,-113,-119,158,-110,-112,-118,112,-93,-94,-95,-96,-97,-98,-99,-100,158,158,-115,-117,-124,-114,-116,-123,158,]),'STRING_VALUE':([24,25,26,27,28,29,30,31,32,101,110,111,112,113,114,115,116,117,118,120,121,122,123,124,125,126,127,128,129,138,139,140,141,143,147,151,152,153,155,157,158,159,160,161,162,163,164,165,169,170,171,173,176,177,178,180,],[-79,-74,-75,-76,-77,-78,-80,-81,-82,113,-84,-85,-86,-87,-88,-89,-90,-91,-92,-103,-101,-102,-104,-105,-106,-107,-108,-109,113,-72,113,-111,-113,-119,159,-110,-112,-118,113,-93,-94,-95,-96,-97,-98,-99,-100,159,159,-115,-117,-124,-114,-116,-123,159,]),'BRACKET_L':([24,25,26,27,28,29,30,31,32,101,105,110,111,112,113,114,115,116,117,118,120,121,122,123,124,125,126,127,128,129,136,138,139,140,141,143,147,151,152,153,155,157,158,159,160,161,162,163,164,165,169,170,171,173,176,177,178,180,],[-79,-74,-75,-76,-77,-78,-80,-81,-82,129,136,-84,-85,-86,-87,-88,-89,-90,-91,-92,-103,-101,-102,-104,-105,-106,-107,-108,-109,129,136,-72,129,-111,-113,-119,165,-110,-112,-118,129,-93,-94,-95,-96,-97,-98,-99,-100,165,165,-115,-117,-124,-114,-116,-123,165,]),}
_lr_action = {}
for _k, _v in _lr_action_items.items():
for _x,_y in zip(_v[0],_v[1]):
if not _x in _lr_action: _lr_action[_x] = {}
_lr_action[_x][_k] = _y
del _lr_action_items
_lr_goto_items = {'document':([0,],[1,]),'definition_list':([0,],[2,]),'selection_set':([0,8,22,33,34,35,53,54,55,61,62,64,73,74,76,83,86,94,95,102,106,],[3,36,56,63,65,66,75,77,78,87,88,89,96,97,98,103,104,107,108,131,137,]),'definition':([0,2,],[4,13,]),'operation_definition':([0,2,],[6,6,]),'fragment_definition':([0,2,3,14,],[7,7,15,50,]),'operation_type':([0,2,],[8,8,]),'fragment_list':([3,],[14,]),'selection_list':([5,],[16,]),'selection':([5,16,],[17,52,]),'field':([5,16,],[18,18,]),'fragment_spread':([5,16,],[19,19,]),'inline_fragment':([5,16,],[20,20,]),'alias':([5,16,],[21,21,]),'name':([5,8,16,21,40,58,60,69,72,79,105,119,130,136,142,166,172,],[22,33,22,53,71,81,85,92,85,81,85,138,145,85,145,175,175,]),'variable_definitions':([8,33,],[34,61,]),'directives':([8,22,33,34,53,54,59,61,73,83,94,],[35,55,62,64,74,76,82,86,95,102,106,]),'directive_list':([8,22,33,34,53,54,59,61,73,83,94,],[38,38,38,38,38,38,38,38,38,38,38,]),'directive':([8,22,33,34,38,53,54,59,61,73,83,94,],[39,39,39,39,70,39,39,39,39,39,39,39,]),'fragment_name':([9,23,],[42,59,]),'arguments':([22,53,71,],[54,73,93,]),'variable_definition_list':([37,],[67,]),'variable_definition':([37,67,],[68,91,]),'argument_list':([58,],[79,]),'argument':([58,79,],[80,100,]),'type_condition':([60,72,],[83,94,]),'named_type':([60,72,105,136,],[84,84,133,133,]),'value':([101,129,139,155,],[109,141,152,168,]),'variable':([101,129,139,155,],[110,110,110,110,]),'null_value':([101,129,139,147,155,165,169,180,],[114,114,114,160,114,160,160,160,]),'boolean_value':([101,129,139,147,155,165,169,180,],[115,115,115,161,115,161,161,161,]),'enum_value':([101,129,139,147,155,165,169,180,],[116,116,116,162,116,162,162,162,]),'list_value':([101,129,139,155,],[117,117,117,117,]),'object_value':([101,129,139,155,],[118,118,118,118,]),'type':([105,136,],[132,150,]),'list_type':([105,136,],[134,134,]),'non_null_type':([105,136,],[135,135,]),'value_list':([129,],[139,]),'object_field_list':([130,],[142,]),'object_field':([130,142,],[144,154,]),'default_value':([132,],[146,]),'const_value':([147,165,169,180,],[156,171,177,181,]),'const_list_value':([147,165,169,180,],[163,163,163,163,]),'const_object_value':([147,165,169,180,],[164,164,164,164,]),'const_value_list':([165,],[169,]),'const_object_field_list':([166,],[172,]),'const_object_field':([166,172,],[174,179,]),}
_lr_goto = {}
for _k, _v in _lr_goto_items.items():
for _x, _y in zip(_v[0], _v[1]):
if not _x in _lr_goto: _lr_goto[_x] = {}
_lr_goto[_x][_k] = _y
del _lr_goto_items
_lr_productions = [
("S' -> document","S'",1,None,None,None),
('document -> definition_list','document',1,'p_document','parser.py',42),
('document -> selection_set','document',1,'p_document_shorthand','parser.py',48),
('document -> selection_set fragment_list','document',2,'p_document_shorthand_with_fragments','parser.py',54),
('fragment_list -> fragment_list fragment_definition','fragment_list',2,'p_fragment_list','parser.py',60),
('fragment_list -> fragment_definition','fragment_list',1,'p_fragment_list_single','parser.py',66),
('definition_list -> definition_list definition','definition_list',2,'p_definition_list','parser.py',72),
('definition_list -> definition','definition_list',1,'p_definition_list_single','parser.py',78),
('definition -> operation_definition','definition',1,'p_definition','parser.py',84),
('definition -> fragment_definition','definition',1,'p_definition','parser.py',85),
('operation_definition -> operation_type name variable_definitions directives selection_set','operation_definition',5,'p_operation_definition1','parser.py',99),
('operation_definition -> operation_type name variable_definitions selection_set','operation_definition',4,'p_operation_definition2','parser.py',110),
('operation_definition -> operation_type name directives selection_set','operation_definition',4,'p_operation_definition3','parser.py',120),
('operation_definition -> operation_type name selection_set','operation_definition',3,'p_operation_definition4','parser.py',130),
('operation_definition -> operation_type variable_definitions directives selection_set','operation_definition',4,'p_operation_definition5','parser.py',136),
('operation_definition -> operation_type variable_definitions selection_set','operation_definition',3,'p_operation_definition6','parser.py',146),
('operation_definition -> operation_type directives selection_set','operation_definition',3,'p_operation_definition7','parser.py',155),
('operation_definition -> operation_type selection_set','operation_definition',2,'p_operation_definition8','parser.py',164),
('operation_type -> QUERY','operation_type',1,'p_operation_type','parser.py',170),
('operation_type -> MUTATION','operation_type',1,'p_operation_type','parser.py',171),
('operation_type -> SUBSCRIPTION','operation_type',1,'p_operation_type','parser.py',172),
('selection_set -> BRACE_L selection_list BRACE_R','selection_set',3,'p_selection_set','parser.py',178),
('selection_list -> selection_list selection','selection_list',2,'p_selection_list','parser.py',184),
('selection_list -> selection','selection_list',1,'p_selection_list_single','parser.py',190),
('selection -> field','selection',1,'p_selection','parser.py',196),
('selection -> fragment_spread','selection',1,'p_selection','parser.py',197),
('selection -> inline_fragment','selection',1,'p_selection','parser.py',198),
('field -> alias name arguments directives selection_set','field',5,'p_field_all','parser.py',204),
('field -> name arguments directives selection_set','field',4,'p_field_optional1_1','parser.py',211),
('field -> alias name directives selection_set','field',4,'p_field_optional1_2','parser.py',218),
('field -> alias name arguments selection_set','field',4,'p_field_optional1_3','parser.py',224),
('field -> alias name arguments directives','field',4,'p_field_optional1_4','parser.py',230),
('field -> name directives selection_set','field',3,'p_field_optional2_1','parser.py',236),
('field -> name arguments selection_set','field',3,'p_field_optional2_2','parser.py',242),
('field -> name arguments directives','field',3,'p_field_optional2_3','parser.py',248),
('field -> alias name selection_set','field',3,'p_field_optional2_4','parser.py',254),
('field -> alias name directives','field',3,'p_field_optional2_5','parser.py',260),
('field -> alias name arguments','field',3,'p_field_optional2_6','parser.py',266),
('field -> alias name','field',2,'p_field_optional3_1','parser.py',272),
('field -> name arguments','field',2,'p_field_optional3_2','parser.py',278),
('field -> name directives','field',2,'p_field_optional3_3','parser.py',284),
('field -> name selection_set','field',2,'p_field_optional3_4','parser.py',290),
('field -> name','field',1,'p_field_optional4','parser.py',296),
('fragment_spread -> SPREAD fragment_name directives','fragment_spread',3,'p_fragment_spread1','parser.py',302),
('fragment_spread -> SPREAD fragment_name','fragment_spread',2,'p_fragment_spread2','parser.py',308),
('fragment_definition -> FRAGMENT fragment_name ON type_condition directives selection_set','fragment_definition',6,'p_fragment_definition1','parser.py',314),
('fragment_definition -> FRAGMENT fragment_name ON type_condition selection_set','fragment_definition',5,'p_fragment_definition2','parser.py',321),
('inline_fragment -> SPREAD ON type_condition directives selection_set','inline_fragment',5,'p_inline_fragment1','parser.py',328),
('inline_fragment -> SPREAD ON type_condition selection_set','inline_fragment',4,'p_inline_fragment2','parser.py',335),
('fragment_name -> NAME','fragment_name',1,'p_fragment_name','parser.py',341),
('fragment_name -> FRAGMENT','fragment_name',1,'p_fragment_name','parser.py',342),
('fragment_name -> QUERY','fragment_name',1,'p_fragment_name','parser.py',343),
('fragment_name -> MUTATION','fragment_name',1,'p_fragment_name','parser.py',344),
('fragment_name -> SUBSCRIPTION','fragment_name',1,'p_fragment_name','parser.py',345),
('fragment_name -> TRUE','fragment_name',1,'p_fragment_name','parser.py',346),
('fragment_name -> FALSE','fragment_name',1,'p_fragment_name','parser.py',347),
('fragment_name -> NULL','fragment_name',1,'p_fragment_name','parser.py',348),
('type_condition -> named_type','type_condition',1,'p_type_condition','parser.py',354),
('directives -> directive_list','directives',1,'p_directives','parser.py',360),
('directive_list -> directive_list directive','directive_list',2,'p_directive_list','parser.py',366),
('directive_list -> directive','directive_list',1,'p_directive_list_single','parser.py',372),
('directive -> AT name arguments','directive',3,'p_directive','parser.py',378),
('directive -> AT name','directive',2,'p_directive','parser.py',379),
('arguments -> PAREN_L argument_list PAREN_R','arguments',3,'p_arguments','parser.py',386),
('argument_list -> argument_list argument','argument_list',2,'p_argument_list','parser.py',392),
('argument_list -> argument','argument_list',1,'p_argument_list_single','parser.py',398),
('argument -> name COLON value','argument',3,'p_argument','parser.py',404),
('variable_definitions -> PAREN_L variable_definition_list PAREN_R','variable_definitions',3,'p_variable_definitions','parser.py',410),
('variable_definition_list -> variable_definition_list variable_definition','variable_definition_list',2,'p_variable_definition_list','parser.py',416),
('variable_definition_list -> variable_definition','variable_definition_list',1,'p_variable_definition_list_single','parser.py',422),
('variable_definition -> DOLLAR name COLON type default_value','variable_definition',5,'p_variable_definition1','parser.py',428),
('variable_definition -> DOLLAR name COLON type','variable_definition',4,'p_variable_definition2','parser.py',434),
('variable -> DOLLAR name','variable',2,'p_variable','parser.py',440),
('default_value -> EQUALS const_value','default_value',2,'p_default_value','parser.py',446),
('name -> NAME','name',1,'p_name','parser.py',452),
('name -> FRAGMENT','name',1,'p_name','parser.py',453),
('name -> QUERY','name',1,'p_name','parser.py',454),
('name -> MUTATION','name',1,'p_name','parser.py',455),
('name -> SUBSCRIPTION','name',1,'p_name','parser.py',456),
('name -> ON','name',1,'p_name','parser.py',457),
('name -> TRUE','name',1,'p_name','parser.py',458),
('name -> FALSE','name',1,'p_name','parser.py',459),
('name -> NULL','name',1,'p_name','parser.py',460),
('alias -> name COLON','alias',2,'p_alias','parser.py',466),
('value -> variable','value',1,'p_value','parser.py',472),
('value -> INT_VALUE','value',1,'p_value','parser.py',473),
('value -> FLOAT_VALUE','value',1,'p_value','parser.py',474),
('value -> STRING_VALUE','value',1,'p_value','parser.py',475),
('value -> null_value','value',1,'p_value','parser.py',476),
('value -> boolean_value','value',1,'p_value','parser.py',477),
('value -> enum_value','value',1,'p_value','parser.py',478),
('value -> list_value','value',1,'p_value','parser.py',479),
('value -> object_value','value',1,'p_value','parser.py',480),
('const_value -> INT_VALUE','const_value',1,'p_const_value','parser.py',486),
('const_value -> FLOAT_VALUE','const_value',1,'p_const_value','parser.py',487),
('const_value -> STRING_VALUE','const_value',1,'p_const_value','parser.py',488),
('const_value -> null_value','const_value',1,'p_const_value','parser.py',489),
('const_value -> boolean_value','const_value',1,'p_const_value','parser.py',490),
('const_value -> enum_value','const_value',1,'p_const_value','parser.py',491),
('const_value -> const_list_value','const_value',1,'p_const_value','parser.py',492),
('const_value -> const_object_value','const_value',1,'p_const_value','parser.py',493),
('boolean_value -> TRUE','boolean_value',1,'p_boolean_value','parser.py',499),
('boolean_value -> FALSE','boolean_value',1,'p_boolean_value','parser.py',500),
('null_value -> NULL','null_value',1,'p_null_value','parser.py',506),
('enum_value -> NAME','enum_value',1,'p_enum_value','parser.py',512),
('enum_value -> FRAGMENT','enum_value',1,'p_enum_value','parser.py',513),
('enum_value -> QUERY','enum_value',1,'p_enum_value','parser.py',514),
('enum_value -> MUTATION','enum_value',1,'p_enum_value','parser.py',515),
('enum_value -> SUBSCRIPTION','enum_value',1,'p_enum_value','parser.py',516),
('enum_value -> ON','enum_value',1,'p_enum_value','parser.py',517),
('list_value -> BRACKET_L value_list BRACKET_R','list_value',3,'p_list_value','parser.py',523),
('list_value -> BRACKET_L BRACKET_R','list_value',2,'p_list_value','parser.py',524),
('value_list -> value_list value','value_list',2,'p_value_list','parser.py',530),
('value_list -> value','value_list',1,'p_value_list_single','parser.py',536),
('const_list_value -> BRACKET_L const_value_list BRACKET_R','const_list_value',3,'p_const_list_value','parser.py',542),
('const_list_value -> BRACKET_L BRACKET_R','const_list_value',2,'p_const_list_value','parser.py',543),
('const_value_list -> const_value_list const_value','const_value_list',2,'p_const_value_list','parser.py',549),
('const_value_list -> const_value','const_value_list',1,'p_const_value_list_single','parser.py',555),
('object_value -> BRACE_L object_field_list BRACE_R','object_value',3,'p_object_value','parser.py',561),
('object_value -> BRACE_L BRACE_R','object_value',2,'p_object_value','parser.py',562),
('object_field_list -> object_field_list object_field','object_field_list',2,'p_object_field_list','parser.py',568),
('object_field_list -> object_field','object_field_list',1,'p_object_field_list_single','parser.py',576),
('object_field -> name COLON value','object_field',3,'p_object_field','parser.py',582),
('const_object_value -> BRACE_L const_object_field_list BRACE_R','const_object_value',3,'p_const_object_value','parser.py',588),
('const_object_value -> BRACE_L BRACE_R','const_object_value',2,'p_const_object_value','parser.py',589),
('const_object_field_list -> const_object_field_list const_object_field','const_object_field_list',2,'p_const_object_field_list','parser.py',595),
('const_object_field_list -> const_object_field','const_object_field_list',1,'p_const_object_field_list_single','parser.py',603),
('const_object_field -> name COLON const_value','const_object_field',3,'p_const_object_field','parser.py',609),
('type -> named_type','type',1,'p_type','parser.py',615),
('type -> list_type','type',1,'p_type','parser.py',616),
('type -> non_null_type','type',1,'p_type','parser.py',617),
('named_type -> name','named_type',1,'p_named_type','parser.py',623),
('list_type -> BRACKET_L type BRACKET_R','list_type',3,'p_list_type','parser.py',629),
('non_null_type -> named_type BANG','non_null_type',2,'p_non_null_type','parser.py',635),
('non_null_type -> list_type BANG','non_null_type',2,'p_non_null_type','parser.py',636),
]
| 188.719512 | 15,562 | 0.666139 | 6,389 | 30,950 | 3.112694 | 0.063077 | 0.053905 | 0.01056 | 0.01408 | 0.629607 | 0.557399 | 0.486901 | 0.44255 | 0.366119 | 0.32212 | 0 | 0.364651 | 0.029855 | 30,950 | 163 | 15,563 | 189.877301 | 0.297675 | 0.003393 | 0 | 0.012987 | 1 | 0 | 0.328297 | 0.031484 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
43048e73b8d5dd2a985447a911014c9df3416c27 | 76 | py | Python | buffpy/models/__init__.py | ItsCalebJones/buffpy | 531080f98e02f26fbe1b6902c6c91ae8800951ef | [
"MIT"
] | null | null | null | buffpy/models/__init__.py | ItsCalebJones/buffpy | 531080f98e02f26fbe1b6902c6c91ae8800951ef | [
"MIT"
] | null | null | null | buffpy/models/__init__.py | ItsCalebJones/buffpy | 531080f98e02f26fbe1b6902c6c91ae8800951ef | [
"MIT"
] | null | null | null |
from .link import Link
from .profile import Profile
from .user import User
| 15.2 | 28 | 0.789474 | 12 | 76 | 5 | 0.416667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.171053 | 76 | 4 | 29 | 19 | 0.952381 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
432a4fc8a6e8868e2373b962a5f2b9d9e5011619 | 5,135 | py | Python | antipetros_discordbot/utility/gidsql/db_reader.py | Giddius/Antipetros_Discord_Bot | 2c139a5c0fc410385e936999989513fc1e7ebc8b | [
"MIT"
] | null | null | null | antipetros_discordbot/utility/gidsql/db_reader.py | Giddius/Antipetros_Discord_Bot | 2c139a5c0fc410385e936999989513fc1e7ebc8b | [
"MIT"
] | 13 | 2021-02-19T02:22:28.000Z | 2021-02-20T03:19:11.000Z | antipetros_discordbot/utility/gidsql/db_reader.py | Giddius/Antipetros_Discord_Bot | 2c139a5c0fc410385e936999989513fc1e7ebc8b | [
"MIT"
] | 2 | 2020-11-19T10:21:06.000Z | 2021-12-14T00:27:45.000Z | # region [Imports]
# * Standard Library Imports ---------------------------------------------------------------------------->
import enum
import logging
import sqlite3 as sqlite
import textwrap
# * Gid Imports ----------------------------------------------------------------------------------------->
import gidlogger as glog
import aiosqlite
# * Local Imports --------------------------------------------------------------------------------------->
from antipetros_discordbot.utility.gidsql.db_action_base import GidSqliteActionBase, AioGidSqliteActionBase
# endregion[Imports]
__updated__ = '2020-11-28 02:04:13'
# region [AppUserData]
# endregion [AppUserData]
# region [Logging]
log = logging.getLogger('gidsql')
glog.import_notification(log, __name__)
# endregion[Logging]
# region [Constants]
# endregion[Constants]
class Fetch(enum.Enum):
All = enum.auto()
One = enum.auto()
class GidSqliteReader(GidSqliteActionBase):
FETCH_ALL = Fetch.All
FETCH_ONE = Fetch.One
def __init__(self, in_db_loc, in_pragmas=None, log_execution: bool = True):
super().__init__(in_db_loc, in_pragmas)
self.row_factory = None
self.log_execution = log_execution
glog.class_init_notification(log, self)
def query(self, sql_phrase, variables: tuple = None, fetch: Fetch = Fetch.All):
conn = sqlite.connect(self.db_loc, isolation_level=None, detect_types=sqlite.PARSE_DECLTYPES)
if self.row_factory is not None:
conn.row_factory = self.row_factory
cursor = conn.cursor()
try:
self._execute_pragmas(cursor)
if variables is not None:
cursor.execute(sql_phrase, variables)
if self.log_execution is True:
_log_sql_phrase = ' '.join(sql_phrase.replace('\n', ' ').split())
_log_args = textwrap.shorten(str(variables), width=200, placeholder='...')
log.debug("Queried sql phrase '%s' with args %s successfully", _log_sql_phrase, _log_args)
else:
cursor.execute(sql_phrase)
if self.log_execution is True:
_log_sql_phrase = ' '.join(sql_phrase.replace('\n', ' ').split())
log.debug("Queried Script sql phrase '%s' successfully", _log_sql_phrase)
_out = cursor.fetchone() if fetch is Fetch.One else cursor.fetchall()
except sqlite.Error as error:
_log_sql_phrase = ' '.join(sql_phrase.replace('\n', ' ').split())
_log_args = textwrap.shorten(str(variables), width=200, placeholder='...')
self._handle_error(error, _log_sql_phrase, _log_args)
finally:
conn.close()
return _out
def enable_row_factory(self, in_factory=None):
self.row_factory = in_factory if in_factory is not None else sqlite.Row
def disable_row_factory(self):
self.row_factory = None
class AioGidSqliteReader(AioGidSqliteActionBase):
FETCH_ALL = Fetch.All
FETCH_ONE = Fetch.One
def __init__(self, in_db_loc, in_pragmas=None, log_execution: bool = True):
super().__init__(in_db_loc, in_pragmas)
self.row_factory = None
self.log_execution = log_execution
glog.class_init_notification(log, self)
async def enable_row_factory(self, in_factory=None):
self.row_factory = in_factory if in_factory is not None else aiosqlite.Row
async def disable_row_factory(self):
self.row_factory = None
async def query(self, sql_phrase, variables: tuple = None, fetch: Fetch = Fetch.All):
conn = await aiosqlite.connect(self.db_loc, isolation_level=None, detect_types=sqlite.PARSE_DECLTYPES)
if self.row_factory is not None:
conn.row_factory = self.row_factory
cursor = await conn.cursor()
try:
await self._execute_pragmas(cursor)
if variables is not None:
await cursor.execute(sql_phrase, variables)
if self.log_execution is True:
_log_sql_phrase = ' '.join(sql_phrase.replace('\n', ' ').split())
_log_args = textwrap.shorten(str(variables), width=200, placeholder='...')
log.debug("Queried sql phrase '%s' with args %s successfully", _log_sql_phrase, _log_args)
else:
await cursor.execute(sql_phrase)
if self.log_execution is True:
_log_sql_phrase = ' '.join(sql_phrase.replace('\n', ' ').split())
log.debug("Queried Script sql phrase '%s' successfully", _log_sql_phrase)
_out = await cursor.fetchone() if fetch is Fetch.One else await cursor.fetchall()
except sqlite.Error as error:
_log_sql_phrase = ' '.join(sql_phrase.replace('\n', ' ').split())
_log_args = textwrap.shorten(str(variables), width=200, placeholder='...')
await self._handle_error(error, _log_sql_phrase, _log_args)
raise error
finally:
await cursor.close()
await conn.close()
return _out
| 39.198473 | 110 | 0.608958 | 597 | 5,135 | 4.956449 | 0.184255 | 0.085164 | 0.048665 | 0.032443 | 0.746874 | 0.743494 | 0.743494 | 0.743494 | 0.719838 | 0.63535 | 0 | 0.00697 | 0.24557 | 5,135 | 130 | 111 | 39.5 | 0.75684 | 0.091723 | 0 | 0.549451 | 0 | 0 | 0.052666 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.054945 | false | 0 | 0.087912 | 0 | 0.263736 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
43368ab71e8d87ed33629f01e5de0e6956e71958 | 13,464 | py | Python | res_mods/mods/packages/xvm_integrity/python/hash_table.py | peterbartha/ImmunoMod | cbf8cd49893d7082a347c1f72c0e39480869318a | [
"MIT"
] | null | null | null | res_mods/mods/packages/xvm_integrity/python/hash_table.py | peterbartha/ImmunoMod | cbf8cd49893d7082a347c1f72c0e39480869318a | [
"MIT"
] | 1 | 2016-04-03T13:31:39.000Z | 2016-04-03T16:48:26.000Z | res_mods/mods/packages/xvm_integrity/python/hash_table.py | peterbartha/ImmunoMod | cbf8cd49893d7082a347c1f72c0e39480869318a | [
"MIT"
] | null | null | null | """ Generated automatically by XVM builder """
HASH_DATA = {
'res_mods/configs/xvm/py_macro/str.py': '561e169bd878aee3dbded53b62bf78805dd9daf3',
'res_mods/configs/xvm/py_macro/vinfo.py': 'e19e5a65573dab1e43f4d7f733ace459db72c31f',
'res_mods/configs/xvm/py_macro/totalEfficiency.py': 'db155ffa6234a104c0038309c0cfe378506f857c',
'res_mods/configs/xvm/py_macro/damage_log.py': 'a3d23d5152a1137e660fd82b6e74c3e14497f93b',
'res_mods/configs/xvm/py_macro/math.py': 'e694b79bf2e9cfb6ce1541dcdf0faa1f35f60287',
'res_mods/configs/xvm/py_macro/sixthsenseduration.py': 'c5ab885fa53723688903085b93e126cd0ffaa0a1',
'res_mods/configs/xvm/py_macro/repairTime.py': 'ca61ee2ff5ad64da93b15137f7094bdd106ea0b7',
'res_mods/configs/xvm/py_macro/xvm/__init__.py': 'da39a3ee5e6b4b0d3255bfef95601890afd80709',
'res_mods/configs/xvm/py_macro/xvm/xvm2sup.py': '2164a22ab5ab9548a0e050e491ba687990cc7417',
'res_mods/configs/xvm/py_macro/xvm/damageLog.py': 'c1dbeb915de1a7150170ca9bfa47ed0d086db65d',
'res_mods/configs/xvm/py_macro/xvm/total_hp.py': '46e7408bc949b5249e2f440d54a3de2b5a787f9d',
'res_mods/configs/xvm/py_macro/xvm/utils.py': 'f01929a814e55e33a1922742192a04099d9b164a',
'res_mods/configs/xvm/py_macro/xvm/total_Efficiency.py': 'c664c31b9ee151ec26906e4b25f6b6ba9aee2adc',
'res_mods/configs/xvm/py_macro/xvm.py': '4ebd8dccdb6ebbefa9505681d7c5fabf9a14c427',
'res_mods/configs/xvm/py_macro/score_panel.py': '887e357cf819c432aa908cae5288b9c1474d8da5',
'res_mods/configs/xvm/py_macro/xvm_debug.py': 'dfba560d0ed62a14c344baf223634c99d3935468',
'res_mods/mods/packages/xvm_limits/python/__init__.py': '426e7693bea66da433f598511f2e925b673b1af8',
'res_mods/mods/packages/xvm_limits/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_integrity/python/__init__.py': '429a4920f344b0238f0877042316165486bc0c3f',
'res_mods/mods/packages/xvm_integrity/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_integrity/python/hash_table.py': '7447c8871eac81cb55a94de9d8b080b5c969bb72',
'res_mods/mods/packages/xvm_equip/python/__init__.py': '3f4767615b6c494d48faf0ef9b1105d734cfc1bb',
'res_mods/mods/packages/xvm_equip/python/wg_compat.py': '09b82ab329709a498f0946b261a819bc64b2f01c',
'res_mods/mods/packages/xvm_equip/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_ping/python/pinger_wg.py': 'b4c31543a2ddf5f1bedf932a5a37106cb1f05d25',
'res_mods/mods/packages/xvm_ping/python/__init__.py': '8ea408747dd6c0bbcd7e0b196451fad57b844d9b',
'res_mods/mods/packages/xvm_ping/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_ping/python/pinger.py': '9f2794773c369a2c9b9784c84eac9696e83d5eef',
'res_mods/mods/packages/xvm_tooltips/python/__init__.py': '78d1a2e63d326ddb46daaa7f7bab60abc20ccec0',
'res_mods/mods/packages/xvm_tooltips/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_hangar/python/svcmsg.py': 'a8593204d7de2e5ede400e4b62dd79f5c040f7fa',
'res_mods/mods/packages/xvm_hangar/python/__init__.py': '048d4a7c2a694f240204e727f357532c169242b6',
'res_mods/mods/packages/xvm_hangar/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_profiler/python/swfprofiler.py': '4d62dc4cb03891caa5765b022a58fd98f81f3504',
'res_mods/mods/packages/xvm_profiler/python/__init__.py': '6e83007e96091ee3858345b3395df097a1fee5f4',
'res_mods/mods/packages/xvm_profiler/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_profiler/python/profiler.py': '0f959586d6d3354e0fe91991f581266f83c37491',
'res_mods/mods/packages/xvm_contacts/python/__init__.py': '22d163a207a1e1fc0473700ab3912b63bf62be02',
'res_mods/mods/packages/xvm_contacts/python/wg_compat.py': '8b8976f1089c6465977b715566e1be7a99e160e1',
'res_mods/mods/packages/xvm_contacts/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_contacts/python/view.py': 'a99f6ee3b82f2632f394196b11c74d509e4f16d3',
'res_mods/mods/packages/xvm_contacts/python/contacts.py': '57d8332e5a841ea10774541caf2adf650f7d7e06',
'res_mods/mods/packages/xvm_lobby/as_lobby/xvm_lobby_ui.swf': 'a19451b4f47ee7ba042af2304fd09fdb3b7bbdef',
'res_mods/mods/packages/xvm_lobby/as_lobby/xvm_lobbycompany_ui.swf': '0450bee0d60538370b8342d647000f43c0332370',
'res_mods/mods/packages/xvm_lobby/as_lobby/xvm_lobbycontacts_ui.swf': 'cefc73573a5267e800059e0b668be82d1ab53175',
'res_mods/mods/packages/xvm_lobby/as_lobby/xvm_lobby.swf': 'a42932f83ef43b9ebac4ffd9d71db9e90c321a00',
'res_mods/mods/packages/xvm_lobby/as_lobby/xvm_lobbyprofile_ui.swf': 'b56eef58a9e61b2938cddddd5c78b5fe14d082ac',
'res_mods/mods/packages/xvm_battleresults/python/__init__.py': 'd05c25f43b1e8a2b3dda81dbacd0beb3fea3fa42',
'res_mods/mods/packages/xvm_battleresults/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_export/python/fps.py': 'c9b9d74bf0e16d011924327aba1d916a842b7ee9',
'res_mods/mods/packages/xvm_export/python/__init__.py': 'd93f70e3532eb8ce005d3083eef537321aec7362',
'res_mods/mods/packages/xvm_export/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_tankcarousel/python/__init__.py': '2b5832df0587cc2c983fb7ca805f224624dbab0e',
'res_mods/mods/packages/xvm_tankcarousel/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_tankcarousel/python/filter_popover.py': '0b4b6ae5f4c8b28efc72230467f329fe3f2c0bb9',
'res_mods/mods/packages/xvm_tankcarousel/python/tankcarousel.py': 'bd6fac40fbbb1c7c62b145f11c91436ee895fcdf',
'res_mods/mods/packages/xvm_tankcarousel/python/reserve.py': '011311a474abd78dc2caa0ac9417282f91bb2939',
'res_mods/mods/packages/xvm_online/python/__init__.py': '772a89f36cc3579d3187eefe0fafd83a3069e4bd',
'res_mods/mods/packages/xvm_online/python/online.py': 'c06cd34b409ffa960010a83880e1a79c5d766b4e',
'res_mods/mods/packages/xvm_online/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_hotfix/python/__init__.py': '42e101c4aa9caf1f8fe59f6adb7b393185fabbac',
'res_mods/mods/packages/xvm_hotfix/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_squad/python/__init__.py': '1322564261a8c0cb7a14eea4f674775b7d826d62',
'res_mods/mods/packages/xvm_squad/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_autologin/python/__init__.py': '61684a770c9083eec5aeb85da24992d2e7d9e730',
'res_mods/mods/packages/xvm_autologin/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_main/python/xvm_scale_data.py': '624f14249442e971827de7e9bdcf7edbd6dc7396',
'res_mods/mods/packages/xvm_main/python/config.py': 'e9db3ebaab681c078210be422e0227974351f12b',
'res_mods/mods/packages/xvm_main/python/vehinfo_tiers.py': 'c2c14db5ab6e787a13416e2478ebad533f5cdb58',
'res_mods/mods/packages/xvm_main/python/svcmsg.py': 'f25899af88517dfb4fea410e7eb3c50e53faaa01',
'res_mods/mods/packages/xvm_main/python/python_macro.py': '76fa909613f18e1343273c84e72829dbe2edf74d',
'res_mods/mods/packages/xvm_main/python/default_xvm_xc.py': 'db157532a28155e275b85c19137b8d24c4e4f648',
'res_mods/mods/packages/xvm_main/python/__init__.py': 'c572d34dc29b47a47eff6ebfe655668029e92e8e',
'res_mods/mods/packages/xvm_main/python/consts.py': 'e9571090f9a3a8494edcb41df747cf0f8b2c4c12',
'res_mods/mods/packages/xvm_main/python/vehinfo.py': 'b1cad5c78ce23a85a83454f37b93e20ff712f994',
'res_mods/mods/packages/xvm_main/python/vehinfo_stat_avg.py': '8ab0be1481a129df8d9bfe675ba5687547b3ce71',
'res_mods/mods/packages/xvm_main/python/filecache.py': 'd993d4d38b526de3a605f34c0439b56f71c39292',
'res_mods/mods/packages/xvm_main/python/configwatchdog.py': 'bf704e51a89a84de18e25d724011cacf51c5b973',
'res_mods/mods/packages/xvm_main/python/userprefs.py': '7e12846de5220ac59fef68adf9ef09034eac32d1',
'res_mods/mods/packages/xvm_main/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_main/python/topclans.py': '377b18b4109c578521ac793421213869f3b2f1e1',
'res_mods/mods/packages/xvm_main/python/vehinfo_short.py': '5ca79ed724acf4d67b511cad1f5aecf220cc27a5',
'res_mods/mods/packages/xvm_main/python/minimap_circles.py': 'd34b8ead7cc5c846455d083f125ca63ce70fddc6',
'res_mods/mods/packages/xvm_main/python/vehinfo_xte.py': '358d6d7654c56bee208049611e1f68638cfaa64f',
'res_mods/mods/packages/xvm_main/python/xvm_scale.py': '1dbedad35dbea891716cce31d97f328c9bc6e300',
'res_mods/mods/packages/xvm_main/python/wgutils.py': '1210af3c550f847d57dc612e0762085abf57cab0',
'res_mods/mods/packages/xvm_main/python/xvm.py': 'b52b859e6a25356bad1aafae662d5b350e966981',
'res_mods/mods/packages/xvm_main/python/logger.py': '8ce319bb26552a2f6255c9648913c2ed59ea80bc',
'res_mods/mods/packages/xvm_main/python/mutex.py': 'b1073317655a5ac89adad48993b32661a85ac6bb',
'res_mods/mods/packages/xvm_main/python/utils.py': '094c4a55e4817b17d23266831d267e4406dcf883',
'res_mods/mods/packages/xvm_main/python/default_config.py': 'cd16156729bd5560f558ca8d753c7e24675d3c16',
'res_mods/mods/packages/xvm_main/python/xvmapi.py': '2cc40497879fd5d9c943d802b7bf9abf45da9d15',
'res_mods/mods/packages/xvm_main/python/loadurl.py': '962ed63e3d1b520b5095fc10b1d995449eab57e9',
'res_mods/mods/packages/xvm_main/python/dossier.py': 'f4279ae5c320d232bb1e9c8417d052a02c50dbb6',
'res_mods/mods/packages/xvm_main/python/vehinfo_wn8.py': '170724c7d89cd3b88669c4445d28363b93200a1d',
'res_mods/mods/packages/xvm_main/python/vehinfo_xtdb.py': 'd6defb32293efbca7f274ae5ad800870895766c2',
'res_mods/mods/packages/xvm_main/python/test.py': '36108395e33874380b6b5ba493b2ccf03b102a68',
'res_mods/mods/packages/xvm_main/python/stats.py': 'c4dfc87b34f35bef1b78ec56866e830d0bd5939b',
'res_mods/mods/packages/xvm_battle/as_battle/xvm_vehiclemarkers_ui.swf': '83fd71b1ae89f321456b16d7d92a8265e58b3a6d',
'res_mods/mods/packages/xvm_battle/as_battle_classic/xvm_battle_classic.swf': '625bebdbfe09f638532c736789ae5e8c44bdfade',
'res_mods/mods/packages/xvm_battle/as_battle_ranked/xvm_battle_ranked.swf': '3bec910506743fa4a3e971c716cac39c0580396a',
'res_mods/mods/packages/xvm_battle/python/__init__.py': '1356b611b9be5f7e70119808ac0be5e4cd7849d5',
'res_mods/mods/packages/xvm_battle/python/consts.py': '4cfd9b7ceb7f6441156d3b14268fc4f9b2b2e467',
'res_mods/mods/packages/xvm_battle/python/camera.py': 'b21b803a0f355ba2b4201a464f45c157fda15275',
'res_mods/mods/packages/xvm_battle/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_battle/python/xmqp.py': '3a0b6448424cce040c67f16dfebb73bf5436b19b',
'res_mods/mods/packages/xvm_battle/python/minimap.py': '9705d99f9d274138ce7069eac53acf74c51ceb26',
'res_mods/mods/packages/xvm_battle/python/fragCorrelationPanel.py': '1adf337f254f8b5ac6c898912a7d268df231ffa1',
'res_mods/mods/packages/xvm_battle/python/shared.py': 'e479d1923f7acc08201381ae859d5093f5771e9b',
'res_mods/mods/packages/xvm_battle/python/replay.py': 'f5574c521a75816bb87516a713b6da40fa22b99a',
'res_mods/mods/packages/xvm_battle/python/xmqp_events.py': 'f3d940e854c76ebd1a4c9b0224f027c3268b1fae',
'res_mods/mods/packages/xvm_battle/python/vehicleMarkers.py': '02413d5f481ec025558aa0c24201d3b790a3cd1b',
'res_mods/mods/packages/xvm_battle/python/vehicleMarkersBC.py': '85e4db8c750b576d511d5d36e01f2d06fe5045e1',
'res_mods/mods/packages/xvm_battle/python/battle.py': '3b79a89b9ac7ce4283dbd29de963d52c3b868f7d',
'res_mods/mods/packages/xvm_battle/python/battleloading.py': 'b150aafb27fc0cca2c113a139e6403b62c8d9b5b',
'res_mods/mods/packages/xvm_techtree/python/__init__.py': '58a44b09271b2452748ca510492a25f318d68d8d',
'res_mods/mods/packages/xvm_techtree/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_quests/python/__init__.py': '7528d31005e415a53e228cb16f851760cd0b6229',
'res_mods/mods/packages/xvm_quests/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_sounds/python/__init__.py': 'cb904788fa5267a07c19edd2d963c730dfd429d1',
'res_mods/mods/packages/xvm_sounds/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_sounds/python/enemySighted.py': '77879f97e0f6de7f32a28e091f5eacf23db5ebc4',
'res_mods/mods/packages/xvm_sounds/python/battleEnd.py': 'a377227c62b9e5dbb60191f469d853730c960ea4',
'res_mods/mods/packages/xvm_sounds/python/ammoBay.py': 'dda6b964bf47927b58b50ab42902374c08e20373',
'res_mods/mods/packages/xvm_sounds/python/fireAlert.py': 'a710dfed84dfddc5c353e5eb1908dacde5b22730',
'res_mods/mods/packages/xvm_sounds/python/sixthSense.py': 'dab45c8e42a2db3749944ea3641dcc8ee7d8e80d',
'res_mods/mods/packages/xvm_sounds/python/bankManager.py': 'd408f2c47eba30a1f1991e1cf7a6d4b3b8d8c5aa',
'res_mods/mods/packages/xvm_sounds/python/test.py': 'ab06f85ac215f3ac48584a26f350e70a269c08b9',
'res_mods/mods/packages/xvm_profile/python/__init__.py': '2642d1d5620c095917cc2d726e2ceb99febc542a',
'res_mods/mods/packages/xvm_profile/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
'res_mods/mods/packages/xvm_crew/python/__init__.py': '123e9f6f0359332ee3013610c61120b5381c3a32',
'res_mods/mods/packages/xvm_crew/python/wg_compat.py': '8e36d6b06334f11e7354129b9910a80f47c3358c',
'res_mods/mods/packages/xvm_crew/python/__version__.py': '183d271160bca4348026e7297641f18eeab646c5',
}
| 98.277372 | 121 | 0.860963 | 1,379 | 13,464 | 8.044235 | 0.166062 | 0.083927 | 0.116019 | 0.200397 | 0.501397 | 0.501397 | 0.484089 | 0.238168 | 0.182998 | 0.106193 | 0 | 0.260408 | 0.020573 | 13,464 | 136 | 122 | 99 | 0.580799 | 0.002822 | 0 | 0 | 1 | 0 | 0.919437 | 0.919437 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | false | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
4342b394ed7cd631703949570c033f1c7aad856f | 170 | py | Python | examples/pybullet/examples/graphicsServer.py | felipeek/bullet3 | 6a59241074720e9df119f2f86bc01765917feb1e | [
"Zlib"
] | 9,136 | 2015-01-02T00:41:45.000Z | 2022-03-31T15:30:02.000Z | examples/pybullet/examples/graphicsServer.py | felipeek/bullet3 | 6a59241074720e9df119f2f86bc01765917feb1e | [
"Zlib"
] | 2,424 | 2015-01-05T08:55:58.000Z | 2022-03-30T19:34:55.000Z | examples/pybullet/examples/graphicsServer.py | felipeek/bullet3 | 6a59241074720e9df119f2f86bc01765917feb1e | [
"Zlib"
] | 2,921 | 2015-01-02T10:19:30.000Z | 2022-03-31T02:48:42.000Z | import pybullet as p
import time
p.connect(p.GRAPHICS_SERVER)
#p.connect(p.GRAPHICS_SERVER_MAIN_THREAD)
while p.isConnected():
p.stepSimulation()
time.sleep(1./240.) | 21.25 | 41 | 0.776471 | 27 | 170 | 4.740741 | 0.592593 | 0.125 | 0.140625 | 0.265625 | 0.359375 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025974 | 0.094118 | 170 | 8 | 42 | 21.25 | 0.805195 | 0.235294 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.333333 | 0 | 0.333333 | 0 | 1 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
4a5cac0f76d685b7a21559cae17475705a44f5ed | 369 | py | Python | src/utils/file_manipulation_tools.py | mfkiwl/ConvLSTM-Computer-Vision-for-Structural-Health-Monitoring-SHM-and-NonDestructive-Testing-NDT | 551f6afd2f4207a4a6a717cabc13fe51f31eb410 | [
"MIT"
] | 17 | 2020-02-25T05:41:41.000Z | 2022-03-25T06:48:30.000Z | src/utils/file_manipulation_tools.py | SubChange/ConvLSTM-Computer-Vision-for-Structural-Health-Monitoring-SHM-and-NonDestructive-Testing-NDT | 0f00291fd7d20d3472709f2941adba722b35f8d5 | [
"MIT"
] | 1 | 2021-01-13T06:07:02.000Z | 2021-01-13T06:07:02.000Z | src/utils/file_manipulation_tools.py | SubChange/ConvLSTM-Computer-Vision-for-Structural-Health-Monitoring-SHM-and-NonDestructive-Testing-NDT | 0f00291fd7d20d3472709f2941adba722b35f8d5 | [
"MIT"
] | 5 | 2020-11-22T12:58:23.000Z | 2021-06-16T14:20:10.000Z | import configs_and_settings
import os
def get_file_folder_names_in_dir(dir_path):
files_folders_names_list = [file_i for file_i in os.listdir(dir_path)]
return files_folders_names_list
def get_num_files_in_dir(dir_path):
files_folders_names_list = get_file_folder_names_in_dir(dir_path)
num_files_folders = len(files_folders_names_list)
return num_files_folders | 30.75 | 71 | 0.864499 | 66 | 369 | 4.257576 | 0.333333 | 0.256228 | 0.241993 | 0.298932 | 0.405694 | 0.405694 | 0.405694 | 0.405694 | 0 | 0 | 0 | 0 | 0.086721 | 369 | 12 | 72 | 30.75 | 0.833828 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.222222 | false | 0 | 0.222222 | 0 | 0.666667 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 5 |
4a6ae3bd20a599fdcb729e06cd752643d7cd2e2d | 1,003 | py | Python | deutschland/lebensmittelwarnung/models/__init__.py | kiranmusze/deutschland | 86d8ead3f38ad88ad66bb338b9f5a8db06992344 | [
"Apache-2.0"
] | 445 | 2021-07-26T22:00:26.000Z | 2022-03-31T08:31:08.000Z | deutschland/lebensmittelwarnung/models/__init__.py | kiranmusze/deutschland | 86d8ead3f38ad88ad66bb338b9f5a8db06992344 | [
"Apache-2.0"
] | 30 | 2021-07-27T15:42:23.000Z | 2022-03-26T16:14:11.000Z | deutschland/lebensmittelwarnung/models/__init__.py | kiranmusze/deutschland | 86d8ead3f38ad88ad66bb338b9f5a8db06992344 | [
"Apache-2.0"
] | 28 | 2021-07-27T10:48:43.000Z | 2022-03-26T14:31:30.000Z | # flake8: noqa
# import all models into this package
# if you have many models here with many references from one model to another this may
# raise a RecursionError
# to avoid this, import only the models that you directly need like:
# from from deutschland.lebensmittelwarnung.model.pet import Pet
# or import this package, but before doing it, use:
# import sys
# sys.setrecursionlimit(n)
from deutschland.lebensmittelwarnung.model.inline_object import InlineObject
from deutschland.lebensmittelwarnung.model.request_options import RequestOptions
from deutschland.lebensmittelwarnung.model.response import Response
from deutschland.lebensmittelwarnung.model.response_docs import ResponseDocs
from deutschland.lebensmittelwarnung.model.response_product import ResponseProduct
from deutschland.lebensmittelwarnung.model.response_rapex_information import (
ResponseRapexInformation,
)
from deutschland.lebensmittelwarnung.model.response_safety_information import (
ResponseSafetyInformation,
)
| 43.608696 | 86 | 0.845464 | 118 | 1,003 | 7.118644 | 0.5 | 0.142857 | 0.32381 | 0.371429 | 0.279762 | 0 | 0 | 0 | 0 | 0 | 0 | 0.001116 | 0.10668 | 1,003 | 22 | 87 | 45.590909 | 0.936384 | 0.370887 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.636364 | 0 | 0.636364 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
4a8cb28050303b34b9a45246c10a7242da894370 | 407 | py | Python | magenta/models/my_rnn/__init__.py | 3bst0r/magenta | aa5095b3408762faa8c51baf69b352d72b728d8c | [
"Apache-2.0"
] | null | null | null | magenta/models/my_rnn/__init__.py | 3bst0r/magenta | aa5095b3408762faa8c51baf69b352d72b728d8c | [
"Apache-2.0"
] | null | null | null | magenta/models/my_rnn/__init__.py | 3bst0r/magenta | aa5095b3408762faa8c51baf69b352d72b728d8c | [
"Apache-2.0"
] | null | null | null | from .my_simple_rnn_model import BASIC_EVENT_DIM
from .my_simple_rnn_model import LOOKBACK_RNN_INPUT_EVENT_DIM
from .my_simple_rnn_model import get_simple_rnn_model
from .my_rnn_generate import one_hot_event
from .my_rnn_generate import generate_greedy
from .my_rnn_generate import legacy_generate_beam_search
from .my_rnn_generate import plot_likelihoods_fn
from .my_rnn_generate import melody_seq_to_midi
| 45.222222 | 61 | 0.90172 | 71 | 407 | 4.619718 | 0.366197 | 0.146341 | 0.137195 | 0.259146 | 0.637195 | 0.286585 | 0.207317 | 0.207317 | 0 | 0 | 0 | 0 | 0.078624 | 407 | 8 | 62 | 50.875 | 0.874667 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
4350af9a7deed04eb8a0d026c12bf0e2582762cc | 525 | py | Python | component/jvms.py | kaiseu/pat-data-processing | 26cf8c8d4c4de4fa563bfba32309392a70418547 | [
"Apache-2.0"
] | 1 | 2018-03-23T03:03:04.000Z | 2018-03-23T03:03:04.000Z | component/jvms.py | kaiseu/pat-data-processing | 26cf8c8d4c4de4fa563bfba32309392a70418547 | [
"Apache-2.0"
] | 1 | 2017-09-22T05:42:47.000Z | 2017-09-22T06:47:20.000Z | component/jvms.py | kaiseu/pat-data-processing | 26cf8c8d4c4de4fa563bfba32309392a70418547 | [
"Apache-2.0"
] | 2 | 2017-08-21T08:19:42.000Z | 2018-03-23T03:09:41.000Z | #!/usr/bin/python
# encoding: utf-8
"""
@author: xuk1
@license: (C) Copyright 2013-2017
@contact: kai.a.xu@intel.com
@file: jvms.py
@time: 8/21/2017 16:47
@desc:
"""
import numpy as np
import pandas as pd
from component.base import CommonBase
class Jvms(CommonBase):
"""
Node JVMS attribute, not implement yet
"""
def __init__(self):
pass
def get_data_by_time(self, start, end):
return [pd.DataFrame(np.zeros((3, 3)))], pd.DataFrame(np.zeros((3, 3)))
| 17.5 | 80 | 0.609524 | 76 | 525 | 4.118421 | 0.736842 | 0.070288 | 0.083067 | 0.115016 | 0.127796 | 0.127796 | 0 | 0 | 0 | 0 | 0 | 0.062972 | 0.24381 | 525 | 29 | 81 | 18.103448 | 0.725441 | 0.369524 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.25 | false | 0.125 | 0.375 | 0.125 | 0.875 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 1 | 1 | 1 | 0 | 0 | 5 |
4364c1d606c7401cbb2b804b393d3e340dd84db5 | 37 | py | Python | tests/__init__.py | jwdunne/enguard | 840e878b797f87ce370ca27b4059ad49ef3db5c1 | [
"MIT"
] | 2 | 2020-11-20T10:28:09.000Z | 2021-11-10T10:21:03.000Z | tests/__init__.py | jwdunne/enguard | 840e878b797f87ce370ca27b4059ad49ef3db5c1 | [
"MIT"
] | 17 | 2020-02-01T20:20:35.000Z | 2020-05-30T12:26:16.000Z | tests/__init__.py | jwdunne/enguard | 840e878b797f87ce370ca27b4059ad49ef3db5c1 | [
"MIT"
] | null | null | null | """Unit test package for enguard."""
| 18.5 | 36 | 0.675676 | 5 | 37 | 5 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.135135 | 37 | 1 | 37 | 37 | 0.78125 | 0.810811 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
4370f57963ea74d40327f4e8f764caa0172c2da7 | 94 | py | Python | api/districts/admin.py | AlohaLATAM/alohaweb1.0 | aa2c5cba5e04c6716731e4e4f3d5683ec2a579b1 | [
"MIT"
] | null | null | null | api/districts/admin.py | AlohaLATAM/alohaweb1.0 | aa2c5cba5e04c6716731e4e4f3d5683ec2a579b1 | [
"MIT"
] | 2 | 2020-06-05T19:16:17.000Z | 2021-06-10T20:54:31.000Z | api/districts/admin.py | AlohaLATAM/alohaweb1.0 | aa2c5cba5e04c6716731e4e4f3d5683ec2a579b1 | [
"MIT"
] | null | null | null | from django.contrib import admin
from . models import District
admin.site.register(District)
| 18.8 | 32 | 0.819149 | 13 | 94 | 5.923077 | 0.692308 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.117021 | 94 | 4 | 33 | 23.5 | 0.927711 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.666667 | 0 | 0.666667 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
43a66a8ecd3a73e188828998fe46510452b1bbdf | 43 | py | Python | src/ed_logging/__init__.py | iulianPeiu6/EncryptedDatabase | 127c116cc6ca7389fb0df9eaaa9930447fa5a012 | [
"MIT"
] | null | null | null | src/ed_logging/__init__.py | iulianPeiu6/EncryptedDatabase | 127c116cc6ca7389fb0df9eaaa9930447fa5a012 | [
"MIT"
] | null | null | null | src/ed_logging/__init__.py | iulianPeiu6/EncryptedDatabase | 127c116cc6ca7389fb0df9eaaa9930447fa5a012 | [
"MIT"
] | null | null | null | """Contains implementation for logging."""
| 21.5 | 42 | 0.744186 | 4 | 43 | 8 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.093023 | 43 | 1 | 43 | 43 | 0.820513 | 0.837209 | 0 | null | 0 | null | 0 | 0 | null | 0 | 0 | 0 | null | 1 | null | true | 0 | 0 | null | null | null | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
43d8c524a3665a847393d0bc6cb25768af627767 | 167 | py | Python | dec/utils/__init__.py | tknapen/decode_encode | bc8d86281837fbbe611a1a32fa9175448b2ada2b | [
"MIT"
] | 4 | 2017-08-22T11:08:24.000Z | 2019-05-01T11:04:56.000Z | dec/utils/__init__.py | tknapen/decode_encode | bc8d86281837fbbe611a1a32fa9175448b2ada2b | [
"MIT"
] | null | null | null | dec/utils/__init__.py | tknapen/decode_encode | bc8d86281837fbbe611a1a32fa9175448b2ada2b | [
"MIT"
] | null | null | null | from .utils import create_visual_designmatrix_all, roi_data_from_hdf, get_figshare_data, create_circular_mask
from .css import CompressiveSpatialSummationModelFiltered | 83.5 | 109 | 0.91018 | 21 | 167 | 6.761905 | 0.761905 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.05988 | 167 | 2 | 110 | 83.5 | 0.904459 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
43e1e2b108ff5968cdd7e38257ec688a3dc24347 | 1,927 | py | Python | core/apps.py | gomezag/MQTTdash | fb95fb1acaa807be4780bd5a30526a93321523d5 | [
"MIT"
] | 2 | 2021-01-23T14:35:23.000Z | 2021-06-03T03:45:07.000Z | core/apps.py | gomezag/mqttdash | fb95fb1acaa807be4780bd5a30526a93321523d5 | [
"MIT"
] | null | null | null | core/apps.py | gomezag/mqttdash | fb95fb1acaa807be4780bd5a30526a93321523d5 | [
"MIT"
] | null | null | null | from django.apps import AppConfig
import paho.mqtt.client as mqtt
import json
import environ
env = environ.Env()
class CoreConfig(AppConfig):
name = 'core'
def set_light(state):
# print("creating new instance")
client = mqtt.Client("dash") # create new instance
client.username_pw_set(username=env('MQTT_USER'), password=env('MQTT_PWD'))
client.connect(env('MQTT_HOST')) # connect to broker
print("Connected")
client.loop_start() # start the loop
print('Loop started')
gateway = "+/#"
client.subscribe(gateway)
if state:
client.publish('home/room/lights/main-light/cmnd/POWER', 'ON')
else:
client.publish('home/room/lights/main-light/cmnd/POWER', 'OFF')
client.loop_stop() #stop the loop
def persiana(state):
# print("creating new instance")
client = mqtt.Client("dash") # create new instance
client.username_pw_set(username=env('MQTT_USER'), password=env('MQTT_PWD'))
client.connect(env('MQTT_HOST')) # connect to broker
print("Connected")
client.loop_start() # start the loop
print('Loop started')
gateway = "+/#"
client.subscribe(gateway)
print('going ',state)
msg = dict(value=state)
msg = json.dumps(msg)
client.publish('home/room/persiana', msg)
client.loop_stop() #stop the loop
def hvac(state):
client = mqtt.Client("dash") # create new instance
client.username_pw_set(username=env('MQTT_USER'), password=env('MQTT_PWD'))
client.connect(env('MQTT_HOST')) # connect to broker
print("Connected")
client.loop_start() # start the loop
print('Loop started')
gateway = "+/#"
client.subscribe(gateway)
print('going ', state)
msg = dict()
if state == 'off':
msg['power'] = 'off'
elif state == 'on':
msg['power'] = 'cold'
msg['temp'] = 18
msg = json.dumps(msg)
client.publish('home/room/hvac', msg)
| 26.763889 | 79 | 0.645044 | 249 | 1,927 | 4.907631 | 0.261044 | 0.051555 | 0.069558 | 0.06874 | 0.797054 | 0.797054 | 0.797054 | 0.751228 | 0.692308 | 0.618658 | 0 | 0.001305 | 0.204463 | 1,927 | 71 | 80 | 27.140845 | 0.795825 | 0.12766 | 0 | 0.566038 | 0 | 0 | 0.190162 | 0.045591 | 0 | 0 | 0 | 0 | 0 | 1 | 0.056604 | false | 0.056604 | 0.075472 | 0 | 0.169811 | 0.150943 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 5 |
602c2cc4f0c5272c5d5bb08144da02bbfa4bfce4 | 12,827 | py | Python | circos_make_centrality.py | wchwang/Method_Pancorona | 9dbe0dfd984497406a129c8029ebf1c0c928c27f | [
"MIT"
] | null | null | null | circos_make_centrality.py | wchwang/Method_Pancorona | 9dbe0dfd984497406a129c8029ebf1c0c928c27f | [
"MIT"
] | null | null | null | circos_make_centrality.py | wchwang/Method_Pancorona | 9dbe0dfd984497406a129c8029ebf1c0c928c27f | [
"MIT"
] | null | null | null | # Created by woochanghwang at 10/06/2020
'''
circos make subcellular
for generated file from "circos_make_input_files_key_genes"
MoA based SOM result
#Modified 07/21/2020
#Modified 07/19/2021
Hidden , Sorted by enriched pathways
Centrality
'''
import pandas as pd
import numpy as np
def make_circos_subcellular(subcellular_addr):
key_genes_in_circos_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/COVID-19/result/Circos/data/network_backbone_node_info_key_genes_mutlcategory_v4.tsv"
genecards_subcellular_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/General/data/Compartments/genecard_subcellular.txt"
comparment_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/General/data/Compartments/human_compartment_integrated_full.tsv"
with open(genecards_subcellular_addr) as subcellular_f:
subcellular = [x.strip() for x in subcellular_f.readlines()]
comparments_data = pd.read_csv(comparment_addr, sep='\t', names=['Ensembl','Symbol','GO','Subcellular','Confidence'])
print(comparments_data.head())
comparments_data = comparments_data[comparments_data['Confidence'] >=4]
print(comparments_data)
comparments_data = comparments_data[comparments_data['Subcellular'].isin(subcellular)]
print(comparments_data)
comparments_data = comparments_data[['Symbol','Subcellular','Confidence']]
print(comparments_data)
key_genes_in_circos_df = pd.read_csv(key_genes_in_circos_addr, sep='\t', names=['Band','Protein'])
print(key_genes_in_circos_df)
key_genes_in_circos_subcelluar = pd.merge(left=key_genes_in_circos_df, right=comparments_data,how='left',left_on='Protein',right_on='Symbol')
print(key_genes_in_circos_subcelluar)
key_genes_in_circos_subcelluar = key_genes_in_circos_subcelluar.fillna('NA')
key_genes_proteins = key_genes_in_circos_df['Protein'].to_list()
key_genes_symbols = key_genes_in_circos_subcelluar['Symbol'].to_list()
print(len(set(key_genes_proteins)))
print(len(set(key_genes_symbols)))
print(len(set(key_genes_proteins)&set(key_genes_symbols)))
print(set(key_genes_proteins)-set(key_genes_symbols))
key_genes_in_circos_subcelluar = key_genes_in_circos_subcelluar[['Band','Protein','Subcellular','Confidence']]
print(key_genes_in_circos_subcelluar)
key_genes_in_circos_subcelluar.to_csv(subcellular_addr,sep='\t',index=False)
def make_circos_subcellular_knowledgebased(key_genes_in_circos_addr, subcellular_addr):
# key_genes_in_circos_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/COVID-19/result/Circos/data/network_backbone_node_info_key_genes_mutlcategory_v4.tsv"
genecards_subcellular_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/General/data/Compartments/genecard_subcellular.txt"
comparment_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/General/data/Compartments/human_compartment_knowledge_full.tsv"
with open(genecards_subcellular_addr) as subcellular_f:
subcellular = [x.strip() for x in subcellular_f.readlines()]
comparments_data = pd.read_csv(comparment_addr, sep='\t', names=['Ensembl','Symbol','GO','Subcellular','Source','Evidence_code','Confidence'])
print(comparments_data.head())
# comparments_data = comparments_data[comparments_data['Evidence_code'].isin(['ISS','IDA','HDA', ])] #Inferred from Direct Assay (IDA),
comparments_data = comparments_data[comparments_data['Confidence'] >=4]
print(comparments_data)
comparments_data = comparments_data[comparments_data['Subcellular'].isin(subcellular)] # multiple source --> mean
print(comparments_data)
comparments_data = comparments_data[['Symbol', 'Subcellular', 'Confidence']]
comparments_data_groupby = comparments_data.groupby(['Symbol','Subcellular']).agg({'Confidence':'mean'})
comparments_data_groupby = comparments_data_groupby.reset_index()
print(comparments_data_groupby)
key_genes_in_circos_df = pd.read_csv(key_genes_in_circos_addr, sep='\t', names=['Band','Protein'])
print(key_genes_in_circos_df)
key_genes_in_circos_subcelluar = pd.merge(left=key_genes_in_circos_df, right=comparments_data_groupby,how='left',left_on='Protein',right_on='Symbol')
print(key_genes_in_circos_subcelluar)
key_genes_in_circos_subcelluar = key_genes_in_circos_subcelluar.fillna('NA')
key_genes_proteins = key_genes_in_circos_df['Protein'].to_list()
key_genes_symbols = key_genes_in_circos_subcelluar['Symbol'].to_list()
print(len(set(key_genes_proteins)))
print(len(set(key_genes_symbols)))
print(len(set(key_genes_proteins)&set(key_genes_symbols)))
print(set(key_genes_proteins)-set(key_genes_symbols))
key_genes_in_circos_subcelluar = key_genes_in_circos_subcelluar[['Band','Protein','Subcellular','Confidence']]
print(key_genes_in_circos_subcelluar)
key_genes_in_circos_subcelluar.to_csv(subcellular_addr,sep='\t',index=False)
def make_circos_for_a_subcellular(subcellular_addr, location, location_no_space, location_addr,covid_circos_position_dict, covid_circos_for_a_location_hist_addr):
circos_subcellular_df = pd.read_csv(subcellular_addr, sep='\t')
circos_for_a_location_df = circos_subcellular_df[circos_subcellular_df['Subcellular']==location]
circos_for_a_location_df.to_csv(location_addr, sep='\t', index=False)
genes_in_a_location = circos_for_a_location_df['Protein'].to_list()
circos_covid_a_location = []
# for gene, pos in covid_circos_position_dict.items():
# print(gene,pos)
for gene in genes_in_a_location:
gene_position = covid_circos_position_dict[gene]
gene_position_sub = gene_position[:]
gene_position_sub.append('1')
gene_position_sub.append("id={}".format(location_no_space))
print(gene_position)
circos_covid_a_location.append(gene_position_sub)
circos_covid_a_location = ['\t'.join(gene) for gene in circos_covid_a_location]
with open(covid_circos_for_a_location_hist_addr,'w') as covid_circos_f:
covid_circos_f.write('\n'.join(circos_covid_a_location))
def get_covid_circos_position(circos_position_addr):
# circos_position_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/COVID-19/result/Circos/data/COVID_DIP_structure_DEP_HIDDEN_key_genes_in_network.txt"
with open(circos_position_addr) as circos_position_f:
circos_position_list = [x.strip().split('\t') for x in circos_position_f.readlines()]
circos_position_dict = dict()
for gene in circos_position_list:
# circos_position_dict[gene[-1]] = '\t'.join(gene[:-1])
circos_position_dict[gene[-1]] = gene[:-1]
return circos_position_dict
def make_circos_for_a_centrality(circos_centrality_base_addr,centrality,circos_for_a_centrality_addr, covid_circos_position_dict, covid_circos_for_a_location_hist_addr):
circos_centrality_df = pd.read_csv(circos_centrality_base_addr, sep='\t')
circos_for_a_location_df = circos_centrality_df[circos_centrality_df['Centrality']==centrality]
circos_for_a_location_df.to_csv(circos_for_a_centrality_addr, sep='\t', index=False)
genes_in_a_location = circos_for_a_location_df['Protein'].to_list()
circos_covid_a_location = []
# for gene, pos in covid_circos_position_dict.items():
# print(gene,pos)
for gene in genes_in_a_location:
gene_position = covid_circos_position_dict[gene]
gene_position_sub = gene_position[:]
gene_position_sub.append('1')
gene_position_sub.append("id={}".format(centrality))
print(gene_position)
circos_covid_a_location.append(gene_position_sub)
circos_covid_a_location = ['\t'.join(gene) for gene in circos_covid_a_location]
with open(covid_circos_for_a_location_hist_addr,'w') as covid_circos_f:
covid_circos_f.write('\n'.join(circos_covid_a_location))
def make_circos_centrality_base(virus, circos_node_file_a, keyprotein_addr, circos_centrality_addr):
key_gene_SARS_df = pd.read_csv(keyprotein_addr)
key_gene_SARS = key_gene_SARS_df['Gene'].to_list()
network_anaysis_df = pd.read_csv(f"../result/{virus}/network_analysis/{virus}_A549_24h_centrality_RWR_result_pvalue.csv")
eigen_list = network_anaysis_df[network_anaysis_df['Eigen_pvalue']< 0.01]['Gene'].tolist()
degree_list = network_anaysis_df[network_anaysis_df['Degree_pvalue']< 0.01]['Gene'].tolist()
bw_list = network_anaysis_df[network_anaysis_df['Between_plvaue']< 0.01]['Gene'].tolist()
rwr_list = network_anaysis_df[network_anaysis_df['RWR_pvalue']< 0.01]['Gene'].tolist()
key_genes = list(set(key_gene_SARS))
key_genes_eigen = list(set(key_gene_SARS) & set(eigen_list))
key_genes_degree = list(set(key_gene_SARS) & set(degree_list) - set(key_genes_eigen))
key_genes_bw = list(set(key_gene_SARS) & set(bw_list) - set(key_genes_eigen)-set(key_genes_degree))
key_genes_rwr = list(set(key_gene_SARS) & set(rwr_list) - set(key_genes_eigen)-set(key_genes_degree)-set(key_genes_bw))
key_genes_centrality = []
for gene in key_genes_eigen:
key_genes_centrality.append([gene, 'Eigen'])
for gene in key_genes_degree:
key_genes_centrality.append([gene, 'Degree'])
for gene in key_genes_bw:
key_genes_centrality.append([gene, 'BW'])
for gene in key_genes_rwr:
key_genes_centrality.append([gene, 'RWR'])
key_genes_centrality_df = pd.DataFrame(key_genes_centrality,columns=['Protein','Centrality'])
key_genes_in_circos_df = pd.read_csv(circos_node_file_a, sep='\t', names=['Band','Protein'])
print(key_genes_in_circos_df)
key_genes_in_circos_centrality = pd.merge(left=key_genes_in_circos_df, right=key_genes_centrality_df,how='left',left_on='Protein',right_on='Protein')
key_genes_in_circos_centrality = key_genes_in_circos_centrality.dropna()
key_genes_in_circos_centrality = key_genes_in_circos_centrality[['Band','Protein','Centrality']]
print(key_genes_in_circos_centrality)
key_genes_in_circos_centrality.to_csv(circos_centrality_addr,sep='\t',index=False)
def main():
viruslist = ["SARS-CoV","SARS-CoV-2"]
for virus in viruslist:
circos_node_file_a = f"../result/{virus}/Circos/data/{virus}_backbone_node_info_key_genes_high_level_paths.tsv"
keyprotein_addr = f"../result/{virus}/Sig.Genes/{virus}_key_protein_every.txt"
circos_centrality_base_addr = f"../result/{virus}/Circos/data/{virus}_backbone_node_info_key_genes_high_level_paths_centrality.tsv"
make_circos_centrality_base(virus, circos_node_file_a,keyprotein_addr,circos_centrality_base_addr)
circos_data_file_a = f"../result/{virus}/Circos/data/{virus}_DIP_structure_DEP_HIDDEN_key_genes_high_level_paths.txt"
covid_circos_position_dict = get_covid_circos_position(circos_data_file_a)
centrality_list = ["Eigen", "Degree", "BW", "RWR"]
for centrality in centrality_list:
circos_for_a_centrality_addr = f"../result/{virus}/Circos/data/{virus}_backbone_node_info_key_genes_high_level_paths_centrality_{centrality}.tsv"
covid_circos_for_a_location_hist_addr = f"../result/{virus}/Circos/data/{virus}_key_genes_high_level_paths_centrality_hist_{centrality}.txt"
make_circos_for_a_centrality(circos_centrality_base_addr,centrality,circos_for_a_centrality_addr, covid_circos_position_dict, covid_circos_for_a_location_hist_addr)
# ###############################
# # Step 1: make circos file added subcellular
# #################################
# circos_data_file_a = '/Users/woochanghwang/PycharmProjects/LifeArc/COVID-19/result/Circos/data/COVID_DIP_structure_DEP_HIDDEN_key_genes_high_level_paths_v2.txt'
# genecards_subcellular_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/General/data/Compartments/genecard_subcellular.txt"
# with open(genecards_subcellular_addr) as subcellular_f:
# subcellular = [x.strip() for x in subcellular_f.readlines()]
#
# covid_circos_position_dict = get_covid_circos_position(circos_data_file_a)
#
# for location in subcellular:
# location_no_space= location.replace(' ','_')
# circos_for_a_location_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/COVID-19/result/Circos/data/network_backbone_node_info_key_genes_high_level_paths_subcellular_conf4_{}_v2.tsv".format(location)
# covid_circos_for_a_location_hist_addr = "/Users/woochanghwang/PycharmProjects/LifeArc/COVID-19/result/Circos/data/COVID_key_genes_high_level_paths_subcellular_hist_{}_v2.txt".format(location_no_space)
# make_circos_for_a_subcellular(subcellular_addr, location, location_no_space, circos_for_a_location_addr, covid_circos_position_dict, covid_circos_for_a_location_hist_addr)
if __name__ == '__main__':
main() | 52.786008 | 216 | 0.77025 | 1,784 | 12,827 | 5.06222 | 0.100897 | 0.081497 | 0.046506 | 0.072639 | 0.809545 | 0.752741 | 0.713653 | 0.67977 | 0.671133 | 0.645333 | 0 | 0.006092 | 0.116941 | 12,827 | 243 | 217 | 52.786008 | 0.791207 | 0.163795 | 0 | 0.472222 | 1 | 0.006944 | 0.169928 | 0.10924 | 0 | 0 | 0 | 0 | 0 | 1 | 0.048611 | false | 0 | 0.013889 | 0 | 0.069444 | 0.180556 | 0 | 0 | 0 | null | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
603ece55084a4879eba4dd7d4815acbc36c7a1bc | 101,650 | py | Python | data/jb-generation/jb-json.py | bravecollective/bravecollective-intel | 8b141bbc7a811c4be3f64023ecd46de401c9197b | [
"MIT"
] | 17 | 2015-01-08T05:23:47.000Z | 2018-04-30T20:54:46.000Z | data/jb-generation/jb-json.py | bravecollective/bravecollective-intel | 8b141bbc7a811c4be3f64023ecd46de401c9197b | [
"MIT"
] | 5 | 2015-01-27T00:44:34.000Z | 2017-11-10T22:48:00.000Z | data/jb-generation/jb-json.py | bravecollective/bravecollective-intel | 8b141bbc7a811c4be3f64023ecd46de401c9197b | [
"MIT"
] | 13 | 2015-03-02T19:05:07.000Z | 2018-05-12T15:50:55.000Z | #!/usr/bin/python
import re
import json
from xml.dom import minidom
systemdict = {30001820:'3-QNM4', 30004271:'Afrah', 30004279:'Ahraghen', 30004277:'Ajna', 30004301:'Anath', 30004291:'Anohel', 30004249:'Avada', 30004261:'Balas', 30004296:'Bapraya', 30004252:'Bazadod', 30004250:'Chibi', 30004284:'Defsunun', 30004256:'Edilkam', 30004297:'Efu', 30004259:'Ertoo', 30004305:'Esaeel', 30004287:'Esubara', 30004276:'Fageras', 30004263:'Feshur', 30004254:'Fihrneh', 30004303:'Fobiner', 30004238:'Gens', 30004266:'Gesh', 30004282:'Getrenjesa', 30004288:'Ghekon', 30004248:'Haimeh', 30004257:'Hakatiz', 30004240:'Hier', 30004309:'Hophib', 30004264:'Hoseen', 30004304:'Huna', 30004294:'Illi', 30004243:'Isid', 30004241:'Jasson', 30004281:'Jerhesh', 30004239:'Kamih', 30004306:'Karan', 30004295:'Keba', 30004237:'Kenahehab', 30004258:'Khnar', 30004234:'Maalna', 30004247:'Marmeha', 30004235:'Maseera', 30004251:'Mishi', 30004300:'Naga', 30004280:'Nalnifan', 30004308:'Ned', 30004267:'Nema', 30004274:'Nielez', 30004307:'Nouta', 30004302:'Omigiav', 30004244:'Onanam', 30004253:'Pahineh', 30004255:'Parouz', 30004262:'Pemsah', 30004293:'Pserz', 30004269:'Rashagh', 30004242:'Sadana', 30004299:'Sakht', 30004270:'Sazilid', 30004283:'Shafrak', 30004231:'Shakasi', 30004268:'Shenda', 30004278:'Sheri', 30004233:'Shirshocin', 30004273:'Soliara', 30004272:'Sota', 30004292:'Soza', 30004298:'Tisot', 30004275:'Tukanas', 30004245:'Udianoor', 30004289:'Vaini', 30004230:'Van', 30004246:'Vehan', 30004286:'Yahyerer', 30004236:'Yehaba', 30004265:'Yekh', 30004260:'Yiratal', 30004290:'Zaveral', 30004232:'Zayi', 30004285:'Zazamye', 30045328:'Ahtila', 30045340:'Aivonen', 30045342:'Akidagi', 30045332:'Asakai', 30045311:'Ashitsu', 30045323:'Astoh', 30045319:'Eha', 30045336:'Elunala', 30045339:'Enaluri', 30045341:'Hallanen', 30045338:'Hikkoken', 30045345:'Hirri', 30045306:'Hykanima', 30045329:'Ichoriya', 30045313:'Ienakkamon', 30045337:'Ikoskio', 30045343:'Immuri', 30045316:'Innia', 30045317:'Iralaja', 30045351:'Iwisoda', 30045346:'Kedama', 30045308:'Kehjari', 30045314:'Kinakka', 30045312:'Korasen', 30045318:'Martoh', 30045334:'Mushikegi', 30045344:'Nennamaila', 30045352:'Nisuwa', 30045348:'Notoras', 30045347:'Oinasiken', 30045307:'Okagaiken', 30045330:'Okkamon', 30045324:'Onnamon', 30045320:'Pavanakka', 30045333:'Prism', 30045353:'Pynekastoh', 30045315:'Raihbaka', 30045349:'Rakapas', 30045354:'Reitsato', 30045325:'Rohamaa', 30045322:'Samanuni', 30045310:'Sarenemi', 30045350:'Teimo', 30045335:'Teskanen', 30045327:'Tsuruma', 30045321:'Uchomida', 30045326:'Uuhulanen', 30045331:'Vaaralen', 30045309:'Villasen', 30004322:'0P9Z-I', 30004367:'1G-MJE', 30004393:'1IX-C0', 30004394:'2B7A-3', 30004358:'3-N3OO', 30004336:'3-TD6L', 30004387:'313I-B', 30004390:'3F-JZF', 30004403:'3KNA-N', 30004317:'4-48K1', 30004360:'4-BE0M', 30004384:'4DTQ-K', 30004391:'5-0WB9', 30004374:'5-P1Y2', 30004344:'52G-NZ', 30004345:'5LJ-MD', 30004347:'6-O5GY', 30004351:'8-4GQM', 30004397:'9-B1DS', 30004366:'9F-7PZ', 30004359:'A-G1FM', 30004323:'AH-B84', 30004328:'B-GC1T', 30004346:'B8O-KJ', 30004333:'BKG-Q2', 30004399:'BU-IU4', 30004371:'BWI1-9', 30004378:'C-4ZOS', 30004319:'C-HCGU', 30004372:'C-LBQS', 30004365:'C-LP3N', 30004380:'C-VGYO', 30004401:'CH9L-K', 30004357:'CS-ZGD', 30004335:'CX-1XF', 30004386:'D4R-H7', 30004311:'DCI7-7', 30004388:'EQI2-2', 30004314:'EWN-2U', 30004327:'F-9F6Q', 30004325:'HB7R-F', 30004400:'I-7JR4', 30004361:'I-7RIS', 30004373:'J52-BH', 30004312:'J7YR-1', 30004385:'J9-5MQ', 30004330:'JRZ-B9', 30004324:'JTAU-5', 30004379:'K-8SQS', 30004376:'KJ-QWL', 30004340:'KL3O-J', 30004316:'KMC-WI', 30004375:'KMQ4-V', 30004348:'KV-8SN', 30004353:'LRWD-B', 30004364:'LXWN-W', 30004356:'M-HU4V', 30004369:'MA-VDX', 30004398:'ME-4IU', 30004383:'NEH-CS', 30004338:'NLPB-0', 30004318:'NTV0-1', 30004326:'O-JPKH', 30004381:'O94U-A', 30004334:'OJ-A8M', 30004362:'P7Z-R3', 30004313:'PKG4-7', 30004395:'PUWL-4', 30004389:'Q-4DEC', 30004321:'Q-FEEJ', 30004337:'Q-NJZ4', 30004343:'QCWA-Z', 30004354:'QXQ-BA', 30004402:'QYZM-W', 30004339:'R4O-I6', 30004370:'RO90-H', 30004332:'S-B7IT', 30004377:'SVB-RE', 30004352:'T-Q2DD', 30004349:'UB-UQZ', 30004310:'UQ9-3C', 30004329:'V8W-QS', 30004315:'VL3I-M', 30004392:'W-4FA9', 30004368:'WO-AIJ', 30004331:'X4UV-Z', 30004355:'X7R-JW', 30004342:'XM-4L0', 30004320:'XW-2XP', 30004382:'XW-JHT', 30004396:'Y-1918', 30004350:'YG-82V', 30004341:'Z-K495', 30004363:'ZIU-EP', 30000623:'04EI-U', 30000645:'0LY-W1', 30000611:'2-2EWC', 30000630:'4S0-NP', 30000646:'4YO-QK', 30000637:'5F-MG1', 30000614:'8-BIE3', 30000628:'8-SPNN', 30000648:'8-VC6H', 30000652:'8OYE-Z', 30000616:'995-3G', 30000624:'B-T6BT', 30000619:'BLMX-B', 30000643:'BTLH-I', 30000632:'C-6YHJ', 30000613:'D-6H64', 30000612:'E1W-TB', 30000634:'E5T-CS', 30000636:'I-2705', 30000626:'I6-SYN', 30000622:'IVP-KA', 30000640:'JZ-B5Y', 30000631:'K-RMI5', 30000653:'K85Y-6', 30000647:'LJ-RJK', 30000615:'LMM7-L', 30000649:'LQ-01M', 30000620:'M-CNUD', 30000639:'M-MCP8', 30000633:'M53-1V', 30000650:'NG-M8K', 30000642:'NIF-JE', 30000627:'O-5TN1', 30000638:'P7-45V', 30000654:'PKN-NJ', 30000618:'Q-UEN6', 30000651:'RV5-TT', 30000641:'TPG-DD', 30000629:'U-QMOA', 30000644:'U93O-A', 30000625:'VK-A5G', 30000617:'W2T-TR', 30000635:'W4C8-Q', 30000621:'YE1-9S', 30001170:'1P-WGB', 30001258:'25S-6P', 30001169:'2J-WJY', 30001199:'3-OKDA', 30001212:'3-SFWG', 30001157:'36N-HZ', 30001200:'3GD6-8', 30001260:'4-07MU', 30001201:'4M-HGL', 30001251:'4NBN-9', 30001241:'5-N2EY', 30001167:'6-K738', 30001184:'6-MM99', 30001257:'6BPS-T', 30001197:'6X7-JO', 30001218:'7LHB-Z', 30001232:'7MD-S1', 30001219:'8B-2YA', 30001166:'9-8GBA', 30001154:'9KOE-A', 30001193:'A-803L', 30001210:'A-VILQ', 30001216:'AOK-WQ', 30001203:'AX-DOT', 30001156:'B-3QPD', 30001215:'B-XJX4', 30001235:'BR-N97', 30001245:'BUZ-DB', 30001205:'CB4-Q2', 30001206:'CBL-XP', 30001254:'CNC-4V', 30001224:'CX65-5', 30001253:'CZK-ZQ', 30001229:'E-YJ8G', 30001214:'E1-4YH', 30001217:'E3-SDZ', 30001233:'ERVK-P', 30001181:'EX-0LQ', 30001252:'EX6-AO', 30001171:'F4R2-Q', 30001153:'F9E-KX', 30001256:'FAT-6P', 30001186:'FZ-6A5', 30001183:'G-7WUF', 30001227:'G-AOTH', 30001198:'GE-8JV', 30001177:'GE-94X', 30001192:'GJ0-OJ', 30001178:'GMLH-K', 30001161:'HED-GP', 30001221:'HP-64T', 30001159:'HY-RWO', 30001188:'I-8D0G', 30001236:'IS-R7P', 30001195:'J-ODE7', 30001230:'J6QB-P', 30001225:'JA-O6J', 30001185:'JBY6-F', 30001243:'JGW-OT', 30001190:'JWZ2-V', 30001172:'K0CN-3', 30001238:'K717-8', 30001231:'KA6D-K', 30001242:'KB-U56', 30001164:'KDF-GY', 30001240:'KH0Z-0', 30001180:'KW-I6T', 30001223:'L-B55M', 30001174:'L7XS-5', 30001182:'MB-NKE', 30001213:'MUXX-4', 30001202:'MY-W1V', 30001209:'N-8BZ6', 30001239:'NH-1X6', 30001191:'OGL8-Q', 30001196:'Q-S7ZD', 30001248:'Q-U96U', 30001165:'QBQ-RF', 30001246:'QETZ-W', 30001163:'QSM-LM', 30001189:'R-K4QY', 30001187:'RNF-YH', 30001259:'RR-D05', 30001176:'S-U2VD', 30001237:'S25C-K', 30001220:'SNFV-I', 30001158:'SV5-8N', 30001228:'TA3T-3', 30001155:'U-QVWD', 30001244:'UCG4-B', 30001234:'UL-7I8', 30001208:'UQ-PWD', 30001162:'V-3YG7', 30001222:'V2-VC2', 30001175:'VA6-DR', 30001250:'W-MPTH', 30001179:'W9-DID', 30001160:'WD-VTV', 30001247:'WFC-MY', 30001207:'WJ-9YO', 30001173:'WLAR-J', 30001194:'WQH-4K', 30001211:'X3FQ-W', 30001249:'X4-WL0', 30001255:'Y-PNRL', 30001204:'YHN-3K', 30001226:'ZQ-Z3Y', 30001168:'ZXIC-7', 30004068:'00TY-J', 30004037:'1-3HWZ', 30004048:'1-NW2G', 30004071:'28O-JY', 30004039:'5-MLDT', 30004067:'5S-KNL', 30004053:'6-4V20', 30004073:'6ON-RW', 30004070:'6RCQ-V', 30004057:'6Z9-0M', 30004045:'77-KDQ', 30004056:'8R-RTB', 30004059:'9-4RP2', 30004040:'B-DBYQ', 30004072:'CX7-70', 30004042:'DY-F70', 30004046:'F7C-H0', 30004043:'FD53-H', 30004058:'FQ9W-C', 30004061:'G8AD-C', 30004064:'MJYW-3', 30004050:'O-0HW8', 30004060:'O-BDXB', 30004049:'O-IVNH', 30004044:'O-ZXUV', 30004052:'OU-X3P', 30004076:'P5-KCC', 30004065:'PPG-XC', 30004054:'Q-UA3C', 30004066:'QA1-BT', 30004041:'QXW-PV', 30004047:'TN-T7T', 30004074:'U65-CN', 30004055:'W-4NUU', 30004075:'X-M9ON', 30004069:'XG-D1L', 30004038:'XT-R36', 30004062:'XZH-4X', 30004051:'YI-8ZM', 30004063:'Z-Y7R7', 30004214:'0B-VOJ', 30004226:'1GT-MA', 30004189:'1M7-RK', 30004184:'2-9Z6V', 30004165:'4-PCHD', 30004224:'42SU-L', 30004188:'4A-6NI', 30004216:'4GSZ-1', 30004206:'4LNE-M', 30004221:'4T-VDE', 30004166:'5-3722', 30004212:'58Z-IH', 30004168:'5E-EZC', 30004175:'5E6I-W', 30004193:'5ED-4E', 30004185:'5HN-D6', 30004190:'87-1PM', 30004219:'9-7SRQ', 30004201:'9-WEMC', 30004169:'9KE-IT', 30004179:'A-0IIQ', 30004194:'B-U299', 30004163:'BI0Y-X', 30004191:'C2-1B5', 30004180:'CBY8-J', 30004210:'CHP-76', 30004200:'CUT-0V', 30004222:'D9Z-VY', 30004207:'DK0-N8', 30004195:'DN58-U', 30004186:'E-B957', 30004181:'E-BYOS', 30004217:'E-EFAM', 30004208:'E0DR-G', 30004204:'EJ-5X2', 30004182:'ETXT-F', 30004161:'FV-YEA', 30004197:'FV1-RQ', 30004167:'GQLB-V', 30004178:'H23-B5', 30004228:'HB-5L3', 30004205:'HXK-J6', 30004177:'I-CMZA', 30004162:'J-A5QD', 30004215:'J-QOKQ', 30004192:'JE-VLG', 30004209:'KI2-S3', 30004176:'KIG9-K', 30004203:'L-Z9NB', 30004213:'M-VACR', 30004183:'MK-YNM', 30004223:'MO-YDG', 30004199:'O-F4SN', 30004187:'P-H5IY', 30004170:'P-NRD3', 30004229:'Q-VTWJ', 30004198:'QT-EBC', 30004225:'RGU1-T', 30004172:'S-W8CF', 30004218:'SBEN-Q', 30004164:'SK7-G6', 30004211:'T-67F8', 30004202:'U6R-F9', 30004196:'VAF1-P', 30004220:'VEQ-3V', 30004227:'VY-866', 30004173:'X-41DA', 30004171:'Y-RAW3', 30004174:'YVSL-2', 30001009:'0SHT-A', 30001014:'5E-VR8', 30001017:'8G-MQV', 30001022:'AAM-1A', 30001037:'BPK-XK', 30001042:'CL-1JE', 30001025:'CL-85V', 30001034:'CVY-UC', 30001010:'D87E-A', 30001047:'Doril', 30001035:'EQX-AE', 30001029:'ES-UWY', 30001023:'EW-JR5', 30001049:'Farit', 30001041:'G-0Q86', 30001002:'G-G78S', 30001036:'G-R4W1', 30001001:'H-ADOC', 30001016:'HLW-HP', 30001044:'Hemin', 30001043:'J4UD-J', 30001013:'J7A-UR', 30001007:'JWJ-P1', 30001050:'Jamunda', 30001046:'Jorund', 30001011:'K-B2D3', 30001006:'K-MGJ7', 30001026:'K-QWHE', 30001040:'K88X-J', 30001020:'KLMT-W', 30001038:'LJ-YSW', 30001048:'Litom', 30001032:'M-N7WD', 30001027:'MDD-79', 30001005:'OSY-UD', 30001012:'PO4F-3', 30001033:'QFEW-K', 30001018:'RA-NXN', 30001028:'RMOC-W', 30001030:'S1DP-Y', 30001003:'UW9B-F', 30001045:'Utopia', 30001008:'V-IUEL', 30001015:'V7D-JD', 30001019:'VOL-MI', 30001021:'XX9-WV', 30001031:'Y-DW5K', 30001039:'Y-K50G', 30001024:'YKE4-3', 30001004:'ZZ-ZWC', 30002899:'0P-F3K', 30002934:'0V0R-R', 30002903:'2-KF56', 30002909:'2O9G-D', 30002905:'2R-CRW', 30002919:'33RB-O', 30002937:'3JN9-Q', 30002917:'3OAT-Q', 30002921:'3QE-9Q', 30002938:'3T7-M8', 30002941:'43B-O1', 30002944:'4N-BUI', 30002889:'4U90-Z', 30002950:'5S-KXA', 30002927:'5W3-DG', 30002929:'7T6P-C', 30002924:'85-B52', 30002930:'8S28-3', 30002895:'94-H3F', 30002949:'9CK-KZ', 30002952:'A4L-A2', 30002892:'AD-CBT', 30002933:'AGG-NR', 30002915:'C7Y-7Z', 30002906:'CCP-US', 30002896:'CU9-T0', 30002953:'CZDJ-1', 30002920:'DKUK-G', 30002922:'E-FIC0', 30002931:'E3UY-6', 30002898:'FMB-JP', 30002891:'FO8M-2', 30002947:'GY5-26', 30002908:'I30-3A', 30002907:'II-5O9', 30002942:'J1AU-9', 30002911:'JU-OWQ', 30002900:'K5F-Z2', 30002932:'LEK-N5', 30002928:'LT-DRO', 30002913:'MXX5-9', 30002940:'MZ1E-P', 30002918:'N-TFXK', 30002945:'N2IS-B', 30002910:'NC-N3F', 30002935:'O-2RNZ', 30002936:'OWXT-5', 30002893:'QPO-WI', 30002894:'R8S-1K', 30002954:'RG9-7U', 30002926:'RO0-AF', 30002912:'S-DN5M', 30002890:'T-945F', 30002901:'TXME-A', 30002951:'U-TJ7Y', 30002956:'UEJX-G', 30002955:'UJY-HE', 30002904:'VFK-IV', 30002948:'VPLL-N', 30002939:'WUZ-WM', 30002916:'X-Z4DA', 30002943:'X3-PBC', 30002946:'XCBK-X', 30002897:'XCF-8N', 30002902:'YA0-XJ', 30002925:'YZ-UKA', 30002923:'ZOYW-O', 30002914:'ZZZR-5', 30004756:'0-HDC8', 30004720:'0N-3RO', 30004801:'1-2J4P', 30004738:'1-SMEB', 30004728:'1B-VKF', 30004710:'1DH-SX', 30004759:'1DQ1-A', 30004734:'23G-XC', 30004764:'3-DMQT', 30004722:'319-3D', 30004786:'31X-RE', 30004766:'39P-1J', 30004716:'4K-TRB', 30004746:'4O-239', 30004736:'4X0-8B', 30004789:'5-6QW7', 30004774:'5-CQDA', 30004761:'5BTK-M', 30004740:'6Q-R50', 30004792:'6Z-CKS', 30004790:'7-K6UE', 30004768:'7G-QIG', 30004788:'7UTB-F', 30004781:'8F-TK3', 30004771:'8RQJ-2', 30004760:'8WA-Z6', 30004796:'9GNS-2', 30004780:'9O-8W1', 30004708:'A-ELE2', 30004717:'AJI-MA', 30004798:'C3N-3S', 30004791:'C6Y-ZF', 30004799:'CX8-6K', 30004750:'D-3GIQ', 30004730:'D-W7F0', 30004725:'E3OI-U', 30004784:'F-9PXR', 30004757:'F-TE1T', 30004732:'FM-JK5', 30004718:'FWST-8', 30004793:'G-M5L3', 30004721:'G-TT5V', 30004778:'GY6A-L', 30004715:'HM-XR2', 30004767:'HZAQ-W', 30004775:'I-E3TG', 30004723:'I3Q-II', 30004726:'IP6V-X', 30004755:'J-LPX7', 30004731:'JP4-AA', 30004751:'K-6K16', 30004794:'KBAK-I', 30004772:'KEE-N6', 30004709:'KFIE-Z', 30004747:'LUA5-L', 30004800:'LWX-93', 30004795:'M-SRKS', 30004802:'M0O-JG', 30004773:'M2-XFE', 30004739:'M5-CGW', 30004743:'MJXW-P', 30004765:'MO-GZ5', 30004762:'N-8YET', 30004783:'N8D9-Z', 30004769:'NIDJ-K', 30004712:'NOL-M9', 30004713:'O-IOAI', 30004733:'PDE-U3', 30004782:'PF-KUQ', 30004711:'PR-8CA', 30004770:'PS-94K', 30004754:'PUIG-F', 30004787:'Q-02UL', 30004737:'Q-HESZ', 30004749:'Q-JQSG', 30004744:'QC-YX6', 30004714:'QX-LIJ', 30004752:'QY6-RK', 30004727:'R5-MM8', 30004742:'RCI-VL', 30004724:'RF-K9W', 30004776:'S-6HHN', 30004758:'SVM-3K', 30004748:'T-IPZB', 30004729:'T-J6HT', 30004745:'T-M0FA', 30004735:'T5ZI-S', 30004779:'UEXO-Z', 30004706:'UHKL-N', 30004753:'W-KQPI', 30004763:'Y-OMTZ', 30004785:'Y5C-YD', 30004797:'YAW-7M', 30004719:'YZ9-F6', 30004707:'Z3V-1W', 30004741:'ZA9-PY', 30004777:'ZXB-VC', 30000105:'Abha', 30000061:'Agha', 30000088:'Akeva', 30000065:'Akhrad', 30000003:'Akpivem', 30000034:'Alkez', 30000058:'Amphar', 30000092:'Aranir', 30000099:'Arena', 30000112:'Arnola', 30000012:'Asabona', 30000084:'Asghatil', 30000095:'Asilem', 30000075:'Assah', 30000113:'Astabih', 30000041:'Bairshir', 30000085:'Bar', 30000039:'Bayuka', 30000110:'Bekirdod', 30000109:'Berta', 30000115:'Bimener', 30000081:'Buftiar', 30000049:'Camal', 30000010:'Chidah', 30000038:'Dooz', 30000102:'Dysa', 30000083:'Ejahi', 30000028:'Eshtah', 30000091:'Eshwil', 30000044:'Faspera', 30000050:'Fera', 30000023:'Fovihi', 30000017:'Futzchag', 30000107:'Gamis', 30000087:'Gelhan', 30000070:'Gomati', 30000094:'Hahyil', 30000074:'Hasateem', 30000032:'Hasiari', 30000111:'Hothomouh', 30000053:'Ibaria', 30000048:'Ihal', 30000090:'Ilahed', 30000062:'Iosantin', 30000073:'Irshah', 30000093:'Ishkad', 30000071:'Jangar', 30000060:'Janus', 30000082:'Jarizza', 30000004:'Jark', 30000045:'Jaymass', 30000022:'Jayneleb', 30000078:'Jofan', 30000051:'Juddi', 30000030:'Kasrasi', 30000018:'Kazna', 30000098:'Kehrara', 30000116:'Kenobanala', 30000117:'Khabi', 30000056:'Khankenirdia', 30000024:'Kiereend', 30000021:'Kuharah', 30000029:'Lachailes', 30000002:'Lashesih', 30000020:'Lilmad', 30000096:'Mahnagh', 30000104:'Mahti', 30000047:'Majamar', 30000052:'Maspah', 30000046:'Mifrata', 30000079:'Milu', 30000042:'Moh', 30000031:'Mohas', 30000072:'Nakah', 30000016:'Nazhgete', 30000108:'Nieril', 30000057:'Nikh', 30000035:'Nimambal', 30000008:'Nirbhi', 30000077:'Odlib', 30000013:'Onsooh', 30000026:'Ordize', 30000063:'Orva', 30000066:'Pirohdim', 30000019:'Podion', 30000027:'Psasa', 30000033:'Radima', 30000025:'Rashy', 30000059:'Salashayama', 30000043:'Sari', 30000005:'Sasta', 30000015:'Sendaya', 30000103:'Serad', 30000097:'Shach', 30000054:'Shala', 30000014:'Shamahi', 30000067:'Sharir', 30000106:'Shedoo', 30000011:'Shenela', 30000009:'Sooma', 30000089:'Sosa', 30000086:'Sucha', 30000001:'Tanoo', 30000069:'Thiarer', 30000076:'Tidacha', 30000100:'Timeor', 30000118:'Uanzin', 30000114:'Ubtes', 30000101:'Uhtafal', 30000037:'Uplingur', 30000068:'Usroh', 30000040:'Uzistoon', 30000080:'Yadi', 30000036:'Yishinoon', 30000007:'Yuzier', 30000006:'Zaid', 30000055:'Zemalu', 30000064:'Zet', 30000480:'0-G8NO', 30000440:'0-W778', 30000522:'0IF-26', 30000463:'1-GBVE', 30000474:'1-PGSG', 30000445:'1KAW-T', 30000473:'2-X0PF', 30000452:'3-3EZB', 30000448:'3-LJW3', 30000455:'4NDT-W', 30000453:'52CW-6', 30000442:'5J4K-9', 30000468:'5OJ-G2', 30000513:'62O-UE', 30000457:'6OU9-U', 30000525:'7-A6XV', 30000487:'7-P1JO', 30000483:'77S8-E', 30000494:'8FN-GP', 30000469:'9-02G0', 30000454:'9-OUGJ', 30000458:'9N-0HF', 30000491:'A-7XFN', 30000476:'A-C5TC', 30000435:'B-5UFY', 30000504:'BOZ1-O', 30000446:'C5-SUU', 30000461:'D-0UI0', 30000441:'DG-8VJ', 30000518:'DVWV-3', 30000516:'DX-DFJ', 30000490:'DX-TAR', 30000502:'E-1XVP', 30000503:'E-ACV6', 30000524:'E51-JE', 30000486:'EDQG-L', 30000437:'EU9-J3', 30000497:'F2W-C6', 30000500:'F9O-U9', 30000495:'FIDY-8', 30000484:'FMH-OV', 30000478:'FR46-E', 30000460:'G3D-ZT', 30000464:'GC-LTF', 30000456:'GR-X26', 30000523:'H-93YV', 30000444:'H-FGJO', 30000482:'HZFJ-M', 30000520:'I-9GI1', 30000509:'IAS-I5', 30000489:'J-L9MA', 30000451:'JFV-ID', 30000510:'K7S-FF', 30000519:'KE-0FB', 30000498:'KZ9T-C', 30000528:'L-L7PE', 30000467:'L-QQ6P', 30000462:'L8-WNE', 30000466:'LT-XI4', 30000471:'M-XUZZ', 30000443:'MD-0AW', 30000465:'NB-ALM', 30000492:'O3-4MN', 30000512:'O5Q7-U', 30000439:'OEG-K9', 30000472:'OFVH-Y', 30000450:'P7MI-T', 30000438:'PQRE-W', 30000507:'Q0J-RH', 30000505:'QIMO-2', 30000475:'QLPX-J', 30000481:'QRFJ-Q', 30000526:'QXE-1N', 30000511:'RT-9WL', 30000477:'RZ-PIY', 30000501:'S-51XG', 30000508:'SAI-T9', 30000436:'SK42-F', 30000479:'SLVP-D', 30000515:'SY-UWN', 30000488:'T-0JWP', 30000485:'TYB-69', 30000493:'U-MFTL', 30000459:'U-OVFR', 30000514:'U0W-DR', 30000527:'U69-YC', 30000434:'V-4DBR', 30000521:'W6P-7U', 30000433:'WU-FHQ', 30000517:'X-31TE', 30000496:'X40H-9', 30000470:'XA5-TY', 30000447:'XSUD-1', 30000499:'XW2H-V', 30000506:'Z-2Y2Y', 30000449:'ZLO3-V', 30002992:'Akes', 30002977:'Arayar', 30003007:'Arveyil', 30002960:'Arzad', 30002978:'Asghed', 30002965:'Choonka', 30002967:'Dihra', 30002968:'Dital', 30002997:'Ehnoum', 30002969:'Eredan', 30002964:'Esescama', 30002962:'Ezzara', 30003002:'Faktun', 30002972:'Gheth', 30002990:'Hakshma', 30003003:'Halenan', 30002981:'Halmah', 30002994:'Hati', 30002984:'Ibash', 30002985:'Itsyamil', 30002976:'Labapi', 30002989:'Laddiaha', 30002973:'Lisudeh', 30002996:'Lower Debyl', 30002974:'Mehatoor', 30002986:'Mendori', 30003001:'Mili', 30002995:'Naeel', 30002988:'Nakatre', 30003006:'Nidebora', 30002963:'Odin', 30002970:'Ohide', 30002961:'Oyeman', 30003008:'Palpis', 30002958:'Raa', 30002982:'Rahadalon', 30002993:'Riavayed', 30002975:'Roushzar', 30002971:'Sasoutikh', 30002999:'Shastal', 30002959:'Sifilar', 30002983:'Soosat', 30002980:'Sosan', 30002979:'Tararan', 30003000:'Thakala', 30002966:'Thasinaz', 30002957:'Tzvi', 30002991:'Uadelah', 30003005:'Uktiad', 30003004:'Ulerah', 30002998:'Upper Debyl', 30002987:'Ussad', 30003512:'Abaim', 30002268:'Adia', 30003535:'Afivad', 30002220:'Aghesi', 30002247:'Ahala', 30002266:'Ahmak', 30002222:'Airshaz', 30002197:'Akhragan', 30003479:'Akila', 30002216:'Aldali', 30003521:'Alkabsi', 30002187:'Amarr', 30003480:'Amod', 30003552:'Ana', 30003485:'Andabiar', 30003515:'Anila', 30003487:'Arbaz', 30002231:'Ardishapur Prime', 30002244:'Arera', 30002205:'Armala', 30002240:'Arodan', 30002253:'Arshat', 30003556:'Arton', 30002278:'Artoun', 30003491:'Ashab', 30002272:'Asoutar', 30002270:'Avair', 30002265:'Azizora', 30003525:'Bagodan', 30003502:'Bahromab', 30003548:'Barira', 30003478:'Basan', 30002199:'Bashakru', 30002282:'Bhizheba', 30002252:'Bika', 30003561:'Biphi', 30003555:'Bittanshal', 30002188:'Boranai', 30002238:'Bourar', 30002207:'Cailanar', 30002224:'Charra', 30003489:'Chaven', 30003529:'Chemilip', 30003527:'Chesoh', 30002275:'Clarelam', 30002233:'Dakba', 30013489:'Deepari', 30003476:'Ealur', 30002281:'Eba', 30002196:'Ebidan', 30002269:'Ebo', 30002277:'Ebtesham', 30003494:'Ekid', 30003537:'Erzoh', 30002260:'Esteban', 30003517:'Etav', 30002221:'Fabin', 30003505:'Fabum', 30003473:'Fahruni', 30003541:'Faswiba', 30023489:'Fora', 30002204:'Gaha', 30003544:'Galeh', 30002232:'Gid', 30003509:'Gosalav', 30002264:'Hadonoo', 30003533:'Hahda', 30002250:'Hai', 30003523:'Hama', 30003547:'Hamse', 30033489:'Hanan', 30002225:'Harva', 30003542:'Hayumtom', 30002189:'Hedion', 30003528:'Herila', 30002214:'Hiramu', 30003531:'Hisoufad', 30002245:'Hizhara', 30043489:'Horir', 30003560:'Hoshoun', 30002217:'Hutian', 30003513:'Ides', 30002208:'Ilonarav', 30003524:'Irnal', 30002192:'Irnin', 30002276:'Isamm', 30003554:'Jambu', 30002263:'Jarshitsan', 30003551:'Jaswelu', 30002254:'Jerma', 30003532:'Jesoyeh', 30002210:'Joppaya', 30002193:'Kehour', 30003486:'Kheram', 30003490:'Khopa', 30002248:'Knophtikoo', 30003501:'Kudi', 30003519:'Lahnina', 30003549:'Lashkai', 30002261:'Luromooh', 30002190:'Mabnen', 30003558:'Madimal', 30003503:'Madirmilire', 30003520:'Mahrokht', 30003499:'Mai', 30003546:'Maiah', 30002242:'Mamenkhanar', 30003559:'Mamet', 30002194:'Martha', 30002213:'Mazitah', 30003538:'Merz', 30003539:'Miakie', 30002198:'Mikhir', 30002236:'Milal', 30003563:'Misaba', 30003482:'Mista', 30002255:'Miyeli', 30002257:'Moussou', 30002206:'Murema', 30003526:'Murzi', 30002258:'Nadohman', 30003475:'Naguton', 30003496:'Nakri', 30002262:'Nalu', 30003534:'Namaili', 30002202:'Narai', 30002228:'Nererut', 30002246:'Neziel', 30003504:'Niarja', 30002234:'Nifshed', 30002218:'Noli', 30002219:'Nomash', 30003492:'Orkashu', 30002223:'Patzcha', 30003516:'Pedel', 30002211:'Pelkia', 30003488:'Penirgman', 30002273:'Porsharrah', 30002239:'Rammi', 30003530:'Raravath', 30003495:'Raravoss', 30002212:'Raren', 30002227:'Rasile', 30002271:'Rayl', 30003564:'Rephirib', 30002256:'Reyi', 30002241:'Rimbah', 30002249:'Ruchy', 30003506:'Saana', 30002251:'Sadye', 30002279:'Safizon', 30003474:'Sahda', 30002259:'Sahdil', 30003518:'Saheri', 30002215:'Sakhti', 30003522:'Sarum Prime', 30003508:'Sayartchen', 30002243:'Seiradih', 30002267:'Shabura', 30003477:'Shajarleg', 30003498:'Sharhelund', 30003500:'Sharji', 30002235:'Shumam', 30002201:'Shuria', 30003484:'Sibot', 30003557:'Sieh', 30002195:'Simbeloud', 30003540:'Sirkahri', 30002229:'Sitanan', 30002237:'Sobenah', 30003511:'Somouh', 30003510:'Sorzielang', 30002200:'Sukirah', 30002274:'Tastela', 30003507:'Teshi', 30002226:'Thebeka', 30002191:'Toshabia', 30002209:'Uchat', 30003481:'Unefsih', 30003536:'Uzigh', 30003483:'Valmu', 30002230:'Vashkah', 30003553:'Warouh', 30003514:'Yeeramoun', 30003493:'Youl', 30003545:'Yuhelia', 30003497:'Zaimeth', 30003543:'Zanka', 30002280:'Zatsyaki', 30003550:'Zhilshinou', 30002203:'Ziona', 30003562:'Ziriert', 30003114:'0-O6XF', 30003149:'02V-BK', 30003147:'111-F1', 30003122:'16P-PX', 30003171:'29YH-V', 30003182:'2R-KLH', 30003105:'4-OUKF', 30003131:'450I-W', 30003111:'5-9UXZ', 30003145:'6-TYRX', 30003139:'6EK-BV', 30003179:'6SB-BN', 30003165:'7P-J38', 30003100:'A-CJGE', 30003125:'A1-AUH', 30003150:'A5MT-B', 30003180:'B1D-KU', 30003138:'BY-MSY', 30003121:'BZ-0GW', 30003169:'C-PEWN', 30003113:'C-VZAK', 30003109:'C9N-CC', 30003123:'CR-0E5', 30003134:'CZ6U-1', 30003115:'D-FVI7', 30003135:'D-PNP9', 30003156:'DIBH-Q', 30003170:'DL-CDY', 30003157:'DNEP-Y', 30003108:'DTX8-M', 30003136:'E1UU-3', 30003126:'F-UVBV', 30003118:'FN-GFQ', 30003161:'G-4H4C', 30003155:'G-JC9R', 30003133:'G-YZUX', 30003101:'G2-INZ', 30003144:'H-T40Z', 30003160:'H-YHYM', 30003106:'HAJ-DQ', 30003162:'HHE5-L', 30003099:'HHQ-M1', 30003103:'HT4K-M', 30003176:'IPX-H5', 30003140:'IR-FDV', 30003142:'J-RVGD', 30003107:'JAUD-V', 30003148:'JD-TYH', 30003177:'KSM-1T', 30003168:'L-M6JK', 30003172:'LG-RO2', 30003153:'MS2-V8', 30003117:'NH-R5B', 30003141:'NIZJ-0', 30003132:'OIOM-Y', 30003137:'P-3XVV', 30003163:'P9F-ZG', 30003159:'PE-H02', 30003167:'PK-PHZ', 30003112:'Q0OH-V', 30003146:'Q1-R7K', 30003164:'QFGB-E', 30003181:'QFIU-K', 30003174:'QS-530', 30003151:'R-ARKN', 30003127:'R-FM0G', 30003104:'RBW-8G', 30003152:'SN9S-N', 30003128:'TEIZ-C', 30003130:'V-XANH', 30003143:'V1ZC-S', 30003116:'VL7-60', 30003175:'VR-YRV', 30003129:'VUAC-Y', 30003098:'VYJ-DA', 30003102:'WAC-HW', 30003166:'WT-2J9', 30003120:'WX-6UX', 30003110:'X-7BIX', 30003173:'X-HISR', 30003119:'XKZ8-H', 30003158:'YAP-TN', 30003178:'YRV-MZ', 30003154:'Z-MO29', 30003124:'Z-Y9C3', 30004984:'Abune', 30004981:'Actee', 30005003:'Adirain', 30005021:'Adrel', 30005006:'Aere', 30005008:'Aeschee', 30004972:'Algogille', 30004995:'Allamotte', 30005009:'Allebin', 30004983:'Amane', 30005028:'Andole', 30005022:'Ane', 30004989:'Annages', 30005019:'Aporulie', 30004994:'Arant', 30005001:'Arnon', 30005024:'Atlangeins', 30005010:'Atlulle', 30005004:'Attyn', 30004973:'Caslemon', 30005026:'Cat', 30004976:'Charmerout', 30005023:'Clorteler', 30014971:'Couster', 30004987:'Deninard', 30005025:'Derririntel', 30004985:'Deven', 30005011:'Droselory', 30004971:'Duripant', 30004986:'Estaunitte', 30004980:'Fliet', 30005012:'Haine', 30024971:'Hecarrin', 30034971:'Henebene', 30004979:'Heydieles', 30004988:'Hulmate', 30005005:'Ignebaener', 30004982:'Indregulle', 30005014:'Isenan', 30004974:'Jolevier', 30004999:'Ladistier', 30005002:'Laurvier', 30005007:'Lisbaetanne', 30004967:'Luminaire', 30044971:'Mesokel', 30004975:'Mesybier', 30004968:'Mies', 30005018:'Noghere', 30004996:'Obalyu', 30005000:'Old Man Star', 30005027:'Ommare', 30004990:'Onne', 30004969:'Oursulaert', 30004992:'Palmon', 30004998:'Parts', 30004978:'Pemene', 30005013:'Perckhevin', 30004970:'Renyn', 30005020:'Seyllin', 30005015:'Synchelle', 30005029:'Vale', 30004997:'Vifrevaert', 30004993:'Villore', 30004991:'Vitrauze', 30005016:'Wysalan', 30005017:'Yona', 30004977:'Yvangier', 30002308:'0M-24X', 30002317:'1ACJ-6', 30002332:'1GH-48', 30002329:'1H5-3W', 30002365:'1PF-BC', 30002342:'2B-UUQ', 30002283:'2G-VDP', 30002302:'3G-LFX', 30002325:'3H58-R', 30002357:'3IK-7O', 30002344:'4-QDIX', 30002350:'43-1TL', 30002297:'4LJ6-Q', 30002291:'5J-62N', 30002313:'5U-3PW', 30002293:'8-MXHA', 30002346:'89-JPE', 30002314:'89JS-J', 30002354:'8KE-YS', 30002284:'9F-3CR', 30002287:'9P-870', 30002339:'9QS5-C', 30002289:'AID-9T', 30002338:'ALC-JM', 30002334:'B-2VXB', 30002318:'BNX-AS', 30002305:'BY-7PY', 30002375:'C-4D0W', 30002367:'C-V6DQ', 30002315:'C9R-NO', 30002362:'CL-IRS', 30002373:'CT8K-0', 30002312:'CYB-BZ', 30002304:'D-CR6W', 30002347:'D-IZT9', 30002366:'D-OJEZ', 30002295:'D3S-EA', 30002378:'DYPL-6', 30002349:'E8-432', 30002369:'EX-GBT', 30002381:'F69O-M', 30002320:'F9-FUV', 30002321:'FB-MPY', 30002345:'FGJP-J', 30002335:'FIZU-X', 30002316:'FKR-SR', 30002301:'G-QTSD', 30002306:'GN-TNT', 30002328:'GTY-FW', 30002356:'HV-EAP', 30002343:'I64-XB', 30002372:'IL-H0A', 30002333:'IRD-HU', 30002331:'IS-OBW', 30002285:'J7M-3W', 30002336:'JAWX-R', 30002323:'JTA2-2', 30002296:'KGT3-6', 30002311:'KMH-J1', 30002286:'KRPF-A', 30002300:'L-ZJLN', 30002376:'L4X-1V', 30002294:'LPVL-5', 30002355:'LXQ2-T', 30002377:'M-V0PQ', 30002374:'M9-LAN', 30002299:'MF-PGF', 30002359:'MO-I1W', 30002341:'N-SFZK', 30002309:'N06Z-Q', 30002303:'NK-VTL', 30002340:'NWX-LI', 30002358:'O-EUHA', 30002351:'O-LJOO', 30002370:'PX-IHN', 30002290:'PXE-RG', 30002363:'QBZO-R', 30002364:'QHJR-E', 30002307:'QKCU-4', 30002288:'QNXJ-M', 30002330:'QZV-X3', 30002324:'R-6KYM', 30002380:'RK-Q51', 30002322:'RO-0PZ', 30002326:'RV-GA8', 30002298:'SAH-AD', 30002382:'T-IDGH', 30002327:'TP-RTO', 30002353:'TZ-74M', 30002361:'UAV-1E', 30002379:'V-OL61', 30002371:'WPV-JN', 30002348:'WU9-ZR', 30002319:'XB-9U2', 30002310:'YX-0KH', 30002292:'Z-DRIY', 30002368:'Z-FET0', 30002337:'Z0G-XG', 30002352:'ZS-PNI', 30002360:'ZZ5X-M', 30003028:'Aclan', 30003059:'Adeel', 30003015:'Aice', 30003042:'Alachene', 30003012:'Amattens', 30003062:'Angatalie', 30003046:'Angymonne', 30003051:'Antollare', 30003030:'Ardallabier', 30003009:'Arnatele', 30003031:'Athinard', 30003038:'Atlanins', 30003053:'Avele', 30003047:'Averon', 30003055:'Aydoteaux', 30003018:'Azer', 30003014:'Bereye', 30003040:'Bille', 30003026:'Blameston', 30003048:'Carirgnottin', 30003019:'Cherore', 30003041:'Colcer', 30003044:'Elarel', 30003045:'Enedore', 30003033:'Ethernity', 30003036:'Frarolle', 30003023:'Gerper', 30003035:'Gicodel', 30003057:'Groothese', 30003010:'Halle', 30003017:'Harerget', 30003029:'Jaschercis', 30003016:'Junsoraert', 30003013:'Jurlesel', 30003049:'Laic', 30003039:'Leremblompes', 30003025:'Lirsautton', 30003060:'Mannar', 30003024:'Marosier', 30003034:'Mattere', 30003032:'Meves', 30003061:'Mormelot', 30003011:'Mormoen', 30003021:'Mosson', 30003056:'Muer', 30003022:'Mya', 30003050:'Odixie', 30003058:'Olide', 30003037:'Quier', 30003054:'Scuelazyns', 30003052:'Tolle', 30003020:'Torvi', 30003043:'Uphene', 30003027:'Vaurent', 30003693:'0-ARFO', 30003702:'8QMO-E', 30003695:'8W-OSE', 30003692:'C-OK0R', 30003697:'C4C-Z4', 30003676:'C8-CHY', 30003678:'CR-IFM', 30003681:'DO6H-Q', 30003682:'DW-T2I', 30003677:'E-9ORY', 30003694:'E9KD-N', 30003691:'FIO1-8', 30003698:'GME-PQ', 30003679:'HHK-VL', 30003701:'I-UUI5', 30003687:'K4YZ-Y', 30003689:'L-C3O7', 30003684:'L-SCBU', 30003699:'MPPA-A', 30003683:'O-CNPR', 30003686:'O1Y-ED', 30003680:'P-33KR', 30003685:'VRH-H7', 30003696:'WQY-IQ', 30003688:'X36Y-G', 30003700:'X5-UME', 30003690:'YKSC-A', 30004483:'0OYZ-G', 30004454:'2-F3OE', 30004492:'2-RSC7', 30004467:'23M-PX', 30004447:'2UK4-N', 30004466:'3-BADZ', 30004405:'3-YX2D', 30004419:'3L-Y9M', 30004415:'4AZ-J8', 30004412:'5-IZGE', 30004455:'5-LCI7', 30004445:'5ELE-A', 30004477:'5P-AIP', 30004432:'5XR-KZ', 30004486:'6-ELQP', 30004442:'6O-XIO', 30004429:'75C-WN', 30004409:'9-ZFCG', 30004450:'AZN-D2', 30004427:'BG-W90', 30004417:'BGN1-O', 30004444:'BJ-ZFD', 30004422:'BJD4-E', 30004420:'BLC-X0', 30004434:'C-0ND2', 30004407:'CFLF-P', 30004458:'CL-J9W', 30004491:'D4-2XN', 30004480:'D6SK-L', 30004470:'DB1R-4', 30004418:'DUU1-K', 30004451:'E-PR0S', 30004414:'F-8Y13', 30004474:'GHZ-SJ', 30004465:'GPUS-A', 30004446:'H-P4LB', 30004443:'H65-HE', 30004481:'HYPL-V', 30004430:'I5Q2-S', 30004482:'I9-ZQZ', 30004410:'J-TPTA', 30004460:'J94-MU', 30004435:'JI-LGM', 30004462:'JO-32L', 30004490:'K-9UG4', 30004475:'K-J50B', 30004421:'K-X5AX', 30004488:'KJ-V0P', 30004449:'M-CMLV', 30004478:'M-PGT0', 30004461:'M2GJ-X', 30004464:'MSKR-1', 30004476:'NLO-3Z', 30004479:'NPD9-A', 30004438:'NW2S-A', 30004440:'NX5W-U', 30004424:'O9V-R7', 30004487:'OBK-K8', 30004426:'OCU4-R', 30004413:'OXC-UL', 30004471:'P8-BKO', 30004411:'PMV-G6', 30004431:'PO-3QW', 30004408:'QBH5-F', 30004448:'QK-CDG', 30004473:'R4K-8L', 30004485:'R97-CI', 30004472:'RIT-A7', 30004484:'SWBV-2', 30004452:'TR07-S', 30004423:'TSG-NO', 30004436:'U-BXU9', 30004439:'U-JJEW', 30004441:'U1-C18', 30004463:'UB5Z-3', 30004404:'UD-VZW', 30004468:'UTDH-N', 30004406:'V-TN6Q', 30004433:'VF-FN6', 30004453:'VNGJ-U', 30004457:'VVO-R6', 30004416:'X6-J6R', 30004428:'Y-YGMW', 30004456:'Y2-I3W', 30004459:'YHP2-D', 30004425:'Z-PNIA', 30004489:'ZID-LE', 30004469:'ZS-2LT', 30004437:'ZXOG-O', 30004656:'006-L3', 30004596:'00GD-D', 30004650:'1-5GBW', 30004639:'14YI-D', 30004611:'15U-JY', 30004632:'38IA-E', 30004555:'3WE-KY', 30004628:'3ZTV-V', 30004553:'4-EP12', 30004661:'4HS-CR', 30004573:'5-D82P', 30004643:'57-KJB', 30004604:'671-ST', 30004591:'6F-H3W', 30004608:'6VDT-H', 30004585:'7-8S5X', 30004620:'75FA-Z', 30004635:'7BIX-A', 30004616:'7BX-6F', 30004587:'7X-02R', 30004640:'87XQ-0', 30004574:'8ESL-G', 30004557:'9-VO0Q', 30004629:'9D6O-M', 30004646:'9DQW-W', 30004582:'9O-ORX', 30004569:'9R4-EJ', 30004619:'A-1CON', 30004605:'A-HZYL', 30004558:'A8-XBW', 30004563:'AC2E-3', 30004578:'AL8-V4', 30004576:'APM-6K', 30004654:'ATQ-QS', 30004614:'AV-VB6', 30004590:'B17O-R', 30004599:'B32-14', 30004562:'BYXF-Q', 30004564:'C-C99Z', 30004651:'C-FER9', 30004600:'C-N4OD', 30004597:'C1XD-X', 30004601:'CHA2-Q', 30004565:'CL-BWB', 30004622:'D-Q04X', 30004588:'D2AH-Z', 30004626:'D4KU-5', 30004595:'DBRN-Z', 30004567:'E-BWUU', 30004586:'EI-O0O', 30004603:'ESC-RI', 30004653:'F-88PJ', 30004652:'F2-2C3', 30004631:'G-UTHL', 30004664:'G1CA-Y', 30004598:'G95F-H', 30004592:'H-NPXW', 30004606:'H-S80W', 30004615:'HMF-9D', 30004636:'I-CUVX', 30004583:'IGE-RI', 30004556:'IR-WT1', 30004637:'J-RQMF', 30004589:'J5A-IX', 30004575:'JGOW-Y', 30004572:'K8L-X7', 30004579:'KCT-0A', 30004642:'KVN-36', 30004593:'L-1SW8', 30004625:'L-A5XP', 30004659:'L7-APB', 30004663:'LBGI-2', 30004630:'LIWW-P', 30004641:'LJ-TZW', 30004633:'M-KXEH', 30004618:'MN5N-X', 30004580:'N2-OQG', 30004609:'NDH-NV', 30004612:'NY6-FH', 30004649:'O-PNSN', 30004645:'OL3-78', 30004581:'OW-TPO', 30004624:'P5-EFH', 30004657:'PB-0C1', 30004559:'PNQY-Y', 30004647:'PXF-RF', 30004571:'Q-XEB3', 30004610:'QV28-G', 30004648:'R-BGSU', 30004566:'R3W-XU', 30004577:'RE-C26', 30004560:'RP2-OQ', 30004570:'SPLE-Y', 30004623:'Serpentis Prime', 30004638:'TEG-SD', 30004634:'TU-Y2A', 30004594:'U-SOH2', 30004602:'UAYL-F', 30004644:'V6-NY1', 30004662:'WMH-SO', 30004621:'WY-9LL', 30004552:'XF-TQL', 30004613:'XJP-Y7', 30004655:'XUW-3X', 30004568:'Y-1W01', 30004665:'Y-2ANO', 30004627:'YRNJ-8', 30004561:'YVBE-E', 30004617:'YZ-LQL', 30004554:'YZS5-4', 30004666:'Z-YN5Y', 30004607:'Z30S-A', 30004584:'Z9PP-H', 30004660:'ZTS-4D', 30004658:'ZUE-NS', 30002454:'0-GZX9', 30002460:'04-LQM', 30002424:'2E-ZR5', 30002455:'2H-TSE', 30002444:'39-DGG', 30002486:'3SFU-S', 30002449:'3USX-F', 30002441:'4-CUM5', 30002469:'4D9-66', 30002495:'4K0N-J', 30002456:'4NGK-F', 30002461:'4VY-Y1', 30002480:'54-MF6', 30002501:'5F-YRA', 30002459:'6L78-1', 30002446:'6RQ9-A', 30002465:'6YC-TU', 30002484:'8-KZXQ', 30002442:'8MG-J6', 30002450:'9-KWXC', 30002472:'9P4O-F', 30002478:'AD-5B8', 30002453:'AP9-LV', 30002489:'Atioth', 30002496:'B-F1MI', 30002436:'B6-52M', 30002498:'BE-UUN', 30002438:'BND-16', 30002440:'BWF-ZZ', 30002483:'CFYY-J', 30002481:'D-I9HJ', 30002430:'D0-F4W', 30002477:'E-91FV', 30002475:'EOA-ZC', 30002423:'FDZ4-A', 30002476:'G-73MR', 30002468:'HJO-84', 30002485:'HKYW-T', 30002439:'IOO-7O', 30002500:'JE1-36', 30002464:'K25-XD', 30002447:'K42-IE', 30002452:'KR-V6G', 30002434:'L-HV5C', 30002470:'L-TOFR', 30002435:'L4X-FH', 30002493:'LR-2XT', 30002462:'LU-HQS', 30002458:'LX-ZOJ', 30002428:'M-MD31', 30002421:'MR4-MY', 30002492:'N-HK93', 30002433:'NBPH-N', 30002451:'NQ-9IH', 30002457:'O-VWPB', 30002425:'O1-FTD', 30002499:'O2O-2X', 30002427:'OEY-OR', 30002482:'P-6I0B', 30002467:'P-E9GN', 30002490:'PYY3-5', 30002471:'Q-TBHW', 30002431:'QKTR-L', 30002479:'QP0K-B', 30002491:'RFGW-V', 30002443:'RLSI-V', 30002426:'Roua', 30002422:'SR-KBB', 30002445:'SV-K8J', 30002502:'TDE4-H', 30002474:'TJM-JJ', 30002494:'TZL-WT', 30002463:'U-L4KS', 30002488:'U6D-9A', 30002473:'UBX-CC', 30002503:'UER-TH', 30002504:'UG-UWZ', 30002437:'V-MZW0', 30002487:'VJ-NQP', 30002448:'VSJ-PP', 30002497:'W-3BSU', 30002429:'WH-2EZ', 30002466:'Y8R-XZ', 30002432:'YN3-E3', 30005254:'Abhan', 30005266:'Access', 30005292:'Agal', 30005196:'Ahbazon', 30005225:'Alal', 30005279:'Anara', 30005264:'Angur', 30005261:'Antem', 30005249:'Anyed', 30005252:'Anzalaisio', 30005216:'Apanake', 30005239:'Aring', 30005251:'Asanot', 30005214:'Ashokon', 30005224:'Assez', 30005197:'Atreen', 30005215:'Avyuh', 30005275:'Azedi', 30005294:'Bania', 30005257:'Bantish', 30005234:'Beke', 30005267:'Bherdasopt', 30005287:'Canard', 30005283:'Central Point', 30005228:'Chamja', 30005237:'Chej', 30005253:'Chiga', 30005194:'Cleyd', 30005285:'Dead End', 30005229:'Diaderi', 30005262:'Djimame', 30005226:'Dom-Aphis', 30005293:'Doza', 30005291:'Ebasez', 30005202:'Emsar', 30005281:'Exit', 30005273:'Galnafsad', 30005282:'Gateway', 30005240:'Gayar', 30005244:'Gergish', 30005288:'Girani-Fa', 30005268:'Gonditsa', 30005250:'Habu', 30005223:'Hadji', 30005265:'Hangond', 30005290:'Heorah', 30005213:'Hesarid', 30005248:'Hirizan', 30005227:'Iderion', 30005246:'Imya', 30005256:'Itrin', 30005206:'Kemerk', 30005260:'Keri', 30005220:'Keseya', 30005247:'Kobam', 30005258:'Korridi', 30005259:'Lela', 30005233:'Leran', 30005193:'Lor', 30005243:'Madomi', 30005210:'Makhwasan', 30005235:'Malma', 30005201:'Manarq', 30005230:'Manatirid', 30005238:'Menai', 30005263:'Mozzidit', 30005242:'Naka', 30005207:'Nardiarang', 30005289:'Nasreri', 30005286:'New Eden', 30005236:'Noranim', 30005272:'Olin', 30005274:'Otakod', 30005203:'Ourapheh', 30005198:'Pakhshi', 30005232:'Pamah', 30005280:'Partod', 30005231:'Pashanai', 30005241:'Petidu', 30005277:'Pirna', 30005284:'Promised Land', 30005255:'Saphthar', 30005222:'Serren', 30005278:'Seshi', 30005270:'Shalne', 30005271:'Shapisin', 30005276:'Sharza', 30005192:'Shera', 30005217:'Sheroo', 30005209:'Sibe', 30005219:'Sigga', 30005269:'Simela', 30005218:'Sosh', 30005245:'Tahli', 30005199:'Tar', 30005205:'Tarta', 30005200:'Tekaima', 30005212:'Toon', 30005195:'Vecamia', 30005204:'Yulai', 30005211:'Zarer', 30005208:'Ziasad', 30005221:'Zoohen', 30000995:'0-3VW8', 30000929:'0NV-YU', 30000966:'0PI4-E', 30000940:'0R-GZQ', 30000931:'168-6H', 30000963:'1C-953', 30000962:'1L-AED', 30000996:'28-QWU', 30000909:'2X7Z-L', 30000976:'4M-P1I', 30000912:'504Z-V', 30000989:'52V6-B', 30000908:'56D-TC', 30000992:'5FCV-A', 30000985:'66U-1P', 30000907:'6EG7-R', 30000967:'6WT-BE', 30000919:'7-IDWY', 30000955:'7JF-0Z', 30000943:'7Q-8Z2', 30000910:'8DL-CP', 30000942:'8YC-AN', 30000939:'9-34L5', 30000994:'92-B0X', 30000969:'9SNK-O', 30000914:'AB-FZE', 30000933:'AI-EVH', 30000920:'AZF-GH', 30000961:'B-ROFP', 30000970:'B-VIP9', 30000986:'BRT-OP', 30000925:'BY5-V8', 30000951:'CI4M-T', 30000923:'CRXA-Y', 30000954:'DE71-9', 30000903:'E02-IK', 30000980:'EOE3-N', 30000934:'F-MKH3', 30000959:'F5-CGW', 30000981:'F7A-MR', 30000913:'F8K-WQ', 30000906:'FVXK-D', 30000936:'GF-3FL', 30000974:'H-8F5Q', 30000973:'H7O-JZ', 30000960:'H9S-WC', 30000979:'HB-1NJ', 30000952:'I-QRJA', 30000956:'IX8-JB', 30000901:'JPL-RA', 30000987:'JUK0-1', 30001000:'K-IYNW', 30000968:'L1S-G1', 30000971:'LXTC-S', 30000922:'M-EKDF', 30000905:'M-MD3B', 30000953:'M-YWAL', 30000998:'M9U-75', 30000915:'N-6Z8B', 30000904:'N-DQ0D', 30000999:'N-RAEL', 30000917:'NE-3GR', 30000900:'NIH-02', 30000902:'NK-7XO', 30000982:'O-8SOC', 30000993:'O-OVOQ', 30000975:'O-RXCZ', 30000983:'OJOS-T', 30000945:'OK-6XN', 30000949:'P1T-LP', 30000977:'P7UZ-T', 30000990:'PUC-JZ', 30000978:'PUZ-IO', 30000946:'Q2FL-T', 30000941:'QM-20X', 30000938:'QQ3-YI', 30000950:'R-ESG0', 30000991:'SB-23C', 30000964:'SL-YBS', 30000944:'SUR-F7', 30000926:'TET3-B', 30000948:'U3K-4A', 30000997:'UD-AOK', 30000911:'UMDQ-6', 30000965:'UNJ-GX', 30000921:'UT-UZB', 30000930:'V-2GYS', 30000988:'V-IH6B', 30000984:'V89M-R', 30000927:'VKU-BG', 30000924:'VXO-OM', 30000932:'W-RFUO', 30000972:'WE3-BX', 30000928:'WPR-EI', 30000957:'WTIE-6', 30000958:'Y-DSSK', 30000918:'Y4-GQV', 30000947:'Y7-XFD', 30000916:'YUY-LM', 30000937:'ZJ-GOU', 30000935:'ZM-DNR', 30002507:'Abudban', 30002512:'Alakgur', 30002537:'Amamake', 30002511:'Ameinaka', 30002547:'Ammold', 30002565:'Appen', 30002551:'Aralgrund', 30002557:'Atgur', 30002542:'Auga', 30002561:'Auren', 30002523:'Austraka', 30002530:'Avesber', 30002528:'Balginia', 30002553:'Bogelek', 30002514:'Bosboger', 30002577:'Bundindus', 30002541:'Dal', 30002513:'Dammalin', 30002520:'Dumkirinur', 30002535:'Ebasgerdur', 30002536:'Ebodold', 30002552:'Eddar', 30002518:'Edmalbrurdus', 30002563:'Egmur', 30002555:'Eifer', 30002548:'Emolgranlan', 30002558:'Endrulf', 30002543:'Eystur', 30002526:'Frarn', 30002533:'Gerbold', 30002531:'Gerek', 30002517:'Gulmorogod', 30002560:'Gultratren', 30002556:'Gusandall', 30002529:'Gyng', 30012547:'Hadaugago', 30002579:'Hedgiviter', 30002574:'Hrondedir', 30002576:'Hrondmund', 30002505:'Hulm', 30002572:'Hurjafren', 30002527:'Illinfrik', 30002559:'Ingunn', 30002546:'Isendeldik', 30002524:'Ivar', 30002564:'Javrendei', 30002567:'Jorus', 30002580:'Katugumur', 30002566:'Klir', 30022547:'Krilmokenur', 30002519:'Kronsur', 30002540:'Lantorn', 30032547:'Larkugei', 30042547:'Loguttur', 30002516:'Lulm', 30002545:'Lustrevik', 30002570:'Magiko', 30012505:'Malukker', 30002525:'Meirakulf', 30002522:'Obrolber', 30002509:'Odatrik', 30002549:'Offugen', 30002515:'Olfeim', 30002568:'Onga', 30002571:'Oremmulf', 30002569:'Osaumuni', 30002506:'Osoggur', 30002578:'Otraren', 30002544:'Pator', 30002510:'Rens', 30002534:'Rokofur', 30002550:'Roniko', 30002539:'Siseide', 30002521:'Sist', 30002575:'Sotrenzur', 30032505:'Todeko', 30002532:'Tongofur', 30002562:'Trer', 30002508:'Trytedald', 30042505:'Usteli', 30002538:'Vard', 30002573:'Vullat', 30002554:'Wiskeber', 30002168:'08-N7Q', 30002175:'2O-EEW', 30002144:'4-GB14', 30002154:'4DV-1T', 30002167:'6-I162', 30002158:'7-ZT1Y', 30002125:'78TS-Q', 30002177:'7YSF-E', 30002171:'8X6T-8', 30002159:'9-XN3F', 30002131:'94FR-S', 30002113:'A4B-V5', 30002160:'AC-7LZ', 30002107:'AF0-V5', 30002108:'B-A587', 30002143:'B-KDOZ', 30002157:'B-R5RB', 30002105:'B-S347', 30002165:'B2-UQW', 30002110:'B9E-H6', 30002127:'CJNF-J', 30002170:'CKX-RW', 30002149:'D-BAMJ', 30002184:'DR-427', 30002146:'DW-N2S', 30002119:'DY-P7Q', 30002136:'E1F-LK', 30002163:'E8-YS9', 30002124:'EA-HSA', 30002152:'F76-8Q', 30002156:'FN-DSR', 30002180:'FRTC-5', 30002128:'FYI-49', 30002133:'GM-0K7', 30002123:'GXK-7F', 30002120:'H-RXNZ', 30002134:'I-NGI8', 30002174:'J-QA7I', 30002112:'JDAS-0', 30002150:'JKWP-U', 30002178:'KCDX-7', 30002142:'L-5JCJ', 30002161:'LBA-SO', 30002139:'LK1K-5', 30002114:'LN-56V', 30002181:'M-ZJWJ', 30002185:'NI-J0B', 30002103:'NS2L-4', 30002153:'O3Z5-G', 30002116:'O7-7UX', 30002179:'O7-VJ5', 30002173:'OP9L-F', 30002145:'PH-NFR', 30002106:'PPFB-U', 30002132:'Q-HJ97', 30002138:'QE-E1D', 30002104:'QI-S9W', 30002186:'QN-6J2', 30002182:'R-ORB7', 30002135:'R-ZUOL', 30002140:'REB-KR', 30002129:'RF6T-8', 30002151:'RHE7-W', 30002183:'RU-PT9', 30002111:'SPBS-6', 30002164:'U79-JF', 30002166:'U9U-TQ', 30002147:'W-FHWJ', 30002172:'W4E-IT', 30002126:'WYF8-8', 30002148:'X-6WC7', 30002118:'XD-JW7', 30002155:'XS-K1O', 30002122:'XVV-21', 30002169:'Y-C4AL', 30002162:'Y-FZ5N', 30002176:'Y-N4EF', 30002109:'Y19P-1', 30002115:'Y2-QUV', 30002141:'Z-H2MA', 30002137:'Z4-QLD', 30002117:'Z8-81T', 30002121:'ZBP-TP', 30002130:'ZJA-6U', 30002609:'01TG-J', 30002581:'1-7KWU', 30002582:'3-UCBF', 30002613:'4-MPSJ', 30002616:'442-CS', 30002585:'4OIV-X', 30002625:'4RS-L1', 30002591:'5GQ-S9', 30002593:'68FT-6', 30002623:'6B-GKA', 30002619:'6E-MOW', 30002590:'9-IIBL', 30002589:'9I-SRF', 30002618:'9ZFH-Z', 30002611:'A1BK-A', 30002587:'AFJ-NB', 30002626:'D-L4H0', 30002631:'DDI-B7', 30002599:'DY-40Z', 30002605:'E7VE-V', 30002598:'F-3H2P', 30002628:'FG-1GH', 30002630:'FR-B1H', 30002620:'GBT4-J', 30002627:'GU-9F4', 30002621:'GZ1-A1', 30002588:'H-64KI', 30002596:'HOHF-B', 30002595:'IRE-98', 30002594:'IV-UNR', 30002607:'L6BY-P', 30002624:'LHGA-W', 30002604:'LJK-T0', 30002601:'M-9V5D', 30002603:'M-VEJZ', 30002612:'N-7ECY', 30002583:'N-CREL', 30002606:'NUG-OF', 30002602:'O2-39S', 30002615:'PZMA-E', 30002584:'TM-0P2', 30002614:'TWJ-AW', 30002608:'U3SQ-X', 30002610:'UK-SHL', 30002629:'WFYM-0', 30002622:'X-0CKQ', 30002600:'XWY-YM', 30002597:'Y-6B0E', 30002586:'Y-JKJ8', 30002592:'YALR-F', 30002617:'Z-N9IP', 30000838:'0-6VZ5', 30000799:'0-VG7A', 30000822:'04-EHC', 30000781:'0UBC-R', 30000766:'1TG7-W', 30000836:'1ZF-PJ', 30000756:'2-Q4YG', 30000832:'27-HP0', 30000757:'2JT-3Q', 30000823:'3-0FYP', 30000829:'38NZ-1', 30000798:'3AE-CP', 30000782:'3U-48K', 30000795:'4CJ-AC', 30000749:'4DS-OI', 30000818:'4LB-EL', 30000776:'4M-QXK', 30000762:'5-2PQU', 30000844:'5C-RPA', 30000820:'5IH-GL', 30000741:'5M2-KP', 30000791:'67Y-NR', 30000764:'6BJH-3', 30000759:'7-JT09', 30000787:'74-VZA', 30000827:'74L2-U', 30000773:'78-0R6', 30000840:'7EX-14', 30000745:'7L3-JS', 30000775:'8-WYQZ', 30000770:'88A-RA', 30000748:'8EF-58', 30000771:'8G-2FP', 30000800:'9OLQ-6', 30000769:'A-TJ0G', 30000794:'A24L-V', 30000760:'AGCP-I', 30000817:'B-II34', 30000772:'C-J6MT', 30000821:'C1G-XC', 30000743:'C8H5-X', 30000845:'CR2-PQ', 30000826:'D-P1EH', 30000816:'DFH-V5', 30000783:'EFM-C4', 30000814:'EJ48-O', 30000740:'EKPB-3', 30000796:'EUU-4N', 30000804:'F2A-GX', 30000810:'F3-8X2', 30000754:'F39H-1', 30000809:'FN0-QS', 30000835:'FX4L-2', 30000778:'G-EURJ', 30000839:'GB-6X5', 30000792:'GDHN-K', 30000789:'GK5Z-T', 30000837:'HFC-AQ', 30000828:'HL-VZX', 30000825:'HZ-O18', 30000788:'I-1QKL', 30000758:'I3CR-F', 30000843:'J-ZYSZ', 30000737:'KD-KPR', 30000736:'KDG-TA', 30000807:'KS-1TS', 30000786:'L5-UWT', 30000813:'LVL-GZ', 30000761:'M4-GJ6', 30000803:'MJ-LGH', 30000801:'MOCW-2', 30000774:'MSG-BZ', 30000824:'N-O53U', 30000811:'N7-BIY', 30000841:'N7-KGJ', 30000744:'O-7LAI', 30000831:'O-9G5Y', 30000738:'PT-21C', 30000797:'Q-3HS5', 30000785:'Q7-FZ8', 30000793:'QTME-D', 30000767:'QYD-WK', 30000768:'R959-U', 30000805:'RD-FWY', 30000780:'RERZ-L', 30000815:'ROJ-B0', 30000790:'RQN-OO', 30000834:'RZ-TI6', 30000753:'S0U-MO', 30000779:'SHBF-V', 30000763:'SN9-3Z', 30000742:'TK-DLH', 30000812:'TTP-2B', 30000747:'TZN-2V', 30000765:'U-UTU9', 30000819:'UDE-FX', 30000755:'V-QXXK', 30000806:'VBPT-T', 30000842:'VD-8QY', 30000751:'W-6GBI', 30000830:'W-MF6J', 30000746:'WF4C-8', 30000808:'X0-6LH', 30000833:'X1-IZ0', 30000777:'X5-0EM', 30000752:'XKH-6O', 30000750:'XQP-9C', 30000784:'YPW-M4', 30000739:'Z182-R', 30000802:'ZO-4AR', 30004090:'Aband', 30004082:'Aharalel', 30004147:'Akhmoh', 30015042:'Akhwa', 30004104:'Ansasos', 30004079:'Aphend', 30004124:'Aphi', 30004118:'Ardhis', 30004093:'Askonak', 30004156:'Asrios', 30004138:'Bersyrim', 30004106:'Bordan', 30004117:'Bushemal', 30004108:'Chaneya', 30004122:'Chanoun', 30004097:'Dantan', 30004105:'Dehrokh', 30004078:'Dresi', 30004149:'Elmed', 30004110:'Finid', 30004085:'Gamdis', 30004123:'Garisas', 30004119:'Gasavak', 30004083:'Gensela', 30004084:'Ghesis', 30004087:'Gonan', 30004100:'Halibai', 30004145:'Hapala', 30004142:'Hikansog', 30004133:'Hilmar', 30004141:'Hiremir', 30004077:'Hiroudeh', 30004130:'Hostakoh', 30004120:'Iaokit', 30004102:'Inis-Ilix', 30004157:'Ithar', 30004125:'Jakri', 30004148:'Jennim', 30004132:'Jeshideh', 30004086:'Joamma', 30004088:'Joramok', 30004095:'Prime', 30004115:'Kamda', 30004134:'Kasi', 30004096:'Khafis', 30004128:'Koona', 30004152:'Kooreng', 30004103:'Kothe', 30004159:'Lazara', 30004112:'Mandoo', 30004121:'Menri', 30004113:'Miah', 30004153:'Minin', 30004136:'Mod', 30004129:'Munory', 30004092:'Murini', 30004089:'Neburab', 30004126:'Nidupad', 30004094:'Nordar', 30004109:'Oberen', 30004137:'Omam', 30004114:'Peyiri', 30004116:'Rayeret', 30004080:'Romi', 30004146:'Salah', 30004139:'Sechmaren', 30004150:'Shaggoth', 30004155:'Shemah', 30004135:'Shura', 30004099:'Sonama', 30004101:'Suner', 30004143:'Syrikos', 30004158:'Telang', 30004098:'Turba', 30004091:'Uanim', 30004151:'Ustnia', 30004111:'Yarebap', 30004144:'Yebouz', 30004154:'Yehnifi', 30004131:'Yooh', 30004107:'Zimmem', 30004127:'Zimse', 30004140:'Zinoo', 30004081:'Zororzih', 30004160:'Zorrabed', 30003903:'Afnakat', 30003862:'Agil', 30003895:'Ainsan', 30003917:'Amafi', 30003885:'Arzanni', 30003921:'Arzieh', 30003890:'Ashi', 30003919:'Ashkoo', 30003886:'Ashmarir', 30003915:'Aurejet', 30003888:'Badivefi', 30003910:'Balanaz', 30003920:'Baratar', 30003908:'Bashyam', 30003913:'Bomana', 30003866:'Bukah', 30003934:'Cabeki', 30003905:'Chamemi', 30003925:'Chitiamem', 30003896:'Claini', 30003904:'Col', 30003912:'Danera', 30003924:'Dimoohan', 30003911:'Edani', 30003892:'Efa', 30003867:'Ervekam', 30003938:'Fanathor', 30003906:'Firbha', 30003897:'Gehi', 30003871:'Geztic', 30003877:'Gidali', 30003930:'Goudiyah', 30003858:'Gousoviba', 30003918:'Hakana', 30003900:'Ham', 30003902:'Hemouner', 30003937:'Hezere', 30003875:'Hishai', 30003933:'Ibani', 30003861:'Ipref', 30003935:'Irmalin', 30003864:'Jachanu', 30003873:'Kahah', 30003887:'Kaira', 30003883:'Keberz', 30003863:'Prime', 30003860:'Kihtaled', 30003926:'Kuhri', 30003882:'Lansez', 30003868:'Mashtarmem', 30003876:'Molea', 30003881:'Moniyyuku', 30003893:'Moro', 30003922:'Nahrneder', 30003936:'Nakis', 30003923:'Nandeza', 30003929:'Neda', 30003859:'Neyi', 30003884:'Nourbal', 30003870:'Osis', 30003878:'Palas', 30003909:'Parses', 30003940:'Pout', 30003941:'Rafeme', 30003914:'Rahabeda', 30003880:'Reteka', 30003916:'Rilera', 30003894:'Sabusi', 30003879:'Safshela', 30003874:'Saloti', 30003931:'Sassecho', 30003865:'Sazre', 30003869:'Sehsasez', 30003898:'Seshala', 30003889:'Talidal', 30003907:'Tegheon', 30003932:'Timudan', 30003891:'Tzashrah', 30003901:'Upt', 30003899:'Vezila', 30003872:'Yezara', 30003927:'Zahefeus', 30003928:'Zephan', 30003939:'Zirsem', 30005062:'Abath', 30005036:'Amdonen', 30005035:'Ami', 30005049:'Andrub', 30025042:'Annad', 30005086:'Arza', 30005065:'Arzi', 30005077:'Atarli', 30005034:'Bridi', 30035042:'Chaktaren', 30005051:'Choga', 30045042:'Conoban', 30005044:'Danyana', 30005074:'Daran', 30005082:'Enal', 30005030:'Fensi', 30005072:'Gademam', 30005053:'Imih', 30005083:'Jedandan', 30005033:'Jeni', 30005046:'Jinkah', 30005078:'Keproh', 30005066:'Kerying', 30005032:'Khabara', 30005056:'Kizama', 30005038:'Prime', 30005050:'Kulu', 30005075:'Latari', 30005039:'Leva', 30005087:'Liparer', 30005064:'Mafra', 30005041:'Masanuh', 30005084:'Miroona', 30005059:'Misha', 30005037:'Mora', 30005069:'Nahol', 30005045:'Nahyeen', 30005043:'Nakregde', 30005054:'Nare', 30005031:'Nebian', 30005058:'Neesher', 30005047:'Nibainkier', 30005040:'Nishah', 30005068:'Oguser', 30005060:'Ordion', 30005073:'Pananan', 30005061:'Perbhe', 30005081:'Piri', 30005048:'Polfaly', 30005085:'Ranni', 30005080:'Rannoze', 30005063:'Schmaeel', 30005042:'Sehmy', 30005057:'Shaha', 30005076:'Shokal', 30005052:'Soumi', 30005070:'Tadadan', 30005071:'Tralasa', 30005079:'Zatamaka', 30005055:'Zinkon', 30005067:'Zorenyen', 30001396:'Aakari', 30001418:'Aikantoh', 30001404:'Airkio', 30001431:'Aivoli', 30001414:'Ajanen', 30011407:'Akiainavas', 30001382:'Akonoinen', 30001392:'Amsen', 30001439:'Anin', 30001357:'Antiainen', 30001381:'Arvasaras', 30001419:'Atai', 30001398:'Aunenen', 30001361:'Aurohunen', 30001411:'Autama', 30001384:'Autaris', 30001356:'Dantumi', 30001420:'Daras', 30001378:'Ekura', 30001434:'Elanoda', 30001399:'Elonaya', 30001430:'Endatoh', 30001371:'Erenta', 30001364:'Funtanainen', 30001424:'Haajinen', 30001367:'Hageken', 30001448:'Hakonen', 30031407:'Hitanishio', 30001428:'Ibura', 30041407:'Ichinumi', 30001374:'Iidoken', 30001422:'Iitanmadan', 30001389:'Isanamo', 30001436:'Isie', 30001387:'Isikano', 30001365:'Isikemi', 30001426:'Isinokka', 30001397:'Isseras', 30001385:'Jan', 30001423:'Jotenen', 30011392:'Jouvulen', 30001405:'Kakakela', 30001406:'Kamokor', 30021392:'Kappas', 30001440:'Karjataimon', 30001372:'Kino', 30001410:'Kirras', 30001360:'Kiskoken', 30001394:'Korama', 30001415:'Kuoka', 30001400:'Litiura', 30001416:'Liukikka', 30001393:'Malkalen', 30001388:'Mara', 30001445:'Nalvula', 30001413:'Nani', 30001438:'Nannaras', 30001401:'Nonni', 30001376:'Nourvukaiken', 30001435:'Ohbochi', 30001444:'Oimmo', 30001425:'Oipo', 30001433:'Oishami', 30001358:'Ossa', 30001421:'Otalieto', 30001446:'Otsasai', 30001370:'Ouranienen', 30001390:'Pakkonen', 30001402:'Passari', 30001403:'Piak', 30001391:'Piekura', 30001417:'Rauntaka', 30001373:'Raussinen', 30001408:'Ruvas', 30001386:'Saatuban', 30001442:'Saranen', 30001377:'Sarekuwa', 30001359:'Semiki', 30001363:'Sobaseki', 30001369:'Sotrentaira', 30001447:'Taisy', 30001437:'Tamo', 30001441:'Tartoken', 30001407:'Todaki', 30001429:'Torrinos', 30001375:'Tsuguwa', 30001412:'Tsukuras', 30001379:'Tunttaras', 30001368:'Uemisaisen', 30001432:'Uesuro', 30001409:'Umokka', 30001366:'Uosusuokko', 30001383:'Vaajaita', 30001362:'Veisto', 30001380:'Vellaine', 30001443:'Vuorrassi', 30001395:'Ylandoki', 30001427:'Yoma', 30001150:'0-N1BJ', 30001065:'0-TRV1', 30001120:'06-70G', 30001109:'1-EVAX', 30001066:'13-49W', 30001081:'1NZV-7', 30001136:'2FL-5W', 30001103:'2XI8-Y', 30001116:'2Z-HPQ', 30001149:'4QY-NT', 30001101:'5-A0PX', 30001104:'5B-YDD', 30001111:'6-WMKE', 30001092:'63-7Q6', 30001134:'6A-FUY', 30001067:'6UT-1K', 30001075:'7-2Z93', 30001057:'7-YHRX', 30001141:'7T-0QS', 30001071:'8-2JZA', 30001132:'863P-X', 30001053:'8AB-Q4', 30001144:'8C-VE3', 30001079:'9F-ERQ', 30001128:'9NI-FW', 30001077:'A0M-R8', 30001114:'APES-G', 30001070:'AZA-QE', 30001076:'B-VFDD', 30001115:'B2J-5N', 30001085:'C-KW6X', 30001124:'C-NMG9', 30001100:'CLW-SI', 30001083:'DAI-SH', 30001130:'DOA-YU', 30001064:'F-TQWO', 30001087:'F-WZYG', 30001148:'FO9-FZ', 30001122:'GL6S-2', 30001129:'H-EBQG', 30001135:'HG-YEQ', 30001110:'I8-AJY', 30001052:'IBOX-2', 30001056:'IF-KD1', 30001146:'IL-OL1', 30001112:'J-Z8C2', 30001055:'JA-G0T', 30001069:'LH-PLU', 30001119:'LO5-LN', 30001078:'LY-WRW', 30001060:'N-H95C', 30001062:'N-YLOE', 30001126:'N6NK-J', 30001063:'NBO-O0', 30001117:'NBW-GD', 30001082:'NIM-FY', 30001094:'NRD-5Q', 30001061:'NSI-MW', 30001099:'O7-RFZ', 30001068:'O8W-5O', 30001125:'P3X-TN', 30001147:'POQP-K', 30001106:'PWPY-4', 30001080:'QCGG-Q', 30001137:'QSCO-D', 30001107:'QZ1-OH', 30001102:'R-RMDH', 30001090:'RIU-GC', 30001139:'RSE-PT', 30001123:'RUF3-O', 30001142:'RWML-A', 30001138:'RXTY-4', 30001088:'S-R9J2', 30001145:'S5W-1Z', 30001098:'SH-YZY', 30001096:'T-4H0B', 30001151:'T-8GWA', 30001051:'TD-4XL', 30001127:'TP-APY', 30001152:'UW-6MW', 30001121:'UYG-YX', 30001143:'V-JCJS', 30001084:'V3P-AZ', 30001073:'VVB-QH', 30001054:'VW-PXL', 30001105:'W-XY4J', 30001095:'W5-205', 30001140:'WVJU-4', 30001059:'X-PQEX', 30001086:'X1W-AL', 30001093:'XCZ5-Y', 30001113:'XTVZ-E', 30001089:'XU-BF8', 30001108:'Y-XZA7', 30001058:'Y6-9LF', 30001118:'YM-SRU', 30001074:'Z-DDVJ', 30001097:'Z-EKCY', 30001091:'Z0H2-4', 30001133:'ZO-YJZ', 30001131:'ZOPZ-6', 30001072:'ZT-L3S', 30013410:'Abrat', 30003439:'Aderkan', 30003404:'Agtver', 30003415:'Aldagolf', 30003464:'Aldik', 30003379:'Aldilur', 30003416:'Aldrat', 30003380:'Alf', 30003389:'Altrinur', 30002055:'Amo', 30003398:'Anbald', 30002051:'Anher', 30002081:'Ansen', 30003440:'Ansher', 30003426:'Anstard', 30002058:'Ardar', 30002080:'Arifsdald', 30003411:'Arlek', 30003374:'Arlulf', 30002066:'Arnher', 30002064:'Arnstur', 30002078:'Arwa', 30002084:'Aset', 30003384:'Asgeir', 30003455:'Atonder', 30002059:'Auner', 30002089:'Avenod', 30002071:'Barkrik', 30002048:'Bei', 30003397:'Bongveber', 30002067:'Brin', 30003375:'Brundakur', 30003451:'Dantbeinn', 30003405:'Datulen', 30003461:'Diromitur', 30002076:'Dudreda', 30003438:'Earled', 30003441:'Earwik', 30002094:'Ebolfer', 30003401:'Egbonbet', 30002099:'Egmar', 30003393:'Eiluvodi', 30003462:'Eldjaerin', 30003412:'Elgoi', 30023410:'Embod', 30003424:'Enden', 30003454:'Engosi', 30003413:'Eram', 30033410:'Erego', 30003472:'Erindur', 30003463:'Erlendur', 30003425:'Erstet', 30003419:'Erstur', 30002095:'Eszur', 30003466:'Eurgrana', 30003381:'Eust', 30002060:'Evati', 30003408:'Evettullur', 30003385:'Evuldgenzo', 30003392:'Eygfe', 30002085:'Eytjangard', 30043410:'Fildar', 30003442:'Finanar', 30002082:'Floseswin', 30003382:'Flost', 30003394:'Freatlidur', 30003420:'Fredagod', 30002090:'Frerstorn', 30003467:'Frulegur', 30002093:'Gebuladi', 30003433:'Gedugaud', 30003429:'Geffur', 30002102:'Gukarla', 30002057:'Hadozeko', 30002050:'Hagilur', 30003449:'Hakeri', 30002077:'Hakisalki', 30003418:'Hardbako', 30003435:'Hebisa', 30002053:'Hek', 30002063:'Helgatild', 30003428:'Hilfhurmur', 30002075:'Hjoramold', 30003400:'Hjortur', 30003469:'Hodrold', 30002096:'Hofjaldgund', 30003456:'Hotrardik', 30003468:'Hroduko', 30002054:'Hror', 30003377:'Illuin', 30003445:'Iluin', 30002072:'Inder', 30003452:'Irgrus', 30002087:'Isbrabata', 30003387:'Jondik', 30003447:'Josekorn', 30003458:'Klaevik', 30002097:'Klogori', 30003471:'Konora', 30002079:'Krirald', 30002074:'Lanngisi', 30002065:'Lasleinur', 30003409:'Leurtmar', 30003421:'Libold', 30003459:'Lirerim', 30003432:'Lumegen', 30003444:'Mateber', 30003396:'Maturat', 30003403:'Meimungen', 30003443:'Moselgi', 30002068:'Nakugard', 30003378:'Nedegulf', 30003423:'Nein', 30003448:'Nifflung', 30003470:'Odebeinn', 30003446:'Ofage', 30003460:'Offikatlin', 30002061:'Ofstold', 30003437:'Ogoten', 30003388:'Olbra', 30003386:'Ongund', 30002091:'Ontorn', 30003430:'Oppold', 30003450:'Oraekja', 30003453:'Orduin', 30002098:'Orfrold', 30022505:'Orgron', 30003427:'Osvestmunnur', 30003434:'Polstodur', 30002052:'Ragnarg', 30002056:'Resbroko', 30003391:'Reset', 30003457:'Ridoner', 30003395:'Roleinn', 30003410:'Ryddinjorn', 30002092:'Sirekur', 30003406:'Situner', 30003376:'Stirht', 30003465:'Tabbetzur', 30002100:'Taff', 30003407:'Tamekamur', 30002062:'Todifrauan', 30003383:'Todrir', 30003436:'Tollus', 30003402:'Totkubad', 30003431:'Tratokard', 30002069:'Traun', 30002086:'Turnur', 30002073:'Tvink', 30002101:'Ualkin', 30002083:'Uisper', 30002070:'Uriok', 30003417:'Urnhard', 30002049:'Uttindar', 30003390:'Vilur', 30002088:'Vimeini', 30003399:'Vorsk', 30003422:'Wirdalen', 30003414:'Yrmori', 30002393:'Aedald', 30002383:'Aeddin', 30002419:'Aeditide', 30002407:'Altbrard', 30002389:'Atlar', 30002395:'Audesder', 30002387:'Bosena', 30002420:'Egbinger', 30002398:'Eldulf', 30002412:'Ennur', 30002408:'Fegomenko', 30002386:'Gelfiven', 30002403:'Gonheim', 30002384:'Gulfonodi', 30002404:'Half', 30002406:'Hedaleolfarber', 30002418:'Hegfunden', 30002390:'Heild', 30002397:'Horaka', 30002392:'Hrober', 30002391:'Hrokkur', 30002396:'Illamur', 30002402:'Istodard', 30002417:'Kadlina', 30002416:'Kattegaud', 30002414:'Klingt', 30002401:'Meildolf', 30002410:'Mimiror', 30002394:'Muttokon', 30002388:'Oddelulf', 30002399:'Orien', 30002409:'Osvetur', 30002405:'Sakulda', 30002411:'Skarkon', 30002385:'Teonusude', 30002413:'Unertek', 30002400:'Varigne', 30002415:'Weld', 30003198:'0-QP56', 30003251:'0PU2-R', 30003258:'1-BK1Q', 30003215:'1-HDQ4', 30003211:'1EO-OE', 30003233:'1P-QWR', 30003188:'2-WNTD', 30003238:'21M1-B', 30003231:'3-JG3X', 30003191:'3VL6-I', 30003214:'5-9L3H', 30003267:'5-IH57', 30003223:'5JEZ-I', 30003210:'6U-MFQ', 30003217:'7-UVMT', 30003189:'83-YGI', 30003253:'91-KD8', 30003249:'9G5J-1', 30003250:'B-ETDW', 30003230:'BQ0-UU', 30003187:'DGDT-3', 30003193:'DS3-6A', 30003196:'DYS-CG', 30003242:'E6Q-LE', 30003192:'F-816R', 30003234:'FJ-GUR', 30003213:'FZCR-3', 30003184:'G-W1ND', 30003202:'GGMF-J', 30003232:'GK3-RX', 30003199:'GTQ-C9', 30003261:'H-MHWF', 30003220:'HF-K3O', 30003243:'HO4E-Q', 30003241:'IAMJ-Q', 30003203:'IG-4OF', 30003219:'IO-R2S', 30003206:'J-D5U7', 30003264:'JXQJ-B', 30003239:'KED-2O', 30003190:'KH-EWC', 30003255:'LA2-KV', 30003204:'LQQH-J', 30003200:'M-NWLB', 30003197:'MTGF-2', 30003185:'MZLW-9', 30003186:'ND-X7X', 30003201:'ORB4-J', 30003254:'OZ-DS5', 30003248:'P-ZWKH', 30003262:'PND-SI', 30003222:'Q-ITV5', 30003183:'QB-AE6', 30003221:'QE2-FS', 30003244:'QY2Y-N', 30003266:'QYT-X8', 30003236:'QZ-DIZ', 30003218:'R-ZESX', 30003209:'R8WV-7', 30003229:'RF-X7V', 30003260:'RJBC-I', 30003246:'RO-AIQ', 30003257:'S7WI-F', 30003225:'SON-TW', 30003195:'T-HMWP', 30003240:'U-RELP', 30003227:'U9SE-N', 30003235:'UGR-J2', 30003226:'V-X0KM', 30003194:'V0-H4L', 30003247:'VZEG-B', 30003205:'W5-VBR', 30003216:'WVMS-X', 30003256:'WW-OVQ', 30003245:'X-9ZZR', 30003259:'X-CYNC', 30003208:'X-Z4JW', 30003224:'XEF6-Z', 30003263:'XKM-DE', 30003252:'XM-RMD', 30003228:'XXZ-3W', 30003237:'Y-0HVF', 30003207:'Y-770C', 30003265:'Y-BIPM', 30003212:'YQTK-R', 30004917:'0DD-MH', 30004920:'1I6F-9', 30004900:'3FKU-H', 30004923:'66-PMM', 30004908:'6T3I-L', 30004925:'7-8EOE', 30004893:'73-JQO', 30004926:'7L9-ZC', 30004914:'99-0GS', 30004884:'9MWZ-B', 30004904:'AXDX-F', 30004889:'C-WPWH', 30004887:'CO-7BI', 30004902:'D2EZ-X', 30004903:'DJK-67', 30004892:'G-B3PR', 30004916:'H90-C9', 30004905:'J-4FNO', 30004912:'K-1OY3', 30004895:'KR8-27', 30004910:'L-AS00', 30004897:'LOI-L1', 30004896:'LQ-AHE', 30004885:'LS-QLX', 30004901:'M9-FIB', 30004899:'MJ-X5V', 30004913:'MMUF-8', 30004919:'NQH-MR', 30004911:'NZPK-G', 30004924:'OKEO-X', 30004906:'PEM-LC', 30004909:'QSF-EJ', 30004891:'R2TJ-1', 30004918:'RI-JB1', 30004886:'S-XZHU', 30004922:'UEP0-A', 30004890:'VULA-I', 30004915:'X-3AUU', 30004907:'X-EHHD', 30004894:'XPUM-L', 30004898:'Y-MSJN', 30004921:'Z-7OK1', 30004888:'ZJG-7D', 30001779:'0-4VQL', 30001757:'1-10QG', 30001786:'2-84WC', 30001784:'24I-FE', 30001746:'2EV-BA', 30001774:'2ID-87', 30001770:'2ULC-J', 30001799:'2WU-XT', 300018203:'-QNM4', 30001815:'4AZV-W', 30001785:'4H-YJZ', 30001828:'4O-ZRI', 30001759:'6-GRN7', 30001817:'7F-2FB', 30001768:'8-AA98', 30001745:'80G-H5', 30001776:'8K-QCZ', 30001805:'8RL-OG', 30001814:'90-A1P', 30001796:'9S-GPT', 30001764:'9Z-XJN', 30001825:'CNHV-M', 30001761:'D-JVGJ', 30001781:'DJ-GBH', 30001769:'EZWQ-X', 30001801:'F-WCLC', 30001775:'FVQF-W', 30001802:'G-HE0N', 30001811:'GA58-7', 30001744:'HLR-GL', 30001767:'HZID-J', 30001752:'I-HRX3', 30001782:'I0N-BM', 30001812:'J-0KB3', 30001750:'J4AQ-O', 30001800:'J7X-VN', 30001777:'JBUH-H', 30001743:'JUE-DX', 30001793:'JX-SOA', 30001762:'K4UV-G', 30001790:'KGCF-5', 30001809:'L1YK-V', 30001804:'LTT-AP', 30001747:'M1-PX9', 30001754:'M4U-EH', 30001818:'MC4C-H', 30001827:'N-I024', 30001826:'NEU-UD', 30001789:'NHKO-4', 30001822:'NQ-M6W', 30001751:'O-O2GN', 30001819:'OW-QXW', 30001823:'P-8PDJ', 30001795:'P-T9VC', 30001763:'Q7E-DU', 30001766:'QFRV-2', 30001772:'QG3-Z0', 30001749:'QHH-13', 30001783:'QOK-SX', 30001806:'R3P0-Z', 30001830:'RQNF-9', 30001773:'RT64-C', 30001780:'SN-DZ6', 30001808:'SN-Q1T', 30001771:'T0DT-T', 30001760:'TFPT-U', 30001788:'U-FQ21', 30001797:'UAJ5-K', 30001813:'UC-8XF', 30001821:'UEPO-D', 30001816:'UNV-3J', 30001787:'V-SEE6', 30001824:'VE-W7O', 30001794:'VH-9VO', 30001748:'W9-TFD', 30001756:'WIO-OL', 30001755:'WK2F-Y', 30001778:'XDTW-F', 30001798:'XJ-AG7', 30001792:'XME-SW', 30001753:'XUPK-Z', 30001829:'Y-7XVJ', 30001791:'Y-UO9U', 30001803:'YC-ANK', 30001758:'YQM-P1', 30001765:'ZEZ1-9', 30001810:'ZJ-5IS', 30001807:'ZZK-VF', 30004510:'0-9UHT', 30004509:'0-WVQS', 30004549:'0D-CHA', 30004501:'1L-BHT', 30004531:'2JJ-0E', 30004517:'33FN-P', 30004520:'3HQC-6', 30004504:'4C-B7X', 30004508:'5WAE-M', 30004525:'7-692B', 30004544:'A-5M31', 30004550:'A2V6-6', 30004527:'AN-G54', 30004532:'B0C-LD', 30004506:'BF-SDP', 30004539:'BMU-V1', 30004545:'BOE7-P', 30004493:'C0T-77', 30004530:'CT7-5V', 30004502:'D5IW-F', 30004524:'DB-6W4', 30004546:'E-GCX0', 30004499:'E9G-MT', 30004503:'F-XWIN', 30004507:'F5FO-U', 30004534:'G-YT55', 30004536:'G5-EN3', 30004523:'GA-2V7', 30004512:'H-M1BY', 30004535:'IZ-AOB', 30004513:'J1H-R4', 30004514:'J9SH-A', 30004515:'JKJ-VJ', 30004526:'L3-XYO', 30004541:'LBV-Q1', 30004505:'LGUZ-1', 30004511:'M-NKZM', 30004519:'MT-2VJ', 30004518:'NM-OEA', 30004533:'NP6-38', 30004543:'O-RIDF', 30004521:'OX-RGN', 30004497:'P-NUWP', 30004522:'R-OCBA', 30004494:'RL-KT0', 30004516:'RTX0-S', 30004529:'T-Z6J2', 30004500:'TQ-RR8', 30004495:'UO9-YG', 30004547:'VBFC-8', 30004551:'VJ0-81', 30004537:'W-Z3HW', 30004538:'W2F-ZH', 30004548:'YVA-F0', 30004542:'Z-40CG', 30004498:'ZJQH-S', 30004496:'ZQP-QV', 30004540:'ZXC8-1', 30004528:'ZXI-K2', 30004674:'0-MX34', 30004695:'0SUF-3', 30004677:'0ZN7-G', 30004688:'1DDR-X', 30004693:'2I-520', 30004668:'33-JRO', 30004682:'3PPT-9', 30004673:'4Y-OBL', 30004670:'5-CSE3', 30004675:'5AQ-5H', 30004681:'8Q-UYU', 30004672:'9T-APQ', 30004690:'AA-GWF', 30004669:'ARBX-9', 30004696:'G-M4GK', 30004697:'G1D0-G', 30004694:'GQ2S-8', 30004678:'H8-ZTO', 30004686:'HHJD-5', 30004667:'JI-K5H', 30004684:'JK-GLL', 30004698:'KU3-BB', 30004700:'LD-2VL', 30004689:'LG-WA9', 30004680:'LUL-WX', 30004705:'LX5K-W', 30004702:'MP5-KR', 30004692:'O-97ZG', 30004671:'O-MCZR', 30004703:'O-N589', 30004699:'O1Q-P1', 30004691:'O4T-Z5', 30004683:'S-KU8B', 30004676:'T-ZFID', 30004685:'UAAU-C', 30004679:'YV-FDG', 30004701:'ZBY-0I', 30004704:'ZDYA-G', 30004687:'ZWV-GD', 30004930:'0-NTIS', 30004954:'08S-39', 30004947:'0A-KZ0', 30004950:'0OTX-J', 30004940:'1-NJLK', 30004964:'1E-W5I', 30004928:'35-JWD', 30004951:'3OP-3E', 30004949:'48I1-X', 30004942:'8KR9-5', 30004939:'CW9-1Y', 30004948:'E-DOF2', 30004959:'E2-RDQ', 30004946:'EIMJ-M', 30004929:'F-M1FU', 30004945:'G-C8QO', 30004962:'G-Q5JU', 30004957:'GR-J8B', 30004956:'HIX4-H', 30004933:'I6M-9U', 30004952:'JZL-VB', 30004927:'L-YMYU', 30004944:'L5D-ZL', 30004934:'MG0-RD', 30004966:'MVUO-F', 30004958:'OY0-2T', 30004961:'PA-VE3', 30004953:'RJ3H-0', 30004963:'RYQC-I', 30004937:'TCAG-3', 30004960:'TN25-J', 30004935:'TPAR-G', 30004938:'UR-E46', 30004943:'VQE-CN', 30004931:'VR-YIQ', 30004936:'VYO-68', 30004932:'XZ-SKZ', 30004941:'Y-CWQY', 30004965:'Z-M5A1', 30004955:'ZU-MS3', 30005191:'0-U2M4', 30005186:'01B-88', 30005102:'0FG-KS', 30005157:'0XN-SK', 30005117:'1A8-6G', 30005139:'2-YO2K', 30005182:'2AUL-X', 30005096:'2V-ZHM', 30005166:'3LL-O0', 30005149:'4-1ECP', 30005121:'4-M1TY', 30005165:'49V-E4', 30005159:'4F6-VZ', 30005154:'4F9Y-3', 30005151:'5-U12M', 30005115:'5T-A3D', 30005152:'5V-Q1R', 30005110:'5ZU-VG', 30005111:'6-1T6Z', 30005114:'6-8QLA', 30005128:'6Q4-X6', 30005145:'7AH-SF', 30005176:'7JRA-G', 30005138:'7M4-4C', 30005146:'7MMJ-3', 30005164:'8B-A4E', 30005148:'9-EXU9', 30005168:'9-ZA4Z', 30005130:'972C-1', 30005180:'9IZ-HU', 30005137:'9WVY-F', 30005167:'A1F-22', 30005092:'A9-F18', 30005098:'AK-L0Z', 30005094:'AY9X-Q', 30005160:'B-7LYC', 30005088:'B-B0ME', 30005127:'B9N2-2', 30005129:'BEG-RL', 30005190:'C3I-D5', 30005122:'C6CG-W', 30005125:'D-QJR9', 30005093:'DE-IHK', 30005100:'E-WMT7', 30005104:'EF-QZK', 30005108:'EH2I-P', 30005103:'F-5WYK', 30005184:'F-A3TR', 30005183:'F-HQWV', 30005187:'F18-AY', 30005101:'FLK-LJ', 30005113:'G-GRSZ', 30005123:'H-29TM', 30005116:'H-FOYG', 30005090:'H-HGGJ', 30005107:'HB-KSF', 30005169:'IU-E9T', 30005158:'J9A-BH', 30005161:'JM0A-4', 30005136:'JPEZ-R', 30005135:'JZ-UQC', 30005120:'K-3PQW', 30005124:'KOI8-Z', 30005163:'L-POLO', 30005179:'L5Y4-M', 30005106:'LW-YEW', 30005140:'M-SG47', 30005153:'M4-KX5', 30005134:'M5NO-B', 30005133:'MJ-5F9', 30005155:'MS-RXH', 30005189:'MTO2-2', 30005170:'NGM-OK', 30005171:'O-QKSM', 30005181:'OBV-YC', 30005091:'OJT-J3', 30005109:'OP7-BP', 30005185:'PA-ALN', 30005118:'PE-SAM', 30005178:'PFV-ZH', 30005162:'PT-2KR', 30005147:'PVF-N9', 30005144:'QHY-RU', 30005172:'QKQ3-L', 30005099:'R-AG7W', 30005112:'R-AYGT', 30005119:'RY-2FX', 30005105:'RZ3O-K', 30005188:'RZ8A-P', 30005141:'SR-10Z', 30005174:'SY-OLX', 30005143:'TAL1-3', 30005089:'TDP-T3', 30005156:'U-3FKL', 30005131:'U-W436', 30005126:'U4-V3J', 30005150:'UYOC-1', 30005097:'V-3K7C', 30005173:'VWES-Y', 30005177:'W-CSFY', 30005142:'W-KXEX', 30005095:'XU7-CH', 30005175:'XY-ZCI', 30005132:'Z-ENUD', 30003832:'Adacyne', 30003787:'Agoze', 30003854:'Alamel', 30003837:'Aldranette', 30003811:'Algasienan', 30003850:'Alparena', 30003800:'Alperaute', 30003841:'Alsavoinon', 30003848:'Amasiree', 30003847:'Amoen', 30003840:'Anchauttes', 30003791:'Annancale', 30003824:'Archavoinet', 30003852:'Arderonne', 30003856:'Athounon', 30003827:'Aubenall', 30003849:'Aubonnie', 30003818:'Aulbres', 30003801:'Aunsou', 30003821:'Ausmaert', 30003843:'Avaux', 30003819:'Barleguet', 30003789:'Brarel', 30003809:'Brellystier', 30003834:'Chardalane', 30003795:'Covryn', 30003802:'Cumemare', 30003797:'Dastryns', 30003807:'Dour', 30003842:'Esesier', 30003822:'Espigoure', 30003825:'Eugales', 30003839:'Evaulon', 30003826:'Frarie', 30003844:'Gallusiene', 30003805:'Gare', 30003808:'Grispire', 30003793:'Harroule', 30003846:'Hedoubel', 30003815:'Iffrue', 30003796:'Iges', 30003788:'Intaki', 30003813:'Ivorider', 30003823:'Kenninck', 30003855:'Mantenault', 30003835:'Maut', 30003853:'Mercomesier', 30003828:'Moclinamaud', 30003814:'Mollin', 30003857:'Odamia', 30003838:'Oicx', 30003817:'Ommaerrer', 30003830:'Orvolle', 30003812:'Osmallanais', 30003831:'Osmeden', 30003792:'Ostingele', 30003833:'Oulley', 30003804:'Pain', 30003806:'Pelille', 30003829:'Renarelle', 30003851:'Reschard', 30003803:'Reynire', 30003845:'Ruerrotta', 30003798:'Slays', 30003794:'Stacmon', 30003799:'Uphallant', 30003820:'Vestouve', 30003790:'Vey', 30003816:'Vilinnon', 30003810:'Vivanier', 30003836:'Vlillirier', 30003743:'08Z-JJ', 30003782:'0B-HLZ', 30003766:'1-1I53', 30003768:'18-GZM', 30003785:'18XA-C', 30003754:'2-TEGJ', 30003774:'2V-CS5', 30003786:'3D-CQU', 30003748:'3GXF-U', 30003781:'3KB-J0', 30003708:'49GC-R', 30003706:'4B-NQN', 30003720:'5IO8-U', 30003738:'5KG-PY', 30003745:'6-OQJV', 30003778:'7YWV-S', 30003771:'8B-VLX', 30003750:'8P9-BM', 30003704:'9-F0B2', 30003707:'9UY4-H', 30003756:'AY-24I', 30003746:'AY-YCU', 30003730:'B-WPLZ', 30003757:'BK4-YC', 30003780:'C1-HAB', 30003727:'D-6WS1', 30003709:'D-GTMI', 30003724:'D61A-G', 30003762:'DNR-7M', 30003721:'DP-JD4', 30003732:'E-YCML', 30003737:'F-DTOO', 30003751:'F-YH5B', 30003718:'FC-3YI', 30003710:'FSW-3C', 30003711:'FX-7EM', 30003703:'G-5EN2', 30003772:'G-B22J', 30003713:'G7AQ-7', 30003777:'GA9P-0', 30003736:'GN7-XY', 30003752:'H-GKI6', 30003723:'H6-CX8', 30003775:'H9-J8N', 30003776:'HP-6Z6', 30003784:'I-MGAB', 30003716:'I7S-1S', 30003740:'INQ-WR', 30003765:'IWZ3-C', 30003760:'JEIV-E', 30003758:'K1I1-J', 30003764:'K1Y-5H', 30003729:'KBP7-G', 30003759:'LF-2KP', 30003712:'MH9C-S', 30003755:'MVCJ-E', 30003763:'N-RMSH', 30003767:'N8XA-L', 30003761:'O-Y5JQ', 30003722:'OXIY-V', 30003735:'PI5-39', 30003714:'QBL-BV', 30003739:'QO-SRI', 30003719:'QR-K85', 30003769:'R3-K7K', 30003741:'S9X-AX', 30003728:'SI-I89', 30003725:'Shintaht', 30003715:'T-RPFU', 30003733:'TU-O0T', 30003742:'TU-RI6', 30003779:'TXJ-II', 30003717:'U-HYMT', 30003749:'VKI-T7', 30003744:'X-4WZD', 30003770:'X-R3NM', 30003773:'X6AB-Y', 30003731:'XHQ-7V', 30003726:'Y-MPWL', 30003734:'Y9-MDG', 30003753:'YQB-22', 30003705:'YWS0-Z', 30003783:'Z-RFE3', 30003747:'ZT-LPU', 30001975:'12YA-2', 30002007:'2-6TGQ', 30001965:'2D-0SO', 30002011:'3V8-LJ', 30002015:'4-ABS8', 30001972:'5-9WNU', 30002005:'5ZXX-K', 30002025:'6GWE-A', 30002046:'7D-0SQ', 30002016:'7RM-N0', 30002031:'7X-VKB', 30002004:'8S-0E1', 30001990:'93PI-4', 30001993:'A8I-C5', 30002029:'B-9C24', 30002012:'B8EN-S', 30001976:'BDV3-T', 30001992:'C-H9X7', 30001981:'CL6-ZG', 30002003:'CR-AQH', 30001979:'CXN1-Z', 30002036:'D2-HOS', 30001968:'D7T-C0', 30001994:'DK-FXK', 30002014:'DP-1YE', 30002041:'DT-TCD', 30002032:'E-Z2ZX', 30001984:'EC-P8R', 30001970:'EL8-4Q', 30001985:'EWOK-K', 30002019:'F-NMX6', 30002021:'FWA-4V', 30001987:'G-M4I8', 30001982:'G95-VZ', 30002020:'GA-P6C', 30002043:'H1-J33', 30002040:'HPS5-C', 30001991:'ION-FG', 30001977:'J-CIJV', 30002026:'J-OK0C', 30001971:'JC-YX8', 30002006:'JE-D5U', 30002027:'KDV-DE', 30001969:'KI-TL0', 30001980:'KLY-C0', 30001963:'KQK1-2', 30002042:'KU5R-W', 30001989:'L-TS8S', 30001995:'M-76XI', 30002000:'M-YCD4', 30001988:'MI6O-6', 30002035:'MQ-NPY', 30002028:'MT9Q-S', 30001974:'N-H32Y', 30002034:'O-A6YN', 30001964:'O-BY0Y', 30001986:'O-N8XZ', 30002008:'OE-9UF', 30002045:'OGV-AS', 30002030:'P-2TTL', 30002009:'PFU-LH', 30002001:'Q-5211', 30002002:'R-2R0G', 30002013:'R-LW2I', 30002010:'R6XN-9', 30002023:'RD-G2R', 30001983:'ROIR-Y', 30002033:'RORZ-H', 30002039:'RQH-MY', 30002022:'RZC-16', 30002017:'S-MDYI', 30002038:'TFA0-U', 30001997:'U-INPD', 30002024:'UC3H-Y', 30002047:'UI-8ZE', 30001966:'UR-E6D', 30001998:'WW-KGD', 30001978:'X-7OMU', 30001967:'X47L-Q', 30001973:'XI-VUF', 30001999:'XQ-PXU', 30002044:'Y-C3EQ', 30002037:'Y2-6EA', 30001996:'ZJET-E', 30002018:'ZKYV-W', 30003947:'0-WT2D', 30003957:'0TKF-6', 30003952:'1M4-FK', 30004019:'3-FKCZ', 30004004:'3-JCJT', 30004000:'3BK-O7', 30004036:'3D5K-R', 30004014:'4-2UXV', 30004007:'4-GJT1', 30004009:'49-U6U', 30004008:'5V-BJI', 30003956:'60M-TG', 30003948:'7GCD-P', 30003945:'7V-KHW', 30004001:'8-GE2P', 30003985:'8-SNUD', 30004025:'8-YNBE', 30003974:'8B-SAJ', 30004012:'8QT-H4', 30003961:'9-HM04', 30003979:'9CG6-H', 30003954:'9ES-SI', 30004022:'9SBB-9', 30003977:'A-5F4A', 30003967:'A-BO4V', 30003942:'A2-V27', 30003944:'A3-LOG', 30004006:'AO-N1P', 30004030:'B-7DFU', 30004029:'BX-VEX', 30003993:'BX2-ZX', 30003995:'C-7SBM', 30003976:'C-9RRR', 30004016:'DG-L7S', 30003992:'DS-LO3', 30004028:'E-VKJV', 30004020:'ED-L9T', 30004033:'ES-Q0W', 30004032:'F-NXLQ', 30004013:'F2OY-X', 30003949:'G-3BOG', 30003963:'GOP-GE', 30003986:'H-4R6Z', 30004034:'H74-B0', 30004023:'I1Y-IU', 30003987:'IGE-NE', 30003969:'K-B8DK', 30003982:'K-L690', 30003998:'K-YI1L', 30003972:'K-Z0V4', 30004017:'K4-RFZ', 30003950:'K7D-II', 30003999:'KEJY-U', 30003951:'L-6BE1', 30003970:'L-6W1J', 30004018:'L-FVHR', 30004003:'L3-I3K', 30003973:'LNVW-K', 30004021:'LS-V29', 30004010:'M1BZ-2', 30003962:'MKD-O8', 30004011:'N-M1A3', 30003980:'NDII-Q', 30004035:'NU4-2G', 30003946:'O3L-95', 30003984:'OGY-6D', 30003978:'P-ZMZV', 30003971:'P4-3TJ', 30003975:'Q2-N6W', 30004002:'QXQ-I6', 30004027:'QY1E-N', 30003994:'RF-CN3', 30004015:'RKM-GE', 30003964:'SKR-SP', 30003943:'T8H-66', 30003966:'T8T-RA', 30003958:'TV8-HS', 30004024:'U-HYZN', 30003955:'UQY-IK', 30003988:'UVHO-F', 30003981:'UYU-VV', 30003965:'V-3U8T', 30003953:'V-LEKM', 30003959:'VT-G2P', 30004005:'W-IIYI', 30003968:'W-IX39', 30003983:'W6V-VM', 30003997:'YF-6L1', 30003960:'YOP-0T', 30004026:'YQX-7U', 30003990:'YW-SYT', 30003991:'Z-UZZN', 30003989:'Z-XX2J', 30003996:'ZAU-JW', 30004031:'ZXJ-71', 30000707:'03-OR2', 30000734:'1-7HVI', 30000720:'1QZ-Y9', 30000680:'1V-LI2', 30000684:'2-KPW6', 30000714:'28Y9-P', 30000733:'S-6VU', 30000730:'4-43BW', 30000718:'4-CM8I', 30000701:'4E-EZS', 30000705:'5-MQQ7', 30000706:'6-EQYE', 30000728:'6-KPAB', 30000726:'71-UTX', 30000731:'8CN-CH', 30000697:'8Q-T7B', 30000724:'9BC-EB', 30000723:'9M-M0P', 30000668:'9PX2-F', 30000702:'A-80UA', 30000691:'AH8-Q7', 30000656:'ARG-3R', 30000672:'AZ3F-N', 30000716:'B-1UJC', 30000700:'C8-7AS', 30000663:'DE-A7P', 30000655:'EIN-QG', 30000676:'ER2O-Y', 30000666:'F-5FDA', 30000695:'F2-NXA', 30000675:'FYD-TO', 30000690:'G15Z-W', 30000671:'GN-PDU', 30000685:'H5N-V7', 30000721:'HJ-BCH', 30000686:'HQ-Q1Q', 30000689:'I-1B7X', 30000709:'IAK-JW', 30000677:'J2-PZ6', 30000708:'JLO-Z3', 30000662:'JMH-PT', 30000665:'K212-A', 30000659:'K7-LDX', 30000710:'KZFV-4', 30000683:'LBC-AW', 30000704:'LQ-OAI', 30000681:'M9-MLR', 30000669:'N3-JBX', 30000696:'NSBE-L', 30000679:'OAQY-M', 30000735:'OX-S7P', 30000661:'P-N5N9', 30000727:'PU-UMM', 30000682:'Q-K2T7', 30000717:'Q-NA5H', 30000715:'Q4C-S5', 30000722:'QPTT-F', 30000658:'R-3FBU', 30000673:'RNM-Y6', 30000712:'RYC-19', 30000657:'S-E6ES', 30000667:'S1-XTL', 30000692:'SD4A-2', 30000670:'SG-75T', 30000660:'U-IVGH', 30000703:'U2-28D', 30000693:'U6K-RG', 30000732:'V-F6DQ', 30000674:'V-KDY2', 30000694:'V-S9YY', 30000725:'WFFE-4', 30000687:'WHI-61', 30000711:'WO-GC0', 30000698:'WV0D-1', 30000713:'X2-ZA5', 30000664:'X9V-15', 30000678:'XV-MWG', 30000729:'Y5-E1U', 30000719:'ZDB-HT', 30000688:'ZFJH-T', 30000699:'ZNF-OK', 30002713:'Abenync', 30002710:'Adiere', 30002704:'Adrallezoen', 30002639:'Adreland', 30002729:'Aetree', 30002658:'Agrallarier', 30002736:'Ainaille', 30002652:'Ala', 30002698:'Aliette', 30002712:'Alillere', 30002664:'Alles', 30002727:'Allipes', 30002678:'Alsottobier', 30002644:'Ambeke', 30002669:'Ansone', 30002702:'Archee', 30002648:'Ardene', 30002687:'Artisine', 30002724:'Assiettes', 30002701:'Atier', 30002695:'Ation', 30002709:'Auberulle', 30002684:'Audaerne', 30002641:'Aufay', 30002680:'Augnais', 30002657:'Aunia', 30002716:'Aurcel', 30002656:'Auvergne', 30002717:'Aymaerne', 30002634:'Balle', 30002690:'Bamiette', 30002683:'Barmalie', 30002666:'Basgerin', 30002700:'Bawilan', 30002649:'Boillair', 30002661:'Botane', 30002715:'Bourynes', 30002699:'Brapelille', 30002703:'Brybier', 30002735:'Caretyn', 30002645:'Carrou', 30002688:'Chainelant', 30002667:'Chelien', 30002708:'Claysson', 30002682:'Colelie', 30002691:'Crielere', 30002705:'Croleur', 30002635:'Decon', 30002681:'Deltole', 30002646:'Direrie', 30002685:'Dodenvale', 30002659:'Dodixie', 30002706:'Doussivitte', 30002633:'Du Annes', 30002670:'Dunraelare', 30002693:'Egghelende', 30002660:'Eglennaert', 30002640:'Erme', 30002730:'Esmes', 30002673:'Estene', 30002651:'Fasse', 30002643:'Faurent', 30002677:'Fluekele', 30002663:'Foves', 30002734:'Fricoure', 30002674:'Gallareue', 30002725:'Goinard', 30002653:'Gratesier', 30002636:'Grinacanne', 30002647:'Ignoitton', 30002672:'Inghenges', 30002642:'Iyen-Oursta', 30002692:'Jel', 30002679:'Jolia', 30002722:'Lamadent', 30002728:'Lermireve', 30002637:'Metserel', 30002732:'Mirilene', 30002719:'Miroitem', 30002665:'Misneden', 30002671:'Nausschie', 30002650:'Ney', 30002694:'Odette', 30012715:'Odotte', 30022715:'Oirtlair', 30032715:'Olelon', 30002686:'Olettiers', 30002723:'Otou', 30002676:'Parchanier', 30002632:'Pettinck', 30002714:'Pozirblant', 30002733:'Pucherie', 30002662:'Pulin', 30002726:'Raeghoscon', 30002718:'Rancer', 30002697:'Ravarin', 30002721:'Rorsins', 30002654:'Schoorasana', 30002638:'Sharuveil', 30002689:'Sileperer', 30002675:'Stayme', 30002696:'Stegette', 30002711:'Stetille', 30002720:'Thelan', 30002668:'Trosquesere', 30042715:'Trossere', 30002707:'Unel', 30002731:'Vittenyn', 30002655:'Vylade', 30003585:'Aeter', 30003600:'Agaullores', 30003566:'Aimoguier', 30003571:'Anckee', 30003590:'Arasare', 30003595:'Arittant', 30003601:'Babirmoult', 30003574:'Boystin', 30003569:'Cadelanne', 30003565:'Conomette', 30003605:'Eggheron', 30003570:'Elore', 30003599:'Faurulle', 30003586:'Gererique', 30003597:'Hare', 30003587:'Harner', 30003594:'Heluene', 30003579:'Larryn', 30003592:'Lazer', 30003575:'Lour', 30003576:'Maire', 30003568:'Meunvon', 30003580:'Niballe', 30003578:'Octanneve', 30003582:'Odinesyn', 30003577:'Oerse', 30003598:'Ogaria', 30003603:'Ondree', 30003596:'Oruse', 30003573:'Pertnineere', 30003604:'Pochelympe', 30003581:'Postouvin', 30003602:'Ratillose', 30003584:'Sarline', 30003593:'Stoure', 30003607:'Straloin', 30003606:'Toustain', 30003589:'Vecodie', 30003572:'Vevelonel', 30003583:'Weraroix', 30003588:'Yvaeroure', 30003591:'Yvelet', 30003567:'Yveve', 30001904:'0-7XA8', 30001889:'0G-A25', 30001856:'0GN-VO', 30001961:'0T-LIB', 30001837:'0Y1-M7', 30001880:'1H-I12', 30001952:'1H4V-O', 30001864:'2-V0KY', 30001949:'2B-3M4', 30001860:'2IGP-1', 30001841:'32-GI9', 30001947:'373Z-7', 30001868:'37S-KO', 30001867:'40GX-P', 30001894:'42-UOW', 30001882:'4A-XJ6', 30001858:'4GQ-XQ', 30001844:'4J-ZC9', 30001869:'4J9-DK', 30001940:'4XW2-D', 30001850:'57M7-W', 30001951:'5J-UEX', 30001930:'6QBH-S', 30001907:'6Y-0TW', 30001884:'7-X3RN', 30001845:'7R5-7R', 30001887:'8KQR-O', 30001879:'8O-OSG', 30001877:'8ZO-CK', 30001886:'9O-ZTS', 30001840:'9RQ-L8', 30001857:'9U6-SV', 30001954:'A-DZA8', 30001870:'A-GPTM', 30001950:'A-XASO', 30001896:'A4UG-O', 30001944:'B-2UL0', 30001910:'B-G1LG', 30001885:'BF-FVB', 30001895:'CBGG-0', 30001926:'CJF-1P', 30001881:'D9D-GD', 30001921:'DABV-N', 30001960:'DK6W-I', 30001912:'DP-2WP', 30001831:'DSS-EZ', 30001834:'E-C0SR', 30001909:'E7-WSY', 30001933:'EAWE-2', 30001942:'EOT-XL', 30001906:'F-TVAP', 30001957:'F7-ICZ', 30001888:'F9SX-1', 30001852:'FV-SE8', 30001853:'FZSW-Y', 30001848:'G-ME2K', 30001862:'GDEW-0', 30001873:'GDO-7H', 30001883:'GU-54G', 30001847:'HM-UVD', 30001871:'HQ-TDJ', 30001934:'I-3FET', 30001914:'I-ME3L', 30001843:'IP-MVJ', 30001920:'J-AYLV', 30001941:'J5NU-K', 30001851:'JS-E8E', 30001937:'JU-UYK', 30001948:'JVJ2-N', 30001901:'KP-FQ1', 30001945:'L-A9FS', 30001876:'L0AD-B', 30001928:'L6B-0N', 30001917:'LB0-A1', 30001923:'LC-1ED', 30001833:'LGK-VP', 30001953:'LGL-SD', 30001832:'MB4D-4', 30001913:'MMR-LZ', 30001962:'NRT4-U', 30001874:'NZG-LF', 30001955:'O-CT8N', 30001938:'O-FTHE', 30001855:'O5Y3-W', 30001946:'OOO-FS', 30001863:'PSJ-10', 30001838:'Q-Q2S6', 30001935:'QCKK-T', 30001903:'QM-O7J', 30001859:'R8-5XF', 30001902:'RLDS-R', 30001936:'RP-H66', 30001924:'RPS-0K', 30001931:'RRWI-5', 30001900:'RV5-DW', 30001943:'RVRE-Z', 30001918:'S-BWWQ', 30001893:'S-DLKC', 30001891:'S91-TI', 30001911:'T-8UOF', 30001959:'T-NNJZ', 30001916:'T7-JNB', 30001842:'TG-Z23', 30001908:'TL-T9Z', 30001865:'U-WLT9', 30001898:'U2-BJ2', 30001927:'U6-FCE', 30001854:'UF-KKH', 30001875:'UJM-RD', 30001899:'UKYS-5', 30001892:'V1V-6F', 30001925:'VNPF-7', 30001836:'VTGN-U', 30001939:'W-Q233', 30001897:'W-VXL9', 30001872:'WBLF-0', 30001878:'WEQT-K', 30001839:'WHG2-7', 30001890:'WJO0-G', 30001849:'WNS-7J', 30001835:'X1E-OQ', 30001905:'X5O1-L', 30001958:'XFBE-T', 30001932:'Y-4U62', 30001846:'Y1-UQ2', 30001915:'YE17-R', 30001956:'Z-6YQC', 30001919:'Z-R96X', 30001929:'Z-XMUC', 30001861:'Z2-QQP', 30001866:'ZG8Q-N', 30001922:'ZH-KEV', 30003296:'0EK-NJ', 30003330:'0LTQ-C', 30003349:'0T-AMZ', 30003297:'1-NKVT', 30003336:'10UZ-P', 30003370:'2G38-I', 30003332:'2P-4LS', 30003304:'2Q-I6Q', 30003275:'2X-PQG', 30003356:'3-IN0V', 30003329:'31-MLU', 30003312:'35-RK9', 30003322:'3KNK-A', 30003324:'3MOG-V', 30003366:'4-JWWQ', 30003351:'4L-E5P', 30003310:'5-75MB', 30003290:'5-DSFH', 30003321:'5-FGQI', 30003302:'5-T0PZ', 30003346:'5-VKCN', 30003354:'51-5XG', 30003350:'57-YRU', 30003348:'5KS-AB', 30003280:'6-CZ49', 30003316:'6-U2M8', 30003315:'617I-I', 30003270:'6E-578', 30003303:'6R-PWU', 30003282:'8-JYPM', 30003320:'8V-SJJ', 30003301:'97X-CH', 30003373:'98Q-8O', 30003338:'9GYL-O', 30003341:'9U-TTJ', 30003319:'A-3ES3', 30003335:'A-SJ8X', 30003305:'A-ZLHX', 30003331:'A9D-R0', 30003293:'AAS-8R', 30003291:'AK-QBU', 30003273:'ATY-2U', 30003327:'BMNV-P', 30003328:'BY-S36', 30003363:'CIS-7X', 30003371:'CY-ZLP', 30003358:'D-B7YK', 30003345:'D85-VD', 30003364:'DCHR-L', 30003314:'DP34-U', 30003359:'DUV-5Y', 30003355:'EF-F36', 30003337:'EN-VOD', 30003365:'EU0I-T', 30003281:'EZA-FM', 30003269:'F67E-Q', 30003276:'FD-MLJ', 30003367:'G-6SXJ', 30003360:'GRNJ-3', 30003308:'I-YGGI', 30003317:'I0AB-R', 30003311:'IIRH-G', 30003285:'JH-M2W', 30003279:'K5-JRD', 30003343:'KFR-ZE', 30003344:'KLYN-8', 30003300:'KTHT-O', 30003334:'LSC4-P', 30003284:'M2-CF1', 30003268:'MHC-R3', 30003318:'MXYS-8', 30003325:'NG-C6Y', 30003286:'PC9-AY', 30003277:'PF-346', 30003295:'PFP-GU', 30003283:'PVH8-0', 30003271:'Poitot', 30003292:'QWF-6P', 30003333:'RF-GGF', 30003353:'RLL-9R', 30003362:'RSS-KA', 30003340:'S-GKKR', 30003368:'S-U8A4', 30003299:'T-LIWS', 30003287:'T22-QI', 30003323:'TXW-EI', 30003347:'U0V6-T', 30003372:'U4-Q2V', 30003352:'UFXF-C', 30003298:'UM-Q7F', 30003306:'UTKS-5', 30003294:'V4-L0X', 30003339:'VLGD-R', 30003361:'VSIG-K', 30003309:'VV-VCR', 30003274:'X-BV98', 30003278:'X-M2LR', 30003288:'X-PYH5', 30003313:'XS-XAY', 30003326:'XYY-IA', 30003342:'Y-W6GF', 30003307:'Y9G-KS', 30003357:'Z-QENW', 30003289:'ZN0-SR', 30003369:'ZV-72W', 30003272:'ZVN5-H', 30001694:'Abai', 30001648:'Adahum', 30001717:'Adar', 30001697:'Ahkour', 30001649:'Ahrosseas', 30001741:'Ahteer', 30001730:'Alra', 30001647:'Anjedin', 30001740:'Arakor', 30001704:'Arkoz', 30001689:'Asesamy', 30001708:'Asezai', 30001723:'Assiad', 30001678:'Atoosh', 30001711:'Azerakish', 30001705:'Azhgabid', 30001670:'Baviasi', 30001664:'Chamume', 30001660:'Dabrid', 30001672:'Emrayur', 30001736:'Esa', 30001709:'Ferira', 30001698:'Gaknem', 30001663:'Gemodi', 30001714:'Ghishul', 30001716:'Goni', 30001646:'Goram', 30001661:'Gyerzen', 30001737:'Hath', 30001662:'Hibi', 30001674:'Hilaban', 30001680:'Hoona', 30001690:'Hostni', 30001731:'Ilas', 30001655:'Imeshasa', 30001696:'Iro', 30001726:'Iswa', 30001656:'Ivih', 30001653:'Jarzalad', 30001706:'Jinizu', 30001738:'Judra', 30001742:'Kari', 30011672:'Kerepa', 30001682:'Keshirou', 30001692:'Kibursha', 30001712:'Lari', 30001687:'Lossa', 30001658:'Mani', 30001722:'Marthia', 30001654:'Matyas', 30001676:'Mimen', 30001691:'Mimime', 30001720:'Modun', 30001715:'Moutid', 30001651:'Nafomeh', 30001702:'Nafrivik', 30001683:'Nasesharafa', 30001695:'Nehkiah', 30001725:'Nosodnis', 30001665:'Nuzair', 30001688:'Onazel', 30001685:'Ordat', 30021672:'Pasha', 30001718:'Paye', 30001666:'Pera', 30001693:'Perdan', 30001707:'Phoren', 30001669:'Pimebeka', 30001652:'Pimsu', 30001727:'Rand', 30001700:'Remoriu', 30001686:'Rethan', 30001650:'Riramia', 30001724:'Rumida', 30001675:'Sacalan', 30031672:'Safilbab', 30001719:'Sagain', 30001721:'Saminer', 30001659:'Sehmosh', 30001657:'Seil', 30041672:'Seitam', 30001739:'Sharios', 30001673:'Shesha', 30001667:'Shousran', 30001729:'Sinid', 30001699:'Siyi', 30001728:'Sizamod', 30001703:'Taru', 30001671:'Prime', 30001645:'Tendhyes', 30001681:'Teshkat', 30001733:'Tew', 30001677:'Thashkarai', 30001684:'Tirbam', 30001644:'Tividu', 30001735:'Uhodoh', 30001679:'Unkah', 30001701:'Yanuel', 30001713:'Yasud', 30001710:'Yeder', 30001668:'Yong', 30001734:'Zehru', 30001732:'Zith', 30003615:'0-UVHJ', 30003647:'0M-103', 30003624:'1BWK-S', 30003617:'1QH-0K', 30003668:'2-3Q2G', 30003675:'3-QYVE', 30003632:'30-D5G', 30003642:'33CE-7', 30003609:'3DR-CR', 30003630:'5-O8B1', 30003640:'6-AOLS', 30003648:'6OYQ-Z', 30003671:'7D-PAT', 30003650:'A-1IJ9', 30003655:'A1RR-M', 30003656:'AR-5SY', 30003670:'C-XNUA', 30003644:'DCJ-ZT', 30003662:'EN-GTB', 30003612:'EOY-BG', 30003636:'G06-8Y', 30003653:'GW7P-8', 30003608:'H1-ESN', 30003633:'HB-FSO', 30003649:'HE5T-A', 30003621:'I1-BE8', 30003614:'IG-ZAM', 30003641:'IKTD-P', 30003634:'J1-KJP', 30003661:'JI-1UQ', 30003664:'JSI-LL', 30003625:'KMV-CQ', 30003635:'KW-1MV', 30003643:'L-P3XM', 30003665:'M-UC0S', 30003659:'MZPH-W', 30003616:'NCG-PW', 30003627:'NV-3KA', 30003645:'O36A-P', 30003657:'OE-4HB', 30003674:'P-UCRP', 30003613:'PNS7-J', 30003669:'Q1U-IU', 30003631:'R-YWID', 30003626:'RKE-CP', 30003610:'RLTG-3', 30003628:'S-1LIO', 30003611:'S-EVIQ', 30003629:'S-KSWL', 30003654:'SF-XJS', 30003667:'SY0W-2', 30003639:'SZ6-TA', 30003673:'T-K10W', 30003637:'U-O2DA', 30003623:'U1TX-A', 30003663:'U5-XW7', 30003672:'V-LDEJ', 30003666:'V7-MID', 30003660:'W0X-MG', 30003622:'W8O-19', 30003638:'WV-0R2', 30003651:'Y-YHZQ', 30003646:'Z-LO6I', 30003652:'Z-SR1I', 30003618:'ZH3-BS', 30003619:'ZJ-QOO', 30003658:'ZK-YQ3', 30003620:'ZXA-V6', 30004874:'0P-U0Q', 30004843:'0VK-43', 30004849:'16AM-3', 30004830:'2PG-KN', 30004808:'3L3N-X', 30004868:'3Q1T-O', 30004828:'4-IT9G', 30004811:'4-P4FE', 30004878:'46DP-O', 30004865:'5-NZNW', 30004810:'6-IAFR', 30004816:'78R-PI', 30004872:'7KIK-H', 30004860:'7M4C-F', 30004869:'8-4KME', 30004862:'8-BEW8', 30004879:'9-980U', 30004826:'9-MJVQ', 30004850:'A-REKV', 30004831:'ABE-M2', 30004873:'B-6STA', 30004848:'B8HU-Z', 30004851:'BB-EKF', 30004804:'BW-WJ2', 30004818:'C-FD0D', 30004824:'C3-0YD', 30004841:'CCE-0J', 30004813:'D-9UEV', 30004806:'DT-PXH', 30004852:'DZ6-I5', 30004880:'EMIG-F', 30004835:'EQWO-Y', 30004867:'F-ZBO0', 30004821:'FE-6YQ', 30004876:'G-D0N3', 30004854:'G1-0UI', 30004814:'H-HWQR', 30004832:'IL-YTR', 30004839:'JI1-SY', 30004836:'JK-Q77', 30004883:'JV1V-O', 30004833:'KW-OAM', 30004827:'L2GN-K', 30004823:'M-4KDB', 30004881:'M-RPN3', 30004861:'MS1-KJ', 30004866:'NR8S-Y', 30004863:'NZW-ZO', 30004858:'OQTY-Z', 30004825:'PDF-3Z', 30004829:'PEK-8Z', 30004845:'Q0G-L8', 30004846:'Q5KZ-W', 30004855:'QCDG-H', 30004837:'QI9-42', 30004857:'QLU-P0', 30004815:'QRBN-M', 30004853:'R-XDKM', 30004871:'R1-IMO', 30004812:'RH0-EG', 30004819:'S-9RCJ', 30004805:'S4-9DN', 30004877:'T-AKQZ', 30004842:'T2-V8F', 30004870:'T6GY-Y', 30004844:'TY2X-C', 30004834:'U2U5-A', 30004807:'UALX-3', 30004822:'W-16DY', 30004803:'WB-AYY', 30004847:'WE-KK2', 30004864:'WSK-1A', 30004840:'X-1QGA', 30004875:'XGH-SH', 30004856:'XUDX-A', 30004859:'Y-EQ0C', 30004809:'Y-ORBJ', 30004838:'YF-P4X', 30004817:'ZD1-Z2', 30004820:'ZMV9-A', 30004882:'ZO-P5K', 30003071:'Anka', 30003093:'Ayeroilen', 30003078:'Erkinen', 30003095:'Furskeshin', 30003076:'Gammel', 30003085:'Hakodan', 30003087:'Haras', 30003082:'Hatori', 30003067:'Huola', 30003072:'Iesa', 30003094:'Imata', 30003080:'Jarkkolen', 30003083:'Junsen', 30003069:'Kamela', 30003092:'Komaa', 30003068:'Kourmonen', 30003066:'Kuomi', 30003096:'Kurmaru', 30003089:'Kurniainen', 30003063:'Lamaa', 30003084:'Malpara', 30003075:'Myyhera', 30003073:'Netsalakka', 30003065:'Otelen', 30003088:'Oyonata', 30003081:'Ronne', 30003086:'Sahtogas', 30003090:'Saidusairos', 30003079:'Saikamon', 30003074:'Sasiekko', 30003097:'Satalama', 30003070:'Sosala', 30003091:'Tannakan', 30003064:'Tuomuta', 30003077:'Uusanen', 30002752:'Ahynada', 30002753:'Aikoro', 30002754:'Alikara', 30002776:'Annaro', 30002805:'Anttiri', 30002817:'Aramachi', 30002744:'Auviken', 30002740:'Eitu', 30002769:'Enderailen', 30002810:'Eranakko', 30002742:'Erila', 30002800:'Haatomo', 30002781:'Halaima', 30002758:'Hasama', 30002806:'Hasmijaala', 30002764:'Hatakani', 30002773:'Hogimo', 30002741:'Horkkisen', 30002774:'Huttaken', 30002796:'Hysera', 30002766:'Iivinen', 30002786:'Ikao', 30002793:'Inari', 30002788:'Inaro', 30002738:'Inoue', 30002792:'Irjunen', 30002739:'Isaziwa', 30002815:'Isenairos', 30002756:'Ishomilken', 30002804:'Isikesu', 30002777:'Isutaka', 30002748:'Jeras', 30002803:'Juunigaishi', 30002789:'Kaaputenen', 30002751:'Kaimon', 30002747:'Kakki', 30002782:'Kamio', 30002761:'Kassigainen', 30002797:'Kaunokka', 30002749:'Kausaaja', 30031392:'Komo', 30002737:'Konola', 30002767:'Kubinen', 30002771:'Kulelen', 30002802:'Kusomonmon', 30041392:'Laah', 30002760:'Manjonakko', 30002819:'Motsu', 30002780:'Muvolailen', 30002807:'Nagamanen', 30002757:'Nikkishina', 30002743:'Ohvosamon', 30002818:'Oichiya', 30002746:'Oijamon', 30002750:'Oiniken', 30002799:'Oisio', 30002811:'Onatoh', 30002779:'Ono', 30002795:'Oshaima', 30002808:'Oto', 30002775:'Paara', 30002772:'Rairomon', 30002745:'Saikanen', 30002816:'Saila', 30002783:'Sankkasen', 30002785:'Santola', 30002791:'Sirppala', 30002765:'Sivala', 30002809:'Sujarento', 30002801:'Suroken', 30002813:'Tama', 30002812:'Tannolen', 30002778:'Tasabeshi', 30002763:'Tennen', 30002784:'Tintoh', 30002770:'Tunudan', 30002768:'Uedama', 30002814:'Uotila', 30040141:'Urhinichi', 30002755:'Usi', 30002759:'Uuna', 30002798:'Venilen', 30002787:'Waira', 30002790:'Waskisen', 30002762:'Yashunen', 30002794:'Yria', 30000147:'Abagawa', 30000125:'Ahtulaima', 30000185:'Airaken', 30000166:'Airmia', 30000178:'Akkilen', 30000200:'Akkio', 30000163:'Akora', 30000132:'Ansila', 30021407:'Aokannitoh', 30000203:'Eruka', 30000137:'Eskunen', 30000168:'Friggi', 30000199:'Fuskunen', 30000149:'Gekutami', 30000126:'Geras', 30000152:'Hampinen', 30000188:'Hentogaira', 30000133:'Hirtamon', 30000150:'Hurtoken', 30000134:'Hykkota', 30000169:'Ihakana', 30000159:'Ikami', 30000138:'Ikuchi', 30000182:'Inaya', 30000165:'Ishisomo', 30000119:'Itamo', 30000148:'Jakanerva', 30000121:'Jatate', 30000142:'Jita', 30000156:'Josameto', 30000176:'Keikaken', 30000189:'Kiainti', 30000141:'Kisogo', 30000181:'Korsiki', 30000124:'Kylmabe', 30000154:'Liekuri', 30000122:'Mahtista', 30000162:'Maila', 30000202:'Mastakomon', 30000140:'Maurasi', 30000164:'Messoya', 30000120:'Mitsolen', 30000145:'New Caldari', 30000143:'Niyabainen', 30000131:'Nomaa', 30000183:'Nuken', 30000155:'Obanen', 30000205:'Obe', 30000204:'Ohkunen', 30000136:'Ohmahailen', 30000186:'Oijanen', 30000158:'Olo', 30000174:'Onuse', 30000207:'Osaa', 30000180:'Osmon', 30000192:'Otanuomi', 30000157:'Otela', 30000171:'Otitoh', 30000172:'Otomainen', 30000196:'Otosela', 30000194:'Otsela', 30000135:'Outuni', 30000198:'Paala', 30000144:'Perimeter', 30000153:'Poinen', 30000161:'Purjola', 30000160:'Reisen', 30000146:'Saisio', 30010141:'Sakenta', 30000167:'Sakkikainen', 30020141:'Senda', 30000130:'Shihuken', 30000179:'Silen', 30000127:'Sirseshin', 30000175:'Soshin', 30000195:'Tasti', 30000128:'Tuuriainas', 30000201:'Uchoshi', 30000197:'Uemon', 30030141:'Uitra', 30000177:'Ukkalen', 30000184:'Uminas', 30000129:'Unpas', 30000151:'Uoyonen', 30000139:'Urlen', 30000123:'Vaankalen', 30000170:'Vahunomi', 30000190:'Vasala', 30000173:'Vattuolen', 30000193:'Vouskiaho', 30000191:'Walvalin', 30000206:'Wirashoda', 30000187:'Wuos', 30002846:'0S1-GI', 30002823:'1-KCSA', 30002866:'1S-SU1', 30002864:'42G-OB', 30002883:'5-VFC6', 30002855:'6FS-CZ', 30002832:'6V-D0E', 30002848:'74-DRC', 30002885:'86L-9F', 30002868:'9-0QB7', 30002836:'A-YB15', 30002834:'AU2V-J', 30002850:'B1UE-J', 30002860:'B3ZU-H', 30002876:'BGMZ-0', 30002843:'BM-VYZ', 30002828:'BVRQ-O', 30002882:'BZ-BCK', 30002858:'C3J0-O', 30002838:'D-6PKO', 30002845:'EPCD-D', 30002826:'F48K-D', 30002881:'FB5U-I', 30002827:'FBH-JN', 30002878:'FZX-PU', 30002853:'G-KCFT', 30002861:'G4-QU6', 30002875:'GQ-7SP', 30002859:'GSO-SR', 30002857:'H7S-5I', 30002863:'HD-HOZ', 30002856:'HPV-RJ', 30002871:'HVGR-R', 30002877:'I2D3-5', 30002886:'IUU3-L', 30002887:'J-OAH2', 30002821:'JT2I-7', 30002872:'K76A-3', 30002873:'K95-9I', 30002847:'L-GY1B', 30002842:'L-TLFU', 30002849:'LE-67X', 30002865:'LEM-I1', 30002830:'LS3-HP', 30002869:'M-75WN', 30002852:'M3-H2Y', 30002840:'MN9P-A', 30002820:'N-JK02', 30002867:'ND-GL4', 30002851:'O31W-6', 30002884:'O5-YNW', 30002879:'O9K-FT', 30002870:'PNFW-O', 30002844:'Q-GICU', 30002829:'QX-4HO', 30002837:'QZX-L9', 30002874:'R1O-GN', 30002839:'RAI-0E', 30002880:'RQOO-U', 30002888:'S-LHPJ', 30002833:'SG-3HY', 30002831:'SH6X-F', 30002835:'SY-0AM', 30002841:'TA9T-P', 30002825:'UDVW-O', 30002824:'UJXC-B', 30002862:'V2-GZS', 30002854:'WNM-V0', 30002822:'XTJ-5Q', 30001551:'0J-MQW', 30001540:'1I5-0V', 30001553:'3ET-G8', 30001578:'4DH-ST', 30001558:'4HF-4R', 30001532:'5LAJ-8', 30001545:'6U-1RX', 30001574:'6W-6O9', 30001588:'7-QOYS', 30001561:'8P-LKL', 30001537:'A-J6SN', 30001539:'AG-SYG', 30001534:'AL-JSG', 30001555:'B6-XE8', 30001529:'B9EA-G', 30001576:'C-BHDN', 30001533:'C6C-K9', 30001557:'DFTK-D', 30001581:'DVN6-0', 30001530:'E-BFLT', 30001527:'E4-E8W', 30001535:'ETO-OT', 30001571:'EU-WFW', 30001544:'FO1U-K', 30001569:'G-VFVB', 30001580:'GF-GR7', 30001573:'GTB-O4', 30001531:'GZM-KB', 30001575:'H4X-0I', 30001528:'HIK-MC', 30001583:'HPMN-V', 30001593:'JL-ZUQ', 30001556:'JLH-FN', 30001542:'JNG7-K', 30001567:'JVA-FE', 30001550:'K-BBYU', 30001543:'K-XJJT', 30001572:'K-YL9T', 30001536:'KPI-OW', 30001589:'KS8G-M', 30001592:'L-EUY2', 30001526:'L-WG68', 30001560:'L7-BLT', 30001587:'LH-LY1', 30001597:'M-NP5O', 30001554:'MOSA-I', 30001549:'N-PS2Y', 30001579:'OSW-0P', 30001538:'OTJ-4W', 30001586:'OTJ9-E', 30001547:'P-NI4K', 30001568:'P65-TA', 30001562:'Q-UVY6', 30001564:'QFU-4S', 30001565:'QQGH-G', 30001596:'QRH-BF', 30001577:'R-RE2B', 30001563:'RXA-W1', 30001591:'S-CUEA', 30001548:'T6T-BQ', 30001585:'U1-VHY', 30001566:'VK6-EZ', 30001541:'VX1-HV', 30001595:'WIW-X8', 30001594:'X-KHRZ', 30001584:'XR-ZL7', 30001552:'XT-1E0', 30001570:'Y4B-BQ', 30001546:'Y4OK-W', 30001559:'Y8K-5B', 30001582:'Z19-B8', 30001590:'ZWM-BB', 30000857:'0-YMBJ', 30000861:'15W-GC', 30000898:'2CG-5V', 30000881:'2ISU-Y', 30000876:'3G-LHB', 30000875:'9GI-FB', 30000883:'9SL-K9', 30000852:'A-DDGY', 30000860:'AW1-2I', 30000854:'B-S42H', 30000863:'C2X-M5', 30000892:'C8VC-S', 30000867:'D7-ZAC', 30000877:'DBT-GB', 30000879:'DL1C-E', 30000846:'E-OGL4', 30000856:'F-749O', 30000897:'F-G7BO', 30000853:'F-RT6Q', 30000850:'FY0W-N', 30000887:'GIH-ZG', 30000859:'GKP-YT', 30000865:'H-W9TY', 30000895:'IMK-K1', 30000847:'J-GAMP', 30000890:'K-6SNI', 30000874:'L-1HKR', 30000891:'L-VXTK', 30000848:'M-OEE8', 30000851:'MJI3-8', 30000864:'MSHD-4', 30000862:'N-FK87', 30000896:'NJ4X-S', 30000855:'NL6V-7', 30000870:'O-0ERG', 30000885:'OY-UZ1', 30000873:'PBD-0G', 30000866:'PNDN-V', 30000872:'Q-CAB2', 30000899:'QFF-O6', 30000886:'S8-NSQ', 30000868:'SH1-6P', 30000869:'TRKN-L', 30000878:'U-W3WS', 30000858:'UMI-KK', 30000849:'V0DF-2', 30000888:'V7-FB4', 30000893:'W-UQA5', 30000894:'W6VP-Y', 30000871:'WH-JCA', 30000882:'X-CFN6', 30000889:'XD-TOV', 30000884:'Y-PZHM', 30000880:'YLS8-J', 30000222:'0-R5TS', 30000216:'05R-7A', 30000267:'0J3L-V', 30000262:'0MV-4W', 30000290:'0R-F2F', 30000301:'1-GBBP', 30000294:'1N-FJ8', 30000286:'1VK-6B', 30000288:'1W-0KS', 30000292:'2DWM-2', 30000228:'3HX-DL', 30000240:'4-HWWF', 30000255:'47L-J4', 30000251:'49-0LI', 30000239:'4GYV-Q', 30000299:'5T-KM3', 30000218:'5ZO-NZ', 30000289:'669-IX', 30000258:'6WW-28', 30000297:'6Y-WRK', 30000224:'7-K5EL', 30000287:'7-PO3P', 30000217:'7-UH4Z', 30000323:'7G-H7D', 30000312:'8-TFDX', 30000242:'8TPX-N', 30000318:'9-GBPD', 30000271:'97-M96', 30000247:'9OO-LH', 30000314:'A-QRQT', 30000317:'A3-RQ3', 30000259:'A8A-JN', 30000265:'AZBR-2', 30000211:'B-588R', 30000307:'B-E3KQ', 30000311:'BR-6XP', 30000278:'C-DHON', 30000302:'C-FP70', 30000273:'C-J7CR', 30000253:'DAYP-G', 30000257:'E-D0VZ', 30000283:'E-SCTX', 30000249:'EIDI-N', 30000279:'F-D49D', 30000233:'FA-DMO', 30000226:'FH-TTC', 30000227:'FMBR-8', 30000219:'FS-RFL', 30000213:'G-LOIT', 30000310:'G5ED-Y', 30000305:'G96R-F', 30000234:'GEKJ-9', 30000276:'H-1EOH', 30000225:'H-5GUI', 30000281:'H-EY0P', 30000268:'H-NOU5', 30000223:'H-UCD1', 30000214:'HE-V4V', 30000254:'IFJ-EL', 30000252:'IPAY-2', 30000277:'IR-DYY', 30000285:'IT-YAU', 30000325:'JZV-F4', 30000244:'K8X-6B', 30000246:'KRUN-N', 30000269:'KX-2UI', 30000319:'LS-JEP', 30000300:'LS9B-9', 30000208:'LZ-6SU', 30000272:'MA-XAP', 30000209:'MC6O-F', 30000321:'MGAM-4', 30000270:'MO-FIF', 30000280:'MQ-O27', 30000232:'MY-T2P', 30000236:'N-5QPW', 30000215:'N-HSK0', 30000212:'NCGR-Q', 30000230:'NFM-0V', 30000309:'O-LR1H', 30000250:'P3EN-E', 30000243:'PM-DWE', 30000316:'PX5-LR', 30000274:'Q-EHMJ', 30000256:'Q-L07F', 30000235:'Q-R3GP', 30000324:'Q3-BAY', 30000291:'R-P7KL', 30000320:'R-RSZZ', 30000298:'RVCZ-C', 30000260:'S-NJBB', 30000284:'S6QX-N', 30000261:'T-GCGL', 30000303:'T-ZWA1', 30000263:'TVN-FM', 30000210:'U54-1L', 30000229:'UH-9ZG', 30000313:'UL-4ZW', 30000282:'UNAG-6', 30000264:'V-NL3K', 30000248:'V-OJEN', 30000295:'VI2K-J', 30000322:'VORM-W', 30000238:'WBR5-R', 30000315:'WMBZ-U', 30000245:'X445-5', 30000221:'X97D-W', 30000293:'XF-PWO', 30000275:'XSQ-TF', 30000237:'XV-8JQ', 30000306:'Y-ZXIO', 30000220:'Y0-BVN', 30000308:'Y5J-EU', 30000241:'YMJG-4', 30000231:'YXIB-I', 30000266:'Z-8Q65', 30000304:'ZA0L-U', 30000296:'ZLZ-1Z', 30001287:'0-BFTQ', 30001322:'0-O2UT', 30001295:'0-XIDJ', 30001270:'1-Y6KI', 30001352:'2IBE-N', 30001319:'2PLH-3', 30001333:'2TH-3F', 30001281:'3A1P-N', 30001318:'4-7IL9', 30001349:'42XJ-N', 30001340:'430-BE', 30001300:'4RX-EE', 30001335:'4S-PVC', 30001317:'65V-RH', 30001290:'6NJ8-V', 30001339:'6UQ-4U', 30001276:'6W-HRH', 30001328:'6ZJ-SC', 30001298:'8CIX-S', 30001273:'9-266Q', 30001279:'9-8BL8', 30001266:'9-R6GU', 30001312:'92D-OI', 30001284:'92K-H2', 30001306:'9IPC-E', 30001283:'A-AFGR', 30001285:'AA-YRK', 30001289:'AJCJ-1', 30001342:'AZ-UWB', 30001321:'B-CZXG', 30001346:'B3QP-K', 30001286:'BV-1JG', 30001355:'C2-DDA', 30001275:'CSOA-B', 30001272:'D-8SI1', 30001299:'D-SKWC', 30001294:'E-7U8U', 30001334:'E1F-E5', 30001307:'EIV-1W', 30001313:'EK2-ET', 30001344:'FHB-QA', 30001348:'G9D-XW', 30001347:'GVZ-1W', 30001331:'H-AJ27', 30001269:'H-PA29', 30001343:'H-S5BM', 30001292:'HBD-CC', 30001330:'HD-JVQ', 30001315:'JURU-T', 30001274:'K3JR-J', 30001264:'KK-L97', 30001350:'L-IE41', 30001337:'LHJ-2G', 30001332:'M2-2V1', 30001316:'MC6-5J', 30001278:'MQFX-Q', 30001309:'N-5476', 30001267:'N-Q5PW', 30001302:'N0C-UN', 30001277:'N5Y-4N', 30001280:'N6G-H3', 30001297:'O-TVTD', 30001341:'OJ-CT4', 30001282:'OZ-VAE', 30001268:'P-FSQE', 30001293:'P-GKF5', 30001329:'P-VYVL', 30001324:'PF-QHK', 30001310:'PZOZ-K', 30001326:'Q-7SUI', 30001323:'Q61Y-F', 30001305:'QHJ-FW', 30001265:'R-KZK7', 30001320:'RQ9-OZ', 30001308:'S-1ZXZ', 30001296:'SBL5-R', 30001314:'SE-SHZ', 30001338:'SHJO-J', 30001288:'SS-GED', 30001301:'V3X-L8', 30001303:'VG-6CH', 30001351:'VG-QW1', 30001327:'VVD-O6', 30001311:'W3KK-R', 30001336:'WLF-D3', 30001325:'XW-6TC', 30001291:'Y-4CFK', 30001261:'Y-W1Q3', 30001262:'Y6-HPG', 30001353:'YJ3-UT', 30001271:'YP-J33', 30001263:'Z-GY5S', 30001304:'Z0-TJW', 30001345:'Z3U-GI', 30001354:'ZD4-G9', 30015305:'Adallier', 30005307:'Aidart', 30005302:'Alenia', 30005304:'Alentene', 30005329:'Amoderia', 30005311:'Amygnon', 30005326:'Annelle', 30005309:'Ansalle', 30005330:'Arraron', 30025305:'Channace', 30005331:'Chantrousse', 30005327:'Chesiette', 30005305:'Cistuvaert', 30035305:'Clacille', 30005324:'Claulenne', 30045305:'Clellinon', 30005298:'Costolle', 30005317:'Ekuenbiron', 30005315:'Eletta', 30005313:'Ellmay', 30005312:'Gisleres', 30005320:'Hevrice', 30005321:'Jovainnon', 30005308:'Jufvitte', 30005300:'Loes', 30005316:'Luse', 30005325:'Masalle', 30005296:'Melmaniel', 30005303:'Merolles', 30005299:'Muetralle', 30005295:'Murethand', 30005332:'Osmomonne', 30005297:'Ouelletta', 30005319:'Raneilles', 30005328:'Reblier', 30005310:'Scheenins', 30005322:'Scolluzer', 30005323:'Sortet', 30005333:'Stou', 30005314:'Theruesse', 30005334:'Tierijev', 30005301:'Tourier', 30005306:'Vaere', 30005318:'Vay', 30000596:'07-SLO', 30000566:'0RI-OV', 30000543:'0TYR-T', 30000573:'1-7B6D', 30000579:'1L-OEK', 30000534:'30-YOU', 30000535:'384-IN', 30000590:'3Q-VZA', 30000576:'4-EFLU', 30000582:'4-OS2A', 30000536:'4F89-U', 30000562:'5DE-QS', 30000554:'5E-CMA', 30000581:'5H-SM2', 30000609:'5NQI-E', 30000564:'5Q65-4', 30000557:'6-L4YC', 30000606:'7K-NSE', 30000542:'8-OZU1', 30000587:'A-4JOO', 30000610:'B-WQDP', 30000550:'C-62I5', 30000567:'C-LTXS', 30000568:'C0O6-K', 30000595:'DUO-51', 30000531:'E-JCUS', 30000577:'EIH-IU', 30000559:'F-3FOY', 30000578:'F-EM4Q', 30000572:'F-QQ5N', 30000600:'F5M-CC', 30000537:'G063-U', 30000545:'G9L-LP', 30000570:'G9NE-B', 30000553:'GGE-5Q', 30000544:'GM-50Y', 30000598:'GPD5-0', 30000552:'GPLB-C', 30000593:'GRHS-B', 30000547:'H-HHTH', 30000574:'H6-EYX', 30000569:'HD-AJ7', 30000592:'HPBE-D', 30000594:'J-RXYN', 30000538:'J7-BDX', 30000608:'JEQG-7', 30000548:'JQU-KY', 30000540:'L-FM3P', 30000605:'L-Z9KJ', 30000599:'LKZ-CY', 30000533:'LP1M-Q', 30000591:'M-MBRT', 30000556:'M3-KAQ', 30000529:'MKIG-5', 30000539:'MLQ-O9', 30000580:'MN-Q26', 30000546:'MWA-5Q', 30000560:'OAIG-0', 30000607:'OR-7N5', 30000586:'Q-GQHN', 30000563:'R0-DMM', 30000589:'R4N-LD', 30000571:'SJJ-4F', 30000584:'SO-X5L', 30000565:'SR-4EK', 30000588:'TP7-KE', 30000601:'TZE-UB', 30000575:'U-HVIX', 30000555:'U104-3', 30000558:'UM-SCG', 30000549:'UY5A-D', 30000561:'UZ-QXW', 30000603:'V7G-RL', 30000532:'W-QN5X', 30000602:'WRL4-2', 30000541:'X-ARMF', 30000604:'XEN7-0', 30000585:'XQS-GZ', 30000530:'YHEN-G', 30000583:'YI-GV6', 30000597:'Z-A8FS', 30000551:'ZH-GKG'}
# ---------------------------------------------
bridges = []
def findSystem(name):
for key in systemdict:
if (systemdict[key] == name):
return key
return False
def parse(fin, friendly):
f = open(fin,'rt')
content = f.read().splitlines()
f.close()
for line in content:
line = line.strip()
if line.startswith("#"):
continue;
parts = re.split(" <-> ", line);
if (len(parts) != 2):
print ("Ignoring: " + line);
continue;
sysA = parts[0].strip()
sysB = parts[1].strip()
mA = re.search('([A-Za-z0-9\ \-]+) ([0-9]+)-([0-9]+)', sysA)
if (len(mA.groups()) != 3):
print ("Ignoring: " + line);
continue;
sysAid = findSystem(mA.group(1))
if (sysAid == False):
print ("Ignoring: " + line);
continue;
mB = re.search('([A-Za-z0-9\ \-]+) ([0-9]+)-([0-9]+)', sysB)
if (len(mB.groups()) != 3):
print ("Ignoring: " + line);
continue;
sysBid = findSystem(mB.group(1))
if (sysBid == False):
print ("Ignoring: " + line);
continue;
bridges.append({'nameA': mA.group(1), 'idA': sysAid, 'planetA': mA.group(2), 'moonA': mA.group(3), 'nameB': mB.group(1), 'idB': sysBid, 'planetB': mB.group(2), 'moonB': mB.group(3), 'friendly': friendly})
print("Parsing ...")
parse("jb_friendly.txt", True)
parse("jb_hostile.txt", False)
print("Found " + str(len(bridges)) + " bridges!")
result = { 'bridges': bridges }
f = open("jb.svg.json",'w')
json.dump(result, f)
f.close()
# ---------------------------------------------
| 1,452.142857 | 100,078 | 0.704515 | 13,908 | 101,650 | 5.148979 | 0.819888 | 0.000908 | 0.001187 | 0.001746 | 0.002234 | 0.001396 | 0.000503 | 0.000503 | 0.000503 | 0.000503 | 0 | 0.473149 | 0.054176 | 101,650 | 69 | 100,079 | 1,473.188406 | 0.271699 | 0.001062 | 0 | 0.265306 | 0 | 0 | 0.321945 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0.020408 | 0.061224 | null | null | 0.142857 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
6053dd95a37bcef591b62f653f8af84fff78af95 | 68 | py | Python | test/fixtures/python/corpus/float.B.py | matsubara0507/semantic | 67899f701abc0f1f0cb4374d8d3c249afc33a272 | [
"MIT"
] | 8,844 | 2019-05-31T15:47:12.000Z | 2022-03-31T18:33:51.000Z | test/fixtures/python/corpus/float.B.py | matsubara0507/semantic | 67899f701abc0f1f0cb4374d8d3c249afc33a272 | [
"MIT"
] | 401 | 2019-05-31T18:30:26.000Z | 2022-03-31T16:32:29.000Z | test/fixtures/python/corpus/float.B.py | matsubara0507/semantic | 67899f701abc0f1f0cb4374d8d3c249afc33a272 | [
"MIT"
] | 504 | 2019-05-31T17:55:03.000Z | 2022-03-30T04:15:04.000Z | -.7_8
+.2_2
123.2345
123.321J
2_4.8_0
2_0.
8e+2_3j
.8e2_7
2_1.l
.2l
| 6.181818 | 8 | 0.676471 | 23 | 68 | 1.652174 | 0.608696 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.551724 | 0.147059 | 68 | 10 | 9 | 6.8 | 0.103448 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | null | 0 | 0 | null | null | 0 | 1 | 0 | 1 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
605ef16d8062a3b3df0556e0b4d613c662196f70 | 38 | py | Python | project/admin/__init__.py | iustce/cesa-web | 8b6b1fd8a66277b7319fdbf327e19948cc56917d | [
"MIT"
] | 1 | 2018-10-13T19:48:05.000Z | 2018-10-13T19:48:05.000Z | project/admin/__init__.py | iustce/cesa-web | 8b6b1fd8a66277b7319fdbf327e19948cc56917d | [
"MIT"
] | null | null | null | project/admin/__init__.py | iustce/cesa-web | 8b6b1fd8a66277b7319fdbf327e19948cc56917d | [
"MIT"
] | null | null | null | from admin import *
from user import * | 19 | 19 | 0.763158 | 6 | 38 | 4.833333 | 0.666667 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.184211 | 38 | 2 | 20 | 19 | 0.935484 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
60688269c0723f27fc610512487f75b904c926e5 | 17 | py | Python | test.py | mlaloux/pythonista | 0522d87b83b189c6385ab7c36f11df538d43df86 | [
"Apache-2.0"
] | null | null | null | test.py | mlaloux/pythonista | 0522d87b83b189c6385ab7c36f11df538d43df86 | [
"Apache-2.0"
] | null | null | null | test.py | mlaloux/pythonista | 0522d87b83b189c6385ab7c36f11df538d43df86 | [
"Apache-2.0"
] | null | null | null | print('Bonjour')
| 8.5 | 16 | 0.705882 | 2 | 17 | 6 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.058824 | 17 | 1 | 17 | 17 | 0.75 | 0 | 0 | 0 | 0 | 0 | 0.411765 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0 | 0 | 0 | 1 | 1 | 1 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 5 |
60908ceab0166838d207f7a18d34ee0f17d41290 | 360 | py | Python | findy/database/plugins/eastmoney/__init__.py | doncat99/FinanceDataCenter | 1538c8347ed5bff9a99a3cca07507a7605108124 | [
"MIT"
] | null | null | null | findy/database/plugins/eastmoney/__init__.py | doncat99/FinanceDataCenter | 1538c8347ed5bff9a99a3cca07507a7605108124 | [
"MIT"
] | null | null | null | findy/database/plugins/eastmoney/__init__.py | doncat99/FinanceDataCenter | 1538c8347ed5bff9a99a3cca07507a7605108124 | [
"MIT"
] | null | null | null | # -*- coding: utf-8 -*-
from findy.database.plugins.eastmoney.dividend_financing import *
from findy.database.plugins.eastmoney.finance import *
from findy.database.plugins.eastmoney.holder import *
from findy.database.plugins.eastmoney.meta import *
from findy.database.plugins.eastmoney.quotes import *
from findy.database.plugins.eastmoney.trading import *
| 45 | 65 | 0.811111 | 46 | 360 | 6.326087 | 0.347826 | 0.185567 | 0.350515 | 0.494845 | 0.783505 | 0.670103 | 0 | 0 | 0 | 0 | 0 | 0.003021 | 0.080556 | 360 | 7 | 66 | 51.428571 | 0.876133 | 0.058333 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | null | 0 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 5 |
60ce3fb087de377ea118af3e54f0ff326cb3a916 | 231 | py | Python | src/1song/auto_1songs.py | CompetitionDataResearch/recsys-spotify-challenge | 7468e82fcc69c45b46fd23ad924446d31f729a2e | [
"Apache-2.0"
] | 7 | 2018-07-02T07:03:41.000Z | 2020-12-01T08:07:25.000Z | src/1song/auto_1songs.py | CompetitionDataResearch/recsys-spotify-challenge | 7468e82fcc69c45b46fd23ad924446d31f729a2e | [
"Apache-2.0"
] | null | null | null | src/1song/auto_1songs.py | CompetitionDataResearch/recsys-spotify-challenge | 7468e82fcc69c45b46fd23ad924446d31f729a2e | [
"Apache-2.0"
] | 3 | 2018-07-23T04:21:26.000Z | 2021-07-06T19:33:20.000Z | import os
os.system("python main.py")
os.system("python lgb_train_features.py")
os.system("python lgb_train.py")
os.system("python prediction.py 2 1")
os.system("python lgb_test_features.py")
os.system("python lgb_predict.py 2")
| 23.1 | 41 | 0.761905 | 41 | 231 | 4.146341 | 0.341463 | 0.282353 | 0.494118 | 0.376471 | 0.488235 | 0.488235 | 0 | 0 | 0 | 0 | 0 | 0.014151 | 0.082251 | 231 | 9 | 42 | 25.666667 | 0.787736 | 0 | 0 | 0 | 0 | 0 | 0.584416 | 0.090909 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 0.142857 | 0 | 0.142857 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 5 |
60df2f3a700573aa1408deeb77613388a77f364c | 395 | py | Python | server/flaskr/lognorm.py | Intel-OpenVINO-Edge-AI-Scholarship/arcface-project | 86458a207c8e265bfc231736234ec38e4e70588b | [
"MIT"
] | null | null | null | server/flaskr/lognorm.py | Intel-OpenVINO-Edge-AI-Scholarship/arcface-project | 86458a207c8e265bfc231736234ec38e4e70588b | [
"MIT"
] | 5 | 2020-09-26T01:15:39.000Z | 2022-02-10T02:11:54.000Z | server/flaskr/lognorm.py | Intel-OpenVINO-Edge-AI-Scholarship/arcface-project | 86458a207c8e265bfc231736234ec38e4e70588b | [
"MIT"
] | null | null | null | import keras
class_weights = [1, 0.5, 1, 10, 10]
def intermediate_model(x):
return (
keras.activations.elu((x-keras.backend.mean(x, axis=1)) / keras.backend.square(keras.backend.std(x))
) + keras.backend.square(x-keras.backend.mean(x, axis=1)) / keras.backend.std(x)
) + keras.backend.min(x, axis=1) * keras.backend.log(keras.backend.square(x-keras.backend.mean(x, axis=1))) | 49.375 | 111 | 0.683544 | 65 | 395 | 4.123077 | 0.338462 | 0.447761 | 0.242537 | 0.190299 | 0.66791 | 0.600746 | 0.481343 | 0.481343 | 0.481343 | 0.30597 | 0 | 0.034783 | 0.126582 | 395 | 8 | 111 | 49.375 | 0.742029 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0.142857 | false | 0 | 0.142857 | 0.142857 | 0.428571 | 0 | 0 | 0 | 0 | null | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 5 |
60e98850488ef8de08fb40cd5816838875d2c4c2 | 76 | py | Python | colamatch/__init__.py | FZJ-INM1-BDA/colamatch | 2c98c9a7a675eba724262b1ccd3f099c4656b803 | [
"BSD-3-Clause"
] | null | null | null | colamatch/__init__.py | FZJ-INM1-BDA/colamatch | 2c98c9a7a675eba724262b1ccd3f099c4656b803 | [
"BSD-3-Clause"
] | null | null | null | colamatch/__init__.py | FZJ-INM1-BDA/colamatch | 2c98c9a7a675eba724262b1ccd3f099c4656b803 | [
"BSD-3-Clause"
] | null | null | null | from .constellation_matching import ExhaustiveSampler, RandomSampler, match
| 38 | 75 | 0.881579 | 7 | 76 | 9.428571 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.078947 | 76 | 1 | 76 | 76 | 0.942857 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | true | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | null | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | 0 | 1 | 0 | 0 | 5 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.